Hi, No matter what I try, I get a white screen when previewing. I am using ollama locally with deepseek-coder v2:16b. Any ideas what could be causing this issue.
I have also created the Modelfiles folder to edit the models from ollama
Hi, No matter what I try, I get a white screen when previewing. I am using ollama locally with deepseek-coder v2:16b. Any ideas what could be causing this issue.
I have also created the Modelfiles folder to edit the models from ollama
Using Deepseek-coder V2 236B from Open Router works better than working with ollama locally. I at least get my preview now.
Will be great when the team can figure out how to use these local LLM’s
Bulding a simple whatsapp clone
How did you solve that? I know I’m an ultimate noob, but no matter what I try, I still have nothing in preview. I noticed I even don’t have any URL there. Any ideas?
Here is what I am testing at this moment. Have/had the same issue, just can not get the preview to work. even after pulling latest code. So i added explicit instruction at the end of my script “be sure results display in the preview pane of this application.” working so far, will continue test on more intricate applications. Hope this helps
what you did after the last message that says “npm run dev” To view the updated landing page, simply run “npm run dev” in your terminal… ?