How do I add a custom llm with ollama

So yesterday I was trying to add a smaller model on ollama and all was good, I added it to bolt and it got added but when I ask it to make a website it only answers on the chat never on the code. I was speaking with @leex279 and he told me it should be in the optimized file

app/lib/common/prompts/optimized.ts

I am trying to add lumolabs-ai/Lumo-70B-Instruct · Hugging Face

please someone help :pray:

@thecodacus Do you know how to optimize for a special model? is there a process to find out what the model/bolt needs to make it work with a seperate system prompt?

can anyone help with this ?

same happens when selecting optimized prompt?
and can you create a custom model file in ollama to see if the context size it corectly set?

yeah, also not working with optimized prompt.

what exactly do you mean bei Modelfile to see the context size? I think you would then set the contextsize there?

@thecodacus this at least did not help to get it work.

Any updates on this?