How can we integrate this with API's or server endpoint from Novita.ai?

How can we integrate this with API’s or server endpoint from Novita.ai? (they have GPU servers and model endpoint access)

1 Like

Fantastic question @AwesomeMrT!

Novita’s LLM API is actually OpenAI compatible:

So you can use the OpenAI like API integration we have with oTToDev to hook into Novita!

Cole, Thank you.
Novita has many different options but the API link is the most unaffordable. What I would like to do is spin up their [my Novita] GPU instance and run an LLM within. Then expose the url (API) of the LLM to use with our frontend. Any idea how to accomplish this?

1 Like

Yeah great question, you can do this! You just have to get Ollama running in the Novita GPU instance and then within oTToDev change the “Ollama base URL” to point to the IP address of your Novita instance!