Not completely true. While OpenAI and Anthropic require payment processing with a balance, there are MANY other options:
- Openrouter and many other providers give you free credits to start and they have several free models you can also use.
- HuggingFace has
Interface API (serverless)
endpoints that are all free, with pretty generous rate limits. - Google AI Studio offers their “Experimental” Gemini Flash 2.0+ models for free (which are quite good and also has generous rate limits).
- Microsoft offers Azure AI Models through GitHub for free, including ChatGPT 4o.
- You can also run LMStudio or Ollama locally for free with many of the models available on HuggingFace…
- And arguably you could connect a Kaggle, Google Collab, or various Jupyter Notebook-type platforms with it.
- There are even ways to get Anthropic’s Claude 3.5 for free.
So, there’s really no limit of options to get started. Many free and good enough for your average user. But the options can also be overwhelming when first starting out with AI, so I’d suggest just get started with HuggingFace and go from there.
And if you are willing to pay money, the new DeepSeek-V3 model is quite good and available for < 20¢/MTok, which is super cheap. They charge a fee per transaction though, so I generally just keep a rotating balance on it once it gets low.
P.S. I should also mention, you can even host Bolt.diy (and customize it if you wish) completely for free. See my article on Deploying Bolt.diy with Cloudflare Pages (the easy way!). Or you can just use the one’s I’m running (need to provide your own key):
So, you can literally run Bolt.diy COMPLETELY Free! No strings attached.
Good luck!