Have this error


Why do i keep getting this error i cannot use archon i got this when i enterred the env variables and clicked enter

Just had the same issue. I think you have a missconfiguration or missing environment variable, which causes the issue. In my case I did not set the supabase-key, which led to this error. at the moment there is no way back within the UI. @ColeMedin told me he knows about this error already and will be fixed.

To get it work, you can go set the variables via .env file instead of in the UI. (.env.example => rename to .env and fill out there)

1 Like

ohh ic thanks mann ill try it

1 Like

Should be fixed now!

1 Like

Hello it appears, that I am getting the same issue. Every time I put in a command it keeps coming up with “This app has encountered an error”, I have also tried renaming the file to .env. Can u give me some advice on how to solve this issue. Thank u.

Hi,
where do you put in a command and this error comes? Did you watch my video and can tell me on which time/step you stuck?

The command I used for the chat was Hi. And then the error messaged popped up.
Yes I have watched your video all the way through, the only problem I got was in the documentation section it says the “crawling process has picked up 5 failed URLs”, so I’m not sure if this is the issue.
I’m using Docker to set up Archon.
Thank u for the response.

Do you see documents in the supabase table?

Unable to add new comment so had to edit previous message.

Yes there is data in the table of supabase, once it has finished the crawling process it says there is 1 failed URL.

No, I mean if there is actual data in the table of supabase, as shown in my install video https://www.youtube.com/watch?v=M5uz9DEZtkc at about 17:20min

Hello, I have the same problem reported by user damienjh.
In supabase is present all the extracted documentation. The tables and database have been populated correctly. The problem is that when I start the agent and make questions in chat generates the same error.
I have set the api correctly, both on streamlit and in the file . env, as well as in the file env_vars.json inside the container but I can’t fix it.
In the desktop container app, I have inspected the logs, I see that it reports access problems to openAI APIs.
This is very strange considering that to download the documentation from pydantic, it is necessary to use the API to generate the database on supabase correctly.
I confirm that I have inserted the bees everywhere
I confirm that the database has been created and populated correctly (more than 600 documents extracted).
I checked the database and did not see any problems. Summary title and content generated correctly (at the low cost of a few tens of cents API).
Do you have any idea what it could be and how to solve the problem?

what specific errors you get?
did you test your api key outside of archon, e.g. with a curl or in bolt.diy? (I had similar problems and I was sooo sure everything is correct, but then I tried they key outside of archon and it was not working. not sure what it was, maybe copy/paste problems as well with whitespaces etc.). Just make sure you verify this is working.

1 Like

the bees are working, when I downloaded the documentation from pydantic, it has scaled some credits so it means that the bees are working.
I also tried with the API of Claude! I have the same problem.

Attach log:

 
2025-03-23 13:42:54     User AI Agent Request: hi
2025-03-23 13:42:54     
2025-03-23 13:42:54     Create detailed scope ...o creating this agent for the user in the scope document.
2025-03-23 13:42:54     
2025-03-23 13:42:54 12:42:54.846   preparing model and tools run_step=1
2025-03-23 13:42:54 12:42:54.846   model request
2025-03-23 14:46:55   Stopping...
2025-03-23 14:47:08 
2025-03-23 14:47:08 Collecting usage statistics. To deactivate, set browser.gatherUsageStats to false.
2025-03-23 14:47:08 
2025-03-23 14:47:08 
2025-03-23 14:47:08   You can now view your Streamlit app in your browser.
2025-03-23 14:47:08 
2025-03-23 14:47:08   URL: http://0.0.0.0:****
2025-03-23 14:47:08 
2025-03-23 14:47:32 INFO:httpx:HTTP Request: GET https://*********************.supabase.co/rest/v1/site_pages?select=url&metadata-%3E%3Esource=eq.pydantic_ai_docs "HTTP/2 200 OK"
2025-03-23 14:47:32 13:47:32.862 reasoner run prompt=
2025-03-23 14:47:32     User AI Agent Request: hi
2025-03-23 14:47:32     
2025-03-23 14:47:32     Create detailed scope ...o creating this agent for the user in the scope document.
2025-03-23 14:47:32     
2025-03-23 14:47:32 13:47:32.883   preparing model and tools run_step=1
2025-03-23 14:47:32 13:47:32.883   model request
2025-03-23 14:47:33 INFO:httpx:HTTP Request: POST https://api.openai.com/v1/chat/completions "HTTP/1.1 404 Not Found"
2025-03-23 14:47:33 2025-03-23 13:47:33.562 Uncaught app execution
2025-03-23 14:47:33 Traceback (most recent call last):
2025-03-23 14:47:33   File "/usr/local/lib/python3.12/site-packages/streamlit/runtime/scriptrunner/exec_code.py", line 88, in exec_func_with_error_handling
2025-03-23 14:47:33     result = func()
2025-03-23 14:47:33              ^^^^^^
2025-03-23 14:47:33   File "/usr/local/lib/python3.12/site-packages/streamlit/runtime/scriptrunner/script_runner.py", line 579, in code_to_exec
2025-03-23 14:47:33     exec(code, module.__dict__)
2025-03-23 14:47:33   File "/app/streamlit_ui.py", line 114, in <module>
2025-03-23 14:47:33     asyncio.run(main())
2025-03-23 14:47:33   File "/usr/local/lib/python3.12/asyncio/runners.py", line 195, in run
2025-03-23 14:47:33     return runner.run(main)
2025-03-23 14:47:33            ^^^^^^^^^^^^^^^^
2025-03-23 14:47:33   File "/usr/local/lib/python3.12/asyncio/runners.py", line 118, in run
2025-03-23 14:47:33     return self._loop.run_until_complete(task)
2025-03-23 14:47:33            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
2025-03-23 14:47:33   File "/usr/local/lib/python3.12/asyncio/base_events.py", line 691, in run_until_complete
2025-03-23 14:47:33     return future.result()
2025-03-23 14:47:33            ^^^^^^^^^^^^^^^
2025-03-23 14:47:33   File "/app/streamlit_ui.py", line 93, in main
2025-03-23 14:47:33     await chat_tab()
2025-03-23 14:47:33   File "/app/streamlit_pages/chat.py", line 81, in chat_tab
2025-03-23 14:47:33     async for chunk in run_agent_with_streaming(user_input):
2025-03-23 14:47:33   File "/app/streamlit_pages/chat.py", line 30, in run_agent_with_streaming
2025-03-23 14:47:33     async for msg in agentic_flow.astream(
2025-03-23 14:47:33   File "/usr/local/lib/python3.12/site-packages/langgraph/pregel/__init__.py", line 2007, in astream
2025-03-23 14:47:33     async for _ in runner.atick(
2025-03-23 14:47:33   File "/usr/local/lib/python3.12/site-packages/langgraph/pregel/runner.py", line 527, in atick
2025-03-23 14:47:33     _panic_or_proceed(
2025-03-23 14:47:33   File "/usr/local/lib/python3.12/site-packages/langgraph/pregel/runner.py", line 619, in _panic_or_proceed
2025-03-23 14:47:33     raise exc
2025-03-23 14:47:33   File "/usr/local/lib/python3.12/site-packages/langgraph/pregel/retry.py", line 128, in arun_with_retry
2025-03-23 14:47:33     return await task.proc.ainvoke(task.input, config)
2025-03-23 14:47:33            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
2025-03-23 14:47:33   File "/usr/local/lib/python3.12/site-packages/langgraph/utils/runnable.py", line 532, in ainvoke
2025-03-23 14:47:33     input = await step.ainvoke(input, config, **kwargs)
2025-03-23 14:47:33             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
2025-03-23 14:47:33   File "/usr/local/lib/python3.12/site-packages/langgraph/utils/runnable.py", line 320, in ainvoke
2025-03-23 14:47:33     ret = await asyncio.create_task(coro, context=context)
2025-03-23 14:47:33           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
2025-03-23 14:47:33   File "/app/archon/archon_graph.py", line 102, in define_scope_with_reasoner
2025-03-23 14:47:33     result = await reasoner.run(prompt)
2025-03-23 14:47:33              ^^^^^^^^^^^^^^^^^^^^^^^^^^
2025-03-23 14:47:33   File "/usr/local/lib/python3.12/site-packages/pydantic_ai/agent.py", line 340, in run
2025-03-23 14:47:33     end_result, _ = await graph.run(
2025-03-23 14:47:33                     ^^^^^^^^^^^^^^^^
2025-03-23 14:47:33   File "/usr/local/lib/python3.12/site-packages/pydantic_graph/graph.py", line 187, in run
2025-03-23 14:47:33     next_node = await self.next(next_node, history, state=state, deps=deps, infer_name=False)
2025-03-23 14:47:33                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
2025-03-23 14:47:33   File "/usr/local/lib/python3.12/site-packages/pydantic_graph/graph.py", line 263, in next
2025-03-23 14:47:33     next_node = await node.run(ctx)
2025-03-23 14:47:33                 ^^^^^^^^^^^^^^^^^^^
2025-03-23 14:47:33   File "/usr/local/lib/python3.12/site-packages/pydantic_ai/_agent_graph.py", line 254, in run
2025-03-23 14:47:33     model_response, request_usage = await agent_model.request(ctx.state.message_history, model_settings)
2025-03-23 14:47:33                                     ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
2025-03-23 14:47:33   File "/usr/local/lib/python3.12/site-packages/pydantic_ai/models/openai.py", line 167, in request
2025-03-23 14:47:33     response = await self._completions_create(messages, False, cast(OpenAIModelSettings, model_settings or {}))
2025-03-23 14:47:33                ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
2025-03-23 14:47:33   File "/usr/local/lib/python3.12/site-packages/pydantic_ai/models/openai.py", line 203, in _completions_create
2025-03-23 14:47:33     return await self.client.chat.completions.create(
2025-03-23 14:47:33            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
2025-03-23 14:47:33   File "/usr/local/lib/python3.12/site-packages/openai/resources/chat/completions.py", line 1720, in create
2025-03-23 14:47:33     return await self._post(
2025-03-23 14:47:33            ^^^^^^^^^^^^^^^^^
2025-03-23 14:47:33   File "/usr/local/lib/python3.12/site-packages/openai/_base_client.py", line 1849, in post
2025-03-23 14:47:33     return await self.request(cast_to, opts, stream=stream, stream_cls=stream_cls)
2025-03-23 14:47:33            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
2025-03-23 14:47:33   File "/usr/local/lib/python3.12/site-packages/openai/_base_client.py", line 1543, in request
2025-03-23 14:47:33     return await self._request(
2025-03-23 14:47:33            ^^^^^^^^^^^^^^^^^^^^
2025-03-23 14:47:33   File "/usr/local/lib/python3.12/site-packages/openai/_base_client.py", line 1644, in _request
2025-03-23 14:47:33     raise self._make_status_error_from_response(err.response) from None
2025-03-23 14:47:33 openai.NotFoundError: Error code: 404 - {'error': {'message': 'The model `o3-mini` does not exist or you do not have access to it.', 'type': 'invalid_request_error', 'param': None, 'code': 'model_not_found'}}
2025-03-23 14:47:33 During task with name 'define_scope_with_reasoner' and id '*****************************'

Are you on the free Tier of the OpenAI API? It looks like it. As far as I know o3-mini is not available in lower tiers then “Tier 1”, what you get when spending 50$+.
Guess thats the problem.

1 Like

In fact I don’t have a subscription on openai, I only have API credits. How do you suggest I set up the environment to use my credits without an openai subscription? Which models should be used for embedding, reasoning and primary?

You know I wanted to follow your tutorial to the letter and this was not specified!
Alternatively I would be grateful if you could clarify the environment settings if you want to use Anthropic.
Finally, to close the circle it would also be nice to understand how to achieve maximum performance by combining openai and anthropic.

I really appreciate both your work and Coleam00.
This information would be of great help to the community.

Thank you for your time
Greetings
Luigi

1 Like

My tutorial just follows the default cole specified in Archon within the “?” Icons. As this is just an example, you can also just just gpt-4o-mini which should also work fine, I guess.

OpenAI Subscription => You dont need a ChatGPT Subscription. The API Key credits are enough, you just need to load your wallet with 50$+ to get Tier 1. This does not mean you have to spent these credits really. I bought them months ago and just spent 6$ of it so far.

Instead of going with OpenAI, you can also use Ollama (see examples in ?)

See above. The examples are in the tooltips:

Hope it helps and thanks for the feedback :slight_smile:

2 Likes

Thank you and I appreciate the feedback as well!

Thanks @leex279 for all your help here too

2 Likes

I do not see where anyone has fixed this error. Every suggested fix is failing for me (could be me). So, if anyone has actually fixed this (vs giving up), I would appreciate it if you would post/re-post the fix. Thank you

not fixed, i’m having the same problem and have uninstalled and reinstalled and reset the environment varibles 4 times with no luck. I’ll try the .env fix and see what happens.