Archon OpenAI / Streamlit issue

Hi, massive noob here. I am trying to set up Archon. I have almost got it working, I think. It looked way easier on the yt vid!

I have been trying to set up the streamlit ui but I am getting an error:
openai.OpenAIError: This app has encountered an error. The original error message is redacted to prevent data leaks. Full error details have been recorded in the logs (if you’re on Streamlit Cloud, click on ‘Manage app’ in the lower right of your app).

Traceback:

File "/mount/src/archon-v3/streamlit_ui.py", line 30, in <module>
    from archon.archon_graph import agentic_flowFile "/mount/src/archon-v3/archon/archon_graph.py", line 51, in <module>
    openai_client = AsyncOpenAI(api_key=os.getenv("OPENAI_API_KEY"))
                    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^File "/home/adminuser/venv/lib/python3.12/site-packages/openai/_client.py", line 337, in __init__
    raise OpenAIError(

I trouble shot it for hours with ai. We tried everything, multiple times. Probably something easy but too late and too frustrated!

Any help much appreciated.

Thanks

1 Like

Sorry you are running into this! Do you have your OpenAI API key set? Even if you’re using local LLMs, you still need OpenAI for the embedding model to create the vectors for RAG. I am making it possible to use Ollama for embeddings as well in V4, but for V3 it’s not ready quite yet.

Also have you checked the larger error logs that it says it redacted?

It is all good, I made a clean install of archon v4! And it is sort of working, I have database error. I’ve set the sql query and db is created.

Archon message - Error checking table status: [Errno 8] nodename nor servname provided, or not known

Proceeding with the assumption that the table needs to be created.

Supabse message

ERROR: 42P07: relation “site_pages” already exists

I will keep trouble shooting with ai for now.

Keen as to start building, but learning heaps as I go!

Thanks!

I can connect to the Agent!
— SHOWING NEWEST LOGS FIRST (AUTO-SCROLL MODE) —

INFO: Uvicorn running on http://127.0.0.1:8100 (Press CTRL+C to quit)
INFO: Application startup complete.
INFO: Waiting for application startup.
INFO: Started server process [15841]
[14:53:44] Agent service started
[14:53:44] Killed process using port 8100 (PID: 15511)

1 Like

Fantastic!

Is that DB error still happening? Made some more changes recently as well.

im getting this error
01:22:10.264 preparing model and tools run_step=1
01:22:10.266 model request
INFO:openai._base_client:Retrying request to /chat/completions in 0.434398 seconds
INFO:httpx:HTTP Request: POST https://openrouter.ai/api/v1/chat/completions “HTTP/1.1 200 OK”
01:22:27.222 handle model response
01:22:27.248 pydantic_ai_coder run prompt=Build me an AI agent that can search the web with the Brave API
01:22:27.250 preparing model and tools run_step=1
01:22:27.251 model request
INFO:httpx:HTTP Request: POST https://openrouter.ai/api/v1/chat/completions “HTTP/1.1 200 OK”
2025-03-16 18:22:29.077 Uncaught app execution
Traceback (most recent call last):
File “C:\Users\DELL\AppData\Local\Programs\Python\Python313\Lib\site-packages\streamlit\runtime\scriptrunner\exec_code.py”, line 88, in exec_func_with_error_handling
result = func()
File “C:\Users\DELL\AppData\Local\Programs\Python\Python313\Lib\site-packages\streamlit\runtime\scriptrunner\script_runner.py”, line 579, in code_to_exec
exec(code, module.dict)
~~~~^^^^^^^^^^^^^^^^^^^^^^^
File “C:\Users\DELL\Desktop\git project\archon\streamlit_ui.py”, line 114, in
asyncio.run(main())
~~~~~~~~~~~^^^^^^^^
File “C:\Users\DELL\AppData\Local\Programs\Python\Python313\Lib\asyncio\runners.py”, line 195, in run
return runner.run(main)
~~~~~~~~~~^^^^^^
File “C:\Users\DELL\AppData\Local\Programs\Python\Python313\Lib\asyncio\runners.py”, line 118, in run
return self.loop.run_until_complete(task)
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~^^^^^^
File “C:\Users\DELL\AppData\Local\Programs\Python\Python313\Lib\asyncio\base_events.py”, line 725, in run_until_complete
return future.result()
~~~~~~~~~~~~~^^
File “C:\Users\DELL\Desktop\git project\archon\streamlit_ui.py”, line 93, in main
await chat_tab()
File “C:\Users\DELL\Desktop\git project\archon\streamlit_pages\chat.py”, line 73, in chat_tab
async for chunk in run_agent_with_streaming(user_input):
…<2 lines>…
message_placeholder.markdown(response_content)
File “C:\Users\DELL\Desktop\git project\archon\streamlit_pages\chat.py”, line 30, in run_agent_with_streaming
async for msg in agentic_flow.astream(
…<2 lines>…
yield msg
File "C:\Users\DELL\AppData\Local\Programs\Python\Python313\Lib\site-packages\langgraph\pregel_init
.py", line 2007, in astream
async for _ in runner.atick(
…<7 lines>…
yield o
File “C:\Users\DELL\AppData\Local\Programs\Python\Python313\Lib\site-packages\langgraph\pregel\runner.py”, line 527, in atick
_panic_or_proceed(
~~~~~~~~~~~~~~~~~^
futures.done.union(f for f, t in futures.items() if t is not None),
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
timeout_exc_cls=asyncio.TimeoutError,
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
panic=reraise,
^^^^^^^^^^^^^^
)
^
File “C:\Users\DELL\AppData\Local\Programs\Python\Python313\Lib\site-packages\langgraph\pregel\runner.py”, line 619, in _panic_or_proceed
raise exc
File “C:\Users\DELL\AppData\Local\Programs\Python\Python313\Lib\site-packages\langgraph\pregel\retry.py”, line 128, in arun_with_retry
return await task.proc.ainvoke(task.input, config)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File “C:\Users\DELL\AppData\Local\Programs\Python\Python313\Lib\site-packages\langgraph\utils\runnable.py”, line 532, in ainvoke
input = await step.ainvoke(input, config, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File “C:\Users\DELL\AppData\Local\Programs\Python\Python313\Lib\site-packages\langgraph\utils\runnable.py”, line 320, in ainvoke
ret = await asyncio.create_task(coro, context=context)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File “C:\Users\DELL\Desktop\git project\archon\archon\archon_graph.py”, line 124, in coder_agent
result = await pydantic_ai_coder.run(state[‘latest_user_message’], deps=deps, message_history= message_history)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File “C:\Users\DELL\AppData\Local\Programs\Python\Python313\Lib\site-packages\pydantic_ai\agent.py”, line 340, in run
end_result, _ = await graph.run(
^^^^^^^^^^^^^^^^
…<4 lines>…
)
^
File “C:\Users\DELL\AppData\Local\Programs\Python\Python313\Lib\site-packages\pydantic_graph\graph.py”, line 187, in run
next_node = await self.next(next_node, history, state=state, deps=deps, infer_name=False)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File “C:\Users\DELL\AppData\Local\Programs\Python\Python313\Lib\site-packages\pydantic_graph\graph.py”, line 263, in next
next_node = await node.run(ctx)
^^^^^^^^^^^^^^^^^^^
File “C:\Users\DELL\AppData\Local\Programs\Python\Python313\Lib\site-packages\pydantic_ai_agent_graph.py”, line 254, in run
model_response, request_usage = await agent_model.request(ctx.state.message_history, model_settings)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File “C:\Users\DELL\AppData\Local\Programs\Python\Python313\Lib\site-packages\pydantic_ai\models\openai.py”, line 168, in request
return self._process_response(response), _map_usage(response)
~~~~~~~~~~~~~~~~~~~~~~^^^^^^^^^^
File “C:\Users\DELL\AppData\Local\Programs\Python\Python313\Lib\site-packages\pydantic_ai\models\openai.py”, line 224, in _process_response
timestamp = datetime.fromtimestamp(response.created, tz=timezone.utc)
TypeError: ‘NoneType’ object cannot be interpreted as an integer
During task with name ‘coder_agent’ and id ‘710c833c-bd75-9c65-dfea-3db09b772285’