Cant add the 8k token parameter to ollama

i use docker cli. i type the commands in the docker-compose.yml in order to make my qwen2.5:3b model to have 8k token size using the model file that comes with the package. but i get an error.

below is the logs for helping:

<pre><font color="#8AE234"><b>nick@nick-pc</b></font>:<font color="#739FCF"><b>~/local-ai-packaged</b></font>$ docker ps
CONTAINER ID   IMAGE                                COMMAND                   CREATED             STATUS                          PORTS                                                                                                          NAMES
8280cd02e164   n8nio/n8n:latest                     &quot;tini -- /docker-ent…&quot;    About an hour ago   Up About an hour                0.0.0.0:5678-&gt;5678/tcp, [::]:5678-&gt;5678/tcp                                                                    n8n
f6cb1df1a70d   ghcr.io/open-webui/open-webui:main   &quot;bash start.sh&quot;           About an hour ago   Up About an hour (healthy)      0.0.0.0:3000-&gt;8080/tcp, [::]:3000-&gt;8080/tcp                                                                    open-webui
313227e1dc94   qdrant/qdrant                        &quot;./entrypoint.sh&quot;         About an hour ago   Up About an hour                0.0.0.0:6333-&gt;6333/tcp, [::]:6333-&gt;6333/tcp, 6334/tcp                                                          qdrant
062bc697514a   searxng/searxng:latest               &quot;/sbin/tini -- /usr/…&quot;    About an hour ago   Restarting (1) 14 seconds ago                                                                                                                  searxng
51088c7dbe51   flowiseai/flowise                    &quot;/bin/sh -c &apos;sleep 3…&quot;    About an hour ago   Up About an hour                0.0.0.0:3001-&gt;3001/tcp, [::]:3001-&gt;3001/tcp                                                                    flowise
f5908a756100   caddy:2-alpine                       &quot;caddy run --config …&quot;    About an hour ago   Up About an hour                                                                                                                               caddy
1427ffcf6265   valkey/valkey:8-alpine               &quot;docker-entrypoint.s…&quot;    About an hour ago   Up About an hour                6379/tcp                                                                                                       redis
5184a8c5128b   supabase/storage-api:v1.19.3         &quot;docker-entrypoint.s…&quot;    About an hour ago   Up About an hour (healthy)      5000/tcp                                                                                                       supabase-storage
df1463551eac   kong:2.8.1                           &quot;bash -c &apos;eval \&quot;echo…&quot;   About an hour ago   Up About an hour (healthy)      0.0.0.0:8000-&gt;8000/tcp, [::]:8000-&gt;8000/tcp, 8001/tcp, 0.0.0.0:8443-&gt;8443/tcp, [::]:8443-&gt;8443/tcp, 8444/tcp   supabase-kong
26417df2d73e   supabase/studio:20250317-6955350     &quot;docker-entrypoint.s…&quot;    About an hour ago   Up About an hour (healthy)      3000/tcp                                                                                                       supabase-studio
2e009589754e   supabase/postgres-meta:v0.87.1       &quot;docker-entrypoint.s…&quot;    About an hour ago   Up About an hour (healthy)      8080/tcp                                                                                                       supabase-meta
06ca5223e3c1   supabase/supavisor:2.4.14            &quot;/usr/bin/tini -s -g…&quot;    About an hour ago   Restarting (1) 18 seconds ago                                                                                                                  supabase-pooler
9aac64c85d23   postgrest/postgrest:v12.2.8          &quot;postgrest&quot;               About an hour ago   Up About an hour                3000/tcp                                                                                                       supabase-rest
0dc15bfb42fe   supabase/gotrue:v2.170.0             &quot;auth&quot;                    About an hour ago   Up About an hour (healthy)                                                                                                                     supabase-auth
478f8206884a   supabase/realtime:v2.34.43           &quot;/usr/bin/tini -s -g…&quot;    About an hour ago   Up About an hour (healthy)                                                                                                                     realtime-dev.supabase-realtime
5862baa19782   supabase/edge-runtime:v1.67.4        &quot;edge-runtime start …&quot;    About an hour ago   Up About an hour                                                                                                                               supabase-edge-functions
12d3c8c55f59   supabase/logflare:1.12.0             &quot;sh run.sh&quot;               About an hour ago   Up About an hour (healthy)      0.0.0.0:4000-&gt;4000/tcp, [::]:4000-&gt;4000/tcp                                                                    supabase-analytics
df9781573e0c   supabase/postgres:15.8.1.060         &quot;docker-entrypoint.s…&quot;    About an hour ago   Up About an hour (healthy)      5432/tcp                                                                                                       supabase-db
2915b30bf91c   darthsim/imgproxy:v3.8.0             &quot;imgproxy&quot;                About an hour ago   Up About an hour (healthy)      8080/tcp                                                                                                       supabase-imgproxy
5990cdca8124   timberio/vector:0.28.1-alpine        &quot;/usr/local/bin/vect…&quot;    About an hour ago   Up About an hour (healthy)                                                                                                                     supabase-vector
d7c63dfac2a4   ollama/ollama:rocm                   &quot;/bin/ollama serve&quot;       18 hours ago        Up About an hour                0.0.0.0:11434-&gt;11434/tcp, [::]:11434-&gt;11434/tcp                                                                ollama
<font color="#8AE234"><b>nick@nick-pc</b></font>:<font color="#739FCF"><b>~/local-ai-packaged</b></font>$ docker exec ollama ollama list
NAME                          ID              SIZE      MODIFIED          
nomic-embed-text:latest       0a109f422b47    274 MB    About an hour ago    
qwen2.5:7b-instruct-q4_K_M    845dbda0ea48    4.7 GB    About an hour ago    
qwen2.5:3b-instruct-q4_K_M    357c53fb659c    1.9 GB    17 hours ago         
<font color="#8AE234"><b>nick@nick-pc</b></font>:<font color="#739FCF"><b>~/local-ai-packaged</b></font>$ docker exec ollama echo &quot;FROM qwen2.5:7b-instruct-q4_K_M\n\nPARAMETER num_ctx 8096&quot; &gt; Modelfile
<font color="#8AE234"><b>nick@nick-pc</b></font>:<font color="#739FCF"><b>~/local-ai-packaged</b></font>$ docker exec ollama ollama create qwen2.5:7b-8k -f ./Modelfile
gathering model components 
Error: no Modelfile or safetensors files found
</pre>

Which Modelfile you refering to? Within the default local-ai-packaged, is no Modelfile included.

1 Like

this is my folder.


nick@nick-pc:~/local-ai-packaged$ ls
assets              Local_RAG_AI_Agent_n8n_Workflow.json  README.md
Caddyfile           Modelfile                             searxng
docker-compose.yml  n8n                                   shared
flowise             n8n_pipe.py                           start_services.py
LICENSE             n8n-tool-workflows                    supabase

I tried running the command but it can’t found it

nick@nick-pc:~/local-ai-packaged$ docker exec ollama ollama create qwen2.5:3b-8k -f /home/nick/local-ai-packaged/Modelfile 
gathering model components 
Error: no Modelfile or safetensors files found

You use docker exec, which means that the command is executed within the container and not on your local pc.
So the file needs to be within the container. Therefore you need to mount the file via volumens:

you can add here your mapping.

1 Like

so it has to be changed to

ollama_storage : my_path_to_modelfile

The mounted local storage should contain anything else besides the modelfile? Should I pull the whole model there ?

no this stays as is, you just do a second mount. example you can see at n8n service:

1 Like