Error: model requires more system memory

I’m running llama3.2 and I’m getting an error “model requires more system memory (23.3 GiB) than is available (19.7)”. Does this sound right? Unfortunately I think I won’t be able to run this locally…

Hi Michael,

you get that error when you don’t have enough RAM to run huge model.

1 Like

I’m getting the same error using qwen2.5-coder:14b and it’s saying that the model needs 45.7 GB which seems incorrect

I was getting the same errors with Ollama but it sounded incorrect so I went ahead and freed some Ram memory usage anyway to use it for Ollama and it worked, that did the trick.