I’m running llama3.2 and I’m getting an error “model requires more system memory (23.3 GiB) than is available (19.7)”. Does this sound right? Unfortunately I think I won’t be able to run this locally…
Hi Michael,
you get that error when you don’t have enough RAM to run huge model.
I’m getting the same error using qwen2.5-coder:14b and it’s saying that the model needs 45.7 GB which seems incorrect