Ubuntu 24.04 Ollama not working

I have used Unix/Linux for more than a couple of decades. So I decided to build an AI server using Ubuntu Server 24.04. I have 2 Nvidia Tesla P40’s and 2 RTX-3090 in the one server. I have Ollama installed natively in the OS and running all my other AI Tools as Docker containers. I followed the instructions on the Github for installing; however, I don’t have the Ollama running and the OpenAI I could only get running having two environment files. The docker requires .env.local but OpenAI only works properly if I also have .env. Right now I am just using a link. The other issue I think is interesting is if you only use the two options, Ollama and OpenAI, all the other choices are still there.
Would really appreciate some help on getting both to working. The interesting thing is that on my Laptop which is Ubuntu 24.04 I had the Ollama working then updated the code and rebuitl the Docker and now it is broken again.