This would be a huuuuuge help. Because runpod pods, for example, let you rent e.g. 400 VRAM, but the challenge is you have to set up the AI… including downloading the model and software that lets you connect e.g. bolt.diy to your remote instance.
Super important to me that my prompt and outputs are 100% protected! And I need a bigger llm than I can run locally with LM Studio.
”anyone with some experience able to give a write up/video walk through of using a “truly private AI eg Runpod Pod”?”
I’d be super grateful and I’m sure others in the community as well.. if one of you guys who know the ins-and-outs give us a walkthrough for put in place a beefier AI model.