With the NYT lawsuit against OpenAI and growing concerns about data privacy, I’ve started thinking about moving away from ChatGPT. I’m especially interested in local LLMs that offer more control and privacy. But as a non-technical user, I’m unsure how realistic that is—and I’d really appreciate some advice from people with hands-on experience.
My biggest question is whether a local LLM can come close to the quality and functionality of models like ChatGPT, especially o3 or o4-mini. Would something like that actually run on a typical home setup, say with 8–16GB of GPU VRAM?
I also rely on features like deep research—web search, document analysis, multi-step reasoning. From what I understand, these often require external search APIs. I’m not sure how much those cost in practice, or whether they raise new privacy concerns. Would using privacy-focused engines like Brave Search or DuckDuckGo help?
Overall, I’m trying to figure out if switching to a local LLM is really possible for someone without a technical background. If you’ve gone down this path, I’d love to hear what worked for you—and what didn’t.
Thanks in advance for any insight!