They published their latest article on the NYT lawsuit. In it, they suggested introducing client-side encryption. Depending on how they implement it, their privacy and security advantages could give them an edge over competitors like Gemini and Claude.
Our long-term roadmap includes advanced security features designed to keep your data private, including client-side encryption for your messages with ChatGPT. We believe these features will help keep your private conversations private and inaccessible to anyone else, even OpenAI. We will build fully automated systems to detect safety issues in our products. Only serious misuse and critical risks—such as threats to someone’s life, plans to harm others, or cybersecurity threats—may ever be escalated to a small, highly vetted team of human reviewers. These security features are in active development and we will share more details about them, and other short-term mitigations, in the very near future.
Now do they mean encryption for your message history after you send your queries and they get processed or do they mean homomorphic encryption so they never have your queries in the first place. Both would be good but the former I think would also require more measures like Apple’s PCC or Google’s similar recent offering to protect your messages while they’re being processed.
This is my interpretation of this sentence, especially seeing as the problem in this case is the Times wanting access to historical data. So like what Proton Lumo does is all I think they are committing to, not necessarily a PCC approach.
Hm. OpenAI’s careers page (link / mirror) does hint at Apple PCC-like approach. Even if they don’t roll that out to consumers, some Enterprises (finance and medicine) may prefer it over on-premise deployments.
Similarly, there’s also open job reqs (link / mirror) for “federated learning” (pioneered by Google, initially for Keyboard suggestions) & “differential privacy” (pioneered by Apple for telemetry, I believe).
Not for practitioners. It is cutting edge, but there’s multiple open source projects (TinFoilSh) that implement “e2ee” for inference.
I would be very surprised if they were not pursuing this technology at all, but indeed I have no reason to believe it will ever come to a consumer product like ChatGPT.
I could end up being wrong though. Apple purchasing Gemini from Google to run inside PCC could drive competition in this space, which would be good I suppose.