Did Signal's Founder Create the Most Private AI?|This Week in Privacy #36 (Jan 16, 2026)

Join us 2026-01-16T22:00:00Z for This Week in Privacy #36, to catch up on the latest Privacy Guides updates and to discuss trending news in the privacy space.

During the livesteam we’ll answer viewer questions. If you have a question for us, please leave a comment in this forum thread or the YouTube chat.

Members please leave your questions for us and we will try to get them answered first during the livestream :smiley:

Please be aware that by posting here your post will be displayed on the YouTube stream.


Streaming Platforms

Beginning this week, we are trialing Streamyard: You can join the link above when the stream starts to watch and chat in the livestream without an account.

We will also be streaming from the following platforms:

If all goes well, chats from all 6 platforms will be combined on our end and displayed on the stream, and we will try to respond to as many questions as we have time for, regardless of where they’re sent.

8 Likes

I still really wish this was on another time. It’s too in-the-middle-of-the-night for me. Hope PG experiments with other times or at-least asks their audience what time they prefer to see if there is a consensus on which PG can deliver.

Was hoping to participate actively during the stream.

Where do the Trusted Execution Environments run? They’d be more trusted if I knew where these run and how they are insulated from the surrounding environment. For now I can’t seem to find much info on it. It’s Marlinspike so I feel pretty sure I can play with this for fun but I’m not about to have a therapy session with it any time soon. Though that should never be the use case for AI in an ideal world.

1 Like

Do you think Confer’s way of doing AI is something other AI products will follow suit? How difficult would it be for existing AI products to migrate to that?

Other products already do it such as Maple AI. Apple has been doing a more advanced version of this for years now with Private Cloud Compute. So yes I think more AI companies can and should do the same thing, I know Google recently announced a similar thing with Private AI Compute and OpenAI has expressed interest in implementing some form of privacy protections for ChatGPT.

1 Like

This is exactly it - any LLM you don’t run yourself relies on trust. There’s no way around that. Even if 32 audit firms come and thoroughly inspect this - how much do you trust those 32 audits and auditors?

I see. I just assumed it was new with the whole “Did Signal’s Founder Create the Most Private AI?” title. I guess what makes this different is the company behind the AI? Or is Confer adding something new?

I think for me the hope is that Moxie can get a model backed by a non profit like Signal once this things gets legs. I am not interested in paying to run in some unknown “Trusted” execution environment until I feel like the incentive structure of what pays for the execution environment is geared towards a public service and how transparent that environment is for scrutiny from privacy labs.

My hope is that now that there is groundwork laid for making private messaging a public service, public good investors will be more willing to build another foundation with Moxie and his network and ideally they will show the world a better experience with open small to medium language models that don’t require the resources LLMs have so we don’t only feel safe using Confer, we also can feel like we’re making an ethical choice. I’m not sure if this is the goal of this tech, but if you’re trying to implement AI on the lowest dollar it’s what makes practical sense. Research only shows smaller finetuned models outperform LLMs since about GPT4. Showcasing that trend in a real-world product seems like a natural implementation design.

Anyways, that’s a lot of speculation but given what Moxie and some public good investors pulled off with Signal and the vital service it provides today, I am hopeful that some of what I’m hoping to see comes to fruition with Confer.

1 Like

Yeah I mean, at some level you will need to trust someone if you don’t run it yourself. I’m working on a project that can hopefully be the end game so we don’t need projects like Signal and Confer. Quoting a post on Moxie’s blog he wrote around web3:

People don’t want to run their own servers, and never will.

The premise for web1 was that everyone on the internet would be both a publisher and consumer of content as well as a publisher and consumer of infrastructure.

We’d all have our own web server with our own web site, our own mail server for our own email, our own finger server for our own status messages, our own chargen server for our own character generation. However – and I don’t think this can be emphasized enough – that is not what people want. People do not want to run their own servers.

I think this notion motivates Moxie to seek out the transparent “web2.5” option of centralized servers that are run by non profits as a public good. But as a mutualist who has seen plenty centralized and hierarchical human organizations start with all the right intentions (e.g. unions) only to get the wrong people in charge or crowded out by newer and faster competing orgs (e.g. monopolies), it’s not a permanent solution to me either.

I actually trust the 32 auditors in your scenario - especially if they publish their findings well. What I don’t trust is the possibility of the more general notion of Cory Doctorow’s enshittification applied to any benevolent organization as they gain power and/or complacency, incentives move away from serving the people they once serve.

I don’t disagree entirely with Moxie’s premise. I would rephrase it as; People want the servers to run at their house (ownership and agency over their data), but they don’t want to take on the load of maintaining their servers. This is where I could imagine standardizing the ways in which we deploy home data centers. If we follow similar playbooks on self hosting that both lowers the decision space and commoditizes the maintenance of those systems, we can realize an ecosystem where people can run servers, without maintaining them. That will offer new business opportunities to local repair and maintenance shops we once had for computers in the 90s. In the same way you own a home yet don’t need to be a carpenter, electrician, plumber, or HVAC technician, you wouldn’t need to cosplay as a sysadmin to own a server.

This is something I’ve been hinting at a bunch here and hope to share more soon once I run it by the mods.

Anyways my point is that you are right but so is Moxie. We need web2.5 today, but that will be a stepping stone to an idealized web3. To be clear, web3 for me is decentralized ownership of services and data on either your own server running at your house, or a local server farm within about 50 miles of your home. It removes the hyper centralization of web2 and builds ecosystems on open source projects for innovation and economic creation. It is not crypto scams, NFTs, and other BS. I think crypto can play into those economies, but they must be private transactions like Monero.

Highlighting an age verification bill making its way through the Florida Legislature.

SB 482 would require you to verify your identity to use an AI chatbot. The privacy concern is obvious for those who use dedicated LLMs, but based on my reading of the bill it could expand to include big tech ecosystems. If Gemini or Meta AI are too tightly integrated into other products, will you have to verify your identity to use anything in the ecosystem?

Below is our blog post with more info.

The Florida legislative session started on January 13 and only runs until March 13, so we have limited time to amend this bill to remove the requirement for age verification. If you live in Florida, please call your state lawmakers and tell them to oppose this bill until the requirement is removed.

1 Like

How does this compare to Proton’s Lumo?

Too early to say anything. Not a lot is known just yet to form any conclusive opinion as far as I can tell.

Loving the multiple platforms that PG is going to be on.

Darn my frikin time zone though.

You’re actually going to be tough to reach approx. 1.7 billion at least. A minority in the world but the majority when these time zones are where privacy knowledge is going to be most effective for people to learn about digital freedoms. Look at where sun don’t shine during this time.

Edit: I just realized this stream is done. Been traveling, yet to catch up.