Do u think where is more secure?

I have never really looked into these distributed things. They just seem too good to be true though. Have any of them been looked into, privacy-wise?

Security and privacy should be as good as it can be by design. Mainly because it’s decentralized, which is unlike any centralized services that build upon reputation. Since anyone can run these nodes, there’s no trust given, no reputation, only transparency through a proven/auditable code by everyone 24/7 is the only truth here.

This is getting better every day. They’re quite busy with native file versioning implementation currently, see here.

I agree with @Reset0609

Further, records received from Google indicate that, in the months preceding pompompurin’s correspondence with omnipotent, FITZPATRICK appears to have registered a Google account with the email address conorfitzpatrick2002@gmail.com to replace the older email address (conorfitzpatrick02@gmail.com) that pompompurin had identified.

Google is indeed always keen to comply with government data requests.

1 Like

Its also worth noting this particular individual was involved with:

unauthorized purchasing and selling of stolen identification documents,
unauthorized access devices, unauthorized access to victim computer systems, and login credentials through his operation of a data breach website named “BreachForums.”

So you can expect anyone who holds data on you to hand it over if you’re doing that.

Government invaded his privacy because he was selling other people’s privacy :smiley:

This guy was hillariously bad at opsec, if you read further down “he used a VPN” but, he also accessed the same accounts directly without a VPN, and those Google accounts were associated with Google Pay and also Verizon phone numbers associated with him.

1 Like

No, I expect them to do the bare minimum to comply with the law. Its up to courts to judge whether someone is guilty or not, and not for private entities to do so in deciding if they will handover data or not. It should be an administrative decision and not a moral one. In fact, if the government already had enough data for a conviction, then they wouldnt need to ask for more.

2 Likes

The agent would have done exactly that, got a warrant and asked for the data. Google simply complied as any provider would. They’re pretty clear about that.

How does Google handle government requests for user information?

A variety of laws allow government agencies around the world to request the disclosure of user information for civil, administrative, criminal, and national security purposes. Google carefully reviews each request to make sure it satisfies applicable laws. If a request asks for too much information, we try to narrow it, and in some cases we object to producing any information at all. For more information, see our policies for how Google handles government requests for user information.

Those requests go through LERS which allows law enforcement to submit subpoenas and various court orders.

My comment was about the idea that an entity would comply with data requests based on moral judgement, which you had implied is to be expected, and not google’s actions in this case or their official policy on the matter.

Yes, believing they followed their stated policy, because they concluded that it satisfied all applicable legal conditionants and not because they morally supported legal action against that individual.

2 Likes

You know that this doesn’t apply to intelligence agencies, right?

Reading through this thread, it seems like everyone is disagreeing only because everyone is making blanket statements without defining their threat models and assumptions. I think 90% of the disagreement would be gone if everyone just prefaced their statements with “in the context of ____” or “if protecting your data from ____ is part of your threat model than”

IF protecting your photos from Google themselves (including the company, employees, and their AI tools), from the intel agencies they most likely continue to cooperate with, or from law enforcement with legal court orders, is part of your threat model. Google Photos isn’t a good choice, and isn’t secure with respect to those threat actors. But is still quite secure against other common threats.

IF those actors are out of scope for your threat model. Google photos is arguably pretty darn secure. Though by nature of not being client-side encrypted are still likely vulnerable to hacking or phishing/social engineering.

So in my view, you are pretty much all mostly correct, you are just making different assumptions about threat model without actually stating those assumptions explicitly.

2 Likes

Given that google scans your photos and false positives are known to exist I dont see how can that be an “if”. Google themselves can report you to authorities for csam or something else. I know someone whose videos on google drive were deleted for supposed copyright infringement and the person reported by google to the author’s society. The person in question had been involved in the production of the content and as part of that had been allowed to keep a copy

3 Likes

With what they had on him, it’s very much likely so the case because it is the path of least resistance. US citizen in US who apparently did something.

This guy wasn’t some international terrorist/spy, he was a US citizen and was criminal, so that hardly matters. If you were a foreign spy and using google to conduct your trade, than that’s still a you problem :wink: Not everyone has that kind of threat model.

You know how people are though, any chance to hate on “big tech”, while pretending everyone needs the same kind of opsec/consideration of America’s number 1 wanted spy. Over the years I’ve seen this rhetoric in privacy communities, and it largely stems from ancient snowden era prejudices particularly against US companies.

At the end of the day if you’re putting something on someone else’s computer and it’s not encrypted (aka the cloud) then it isn’t safe from legal subpoena anywhere, and it really has nothing to do with “the company” or where they are located, or their size.

A lot of companies which store your photos unencrypted potentially scan against something like PhotoDNA and this is for their own protection liability protection.

There are two solutions:

  • Use E2EE first
  • Don’t store CP in the cloud (or have it in the first place because :face_vomiting: )

The fact of the matter is, technology like PhotoDNA has extremely low false positives, expected to be less than 1 in 1 trillion. The only documented case I can think of is the one with the two dads, that took a picture of their son’s genitals for medical purposes. In that particular case he really should have thought about not transmitting something so sensitive on the internet. I certainly wouldn’t be uploading that to some “medical provider’s portal” as that’s likely not E2EE either. I would also want to ensure that data was deleted, when the problem was solved.

That uses a different system, and unfortunately DMCA is the issue here. Copyright trolls do from time to time, go around reporting things which they don’t have proper ownership of

3 Likes

Not sure how copyright trolls would have access to someone’s personal google drive files (if thats what youre suggesting) and DMCA doesnt apply either here in Portugal or in Ireland


Living is a country that was a dictatorship just 50 years ago, you dont imagine how that kind of reasoning triggers alarm bells. Ill just say that its for good reason that such scanning is illegal under Portuguese law. Not that Google cares about that.

Is that when matching against known material or also when finding unseen one? According to Google:

“For many years, Google has been working on machine learning classifiers to allow us to proactively identify never-before-seen CSAM imagery so it can be reviewed and, if confirmed as CSAM, removed and reported as quickly as possible.” - Fighting child sexual abuse online

A bit creepy to have employees of some random company (pretty sure theyll outsource that kind of stuff) looking at people’s photos for possible matches. People who can live in different parts of the world and have widely different cultural backgrounds and hence notions of what is acceptable or not

2 Likes

They don’t, but they submit a DMCA request to google, and google does matching based on the content id. Google obviously doesn’t want Google Drive to be used for distributing copyrighted content.

They’re a US company so they have to abide by US law. If it was illegal according to Portuguese law then it would likely mean that product would not be available to Portugal, (and users with a Portugese IP address), but seeing as it is I doubt this is the case.

I was talking about PhotoDNA. In any case that is not the same system used for copyright infringing content. PhotoDNA works with known content and fingerprints (at least it did originally).

It’s not it’s largely automated. The actual “real” CSAM is never handled by Google, that’s mostly Interpol/NCMEC. They provide hashes to PhotoDNA.

It’s not just Google that uses PhotoDNA, Microsoft, Adobe, Reddit, Discord do as well. Any company of that size generally needs an automated approach to dealing with data. When you’ve got millions of users there will be a portion who are doing the wrong thing.

This blog article PhotoDNA and Limitations - The Hacker Factor Blog provides a good description on how it works.

2 Likes

Sure, makes sense.

The content in question wasnt being distributed/shared with anyone. Also, by definition, any photo anyone takes is copyrighted to the person who took it, any text document copyrighted to the person/company who elaborated it and so on. The matter here is content being shared without adequate copyright, it certainly is not the possession/sharing of any copyrighted content.

No, theyre an Irish company, subject to the laws of Ireland and the country of the costumer theyre serving theough that subsidiary, hence my mention of Ireland.

Thats an incredibly naive view that I was not expecting. Theres no lack of stuff that illegal under Portuguese law and nonetheless allowed to go on for years or decades. Pretty sure that doesnt happen only in Portugal. For instance, Tesla’s Sentry Mode is illegal, was declared illegal in 2019 by CNPD, a government agency, and Teslas stil come with it to this day. I could give you dozens if not hundreds of other examples. Thing is, you need someone who feels aggrieved by it to go to court for the court to confirm that it is illegal. That can take a long, long, long time to happen. CNPD could take it to court, and they should be the ones doing so, but they probably dont have the money/prefer to spend it elsewhere.

Sure, that doesnt necessarily justify the means

2 Likes

Doesn’t need to be. If it was something that matched a content id, they don’t want that on their servers.

As far as I can tell, using PhotoDNA in Ireland is perfectly legal.

I couldn’t find anything about automated scanning like that being illegal on a server side, when the data is not encrypted. Part of the reason for this is because by providing the data to «company» you’re agreeing to their terms and conditions. None of them keep it a secret that they use that kind of technology.

That article you linked doesn’t support your argument, that’s about photography of roads and keeping sensitive data like number plates and pictures of people. Obviously law is complex, and there are sometimes things that go unnoticed for a while, but I doubt the usage of PhotoDNA on Google’s servers is one of them.

Well that’s less of a legal question and more of an ethical one. These companies will usually do anything to reduce their own liability, when someone does the wrong thing. They will usually seek a solution that is both scalable and brings the least cost to the company.

1 Like

Thing is, I am not. EULAs, terms, whatever you want to call them generally have no legal validity here in Portugal, not even due to anything in them but due to the fact that only a proper ID process would make them valid. When you click an “I agree” button, or sometimes not even that, theres no record to conclusively prove who accepted the terms and in what date. Thats required and its why for anything more serious one is forced through a cumbersome process of sending out ID copies, as well as to sign by hand every page of the terms being agreed to. Thankfully the id.gov.pt system is simplifying that process.

I presented the article to support the fact that big companies violate the law all the time, and dont necessarily care about complying even after they were made aware of their non-compliance. Anything else about the case is irrelevant to prove that point.

Why would you doubt that? Article 26 of the Portuguese Constitution states that everyone has a right to their word, good name and reputation.

Its as a result of this that, for example, a credit scoring system as it exists in the USA and other places is not allowed. Such system places on the individual the burden to prove he is able to repay loans, do so on time and so on. In other words, because there are people who default on their loans each individual has to demonstrate their ability to fulfill their contractual obligations by first takimg small loans, maybe have to put down a deposit to get a credit card (something also illegal here due to the same principle), etc.

Its exactly the same logic when you mention content filtering by default because some people do bad stuff. Each individual is being treated as if it was disreputable

Id argue its both but as per the above my focus here is on the legal side

Indeed, my point is that they might find it cheaper to violate certain laws, rather than to modify the product for certain markets. In other words, they might actually deem it worthy to increase their liability under some circumstances

2 Likes

That’s extremely misguided. It’s well documented that they use illegal data from mass surveillance to gather information on regular domestic crimes, then use it to go after legal evidence that can be presented in court. Also, if you read Newsweek, you should know that they treat Trump supporters as domestic terrorists. So, yeah, grandma needs to have the same threat modelling as Al Qaeda.

1 Like

Maybe so, but you’re not a lawyer either and trying to stretch interpretation of a single part of the constitution to apply to whatever it is you want elsewhere is unlikely to ever work.

The servers Google uses likely aren’t located with in Portugal, and if someone ever does try it in a court (assuming Google lost including appeals) they would just discontinue service to that feature.

I wouldn’t look for legal opinions in a tabloid with its own agenda. Anyway if you’re done having serious discussion, then I won’t bother wasting my time here.

The TLDR of it is, if you don’t want anyone scanning your data when you upload it, use E2EE. There really is no substitute and all the mental gymnastics isn’t going to change that.

2 Likes

Im pretty sure that CNPD used a lot of lawyers in the Tesla case and they cited that same part of the constitution, dealing with personal rights, also specifically as they pertain to the use of technology. If anything, Id argue its farfetched to say Tesla cannot record public space on privacy grounds but under the same law Google can scan private data. Perhaps worth mentioning that corporatioms have no constituional rights, individuals do so there wouldnt be any right elsewhere in the constitution google could claim was being protected by said scanning

They’re not in any way the same thing. You’re giving them particular data to safe keep and they want to check that they are even allowed to be in possession of that data. If that is a term of service, i doubt there’s anything in Portuguese law that forbids that. You either use the service and agree to the term, or Google decides not to let you use the service.

I very much doubt there is a law in Portugal that states a company must do business with you no matter what.

The matter you referenced was about recording people’s faces, number plates in a public places/roads and retention of that data, that doesn’t sound very similar to me. The end of that article states (translated):

These two concepts must be applied taking into account that the purposes in question “must be legitimate”. «Data processing must always respect the principles of adequacy, necessity and proportionality», concludes the CNPD.

Checking images against known fingerprints of CSAM is likely to be considered “legitimate” because Google state they don’t want to be holding that kind of content. I also seriously doubt you’d find a judge that would argue otherwise.

1 Like