If you’re a Google Gemini user, it’s time to review the privacy settings for Google’s AI chatbot. It looks like Gemini might have access to documents in Google Drive that it shouldn’t be able to read. That’s what happened with one Gemini user who was stunned to discover that Gemini could summarize tax returns in a PDF file hosted in Google Drive.
This was a document the user never gave Gemini access to, prompting an investigation into Google Gemini’s behavior with Google Drive. The conclusions aren’t great, as it looks like Gemini might be ingesting data from Drive without the user’s permission or knowledge. Even if that data isn’t used to train the AI specifically, it’s still a potential violation of privacy.
This incident further proves that any AI product needs strong privacy features, especially if it’s made by Google.
A few days ago, I had this to say about the advantages of having a cloud-based Docs product that offers end-to-end encryption, as I was talking about Proton Docs at the time:
End-to-end encryption protects against data breaches and mass collection, and it ensures your data will not be used to train AI. These are all things to keep in mind when trusting any third-party cloud service provider with your data, especially documents you’d like to keep private.
It’s not that Google Docs and Microsoft 365 won’t secure your docs, because they do. Those companies also won’t train AI with your documents. But Proton’s extra end-to-end encryption protection is much better. And Google and Microsoft can’t match it yet.
Given what Kevin Bankston went through, we have reason to worry about how Google handles Drive documents. The Gemini user accidentally discovered that his tax returns in Google Docs were scanned by Google’s AI without his permission. He then tried to find out what had happened and how to change privacy settings to prevent Gemini from taking over data it shouldn’t.
The task is rather difficult, as even Gemini can’t help point users in the right direction for managing their privacy settings. The chatbot hallucinates information about its own privacy settings.
Bankston was able to confirm that Gemini doesn’t train on the data in Gemini Workspace, but that wasn’t good enough to explain why the chatbot accessed the PDF file in the first place. The privacy blunder is all the more shocking as Bankston is a paying Google One user, which means he’s getting the premium version of Gemini.
The user further discovered that Gemini had read all of his PDF files that are hosted in Drive once he had asked Gemini for help with another one of them.
The startling discoveries continued after someone gave Bankston the correct privacy settings for Gemini AI. The setting that’s supposed to keep Gemini out of Google Drive docs was enabled. But Gemini accessed those documents nonetheless.
A little more digging may have helped Bankston identify the real issue. He signed up for Workspace Labs last year to test Bard before he became a Google One Gemini user. This might be the reason why Google Gemni had access to Google Drive documents it shouldn’t have been able to read.
Unfortunately, that’s just one theory for the moment. It’s up to Google to explain and clarify this glaring privacy issue. Even if only one user is affected, which I doubt is the case, it’s still a massive issue. AI like Gemini should never access any user data without explicit permission.
I’d rather have to deal with an avalanche of prompts to agree to Gemini getting data — which is what happens with ChatGPT access in iOS 18 — than somehow give the AI the power to access my data in bulk.
If you suspect Gemini has accessed any of your private files in Google Drive, just try to see whether it can summarize data from them.