Googleās Gemini Deep Research tool can now reach deep into Gmail, Drive, and Chat to obtain data that might be useful for answering research questions.
Gemini Deep Research is Gemini 2.5 Pro (presently) deputized as an agent, meaning it embarks on a multistep process to respond to a directive rather than spitting out an immediate response. Deep Research systems incorporate knowledge discovery, workflow automation, and research orchestration.
Google is not the only provider of such systems. OpenAI and Perplexity also offer deep research tools, and various open source implementations are also available.
āAfter you enter your question, it creates a multi-step research plan for you to either revise or approve,ā explained Dave Citron, senior director of product management for Googleās Gemini service, in a blog post last year. āOnce you approve, it begins deeply analyzing relevant information from across the web on your behalf.ā
Now Gemini Deep Research can, if allowed, access data in your Gmail, Drive (e.g. Docs, Slides, Sheets, and PDFs), and Google Chat for added context. If the data you have stored in Google Workspace might be relevant to your research question, granting Gemini access to that data may lead to better results.
There is precedent for this sort of data access among other AI vendors, since providing AI models with access to personal files and data tends to make them more useful ā at the expense of privacy and security. Anthropicās Claude, for example, has web-based connectors for accessing Google Drive and Slack. Its iOS incarnation can access certain apps like Maps and iMessage. And Claude Desktop supports desktop extensions for access to the local file system.
Nonetheless, itās worth considering Googleās expansive privacy notice for Gemini Apps. On the linked Google Privacy & Terms page, the company says it uses āpublicly available information to help train Googleās AI models and build products and features like Google Translate, Gemini Apps, and Cloud AI capabilities.ā
Microsoft will force its āsuperintelligenceā to be a āhumanistā and play nice with people
OpenAIās Altman and Friar walk back remarks about federal loan guarantees
Senate bill would require companies to report AI layoffs as job cuts reach 20-year high in October
Agents of misfortune: The world isnāt ready for autonomous software
As the wording of that passage says nothing about private data, The Register asked Google to clarify. A company spokesperson confirmed that information available to Gemini via connected apps such as Gmail and Drive is not used to improve the companyās generative AI.
However, the Gemini Deep Research privacy notice does include this noteworthy passage: āHuman reviewers (including trained reviewers from our service providers) review some of the data we collect for these purposes. Please donāt enter confidential information that you wouldnāt want a reviewer to see or Google to use to improve our services, including machine-learning technologies.ā
And it comes with a caution not to use Deep Research for matters of consequence: āDonāt rely on responses from Gemini Apps as medical, legal, financial, or other professional advice.ā
So far, reviews of Gemini Deep Research run the gamut, from glowing to cautious approval, meh, mixed, and skeptical, with caveats about source labeling accuracy and lack of access to paywalled research, among other things.
While the quality of the initial prompt has bearing on the end result, this isnāt just a case of āyouāre holding it wrong.ā
Earlier this year, education consultant and PhD candidate Leon Furze summarized the utility of deep research models as follows:
āThe only conclusion I could arrive at is that it is an application for businesses and individuals whose job it is to produce lengthy, seemingly accurate reports that no one will actually read,ā he wrote in February. āAnyone whose role includes the kind of research destined to end up in a PowerPoint. It is designed to produce the appearance of research, without any actual research happening along the way.ā Ā®

