×
Google quietly expands Gemini data use for AI training—here’s how to opt out
Written by
Published on
Join our daily newsletter for breaking news, product launches and deals, research breakdowns, and other industry-leading AI coverage
Join Now

Google is quietly expanding how it uses customer data to train its artificial intelligence models, and users who don’t pay attention to their privacy settings might inadvertently become part of the training process.

Starting September 2, files, photos, videos, and screen captures that users share with Gemini, Google’s flagship AI assistant, could be sampled and used to improve the company’s AI services. This represents a significant shift in how Google handles user-generated content within its AI ecosystem, bringing the search giant’s data practices more in line with competitors like OpenAI.

The change arrives as Google races to keep pace with OpenAI’s rapid AI developments, including the anticipated release of GPT-5. This week, Google rolled out several Gemini updates alongside this new data usage policy, signaling an aggressive push to enhance its AI capabilities through expanded access to real-world user interactions.

What’s changing with your Gemini data

Google is rebranding its “Gemini Apps Activity” setting as “Keep Activity” in the coming weeks, but the name change comes with a substantial policy shift. When this setting remains enabled, Google will sample future uploads to help improve its services for all users.

The policy applies to what Google describes as “a subset of uploads,” including files, videos, screen captures, and photos shared with Gemini. While Google emphasizes that only a sample of content will be used, the company hasn’t specified exactly what percentage of uploads might be selected or the criteria used for sampling.

Importantly, Google states that it disconnects chat conversations from user accounts before sending them to service providers for AI training purposes. This means that while your content might be used to improve AI models, it theoretically won’t be directly tied to your personal Google account during the training process.

Why Google needs your data

Training sophisticated AI models requires massive amounts of diverse, real-world data. When users interact with AI assistants by uploading documents, asking questions about images, or sharing screen captures, they’re providing exactly the kind of contextual, practical examples that help AI systems learn to handle similar situations more effectively.

This approach reflects a broader industry trend where AI companies leverage user interactions to create more capable systems. The practice allows companies to identify common use cases, understand how people naturally communicate with AI, and spot areas where their models need improvement.

For Google, this data becomes particularly valuable as it competes with OpenAI’s ChatGPT and Microsoft’s Copilot services. Real user interactions provide insights that synthetic training data simply cannot match, potentially giving Google’s models a competitive edge in understanding practical, everyday AI applications.

How to opt out of AI training

Users who prefer to keep their Gemini interactions private can disable this feature through several straightforward steps:

Disable activity tracking: Navigate to Gemini’s “Settings & help” section, then select “Activity.” From the dropdown menu at the top, choose either “Turn off” or “Turn off and delete activity” to prevent future data sampling.

Use temporary chats: Google’s Temporary Chat feature ensures conversations won’t appear in your chat history, won’t be used for personalization, and won’t contribute to AI model training. These sessions provide a way to interact with Gemini without contributing to Google’s data collection efforts.

Review audio settings: Scroll down to find “Improve Google services with your audio and Gemini Live recordings.” This setting allows Google to use audio, video, and screen-sharing recordings for training, with some clips reviewed by human evaluators. Uncheck this box to disable audio-based data collection.

Monitor your data: Users can track what information Google has collected through myactivity.google.com/product/gemini, which functions as a comprehensive history of Gemini interactions. The platform allows users to delete conversations using time filters, ranging from hours to custom date ranges.

Understanding the fine print

Even users who disable activity tracking won’t achieve complete data isolation. Google retains user data for up to 72 hours regardless of privacy settings, citing the need to “provide the service, maintain its safety and security, and process any feedback you choose to provide.”

This temporary storage window reflects the technical realities of running cloud-based AI services, where some data retention is necessary for basic functionality, security monitoring, and abuse prevention. However, it also means that truly private interactions with Gemini require users to trust Google’s data handling practices during this retention period.

When Google implements the “Keep Activity” rebrand, all existing preferences from the current “Gemini Apps Activity” setting will automatically transfer to the new system, so users won’t need to reconfigure their privacy choices.

Industry implications

Google’s move reflects the increasingly competitive nature of the AI market, where access to high-quality training data often determines which companies can build the most capable systems. As AI models become more sophisticated, the marginal value of additional training data remains significant, particularly for understanding nuanced human interactions and specialized use cases.

This development also highlights the evolving relationship between AI companies and their users, where free or low-cost AI services often come with the implicit understanding that user interactions contribute to model improvement. Users who value privacy may need to weigh these trade-offs more carefully as AI services become more integrated into daily workflows.

For business users particularly concerned about data privacy, this change underscores the importance of regularly reviewing AI service terms and privacy settings, especially when handling sensitive information through AI assistants.

Your Gemini Uploads Could Soon Be Used for AI Training: Here's How to Opt Out

Recent News

Iowa teachers prepare for AI workforce with Google partnership

Local businesses race to implement AI before competitors figure it out too.

Microsoft brings AI-powered Copilot to NFL sidelines for real-time coaching

Success could accelerate AI adoption across other major sports leagues and high-stakes environments.