Generative AI and Privacy policy recommendations
Protecting user privacy
We support privacy regulation and policy that fosters trust and are committed to providing products and services that are private by design, secure by default, and that put users in control of their information.
Explore our transparency report
We support privacy regulation and policy that fosters trust and are committed to providing products and services that are private by design, secure by default, and that put users in control of their information.
Our policy priorities for user privacy
in the age of Generative AI
and user controls
and safety into agentic AI
Privacy-Enhancing Technologies (PETs)
Privacy in the age of Generative AI
At Google, we have over 25 years of experience protecting data from inappropriate access and unauthorized use. In this era of AI, we’re extending our best practices in data protection to ensure that the right data is used the right way to train models. We rely on publicly available data and we advocate for a risk-based approach to AI policy that protects personal information while allowing for scientific and social breakthroughs.
Personalization and user controls
Developing responsible AI means incorporating private-by-design principles in our products and platforms. We empower users to make informed decisions on what data they share to personalize their experience and enable strong privacy protection through transparency, robust controls, and clear reporting channels.
Building privacy and safety into agentic AI
Agents require data to be more relevant to your goals and complete tasks under supervision. To ensure agents act only within authorized bounds, we utilize a privacy-by-design approach, and encourage policy that scales privacy through transparent frameworks and granular user controls.
Advancing Privacy-Enhancing Technologies (PETs)
We’re investing in the next generation of PETs, such as differential privacy and federated learning, helping us to train AI models. However, use of these technologies remains low outside of the tech sector. Governments can accelerate PET adoption through integration into public service procurement and delivery, helping further secure citizen data.
Recent news
-
APR 7, 2026
Here’s how we built Gmail to keep your data secure and private in the Gemini era
-
MAR 30, 2026
Evolving expectations of what’s possible
-
FEB 9, 2026
Stay in control of your personal information online
-
NOV 11, 2025
Private AI Compute: our next step in building private and helpful AI
Looking for something else?
Link to Youtube Video (visible only when JS is disabled)
FAQs
Prioritizing user agency and data transparency
Keeping users in control of their data is a core priority of our work. We provide users with robust controls to access, review, export, and delete their personal data through their account settings across our products and platforms. Generative AI is no different, requiring a signed-in experience and giving users the ability to turn off their Gemini Apps activity and conversations. And for emerging AI agents, we are building permission controls that allow users to specify exactly the data an agent can access.
Our privacy policies
Our foundational commitment to data transparency, outlining how user information is collected, used, and protected.
Privacy check-up
Our comprehensive tool that allows individuals to review and customize their privacy settings across our products and services.
Evaluating requests through legal review
Government agencies from around the world ask Google to disclose user information. We carefully review each request to make sure it satisfies applicable laws. If a request asks for too much information, we try to narrow it, and in some cases we object to producing any information at all. We share the number and types of requests we receive in our Transparency Report.
Transparency report
Our annual data-driven report that provides visibility into how government requests and modern security threats influence online information flows.
Privacy and terms: Government requests for user information
An overview of our legal standards and procedural safeguards that we employ when responding to government requests for user data.
Safeguarding young users through expert-led privacy controls
We are committed to ensuring children have age-appropriate, privacy-preserving experiences across our platforms. Our tailored privacy protections and safeguards, including age assurance technology, parent-managed tools, and strict data policies that limit the collection of additional data help balance online protection with a young person’s growing autonomy online. Additionally, we implement strict advertising rules and support a complete ban on personalized advertising for anyone under 18, including the sale of children and teens’ personal information to third parties and data brokers.
Our Youth Legislative Framework
Our proactive policy proposal advocating for standardized, industry-wide protections to ensure a safe and age-appropriate digital environment for younger users.
Ensuring a safer online experience for U.S. kids and teens
Our latest age-assurance technologies and product safety measures designed to protect minors in the United States.
Parental Controls
A comprehensive guide to the suite of tools and settings families can use to manage content exposure and digital habits across Google’s platforms.
Our privacy-centered approach to AI training
Our AI models are primarily trained on publicly available information from the open internet. We don't access data from private databases or pages that require a login, unless the website owner allows it.
We have built-in privacy protections into our AI processes, including cleaning data and deleting duplicate information before training. This helps prevent the model from outputting near-copies of content from its training data and reduces the chance of it showing personal data in its outputs.
Gemini 2.5 Family Model Report
Our Gemini 2.X model family report, with section 5.6 on Privacy & Memorization.
Overview of Google crawlers and fetchers
A guide on how our automated systems interact with web content to power our Search and AI tools.
How Google retains data we collect
An overview of our data lifecycle management, including specific timeframes and logic used for information retention and deletion.
Advancing a uniform national privacy standard
It’s not enough for some organizations to operate responsibly — we need a law that establishes consistent rules and reins in bad actors. We support the passage of a federal privacy law in the U.S. to ensure all Americans have the same strong protections, rather than a confusing patchwork of different state rules.
The urgent necessity of enacting a national privacy law
We advocate for a clear, preemptive federal privacy standard to provide consistent protections for all American consumers.
Partnerships
Perspectives
-
Las Vegas Sun
Small businesses would be damaged by decision to break up Google’s ad tool
-
QNS
Op-ed: Proposed privacy law would badly hurt small businesses like my historic tavern
-
Sacramento Bee
New legislation hinders data collection for targeted ads. That hurts small businesses | Opinion
-
Bay County Coastal
Opinion: Breaking Up Google Would Hurt Bloggers Like Me
-
Center for Strategic & International Studies
How Japan can operationalize Data Free Flow with Trust
Studies, reports, and whitepapers
-
Deduplicating Training Data Makes Language Models Better
Google Research & University of Pennsylvania
-
Hark: A Deep Learning System for Navigating Privacy Feedback at Scale
Google Research
-
Gemini: A Family of Highly Capable Multimodal Models
Google DeepMind
-
Machine Unlearning Doesn’t Do What You Think: Lessons for Generative AI Policy and Research
Google, Cornell University & others