Jump to Content

Try our AI-powered search
PRIVACY

Protecting user privacy

OVERVIEW
A bank building

We support privacy regulation and policy that fosters trust and are committed to providing products and services that are private by design, secure by default, and that put users in control of their information.

REPORT

Explore our transparency report

OVERVIEW
A bank building

We support privacy regulation and policy that fosters trust and are committed to providing products and services that are private by design, secure by default, and that put users in control of their information.

Our policy priorities for user privacy

Privacy
in the age of Generative AI
Personalization
and user controls
Building privacy
and safety into agentic AI
Advancing
Privacy-Enhancing Technologies (PETs)

Privacy
in the age of Generative AI

Privacy in the age of Generative AI

At Google, we have over 25 years of experience protecting data from inappropriate access and unauthorized use. In this era of AI, we’re extending our best practices in data protection to ensure that the right data is used the right way to train models. We rely on publicly available data and we advocate for a risk-based approach to AI policy that protects personal information while allowing for scientific and social breakthroughs.

Personalization
and user controls

Personalization and user controls

Developing responsible AI means incorporating private-by-design principles in our products and platforms. We empower users to make informed decisions on what data they share to personalize their experience and enable strong privacy protection through transparency, robust controls, and clear reporting channels.

Building privacy
and safety into agentic AI

Building privacy and safety into agentic AI

Agents require data to be more relevant to your goals and complete tasks under supervision. To ensure agents act only within authorized bounds, we utilize a privacy-by-design approach, and encourage policy that scales privacy through transparent frameworks and granular user controls.

Advancing
Privacy-Enhancing Technologies (PETs)

Advancing Privacy-Enhancing Technologies (PETs)

We’re investing in the next generation of PETs, such as differential privacy and federated learning, helping us to train AI models. However, use of these technologies remains low outside of the tech sector. Governments can accelerate PET adoption through integration into public service procurement and delivery, helping further secure citizen data.

Looking for something else?

Try our AI-powered search
A video interview still of Dr. Janina Voigt, Engineering Manager at GSEC Munich, speaking in a bright, modern office setting with a green plant logo in the corner.

Taking on privacy & security engineering at the Google Safety Engineering Center, Munich

GSEC Munich is Google’s global hub of privacy and security engineering in the heart of Europe. Established in 2019, it’s where 200+ dedicated engineers work to create products and tools that will help keep people everywhere safe online, and their information private and secure.
Watch the video

FAQs

How we’re helping users control their data
Our stance on government requests for data
Our privacy protections for kids and families
How we use public data in training our AI models
Our perspective on a Comprehensive National Privacy Law
How we’re helping users control their data

Prioritizing user agency and data transparency

Keeping users in control of their data is a core priority of our work. We provide users with robust controls to access, review, export, and delete their personal data through their account settings across our products and platforms. Generative AI is no different, requiring a signed-in experience and giving users the ability to turn off their Gemini Apps activity and conversations. And for emerging AI agents, we are building permission controls that allow users to specify exactly the data an agent can access.

Our privacy policies
Our foundational commitment to data transparency, outlining how user information is collected, used, and protected.

Privacy check-up
Our comprehensive tool that allows individuals to review and customize their privacy settings across our products and services.

Our stance on government requests for data
Our privacy protections for kids and families

Safeguarding young users through expert-led privacy controls

We are committed to ensuring children have age-appropriate, privacy-preserving experiences across our platforms. Our tailored privacy protections and safeguards, including age assurance technology, parent-managed tools, and strict data policies that limit the collection of additional data help balance online protection with a young person’s growing autonomy online. Additionally, we implement strict advertising rules and support a complete ban on personalized advertising for anyone under 18, including the sale of children and teens’ personal information to third parties and data brokers.

Our Youth Legislative Framework
Our proactive policy proposal advocating for standardized, industry-wide protections to ensure a safe and age-appropriate digital environment for younger users.

Ensuring a safer online experience for U.S. kids and teens
Our latest age-assurance technologies and product safety measures designed to protect minors in the United States.

Parental Controls
A comprehensive guide to the suite of tools and settings families can use to manage content exposure and digital habits across Google’s platforms.

How we use public data in training our AI models

Our privacy-centered approach to AI training

Our AI models are primarily trained on publicly available information from the open internet. We don't access data from private databases or pages that require a login, unless the website owner allows it.

We have built-in privacy protections into our AI processes, including cleaning data and deleting duplicate information before training. This helps prevent the model from outputting near-copies of content from its training data and reduces the chance of it showing personal data in its outputs.

Gemini 2.5 Family Model Report
Our Gemini 2.X model family report, with section 5.6 on Privacy & Memorization.

Overview of Google crawlers and fetchers
A guide on how our automated systems interact with web content to power our Search and AI tools.

How Google retains data we collect
An overview of our data lifecycle management, including specific timeframes and logic used for information retention and deletion.

Our perspective on a Comprehensive National Privacy Law

Advancing a uniform national privacy standard

It’s not enough for some organizations to operate responsibly — we need a law that establishes consistent rules and reins in bad actors. We support the passage of a federal privacy law in the U.S. to ensure all Americans have the same strong protections, rather than a confusing patchwork of different state rules.

The urgent necessity of enacting a national privacy law
We advocate for a clear, preemptive federal privacy standard to provide consistent protections for all American consumers.

Partnerships