Artificial intelligence tools each have unique privacy policies and data retention practices, giving users varying levels of control over their personal information. While some tools provide robust options to enhance privacy and limit data usage, others may have more restricted settings. Below, you’ll find concise guides and short videos that demonstrate how to access and adjust privacy settings for each tool, empowering you to take control of your data.
To manage your privacy when using ChatGPT, OpenAI provides a straightforward way to control how your data is used. To adjust the privacy settings in ChatGPT, follow these steps:
When this setting is disabled, new conversations will not appear in your history and will be deleted from OpenAI’s systems within 30 days, unless required for abuse monitoring. Note that disabling this feature means you won't have access to chat logs for future reference.
A large amount of data on the internet relates to people, so our training information does incidentally include personal information. We don’t actively seek out personal information to train our models.
We use training information only to teach our models intelligence, such as the ability to predict, reason, and solve problems. We do not and will not use any personal information in training information to build profiles about people, to contact them, to advertise to them, to try to sell them anything, or to sell the information itself.
Our models may learn from personal information to understand how things like names and addresses fit within language and sentences, or to learn about famous people and public figures. This makes our models better at providing relevant responses.
We also take steps to reduce the processing of personal information when training our models. For example, we remove websites that aggregate large volumes of personal information and we train our models to reject requests for private or sensitive information about people.
To manage your privacy when using Microsoft Copilot, you can adjust data-sharing preferences and other privacy-related settings depending on the specific Copilot platform (e.g., Microsoft 365 Copilot, Security Copilot, or Windows Copilot). When accessing Copilot from a web browser, here are the steps to adjust your privacy settings:
Here, you can manage various privacy-related options, such as personalization, data sharing, and chat history. Adjust these settings according to your preferences.
When a user interacts with Microsoft 365 Copilot (using apps such as Word, PowerPoint, Excel, OneNote, Loop, or Whiteboard), we store data about these interactions. The stored data includes the user's prompt and Copilot's response, including citations to any information used to ground Copilot's response. We refer to the user’s prompt and Copilot’s response to that prompt as the "content of interactions" and the record of those interactions is the user’s Copilot activity history. For example, this stored data provides users with Copilot activity history in Microsoft 365 Copilot Chat (previously named Business Chat) and meetings in Microsoft Teams. This data is processed and stored in alignment with contractual commitments with your organization’s other content in Microsoft 365. The data is encrypted while it's stored and isn't used to train foundation LLMs, including those used by Microsoft 365 Copilot.
Google’s Gemini provides several privacy controls to help you manage how your data is saved and used. These settings are accessible through the Gemini mobile app, web experience, or your Google Account. Here’s how to find and adjust them:
Google collects your Gemini Apps conversations, related product usage information, info about your location, and your feedback. Google uses this data, consistent with our Privacy Policy, to provide, improve, and develop Google products and services and machine-learning technologies, including Google’s enterprise products such as Google Cloud.
Gemini Apps Activity is on by default if you are 18 or older. Users under 18 can choose to turn it on.
If your Gemini Apps Activity setting is on, Google stores your Gemini Apps activity with your Google Account for up to 18 months. You can change this to 3 or 36 months in your Gemini Apps Activity setting. Info about your location, including the general area from your device, IP address, or Home or Work addresses in your Google Account, is also stored with your Gemini Apps activity. Learn more at g.co/privacypolicy/location.
https://support.google.com/gemini/answer/13594961?hl=en#your_data
Perplexity AI provides users with privacy controls to manage how their data is used, ensuring a customizable and secure experience. Here's how to locate and adjust these settings:
These settings allow you to control how your data is handled while using Perplexity AI. Note that disabling data usage or using Incognito Mode may limit certain features, such as saving threads for future reference.
Claude does not currently offer customizable privacy settings for users. However, its privacy policy emphasizes that user data is not automatically used for training its models. Data is only utilized for training purposes if users explicitly provide consent through feedback mechanisms (e.g., thumbs up/down) or if flagged for trust and safety reviews to improve policy enforcement. Additionally, Anthropic commits to automatically deleting user inputs and outputs from its backend systems within 30 days unless longer retention is required for legal or policy compliance.
For more detailed information, please refer to Claude's privacy policy on Anthropic's official website: https://privacy.anthropic.com/en/
We will not use your Inputs or Outputs to train our models, unless: (1) your conversations are flagged for Trust & Safety review (in which case we may use or analyze them to improve our ability to detect and enforce our Usage Policy, including training models for use by our Trust and Safety team, consistent with Anthropic’s safety mission), or (2) you’ve explicitly reported the materials to us (for example via our feedback mechanisms), or (3) by otherwise explicitly opting in to training.
https://privacy.anthropic.com/en/articles/10023555-how-do-you-use-personal-data-in-model-training