Training opt-out: what your company must configure
Which editions train by default, where the opt-out lives, what LGPD requires — and why this is the first item on Autenticare's checklist for any Gemini Enterprise project.
Quick rule by edition
Trains by default
Prompts and content can be used for model improvement. Not safe for corporate use without explicit opt-out.
Doesn't train
Contractual no-training clause. Default for companies that want privacy without complexity.
Doesn't train + DLP/Audit
Same guarantee, plus DLP, advanced controls, audit and Brazil Data Residency.
Where the opt-out lives (step by step)
- Step 1 — Workspace Admin Console
Apps → Google Workspace → Gemini → Data usage settings. Confirm that Gmail, Drive, Docs, Meet and Chat content is not shared for public model training.
- Step 2 — Google Cloud / Vertex AI
In the Cloud project, enable customer policies for Vertex AI and Gemini API. Ensure prompts and responses aren't used for training and retention is set per company policy.
- Step 3 — Contract
Attach explicit no-training clause and an LGPD-compliant Data Processing Agreement. For regulated industries, require periodic Google Cloud audit reports.
- Step 4 — DLP & governance
Configure Cloud DLP to mask personal data in prompts, define authorized domains and users, enable full Audit Log.
- Step 5 — Evidence document
Generate a signed report with screenshots, configurations and clauses. This is what your DPO/legal presents in internal audits or to ANPD.
FAQ
Does paid Gemini Enterprise data train the model?
No. In paid editions (Business, Standard, Plus and Enterprise), Google contractually does not use prompts, responses, or Workspace content to train public models. This is part of the Cloud and Workspace paid customer Terms of Service.
What about the free Starter edition?
Starter (free) trains by default. Prompts and content can be used by Google for model improvement. No company should use Starter without explicit opt-out — better, migrate straight to Business.
Where exactly is the opt-out?
Two places: (1) Workspace Admin Console → Apps → Gemini → Data usage settings; (2) Google Cloud → Vertex AI / Gemini API → customer policies. For full assurance, both must be configured, plus a contract with an explicit no-training clause.
Does LGPD require opt-out?
LGPD requires a legal basis for processing personal data, a specific purpose, and the principle of minimization. Training a third-party model with personal data without informed consent rarely holds up — opt-out is the safe, auditable posture for Brazilian companies.
Does Brazil Data Residency solve it?
It helps but isn't enough. Data Residency keeps data at rest in São Paulo. Model training is a layer above — opt-out is still required, and the combination (residency + opt-out + DLP) is what produces a mature LGPD posture.
Does Autenticare configure all of this?
Yes. In the standard Autenticare setup for Gemini Enterprise, opt-out is applied across all touchpoints, DLP is configured, retention is reviewed, contracts with no-training clauses are aligned, and the customer receives a signed configuration report for audit.
Want an auditable Gemini Enterprise setup from day one?
Autenticare delivers the environment configured: opt-out applied, DLP active, contracts reviewed and a signed evidence report.
