GitHub Expands Copilot Data Use for AI Training, Affecting Free and Pro Users – Opt‑Out Required
What Happened — GitHub announced that, starting 24 April 2026, interaction data from Copilot Free, Pro, and Pro+ users will be used to train and improve its AI models unless the user opts out. Business and Enterprise Copilot tiers, as well as users who have already opted out, are excluded.
Why It Matters for TPRM —
- Third‑party risk managers must assess whether the new data‑use policy aligns with their organization’s data‑privacy and IP protection requirements.
- Unintended exposure of proprietary code or confidential design information could occur if interaction data is harvested for model training.
- Opt‑out mechanisms may add administrative overhead and require verification that corporate policies are enforced across all developer accounts.
Who Is Affected — Software development firms, SaaS providers, financial services, healthcare, and any organization that uses GitHub Copilot on free or paid individual licenses.
Recommended Actions — Review internal Copilot usage, enforce opt‑out for all corporate accounts, update vendor risk questionnaires to capture the new policy, and monitor for any inadvertent leakage of proprietary code.
Technical Notes — The policy expands data collection to include prompts, generated suggestions, accepted or edited outputs, code context, comments, file names, repository structure, and feedback. Data from private repositories “at rest” is not used, but interaction data may be transmitted to Microsoft affiliates for model training. No new CVEs are involved. Source: https://www.helpnetsecurity.com/2026/03/26/github-copilot-data-privacy-policy-update/