The Legal Gray Zone Is Shrinking
Many companies already use ChatGPT, Claude or Microsoft Copilot. But few have asked: Is this actually data protection compliant?
With the revised Swiss Data Protection Act (revDSG), which came into force on September 1, 2023, clear rules apply – including for the use of AI.
Important Note
This article provides general guidance and does not replace legal advice. For specific questions, you should consult a specialized lawyer or your data protection officer.
The New Data Protection Act (revDSG) and AI
The Federal Data Protection and Information Commissioner (FDPIC/EDÖB) has clarified: The DPA applies directly to AI applications. No separate AI law is needed.
The Key Principles
Transparency
Data subjects must be informed when their data is processed with AI – especially for automated decisions.
Purpose Limitation
Personal data may only be used for the stated purpose. Reuse for AI training is often not permitted.
Data Minimization
Only absolutely necessary data should be processed. Anonymize wherever possible.
Data Security
Appropriate technical and organizational measures must protect the data.
Automated Individual Decisions
Special rules apply when AI systems make decisions without human involvement that have significant effects on the data subject.
In such cases, data subjects have the right to:
- Be informed about the automated decision
- Request human review
- Present their point of view
Example: Automated Credit Check
An AI system rejects a credit application. The applicant must be informed that an AI made this decision and can request human review.
Cloud Providers: What's Allowed?
Most AI tools are operated by US companies. This raises questions about data transfer.
| Provider | Data Location | Specifics |
|---|---|---|
| OpenAI (ChatGPT) | USA | Enterprise version with training opt-out |
| Anthropic (Claude) | USA | Enterprise: Data not used for training |
| Microsoft (Copilot/Azure) | EU/Switzerland possible | Azure OpenAI Service with EU data residency |
| Google (Gemini) | Variable | Workspace version with privacy commitments |
Data Transfer to the USA
Since the Swiss-U.S. Data Privacy Framework (in effect since September 2024), data transfer to certified US companies is simplified again. However, you should check:
- Is the provider certified under the Framework?
- What data is actually being transferred?
- Is there an Enterprise version with better privacy guarantees?
What Data May Be Processed?
Safe
- Publicly available information
- Anonymized or synthetic data
- Internal documentation without personal reference
- Aggregated statistics
- Own texts for editing
Critical / Prohibited
- Customer data without consent
- Health data (specially protected)
- Employee personnel data
- Confidential business information
- Data of minors
The EU AI Act and Its Impact on Switzerland
The EU adopted the AI Act in 2024 – the world's first comprehensive AI law. It's relevant for Swiss companies if:
- You offer or deploy AI systems in the EU
- The outputs of your AI systems are used in the EU
- You have EU citizens as customers
Risk-Based Approach of the AI Act
The EU AI Act classifies AI systems by risk:
- Unacceptable risk: Prohibited (e.g., social scoring)
- High risk: Strict requirements (e.g., creditworthiness assessment)
- Limited risk: Transparency obligations (e.g., chatbots)
- Minimal risk: No special requirements
Checklist: DSG-Compliant AI Usage
Use this checklist to review your AI usage:
-
Create data inventory
What data is transmitted to which AI tools? -
Check providers
Where is data processed? What guarantees are there? -
Establish usage guidelines
Clear rules for employees: What may be entered, what not? -
Update privacy policy
Inform customers about AI use when their data is affected. -
Check data processing agreement (DPA)
For business versions of AI tools: Is a DPA in place? -
Train employees
Raise awareness of privacy risks when using AI. -
Keep documentation
Be able to prove that you've taken data protection measures.
Practical Recommendations
For Immediate Start
- Use Enterprise versions: ChatGPT Enterprise, Claude for Business, Microsoft Copilot for Business offer better privacy guarantees.
- Anonymize data: Replace names, addresses and other identifying features before entering data into AI tools.
- Exclude sensitive areas: HR, accounting and customer data should not be entered into public AI tools.
For Long-Term Strategy
- Evaluate local AI solutions: On-premise models or Swiss cloud providers for sensitive applications.
- Build AI governance: Clear responsibilities and processes for using AI in the company.
- Data protection impact assessment: Required under DSG for high-risk AI systems.
Conclusion: Yes to AI, But With Care
Using AI is fundamentally permitted in Switzerland – but not without rules. The Data Protection Act sets clear limits that companies must observe.
The good news: With the right measures, most risks can be minimized. Enterprise versions of common AI tools, clear usage guidelines and trained employees are the key to compliance-compliant AI use.
Companies that address this topic now are well positioned – also for future regulatory developments.
Implement AI Compliantly
I support you in implementing AI solutions in your company in a data protection-compliant manner.
Schedule Consultation