CASUS Logo
Casus Logo

CASUS Blog

AI and Switzerland's Data Protection Act: What legal teams need to know

Published on

March 25, 2026

by

Celeste Urech

Céleste Urech

|

Co-Founder & CTO

Switzerland's revised Federal Data Protection Act (DSG) has been in force since 1 September 2023. One consequence that is frequently overlooked: the DSG is formulated in technology-neutral terms, which means it applies directly to AI-based data processing - without requiring separate AI-specific legislation. The Swiss Federal Data Protection and Information Commissioner (FDPIC/EDÖB) confirmed this explicitly in an updated guidance document issued on 8 May 2025. Law firms and in-house legal teams using AI tools are therefore operating within a defined legal framework - provided the tools they use and their internal processes meet DSG requirements.

What the DSG means in practice for AI applications

The DSG protects personal data of natural persons. Since the revision, this includes genetic and biometric data as specially protected categories. For AI systems, this matters because personal data can be processed at multiple points in the AI lifecycle: as training data, for contextualisation, as user input, and as AI-generated output.

The DSG requires organisations and public bodies deploying AI to take concrete measures. The most relevant ones are:

Transparency obligation: Anyone carrying out AI-based data processing must disclose the purpose, functionality, and data sources. Users interacting with AI language models have the right to know whether they are communicating with a machine and whether their input is used for model training.

Privacy by Design and Privacy by Default: Art. 7 DSG requires data protection to be considered from the earliest development and planning stages - not only at rollout. For external AI tools, this means assessing whether a system meets these requirements before procurement.

Data Protection Impact Assessment (DPIA): Art. 22 DSG requires a DPIA when data processing carries a high risk to the personality rights or fundamental rights of those affected. High-risk AI applications are permitted in principle, but only with appropriate protective measures.

Data breach notification: Security incidents must be reported to the FDPIC as quickly as possible.

Prohibited AI applications under the DSG

The DSG prohibits applications specifically designed to undermine privacy and the right to informational self-determination. The FDPIC explicitly cites mass real-time facial recognition in public spaces and social scoring - meaning comprehensive behavioural monitoring and rating of individuals - as examples. These practices are primarily observed in authoritarian state systems, but the boundary is still relevant for compliance purposes.

Switzerland's regulatory framework: DSG instead of an AI Act

Unlike the EU, Switzerland does not yet have dedicated AI legislation. The Federal Council signed the Council of Europe Convention on AI and Human Rights (adopted 17 May 2024) in March 2025. Ratification is planned, and initial consultation proposals are expected by the end of 2026. The Federal Office of Communications (BAKOM) is leading the regulatory review process.

Switzerland's stated approach has three goals: strengthening innovation, protecting fundamental rights including economic freedom, and building public trust in AI systems. Until a Swiss-specific AI law enters into force, the DSG remains the primary legal instrument.

For organisations with EU exposure: the DSG is closely modelled on the GDPR, which contributed to the EU's adequacy decision for Switzerland. DSG compliance aligns with many GDPR requirements in practice - but the two frameworks are not fully equivalent.

Why legal teams need to pay particular attention

Legal teams handle highly sensitive data every day: contract documents, client information, due diligence files, HR records. When this content is fed into AI tools, data protection questions arise immediately: where is the data stored? Who has access? Are inputs used for training?

A recent AXA study (KMU-Arbeitsmarktstudie 2025) found that two in three Swiss SMEs are already using or experimenting with AI - but only one in three has clear internal rules governing AI use. In most organisations, employees independently decide which tools to use and what data to enter. From a data protection standpoint, that is a significant risk.

The FDPIC has also brought the issue into international focus. In March 2025, it concluded a preliminary investigation into X (formerly Twitter), which had used public posts from users to train its AI model Grok without sufficient transparency. The outcome: X users can opt out of having their posts used for this purpose.

What DSG-compliant AI tools for legal teams look like

An AI tool used in a legal context should concretely meet the following criteria:

No data retention: Inputs and documents are not stored permanently. CASUS, a Swiss legal AI platform for law firms and in-house legal teams, operates with zero data retention - no data is stored after processing.

No human review: Content is not viewed by human staff at the provider. CASUS offers an abuse monitoring opt-out, meaning no human review of user inputs takes place.

Hosting in Switzerland or the EU: Data must not be transferred to third countries without an adequate level of data protection. CASUS hosts exclusively in Switzerland and the EU - no data transfer to the US.

Transparency about processing: Users should be able to understand what happens to their inputs.

These are not marketing promises - they are technical and contractual prerequisites for legally compliant AI use in a legal setting.

Practical implications for law firms and legal teams

Any firm using AI tools for contract analysis, due diligence, or legal research should address three internal questions:

First: what data is being entered into the tool? If it includes personal data of natural persons - which is almost always the case in legal documents - the DSG applies.

Second: is a DPIA required? For high-risk processing, yes. For standard contract analysis without profiling, the risk level is often lower, but a review is still recommended.

Third: does the organisation have internal AI usage policies? According to the AXA study, two in three Swiss companies lack them. Violations can lead to significant fines and reputational damage.

CASUS's AI Data Room supports the processing of large document volumes and can detect personal data such as names, email addresses, IDs, and bank details - prioritising sensitive data categories. This supports preparation for anonymisation before documents are shared further.

For legal research on data protection questions, CASUS offers a Legal Research mode that draws on statutes, case law, and legally reliable sources, delivering structured, traceable outputs rather than generic internet answers.

CASUS as a data-protection-compliant option

CASUS is a Swiss legal AI platform that works directly in Microsoft Word or as a web app. The platform includes modules for contract analysis (Risk & Quality Review), document comparison (Benchmark), legal proofreading (Proofread), and AI-powered chat with documents (AI Chat).

The architecture is explicitly designed for the Swiss and European legal market: hosting in Switzerland and the EU, no data transfer to the US, zero data retention, and no human review. This means the platform meets the core technical requirements for DSG-compliant use in a legal context.

Firms looking for an AI solution that fits within Swiss data protection law can try CASUS for free at app.getcasus.com/signup.

FAQ

Does Switzerland's DSG apply to AI applications?

Yes. The DSG is formulated in technology-neutral terms and applies directly to any AI-based data processing. The FDPIC confirmed this explicitly in updated guidance issued on 8 May 2025.

Does Switzerland need a dedicated AI law?

Not yet in force. Switzerland signed the Council of Europe Convention on AI in March 2025. Initial consultation proposals are expected by the end of 2026. Until then, the DSG remains the governing instrument.

When is a Data Protection Impact Assessment (DPIA) required for AI projects?

A DPIA under Art. 22 DSG is required when data processing carries a high risk to the personality rights or fundamental rights of those affected. This applies particularly to AI applications involving extensive profiling or automated individual decision-making.

Which AI applications are prohibited under the DSG?

Applications specifically designed to undermine privacy and informational self-determination are prohibited. The FDPIC cites mass real-time facial recognition and social scoring as clear examples.

Can a law firm enter client data into AI tools?

It depends on the tool. The key factors are: where is the data stored, are inputs used for model training, and is there human review? A law firm must ensure that any tool it uses is DSG-compliant - in particular, no data retention and no US hosting.

What does "zero data retention" mean in an AI context?

Zero data retention means that input data and documents are not stored permanently after processing. This is a technical requirement for meeting the DSG obligations around the right to erasure and purpose limitation.

How does Switzerland's DSG differ from the EU GDPR?

The DSG is closely modelled on the GDPR, which led to the EU's adequacy decision for Switzerland. Key differences include the fact that the DSG does not protect legal persons (the GDPR also does not, but the comparison is frequently raised), and certain thresholds and procedures differ. In practice, DSG compliance aligns with many GDPR requirements but the two frameworks are not fully equivalent.

How can legal teams reduce AI-related data protection risks internally?

Practical steps include: assessing every AI tool for DSG compliance before use, establishing clear internal policies on permitted data inputs, training staff on applicable rules, and - where possible - anonymising personal data before AI processing.

Casus Logo

Verträge auf Autopilot. Mit CASUS.

Capterra Logo
Innosuisse Logo
Venture Kick Logo
HSG Spin Off Logo

CASUS Technologies AG

Uraniastrasse 31

8001 Zurich

Switzerland

Copyright ©2025 CASUS Technologies AG — All rights reserved.

Linkedin Icon
Youtube Icon
Casus Logo

Verträge auf Autopilot. Mit CASUS.

Capterra Logo
Innosuisse Logo
Venture Kick Logo
HSG Spin Off Logo

CASUS Technologies AG

Uraniastrasse 31

8001 Zurich

Switzerland

Copyright ©2025 CASUS Technologies AG — All rights reserved.

Linkedin Icon
Youtube Icon
Casus Logo

Verträge auf Autopilot. Mit CASUS.

Capterra Logo
Innosuisse Logo
Venture Kick Logo
HSG Spin Off Logo

CASUS Technologies AG

Uraniastrasse 31

8001 Zurich

Switzerland

Copyright ©2025 CASUS Technologies AG — All rights reserved.

Linkedin Icon
Youtube Icon