Why ChatGPT remains an experiment in most tax firms – and how Microsoft offers the only real compliance perspective

Updated on
Warum ChatGPT in den meisten Steuerkanzleien ein Experiment bleibt – und wie Microsoft die einzige echte Compliance-Perspektive bietet

In recent months, the discussion about artificial intelligence in tax consulting has become increasingly loud. It seems like every second firm is currently testing how ChatGPT can simplify day-to-day business . Many firm managers report that they are testing initial prompts for drafting pleadings, pre-drafting emails, or drafting procedural documentation. This experimentation is important and useful – but it has clear limitations arising from the structure of the industry.

Around 80% of all tax firms in Germany have fewer than ten employees . This means they have neither specialized IT departments nor the budget to implement large-scale enterprise solutions. This is precisely where AI compliance becomes a problem. As soon as personal data—client data, employee information, tax documents—is processed, strict data protection and confidentiality requirements apply. A simple ChatGPT account, even a paid one, cannot meet these requirements.

Many law firms initially subscribe to the so-called ChatGPT Plus subscription. This costs around $20 per month and unlocks GPT-4o, the currently most powerful model variant. Plus users can save prompts, activate the memory function, and entrust the AI with complex tasks. But what many overlook is that ChatGPT Plus is not a solution for data protection-compliant work . OpenAI uses the entered data for training purposes by default, stores it in US data centers, and does not offer any data processing agreement (DPA) in accordance with Art. 28 GDPR. Even if you deactivate the use of data for model training in the settings, this does not change the fact that there is no contractual framework for legally compliant processing.

Even more frequently, expert articles recommend switching to the so-called Team Plan . Depending on the region, this costs approximately $25–30 per user per month. It offers additional administrative functions, a central dashboard, and the ability to manage internal team projects. What is less well known, however, is that the Team Plan also does not include a legally binding data protection agreement . While you can deactivate training use and manage role rights, the storage location remains unchanged in the USA. Therefore, from the perspective of a law firm bound by professional secrecy, the Team Plan cannot be used productively . It may make sense for internal experiments – but not for actual client work.

Only OpenAI's Enterprise plan offers an infrastructure that can be designed, at least on paper, to comply with GDPR. Here, data is not used for training, access is auditable, and organizational and technical safeguards can be implemented. A customized data processing agreement is also agreed upon. But this is the second major obstacle: Enterprise typically requires at least 150 users and incurs costs of around $60 per user per month . If you extrapolate this, we're talking about approximately $9,000 per month – a sum that even large tax consultancies wouldn't readily approve. For 80% of firms with fewer than ten employees, Enterprise is simply out of reach.

From this perspective, it becomes clear: Iterative processes, often described as the future of AI, are currently virtually unfeasible for small law firms. These processes rely on ChatGPT storing context, developing versions, and continuously optimizing them over days or weeks. Anyone who wants to use them productively must be able to guarantee that no client data ends up in US data centers without supervision. Without an enterprise plan, these guarantees are impossible.

This is where a misunderstanding arises that many law firms have experienced in recent months: Just because a tool is technologically impressive doesn't mean it's permitted by regulations. Those bound by professional secrecy, in particular, are subject to special obligations. Violations of Section 203 of the German Criminal Code (violation of private secrets) or Article 32 of the GDPR (security of processing) are not a theoretical risk, but can result in hefty fines or criminal proceedings.

Anyone looking for a realistic alternative quickly comes across Microsoft . Microsoft takes a different approach: The models are run via Azure OpenAI in data centers located in Europe, depending on the configuration. This allows law firms to establish a clear accountability structure. Microsoft acts as a data processor, automatically concludes data protection agreements, and provides comprehensive audit logs. Unlike ChatGPT (OpenAI directly), data processing is integrated into a compliance framework that tax advisors are already familiar with from Microsoft 365.

For many smaller law firms, Microsoft 365 Copilot and Azure OpenAI are particularly relevant. Copilot can be integrated into existing Word, Excel, and Outlook installations and processes data only within its own tenant. Access controls, data loss prevention, and eDiscovery can be activated with just a few clicks. The current cost is approximately €28–30 per user per month in addition to the E3/E5 license—a price that is also accessible to smaller law firms.

A second approach is to use Azure OpenAI directly. GPT-4 Turbo is provided as an API without using any data for training purposes. Developers or specialized service providers can build their own applications on this basis, for example, for automated document preparation or tax research services. This model is more complex, however, as it requires development effort. In return, it offers maximum control, including the ability to precisely define storage locations, deletion periods, and logs.

In summary, the technology has long been available – but the regulatory framework is lacking for many offerings. As long as OpenAI Enterprise is neither financially nor organizationally feasible for small law firms, many AI scenarios will remain purely experimental. Anyone who truly wants to become productive should take a close look at Microsoft's compliance models. Only they offer a pragmatic path that is legally and financially viable.

Anyone who believes they can solve the problem simply by disabling training use in ChatGPT Plus or Team should be aware that without a data processing agreement, clearly defined storage locations, and robust auditing, the use remains highly risky. For tax firms, this means that AI can already create added value today—but only if technology, data protection, and professional regulations are considered together.

Updated on

Leave a comment

Please note, comments need to be approved before they are published.

More interesting offers

... is available in our online shop