Why Australian businesses should stop using ChatGPT for client work
3 April 2026 · 5 min read
Cloud AI tools are convenient. You type a prompt, you get a useful result, and your afternoon gets a little easier. For a lot of tasks, that trade-off feels perfectly reasonable. For client work, it is not.
The moment you paste a client email, a contract, a brief, or any identifying information into a cloud AI product, that data leaves Australia. It travels to servers operated by a US company, subject to US law, and processed under terms that most business owners have never read. Your client almost certainly has no idea this is happening.
What the Privacy Act actually says
Australia's Privacy Act 1988 applies to any organisation with an annual turnover above $3 million, and to many smaller businesses in certain sectors. If you handle personal information about clients or their customers, the Act governs what you can do with it. Sending that information offshore is not a technical loophole. It is a disclosure, and it triggers obligations under Australian Privacy Principle 8.
APP 8 requires that before you disclose personal information to an overseas recipient, you take reasonable steps to ensure the recipient will handle it in accordance with Australian privacy principles. In practice, this means either a binding agreement with that overseas party, or a reasonable belief based on the circumstances. A terms-of-service agreement with a US AI provider does not meet this standard.
The practical risk
Most business owners do not think about this because nothing has gone wrong yet. But the risk compounds over time. Every client document you process through a cloud AI tool adds to a pattern of cross-border disclosure that would be very difficult to defend if a client ever complained to the OAIC, or if a data incident required notification.
There is also the question of client trust. If your clients knew you were processing their sensitive information through an overseas AI service, would they be comfortable with that? Most would not. That gap between what clients expect and what is actually happening is a reputational liability that sits quietly until it does not.
The alternative is not to stop using AI
The productivity benefits of AI are real. The goal should not be to stop using it, but to use it in a way that keeps your data where it belongs: on hardware you control, in the country your clients expect.
A local AI setup processes everything on your own machine. No internet connection required, no data leaving your premises, no overseas recipients to worry about. You get the same day-to-day productivity without the compliance exposure.
For Australian businesses that handle sensitive client information, that is not just a nice-to-have. It is quickly becoming the only defensible approach.
George runs entirely on your hardware.
No data leaves your machine. No overseas servers. No subscriptions. Just a private AI assistant that works the way Australian businesses need it to.
See George pricing