What is APP 8 and why your AI use might breach it
1 April 2026 · 5 min read
Most Australian business owners who use cloud AI have never heard of Australian Privacy Principle 8. That is understandable. Privacy law is not exactly light reading. But APP 8 is the principle most directly relevant to how AI tools handle your data, and ignoring it carries real consequences.
What APP 8 actually says
APP 8 sits within the Privacy Act 1988 and deals specifically with cross-border disclosure of personal information. In plain terms: if you send personal information about an Australian individual to a recipient overseas, you are responsible for making sure that recipient handles the data in a way that meets Australian privacy standards.
You cannot discharge that responsibility simply by pointing to a provider's privacy policy. The Act requires that you either have a contractual arrangement that binds the overseas party to Australian Privacy Principles, or that you have taken other reasonable steps to verify compliance. If something goes wrong, the liability stays with you, the Australian business, not the overseas provider.
How cloud AI triggers it
When you type into a cloud AI product — whether that is drafting an email, summarising a document, or working through a client problem — the text you enter is transmitted to servers operated by the AI provider. Those servers are almost always located in the United States or the European Union. The moment personal information about a client or employee enters that prompt, you have made a cross-border disclosure under APP 8.
"Personal information" is broader than most people think. A client's name combined with their project details is personal information. A complaint from a staff member is personal information. An email address alongside a purchase history is personal information. You do not need to be working with medical records or financial data to be in scope.
What the OAIC can do
The Office of the Australian Information Commissioner can investigate complaints, conduct audits, and issue determinations. Serious or repeated breaches can attract civil penalties. More commonly, businesses face mandatory remediation, reputational damage, and the cost of responding to a formal investigation.
The regulatory environment around AI is also tightening. The OAIC has been increasingly active in publishing guidance on AI and privacy, and it is reasonable to expect that scrutiny of AI-related data practices will intensify over the next few years.
The clean solution
The simplest way to eliminate the APP 8 problem is to keep processing on Australian soil, on hardware you control. A local AI assistant that runs entirely offline never sends data to an overseas recipient. There is no cross-border disclosure because there is no disclosure at all. Your data stays on your machine, full stop.
This is not just a compliance workaround. It is the more sensible architecture for any business that takes its obligations to clients seriously.
George never touches an overseas server.
Everything stays on your own hardware. No cross-border disclosure, no APP 8 exposure. George is designed for Australian businesses that cannot afford to be casual with client data.
See George pricing