The Australian Productivity Commission must have known that their interim report on harnessing data and digital technologies (the Report) would get attention. Before publication, the Australian Privacy Commissioner let on that she was “watching with interest to see if privacy is positioned as a barrier to, or an enabler of, a more trustworthy and productive digital economy”, while the Federal Treasurer highlighted that although legislation would matter, his government was “overwhelmingly focused on capabilities and opportunities, not just guardrails” in relation to AI technologies.
The Productivity Commission released the Report in the small hours of last Tuesday night (5 August). If this was intended to detract from the attention, it did not. Much of Australia’s privacy and media community have expressed their opinion that the Report veers more into “productivity” and “capabilities” than it does “trust” and “guardrails”. Other industry members support the pragmatic approach taken by the Commission in areas like AI, which stands in contrast to more interventionist players.
Do you have a view? If so, you are encouraged to submit a response by 15 September 2025. For the privacy professionals reading this blog entry, the following recommendations will be most relevant to you:
1. The Privacy Act should be amended to introduce an alternative compliance pathway that enables firms to fulfil their privacy obligations by meeting outcomes-based criteria.
As most Australian privacy professionals know, the Privacy Act has been slated for amendment for at least seven years. In September 2023, the Government published its response to the Privacy Act Review Report (the Government Response), agreeing to many of the 116 proposals intended to strengthen and modernise Australian privacy law. A small number (“Tranche 1” amends) were incorporated into the Privacy Act in December 2024, but the majority (“Tranche 2” amends) have not yet been legislated.
Much like the Government Response, the Report supports changing Australian privacy rules – but takes a different tack. According to the Productivity Commission, the Privacy Act offers some flexibility but many of its requirements focus too much on specific controls. “Control-based” obligations include those which mandate formal notice when personal information is collected, or which restrict the circumstances in which such personal information is used and disclosed (including by requiring consent before some secondary uses or disclosures).
To avoid overreliance on notice and consent – which may not actually lead to greater privacy protections for individuals – the Productivity Commission recommends incorporating an outcomes-based approach, rather than a “tick box” exercise where businesses “comply with the letter of the law but not the spirit of it”. In the Commission’s view, this would include inserting a more general privacy obligation for organisations. This could include an obligation to act in the “best interests” of an individual in respect of their privacy, or a “duty of care” to prevent reasonably foreseeable harm to their privacy .The obligation could either serve as a defence to non-compliance with the existing Australian Privacy Principles (APPs) or it could be the general rule that applies, with compliance with the APPs serving as a “safe harbour”.
Some thoughts immediately come to mind:
- While we agree that certain APPs are controls-based and may create notice and consent fatigue, other APPs are clearly outcome-based. For example, APP 10 imposes a duty to take “reasonable steps” to ensure personal information is accurate, up-to-date and complete, while APP 11 also requires “reasonable steps” to protect personal information from misuse. Many of our clients already complain that these APPs are unclear and difficult to follow without prescriptive guidance. We are unsure that the “alternative path” would present a viable compliance approach for such organisations, or how it would interact with existing outcomes-based obligations.
- The alternative compliance pathway resembles Article 6(1)(f) of the EU GDPR, which allows the processing of personal data where necessary for the “legitimate interests” pursued by an organisation, provided that such legitimate interests are not overridden by an individual’s rights and freedoms. While Article 6(1)(f) is drafted flexibly, most European data protection authorities strongly recommend organisations conduct and formally document the “balancing test” between their legitimate interests and an individual’s rights. Even if the proposed duty is intended to operate flexibly, in practice, Australian organisations would also need to document their compliance, which could lead to the same “excessive regulatory burden” that the pathway is intended to reduce.
- The Commission does not consider the current Privacy Act regime to be “scalable” and believes that it “does not adapt according to risks posed by…the size of different regulated entities”. In fact, the Privacy Act already exempts small businesses from compliance. Where small to medium-sized enterprises are caught by the Act, their reduced access to privacy resources could make compliance with an open-ended obligation more challenging than a guided regime.
All of that being said, there may be a place for such a duty or obligation in the context of emerging tech – please see What about AI? below.
2. Do not implement a right to erasure.
The Productivity Commission was critical of the Tranche 2 reforms, but particularly the proposed right to erasure. As proposed in the Government Response:
- An individual may seek to exercise the right to erasure for any of their personal information.
- The organisation who has collected the information from a third party or disclosed the information to a third party must inform the individual about the third party and notify the third party of the erasure request unless it is impossible or involves disproportionate effort.
In the Commission’s view, this right is difficult to implement. Any contemplated reforms should consider both “the benefits to individuals, as well as the costs to regulated entities and the resultant effect on innovation and productivity” – and this was an example of where the costs outweigh the benefits.
A couple of points to note:
- Other than a brief reference to “general exemptions”, the Report does not highlight that the right to erasure is absolute. Like all other privacy rights, it would be subject to exemptions, such as where there are competing public interests or if compliance with erasure is inconsistent with law or contract. Rather than recommending a blanket exclusion, another option might be to identify the specific pitfalls of the GDPR’s “right to be forgotten” – which the Report criticises at a very high level – and proposing tailored and effective exemptions which help to address these issues.
- Even if a total right to erasure is not the answer for Australia, compromise options are available. We only need to look to South Korea, where the Personal Information Protection Commission has committed, as of 2023, to assist minors to have their personal information removed or hidden from the internet where it has been uploaded by parents and friends without their consent.
3. Pause Tranche 2 changes.
Although the Report does not go so far as to make this recommendation, the Productivity Commission is not convinced that the “Tranche 2” amends to the Privacy Act are necessary or that the cost-benefit analysis for introducing them has been properly carried out. Even though certain Tranche 2 amendments (such as the “fair and reasonable” test for data handling and mandatory privacy impact assessments) seem to achieve the Report’s aim of “[putting] the onus onto businesses instead of holding consumers accountable…for their best interests”, the Report makes the interesting point that these changes impose “additional requirements” on Australian businesses, rather than creating alternative requirements. We have raised this same point before, specifically in relation to the “fair and reasonable” test which carries the burden of the legitimate interests balancing test without offering businesses a corresponding right to use personal information.
What about AI?
While the Report does not explicitly discuss AI in the context of privacy, the Commission seeks feedback on a proposal to amend the Copyright Act to include a fair dealing exception for text and data mining. Just like the “alternative pathway” for privacy compliance, this approach to AI is intended to facilitate innovation and development, while keeping companies’ obligations flexible.
Although it has its limitations, a duty to act in the “best interests” of an individual in respect of their privacy might play an important role in the context of emerging tech. For example, where an individual’s personal information is used to train a generative AI model, many of the traditional data rights – such as access, correction and the proposed right of erasure – are difficult for individuals to exercise, especially when their personal information cannot be easily extracted from an underlying model (and where such information is unlikely to be critical to the model’s operation anyway). In such circumstances, a proactive commitment by a deployer to use personal information safely and in their best interests could provide more meaningful protection.
If any of this is important to you or your business then please get in touch – we would love to hear your thoughts or even support you in drafting a response to the Report before 15 September.