Digital Rights Watch Feed Items

Media Release: OAIC Determination on Bunnings

 â€” 

Digital Rights Watch welcomes OAIC landmark determination that Bunnings breached Australians’ privacy with facial recognition

Digital Rights Watch welcomes the determination from the Office of the Australian Information Commissioner today on Bunnings’ use of dangerous and invasive facial surveillance technology. This represents a landmark decision and corporate Australia should take as a warning about the use of this technology.

Digital Rights Watch is pleased by the outcome of the OAIC investigation. It represents a far-reaching and significant determination on the legality of facial recognition technology in Australia, clearly setting the rules for all businesses and organisations using or considering using the technology.

Facial recognition technology as used by Bunnings collects sensitive biometric information that can uniquely identify you, similar to your fingerprint. The huge public outcry at the time of the CHOICE investigation showed that Australians are deeply concerned about the use of this invasive tech. Our friends at CHOICE should be commended for their groundbreaking investigation and for tireless advocacy to hold Bunnings to account.

Covert use of facial recognition technology in retail settings and in public spaces impinges on our human right to privacy and normalises surveillance. The technology is prone to inaccuracies and bias, with higher rates of false identification for people with darker skin leading to discrimination.

Submission: Privacy and Other Legislation Amendment Bill 2024

 â€” 

Privacy is essential to upholding democracy, reining in corporate power, and building a safe and fair digital future.

The bill has been described as a ‘first tranche’ in the process of reforming the Act. The two central proposals, a statutory tort and the roadmap for a children’s online privacy code, together represent a good first step, but Australia’s privacy legislation remains decades behind other nations. In particular, we note the absence of an updated definition of ‘personal information’, a fair and reasonable test, and the continuing exemptions such as those that currently exist for small businesses. Delay in pursuing these reforms leaves gaping holes in Australia’s legal regime for the protection of personal information.

We are past the time for incremental amendments to the Act.

If the Attorney-General’s office intends on introducing these reforms in ‘tranches’, as is suggested, we expect to see a detailed roadmap and timeline for the introduction of the remaining tranche(s), else we risk the remaining reforms being delayed indefinitely. We concur with many other civil society organisations in calling on the government to implement the remaining reforms within six months of taking office, should they win the next election. We also call on the opposition to make a similar commitment should they win office.

You can read our submission in full below:

Submission: The opportunities and impacts for Australia arising out of the uptake of AI technologies

 â€” 

We need not look to far-future hypothetical scenarios to understand the ways in which AI can cause harm: it is already happening. There is over a decade of case studies from around the world, research, analysis and recommendations to draw from. More than ever before, Australia is in a position to move from identifying problems and toward taking steps to remediate and mitigate them. Digital Rights Watch urges the committee to take this task seriously, and to recognise that there is nothing about AI that is inevitable. The government can—and should—intervene.

Critically, much of the AI hype—both negative and positive—serves the interests of companies who stand to profit the most from widespread adoption of their products in a low regulation environment. We should not allow our laws and policy to be shaped by AI Industry leaders for their own purposes, especially given that those leaders are generally not based in Australia, and represent a different set of values that do not always apply well in the Australian context.

You can read our full submission below:

Submission: Communications Legislation Amendment (Combatting Misinformation and Disinformation) Bill 2024

 â€” 

We have significant concerns about the breadth of powers that the Communications Legislation Amendment (Combatting Misinformation and Disinformation) Bill 2024 proposes to grant the Australian Communications and Media Authority (ACMA)  with limited mechanisms for oversight and accountability.

Mis- and disinformation are undoubtedly serious problems. They ought to be understood in the context of advertising-based business models that focus on the extraction of personal information. The widespread amplification of mis- and disinformation is exacerbated by commercial business models that prioritise engagement and ratings above all else, treating users exclusively as consumers rather than citizens.

We welcome efforts to reduce the spread of mis- and disinformation, but these efforts only target the symptoms of the problem rather than the cause.

Our recent submissions relevant to this inquiry include:

Submission: Inquiry into the influence and impacts of social media on Australian society

 â€” 

Human rights must be at the centre of Australia’s approach to tech policy

Protecting, enhancing and upholding human rights is essential to ensuring Australia’s technology policy is robust, fit for purpose, and meaningfully contributes to the improvement of individuals and community wellbeing both online and off.

Digital Rights Watch actively participates in public consultations regarding the development of legislation and policy in relation to technology and human rights. We have consistently contributed to the public debate regarding many of the inquiry’s terms of reference, in particular in relation to age verification and the news media bargaining code.

Our recent submissions relevant to this inquiry include:

Media release: Privacy Reform

 â€” 

Digital Rights Watch welcomes privacy reform, but more must be done urgently to bring our laws into the 21st century

Today, the Attorney General tabled a set of privacy reforms, which has been described as a first tranche. The proposed statutory tort and the plan for a children’s code together represent a good first step, but Australia remains decades behind other nations. There is a lot more to do to catch up.

Frankly, it is disappointing that we are here after three years of consultation.

Digital Rights Watch calls on the government to lay out a clear time frame for the remaining 100+ reforms that it has committed to implementing. Taking specific reforms to the election will ensure a mandate to resist the push back from the powerful vested interests who have always stood in the way of privacy reform.

The Privacy Act has not been meaningfully reformed in over 40 years, and it is urgent that changes be made, including to:

Submission: Statutory Review of the Online Safety Act

 â€” 

The Online Safety Act commenced in January 2022. It provides broad powers to the online safety regulator, the eSafety Commissioner.

In November 2023, the Minister for Communications the Hon Michelle Rowland MP announced a statutory review into the operation of the Act.

Digital Rights Watch has actively participated throughout the development of the Online Safety Act and its related parts, such as the Basic Online Safety Expectations and the online safety industry standards.

As always, we emphasise that privacy and digital security are essential to uphold safety.

Our recent submissions relevant to online safety in Australia include:

How Western Australia could lead the nation on privacy

 â€” 

Australians are sitting with anticipation awaiting August 2024, when the Commonwealth Government promised to deliver a draft bill to update the Privacy Act 1988 (Cth) (Privacy Act). But there’s another bill that’s poised to outshine the Commonwealth’s and champion state privacy rights.

In May 2024, Western Australia (WA) tabled its Privacy and Responsible Information Sharing Bill 2024 (Wa.) (Bill). We’re delighted by this development and in this blog post, our Board member Piotr Debowski will walk you through what we love about the Bill, what we are less keen on, and what we think the WA government needs to do some more thinking on. We focus our attention on the privacy aspects of the Bill, but the Bill does also contain provisions facilitating the sharing of information within the WA government.

The Bill is currently in front of the WA Legislative Assembly. If it passes, it will be handed to the Legislative Council for consideration. 

Submission: Basic Online Safety Expectations Amendment

 â€” 

The BOSE outline the Australian Government’s expectations that apps, websites, social media and other services will take reasonable steps to keep Australians safe. Read more about the BOSE on the eSafety website here.

In late 2023 the Department of Infrastructure, Transport, Regional Development, Communications and the Arts announced they would be amending the Basic Online Safety Expectations (BOSE) Determination. A range of changes to the original BOSE have been proposed, including expectations related to generative AI and recommender systems. Digital Rights Watch provided feedback which you can read below, or download a PDF here.

The original BOSE Determination was made in 2022. You can read Digital Rights Watch’s submission in response to the original draft BOSE here.

Public health, children’s rights and privacy organisations deliver open letter to Attorney General calling for bold privacy reform 

 â€” 

The Australian government must act on its commitment to bold reform of Australia’s Privacy Act in order to uphold the safety, wellbeing and autonomy of children, according to an open letter today delivered to Attorney General Mark Dreyfus. The Open Letter was coordinated by Digital Rights Watch, and has been co-signed by 22 organisations across public health, children’s rights, and privacy advocacy. It also has over 800 signatures from members of the public in support. 

The Privacy Act has been the subject of a years-long review process, which has involved extensive community engagement. The latest report from the Attorney General’s department made dozens of recommendations, which were largely accepted by the Australian government in September 2023. The next stage is for a bill to be tabled but advocates are growing concerned about when this will take priority.

Australia’s commitment to privacy rights lags behind similar liberal democracies, posing a particular problem for children given their specific vulnerabilities. There is an urgent need to update Australia’s privacy laws for the twenty-first century. 

The letter highlights the negative impacts of invasive data-driven business models upon children—and indeed everyone—that the Privacy Act currently leaves unchallenged. It warns of the harms caused by endless engagement, targeted online advertising, rampant misinformation, and the normalisation of surveillance as the price for participation in online life.  

Submission: Online Safety Draft Industry Standards

 â€” 

Under the Online Safety Act, the eSafety Commissioner can require industry bodies to draft industry codes to deal with Class 1 and Class 2 material. In 2022, a group of industry bodies commenced drafting industry codes to handle Class 1A and 1B material – this includes Child Sexual Abuse Material (CSAM) and/or Child Sexual Exploitation Material (CSEM), “pro-terror” material, as well as material that deals with crime and violence, and drug-related content.

In June 2023, the eSafety Commissioner registered 5 out of the 8 proposed industry codes. Of the remaining 3 codes, a sixth was registered after amendments to reflect the developments in generative AI. The eSafety Commissioner declined to register the final two codes – for ‘Designated Internet Services’ and ‘Relevant Electronic Services’, based on the decision they did not go far enough to safeguard users in Australia. Given the proposed codes did not meet the expectations of the eSafety Commissioner, they then drafted industry standards. In November 2023, the eSafety Commissioner opened public consultation on the draft industry standards for 31 days.

Local and international organisations urge Australia’s eSafety Commissioner against requiring the tech industry to scan users’ personal files and messages

 â€” 

40 organisations from around the world have today delivered a joint letter to Australia’s eSafety Commissioner, calling for protections for privacy, digital security and end-to-end encryption. 

The letter was coordinated by Digital Rights Watch, Access Now, and the Global Encryption Coalition Steering Committee, and has been co-signed by organisations including Signal, Mozilla, Proton, the Tor Project, Electronic Frontiers Australia, and more. It was also signed by 560+ supporting members of the public.

The letter is in response to two draft industry standards proposed by the eSafety Commissioner under the Online Safety Act, which are open for public consultation until 21 December. The standards would apply to a broad range of services including email, messaging, and personal file storage, and include a range of proactive detection obligations to detect, remove, disrupt and deter illegal content. However, as there are no safeguards for encryption, the standards would require end-to-end encrypted services to undermine the security and privacy of their users in order to comply. 

Signatories acknowledge the severity of harm caused by the dissemination of illegal content, and recognise the need for regulation to enhance online safety. Contrary to the goal of the standards, what is being proposed will make everyone less safe online. 

2023 Wrap Up: what’s hot, what’s not, what’s coming…

 â€” 

This is our final update for 2023, so here’s a little roundup of the highlights and lowlights of the year, as well as a sneak peek into what’s coming in 2024. But first…

A note from the Digital Rights Watch Chair 

It’s been another big year at Digital Rights Watch HQ (on the internet). Over the course of 2023 we made fourteen submissions to government inquiries, bills and consultations, took part in more than ten roundtables, appeared at three parliamentary hearings, and appeared in the media over 80 times. To me, the critical importance of this work is self-evident. If we don’t fix how our online world is governed, it remains virtually impossible to build functioning community spaces, or a public space to debate difficult problems like climate change, racial injustice and our response to military violence. If we don’t improve our privacy laws, generations of kids will be surveilled by predatory businesses that do not have their best interests at heart. If we don’t get our approach to online safety right, vulnerable people will be pushed further to the margins. I remain hopeful that a rights based approach gives us the best chance at making good policy that puts the power of tech back in the hands of people. If you agree, please consider supporting our organisation however you can. We have some tough adversaries out there and we welcome your support. – Lizzie O’Shea, Digital Rights Watch Chair

Posting protest photos? Here’s how to protect others’ identities 

 â€” 

Sharing photos of protests is a great way to amplify the impact of collective action, raise awareness on important issues, and encourage more people to participate. We love it! 

But surveillance is on the rise, including the use of facial recognition technology – capturing biometric data from people’s faces. At the same time, the right to protest is under threat in Australia

Submission: Digital ID Bill 2023 exposure draft

 â€” 

In October 2023 the Digital ID Taskforce (in the Department of Finance) closed a public consultation on the exposure draft of a proposed Digital ID Bill 2023. This follows their previous consultation on a 2021 exposure draft of the Trusted Digital Identity Framework (see our submission for that consultation here).

Following the Optus and Medibank breaches, the idea of a digital identity that enables government bodies and companies to verify people’s identity without each company collecting and holding identity documents has become more popular. Still, it’s not without its privacy and security concerns, and as always, the devil will be in the detail (and implementation).

Digital Rights Watch recognises the potential benefits associated with the establishment of a digital identity system, however we will continue to advocate for a handful of key components that we believe are fundamental for a robust, fair, trustworthy and successful Digital ID system.

The Digital ID system must:

Submission: Identity Verification Services Bill

 â€” 

In September 2023 the Identity Verification Services Bill 2023 was introduced to Parliament. The Bill was referred to an Inquiry by the Senate Standing Committees on Legal and Constitutional Affairs, and Digital Rights Watch made a submission. The Committee is required to report by the 9th of November 2023.

The bill creates a legislative framework to support the operation of identity verification services which are already on offer by the Commonwealth to allow government agencies and industry to compare or verify personal information on identity documents against existing government records, such as passports, drivers licenses, and birth certificates.

This includes one-to-one matching services such as the Document Verification Service (DVS) and Facial Verification Service (FVS), which was used over 140 million times in 2022.

Context

This is the ALP’s re-vamped version of the Coalition’s controversial Identity Matching Services Bill 2019, which was so strongly criticised that it was sent back to the drawing board by the Parliamentary Joint Committee on Intelligence and Security due to concerns about the lack of privacy protections and the ability to enable mass surveillance.

Campaign win: Australian government will not force sites to implement age verification

 â€” 

Yesterday, the Australian Government released the eSafety Commissioner’s long-awaited roadmap for age verification for online pornography. We are pleased to see that the federal government will not force websites to implement age verification as a result of concerns about privacy and the lack of maturity of the technology.

Age verification is rife with privacy and digital security risks, as well as critical effectiveness and implementation issues. We welcome this sensible announcement from the Australian government.

We have been fighting this proposal for close to three years. Over that period, we made eight submissions related to online safety and age verification, advocated in the media, participated in many consultation roundtables and workshops with government and industry, and collaborated with other privacy and security advocates, researchers, and community groups.

This win shows that when we raise the alarm and put pressure on government we can stop harmful and invasive tech policy proposals. We need to keep up the fight to protect human rights, wellbeing and safety.

Submission: Combatting Misinformation and Disinformation Online

 â€” 

In January, the Minister for Communications announced that the Australian Government would introduce new laws to provide the Australian Communications and Media Authority (ACMA) with new powers to combat online misinformation and disinformation. The draft bill was open for public feedback from 20 June to 20 August 2023.

Read the draft bill and the public submissions here.

In our submission, Digital Rights Watch highlights a handful concerns and of areas for improvement, including: