A new link from the Human Rights Law Center gives American tech employees the authority to speak out against bad company practices or products.
The manual, Technology-Related Whistleblowing, provides a summary of constitutionally protected strategies for raising fears about the detrimental effects of technology, as well as practical concerns.
Notice: ‘ Right to Disconnect ‘ Laws Push Employers to Reassess Tech Use for Work-Life Balance
” We’ve heard a lot this year about the dangerous do of tech-enabled businesses, and there is definitely more to travel out”, Alice Dawkins, Reset Tech Australia executive chairman, said in a statement. Reset Tech Australia is the report’s co-author.
She continued,” We know it will take time to develop detailed protections for Australians for modern harms – , it’s particularly urgent to open up the door for people accountability via whistleblowing.”
Technology’s potential damages are a key area of focus in the American market.
Australia has experienced somewhat minor tech-related reporting. In fact, Kieran Pender, the Human Rights Law Centre’s associate legitimate producer, said,” the tech reporting wave hasn’t already made its way to Australia”.
However, recent media coverage and new legislation passed by the American authorities have raised questions about the potential harms that technology and platforms could cause.
Australia’s restrictions on social media for under 16s
A moratorium on social media for people under the age of 16 has been passed into law in Australia in late 2025. The moratorium, spurred by concerns about the mental heath impacts of social media on younger people, may demand platforms like Whatsapp, TikTok, Facebook, Instagram, and Reddit to evaluate consumer ages.
A’ online duty of care’ for technologies companies
Following a review of its Online Safety Act 2021, Australia is currently enacting legislation to create a “digital duty of care.” Tech companies are required by the new law to actively protect Australians from online threats and to better prevent virtual harms. It adheres to the same congressional framework as the European Union and U.K. types.
Poor technology in taxes Robodebt scandal
The Australian Taxation Office pursued 470, 000 incorrectly issued tax debts as a result of technology-assisted technology in the form of tax data matching and income-averaging computations. A complete Royal Commission investigation was conducted in response to the alleged illegality of the so-called Robodebt system.
use of AI files and its impact on Australian employment
An AI laws to govern Artificial companies was recently recommended by an American Senate Select Committee. OpenAI, Meta, and Google LLMs may be classified as “high-risk” under the new laws.
Concerned about the potential use of copyrighted material in AI type coaching information without permission and the effects of AI’s effect on the livelihoods of creators and other workers. A new OpenAI journalist expressed some issues in the United States.
Consents with the heath information from AI models.
The American imaging firm provided medical images of people without their knowledge or consent for a medical AI start-up to use the imaging to teach AI models, according to reports in The Technology-Related Whistleblowing Guide.
Images of Australian children used by AI models
According to an analysis conducted by Human Rights Watch, LAION-5B, a data set used to train some well-known AI programs through internet searches, contains links to recognizable images of Australian children. No parental or child gave their consent.
Payout following the Cambridge Analytica scandal at Facebook
Following allegations that Facebook user data was gathered by an app, made available for disclosure to Cambridge Analytica and others, and possibly used for political profiling, the Office of the Australian Information Commissioner recently approved a$ 50 million settlement with Meta.
Concerns over immigration detainee algorithm
According to the Technology-Related Whistleblowing Guide, reports about an algorithm being employed to assess the risk levels of immigration detainees were mentioned. The algorithm’s rating allegedly impacted how immigration detainees were managed, despite questions over the data and ratings.
Tech employees in Australia have specific whistleblower protections.
The guide provides in depth information on potential protections that tech employee whistleblowers might have. For instance, it explains that in the Australian private sector, different whistleblower laws exist that cover certain “disclosable matters” that make employees eligible for legal protections.
A “disclosable matter” under the Corporations Act occurs when there are good reason to believe the information is concerning misconduct or an improper situation or circumstances in an organization.
SEE: Accenture, SAP Leaders on AI Bias Diversity Problems and Solutions
Public sector employees can use public interest disclosure laws when there are significant health, safety, or environmental risks.
According to the guide,” Digital technology concerns are likely to arise in both the public and private sectors, which means there is a chance that your disclosure may be analyzed by either the private sector whistleblower laws or a PID scheme, depending on the organization your report relates to.”
” In most cases, this will be straightforward to determine, but if not we encourage you to seek legal advice”.
Australia: A testing ground for the ‘ good, bad, and unlawful’ in tech
For the Australian guide, whistleblower Frances Haugen wrote a forward citing the source of the internal Facebook material that led to the Wall Street Journal’s investigation into The Facebook Files. She said the Australian government was signaling moves on tech accountability, but its project “remains nascent”.
” Australia is, in many respects, a testing centre for many of the world’s incumbent tech giants and an incubator for the good, bad, and the unlawful”, she claimed in the whistleblowing guide.
SEE: Australia Proposes Required Guardrails for AI
The authors argue in their release that more people than ever in Australia are being exposed to the harm caused by new technologies, digital platforms, and artificial intelligence. However, they noted that, amidst the policy debate, the role of whistleblowers in exposing wrongdoing has been largely disregarded.
Haugen wrote that” the depth, breadth, and pace of new digital risks are rolling out in real-time”.
She said,” Timely disclosures will continue to be crucial for obtaining a more accurate picture of the risks and potential harm arising from digital products and services.”