The Australian Human Rights Commission has voiced an urgent need for legislative privacy law reforms after releasing its own – world first – study into potential impacts from the application of neurotechnology.
The Commission’s report – Peace of Mind: Navigating the ethical frontiers of neurotechnology and human rights – released late last month is the first international deep dive into the benefits of embracing myriad cutting edge technologies while balancing them against the protection of human rights.
The report provides key findings and recommendations – particularly the implementation of robust limitations on the use of neurotechnology in the criminal justice system and its use on children, people with disabilities, the elderly and people in the workplace.
While definitions on the precise nature of neurotechnology – a quickly and ever-evolving field of endeavour – vary globally, the report describes it as “devices and procedures which can access, monitor, record or manipulate brain data.”
The United Kingdom’s Parliamentary website describes the new technological and medical frontier as “an umbrella term for a wide range of technologies that can read information from the nervous system or stimulate it.”
“This can range from established devices, such as cochlear implants, to neural implants with new applications, such as the potential to translate thoughts to physical actions,” the UK definition says.
“A subset of neurotechnology is brain-computer interfaces which enable direct communication between the brain and external devices.
“Some novel approaches involve emerging technologies that are in early-stage human trials, that have the potential to increase independence for people with neurological diseases or injuries, by improving or restoring communication and mobility.”
Funding for research into the field has been fierce internationally in recent years, according to numerous established international news services, with a mix of government agencies, private foundations and for-profit venture capital firms leading the charge – including the world’s richest man Elon Musk.
Australian Human Rights Commissioner Lorraine Finlay said the report revealed neurotechnology had the capacity to expose sensitive neural data, pose risks to privacy and freedom of thought, and create new forms of discrimination and surveillance.
“A technology capable of decoding brain activity or influencing thought processes is, by its nature, deeply personal and profoundly powerful,” Ms Finlay said.
“We must ensure that technological progress does not come at the expense of our most fundamental rights and freedoms.
“Australia is well placed to lead not only in technical innovation, but also in ethical and rights-respecting neurotechnology. Achieving this will require continued collaboration, investment, and a commitment to ‘human rights by design’.
“Only by putting human dignity at the centre of our digital lives can we unlock the full potential of neurotechnology – while retaining our peace of mind.”
The United Nations has voiced its own concerns about protections of the “burgeoning frontier.”
United Nations Educational, Scientific and Cultural Organisation (UNESCO) bioethics chief Dafna Feinholz was quoted in The Guardian saying there is no control of neurotechnology.
“We have to inform the people about the risks, the potential benefits, the alternatives, so that people have the possibility to say, ‘I accept, or I don’t accept’.”
As part of its report, the AHRC has recommended:
- Human rights by design: The Commission calls for human rights protections to be embedded at every stage of neurotechnology development, echoing the ‘safety by design’ approach.
- Privacy and consent: The report urges urgent reform of Australia’s privacy laws to explicitly protect neural data. It calls for plain-English privacy policies and meaningful, informed consent for all users.
- Freedom of thought and expression: Neurotechnology must not be used to coerce, manipulate, or punish individuals for their thoughts. The Commission recommends prohibiting neuromarketing for political and consumer purposes, especially targeting children.
- Workplace and consumer protections: The Commission recommends a ban on workplace neurotechnology other than for addressing the most serious work health and safety risks in high-risk industries. The creation of a specialist neurotechnology safety agency will also be necessary to protect consumers and establish effective safety standards.
- Impacted groups: The best interests of children, people with disability, and older people must be central to all neurotechnology policy and practice. The Commission calls for child rights impact assessments and stronger safeguards against discrimination and coercion.
- Criminal justice and military use: The report recommends a moratorium on the use of neurotechnology in criminal justice until an inquiry is conducted by the Australian Law Reform Commission, and calls for regular legal reviews of military applications to ensure compliance with international law.


Share this article