United Kingdom Research and Innovation has awarded £1 million to establish an interdisciplinary research laboratory focused on the philosophy of neurotechnology, funding a collaboration between the Universities of Glasgow and Oxford that will examine the ethical, epistemological, and cognitive implications of brain-computer interfaces and related technologies.
The grant, announced on March 24, will create MindTech Research Lab, a new inter-university facility that pairs philosophers with neuroscientists and quantum physicists. The project is titled “Navigating the Neural Frontier: Embedding Ethics and Epistemology in Neurotech.”
Four philosophers lead the team: J. Adam Carter, Emma Gordon, and Christoph Kelp, all members of the Cogito research group at the University of Glasgow, and Mona Simion at the University of Oxford. The scientific side draws on the Glasgow Centre for Neurotechnology, with neuroscientists and physicists Daniele Faccio, Simon Hanslmayr, Monika Harvey, and Lars Muckli contributing expertise in brain imaging, neural oscillations, and visual neuroscience.
The lab’s research agenda centres on three questions. First, which neurotechnologies are most feasible within the next five to ten years and warrant prioritised scrutiny. Second, how advances in neural interfaces affect human vulnerability to manipulation and misinformation. Third, what brain-computer interaction means for autonomy, authenticity, and moral responsibility.
These are not abstract concerns. As BCI technology moves from laboratory demonstrations toward clinical and eventually consumer applications, regulators and policymakers will need frameworks for governing how neural data is collected, stored, and used. Existing privacy law was not written with direct brain-to-computer communication in mind, and questions about cognitive liberty — whether individuals have a right to mental privacy and freedom from neural manipulation — are moving from philosophical thought experiments to practical policy problems.
The UKRI funding follows a pattern of European institutions taking an early position on neuroethics governance. The European Union’s AI Act, which entered force in 2024, addresses some aspects of neural data under its biometric data provisions, but does not specifically regulate brain-computer interfaces. UNESCO’s 2024 report on neurotechnology and human rights called for international standards, though no binding framework has yet emerged. The UK grant suggests that British research funders see value in building the philosophical groundwork before the technology outpaces the governance conversation.
MindTech’s interdisciplinary structure is deliberate. The grant embeds philosophers directly within a neurotechnology research centre rather than housing them in a separate department. The intent, according to the project description, is not only to study the implications of neurotechnology but to develop “philosophically-informed neurotech” — technologies designed from the outset with ethical and epistemological constraints built into their architecture.
The Glasgow Centre for Neurotechnology, which provides the scientific infrastructure for the project, has existing programmes in brain imaging and neural signal processing. The addition of a dedicated philosophy and ethics team gives it a formal mechanism for integrating normative questions into technical research, rather than treating ethics as a downstream compliance exercise.
For the BCI industry, the grant is a signal that the governance conversation is advancing in parallel with the technology. Companies developing neural interfaces will increasingly face not just regulatory scrutiny from medical device authorities, but deeper questions about cognitive rights and neural data governance that MindTech and similar initiatives are now being funded to address.