When Connecticut’s amended Data Privacy Act takes effect on July 1, the state will become the fourth in the US to enforce specific legal protections for neural data. It joins Colorado, California, and Montana in classifying information derived from brain activity as sensitive personal information — a category that triggers stricter consent requirements and limits on how companies can collect, store, and share the data.
The four laws differ in meaningful ways. Colorado’s amendment to the Colorado Privacy Act (HB 24-1058), effective since August 2024, is the narrowest: it treats neural data as “biological data” and applies its protections only when the data is used for identification purposes. California’s SB 1223, which took effect in January 2025, amended the California Consumer Privacy Act to cover information generated by measuring central or peripheral nervous system activity, but only when it is not inferred from non-neural sources. Montana’s SB 163, effective since October 2025, takes a different path entirely, amending the state’s Genetic Information Privacy Act rather than a general consumer privacy law, and includes a notable provision requiring law enforcement to obtain a search warrant before accessing neural data.
Connecticut’s approach (SB 1295, signed June 2025) is narrower still in one respect: it covers only central nervous system activity, excluding peripheral signals. That distinction means it applies primarily to data collected from brain-computer interfaces, EEG headsets, and similar devices, rather than wearables that measure broader nervous system responses.
The legislative momentum extends well beyond these four states. A March 2026 analysis by Morrison Foerster identified active neural data bills in Virginia, Alabama, California (a second measure targeting workplace surveillance), New York, Illinois, and Vermont, each taking a distinct regulatory approach. Virginia’s HB 654 classifies neural data as biometric data under existing consumer privacy law. Alabama’s HB 263 creates a standalone neural data privacy statute. New York’s S9008 would fold neural data into data broker regulations. Illinois’s SB 2994 expands its genetic privacy law to cover neural data, including a private right of action — a provision that would allow individuals to sue directly rather than relying on state enforcement.
This patchwork poses a practical challenge for BCI companies operating across state lines. A device maker selling an EEG headset in all fifty states could face overlapping and sometimes contradictory requirements on what constitutes neural data, when consent is needed, and what enforcement mechanisms apply.
A Stanford Law analysis published on March 30 by Bo Hyoung Lee of the Center for Law and the Biosciences argues that the problem runs deeper than jurisdictional fragmentation. Lee contends that traditional notice-and-consent frameworks are structurally inadequate for neural data because users cannot meaningfully evaluate how raw brain signals might be processed into inferences about their mental states, emotions, or intentions. The paper calls for protections grounded in cognitive liberty and mental integrity rather than conventional data-as-property models, and draws on the UNESCO Recommendation on the Ethics of Neurotechnology adopted in November 2025.
The US is not working in isolation. That UNESCO Recommendation, adopted by all 194 member states at the General Conference in Samarkand in November 2025, represents the first attempt at a global ethical framework for neurotechnology. It is not legally binding, but it establishes principles around cognitive liberty, mental privacy, and equitable access that are already shaping domestic legislation in multiple countries.
Chile remains the pioneer. In 2021 it became the first nation to amend its constitution to protect neurorights, requiring that technological development respect citizens’ physical and mental integrity and that the law specifically protect brain activity and the information derived from it. The Chilean Supreme Court has since applied these provisions in at least one case involving a consumer neurotechnology company. Brazil’s state of Rio Grande do Sul followed in December 2023, incorporating neurorights into its state constitution, while a federal bill (PEC 29) to amend Brazil’s national constitution is pending in the Senate. Mexico introduced a comprehensive General Law on Neurorights and Neurotechnologies in July 2024, a 92-article proposal that would create a national Commission of Neuroethics and Neurolaw.
In Europe, the EU’s AI Act, which entered into force in August 2024, partially covers neurotechnology by prohibiting AI systems that use subliminal techniques to distort behaviour. Its guidelines explicitly reference brain-computer interfaces as a potential vector for such techniques. However, the Act lacks neurospecific risk classifications, and observers note that proposed GDPR revisions may soon classify raw brain signals as high-risk biometric data. France and Germany are separately drafting laws to prohibit mandatory neurotechnology adoption in employment contracts.
At the federal level in the US, progress has been slower. Senators introduced the MIND Act in late 2025, which would direct the FTC to study neural data privacy and recommend national standards, but the legislation has not advanced beyond committee.
For the BCI industry, the stakes are rising alongside the market. The sector closed Q1 2026 with over $960 million raised, and consumer-facing neural devices from companies like Neurable and others are reaching retail shelves. The gap between the pace of commercial deployment and the speed of coherent regulatory response continues to widen.