This Privacy Policy explains how Netneuro Ltd ("we", "us", "our") collects, uses, stores, and protects personal data — including special category health data — when you use the Fractal Resonance Model (FRM) platform.
We are committed to protecting your privacy in accordance with the UK General Data Protection Regulation (UK GDPR), the Data Protection Act 2018 (DPA 2018), and applicable guidance from the Information Commissioner's Office (ICO).
Contents
- Who We Are
- What Data We Collect
- Special Category Health Data
- Legal Basis for Processing
- How We Use Your Data
- Data Storage and Security
- Data Retention
- Third Parties and Data Sharing
- AI Processing and Automated Analysis
- Your Rights
- Cookies and Local Storage
- Children's Data
- Changes to This Policy
- Contact and Complaints
- EEG Data Collection Access Controls
- Technician Admission and Credential Verification
- Research Mode and Academic Anonymisation
- International Compliance (HIPAA / LGPD)
Who We Are
The data controller for the FRM platform is:
Data Controller
Netneuro Ltd
University of Parma — Research Division
Email: privacy@netneuro.ai
ICO Registration Number: [pending registration]
If you have questions about how your data is handled, contact our Data Protection contact at the address above.
What Data We Collect
Account and Identity Data
- Full name, email address, role (clinician, technician, patient)
- Password (stored as a bcrypt hash — never in plain text)
- Account status and access logs
Patient Clinical Data
- Date of birth, age, sex, handedness
- Education level and clinical conditions (e.g. ADHD, anxiety, depression)
- Medication records: name, dose, frequency, start/end dates
- Height, weight, and biometric measurements
- Blood test results and HRV (Heart Rate Variability) data
Neurological and qEEG Data
- Quantitative EEG recordings: band power, alpha peaks, electrode data
- Fractal analysis results: Higuchi FD, Hurst exponent, DFA, entropy
- Resonance metrics: i-APF, i-FAI, CDI, BDWS scores
- sLORETA source localisation outputs
- Simulation results and neurofeedback protocol parameters
Usage and Technical Data
- Login timestamps, session activity, IP address
- Audit log entries for all data access and modifications
- Browser type and device information (collected automatically)
Special Category Health Data
The FRM platform processes special category data as defined by UK GDPR Article 9 and the Data Protection Act 2018 Schedule 1. This includes:
- Health and medical data (qEEG, HRV, blood tests, diagnoses)
- Mental health and neurological condition information
- Medication and treatment records
All special category data access is logged in an immutable audit trail in accordance with ICO guidance on accountability and record-keeping.
Caldicott Principles
Where patient data is involved, we follow the seven Caldicott Principles for handling confidential patient information:
- Justify the purpose(s) for using confidential information
- Use only what is necessary
- Use the minimum necessary
- Access on a strict need-to-know basis
- Everyone with access understands their responsibilities
- Comply with the law
- The duty to share information is as important as the duty to protect it
Legal Basis for Processing
Under UK GDPR Article 6 and Article 9, we rely on the following legal bases:
| Processing Activity | Legal Basis | UK GDPR Article |
|---|---|---|
| Clinician account management | Legitimate interests / contract performance | Art. 6(1)(b)(f) |
| Patient clinical data and qEEG analysis | Explicit consent of the patient | Art. 6(1)(a) + Art. 9(2)(a) |
| Medical diagnosis and treatment purposes | Healthcare provision by a health professional | Art. 9(2)(h) + DPA 2018 Sch.1 |
| Scientific research and analysis | Research purposes with appropriate safeguards | Art. 9(2)(j) + DPA 2018 Sch.1 Para.4 |
| Audit logging and security | Legal obligation and legitimate interests | Art. 6(1)(c)(f) |
| AI-generated clinical reports | Explicit consent + healthcare provision | Art. 9(2)(a)(h) |
How We Use Your Data
We use the data we collect strictly for the following purposes:
- Clinical analysis: running qEEG spectral, fractal, and resonance analyses on patient recordings
- Report generation: producing Data Reports and AI-assisted Neuro Analysis reports for clinicians
- Neurofeedback protocols: generating personalised treatment parameters from the patient's biological frequencies
- Patient monitoring: tracking BDWS and cognitive metrics across sessions over time
- Platform security: authentication, role-based access control, and audit logging
- Research: with appropriate safeguards and anonymisation, improving the Fractal Resonance Model
We do not use patient data for advertising, profiling for commercial purposes, or any purpose incompatible with the original clinical intent.
Data Storage and Security
Where Data Is Stored
All patient and clinical data is stored in a PostgreSQL database hosted on servers located within the United Kingdom or European Economic Area (EEA). No patient data is transferred to countries without an adequacy decision or appropriate safeguards in place.
Encryption in Transit
All communication between any client (collection pole, clinician browser, patient portal) and the FRM platform is enforced over HTTPS using TLS 1.2 minimum (TLS 1.3 preferred). HTTPS is enforced at the network layer — plain HTTP connections are rejected before reaching the application. There is no fallback to unencrypted transport.
- Confidentiality: EEG signals, patient identifiers, and session metadata travel as ciphertext — intercepted network traffic is unreadable without the server's private key
- Server identity: TLS certificates prove the client is communicating with the genuine FRM server, preventing man-in-the-middle attacks
- Integrity: TLS message authentication codes (MACs) detect any in-transit tampering — corrupted or altered EEG files are rejected before reaching the database
- Audit evidence: TLS session logs provide a cryptographic record that data was transmitted securely, supporting compliance audits and breach investigations
Encryption at Rest — Two Layers
EEG recordings and all patient-identifiable information (PHI) are encrypted at two independent layers:
| Layer | Mechanism | Scope |
|---|---|---|
| Application-level | Fernet (AES-128-CBC + HMAC-SHA256) | Per-field encryption on PHI columns and EEG data blobs |
| Storage-level | AES-256 (database volume) | Entire database volume including backups |
Even if an attacker obtains a direct copy of the database (e.g. a stolen backup), they see only encrypted ciphertext — not names, dates of birth, addresses, or EEG signal data.
Encryption Key Management
- The application encryption key is stored in a dedicated secrets management service — never in source code, configuration files, or version control
- Key access is controlled via least-privilege IAM policies
- The platform fails to start if the encryption key is absent, preventing accidental operation in an unencrypted state
Authentication and Access Controls
- Authentication: JWT-based access tokens (30-minute expiry) with 7-day refresh tokens; bcrypt password hashing
- Role-based access control (RBAC): Four roles (super_admin, admin, technician, patient) with granular per-permission control — patients can only access their own records
- Immutable audit trail: Every data access, creation, modification, and deletion is logged with user ID, timestamp, and IP — records cannot be altered or deleted
- Pseudonymisation: Where used for research, patient data is pseudonymised before processing
Organisational Safeguards
- Access restricted to authorised clinical staff only on a strict need-to-know basis
- All personnel with access to patient data are bound by confidentiality obligations (see Section 16 — Technician Admission)
- Regular security reviews of access permissions and audit logs
Data Retention
We retain personal data only for as long as necessary for the purposes for which it was collected, in line with UK GDPR Article 5(1)(e) and applicable NHS and professional regulatory guidance.
| Data Type | Retention Period | Basis |
|---|---|---|
| Adult patient clinical records and qEEG data | 8 years from last contact | NHS Records Management Code of Practice 2021 |
| Children's clinical records | Until age 25 (or 8 years from last contact if longer) | NHS Records Management Code of Practice 2021 |
| Account and authentication data | Duration of account + 2 years post-closure | Legitimate interests / legal obligation |
| Audit logs | 7 years minimum (immutable, append-only) | Legal obligation / accountability |
| AI-generated reports | Same as the associated clinical record | Healthcare provision |
| Research data (anonymised) | As specified in the research ethics approval | Scientific research basis |
After the applicable retention period, data is securely deleted or anonymised in a manner that makes re-identification impossible.
Third Parties and Data Sharing
We do not sell, rent, or trade personal data. We may share data only in the following limited circumstances:
Clinical Team Members
Patient data is accessible to authorised clinicians and technicians within the same organisation, strictly on a need-to-know basis enforced by role-based access control.
AI Processing (Ollama / Claude)
When AI-assisted reports are generated, de-identified or pseudonymised clinical summaries may be sent to the configured AI provider (self-hosted Ollama or Anthropic Claude API). Where the Claude API is used, data is processed in accordance with Anthropic's Privacy Policy and their data processing agreement. No raw patient identifiers are included in AI prompts.
Legal Requirements
We may disclose personal data where required by law, court order, or regulatory authority, including the ICO, NHS, or law enforcement agencies acting under lawful authority.
Service Providers
Where we use third-party service providers (e.g. hosting infrastructure), they are bound by data processing agreements (DPAs) and are not permitted to use personal data for any other purpose. All providers are selected to ensure data remains within the UK or EEA.
AI Processing and Automated Analysis
The FRM platform uses automated algorithms and, optionally, AI language models to assist clinicians in generating reports. In line with UK GDPR Article 22:
- All automated analysis outputs (BDWS scores, fractal metrics, AI narratives) are decision-support tools only — they do not constitute medical decisions
- Final clinical decisions remain with the qualified healthcare professional
- Patients have the right to request human review of any automated output that has influenced a clinical decision
- AI-generated narratives are clearly labelled as AI-assisted in all reports
Your Rights
Under UK GDPR and the Data Protection Act 2018, you have the following rights regarding your personal data:
Right of Access (Article 15)
You may request a copy of all personal data we hold about you (a Subject Access Request / SAR). We will respond within one calendar month.
Right to Rectification (Article 16)
You may request correction of inaccurate or incomplete personal data.
Right to Erasure (Article 17)
You may request deletion of your personal data where there is no legitimate reason to continue holding it. Note that some data (clinical records, audit logs) may be retained to comply with legal obligations or professional regulatory requirements even after an erasure request.
Right to Restrict Processing (Article 18)
You may request that we restrict how we process your data in certain circumstances (e.g. while accuracy is contested).
Right to Data Portability (Article 20)
Where processing is based on consent or contract, you may request your data in a structured, commonly used, machine-readable format (e.g. JSON or CSV).
Right to Object (Article 21)
You may object to processing based on legitimate interests, including use of your data for research purposes.
Rights Relating to Automated Decision-Making (Article 22)
You have the right not to be subject to solely automated decisions that produce legal or similarly significant effects, and to request human review of automated outputs.
Right to Withdraw Consent
Where processing is based on consent, you may withdraw it at any time. Withdrawal does not affect the lawfulness of processing carried out before withdrawal.
How to exercise your rights
Submit a request to privacy@netneuro.ai. We will respond within one calendar month. We may request proof of identity before actioning certain requests.
Cookies and Local Storage
The FRM platform does not use third-party tracking or advertising cookies. We use the following minimal local storage mechanisms:
- JWT access tokens: stored in memory / secure HttpOnly cookies for authentication
- Session preferences:
sessionStorageis used to remember UI state (e.g. last open settings tab) — this is cleared when the browser tab is closed and never sent to our servers - Language preference:
localStoragestores your selected display language — no personal data is included
This landing page and the privacy page do not set any cookies. The main application uses only strictly necessary session storage for authentication and UI state.
Children's Data
The FRM platform may process clinical data relating to children (under 18) when used by authorised healthcare professionals in a clinical context. Such data is treated with the highest level of protection:
- Children's records are retained until the patient's 25th birthday, or 8 years from the last contact (whichever is longer), in line with NHS Records Management Code of Practice 2021
- Explicit consent from a parent or guardian is required before processing a child's health data
- Children are not permitted to create accounts directly on the platform
Changes to This Policy
We may update this Privacy Policy from time to time to reflect changes in the law, our data practices, or platform functionality. When we make material changes:
- The "Last updated" date at the top of this page will be revised
- Registered users will be notified by email of significant changes
- Continued use of the platform after notification constitutes acceptance of the revised policy
We encourage you to review this policy periodically.
Contact and Complaints
For any privacy-related questions, subject access requests, or concerns:
Data Protection Contact
Netneuro Ltd
Email: privacy@netneuro.ai
Subject line: "Privacy Enquiry — FRM Platform"
Right to Lodge a Complaint
If you are not satisfied with how we have handled your personal data, you have the right to lodge a complaint with the Information Commissioner's Office (ICO) — the UK supervisory authority for data protection:
Information Commissioner's Office (ICO)
Website: ico.org.uk
Helpline: 0303 123 1113
Address: Wycliffe House, Water Lane, Wilmslow, Cheshire, SK9 5AF
We would, however, appreciate the opportunity to address your concerns before you approach the ICO, so please contact us in the first instance.
EEG Data Collection Access Controls
Any physical or remote site that collects raw EEG data (a "collection pole" — a clinic, hospital unit, or mobile assessment centre) receives a dedicated platform login scoped exclusively to uploading recordings. This account cannot view patient records, run analyses, generate reports, or access any other patient's data.
Once an EEG file is uploaded, the platform routes it automatically based on the service model in place:
| Service Model | Assignment Logic |
|---|---|
| Private clinic | The patient selects their preferred technician at registration. The uploaded file is assigned directly and exclusively to that technician's queue. No other technician can access it. |
| Government / public health | Files enter a shared institutional queue. Assignment follows the policy set by the governing body — typically first-uploaded, first-analysed. Patient-technician pre-selection is disabled unless explicitly authorised by the institution. |
This architecture enforces a strict separation of roles at the technical level, ensuring collection staff never have access to the clinical data they submit.
Technician Admission and Credential Verification
Technicians are not granted platform access on self-registration. Every technician account must pass two verification gates before activation:
Gate 1 — Electronic NDA
Before any account is activated, the applicant must read and electronically sign a Non-Disclosure Agreement within the platform. The NDA covers:
- Confidentiality of all patient data accessed through the platform
- Prohibition on exporting, copying, or sharing raw EEG data or clinical reports outside authorised workflows
- Obligation to report any suspected data breach or unauthorised access immediately
- Consequences of breach, including account termination and potential legal liability
The signed NDA document, timestamp, and the signer's IP address are permanently recorded in the audit log. The account remains locked until both gates pass.
Gate 2 — CPD Credential Verification
The applicant must hold a valid Continuing Professional Development (CPD) credential in neuro modulation, issued by a recognised professional body. The platform:
- Collects the credential number, issuing body, and expiry date
- Verifies the credential against the issuer's registry (via API where available; manual review with documented approval for bodies without an API)
- Records the verification outcome, verifier identity, and date in the audit log
- Monitors expiry — when a CPD credential expires, the technician account is automatically suspended until a renewed credential is submitted and re-verified
Account State Lifecycle
| State | Description |
|---|---|
| pending_nda | Account created; awaiting NDA signature |
| pending_cpd | NDA signed; awaiting CPD credential verification |
| active | Both gates passed; full technician access granted |
| suspended_cpd_expired | CPD credential expired; account suspended pending renewal |
| suspended_admin | Manually suspended by clinic administrator |
Research Mode and Academic Anonymisation
Any clinic or institution registered on the platform can be designated as a Research entity. When Research Mode is active, patient anonymisation is enforced structurally — not as a policy recommendation, but as a technical constraint built into the platform.
How It Works
When a new patient is added under a Research-mode entity, the platform:
- Generates a random, opaque identifier (e.g.
P-7F3A2C) as the patient's name - Discards any real name supplied in the registration form — it is never written to the database
- Locks the identifier permanently — no user, including clinic administrators and platform super-admins, can change or reveal a real identity through the platform interface or API
Why This Matters for Research Integrity
- Ethics board compliance: Most IRB and ethics committee approvals for human-subject research require that data collectors cannot link records to individuals. Research Mode satisfies this requirement at the system level, providing a verifiable technical control rather than a procedural promise
- Audit trail without exposure: All platform audit logs record actions against the anonymised code — not against a name. PHI is structurally absent, so there is no PHI to expose in the event of a breach
- Academic publishing readiness: Datasets exported from a Research-mode entity contain only anonymised codes, EEG metrics, and aggregated clinical indicators — ready for submission to journals or data repositories without further de-identification work
International Compliance (HIPAA / LGPD)
In addition to UK GDPR and DPA 2018, the FRM platform is designed to support compliance with international healthcare data regulations for institutions operating in the United States and Brazil:
| Regulation | Requirement | FRM Implementation |
|---|---|---|
| HIPAA §164.312(a) | Access controls | RBAC with 4 roles; upload-only accounts for collection poles |
| HIPAA §164.312(e)(1) | Transmission security | HTTPS enforced at network layer; no HTTP fallback |
| HIPAA §164.312(e)(2) | Encryption at rest | AES-128 application-layer encryption + AES-256 storage volume encryption |
| LGPD Art. 46 | Security measures | Equivalent coverage to HIPAA via the same technical controls |
| LGPD Art. 37 | Data processing records | Immutable audit log for all PHI access, NDA signatures, and CPD verifications |
| IRB / Ethics boards | Structural anonymisation | Research Mode enforces opaque identifiers at creation; real names never stored |
HIPAA Business Associate Agreement
Where the platform is hosted on cloud infrastructure, the cloud provider signs a HIPAA Business Associate Agreement (BAA) with the platform operator, making all services used for PHI storage and processing eligible under HIPAA.