2 April 2026·14 min read

CRA Annex I Checklist for Firmware Engineers

Map all 13 CRA Annex I security requirements and 8 vulnerability handling obligations to concrete firmware engineering tasks. Embedded checklist.


Annex I of the Cyber Resilience Act is the core of the regulation. It defines the essential cybersecurity requirements that every product with digital elements must meet before it can carry a CE mark and be placed on the EU market.

The problem for firmware engineers: Annex I is written in regulatory language, not engineering language. It says things like "appropriate level of cybersecurity" and "designed and manufactured to ensure an appropriate level of protection" — which doesn't tell you what to implement.

This post translates every Annex I requirement into concrete firmware engineering tasks. Use it as a checklist to assess your current posture and track remediation work. Each requirement links to deeper coverage in our other posts where applicable.

How Annex I Is Structured

Annex I has two parts:

Part I — Security requirements (13 items): These cover the product's design, development, and delivery. They address what the product itself must do.

Part II — Vulnerability handling requirements (8 items): These cover the manufacturer's processes for managing vulnerabilities after the product is shipped. They address what your organisation must do.

Both parts must be satisfied. A product with excellent security engineering but no vulnerability management process is non-compliant, and vice versa.

Harmonised Standards: EN 18031

CEN/CENELEC has developed harmonised standards relevant to cybersecurity for connected products. The EN 18031 series (EN 18031-1, EN 18031-2, EN 18031-3) was originally developed for the Radio Equipment Directive (RED) under Delegated Regulation 2022/30/EU. The series was finalised in August 2024 and harmonised references were published in the Official Journal of the EU in January 2025.

While EN 18031 was not developed specifically for the CRA, it serves as a foundation upon which CRA-specific harmonised standards are being built. Implementing EN 18031 provides a strong starting baseline for CRA compliance, particularly for connected products that also fall under the RED. CEN/CENELEC is developing additional CRA-specific standards that will provide a presumption of conformity with Annex I requirements once published.

In practice: if you implement EN 18031 and can demonstrate alignment with its requirements, you establish a solid technical baseline for CRA conformity, though the CRA-specific harmonised standards may add additional requirements. The requirements below reflect both the Annex I text and the EN 18031 standards where they provide useful clarification.

Part I: Security Requirements

Requirement 1 — Designed with an appropriate level of cybersecurity

Regulation text (paraphrased): Products must be designed, developed, and produced to ensure an appropriate level of cybersecurity based on the risks.

What this means for firmware:

  • Conduct a risk assessment / threat model for your product before and during development
  • Document security design decisions and their rationale
  • Security requirements derived from the threat model are traceable to implementation
  • Security is part of the development process, not bolted on after functional development

This is the overarching requirement. All subsequent requirements are specific instances of it.

Requirement 2 — No known exploitable vulnerabilities

Regulation text (paraphrased): Products must be delivered without known exploitable vulnerabilities.

What this means for firmware:

  • Run vulnerability scanning against all third-party components before each release (use your SBOM as input)
  • Triage all CVEs affecting components in your firmware (VEX process)
  • Patch or mitigate all exploitable vulnerabilities before shipping
  • Document triage decisions for CVEs you've assessed as not exploitable (retain VEX records)
  • Include vulnerability scan results in your release documentation

Note: "Known" means vulnerabilities published in CVE databases for components you ship. You're expected to be monitoring.

Requirement 3 — Integrity protection

Regulation text (paraphrased): Protect the integrity of stored, transmitted, and processed data and software, including firmware.

What this means for firmware:

  • Implement secure boot to verify firmware integrity at every boot
  • Sign all firmware updates and verify signatures before installation
  • Use authenticated encryption or HMAC for data stored in external flash/EEPROM
  • Validate all data received from external interfaces before processing (input validation)
  • Protect configuration data against unauthorised modification

Requirement 4 — Confidentiality protection

Regulation text (paraphrased): Protect the confidentiality of stored, transmitted, and processed data, including personal data and secrets.

What this means for firmware:

  • Encrypt sensitive data at rest (credentials, keys, user data) — use AES-256 or ChaCha20
  • Encrypt data in transit — TLS 1.2+ for TCP, DTLS 1.2+ for UDP/CoAP on constrained devices
  • Protect cryptographic keys in secure storage (hardware keystore, TrustZone, secure element) — never in plaintext flash
  • Implement secure key derivation for session keys (HKDF or similar)
  • Don't log or transmit sensitive data in debug output

Requirement 5 — Minimise data collection and processing

Regulation text (paraphrased): Minimise the processing of data, including personal data, to what is necessary for the intended purpose.

What this means for firmware:

  • Only collect data necessary for the product's function (no telemetry beyond what's needed)
  • Provide users with control over optional data collection
  • Implement data retention limits — don't store data indefinitely
  • Document what data your device collects and why

This requirement aligns with GDPR principles. For firmware teams, it typically means auditing your telemetry and logging to ensure you're not overcollecting.

Requirement 6 — Minimise attack surface

Regulation text (paraphrased): Minimise the attack surface, including external interfaces.

What this means for firmware:

  • Disable all unused network services and protocols in production builds
  • Disable debug interfaces (JTAG/SWD) in production firmware via OTP fuses or firmware configuration
  • Close or disable unused UART, SPI, I2C, and other peripheral interfaces in software
  • Remove debug logging, test endpoints, and development backdoors from production builds
  • Compile with hardening flags (-fstack-protector-strong, -D_FORTIFY_SOURCE=2, ASLR where supported)
  • Use MPU (Memory Protection Unit) to isolate privileged and unprivileged code where the MCU supports it
  • Minimise the firmware binary — strip unused features, libraries, and drivers

Requirement 7 — Secure default configuration

Regulation text (paraphrased): Products must be delivered with a secure default configuration, including the possibility to reset to the original secure state.

What this means for firmware:

  • No default passwords — either unique-per-device credentials or force user setup on first boot
  • All security features enabled by default (encryption, authentication, secure boot)
  • Unnecessary services disabled by default (don't ship with telnet or HTTP debug server enabled)
  • Factory reset restores the device to a secure state (not to a state with known-default credentials)
  • Configuration changes that weaken security require explicit user action and generate a warning

The "no default passwords" requirement is explicit and non-negotiable. If your product ships with admin/admin or a common default across all units, you're in violation.

Requirement 8 — Protection against unauthorised access

Regulation text (paraphrased): Products must be designed to protect against unauthorised access through appropriate control mechanisms, including authentication.

What this means for firmware:

  • All remote access interfaces require authentication
  • Authentication mechanisms are resistant to brute force (rate limiting, account lockout, exponential backoff)
  • Credentials stored on device are hashed/salted (not plaintext)
  • Session tokens have appropriate expiry
  • Privilege separation — different access levels for user vs. admin vs. maintenance operations

Requirement 9 — Availability and resilience

Regulation text (paraphrased): Products must be designed to ensure availability, including resilience against denial-of-service attacks.

What this means for firmware:

  • Network stack handles malformed packets gracefully (doesn't crash or hang)
  • Resource limits enforced — connection limits, message rate limits, buffer size limits
  • Watchdog timer configured to recover from hangs
  • Critical functions remain operational under network stress
  • Stack and heap overflow protection enabled

Requirement 10 — Secure communications

Regulation text (paraphrased): Products must ensure secure communication, including encryption of data in transit.

What this means for firmware:

  • TLS 1.2+ (or DTLS 1.2+) for all network communications carrying sensitive data
  • Server certificate verification enabled (no verify=false in production)
  • Strong cipher suites only — disable CBC mode ciphers, prefer AEAD (AES-GCM, ChaCha20-Poly1305)
  • For constrained devices: DTLS 1.2 with PSK or certificate authentication over CoAP
  • Certificate or PSK provisioning during manufacturing (not hardcoded shared secrets)

Libraries: Mbed TLS (part of TrustedFirmware), wolfSSL, and BearSSL are common choices for MCU-based TLS. For Zephyr projects, Mbed TLS is the default. For ESP-IDF, mbedtls is bundled. (See our Zephyr RTOS and FreeRTOS guides for RTOS-specific implementation details.)

Requirement 11 — Logging of security-relevant events

Regulation text (paraphrased): Products must log security-relevant events, where technically feasible.

What this means for firmware:

  • Log authentication attempts (success and failure)
  • Log firmware update events (download, verification, installation, rollback)
  • Log security configuration changes
  • Log detected attacks or anomalies (malformed packets, repeated auth failures)
  • Logs are tamper-protected (integrity-checked or stored in a protected region)
  • For constrained devices where persistent logging isn't feasible: document why and what alternatives you provide (e.g., event counters, syslog forwarding)

Requirement 12 — Secure deletion of data

Regulation text (paraphrased): Provide the possibility for users to securely remove personal and configuration data from the device.

What this means for firmware:

  • Factory reset function that overwrites (not just marks as deleted) user data, credentials, and configuration
  • Secure erase of cryptographic keys during factory reset
  • Factory reset accessible without requiring authentication (for when credentials are lost)
  • Document the factory reset procedure in user documentation

Requirement 13 — User notification of security issues

Regulation text (paraphrased): Products must be capable of notifying users about security issues and available updates.

What this means for firmware:

  • Mechanism to inform users when a security update is available (LED indicator, companion app notification, web dashboard alert)
  • Users can check the current firmware version easily
  • Security advisories are published in an accessible location (product support page, security bulletin feed)

Part II: Vulnerability Handling Requirements

Part II requirements apply to your organisation's processes, not to the product itself. These must be in place and documented.

VH Requirement 1 — Identify and document vulnerabilities and components

  • Maintain a machine-readable SBOM covering all product components
  • SBOM updated with each firmware release
  • SBOM in SPDX or CycloneDX format with NTIA minimum elements

VH Requirement 2 — Address vulnerabilities with security updates

  • OTA update mechanism operational and tested
  • Security updates delivered free of charge
  • Patch timeline: critical/exploited vulnerabilities addressed within days/weeks, not months
  • Update process documented and tested for failure scenarios

VH Requirement 3 — Regular testing and review

  • Regular vulnerability scanning of firmware components (at minimum, per release)
  • Periodic penetration testing (annual for default category; more frequent for Class I/II)
  • Code review process for security-sensitive changes
  • Test results documented and retained

VH Requirement 4 — Public disclosure of fixed vulnerabilities

  • Security advisories published for fixed vulnerabilities
  • Advisories include CVE IDs, affected versions, fixed versions, and mitigation guidance
  • Advisories published at the same time as the security update (not delayed)

VH Requirement 5 — Coordinated vulnerability disclosure policy

  • CVD policy published and accessible (security.txt at /.well-known/security.txt)
  • Monitored security contact (security@yourcompany.com or equivalent)
  • Researcher acknowledgement policy defined
  • Response timeline commitments documented (e.g., acknowledge within 5 business days)
  • See our Article 14 post for the full PSIRT setup

VH Requirement 6 — Sharing vulnerability information

  • Vulnerability information shared with affected parties when necessary
  • ENISA notification process established for actively exploited vulnerabilities (Article 14)
  • Process for notifying downstream customers when a vulnerability affects their deployment

VH Requirement 7 — Secure distribution of updates

  • Updates cryptographically signed
  • Update distribution infrastructure secured (TLS, access controls)
  • Update integrity verified before installation on device
  • Update distribution documented in technical documentation

VH Requirement 8 — No undue delay in distributing security patches

  • Patch development and release process with defined SLAs
  • No gating of security patches behind feature releases or subscription payments
  • Emergency patch process for critical/exploited vulnerabilities
  • Patch deployment tracking (what percentage of devices have been updated)

Using This Checklist

Priority Order for Firmware Teams

If you're starting from zero, this is the recommended implementation order:

  1. SBOM (VH1) — You need to know what's in your firmware before you can secure it. (SBOM guide)
  2. Secure boot (Req 3) — Foundation for firmware integrity. (Secure boot guide)
  3. OTA updates (VH2, VH7, VH8) — Ability to deliver fixes. (OTA guide)
  4. Vulnerability monitoring (VH1, VH3) — Know when your components have CVEs
  5. CVD policy and PSIRT (VH5, VH6) — Handle external vulnerability reports. (Article 14 guide)
  6. Threat model (Req 1) — Document your security design rationale. (Threat modeling guide)
  7. Secure defaults (Req 7) — No default passwords, services disabled by default
  8. Encryption (Req 4, Req 10) — Data at rest and in transit
  9. Access control (Req 8) — Authentication and authorisation
  10. Everything else (Req 5, 6, 9, 11, 12, 13) — Important but lower priority for initial compliance

Mapping to Product Classification

The checklist applies to all CRA-classified products, but the evidence requirements differ:

Requirement areaDefault (self-declare)Class I (CAB audit)Class II (notified body)
Threat modelDocumented, internal reviewReviewed by CABReviewed by notified body
Security testingSelf-assessed test resultsThird-party test results may be requiredFormal penetration testing required
SBOMGenerated, on fileReviewed by CABReviewed by notified body
Process documentationWritten, internalAudited by CABAudited by notified body

See our product classification guide for which tier applies to your product.

Timeline

  • 11 September 2026: Vulnerability handling requirements (Part II) must be in place — specifically Article 14 reporting
  • 11 December 2027: All Annex I requirements (Part I and Part II) must be met for products placed on the market

Start with Part II (vulnerability handling) since that deadline comes first.

Printable Summary

Part I — Security Requirements:

#RequirementKey firmware task
1Appropriate cybersecurity levelThreat model, security design
2No known exploitable vulnerabilitiesCVE scanning, VEX triage
3Integrity protectionSecure boot, signed updates
4ConfidentialityEncryption at rest and in transit
5Data minimisationAudit telemetry, retention limits
6Minimise attack surfaceDisable debug ports, unused services
7Secure defaultsNo default passwords, secure out of box
8Access controlAuthentication, privilege separation
9Availability / resilienceDoS protection, watchdog, resource limits
10Secure communicationsTLS/DTLS, certificate verification
11Security event loggingAuth logs, update logs, anomaly detection
12Secure data deletionFactory reset with secure erase
13User notificationUpdate availability notifications

Part II — Vulnerability Handling:

#RequirementKey organisational task
1Component identification (SBOM)Automated SBOM generation in build pipeline
2Security update deliveryOTA update mechanism
3Regular testingVulnerability scanning, penetration testing
4Public disclosureSecurity advisories with CVE IDs
5CVD policysecurity.txt, monitored inbox, response SLAs
6Information sharingENISA reporting, downstream notification
7Secure update distributionSigned updates, TLS delivery
8Timely patch deliveryDefined SLAs, no paywall on security patches

Use the Stack Canary assessment tool to get a personalised assessment of which requirements you've already met and where to focus your remediation effort.


Based on Regulation EU 2024/2847 Annex I Parts I and II, EN 18031 series (finalised 2024, harmonised 2025), ENISA CRA implementation guidance (2025). This does not constitute legal advice.

Sources


Check your CRA compliance status

Answer 7 questions about your embedded product and get a personalized gap analysis — with your CRA classification, key deadlines, and specific action items.

Start free assessment →