After ten weeks of breaking IoT devices, we shift perspective to building them securely. We know
attacks work because developers make mistakes—buffer overflows, hardcoded keys, cleartext protocols.
The Shift Left: Traditional security testing happens at the end (Pen Testing). This
is too late. Fixing a bug in design costs $1. Fixing it in code costs $10. Fixing it in production
costs $10,000 (Recall). We must move security to the left of the timeline.
Week Learning Outcomes:
Architect a Secure Boot Chain from ROM to User Space.
Implement a DevSecOps pipeline for embedded C/C++ (SAST/DAST).
Fuzz protocols using AFL++ to find zero-day crashes.
Generate an SBOM (Software Bill of Materials) to track supply chain risks.
Sign firmware packages using asymmetric cryptography (PKI).
1) The Secure Boot Chain: Foundation of Trust
If you cannot trust the code running on the device, nothing else matters. Secure boot ensures that
only Authorized code executes.
[ THE CHAIN OF TRUST ]
(Power On)
|
[ BOOT ROM ] (Immutable, Hardware Root of Trust)
| Verifies Stage 1 Signature (Public Key Hash in Fuses)
v
[ STAGE 1 BOOTLOADER ] (SPL / FSBL)
| Verifies Stage 2 Signature (U-Boot/LittleKernel)
v
[ STAGE 2 BOOTLOADER ] (U-Boot)
| Verifies Kernel Signature (ITB/FIT Image)
v
[ LINUX KERNEL ]
| Verifies Root Filesystem (dm-verity)
v
[ USER SPACE ] (Application)
Key Concepts
Root of Trust (RoT): The component that is implicitly trusted. Usually
Mask ROM + eFuse (OTP).
eFuse (One-Time Programmable): Physical fuses burned during manufacturing.
Contains the hash of the Public Key. If the firmware signature doesn't match this hash, the CPU
halts.
Anti-Rollback: An eFuse counter prevents downgrading to an older, vulnerable
firmware version.
The Signing Ceremony
How do we actually sign firmware? We need an Off-Line Root CA.
OpenSSL Signing Workflow
# 1. Generate Private Key (Keep this OFFLINE/HSM!)
openssl genrsa -out private_key.pem 2048
# 2. Extract Public Key (Embed this in Device/Bootloader)
openssl rsa -in private_key.pem -pubout -out public_key.pem
# 3. Hash the Firmware
sha256sum firmware.bin > firmware.hash
# 4. Sign the Hash (Create Signature)
openssl dgst -sha256 -sign private_key.pem -out firmware.sig firmware.bin
# 5. Verify (On Device)
openssl dgst -sha256 -verify public_key.pem -signature firmware.sig firmware.bin
2) Hardware Security Modules (HSM) vs TPM vs SE
Where do you store the secrets? (AWS Keys, Private Keys).
Component
Cost
Use Case
Strengths
TPM 2.0
$1-3
PC/Server/Gateway
Standardized API, Rich Feature Set (PCRs).
Secure Element (SE)
$0.50
IoT Endpoints (ATECC608A)
Cheap, Ultra-Low Power, I2C Interface.
PUF (Physically Unclonable Function)
$$$
High Assurance
Keys derived from silicon physics. Impossible to clone.
TrustZone (TEE)
$0 (IP)
ARM Cortex-A/M
Virtual separation (Secure World vs Normal World).
Rule: Never store keys in the filesystem (even if encrypted). Store them in
hardware.
3) The DevSecOps Pipeline
Automating security checks into the CI/CD pipeline. Security scales only if it is automated.
Observation: Within seconds, AFL will find that inputs starting with 'M' followed by
>50 chars cause a crash. It will store the crashing input in `out/crashes/`.
XP REWARD: +800 XP (Code Breaker)
Appendix A: Secure C Coding Standards
Appendix B: Crypto Implementation Checklist
Do's
Don'ts
Use AES-GCM (Authenticated Encryption).
Don't usage AES-ECB (Patterns are visible).
Use SHA-256 or SHA-3.
Don't use MD5 or SHA-1 (Collisions possible).
Use PBKDF2 / Argon2 for passwords.
Don't use simple Hashes for passwords.
Use /dev/urandom or Hardware TRNG.
Don't use rand() or time-based seeds.
Verify Certificates (Chain of Trust).
Don't use "SkipVerify: true" just to make it work.
Appendix C: Vulnerability Disclosure Policy (VDP)
Every company needs a SECURITY.md at the root of their repo. It tells researchers how to
report bugs without getting sued.
Template: SECURITY.md
# Security Policy
## Supported Versions
| Version | Supported |
| ------- | ------------------ |
| 5.1.x | :white_check_mark: |
| 5.0.x | :x: |
## Reporting a Vulnerability
We take the security of our systems seriously.
If you believe you have found a security vulnerability, please report it to us as described below.
**Please do not report security vulnerabilities through public GitHub issues.**
### Reporting Process
1. Email your findings to **security@example.com**.
2. Include steps to reproduce the issue.
3. We will acknowledge receipt within 48 hours.
4. We will provide an estimated timeline for the fix.
### Safe Harbor
We pledge not to initiate legal action against researchers for penetrating or attempting to penetrate our systems as long as they adhere to this policy.