Opening Framing: The Goal Is Not "Secure." The Goal Is "Known."
Throughout CSY102, we've built a systems-security lens: memory and boundaries (Week 5), identity and permissions (Week 3), services and scheduling (Weeks 6-7), networking and exposure (Week 8), evidence and logging (Week 9), updates and supply chains (Week 10), and virtualisation and isolation (Week 11).
Now we end where professionals actually live:
How do you keep a system predictable, defendable, and provably in a state you trust?
If you cannot answer "what should this system look like when healthy," you cannot reliably detect compromise. If you cannot enforce "what should be allowed by default," you are negotiating with chaos.
Security is not a collection of tricks. It is the discipline of keeping systems in a known-good state. This week unifies everything: hardening, baselining, and assurance as the operating model of real defenders.
Hardening reduces the ways a system can be attacked. Baselining defines what "normal" looks like. Assurance is the ability to prove the system remains within that normal.
Mental Model: The Aircraft Maintenance Standard
An aircraft is not made safe by trusting pilots to "be careful." It is made safe by systematic engineering and continuous verification:
- Standardised configurations: every aircraft of a type is configured identically
- Controlled parts and replacements: only approved components, tracked serials
- Scheduled inspection: mandatory checks at defined intervals
- Strict change management: every modification documented and approved
- Evidence of compliance: maintenance logs prove the aircraft's history
The parallel to systems security is exact:
- Hardening is removing unnecessary parts and capabilities — like removing systems an aircraft doesn't need, reducing weight and failure points
- Baselining is the standard configuration — the documented "this is what a healthy aircraft looks like"
- Assurance is the inspection and evidence trail that proves integrity over time — the maintenance logs that let you trust the aircraft
In other words: security is the discipline of narrowing behavior and measuring drift. You define what should be true, then continuously verify it remains true.
Mental model: defend systems like you maintain aircraft — through standards, verification, and evidence, not through hope and vigilance alone.
1) Hardening: Reduce Attack Surface by Design
Hardening is not "add more security tools." It is systematically making the system harder to misuse, misconfigure, or exploit by removing unnecessary functionality and constraining what remains.
Hardening principles you should internalise:
- Remove what you don't need: fewer services means fewer things to patch, monitor, and secure. If it's not running, it can't be exploited.
- Constrain what must exist: least privilege (minimum permissions), least exposure (minimum network access), least functionality (disable unused features).
- Prefer defaults that fail closed: deny by default, allow explicitly. Firewalls should block unless told to allow, not vice versa.
- Make security controls resilient: protect logs (Week 9), protect update paths (Week 10), protect control planes (Week 11). Attackers target these specifically.
- Assume breach: design so that when (not if) something is compromised, the blast radius is limited.
A hardened system is not necessarily complicated. It is disciplined. Complexity is often the enemy of security — every additional component is another thing that can fail or be exploited.
Key insight: hardening is subtraction, not addition. Security comes from removing possibilities, not adding controls on top of a complex base.
2) Baselining: Define "Normal" Before You Need It
Baselining is creating a reference state so you can answer critical questions during an incident:
- Which services should be running? (Week 6)
- Which ports should be listening? (Week 8)
- Which users and permissions should exist? (Week 3)
- Which scheduled tasks are expected? (Week 7)
- Which software versions are approved? (Week 10)
Without a baseline, "unusual" becomes subjective and investigation becomes guesswork. Is this process supposed to be running? Is this port normally open? You need to know what "healthy" looks like to recognize "sick."
Baselines can be:
- Configuration baselines: documented settings, policies, hardening standards (e.g., CIS Benchmarks)
- Behavior baselines: typical CPU/memory/network patterns — what does "normal Monday morning" look like?
- Integrity baselines: expected file hashes, signed packages, approved container images — what should be present and unchanged?
Baselining is not about perfection. It is about observability with context. A baseline gives meaning to your monitoring — it's the "expected" against which you measure the "actual."
Key insight: you cannot detect anomalies without knowing what's normal. Baselines turn raw data into actionable intelligence.
3) Assurance: Prove the System Is What You Expect
Assurance is the ability to demonstrate — not assume — that security properties hold. It's the difference between "we think the system is secure" and "we can prove it."
Assurance mechanisms include:
- Auditing: periodic review of configuration, access controls, and logs. Did anyone change firewall rules? Who accessed sensitive data?
- Integrity checking: file-hash validation, signed software verification. Has anything been modified that shouldn't have been?
- Compliance scanning: automated comparison against baselines. Does the current state match the documented expected state?
- Penetration testing: adversarial tests to validate assumptions. Can an attacker actually exploit what we think is protected?
- Continuous monitoring: real-time detection of drift from baseline. Alert when something changes unexpectedly.
Without assurance, hardening and baselining are intentions, not facts. You might have a hardening checklist, but did anyone actually follow it? You might have a baseline, but has the system drifted from it?
Assurance closes the loop: you define the standard (baseline), you implement it (hardening), and you verify it continuously (assurance).
Key insight: trust, but verify. Assurance is the "verify" part. Without it, security is hope, not engineering.
4) Unit Synthesis: Systems Thinking for Security
CSY102 has built a complete systems-security lens. Let's bring it together:
- Memory and boundaries (Week 5): systems work by separation — isolation is the foundation of all security guarantees
- Identity and privilege (Week 3): access is mediated by claims — who you are determines what you can do
- Services and daemons (Week 6): authority exists in the background — what runs continuously has persistent power
- Scheduling (Week 7): authority can be deferred to the future — time itself is a security surface
- Networking (Week 8): exposure creates attack surface — listening means accepting input from strangers
- Evidence and logging (Week 9): observability is defence — you can't investigate what you didn't record
- Supply chain (Week 10): trust is inherited through update paths — your security depends on everyone upstream
- Virtualisation (Week 11): isolation partitions risk — contain breaches by limiting blast radius
- Hardening/baselining/assurance (Week 12): predictability is security — define normal, enforce it, verify it continuously
You are not memorising attack names or tool commands. You are building a lens through which any system — current or future — can be analysed. Technologies change; these principles endure.
Key insight: security is systems thinking. Every component interacts with every other. Understanding the system as a whole is how you find the weaknesses.
Capstone Activity (Conceptual)
Imagine you are responsible for a small organisation's server that hosts a web application. Your task is not to "make it secure" in one dramatic step. Your task is to produce an assurance argument:
- What should be running (services)?
- What should be reachable (ports/endpoints)?
- Who should exist (users/roles) and what can they do?
- What evidence will prove your answers over time (logs/telemetry)?
- How will updates be handled safely?
- How will you detect and respond to drift?
A strong defender is not someone who knows every exploit. A strong defender is someone who can keep systems within known bounds and prove it.
Week 12 Synthesis
- Hardening reduces attack surface by removing and constraining.
- Baselining defines what "normal" looks like before incidents happen.
- Assurance is evidence-based confidence over time.
- Security is a process: narrowing behavior and measuring drift.
- The unit's content becomes a single operating model for defenders.
You now have a systems-level security lens — not just knowledge, but a way to reason.
Real-World Context: Hardening and Compliance in Practice
Hardening and baselining are not academic concepts. They are industry standards:
CIS Benchmarks: The Center for Internet Security publishes detailed hardening guides for every major operating system, cloud platform, and application. These benchmarks specify exactly which settings to configure, which services to disable, and which permissions to restrict. Organizations use CIS Benchmarks as their baseline and scan against them continuously.
NIST Security Frameworks: The National Institute of Standards and Technology provides frameworks (like NIST 800-53) that define security controls for federal systems. These frameworks operationalize the concepts we've discussed: access control, audit logging, configuration management, system integrity.
Compliance Requirements: Regulations like PCI-DSS (payment cards), HIPAA (healthcare), and SOC 2 (service organizations) require documented hardening, baselining, and assurance. Auditors ask: "Show me your baseline. Show me your hardening checklist. Prove you're monitoring for drift."
Whether you become a penetration tester, a blue-team analyst, a DevSecOps engineer, a security architect, or simply a developer who builds safer software — this lens matters more than any specific tool. Tools change. Platforms change. Attack techniques evolve. But the ability to read a system — to ask "what can go wrong, and how do I know if it has?" — never becomes obsolete.
Common thread: professional security is systematic, not heroic. Standards, baselines, and continuous verification replace guesswork and intuition.
Guided Lab: Building a Hardening Checklist
This capstone lab brings together concepts from the entire unit. You will create a systematic hardening checklist based on everything you've learned.