Skip to content
CSY204 Week 10 Intermediate

Practice timeline building before moving to reading resources.

Security Operations

Track your progress through this week's content

Opening Framing

Every forensic artifact has a timestamp. File creation, registry modification, log entry, network connection—each event is anchored in time. Individually, these timestamps are data points. Combined into a unified timeline, they become a narrative that reconstructs exactly what happened, when, and in what order.

Timeline analysis is the synthesis phase of digital forensics. After collecting artifacts from file systems, memory, network captures, and logs, timeline analysis weaves them together into a coherent story. A successful login at 09:30 followed by file access at 09:31, data staging at 09:45, and exfiltration at 10:15 transforms from isolated events into a documented attack chain.

This week covers timestamp types and their forensic meaning, super timeline creation with Plaso, timeline filtering and analysis techniques, correlation across artifact sources, and visualization approaches. You'll learn to build comprehensive timelines that tell the complete story of an incident.

Key insight: Timestamps don't lie, but they can be manipulated. Understanding timestamp sources and their reliability is essential for accurate timeline construction.

1) Understanding Timestamps

Different artifacts record different types of timestamps, each with distinct forensic meaning:

MACB Timestamps (File Systems):

M - Modified:   Content of file was changed
A - Accessed:   File was read or executed
C - Changed:    Metadata was modified (permissions, owner)
B - Born:       File was created (not all file systems)

NTFS Timestamps (Windows):
┌─────────────────────────────────────────────────────────────┐
│ $STANDARD_INFORMATION ($SI):                                │
│ - Modified, Accessed, Changed, Birth                        │
│ - Updated by Windows and applications                       │
│ - CAN be modified by user-space tools (timestomping)        │
├─────────────────────────────────────────────────────────────┤
│ $FILE_NAME ($FN):                                           │
│ - Same four timestamps                                      │
│ - Updated only by Windows kernel                            │
│ - Much HARDER to modify (requires raw disk access)          │
│ - More reliable for forensics                               │
└─────────────────────────────────────────────────────────────┘

ext4 Timestamps (Linux):
- atime: Access time
- mtime: Modification time
- ctime: Change time (metadata)
- crtime: Creation time (ext4 only)

Note: Linux atime behavior varies:
- noatime: Never updated (common for performance)
- relatime: Updated if older than mtime
- strictatime: Always updated

Timestamp Forensic Meaning:

What Each Timestamp Indicates:

Modified (M):
- File content was written
- Document was edited
- Executable was compiled
- Data was appended

Accessed (A):
- File was opened for reading
- Executable was run
- Directory was listed
- Preview was generated
Note: Often unreliable due to atime policies

Changed (C) / MFT Modified:
- Permissions changed
- Owner changed
- File renamed
- File moved (same volume)
- Attributes modified

Birth/Created (B):
- File first appeared on this volume
- Copy creates new birth time
- Move (same volume) preserves birth time

Forensic Interpretations:
┌─────────────────────────────────────────────────────────────┐
│ Scenario                    │ Timestamp Pattern             │
├─────────────────────────────┼───────────────────────────────┤
│ File copied to system       │ B = copy time, M = original   │
│ File created locally        │ B ≈ M                         │
│ File downloaded             │ B = download time             │
│ Timestomping attempt        │ $SI times < $FN times         │
│ File moved (same volume)    │ B preserved, C updated        │
│ File executed               │ A updated (if atime enabled)  │
└─────────────────────────────┴───────────────────────────────┘

Timestamp Storage Formats:

Common Timestamp Formats:

Unix Epoch (POSIX):
- Seconds since January 1, 1970 00:00:00 UTC
- Example: 1710495045
- 32-bit limit: January 19, 2038

Windows FILETIME:
- 100-nanosecond intervals since January 1, 1601
- Example: 133555818451230000
- 64-bit value

NTFS Timestamp:
- Same as FILETIME
- Stored in MFT entries

FAT Timestamp:
- 2-second resolution
- Date and time in separate fields
- No timezone (assumed local)

ISO 8601:
- Human-readable standard
- Example: 2024-03-15T09:30:45.123Z
- Z indicates UTC

Conversion Examples:

Unix to Human:
$ date -d @1710495045
Fri Mar 15 09:30:45 UTC 2024

Windows FILETIME to Human (Python):
from datetime import datetime, timedelta
ft = 133555818451230000
dt = datetime(1601, 1, 1) + timedelta(microseconds=ft // 10)

Human to Unix:
$ date -d "2024-03-15 09:30:45" +%s
1710495045

Timestomping Detection:

Timestomping - Timestamp Manipulation:

What Attackers Do:
- Modify timestamps to blend in
- Make malware appear old
- Match legitimate system files
- Hide recent activity

Detection Methods:

1. $SI vs $FN Comparison:
   If $SI timestamps < $FN timestamps = TIMESTOMPED
   
   Normal: $SI Created ≈ $FN Created
   Stomped: $SI Created = 2019, $FN Created = 2024

2. Logical Inconsistencies:
   - Created time after Modified time
   - Birth time in the future
   - Timestamps before OS installation

3. USN Journal:
   - Records actual file activity
   - Harder to manipulate
   - Shows real creation time

4. $LogFile:
   - Transaction journal
   - May contain original timestamps

Tools for Detection:
- MFTECmd (Eric Zimmerman)
- Autopsy timeline
- NTFS Log Tracker
- Timestomp detection scripts

MFTECmd Output:
$ MFTECmd.exe -f $MFT --csv output/
[Compare SI_Created vs FN_Created columns]

Key insight: $FILE_NAME timestamps are the "truth" for NTFS. When $STANDARD_INFORMATION doesn't match, investigate further.

2) Super Timeline Creation

A super timeline combines timestamps from all available sources into a single, unified view of system activity:

Super Timeline Concept:

Traditional Approach:
- Examine file system separately
- Review logs separately
- Check registry separately
- Manual correlation

Super Timeline Approach:
- Parse ALL artifact sources
- Normalize to common format
- Merge into single timeline
- Filter and analyze together

Sources Included:
┌─────────────────────────────────────────────────────────────┐
│ File System        │ MACB times for all files              │
│ Windows Registry   │ Key last write times                  │
│ Event Logs         │ Event timestamps                      │
│ Prefetch          │ Execution times                        │
│ Browser History   │ Visit timestamps                       │
│ LNK Files         │ Access times                          │
│ Jump Lists        │ Recent access                         │
│ USN Journal       │ File changes                          │
│ $LogFile          │ NTFS transactions                     │
│ Shellbags         │ Folder access                         │
│ SRUM              │ Resource usage                        │
│ Amcache           │ Execution                             │
└─────────────────────────────────────────────────────────────┘

Plaso (log2timeline):

Plaso - Primary Super Timeline Tool:

Components:
- log2timeline.py: Extracts timestamps → Plaso storage
- psort.py: Sorts and outputs timeline
- pinfo.py: Storage file information

Basic Workflow:

1. Extract timestamps:
$ log2timeline.py --storage-file case.plaso disk_image.E01

2. Create timeline output:
$ psort.py -o l2tcsv -w timeline.csv case.plaso

3. Filter during output:
$ psort.py -o l2tcsv -w filtered.csv case.plaso "date > '2024-03-15 00:00:00'"

Common Options:

log2timeline.py:
--storage-file FILE    Output Plaso storage file
--parsers LIST         Specific parsers to use
--partitions all       Process all partitions
--volumes all          Process all volumes
-z TIMEZONE            Source timezone
--workers N            Parallel workers

psort.py:
-o FORMAT              Output format (l2tcsv, dynamic, json)
-w FILE                Output file
--analysis PLUGIN      Run analysis plugin
"FILTER"               Filter expression

Plaso Parsers:

Plaso Parser Categories:

File System:
- filestat: File system timestamps
- usnjrnl: USN Journal
- mft: MFT entries

Windows Artifacts:
- winevt/winevtx: Event logs
- winreg: Registry hives
- prefetch: Prefetch files
- lnk: Shortcut files
- recycler: Recycle bin
- chrome_cache: Chrome artifacts

Linux/macOS:
- syslog: Syslog entries
- utmp: Login records
- bash_history: Command history

Applications:
- chrome_history: Chrome browsing
- firefox_history: Firefox browsing
- skype: Skype database
- sqlite: Generic SQLite

Selecting Parsers:
# List available parsers
$ log2timeline.py --parsers list

# Use specific parsers
$ log2timeline.py --parsers "winevtx,prefetch,mft" ...

# Exclude parsers
$ log2timeline.py --parsers "!filestat" ...

# Parser presets
$ log2timeline.py --parsers "win7" ...    # Windows 7 preset
$ log2timeline.py --parsers "linux" ...   # Linux preset

Timeline Output Formats:

L2TCSV Format (Default):
date,time,timezone,MACB,source,sourcetype,type,user,host,short,desc,version,filename,inode,notes,format,extra

Example Entry:
2024-03-15,09:30:45,UTC,MACB,FILE,NTFS MFT,Creation Time,-,HOST,malware.exe,File created,...

Dynamic Format:
datetime,timestamp_desc,source,source_long,message,parser,display_name,...

JSON Lines:
{"datetime": "2024-03-15T09:30:45", "timestamp_desc": "Creation Time", ...}

Output Selection:
$ psort.py -o l2tcsv -w timeline.csv case.plaso    # CSV
$ psort.py -o json_line -w timeline.json case.plaso # JSON
$ psort.py -o dynamic -w timeline.txt case.plaso    # Dynamic

Large Timeline Handling:
- Can produce millions of entries
- Filter during psort.py
- Use specialized viewers
- Consider SQLite output for queries

Key insight: Running log2timeline on a full disk image takes hours. Plan accordingly, or use targeted collection to reduce scope for time-sensitive investigations.

3) Timeline Filtering and Analysis

Raw super timelines contain millions of entries. Effective filtering isolates relevant events:

Plaso Filter Expressions:

Time-Based Filters:
"date > '2024-03-15 00:00:00'"
"date < '2024-03-16 00:00:00'"
"date > '2024-03-15 09:00:00' AND date < '2024-03-15 10:00:00'"

Source Filters:
"source is 'WEBHIST'"
"source contains 'EVT'"
"parser is 'prefetch'"

Content Filters:
"message contains 'malware.exe'"
"filename contains 'Users\\admin'"
"message regexp '.*password.*'"

Combined Filters:
"date > '2024-03-15' AND message contains 'powershell'"
"source is 'FILE' AND filename contains '.exe'"

Examples:
# Events in specific timeframe
$ psort.py -w filtered.csv case.plaso \
    "date > '2024-03-15 09:00:00' AND date < '2024-03-15 12:00:00'"

# Only web history
$ psort.py -w webhistory.csv case.plaso "source is 'WEBHIST'"

# Executable activity
$ psort.py -w exes.csv case.plaso "filename contains '.exe'"

Timeline Explorer (Eric Zimmerman):

Timeline Explorer - GUI Analysis:

Features:
- Open large CSV files efficiently
- Column filtering
- Keyword highlighting
- Time range selection
- Export filtered results

Workflow:
1. Open timeline CSV in Timeline Explorer
2. Apply column filters (source type, path, etc.)
3. Navigate to time of interest
4. Use Find to search for keywords
5. Highlight and tag relevant entries
6. Export filtered subset

Effective Filters:
- Filter "Source" to specific artifact types
- Filter "Message" contains filename
- Sort by datetime
- Group by source for overview

Keyboard Shortcuts:
Ctrl+F: Find
Ctrl+G: Go to line
F5: Refresh
Ctrl+E: Export

Command-Line Analysis:

grep/awk for Timeline Analysis:

Basic Searches:
$ grep "malware.exe" timeline.csv
$ grep "2024-03-15,09:3" timeline.csv
$ grep -i "powershell" timeline.csv

Field Extraction:
$ awk -F',' '{print $1","$2","$5","$11}' timeline.csv
# date, time, source, short description

Time Window Extraction:
$ awk -F',' '$1 == "2024-03-15" && $2 >= "09:00:00" && $2 <= "10:00:00"' timeline.csv

Count by Source:
$ awk -F',' '{print $5}' timeline.csv | sort | uniq -c | sort -rn

Find Specific User Activity:
$ grep "\\\\Users\\\\admin" timeline.csv

Executable Files:
$ grep "\.exe" timeline.csv | grep -i "creation time"

Complex Pipeline:
$ grep "2024-03-15" timeline.csv | \
    grep -E "prefetch|evt|file" | \
    awk -F',' '{print $1" "$2" "$5" "$11}' | \
    sort | less

Pivot Point Analysis:

Pivot Point Strategy:

Start with Known Indicator:
1. Malware filename
2. Suspicious IP
3. Alert timestamp
4. User account

Expand Timeline Around Pivot:

Example: Malware detected at 09:30
┌─────────────────────────────────────────────────────────────┐
│ -1 hour   │ Look for: download, creation, delivery         │
│ -30 min   │ Look for: initial access, exploitation         │
│ -5 min    │ Look for: execution, process creation          │
│ PIVOT     │ Malware detection event                        │
│ +5 min    │ Look for: persistence, C2 connection           │
│ +30 min   │ Look for: lateral movement, discovery          │
│ +1 hour   │ Look for: data staging, exfiltration           │
└─────────────────────────────────────────────────────────────┘

Questions at Each Phase:
- What happened immediately before?
- What process caused this?
- What network connections occurred?
- What files were accessed?
- What user was active?

Document Findings:
- Note timestamp for each event
- Record evidence source
- Link events causally
- Build narrative

Key insight: Start with what you know (pivot point), then expand outward in time. Don't try to read millions of entries sequentially.

4) Cross-Source Correlation

The power of timeline analysis comes from correlating events across different artifact sources:

Correlation Patterns:

File Download + Execution:
09:30:00  WEBHIST   http://evil.com/malware.exe visited
09:30:05  FILE      malware.exe created in Downloads
09:30:10  PREFETCH  MALWARE.EXE-ABCD1234.pf created
09:30:10  EVT       Process creation: malware.exe (4688)

Correlation proves: User downloaded and executed malware

Lateral Movement:
09:30:00  EVT       4624 Logon Type 3 from 192.168.1.50
09:30:01  EVT       4672 Special privileges assigned
09:30:05  FILE      psexec.exe created
09:30:06  EVT       7045 Service installed: PSEXESVC
09:30:10  FILE      Output file created

Correlation proves: Remote access followed by PsExec execution

Data Exfiltration:
14:00:00  FILE      Documents accessed (multiple)
14:05:00  FILE      archive.zip created
14:05:30  WEBHIST   Dropbox upload URL
14:06:00  EVT       Large outbound connection

Correlation proves: Files staged and exfiltrated

Artifact Correlation Table:

What Each Source Contributes:

┌───────────────┬────────────────────────────────────────────┐
│ Source        │ Evidence Provided                          │
├───────────────┼────────────────────────────────────────────┤
│ MFT/File      │ File existence, timestamps                 │
│ USN Journal   │ File changes, deletions, renames           │
│ $LogFile      │ Recent MFT changes (hours)                 │
│ Prefetch      │ Program execution (last 8 times)           │
│ Event Logs    │ Authentication, process, service events    │
│ Registry      │ Configuration, persistence, user activity  │
│ Browser       │ URLs visited, downloads, searches          │
│ LNK Files     │ Files accessed, original locations         │
│ Jump Lists    │ Recent files per application               │
│ ShellBags     │ Folders browsed (even deleted)             │
│ SRUM          │ Network usage, application runtime         │
│ Amcache       │ Execution evidence, SHA1 hashes            │
└───────────────┴────────────────────────────────────────────┘

Correlation Example - Proving Execution:

Question: Did user execute malware.exe?

Evidence Sources:
1. Prefetch: MALWARE.EXE-ABCD.pf exists
2. Amcache: Entry for malware.exe with SHA1
3. UserAssist: Run count > 0
4. Event 4688: Process creation logged
5. ShimCache: Entry in AppCompatCache
6. BAM/DAM: Execution entry (Win10+)

Strong case: Multiple sources agree
Weak case: Only one source (could be artifact manipulation)

Timeline Gaps Analysis:

Identifying Gaps:

Types of Gaps:

1. Time Gaps:
   - No activity for extended periods
   - May indicate log clearing
   - May indicate system powered off
   - May be normal (overnight, weekend)

2. Source Gaps:
   - Expected artifact missing
   - Event logs cleared (Event 1102)
   - Prefetch disabled
   - USN Journal wrapped

3. Logical Gaps:
   - Events that should exist don't
   - Missing prerequisite activities
   - Incomplete attack chain

Investigating Gaps:

Time Gap Questions:
- Was system powered on?
- Were logs cleared?
- Did attacker stop activity?
- Is this normal for user?

Evidence of Anti-Forensics:
- Event 1102: Audit log cleared
- Large USN Journal gaps
- Missing MFT entries
- Timestomping detected
- Prefetch files deleted

Document Gaps:
- Note time range
- Note expected vs. actual
- Hypothesize reason
- Flag uncertainty in report

Building Causality:

Establishing Event Relationships:

Causation vs. Correlation:
- Correlation: Events near in time
- Causation: One event caused another

Proving Causality:

Parent-Child Processes:
Event 4688 shows:
- New Process: cmd.exe
- Creator Process: winword.exe
→ Word launched command prompt (macro?)

File Creation Context:
Timeline shows:
09:30:00 chrome.exe network connection to evil.com
09:30:01 malware.exe created in Downloads
→ Chrome downloaded malware

Temporal Proximity:
Authentication → File Access → Exfiltration
- Seconds apart
- Same user context
- Related file paths
→ Single attack sequence

Building Narrative:
1. List events chronologically
2. Identify causal relationships
3. Fill gaps with hypothesis
4. Verify with additional evidence
5. Document confidence levels

Key insight: Single artifacts suggest; multiple correlated artifacts prove. Strong conclusions require evidence from independent sources.

5) Visualization and Presentation

Effective visualization transforms complex timelines into understandable narratives for technical and non-technical audiences:

Visualization Approaches:

Chronological Timeline:
─────┬─────────┬─────────┬─────────┬─────────┬─────────▶
     │         │         │         │         │
   09:00     09:15     09:30     09:45     10:00
   Initial   Recon     Exploit   Persist   Exfil
   Access

Swimlane Diagram:
─────────────────────────────────────────────────────────▶
User1    │ ●────────●                    ●───●
         │ Login    Logout               Login Logout
─────────────────────────────────────────────────────────
Attacker │          ●────●────●────●────●
         │          Access Move Stage Exfil
─────────────────────────────────────────────────────────
System   │ ●───────────────────────────────●
         │ Boot                            Shutdown
─────────────────────────────────────────────────────────▶

Event Cluster:
    ┌─────────────────────────────────────┐
    │ 09:30:00 - 09:30:15                │
    │ ├─ Malware downloaded               │
    │ ├─ Malware executed                 │
    │ ├─ Service created                  │
    │ └─ C2 connection established        │
    └─────────────────────────────────────┘

Timesketch:

Timesketch - Collaborative Timeline Analysis:

Features:
- Web-based interface
- Collaborative investigation
- Search and filter
- Tagging and starring
- Sketch sharing
- Analysis plugins

Setup (Docker):
$ git clone https://github.com/google/timesketch.git
$ cd timesketch
$ docker-compose up -d

Import Timeline:
$ timesketch_importer.py --host localhost \
    --username analyst \
    --timeline_name "Case001" \
    timeline.csv

Interface Features:
- Search bar with filters
- Timeline visualization
- Event details panel
- Tags and comments
- Saved searches
- Team collaboration

Use Cases:
- Team investigations
- Long-running cases
- Knowledge retention
- Training and review

Manual Timeline Documentation:

Timeline Table Format:

| # | Date/Time (UTC)     | Source    | Event Description           | Evidence     | Notes           |
|---|---------------------|-----------|------------------------------|--------------|-----------------|
| 1 | 2024-03-15 09:28:00 | Web Log   | Exploit attempt              | access.log   | Initial attack  |
| 2 | 2024-03-15 09:28:05 | Web Log   | Successful exploitation      | access.log   | RCE achieved    |
| 3 | 2024-03-15 09:28:30 | File      | webshell.php created         | MFT          | Persistence     |
| 4 | 2024-03-15 09:30:00 | Auth Log  | root login via webshell      | auth.log     | Priv esc        |
| 5 | 2024-03-15 09:35:00 | File      | /etc/passwd accessed         | MFT          | Recon           |
| 6 | 2024-03-15 10:00:00 | Network   | Outbound to 10.0.0.50:443    | conn.log     | C2 established  |

Include:
- Sequential numbering
- UTC timestamps
- Source artifact
- Event description
- Evidence location
- Analyst notes

Best Practices:
- Use consistent timezone (UTC)
- Note confidence levels
- Link to raw evidence
- Include hypothesis tags

Presentation Techniques:

Audience-Appropriate Presentation:

Technical Audience (IR Team):
- Full timeline detail
- Command-line artifacts
- Hash values and paths
- Raw log excerpts
- Methodology notes

Management/Legal:
- High-level summary
- Key events only
- Impact focused
- Visual timeline
- Business context

Executive Summary:
┌─────────────────────────────────────────────────────────────┐
│ INCIDENT TIMELINE SUMMARY                                   │
├─────────────────────────────────────────────────────────────┤
│ Initial Compromise: March 15, 2024, 09:28 UTC               │
│ Method: Web application exploitation                        │
│ Duration: 4 hours 32 minutes                                │
│ Data Accessed: Customer database (50,000 records)           │
│ Exfiltration: 2.3 GB to external IP                         │
│ Remediation: Completed March 15, 14:00 UTC                  │
└─────────────────────────────────────────────────────────────┘

Supporting Graphics:
- Attack flow diagram
- Timeline visualization  
- Heat map of activity
- Geographic IP mapping

Key insight: The same timeline data needs different presentations for different audiences. Technical detail for responders, business impact for executives.

Real-World Context

Case Study: Ransomware Timeline

A company suffered ransomware encryption at 03:00 Saturday. Timeline analysis reconstructed the full attack: Initial access occurred 18 days prior via phishing email (email logs). The beacon first connected to C2 14 days ago (network logs). Credential harvesting occurred 10 days prior (Event 4648, 4624). Domain admin access was achieved 7 days ago (Event 4672, 4728). Reconnaissance and staging happened over 5 days (file timestamps, ShellBags). Ransomware deployment at 03:00 targeted backup deletion first, then encryption (Prefetch, file timestamps). The timeline proved the attack was targeted and methodical, not opportunistic.

Case Study: Insider Threat

An employee was suspected of data theft before resignation. Timeline analysis provided irrefutable evidence: USB device connected two weeks before departure (Registry, Event logs). Sensitive folders accessed that user never accessed before (ShellBags, LNK files). Large files copied to USB (USN Journal, file timestamps). Cloud storage sync observed (Browser history, network logs). Activity patterns showed after-hours access (timestamp clustering). The timeline demonstrated deliberate, premeditated theft with precise dates and times for legal proceedings.

MITRE ATT&CK Alignment:

Timeline Analysis Enables ATT&CK Mapping:

Temporal Analysis Reveals:
- Attack progression through kill chain
- Dwell time calculation
- Attacker working hours (attribution hint)
- Tool deployment sequence

Example Mapping:

09:28  Initial Access (TA0001)
       └─ T1190: Exploit Public-Facing Application
09:30  Execution (TA0002)
       └─ T1059.003: Windows Command Shell
09:35  Persistence (TA0003)
       └─ T1505.003: Web Shell
10:00  Command and Control (TA0011)
       └─ T1071.001: Web Protocols
10:30  Discovery (TA0007)
       └─ T1083: File and Directory Discovery
11:00  Collection (TA0009)
       └─ T1005: Data from Local System
12:00  Exfiltration (TA0010)
       └─ T1041: Exfiltration Over C2 Channel

Dwell Time Metrics:
- Initial access to detection: X days
- Lateral movement duration: Y hours
- Time to exfiltration: Z hours

Timeline analysis transforms scattered artifacts into a coherent attack narrative, enabling accurate scope assessment and complete remediation.

Guided Lab: Super Timeline Investigation

In this lab, you'll create a super timeline from a forensic image and analyze it to reconstruct an incident.

Lab Environment:

  • SIFT Workstation with Plaso installed
  • Practice forensic image
  • Timeline Explorer or text analysis tools
  • Spreadsheet for documentation

Exercise Steps:

  1. Run log2timeline.py against the forensic image
  2. Create filtered timeline CSV with psort.py
  3. Identify a pivot point (known bad indicator)
  4. Filter timeline to timeframe around pivot
  5. Correlate events across multiple sources
  6. Build chronological attack narrative
  7. Create presentation-ready timeline table

Reflection Questions:

  • What sources provided the most valuable timestamps?
  • Were there any timeline gaps or anti-forensics indicators?
  • How confident are you in the causal relationships identified?

Week Outcome Check

By the end of this week, you should be able to:

  • Explain MACB timestamps and their forensic meaning
  • Detect timestomping through $SI vs $FN comparison
  • Create super timelines using Plaso (log2timeline)
  • Filter and search large timelines effectively
  • Apply pivot point analysis methodology
  • Correlate events across multiple artifact sources
  • Identify timeline gaps and anti-forensics indicators
  • Present timelines for technical and non-technical audiences

🎯 Hands-On Labs (Free & Essential)

Practice timeline building before moving to reading resources.

🎮 TryHackMe: Log Analysis

What you'll do: Parse logs and build event sequences from raw data.
Why it matters: Timeline work starts with clean, structured events.
Time estimate: 1.5-2 hours

Start TryHackMe Log Analysis →

🎮 TryHackMe: DFIR

What you'll do: Correlate evidence and build a narrative from artifacts.
Why it matters: DFIR scenarios stress timeline reasoning.
Time estimate: 2-3 hours

Start TryHackMe DFIR →

📝 Lab Exercise: Super Timeline Build

Task: Build a Plaso timeline and identify a pivot point.
Deliverable: Filtered timeline CSV with 10-15 key events.
Why it matters: Super timelines reveal causality across artifacts.
Time estimate: 90-120 minutes

🛡️ Lab: Ransomware Tabletop Exercise

What you'll do: Run a tabletop IR exercise with a ransomware scenario.
Deliverable: Decision log + containment and communication plan.
Why it matters: IR success depends on coordinated decisions under pressure.
Time estimate: 60-90 minutes

💡 Lab Tip: Normalize all timestamps to UTC before correlating events.

🛡️ Incident Response Methodology

Timeline analysis feeds incident response. Use NIST SP 800-61 to structure decisions and document severity.

IR focus areas:
- Incident severity classification
- Communication plans (internal/external)
- Containment vs. business continuity
- Lessons learned and remediation

📚 Case Study: NotPetya (2017) and the cost of delayed containment decisions.

Resources

Lab

Complete the following lab exercises to practice timeline analysis techniques using Plaso and associated tools.

Part 1: Super Timeline Creation (LO7)

Run log2timeline.py against a practice forensic image. Document: (a) parsers used, (b) processing time, (c) total events extracted, (d) storage file size. Export to CSV format.

Deliverable: Processing log and statistics summary for the super timeline creation.

Part 2: Pivot Point Investigation (LO7)

Using a known indicator (provided filename or timestamp), filter the timeline to a 2-hour window around the pivot. Document: (a) events immediately before, (b) events immediately after, (c) causal relationships identified.

Deliverable: Filtered timeline excerpt with annotated events and relationship analysis.

Part 3: Cross-Source Correlation (LO7)

Identify an event that appears in multiple sources (e.g., program execution). Document: (a) all sources showing the event, (b) timestamp consistency, (c) additional details each source provides.

Deliverable: Correlation table showing same event across different artifact sources.

Part 4: Gap and Anti-Forensics Analysis (LO7)

Examine the timeline for: (a) unexplained time gaps, (b) evidence of log clearing, (c) timestomping indicators (if MFT data available), (d) missing expected artifacts.

Deliverable: Gap analysis report documenting anomalies and potential anti-forensics activity.

Part 5: Timeline Report (LO7, LO8)

Create a professional timeline report including: (a) executive summary with key events, (b) detailed chronological table, (c) visual timeline representation, (d) evidence source references.

Deliverable: Complete timeline report suitable for inclusion in a forensic investigation report.

Week 10 Quiz

Test your understanding of Timeline Analysis, Plaso, and Timestamp Forensics.

Format: 10 multiple-choice questions. Passing score: 70%. Time: Untimed.

Take Quiz

Checkpoint Questions

  1. Explain the difference between $STANDARD_INFORMATION and $FILE_NAME timestamps in NTFS. Why is this difference forensically significant?
  2. What is a "super timeline" and what advantages does it provide over analyzing individual artifact sources separately?
  3. Describe the pivot point analysis methodology. Why is this more effective than reading a timeline sequentially?
  4. How would you detect timestomping in a forensic investigation? What specific comparisons would you make?
  5. What does a gap in a timeline potentially indicate, and how would you investigate whether it represents anti-forensics versus normal system behavior?
  6. Why is correlating events across multiple artifact sources important for building strong forensic conclusions?

Weekly Reflection

Timeline analysis is where all forensic artifacts come together to tell a complete story. This week demonstrated how temporal correlation transforms isolated data points into coherent narratives that answer the fundamental questions of any investigation.

Reflect on the following in 200-300 words:

A strong reflection addresses the practical challenges of timeline analysis, handling uncertainty, and the synthesis skills required to transform data into narrative.

Verified Resources & Videos

← Previous: Week 09 Next: Week 11 →