Incident Management · Day 02 · Research Exercise 02

The Incident Handling
& Response Process

A structured, 3-hour in-class research exercise for college cybersecurity students

Ponderation 2-3-3 Due: April 26, 2026 — Midnight EST Max 4 Pages A4 Report 3 h In-Class Active Work
⏳ Time Remaining Until Submission
--
Days
:
--
Hours
:
--
Mins
:
--
Secs
April 26, 2026 · 23:59 EST
Educator Protection Notice

This exercise has been designed for a supervised educational context. The tools, techniques, and methodologies referenced herein are used exclusively for legal, academic, and ethical training purposes within an isolated or controlled lab environment. Reproduction, redistribution, or application of any content outside of this supervised educational setting, without written authorization from the instructor, is strictly prohibited and may carry legal consequences. The instructor assumes no liability for misuse.

🔓 Open Tools Declaration

Every tool, framework, and technique referenced in this exercise is provided as an example only — none are mandatory. Students are strongly encouraged — and expected — to research alternative tools, explore new platforms, and where appropriate, design their own scripts and workflows. The cybersecurity field rewards those who dig deeper and think independently. Creativity and independent research are graded qualities in this programme.

SECTION 01

Context & Learning Objectives

No organization — regardless of size, sector, or annual budget — is immune to security incidents. What separates a resilient organization from a vulnerable one is not the absence of incidents, but the presence of a structured, rehearsed, and well-documented process for handling them when they occur.

The IH&R (Incident Handling and Response) process is the operational backbone of any mature cybersecurity practice. It is the formalized sequence of steps an organization follows from the moment a potential threat is detected, all the way through containment, recovery, and the post-event analysis that prevents repetition.

🎯 By the End of This Exercise, You Will Be Able To:
  • Articulate the phases of the NIST (National Institute of Standards and Technology) SP 800-61 Rev. 2 incident response lifecycle
  • Compare NIST with the PICERL (Preparation, Identification, Containment, Eradication, Recovery, Lessons Learned) framework popularized by SANS (SysAdmin, Audit, Network, and Security) Institute
  • Identify roles within a CSIRT (Computer Security Incident Response Team) and a SOC (Security Operations Centre)
  • Apply incident classification and triage logic to a realistic scenario
  • Produce a professional, structured research report aligned with industry expectations

⏱ In-Class Time Management

3-Hour Session Breakdown — Click a segment to navigate
Background Reading
~30 min
Task 1
~45 min
Task 2
~75 min
Task 3
~60 min
Review & Submit
~10 min

This is a guideline, not a rigid schedule. If Task 2 takes you longer — that is fine. Depth of reasoning is rewarded.

SECTION 02

Core Concepts — Working Reference

Use this section as an anchor while completing the tasks. It provides definitions and frameworks — but is not a substitute for your own research. Every concept here warrants deeper investigation in the primary sources listed in Section 03.

What Qualifies as a Security Incident?

A security incident is any event that actually or potentially jeopardizes the CIA Triad (Confidentiality, Integrity, Availability) of an information system or the data it handles. Not every anomaly is an incident. The typical escalation path is:

EVENT
Any observable occurrence — e.g., a failed login attempt
ALERT
A flagged event — e.g., 500 failed logins in 60 seconds
INCIDENT
Confirmed violation requiring formal response

The NIST SP 800-61 Rev. 2 Lifecycle

NIST defines four phases that are cyclical, not linear — each incident response feeds back into improved preparation:

🛡️
Phase 1
Preparation
Building IR plans, training teams, acquiring tools, establishing baselines and communication channels
🔍
Phase 2
Detection & Analysis
Identifying attack vectors, validating incidents, prioritizing by scope and business impact
🔒
Phase 3
Containment, Eradication & Recovery
Isolating affected systems, removing threats, restoring services, verifying clean state
📋
Phase 4
Post-Incident Activity
Writing the incident report, holding lessons-learned reviews, updating defences and playbooks
↻   Each cycle feeds back into a stronger Preparation phase

The SANS PICERL Model

SANS expands the lifecycle into six discrete phases, widely used in practitioner training and certification programmes:

#PhaseCore QuestionKey Activities
1PreparationAre we ready?IR (Incident Response) plan, playbooks, tools, training, team contacts
2IdentificationIs this an incident?Alert triage, scope determination, severity classification
3ContainmentStop the bleeding?Short-term isolation, evidence preservation, CoC (Chain of Custody)
4EradicationThreat fully removed?Malware removal, root-cause analysis, system cleaning
5RecoveryBack to normal safely?Service restoration, enhanced monitoring, verification
6Lessons LearnedWhat do we improve?PIR (Post-Incident Review), report writing, control updates

Key Roles in Incident Response

Role / StructureResponsibility
CISO (Chief Information Security Officer)Executive accountable for the security programme; authorizes IR declarations
IR Lead / Incident CommanderCoordinates all response activities; owns the incident timeline
SOC (Security Operations Centre) AnalystFirst line of detection; monitors alerts around the clock
CSIRT (Computer Security Incident Response Team)Dedicated team activated during confirmed incidents; may be internal or external
Forensic AnalystPreserves and analyses digital evidence using sound forensic methodology
Legal / Compliance OfficerAdvises on regulatory obligations, breach notification laws
Communications LeadManages internal and external messaging during and after an incident

Incident Severity Classification

P1
Critical
Active breach; data exfiltration in progress; production systems down
P2 ← SCENARIO
High
Confirmed compromise; contained but not eradicated; significant business impact
P3
Medium
Suspicious activity confirmed as malicious; limited scope
P4
Low
Policy violation; minor anomaly with no confirmed malicious intent

Chain of Custody

CoC (Chain of Custody) is the documented, unbroken record of who collected, handled, transferred, or analysed a piece of digital evidence — and when. It is legally critical. A broken chain of custody can render evidence inadmissible in court. Every forensic action must be logged: timestamp, analyst name, tool used, hash values (MD5, SHA-256) of all collected artifacts.

SECTION 03

Background Reading & Primary Resources

Spend approximately 30 minutes consulting primary sources before tackling the tasks. Do not rely solely on wikis or summary blogs — consult original documents. Record full citations immediately as you go: title, author(s), publication date, URL, and date accessed.

📌 Research Habit — Record Citations Immediately

Every time you open a document, capture the full citation right then. It takes 30 seconds now and saves 30 minutes of frantic searching later. Format: Author(s). (Year). Title. Publisher. URL (Accessed: [Date]).

SECTION 04

Incident Scenario Briefing

You are a junior SOC (Security Operations Centre) Analyst at SecureNorth Inc., a mid-sized financial services company headquartered in Montréal, Québec. It is 09:17 on a Tuesday morning. Your SIEM (Security Information and Event Management) dashboard has just triggered the following alert:

🔴 ALERT ID: SN-2026-0422-0917 — STATUS: ACTIVE / UNRESOLVED
Source IP: 10.14.22.87
Workstation: Finance Department
Destination IP: 185.220.101.34
Classification: Known Tor Exit Node
Port: 443 (HTTPS)
Duration: 22 minutes of repeated connections
Additional context: The same workstation executed a .exe attachment from a vendor-impersonating email received at 08:51. The email claimed to be an invoice from a known supplier.
Initial severity classification: P2 — High (pending your analysis and validation)

Your task throughout Task 2 is to walk through all six PICERL phases as they apply to this scenario, documenting your reasoning at each step.

SECTION 05

Lab Tasks

TASK 01
Framework Reconnaissance
~45 minutes  |  Individual research  |  Output feeds into Report Page 2

Objective: Develop a working understanding of the two dominant IH&R frameworks and map their structural and philosophical differences.

Step 1.1 — Framework Comparison

Using the primary sources from Section 03, construct a comparison table (to be included in your report) that addresses each of the following axes. Write in your own words — no copy-pasting from sources.

Comparison AxisNIST SP 800-61 Rev. 2SANS PICERL
Number of phasesYour researchYour research
Originating bodyYour researchYour research
Primary audienceYour researchYour research
StrengthsYour researchYour research
LimitationsYour researchYour research
Typical use caseYour researchYour research

Step 1.2 — Real-World Adoption

Research and document two real-world organizations or government bodies that have published their IH&R policies or frameworks based on one of these models. For each organization:

  • Name the organization and provide a clearly visible hyperlink to their official policy or publication
  • Identify which framework they align with and articulate why
  • Note one aspect of their published approach that stands out as particularly mature or innovative
💡 Starting Point (Do Not Copy — Find Your Own)

The Canadian Centre for Cyber Security (CCCS) publishes guidance aligned with NIST. Explore their publications — but your two examples must be different from this one. Look beyond Canada: consider the UK's NCSC (National Cyber Security Centre), the German BSI (Bundesamt für Sicherheit in der Informationstechnik), or sector-specific bodies in finance, healthcare, or critical infrastructure.

Step 1.3 — Personal Glossary

Build a personal glossary of at least 10 terms encountered during your reading. For each entry:

  • The abbreviation (if applicable) with the full expansion in brackets immediately after
  • A definition in your own words — 2–3 sentences maximum
  • The specific source you derived the definition from (author, document, page or section)

This glossary becomes Appendix A in your report. A well-constructed glossary demonstrates genuine engagement with the source material.

TASK 02
Incident Scenario Analysis — PICERL Applied
~75 minutes  |  Analytical work  |  Output feeds into Report Pages 3–4

Objective: Apply all six PICERL phases to the SecureNorth scenario. This is an analytical exercise — you are not expected to execute commands. You are expected to think, reason, and document like a professional responder.

Phase 1   Preparation

Q1

What policies, playbooks, and tools should SecureNorth have had in place before this alert fired? List and justify at least five concrete preparation items — go beyond the obvious and consider what a financial sector organization specifically needs.

Q2

What is a SOAR (Security Orchestration, Automation, and Response) platform, and how would one have assisted in this scenario? Research at least one open-source SOAR tool and describe its architecture, key capabilities, and limitations.

💡 Tools to Investigate (examples only — not exhaustive)

TheHive Project — an open-source, scalable IR platform. Compare it with Cortex XSOAR (community edition) or Shuffle. What distinguishes each? Which would you choose for a 200-employee financial firm and why? There are others — find them.

Phase 2   Identification

Q3

Using the alert details from Section 04, determine whether this event meets the threshold of a confirmed security incident. Structure your determination as a formal argument — present the evidence for and against, then reach a conclusion. A one-word answer is unacceptable.

Q4

What additional data sources would you consult to enrich your analysis? List at least four (e.g., endpoint logs, DNS (Domain Name System) query history, email gateway logs, proxy logs, threat intelligence feeds). For each source, explain the specific question it would help you answer.

Q5

The destination IP (185.220.101.34) is flagged as a known Tor exit node. How would you use an open-source threat intelligence platform to confirm or enrich this indicator? Research and describe at least one platform or tool — include what data it provides, how to interpret it, and its limitations.

💡 Threat Intelligence Platforms to Explore

VirusTotal, AbuseIPDB, AlienVault OTX (Open Threat Exchange), Shodan, ThreatFox by abuse.ch. What does each offer that the others do not? Are there newer platforms not on this list? Find them and compare.

Q6

Assign a formal severity level (P1 through P4) to this incident using the classification from Section 02. Document your full reasoning — severity classification is a professional judgment call, not a lookup table exercise.

Phase 3   Containment

Q7

What short-term containment actions would you take for workstation 10.14.22.87? Describe at least three specific steps and explain why each is necessary. Explicitly address the tension between containment speed and evidence preservation.

Q8

Would you immediately disconnect the workstation from the network? Construct a rigorous argument for both positions — immediate disconnection versus controlled continued monitoring. Conclude with your recommendation and rationale.

Q9

From the moment containment begins, how do you ensure CoC (Chain of Custody) is maintained? Describe the documentation process step-by-step, including what hash algorithms you would use and how you would record handoffs between analysts.

Phase 4   Eradication

Q10

Assuming analysis confirms a malware (malicious software) dropper and an active C2 (Command-and-Control) beacon on the workstation, what eradication steps would you take? List at least four steps in logical sequence and justify the ordering.

Q11

How do you verify that eradication is complete? Research at least one open-source tool suitable for post-eradication verification and describe how it works technically. Generic answers that simply say "run an antivirus scan" will not score well — be specific and demonstrate technical depth.

💡 Areas to Explore

Memory forensics tools (e.g., Volatility Foundation), rootkit (a class of malware that conceals itself within the operating system) detectors, integrity checking tools (e.g., Tripwire Open Source), registry analysis tools, YARA rule scanning. Are there others not on this list? Find them.

Phase 5   Recovery

Q12

Before restoring the workstation to production use, what validation steps must be completed? List at least five and explain the rationale for each. Consider what "clean" actually means in an operational context.

Q13

What monitoring should be enhanced in the 30 days following recovery to detect any recurrence or lateral movement (the technique attackers use to progressively move through a network after gaining initial access)? Be specific about what you would monitor, with what tools, at what frequency.

Phase 6   Lessons Learned

Q14

Draft the structure of a PIR (Post-Incident Review) meeting agenda for this incident. Include at least six agenda items, assign a sample duration to each, and note who should chair and who should attend.

Q15

Based on this scenario, identify three systemic weaknesses exposed at SecureNorth. For each weakness, propose a concrete, actionable remediation. Vague recommendations ("improve email security") will not score well — be specific about controls, tools, and implementation steps.

Q16

If this incident involved confirmed data exfiltration (unauthorized transfer of data out of an organization), what regulatory notification obligations would apply to SecureNorth as a financial services firm in Québec? Research and identify at least two applicable frameworks, including notification timelines and authorities to notify.

💡 Regulatory Frameworks to Investigate

PIPEDA (Personal Information Protection and Electronic Documents Act), Loi 25 (Québec), OSC (Ontario Securities Commission) cybersecurity guidelines. Are there others applicable to financial institutions specifically? Research FINTRAC (Financial Transactions and Reports Analysis Centre of Canada) guidance as well.

TASK 03
Produce Your Research Report
~60 minutes  |  Writing and synthesis  |  Final deliverable

Objective: Compile your Task 1 and Task 2 findings into a professional, structured research report.

SpecificationRequirement
FormatPDF, submitted via the course platform
LengthMaximum 4 pages A4 (body only — appendices are not counted)
FontTimes New Roman 12pt or equivalent professional serif body font
Margins2.5 cm on all sides
Line spacing1.15
LanguageEnglish — professional written register throughout
SECTION 06

Report Construction Guide

A well-structured report communicates competence before a single word is read. Follow this structure exactly — it is not a suggestion.

P1
Title Page + Executive Summary (Page 1)
Title page (top half): Report title, your name, student ID, course name, date. Executive Summary (bottom half): 3–5 bullet findings + one-sentence conclusion. Write the Executive Summary last — it summarises what you have not yet written if done first. Aim for 90-second readability.
P2
Framework Analysis (Page 2)
Your NIST vs PICERL comparison table (Task 1.1) + your real-world adoption findings (Task 1.2) as 2–3 paragraphs. Dense with referenced content — cite inline using APA or IEEE style consistently throughout.
P3–4
Incident Scenario Analysis (Pages 3–4)
PICERL phases applied to SecureNorth — written as coherent narrative paragraphs with phase headers, not as Q&A reproductions. Include at least one original diagram (timeline or PICERL flow). Tables are welcome for structured data. Every claim must be cited.
App A
Appendix A — Glossary (Not counted in 4-page limit)
Your 10-term minimum glossary from Task 1.3. Each entry: abbreviation + expansion + definition in your own words + source.
App B
Appendix B — References (Not counted in 4-page limit)
All sources cited in the body, formatted consistently. Every URL must include the date you accessed it.

Visual Elements Guidelines

📸 Screenshots

Crop tightly — capture only what is relevant. Add a clear caption: "Figure X: [Description] — [Tool Name], captured [Date]". Redact any personal data, credentials, or sensitive identifiers before inclusion.

🖼 Images from External Sources

Do not copy-paste images from the web without attribution. Cite immediately below: "Source: [Author], [Year]. [URL]". When in doubt, create your own — a hand-drawn diagram photographed and included demonstrates understanding in a way that a copied graphic never will.

🔗 Hyperlinks in Your PDF

All URLs must be clickable hyperlinks in the final PDF. Do not write out raw URLs longer than 60 characters in the body — hyperlink the title of the document instead. Example: write "NIST SP 800-61 Rev. 2" as a hyperlink, not the full PDF URL.

📊 Diagrams and Flowcharts

Recommended tools: draw.io (free, browser-based), Lucidchart (free tier), PowerPoint, or LibreOffice Impress. Your diagram must have a title, a legend (where applicable), and a source note. Original diagrams are worth more than adapted ones.

Writing Quality — Non-Negotiables

SECTION 07

Deliverables & Pre-Submission Checklist

Before you submit, go through this list. Click each item to mark it as complete.

Late Submissions

Late submissions are subject to the programme's standard late penalty policy. The deadline is April 26, 2026 at Midnight (23:59) EST. Technical issues with the submission platform are not accepted as grounds for extension — submit early and confirm receipt.

SECTION 08

Grading Criteria

Framework Analysis — depth, accuracy, and genuine comparison of NIST vs PICERL
25%
Incident Scenario Analysis — quality of reasoning across all six phases, not just listing of steps
40%
Report Quality — structure, professionalism, adherence to specifications, visual elements
20%
Research & Citation Quality — source selection, citation correctness, evidence of independent investigation beyond suggested starting points
15%
🌟 Note on Independent Research

Students who go beyond the suggested starting points, explore tools not mentioned in this document, identify novel frameworks, and demonstrate genuine intellectual curiosity will be rewarded. The suggested tools and resources are a floor, not a ceiling. The best submissions will contain ideas and sources the instructor has not seen before.

SECTION 09

Academic Integrity

All submitted work must be your own. Collaborative discussion is permitted and encouraged — it mirrors how real security teams operate. However, the written report must reflect your individual analysis, reasoning, and writing.

Use of AI writing tools to generate report text constitutes academic dishonesty in this context. You may use AI tools to clarify concepts or explore terminology — but your written analysis and conclusions must be entirely your own work, in your own voice.