Overall Expectation
Imposed Limitations
Business Type
Attack Type
Inherent Limitations
Threat Type
Blue Team
White Team
Logistics & Engagement Plan
Required Knowledge
Teams
Red Team
Manuals & Policies
News Groups
War Driving
People Fraud
Initial Discovery Scans
Internal Relations
Access Badges
Partner Data
Theft
Identity Assumption
Known Applications
Direct Technical Investigation: By using various tools and specific information collected from the previous phase, systems, networks, services, and applications can be queried to gather empirical data on characteristics that can be used for an attack vector
White Team
Network Map
Partner Information
Network
Organizational Structure
Evaluate known threats, tactics and structure and compare to existing information and expectations to devise an attack type, profile of required knowledge, and imposed limitations
Operating Systems
Domain Information
Wireless Network
Phone Systems
IP Addresses
Password Change
Account Data
Miscellaneous Data
Threats and Limitations:
Initial Deductions
Information Rationalization
Rationalize: Depending on the tactic, depth, provided data, timeframe, and overall vulnerability of the target or the amount of freely available information, all data can be normalized and compared to seek other opportunities to gain information prior to moving into the next phase
Passwords
Ping Sweeps
War Chalking
Prowling / Surfing
Custom Applications
Leverage existing information security related data, combine with overall business objectives and establish expected outcome of test.
Input/Output
Collect and Define:
Learn and Use: Based on the level and scope of required knowledge, the creation of an information and proposed collection tactic matrix should be used to acquire information about the target. Intensity and scope are defined by the business objectives and threat type, which in turn will establish the role collected data plays in the remainder of the engagement
Phone Lists
Network Map
Domain Data
Dumpster Diving
HelpDesk Fraud
Uber Hacker
Input/Output
Feedback Loop
Management: Create teams, provide operational and communication protocols, and create metrics to ensure clear measurement of success or failure factors
Intranet Data
Website
Observation
Physical Security
Internet Sources
e-Mail
Social Eng.
Hacker
Input/Output
ENUMERATE
Input/Output
the interpreted value of any security project.
Starts with a Policy: Fully understanding the security policy of an organization is critical to
Risk Analysis
Security Program
Previous Test(s)
Business Objectives
Policy
Script Kiddie
RECONNAISSANCE
Internet
Vendor
Patches & Service Packs
Protocol Standards
Vulnerablity Reports
Security Alerts
Input/Output
Vulnerability Analysis: Data from the Internet, product vendors and even the target are reviewed for any documented alignment to a vulnerability
Information Collection
Obtained
Default Installation
Default Passwords
Incidents
ANALYSIS
Extranet
Intranet
Input/Output
Thread-n
Thread-2
Thread-1
Attack Strategy: Based on the information learned about the target, overall objectives, expectations, limitations, and restrictions an attack plan can be formulated. The data will promote the use of one source point over another, or any combination of the three primary types
Attack Plan
Internet
Source Filter
HTTP/ SMTP
RAS/ Extranet
SNMP RMON
Citrix/X
*nix Attacks
Quality Loop
FTP/Telnet
Wireless
ACL/FW
Web Attacks
Windows Attacks
Tools
Quality Loop: Without a review of the initial thread results there is a greater possibility to loosing valuable vulnerability information or affecting the value of the test based on poor validation of a vulnerability thread
Services
Network
Application
Operating System
ATTACK
Thread Results
Group-n
Groups of Threads: Threads represent a singularity of attack that can be combined to represent the total impact of a collection of threads or vulnerability. Multiple groups represent a web of vulnerabilities founded on technical as well as management vulnerabilities
Misc.
Nodes
Protocols
Custom
Appliance Attacks
Expected?
Initial Results
Blue Team
Expected?
Results Analysis
Red Team
White Team
RESULTS
Overall Expectation
All results must be evaluated against established expectations prior to progression
Final Analysis
Review Thread and Group data and combine to formulate other attack scenarios if time permits. Additionally, evaluate results against expectations and agreed upon tactics. If group results analysis continue to not meet expectations, you need to review the expectations of the test otherwise you will not be prepared for the results
Always Compare and Review:
Yes
PLAN
No
Yes
AU1609_Tip 8/18/04 3:51 PM Page 1
No
//
Incident Management
Defense Planning
Detect
Architecture Review
Test
Identify
Process Review
Pilot
Isolate
Implement
INTEGRATE
Erradicate
Awareness
Validate
Policy
Based on the test’s prioritized results, the first order of business is to address the remedial, risk reducing elements
Develop a clear operational and management structure to support full integration of security recommendations, establish an Information Security Management Program, and prepare for the next test
Setup for Long-term ROI:
Response: Developing an incident response plan will be one of the few investments that get better with time. Create, evaluate, and test a response plan, document results and expectations, and prepare for the real thing
Ends with a Policy: Fully integrating the results, expectations for future security endeavors based on the test, and overall objectives into the security policy is essential to for value realization and better ROI on future test
Existing Policy
Integrate
Fix What is Broke:
Strategic
Tactical
Mitigation
ensuring a usable engagement document is to align with existing security policy, understanding of risk, and overall expectations (i.e. comparison of value of test to the value of data). Define a interpretation table and prioritize based on business demands, risk, and time
Deliverable
Remedial
Risk is Key: The only method for
Informational
Warning
Critical
DELIVERABLE