Mental Model
"In the cloud, misconfiguration is the new vulnerability. The shared responsibility model means your security is only as strong as your configuration." — Cloud Security Principle
Cloud environments shift the security paradigm from patching to configuration. While AWS secures the underlying infrastructure, NovaTech is responsible for how they configure and use those services. A single misconfigured S3 bucket or overly permissive IAM policy can expose the entire organization—without any software vulnerability being exploited.
Learning Outcomes
By the end of this week, you will be able to:
- LO1: Apply the AWS Shared Responsibility Model to identify customer security obligations
- LO2: Execute cloud security assessment tools to identify misconfigurations
- LO3: Evaluate IAM policies for excessive permissions and security risks
- LO4: Identify common cloud misconfigurations across compute, storage, and networking services
- LO5: Document cloud security findings with remediation guidance aligned to AWS best practices
Introduction: NovaTech's Cloud Footprint
NovaTech operates a hybrid environment with AWS as their primary cloud provider. Their WorkflowPro SaaS platform runs on AWS infrastructure including EC2, EKS, RDS, S3, and Lambda. Following their Series C funding, rapid scaling has outpaced their security team's capacity to review configurations.
This week focuses on assessing NovaTech's AWS security posture. Unlike traditional vulnerability scanning, cloud security assessment emphasizes configuration review—examining how services are set up rather than what software vulnerabilities exist.
Cloud Assessment Scope
| AWS Account | Environment | Key Services |
|---|---|---|
| 123456789012 | Production (us-east-1) | EKS, RDS, S3, Lambda, CloudFront |
| 234567890123 | DR / Staging (us-west-2) | EC2, RDS replica, S3 |
Marcus Webb has provided read-only IAM credentials for assessment purposes. You are authorized to review configurations but not modify any resources.
1. AWS Shared Responsibility Model
Understanding the Shared Responsibility Model is fundamental to cloud security assessment. It defines what AWS secures versus what customers must secure.
┌─────────────────────────────────────────────────────────────────┐
│ AWS SHARED RESPONSIBILITY MODEL │
├─────────────────────────────────────────────────────────────────┤
│ │
│ ┌───────────────────────────────────────────────────────────┐ │
│ │ CUSTOMER RESPONSIBILITY │ │
│ │ "Security IN the Cloud" │ │
│ ├───────────────────────────────────────────────────────────┤ │
│ │ • Customer Data │ │
│ │ • Platform, Applications, Identity & Access Management │ │
│ │ • Operating System, Network & Firewall Configuration │ │
│ │ • Client-side Data Encryption │ │
│ │ • Server-side Encryption (File System/Data) │ │
│ │ • Network Traffic Protection (Encryption/Integrity) │ │
│ └───────────────────────────────────────────────────────────┘ │
│ │
│ ┌───────────────────────────────────────────────────────────┐ │
│ │ AWS RESPONSIBILITY │ │
│ │ "Security OF the Cloud" │ │
│ ├───────────────────────────────────────────────────────────┤ │
│ │ • Hardware / AWS Global Infrastructure │ │
│ │ • Regions, Availability Zones, Edge Locations │ │
│ │ • Compute, Storage, Database, Networking (physical) │ │
│ │ • Software (Hypervisor, etc.) │ │
│ └───────────────────────────────────────────────────────────┘ │
│ │
│ ASSESSMENT FOCUS: Everything in the "Customer Responsibility" │
│ section is within scope for security review │
│ │
└─────────────────────────────────────────────────────────────────┘
Service-Specific Responsibilities
Responsibility varies by service type:
| Service Type | Examples | Customer Responsible For |
|---|---|---|
| Infrastructure (IaaS) | EC2, VPC, EBS | OS patching, network config, firewall rules, application security, data encryption |
| Container | ECS, EKS | Container images, orchestration config, network policies, secrets management |
| Abstracted | S3, DynamoDB, Lambda | Access policies, encryption settings, data classification, function code security |
Assessment Implication
For NovaTech's assessment, we're not looking for unpatched EC2 instances (that's infrastructure vulnerability assessment from Week 3). Instead, we're examining whether S3 buckets are publicly accessible, whether IAM policies follow least privilege, whether encryption is enabled, and whether network security groups are properly restrictive.
2. Cloud Security Assessment Tools
Several tools automate cloud configuration assessment by querying AWS APIs and comparing configurations against security benchmarks.
Prowler
Prowler is the leading open-source AWS security assessment tool, checking against CIS Benchmarks, PCI-DSS, HIPAA, and AWS best practices.
Installation
# Install via pip
pip3 install prowler
# Or clone from GitHub
git clone https://github.com/prowler-cloud/prowler
cd prowler
pip3 install -r requirements.txt
# Verify installation
prowler --version
Configuration
# Configure AWS credentials (provided by NovaTech)
# Option 1: Environment variables
export AWS_ACCESS_KEY_ID="AKIAXXXXXXXXXXXXXXXX"
export AWS_SECRET_ACCESS_KEY="xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx"
export AWS_DEFAULT_REGION="us-east-1"
# Option 2: AWS credentials file
aws configure --profile novatech-assessment
# Enter Access Key ID, Secret Access Key, Region
# Verify credentials work
aws sts get-caller-identity --profile novatech-assessment
Running Prowler
# Full assessment (all checks)
prowler aws --profile novatech-assessment
# Specific compliance framework
prowler aws --profile novatech-assessment --compliance cis_2.0_aws
# Specific services only
prowler aws --profile novatech-assessment --services s3 iam ec2 rds
# Generate HTML report
prowler aws --profile novatech-assessment -M html -o ./prowler-results/
# Specific region
prowler aws --profile novatech-assessment --region us-east-1
# Multiple output formats
prowler aws --profile novatech-assessment -M json html csv
Understanding Prowler Output
# Prowler categorizes findings by severity:
# CRITICAL - Immediate action required
# HIGH - Priority remediation
# MEDIUM - Should be addressed
# LOW - Best practice recommendations
# INFORMATIONAL - Awareness items
# Example output:
# [CRITICAL] s3_bucket_public_access - S3 bucket "novatech-uploads" is publicly accessible
# [HIGH] iam_policy_allows_privilege_escalation - IAM policy allows privilege escalation
# [MEDIUM] ec2_security_group_allows_all_traffic - Security group allows all inbound traffic
ScoutSuite
ScoutSuite is another powerful open-source multi-cloud security auditing tool with excellent visualization capabilities.
Installation and Usage
# Install via pip
pip3 install scoutsuite
# Run assessment
scout aws --profile novatech-assessment
# Results are generated as an interactive HTML report
# Open scoutsuite-report/aws-xxxxxx.html in browser
# Specific services
scout aws --profile novatech-assessment --services iam s3 ec2
AWS CLI for Manual Checks
While automated tools provide broad coverage, manual AWS CLI queries allow targeted investigation and validation.
# IAM Checks
aws iam list-users
aws iam list-roles
aws iam list-policies --scope Local
aws iam get-account-password-policy
aws iam generate-credential-report
aws iam get-credential-report --output text --query Content | base64 -d
# S3 Checks
aws s3 ls
aws s3api get-bucket-acl --bucket bucket-name
aws s3api get-bucket-policy --bucket bucket-name
aws s3api get-public-access-block --bucket bucket-name
aws s3api get-bucket-encryption --bucket bucket-name
# EC2/VPC Checks
aws ec2 describe-security-groups
aws ec2 describe-network-acls
aws ec2 describe-instances --query 'Reservations[].Instances[].{ID:InstanceId,State:State.Name,PublicIP:PublicIpAddress}'
aws ec2 describe-vpcs
# RDS Checks
aws rds describe-db-instances
aws rds describe-db-clusters
# CloudTrail (Logging)
aws cloudtrail describe-trails
aws cloudtrail get-trail-status --name trail-name
Steampipe
Steampipe allows SQL queries against cloud APIs, enabling flexible custom security checks.
# Install Steampipe
/bin/sh -c "$(curl -fsSL https://raw.githubusercontent.com/turbot/steampipe/main/install.sh)"
# Install AWS plugin
steampipe plugin install aws
# Start interactive query
steampipe query
# Example queries:
# Find public S3 buckets
select name, region, bucket_policy_is_public
from aws_s3_bucket
where bucket_policy_is_public = true;
# Find security groups with 0.0.0.0/0
select group_name, group_id, vpc_id, ip_permission
from aws_vpc_security_group_rule
where cidr_ip = '0.0.0.0/0';
# Find IAM users without MFA
select name, mfa_enabled, password_last_used
from aws_iam_user
where mfa_enabled = false;
3. IAM Security Assessment
Identity and Access Management (IAM) is the foundation of AWS security. Misconfigured IAM policies are among the most common and dangerous cloud security issues.
IAM Assessment Areas
┌─────────────────────────────────────────────────────────────────┐
│ IAM SECURITY ASSESSMENT │
├─────────────────────────────────────────────────────────────────┤
│ │
│ USERS & AUTHENTICATION POLICIES & PERMISSIONS │
│ ────────────────────── ────────────────────── │
│ • MFA enabled? • Least privilege followed? │
│ • Password policy strong? • Wildcard permissions (*)? │
│ • Access keys rotated? • Inline vs managed policies │
│ • Unused credentials? • Resource-based policies │
│ • Root account protected? • Permission boundaries │
│ │
│ ROLES & TRUST MONITORING & GOVERNANCE │
│ ──────────────── ─────────────────────── │
│ • Cross-account trust • CloudTrail enabled? │
│ • Service role permissions • Access Analyzer findings? │
│ • Instance profiles • IAM credential report │
│ • Trust policy scope • Last accessed info │
│ │
└─────────────────────────────────────────────────────────────────┘
Root Account Security
# Check root account configuration
# Root account should have:
# - MFA enabled
# - No access keys
# - Not used for daily operations
# Check via credential report
aws iam generate-credential-report
aws iam get-credential-report --output text --query Content | base64 -d > cred-report.csv
# In the report, check the <root_account> row:
# - mfa_active should be "true"
# - access_key_1_active should be "false"
# - access_key_2_active should be "false"
# - password_last_used should be minimal/never for ops
Password Policy Analysis
# Retrieve password policy
aws iam get-account-password-policy
# Expected secure configuration:
{
"PasswordPolicy": {
"MinimumPasswordLength": 14, # Should be 14+
"RequireSymbols": true,
"RequireNumbers": true,
"RequireUppercaseCharacters": true,
"RequireLowercaseCharacters": true,
"AllowUsersToChangePassword": true,
"ExpirePasswords": true,
"MaxPasswordAge": 90, # 90 days or less
"PasswordReusePrevention": 24, # Remember 24 passwords
"HardExpiry": false
}
}
User MFA Status
# List all users and MFA status
aws iam list-users --query 'Users[].UserName' --output text | \
while read user; do
mfa=$(aws iam list-mfa-devices --user-name $user --query 'MFADevices[].SerialNumber' --output text)
if [ -z "$mfa" ]; then
echo "NO MFA: $user"
else
echo "MFA OK: $user"
fi
done
# Or via credential report - check mfa_active column
Access Key Analysis
# Find old/unused access keys
aws iam list-users --query 'Users[].UserName' --output text | \
while read user; do
echo "=== $user ==="
aws iam list-access-keys --user-name $user --query 'AccessKeyMetadata[].{KeyId:AccessKeyId,Status:Status,Created:CreateDate}'
done
# Check last used date for each key
aws iam get-access-key-last-used --access-key-id AKIAXXXXXXXXXXXXXXXX
# Keys should be:
# - Rotated every 90 days
# - Disabled if not used in 90 days
# - Deleted if not used in 180 days
Policy Analysis
Analyze IAM policies for overly permissive configurations:
# List all customer-managed policies
aws iam list-policies --scope Local --query 'Policies[].{Name:PolicyName,Arn:Arn}'
# Get policy document
aws iam get-policy-version \
--policy-arn arn:aws:iam::123456789012:policy/PolicyName \
--version-id v1
# DANGEROUS PATTERNS TO LOOK FOR:
# 1. Full admin access
{
"Effect": "Allow",
"Action": "*",
"Resource": "*"
}
# 2. Wildcard actions on sensitive services
{
"Effect": "Allow",
"Action": "iam:*",
"Resource": "*"
}
# 3. Privilege escalation paths
{
"Effect": "Allow",
"Action": [
"iam:CreateAccessKey",
"iam:CreateLoginProfile",
"iam:UpdateLoginProfile",
"iam:AttachUserPolicy",
"iam:AttachRolePolicy",
"iam:PutUserPolicy",
"iam:PutRolePolicy"
],
"Resource": "*"
}
# 4. S3 full access
{
"Effect": "Allow",
"Action": "s3:*",
"Resource": "*"
}
IAM Access Analyzer
# Check if Access Analyzer is enabled
aws accessanalyzer list-analyzers
# Get findings (external access to resources)
aws accessanalyzer list-findings --analyzer-arn arn:aws:access-analyzer:us-east-1:123456789012:analyzer/name
# Access Analyzer identifies:
# - S3 buckets accessible from outside the account
# - IAM roles with trust policies allowing external access
# - KMS keys with external access
# - Lambda functions with resource-based policies allowing external access
# - SQS queues with external access
Sample IAM Finding
IAM Users Without MFA Enabled
Description
Multiple IAM users with console access do not have Multi-Factor Authentication (MFA) enabled. Without MFA, compromised passwords provide direct access to AWS resources.
Affected Resources
| dev-user-1 | Console access, no MFA |
| jenkins-admin | Console access, no MFA |
| ops-user-3 | Console access, no MFA |
Evidence
$ aws iam get-credential-report | base64 -d | grep -E "user|mfa_active"
user,mfa_active,...
dev-user-1,false,...
jenkins-admin,false,...
ops-user-3,false,...
Risk
Password compromise through phishing, credential stuffing, or data breaches would provide immediate AWS access. Given NovaTech's SOC 2 requirements, this represents a control deficiency.
Remediation
- Enable MFA for all IAM users with console access
- Enforce MFA via IAM policy condition keys
- Consider using AWS SSO with enforced MFA at the IdP level
Reference: AWS MFA Documentation
4. S3 Security Assessment
S3 bucket misconfigurations are responsible for numerous high-profile data breaches. Assessment focuses on access controls, encryption, and logging.
S3 Security Checklist
┌─────────────────────────────────────────────────────────────────┐
│ S3 SECURITY CHECKLIST │
├─────────────────────────────────────────────────────────────────┤
│ │
│ ACCESS CONTROL ENCRYPTION │
│ ────────────── ────────── │
│ □ Public access blocked □ Default encryption enabled │
│ □ Bucket policy reviewed □ SSE-S3 or SSE-KMS │
│ □ ACLs restrictive □ Encryption enforced (policy) │
│ □ No wildcard principals □ TLS enforced for transit │
│ │
│ LOGGING & MONITORING DATA PROTECTION │
│ ─────────────────── ─────────────── │
│ □ Access logging enabled □ Versioning enabled │
│ □ CloudTrail data events □ Object lock (if needed) │
│ □ S3 Inventory configured □ Lifecycle policies │
│ □ Macie scanning (sensitive) □ Cross-region replication │
│ │
└─────────────────────────────────────────────────────────────────┘
Public Access Assessment
# List all buckets
aws s3api list-buckets --query 'Buckets[].Name' --output text
# Check public access block settings for each bucket
for bucket in $(aws s3api list-buckets --query 'Buckets[].Name' --output text); do
echo "=== $bucket ==="
aws s3api get-public-access-block --bucket $bucket 2>/dev/null || echo "No public access block configured!"
done
# Expected secure configuration:
{
"PublicAccessBlockConfiguration": {
"BlockPublicAcls": true,
"IgnorePublicAcls": true,
"BlockPublicPolicy": true,
"RestrictPublicBuckets": true
}
}
# Check bucket ACLs
aws s3api get-bucket-acl --bucket bucket-name
# DANGEROUS: Look for grants to:
# - "AllUsers" (public)
# - "AuthenticatedUsers" (any AWS account)
# Check bucket policy
aws s3api get-bucket-policy --bucket bucket-name --output text | jq .
# DANGEROUS POLICY PATTERNS:
# - Principal: "*"
# - Principal: {"AWS": "*"}
# - Condition-less Allow statements with public principal
Encryption Assessment
# Check default encryption
for bucket in $(aws s3api list-buckets --query 'Buckets[].Name' --output text); do
echo "=== $bucket ==="
aws s3api get-bucket-encryption --bucket $bucket 2>/dev/null || echo "NO ENCRYPTION CONFIGURED"
done
# Expected configuration:
{
"ServerSideEncryptionConfiguration": {
"Rules": [
{
"ApplyServerSideEncryptionByDefault": {
"SSEAlgorithm": "AES256" # or "aws:kms"
},
"BucketKeyEnabled": true
}
]
}
}
# Check if bucket policy enforces encryption
# Look for condition requiring encryption:
{
"Effect": "Deny",
"Principal": "*",
"Action": "s3:PutObject",
"Resource": "arn:aws:s3:::bucket-name/*",
"Condition": {
"StringNotEquals": {
"s3:x-amz-server-side-encryption": "AES256"
}
}
}
Logging Assessment
# Check if access logging is enabled
for bucket in $(aws s3api list-buckets --query 'Buckets[].Name' --output text); do
echo "=== $bucket ==="
aws s3api get-bucket-logging --bucket $bucket
done
# Check CloudTrail for S3 data events
aws cloudtrail describe-trails
aws cloudtrail get-event-selectors --trail-name trail-name
# Look for data events configuration:
{
"DataResources": [
{
"Type": "AWS::S3::Object",
"Values": ["arn:aws:s3:::bucket-name/"]
}
]
}
Sample S3 Finding
S3 Bucket Publicly Accessible
Affected Resource
novatech-customer-uploads (us-east-1)
Description
The S3 bucket novatech-customer-uploads is configured
with a bucket policy that allows public read access. This bucket
appears to contain customer-uploaded files from the WorkflowPro
application.
Evidence
$ aws s3api get-bucket-policy --bucket novatech-customer-uploads --output text | jq .
{
"Version": "2012-10-17",
"Statement": [
{
"Sid": "PublicRead",
"Effect": "Allow",
"Principal": "*",
"Action": "s3:GetObject",
"Resource": "arn:aws:s3:::novatech-customer-uploads/*"
}
]
}
$ aws s3api get-public-access-block --bucket novatech-customer-uploads
{
"PublicAccessBlockConfiguration": {
"BlockPublicAcls": false,
"IgnorePublicAcls": false,
"BlockPublicPolicy": false,
"RestrictPublicBuckets": false
}
}
# Verified public access by accessing object without credentials:
$ curl -I https://novatech-customer-uploads.s3.amazonaws.com/uploads/customer123/document.pdf
HTTP/1.1 200 OK
Impact
- Customer files accessible to anyone on the internet
- Potential data breach affecting all WorkflowPro customers
- GDPR/CCPA compliance violation
- SOC 2 control failure
Remediation
- Immediately remove the public access policy statement
- Enable public access block at bucket level
- Implement pre-signed URLs for legitimate customer access
- Review access logs for unauthorized downloads
- Enable S3 access logging and CloudTrail data events
5. Network Security Configuration
VPC configuration, security groups, and network ACLs control traffic flow in AWS. Misconfigured network controls can expose resources to unintended access.
Security Group Assessment
# List all security groups
aws ec2 describe-security-groups --query 'SecurityGroups[].{Name:GroupName,ID:GroupId,VPC:VpcId}'
# Find security groups with risky inbound rules
# 0.0.0.0/0 to sensitive ports
aws ec2 describe-security-groups --query 'SecurityGroups[*].{
Name:GroupName,
ID:GroupId,
InboundRules:IpPermissions[?contains(IpRanges[].CidrIp, `0.0.0.0/0`)]
}'
# RISKY PATTERNS:
# - SSH (22) open to 0.0.0.0/0
# - RDP (3389) open to 0.0.0.0/0
# - Database ports (3306, 5432, 1433) open to 0.0.0.0/0
# - All traffic (-1) from 0.0.0.0/0
# Check for overly permissive outbound rules
aws ec2 describe-security-groups --query 'SecurityGroups[].{
Name:GroupName,
OutboundRules:IpPermissionsEgress
}'
VPC Configuration Review
# List VPCs
aws ec2 describe-vpcs --query 'Vpcs[].{ID:VpcId,CIDR:CidrBlock,Default:IsDefault}'
# Check VPC Flow Logs (should be enabled)
aws ec2 describe-flow-logs
# Expected: Flow logs enabled for all VPCs
# If empty or missing VPCs, this is a finding
# Check Network ACLs
aws ec2 describe-network-acls --query 'NetworkAcls[].{
ID:NetworkAclId,
VPC:VpcId,
Entries:Entries
}'
# Check for default NACLs with allow all
# Default NACL allows all traffic - should be customized
Public Exposure Assessment
# Find EC2 instances with public IPs
aws ec2 describe-instances --query 'Reservations[].Instances[].{
ID:InstanceId,
Name:Tags[?Key==`Name`].Value|[0],
PublicIP:PublicIpAddress,
PrivateIP:PrivateIpAddress,
State:State.Name
}' --output table
# Find RDS instances publicly accessible
aws rds describe-db-instances --query 'DBInstances[].{
ID:DBInstanceIdentifier,
Engine:Engine,
PubliclyAccessible:PubliclyAccessible,
Endpoint:Endpoint.Address
}'
# Databases should have PubliclyAccessible: false
# Check for public Elastic IPs
aws ec2 describe-addresses
# Check for public load balancers
aws elbv2 describe-load-balancers --query 'LoadBalancers[].{
Name:LoadBalancerName,
Scheme:Scheme,
Type:Type,
VPC:VpcId
}'
# Internet-facing vs internal
Sample Network Finding
Security Group Allows SSH from Any IP
Affected Resource
Security Group: sg-0abc123def456 (bastion-sg)
Evidence
$ aws ec2 describe-security-groups --group-ids sg-0abc123def456
{
"IpPermissions": [
{
"FromPort": 22,
"ToPort": 22,
"IpProtocol": "tcp",
"IpRanges": [
{
"CidrIp": "0.0.0.0/0",
"Description": "SSH access"
}
]
}
]
}
Risk
SSH access from any IP address exposes the bastion host to brute-force attacks from the entire internet. If weak credentials exist or SSH vulnerabilities emerge, the system could be compromised.
Remediation
- Restrict SSH access to known IP ranges (office IPs, VPN ranges)
- Implement AWS Systems Manager Session Manager as SSH alternative
- If wide access required, use AWS Network Firewall or third-party IPS
- Enable VPC Flow Logs to monitor connection attempts
6. Additional AWS Security Checks
CloudTrail (Audit Logging)
# Verify CloudTrail is enabled
aws cloudtrail describe-trails
# Check trail status
aws cloudtrail get-trail-status --name trail-name
# Required configuration:
# - Multi-region trail enabled
# - Log file validation enabled
# - Logs encrypted with KMS
# - Logs stored in protected S3 bucket
# - CloudWatch Logs integration (for alerting)
# Check for organization trail (if AWS Organizations)
aws cloudtrail describe-trails --query 'trailList[?IsOrganizationTrail==`true`]'
RDS Security
# Check RDS encryption and public access
aws rds describe-db-instances --query 'DBInstances[].{
ID:DBInstanceIdentifier,
Engine:Engine,
Encrypted:StorageEncrypted,
PubliclyAccessible:PubliclyAccessible,
MultiAZ:MultiAZ,
AutoMinorVersionUpgrade:AutoMinorVersionUpgrade
}'
# Check for encryption
# StorageEncrypted should be true
# Check for public accessibility
# PubliclyAccessible should be false for most databases
# Check backup retention
aws rds describe-db-instances --query 'DBInstances[].{
ID:DBInstanceIdentifier,
BackupRetention:BackupRetentionPeriod
}'
# Should be > 0 (7+ days recommended)
EKS Security
# List EKS clusters
aws eks list-clusters
# Describe cluster security configuration
aws eks describe-cluster --name cluster-name --query 'cluster.{
Name:name,
Endpoint:endpoint,
EndpointPublicAccess:resourcesVpcConfig.endpointPublicAccess,
EndpointPrivateAccess:resourcesVpcConfig.endpointPrivateAccess,
PublicAccessCidrs:resourcesVpcConfig.publicAccessCidrs,
EncryptionConfig:encryptionConfig,
Logging:logging
}'
# Security considerations:
# - Endpoint should be private or restricted public access
# - Encryption should be enabled
# - Logging should be enabled
Lambda Security
# List Lambda functions
aws lambda list-functions --query 'Functions[].{Name:FunctionName,Runtime:Runtime,Role:Role}'
# Check function configuration
aws lambda get-function-configuration --function-name function-name
# Check resource-based policy (who can invoke)
aws lambda get-policy --function-name function-name
# Security considerations:
# - Functions should use recent runtimes (not deprecated)
# - Execution roles should follow least privilege
# - Environment variables should not contain secrets (use Secrets Manager)
# - VPC configuration for functions accessing internal resources
Secrets Management
# Check Secrets Manager usage
aws secretsmanager list-secrets
# Check for rotation configuration
aws secretsmanager describe-secret --secret-id secret-name --query '{
Name:Name,
RotationEnabled:RotationEnabled,
LastRotatedDate:LastRotatedDate
}'
# Findings:
# - Secrets should have rotation enabled
# - Rotation should occur regularly (30-90 days)
# - Secrets should not be stored in plaintext (environment variables, code)
7. Cloud Security Finding Summary Template
Use this template structure for documenting cloud security findings:
┌─────────────────────────────────────────────────────────────────┐
│ CLOUD SECURITY FINDING TEMPLATE │
├─────────────────────────────────────────────────────────────────┤
│ │
│ IDENTIFICATION │
│ • Finding ID (NOVA-CLOUD-XXX) │
│ • AWS Service affected │
│ • Resource ARN/identifier │
│ • AWS Region │
│ • Severity (Critical/High/Medium/Low) │
│ • Compliance mapping (CIS, SOC 2, etc.) │
│ │
│ DESCRIPTION │
│ • What is misconfigured │
│ • Why it's a security issue │
│ • How it could be exploited │
│ │
│ EVIDENCE │
│ • AWS CLI command used │
│ • Output showing the misconfiguration │
│ • Prowler/ScoutSuite check ID (if applicable) │
│ │
│ IMPACT │
│ • Data exposure risk │
│ • Compliance implications │
│ • Business impact │
│ │
│ REMEDIATION │
│ • Specific steps to fix │
│ • AWS CLI commands for remediation │
│ • AWS documentation reference │
│ • Verification steps │
│ │
└─────────────────────────────────────────────────────────────────┘
Self-Check Questions
Test your understanding of cloud security assessment:
Question 1
An S3 bucket has public access block enabled at the bucket level, but Prowler still reports it as potentially public. What could explain this?
Reveal Answer
Several possibilities:
- Account-level setting: Public access block wasn't enabled at the account level, only bucket level
- ACLs: Object-level ACLs might still allow public access on specific objects uploaded before the block was enabled
- Bucket policy: The bucket policy might have a statement that could become public if the block is removed
- Cross-account access: The bucket might allow access from another AWS account that is considered "external"
- Access points: S3 Access Points might have different public access settings
Validate by checking: account-level public access block, object ACLs, bucket policy for external principals, and Access Analyzer findings.
Question 2
You discover an IAM policy that grants iam:PassRole permission
with Resource: "*". Why is this dangerous even if the user
doesn't have other IAM permissions?
Reveal Answer
iam:PassRole with Resource: "*" allows privilege escalation:
- The user can pass ANY role in the account to AWS services
- If they can create EC2 instances or Lambda functions, they can pass an admin role to that resource
- The resource then operates with admin privileges
- The user can interact with that resource to perform admin actions
Example attack chain:
- User has:
ec2:RunInstances+iam:PassRole * - Admin role exists:
arn:aws:iam::123456789012:role/AdminRole - User launches EC2 with AdminRole attached
- User SSHs to EC2, uses instance metadata to get admin credentials
- User now has admin access
Remediation: Restrict PassRole to specific roles needed for the user's function.
Question 3
NovaTech's CloudTrail is enabled but only in us-east-1. Their DR environment is in us-west-2. What security gap does this create?
Reveal Answer
Single-region CloudTrail creates audit gaps:
- Regional services: EC2, RDS, Lambda API calls in us-west-2 won't be logged
- Attack detection: An attacker could operate in us-west-2 without leaving audit trails
- Compliance: SOC 2 requires complete audit logging; gaps are control failures
- Incident response: Can't investigate what happened in unmonitored regions
Note: Some global services (IAM, CloudFront, Route 53) are logged to us-east-1 regardless, but regional services in other regions won't be captured.
Remediation: Enable multi-region trail or create organization-level trail.
Question 4
Explain the difference between a security group and a network ACL. Which provides better security, and why might you use both?
Reveal Answer
Security Groups:
- Stateful (return traffic automatically allowed)
- Instance-level (attached to ENIs)
- Allow rules only (implicit deny)
- Evaluated before traffic reaches instance
Network ACLs:
- Stateless (must explicitly allow return traffic)
- Subnet-level (applies to all instances in subnet)
- Allow and Deny rules
- Rules evaluated in order by rule number
Neither is "better"—they serve different purposes:
- Security Groups: Fine-grained, application-specific access control
- NACLs: Subnet-wide policies, explicit deny capability, defense-in-depth
Use both for defense-in-depth: NACLs provide subnet-level baseline (block known-bad IPs, restrict ports at subnet boundary), while security groups provide instance-specific rules.
Question 5
You find that NovaTech's RDS PostgreSQL instance has PubliclyAccessible: true
but the security group only allows access from the application security group.
Is this a finding? Why or why not?
Reveal Answer
Yes, this is still a finding, though mitigated:
- Defense-in-depth violation: Security depends on a single control (security group)
- Configuration drift risk: If someone modifies the security group, database becomes exposed
- Public IP assigned: The database has a public IP that could be targeted
- Compliance: CIS Benchmarks and many compliance frameworks require
PubliclyAccessible: false - Attack surface: Even with security group protection, the public endpoint could be probed
Recommendation: Set PubliclyAccessible: false. The database
can still be accessed via private IP from the VPC. This provides defense-in-depth
and meets compliance requirements.
Report as: Medium severity with note about mitigating control.
Question 6
How would you verify that encryption at rest is actually working for an S3 bucket, beyond just checking the configuration?
Reveal Answer
Verification steps beyond configuration check:
- Check individual objects:
aws s3api head-object --bucket bucket-name --key object-key
Look forServerSideEncryptionin response - Upload test object and verify:
aws s3 cp test.txt s3://bucket-name/ aws s3api head-object --bucket bucket-name --key test.txt
Confirm encryption was applied - Check bucket policy enforcement: Look for deny statements that reject unencrypted uploads
- Sample existing objects: Check several existing objects to ensure they're encrypted (objects uploaded before encryption was enabled may be unencrypted)
- For KMS encryption, verify key access:
aws kms describe-key --key-id key-id
Verify the KMS key is enabled and accessible
Lab: AWS Security Configuration Assessment
Objective
Conduct a comprehensive AWS security configuration assessment using automated tools and manual verification, documenting findings with remediation guidance.
Deliverables
- Prowler Assessment Report: Full HTML/JSON output from Prowler scan
- Manual Verification Notes: CLI commands and outputs validating key findings
- Cloud Security Findings Document: Top 10 findings documented professionally
Time Estimate
4-5 hours
Practice Environments
For hands-on AWS security practice without a real account:
- AWS Free Tier: Create free account, deploy intentional misconfigurations — aws.amazon.com/free
- CloudGoat: Vulnerable-by-design AWS deployment — GitHub
- flAWS Challenge: AWS security CTF — flaws.cloud
- flAWS2 Challenge: More advanced AWS security CTF — flaws2.cloud
- Sadcloud: Terraform-deployable vulnerable infrastructure — GitHub
If using your own AWS account, ensure you understand the costs and clean up resources after testing. Never run security tools against accounts you don't own.
Lab Tasks
Part 1: Tool Setup (LO2)
- Install Prowler:
pip3 install prowler - Configure AWS credentials (for your practice account)
- Verify credentials work:
aws sts get-caller-identity - Install additional tools (Steampipe, ScoutSuite) if desired
Part 2: Automated Assessment (LO2, LO4)
- Run Prowler full assessment:
prowler aws -M html json -o ./prowler-results/ - Review HTML report for findings overview
- Categorize findings by:
- Severity (Critical, High, Medium, Low)
- Service (IAM, S3, EC2, etc.)
- Compliance framework (CIS, SOC 2)
- Identify top 10 findings requiring remediation
Part 3: IAM Deep Dive (LO3)
- Generate and analyze credential report:
aws iam generate-credential-report aws iam get-credential-report --output text --query Content | base64 -d > cred-report.csv - Check password policy
- Review IAM policies for overly permissive statements
- Check for privilege escalation paths
- Document IAM findings
Part 4: S3 Security Review (LO4)
- List all buckets and check public access block
- Review bucket policies for dangerous statements
- Verify encryption configuration
- Check access logging configuration
- Document S3 findings
Part 5: Network Security Review (LO4)
- Review security groups for risky rules
- Check for publicly accessible resources
- Verify VPC flow logs are enabled
- Review network ACL configurations
- Document network findings
Part 6: Documentation (LO5)
- Select top 10 findings from assessment
- Document each finding using the cloud security template
- Include:
- AWS CLI evidence
- Prowler check reference
- Compliance mapping
- Specific remediation steps
- Create executive summary of cloud security posture
Self-Assessment Checklist
Tool Execution
- ☐ Prowler completed successfully
- ☐ Reports generated in multiple formats
- ☐ Manual CLI verification performed
- ☐ Findings validated (not just accepted from tool)
Assessment Coverage
- ☐ IAM security thoroughly reviewed
- ☐ S3 security assessed
- ☐ Network configuration examined
- ☐ Logging and monitoring checked
Documentation Quality
- ☐ Top 10 findings professionally documented
- ☐ Evidence includes CLI commands and outputs
- ☐ Remediation guidance is specific and actionable
- ☐ Compliance mapping included
Professional Standards
- ☐ Findings could be presented to client
- ☐ Severity ratings justified
- ☐ Business impact articulated
- ☐ AWS best practice references included
Portfolio Integration
Save your cloud security assessment deliverables:
06-cloud-security/prowler-results/— Automated scan outputsmanual-verification/— CLI commands and outputscloud-security-findings.pdf— Top 10 documented findingsexecutive-summary.pdf— Cloud security posture overview
🎯 Hands-On Capstone Activities
Week 6 capstone activities - Cloud Security Review
🎯 Cloud Security Review Lab
Deliverable: Professional capstone component ready for portfolio
Time estimate: 4-8 hours
☁️ Cloud Security Case Study
What you'll do: Convert your top three cloud findings into a concise case study with risks, business impact, and prioritized fixes.
Why it matters: Case studies show you can translate technical findings into business outcomes.
Time estimate: 2-3 hours
Deliverable: 1-2 page cloud security case study for portfolio
💡 Capstone Strategy: This work becomes your portfolio—make it job-interview ready.
Resources
CIS Amazon Web Services Foundations Benchmark
Industry-standard security configuration benchmark for AWS. Prowler checks align with these recommendations.
CIS AWS Benchmark 90 minutes (review key sections)AWS Security Best Practices
AWS's official guidance on securing your AWS environment, covering IAM, data protection, infrastructure security, and incident response.
AWS Well-Architected Security Pillar 60 minutesflAWS Challenge
Hands-on challenge walking through common AWS security mistakes. Excellent practical learning for S3 and IAM misconfigurations.
flaws.cloud 2-3 hoursWeekly Reflection
Prompt
Reflect on how cloud security assessment differs from traditional infrastructure vulnerability assessment. How does the Shared Responsibility Model change what you're looking for? What surprised you about the types of misconfigurations that tools like Prowler identify?
Consider NovaTech's rapid growth following their Series C funding. How might fast-moving organizations balance speed of deployment against security configuration? What processes or tools could help prevent the types of misconfigurations you identified without slowing development?
Target length: 250-350 words