- Continuous Clearing Tool
- Introduction
- Continuous Clearing Tool workflow diagram
- Prerequisite
- Installation Guide
- Demo project after consuming the package
- Continuous Clearing Tool Execution
- Overview
- Prerequisite for Continuous Clearing Tool execution
- SPDX v2.3 Support
- SPDX SBOM Signature Validator
- SBOM Signing and Verification
- Overview
- Configuration
- Azure DevOps Pipeline Integration
- Parameters Reference
- Configuring the Continuous Clearing Tool
- Below rows repeat for each supported package type.
- Continuous Clearing Tool Execution
- Prerequisite
- Package Identifier
- SW360 Package Creator
- Artifactory Uploader
- Prerequisite
- Package Identifier
- SW360 Package Creator
- Artifactory Uploader
- Artifactory Uploader Release Execution
- How to handle multiple project types in same project
- Templates
- Troubleshoot
- Manual Update
- Bug or Enhancements
- Glossary of Terms
- References
Welcome to the Continuous Clearing Tool, your automated solution for streamlining the SW360 clearing process. Designed with Project Managers and Developers in mind, this tool efficiently manages third-party components across various platforms, including npm, NuGet, Maven, Python, Conan, Choco, Cargo, Alpine, and Debian.
- Automated Scanning and Identification: The tool automatically scans and identifies third-party components in your projects.
- Integration with SW360: It creates entries in SW360 for any components not already present, linking them to their respective projects.
- FOSSology Code Scanning: Initiates jobs for code scans in FOSSology, ensuring compliance and thorough analysis.
- SBOM Generation: Produces a Software Bill of Materials (SBOM) file detailing the nested descriptions of software artifact components and associated metadata.
- Complete Dependency Mapping: Generates SBOMs with a comprehensive
dependenciessection that lists all direct and transitive package relationships, providing full traceability for compliance, security, and auditing. - SBOM Signing: Provides cryptographic signing and verification capabilities to ensure SBOM integrity and authenticity using Azure Key Vault.
- Efficiency in Component Management: Reduces the manual effort required to create and manage components in SW360.
- Error Reduction: Minimizes the risk of manual errors while creating components and identifying the correct version of source codes from public repositories.
- Harmonized Component Creation: Streamlines and harmonizes the creation of third-party components by automatically filling in necessary information in SW360.
- Security and Trust: Ensures SBOM integrity through cryptographic signatures, providing confidence in supply chain security.
Simply integrate the Continuous Clearing Tool into your project workflow to experience seamless clearing processes and enhanced productivity.
-
Package Identifier
-
SW360 Package Creator
-
Artifactory Uploader
To ensure a smooth operation of the Continuous Clearing Tool, please follow these prerequisites:
- Project Entry in SW360:
- Make sure your project is registered in SW360 for license clearance and is set to an Active state when running the Continuous Clearing Tool.
- Access Requirements:
- SW360 REST API Authentication Token:
- SW360 Token:
- Users can generate a token from their functional account.
- Required credentials include the client ID and client secret.
- SW360 Token:
- Artifactory Token:
- Necessary for uploading cleared, internal, and development packages into JFrog Artifactory. Users must obtain their own JFrog Artifactory token.
- SW360 REST API Authentication Token:
For certain scenarios, the tool uses predefined exit codes, which are described below:
| Exit Code | Scenario |
|---|---|
| 0 | Success |
| 1 | Critical failure/error in the run |
| 2 | Action item required from user's side |
When configuring the Continuous Clearing Tool in the pipeline, users can set up each stage to reflect results based on these exit codes. This configuration can be implemented by the configuration management team during pipeline modification to support the Continuous Clearing Tool.
Once configured, your pipeline will look something like this:
Setting up the Continuous Clearing Tool is straightforward with these installation methods:
Deploy the Continuous Clearing Tool instantly using Docker by pulling the latest container image with the following command:
docker pull ghcr.io/siemens/continuous-clearing:latestIntegrate the tool into your .NET projects by downloading the NuGet package:
- Visit the .nupkg file from the GitHub releases page to obtain the latest version of the package.
Dive into practical examples of how the Continuous Clearing Tool can be integrated and utilized within your projects. We provide sample YAML files that demonstrate the tool's setup and functionality.
Explore these configurations in the DemoProject. These samples are crafted to guide and inspire your pipeline configurations, ensuring a smooth and effective integration process after consuming the package.
The Continuous Clearing Tool comprises three executable DLLs, each playing a crucial role in achieving a comprehensive license clearing process. Execute them sequentially as listed below:
Note :The SBOM created by this tool follows the CycloneDX version v1.6 and Siemens SBOM standard v3. These formats ensure the SBOM is detailed, secure, and meets industry and Siemens-specific requirements.
Enhanced Dependency Mapping: The generated SBOM now includes a complete dependencies section that maps all direct and transitive package relationships. This enhancement:
- Lists every package dependency with their
dependsOnrelationships - Provides full traceability of the dependency tree
- Supports cdxgen-generated SBOM enrichment for more accurate dependency data
- Automatically marks components as direct dependencies using the
siemens:directproperty - Validates and removes invalid dependency references to ensure SBOM integrity
See the Acknowledgments section for cdxgen credit and licensing details.
1. Package Identifier
- This DLL processes the input file and generates a CycloneDX BOM file with comprehensive dependency mapping. The input can be a package file or a CycloneDX BOM file created using a standard tool. If multiple input files are present, simply provide the path to the directory as an argument.
- cdxgen Integration (for NPM, NuGet, Maven, Poetry): For enhanced dependency accuracy, provide a cdxgen-generated SBOM file named
cdx_dep.jsonin your input directory alongside your package files. The tool will automatically detect and merge dependency information from cdxgen output. This file pattern is configured in theIncludesection of appSettings.json for supported project types.
Functionality Without Connections: Users have the flexibility to generate a basic SBOM even if connections to SW360, JFrog, or both are unavailable. The tool maintains essential SBOM generation functionality with limited capabilities in such scenarios.
2. SW360 Package Creator
- Processes the SBOM file (output from the Package Identifier) to create missing components/releases in SW360 and link all components to the project within the SW360 portal. This executable also triggers the upload of components to Fossology and automatically updates the clearing state in SW360.
Note: Since the Package Identifier generates an SBOM file with Dev dependencies and internal components, ensure the RemoveDevDependency flag is set to true when executing this DLL.
3. Artifactory Uploader
- This DLL processes the CycloneDX BOM file generated by the SW360 Package Creator. It targets components with a cleared status ("Report approved") and facilitates their transfer from a remote repository to the configured third-party repository in JFrog Artifactory. Additionally, it manages the transfer of development components from the remote repository to the designated development repository. Internal packages are relocated to the configured release repository.
Note: The default setting for the JFrog dry run is true. This flag is intended to perform a dry run of the component copy/move operation, verifying the components' paths and permissions before executing the actual operation.
-
Input files according to project type
-
Project Type : npm
-
Project Type : NuGet
-
.Net core/.Net standard type project's input file repository should contain project.assets.json file. If not present do a dotnet restore.
-
.Net Framework projects, input file repository should contain a packages.config file.
-
-
Project Type : Maven
-
Apache Maven has to be installed in the build machine and added in the PATH variable. Add the cycloneDX Maven Plugin to the main pom.xml and run the command to generate the input bom file.
mvn install cyclonedx:makeAggregateBom -
Input file repository should contain bom.cdx.json file,Which will be the output of CycloneDx-Maven-Plugin tool
-
Note : Incase your project has internal dependencies, compile the project prior to running the clearing tool
mvn clean install -DskipTests=true -
Note : To enable development dependency identification, provide two BOM files in the input folder.
Step 1 - Generate full BOM (all scopes including test, provided):
mvn install cyclonedx:makeAggregateBom -DincludeTestScope=true -DoutputName=bomStep 2 - Generate production-only BOM (compile and runtime scopes only):
mvn install cyclonedx:makeAggregateBom -DincludeTestScope=false -DoutputName=bom-withoutStep 3 - Place both generated files in the same input folder:
InputFolder/ ├── bom.cdx.json ← all dependencies (including test/dev) └── bom-without.cdx.json ← production dependencies onlyCCTool will compare the two files. Any package present in
bom.cdx.jsonbut absent frombom-without.cdx.jsonwill be marked as a development dependency (IsDevelopment = true).Note : The file names
bom.cdx.jsonandbom-without.cdx.jsonare not compulsory. Any two files matching theIncludepattern (*.cdx.jsonby default in appSettings.json) will work. CCTool automatically treats the file with more components as the full BOM and the file with fewer components as the production BOM — file naming and scan order do not matter.Note : If only one BOM file is provided, CCTool will not identify any development dependencies - all packages will be treated as production dependencies.
-
-
Project Type : Python
- Input file repository should contain poetry.lock file.
-
Project Type : Conan
-
Input file repository should contain *.dep.json file.
-
Note :Only Conan v2 is supported.
-
If you previously used
conan.lockfiles with Conan v1, you now need to generate*.dep.jsonfiles using theconan graph info. -
To generate the required dependency graph file for Conan v2 projects, run the following command in your project directory where conanfile.py is present: ``` conan graph info . -f json > conan.dep.json
-
file name should end with dep.json as appSettings.json is configured to pick files with *.dep.json suffix
-
user can change the file pattern if required in the app settings
-
-
Project Type : Cargo
-
Run the command given below (i.e., To generate a metadata file for your project, run the following command in your project directory (where your Cargo.toml is located)) .
-
For creating metadata.json file you can use the format version 1.
Example: cargo metadata --format-version 1 > cargo.metadata.json After successful execution, *.metadata.json file will be created in specified directory .
Resulted cargo.metadata.json file will be having the list of installed packages and the same file will be used as an input to Continuous clearing tool - Package identifier via the input directory parameter. The remaining process is same as other project types. -
-
Project Type : Debian & Alpine
Note : below steps is required only if you have tar file to process , otherwise you can keep CycloneDx.json file in the Input Directory.
-
Create InputImage directory for keeping tar images and InputDirectory for resulted file storing .
-
Run the command given below by replacing the place holder values (i.e., path to input image directory, path to input directory and file name of the Debian image to be cleared) with actual values.
Example:
docker run --rm -v <path/to/InputImageDirectory>:/tmp/InputImages -v <path/to/InputDirectory>:/tmp/OutputFiles ghcr.io/siemens/continuous-clearing /opt/DebianImageClearing/./syft /tmp/InputImages/<fileNameoftheImageTobeCleared.tar> -o cyclonedx-json --file "/tmp/OutputFiles/output.sbom.cdx.json"After successful execution, output.sbom.cdx.json (CycloneDX.json) file will be created in specified directory
Resulted output.sbom.cdx.json file will be having the list of installed packages and the same file will be used as an input to Continuous clearing tool - Package identifier via the input directory parameter. The remaining process is same as other project types.
-
-
Project Type : Choco (Chocolatey)
-
Input file repository should contain choco.config file.
-
Manual license clearing in SW360 is required for Choco packages.
-
-
The Package Identifier supports importing both supported and unsupported SPDX SBoMs and processes them correctly for inclusion in workflows.
- Automatic detection of SPDX files with
.spdx.sbom.jsonsuffix from the input directory - Conversion of SPDX files to CycloneDX SBOM format while preserving all relationships
- Addition of custom property "internal:siemens:clearing:spdx-file-name" in the converted CycloneDX SBOM
- Support for both single and multiple SPDX file processing
- SPDX files should use the suffix:
.spdx.sbom.json - Example:
component.spdx.sbom.json
The tool now includes automated validation of SPDX SBOM signatures and certificates to ensure integrity and authenticity.
- Input file repository should contain spdx.sbom.json file
For each SPDX SBOM file, the following associated files are expected: example.spdx.sbom.json # SBOM file example.spdx.sbom.json.sig # Signature file example.spdx.sbom.json.pem # Public certificate file
- The system automatically detects SPDX SBOM files in the input directory
- For each SBOM file, it locates corresponding
.sigand.pemfiles - Performs signature verification using the public certificate
The Continuous Clearing Tool provides comprehensive SBOM (Software Bill of Materials) signing and verification capabilities to ensure the integrity and authenticity of generated SBOMs. This feature uses Azure Key Vault for secure certificate management and cryptographic operations.
Key Features:
- Signing: Signs SBOMs using RSA-SHA256 algorithm with certificates stored in Azure Key Vault
- Signature Verification: Validates SBOM signatures to ensure integrity and authenticity
- Mandatory Signing: Option to enforce SBOM signing for all generated SBOMs
- Flexible Configuration: Can be enabled/disabled based on project requirements
Workflow:
- SBOM Generation: Tool generates a CycloneDX SBOM during the package identification process
- Automatic Signing: If signing is enabled, the SBOM is automatically signed using Azure Key Vault
- Signature Embedding: The cryptographic signature is embedded directly into the SBOM JSON structure
- Verification: Signed SBOMs can be verified at any time to ensure they haven't been tampered with
The SBOM signing feature can be configured in two ways:
- Through appSettings.json (for standalone execution):
{
"SbomSigning": {
"KeyVaultURI": "https://your-keyvault.vault.azure.net/",
"CertificateName": "your-signing-certificate",
"ClientId": "your-app-registration-client-id",
"ClientSecret": "your-app-registration-client-secret",
"TenantId": "your-azure-tenant-id", Siemens AG tenant id by default
"SBOMSignVerify": true
}
}- Through Azure DevOps Variable Groups (recommended for pipeline execution)
Configuration Behavior:
-
SBOMSignVerify = true (Default): SBOM signing is mandatory. The tool will:
- Generate signed SBOMs automatically
- Fail if signing credentials are missing or invalid
- Ensure all output SBOMs contain valid cryptographic signatures
-
SBOMSignVerify = false: SBOM signing is optional. The tool will:
- Generate unsigned SBOMs (standard workflow)
- Skip signing operations entirely
- Continue normal processing without signature validation
Required Variables:
| Variable Name | Description |
|---|---|
keyVaultUri |
The Key Vault containing the certificate to be used |
certificateName |
The certificate name stored in the Key Vault |
clientId |
Client ID of the App Registration with access to the Key Vault |
clientSecret |
Client Secret of the App Registration with access to the Key Vault |
tenantId |
Azure tenant ID, defaults to Siemens AG |
***Required when SBOMSignVerify is true.
When signing is enabled, the generated SBOM will include a signature section:
{
"bomFormat": "CycloneDX",
"specVersion": "1.6",
"version": 1,
"metadata": {
"timestamp": "2024-01-15T10:30:00Z",
// ... other metadata
},
"components": [
// ... component list
],
"dependencies": [
// ... dependency mappings
],
"signature": {
"algorithm": "http://www.w3.org/2001/04/xmldsig-more#rsa-sha256",
"value": "MEUCIQDw8yKn...base64-encoded-signature...xyz="
}
}Signature Details:
- Algorithm: Uses RSA-SHA256 (
http://www.w3.org/2001/04/xmldsig-more#rsa-sha256) - Value: Base64-encoded cryptographic signature of the SBOM content (excluding the signature field itself)
- Process: The tool removes any existing signature, computes the hash of the clean SBOM, signs the hash, and embeds the signature
-
Certificate Management:
- Certificates are centrally managed in Azure Key Vault
- All projects use the same trusted signing certificate
-
Validation:
- Always verify signed SBOMs in downstream processes
- Implement signature verification in your SBOM consumption workflows
- Log and alert on signature validation failures
Arguments can be provided to the tool in two ways :
Copy content from the sample app settings and create a new appSettings.json file in Continuous Clearing tool Config directory.
The appsettings can be passed to the tool via the command line paramater --settingsfilepath. The structure of the app settings can be fouond here.
Description for the settings in appSettings.json file
| S.No | Argument Name | Description | Mandatory | Example |
|---|---|---|---|---|
| 1 | TimeOut | Timeout in seconds | No | 400 |
| 2 | ProjectType | Type of the project | Yes | NuGet, npm, Poetry, Conan, Choco, Alpine, Debian, Maven, Cargo |
| 3 | MultipleProjectType | Whether multiple project types are supported | No | False |
| 4 | Telemetry.Enable | Enable telemetry | No | False |
| 5 | Telemetry.ApplicationInsightsConnectionString | Application Insights instrumentation key | No | 123-456-789-123-123 |
| 6 | SW360.URL | URL of the SW360 server | Yes | https://sw360.example.com |
| 7 | SW360.ProjectName | Name of the SW360 project | Yes | MyProject |
| 8 | SW360.ProjectID | ID of the SW360 project | Yes | 57362e4179ce4e839f286ddf0b91d177 |
| 9 | SW360.AuthTokenType | Type of the SW360 token | Yes | Bearer or Token |
| 10 | SW360.Token | Auth token for SW360 | Yes | xxxxxx |
| 11 | SW360.Fossology.URL | URL of Fossology server | Yes | https://fossology.example.com |
| 12 | SW360.Fossology.EnableTrigger | Enable Fossology scan trigger | No | True |
| 13 | SW360.IgnoreDevDependency | Ignore development dependencies | No | True |
| 14 | SW360.ExcludeComponents | Components to exclude (PURL format or ComponentName:Version) | No | ["pkg:npm/foobar@12.3.1", "foobar:12.3.1", "foobar:12.*", "foobar:*"] |
| 15 | Directory.InputFolder | Path to input directory | Yes | "/mnt/Input" |
| 16 | Directory.OutputFolder | Path to output directory | Yes | "/mnt/Output" |
| 17 | Jfrog.URL | URL of JFrog Artifactory | Yes | https://jfrog.example.com |
| 18 | Jfrog.Token | Token for authenticating to JFrog | Yes | xxxxxx |
| 19 | Jfrog.DryRun | Enable dry run (no actual copy/move) | No | True |
| 20 | SbomSigning.KeyVaultURI | Azure Key Vault URI containing signing certificate | No* | "https://your-keyvault.vault.azure.net/" |
| 21 | SbomSigning.CertificateName | Name of the certificate in Key Vault | No* | "signing-certificate" |
| 22 | SbomSigning.ClientId | Azure AD App Registration Client ID | No* | "12345678-1234-1234-1234-123456789abc" |
| 23 | SbomSigning.ClientSecret | Azure AD App Registration Client Secret | No* | "your-client-secret" |
| 24 | SbomSigning.TenantId | Siemens AG Tenant ID | No* | "your-tenant-id" |
| 25 | SbomSigning.SBOMSignVerify | Enable mandatory SBOM signing | No | true (default) |
*Required when SbomSigning.SBOMSignVerify is true
| S.No | Argument Name | Description | Is it Mandatory | Example |
|---|---|---|---|---|
| 26 | Npm.Include | File patterns to include for NPM | Yes | ["p*-lock.json", "*.cdx.json"] |
| 27 | Npm.Exclude | Folders/files to exclude for NPM | No | ["node_modules"] |
| 28 | Npm.Artifactory.ThirdPartyRepos | 3rd-party NPM repos and upload toggle | Yes | [{"Name": "npm-remote", "Upload": true}] |
| 29 | Npm.Artifactory.InternalRepos | Internal NPM repos | Yes | ["npm-internal"] |
| 30 | Npm.Artifactory.DevRepos | Development NPM repos | Yes | ["npm-dev"] |
| 31 | Npm.Artifactory.RemoteRepos | Remote NPM repos | Yes | ["npm-remote"] |
| 32 | Npm.ReleaseRepo | NPM release repository name | Yes | "npm-release" |
| 33 | Npm.DevDepRepo | NPM dev dependency repo name | Yes | "npm-devdep" |
You can also pass the above mentioned arguments in the command line. Note: If the second approach is followed then make sure you provide all the settings mentioned in the appsettings.json in the command line
-
Secrets Management: Sensitive data such as the JFrog token, SW360 token, and SBOM signing credentials should be passed as secure variables via command line parameters. This practice ensures that confidential information remains protected.
-
Project Configuration:
- Project-specific details such as the project type, SW360 project ID, project name, and directories can be conveniently passed as command line parameters. This allows for flexible and dynamic execution based on project requirements.
-
Application Settings:
- For other configuration details, maintain them in an
appSettingsfile. You can then pass the path to this settings file using the--settingsfilepathcommand line option. This approach centralizes configuration management, making it easier to track and update.
- For other configuration details, maintain them in an
In order to exclude any components ,it can be configured in the appSettings.json or in the cmdline paramater by providing the packageName:version or the PURL in the SW360:ExcludeComponents field.
- Incase if you want to exclude a single component of the format "@group/componentname" eg : @angular/common specify it as "@group/componentname:version" i.e @angular/common:4.2.6
- If multiple versions has to be excluded of the same component, specify it as "@group/componentname:"* i.e @angular/common:*
In order to Exclude specific folders from the execution, It can be specified under the Exclude section of that specific package type.
Continuous Clearing Tool can be executed as container or as binaries,
Docker run
-
Install Docker (Latest stable version).
-
Create local directories for mapping to the Continuous clearing tool container directories
- Input : Place to keep input files.
- Output : Resulted files will be stored here.
- Log : Continuous clearing log files.
- CAConfig : Place to keep Config files i.e., appSettings.json.
Note : It is not recommended to use Primary drive(Ex C:) for project execution or directory creation and also the drive should be configured as Shared Drives in docker.
-
In order to run the PackageIdentifier.dll , execute the below command.
Example :
docker run --rm -it -v /path/to/InputDirectory:/mnt/Input -v /path/to/OutputDirectory:/mnt/Output -v /path/to/LogDirectory:/var/log -v /path/to/configDirectory:/etc/CATool ghcr.io/siemens/continuous-clearing dotnet PackageIdentifier.dll --settingsfilepath /etc/CATool/appSettings.json -
With SBOM Signing enabled (credentials from appSettings.json or command line parameters):
Example :
docker run --rm -it -v /path/to/InputDirectory:/mnt/Input -v /path/to/OutputDirectory:/mnt/Output -v /path/to/LogDirectory:/var/log -v /path/to/configDirectory:/etc/CATool ghcr.io/siemens/continuous-clearing dotnet PackageIdentifier.dll --settingsfilepath /etc/CATool/appSettings.json --SbomSigning:SBOMSignVerify true --SbomSigning:KeyVaultURI "https://your-keyvault.vault.azure.net/" --SbomSigning:CertificateName "your-certificate" --SbomSigning:ClientId "your-client-id" --SbomSigning:ClientSecret "your-client-secret" --SbomSigning:TenantId "your-tenant-id"
-
In order to run the SW360PackageCreator.dll , execute the below command.
Example :
docker run --rm -it -v /path/to/OutputDirectory:/mnt/Output -v /path/to/LogDirectory:/var/log -v /path/to/configDirectory:/etc/CATool ghcr.io/siemens/continuous-clearing dotnet SW360PackageCreator.dll --settingsfilepath /etc/CATool/appSettings.json
-
Artifactory uploader is not applicable for Debian and Alpine type package clearance.
-
In order to run the Artifactory Uploader dll , execute the below command.
Example :
docker run --rm -it -v /path/to/OutputDirectory:/mnt/Output -v /path/to/LogDirectory:/var/log -v /path/to/configDirectory:/etc/CATool ghcr.io/siemens/continuous-clearing dotnet ArtifactoryUploader.dll --settingsfilepath /etc/CATool/appSettings.json
Binary execution
- .NET 8 runtime https://dotnet.microsoft.com/download/dotnet-core/8.0
- Node.js and Git latest
-
In order to run the PackageIdentifier.exe, execute the below command.
Example :
PackageIdentifier.exe --settingsfilepath /<PathToConfig>/appSettings.json -
With SBOM Signing enabled (credentials from appSettings.json or command line parameters):
Example :
PackageIdentifier.exe --settingsfilepath /<PathToConfig>/appSettings.json --SbomSigning:SBOMSignVerify true --SbomSigning:KeyVaultURI "https://your-keyvault.vault.azure.net/" --SbomSigning:CertificateName "your-certificate" --SbomSigning:ClientId "your-client-id" --SbomSigning:ClientSecret "your-client-secret" --SbomSigning:TenantId "your-tenant-id"
-
In order to run the SW360PackageCreator.exe, execute the below command.
Example :
SW360PackageCreator.exe --settingsfilepath /<PathToConfig>/appSettings.json
-
Artifactory uploader is not applicable for Debian and Alpine type package clearance.
-
In order to run the Artifactory Uploader exe, execute the below command.
Example :
ArtifactoryUploader.exe --settingsfilepath /<PathToConfig>/appSettings.json
By default, the JFrogDryRun is set to True. This configuration is designed for the routine execution of the Artifactory uploader on a daily basis during the project's development phase. The primary objective is to continuously verify the accuracy of component paths and permissions before actual operations. When the JFrogDryRun is set to False, it indicates a shift towards deployment in a production environment. In this mode, the Artifactory uploader is prepared for live operations, signaling the transition from the verification stage to the actual copy/move of components.
-
In order to execute the tool in release mode we need to pass an extra parameter to the existing argument list.
Example : docker run --rm -it -v /D/Projects/Output:/mnt/Output -v /D/Projects/DockerLog:/var/log -v /D/Projects/CAConfig:/etc/CATool ghcr.io/siemens/continuous-clearing dotnet ArtifactoryUploader.dll --settingsfilepath /etc/CATool/appSettings.json --Jfrog:DryRun false
or
Example : ArtifactoryUploader.exe --settingsfilepath //appSettings.json -Jfrog:DryRun false
Incase your project has both npm/NuGet or other components it can be handled by merely running then Package Identifier dll multiple times and generating a sinle SBOM.
-
Run the Package Identifier dll with "ProjectType" set as "npm".
-
A cycloneDX BOM will be generated in the output directory path that you have provided.
-
Next run the Package Identifier dll with "ProjectType" set as "NuGet". In this run make sure that along with the usual arguments you also provide and additional argument "--MultipleProjectType" as True.
Note: Do not change the output directories during the multiple runs as the tool automatically picks up the previosly generated SBOM and combines it.
-
Once this is done after the dll run you can find that the components from the first run for "npm" and the components from second run for "NuGet" will be merged into one BOM file
-
The remaining steps for the package creator and artifactory uploader remains the same.
Sample templates for integrating the Continuous Clearing Tool (CCTool) workflow in Azure Pipelines can be found at templates\azureDevops. For more details on Azure DevOps templates, refer to the official Microsoft Documentation.
- Simplified Setup: Avoids adding manual steps for different CCTool stages.
- Consistency and Standardization: Ensures uniform execution across the organization.
- Automated File Uploads: Handles uploading of logs and BOM files after execution.
- Check-in Templates: Commit the templates into an Azure DevOps repository.
- Reference the Repository: Include the repository in a new pipeline as shown below:
resources:
repositories:
repository: Templates_Pipeline
type: git
name: YourProject/Templates_Pipeline👉 Note: If the Appsettingsfilepath parameter is not passed, the sample default app settings file is used by the template.
The sample default app settings file is located at templates\sample-default-app-settings.json and can be customized as needed.
- template: pipeline/build/pipeline-template-step-install-run-cctool-binary.yml@Templates_Pipeline
parameters:
workingDirectory: $(Build.SourcesDirectory)/MyProject
sw360Token: '$(sw360ApiKey)'
sw360ProjectId: '$(sw360ProjectID)'
sw360ProjectName: 'My Project'
projectDefinitions:
- projectType: 'NuGet'
inputFolder: $(Build.SourcesDirectory)/src
exclude: 'Test'
outputFolder: '$(Build.SourcesDirectory)/output'
JfrogToken: '$(jfrogToken)'- template: pipeline/build/pipeline-template-step-install-run-cctool-docker.yml@Templates_Pipeline
parameters:
workingDirectory: $(Build.SourcesDirectory)/MyProject
sw360Token: '$(sw360ApiKey)'
sw360ProjectId: '$(sw360ImagePrjID)'
sw360ProjectName: 'My Docker Image'
projectDefinitions:
- projectType: 'debian'
inputFolder: $(Build.SourcesDirectory)/images
imageName: 'myapp'
outputFolder: '$(Build.SourcesDirectory)/output'
JfrogToken: '$(jfrogToken)'The projectDefinitions parameter accepts an array of objects with project-specific configurations:
projectDefinitions:
- projectType: 'NuGet' # Mandatory - Project type (NuGet, Debian, npm, etc.)
inputFolder: '/path/to/input' # Mandatory - Folder containing project files
exclude: 'Test;Temp' # Optional - semicolon-separated exclusion list
For Docker/image-based scanning, additional parameters are available:
projectDefinitions:
- projectType: 'Debian' # Mandatory - Project type (NuGet, Debian, npm, etc.)
inputFolder: '/path/to/input' # Mandatory - Folder containing project files
imageName: 'debian' # Mandatory - Only if you want to run against an image on the machine
imageVersion: 'bookworm-slim' # Optional - Version of the Docker image, by default will use the latest image tag
👉 Please ensure that the image being passed above is present in the machine where the CC tool is being run, as a tar file will be generated based on the image.
Both templates share common parameters with some implementation-specific differences.
👉 Do note that the ones which are not required are values which are retrieved from the default app settings file, if you use your app settings you would need to pass these values, or maintain them in your app settings.
| Parameter | Type | Default | Description | Mandatory |
|---|---|---|---|---|
sw360Token |
string | '' | SW360 authentication token | ✅ |
sw360ProjectId |
string | '' | Target SW360 project ID | ✅ |
sw360ProjectName |
string | '' | SW360 project name | ✅ |
projectDefinitions |
object | [] | List of project configurations to scan | ✅ |
outputFolder |
string | '' | Output directory for reports and artifacts | ✅ |
PackageCreatorEnabled |
boolean | true | Enable/disable SW360 package creation | ❌ |
ArtifactoryUploaderEnabled |
boolean | true | Enable/disable Artifactory uploads | ❌ |
JfrogToken |
string | '' | JFrog Artifactory authentication token | ✅ |
JfrogDryRun |
boolean | true | Run Artifactory uploads in dry-run mode | ❌ |
sw360Url |
string | '' | Optional SW360 instance URL | ❌ |
sw360AuthTokenType |
string | '' | Token type for SW360 (e.g., 'Bearer') | ❌ |
fossologyUrl |
string | '' | Optional Fossology instance URL | ❌ |
fossologyEnableTrigger |
boolean | true | Enable/disable Fossology scanning trigger | ❌ |
JfrogUrl |
string | '' | Optional JFrog Artifactory URL | ❌ |
enableTelemetry |
boolean | true | Enable/disable telemetry collection | ❌ |
workingDirectory |
string | '$(Build.SourcesDirectory)' | Base working directory | ❌ |
excludeComponents |
string | '' | Semicolon-separated list of components to exclude | ❌ |
appSettingsPath |
string | '' | Your own AppSettings.json path | ❌ |
| Parameter | Type | Default | Description | Mandatory |
|---|---|---|---|---|
toolVersion |
string | '' | Specific version of the binary tool to use | ❌ |
branchName_powerUser |
string | '' | GitHub branch to build the tool from (for power users) | ❌ |
👉 Use the branchName_powerUser only with prior co-ordination with the Enabler team.
| Parameter | Type | Default | Description | Mandatory |
|---|---|---|---|---|
branchName_powerUser |
string | '' | GitHub branch to build the Docker image from | ❌ |
To uphold our commitment to robust security, licensing adherence, and architectural best practices, the CC Tool has been enhanced to provide proactive guidance on third-party component usage. This feature aims to help development teams identify and address potential compliance considerations early in the project lifecycle, thereby reducing the need for compliance exceptions and streamlining the component selection process.
The Tool includes an intelligent scanning capability for projects. Specifically, when the tool detects the inclusion of certain components that may have specific compliance implications (e.g., those requiring particular scrutiny due to licensing, security, or architectural considerations, such as components based on Chromium), it will:
- Issue a Warning: A clear warning message will be displayed, indicating the presence of the identified component.
- Suggest Alternatives: Alongside the warning, the tool will provide a curated list of recommended alternative components. These alternatives have been vetted to align with our compliance standards and offer similar functionality, enabling teams to easily pivot to approved solutions.
Ex: How to handle Chromium or CEFSharp software packages?
When you have CEF Sharp packages then automatically you use the Chromium Embedded Framework which includes automatically the binary full Chromium component. The automation will add automatically the CEF Sharp component but you have to add the Chromium component manually to your SW360 project. To decide which version is used you can check this mapping table https://bitbucket.org/chromiumembedded/cef/wiki/BranchesAndBuilding
-
In case your pipeline takes a lot of time to run(more than 1 hour) when there are many components. It is advisable to increase the pipeline timeout and set it to a minimum of 1 hr.
-
In case of any failures in the pipeline, while running the tool,check the following configurations.
-
Make sure your build agents are running.
-
Check if there are any action items to be handled from the user's end.(In this case the exit code with which the pipeline will fail is 2)
-
Check if the proxy settings environment variables for sw360 is rightly configured in the build machine.
-
Upload attachment manually for Debian type.
For reporting any bug or enhancement and for your feedbacks click here
| 3P Components | 3rd Party Components |
|---|---|
| BOM | Bill of Material |
| apiAuthToken | SW360 authorization token |
| SBOM Signing | Cryptographic signing of Software Bill of Materials using digital certificates |
| Azure Key Vault | Microsoft Azure service for securely storing certificates, keys, and secrets |
This project integrates dependency data from cdxgen to enhance SBOM dependency accuracy for supported ecosystems (npm, NuGet, Maven, Poetry).
- Tool: cdxgen
- URL: https://github.com/cdxgen/cdxgen
- Fetching Project Id from SW360
- SW360 API Guide : https://www.eclipse.org/sw360/docs/development/restapi/dev-rest-api/
- FOSSology API Guide: https://www.fossology.org/get-started/basic-rest-api-calls/
- cdxgen: https://github.com/cdxgen/cdxgen
Copyright © Siemens AG ▪ 2026