The MITRE Security Automation Framework (SAF) Command Line Interface (CLI) brings together applications, techniques, libraries, and tools developed by MITRE and the security community to streamline security automation for systems and DevOps pipelines.
npm install @mitre/safThe MITRE Security Automation Framework (SAF) Command Line Interface (CLI) brings together applications, techniques, libraries, and tools developed by MITRE and the security community to streamline security automation for systems and DevOps pipelines
The SAF CLI is the successor to Heimdall Tools and InSpec Tools.
- "Heimdall" - A visualizer for all security result data
- "OASIS Heimdall Data Format (OHDF) - aka HDF" - A common data format to preserve and transform security data
>[!NOTE]
> All mention of HDF in this document refers to the OHDF.
* Via NPM
* Update via NPM
* Via Brew
* Update via Brew
* Via Docker
* Update via Docker
* Via Windows Installer
* Update via Windows Installer
* Attest
* Create Attestations
* Apply Attestations
* Convert From HDF
* HDF to ASFF
* HDF to Splunk
* HDF to XCCDF Results
* HDF to Checklist
* HDF to CSV
* HDF to Condensed JSON
* Convert To HDF
* Anchore Grype to HDF
* ASFF to HDF
* AWS Config to HDF
* Burp Suite to HDF
* CKL to POA&M
* CycloneDX SBOM to HDF
* DBProtect to HDF
* Dependency-Track to HDF
* Fortify to HDF
* gosec to HDF
* Ion Channel 2 HDF
* JFrog Xray to HDF
* Tenable Nessus to HDF
* Microsoft Secure Score to HDF
* Netsparker to HDF
* NeuVector to HDF
* Nikto to HDF
* Prisma to HDF
* Prowler to HDF
* Sarif to HDF
* Scoutsuite to HDF
* Snyk to HDF
* SonarQube to HDF
* Splunk to HDF
* Trivy to HDF
* Trufflehog to HDF
* Twistlock to HDF
* Veracode to HDF
* XCCDF Results to HDF
* OWASP ZAP to HDF
* Validate
* Thresholds
* Generate
* Delta
* Delta Supporting Commands
* CKL Templates
* InSpec Metadata
* Inspec Profile
* Thresholds
* Spreadsheet (csv/xlsx) to InSpec
* DoD Stub vs CIS Stub Formatting
* Mapping Files
* Supplement
* Passthrough
* Read
* Write
* Target
* Read
* Write
---
___
The SAF CLI can be installed and kept up to date using npm, which is included with most versions of NodeJS.
``bash`
npm install -g @mitre/saf
#### Update via NPM
To update the SAF CLI with npm:
`bash`
npm update -g @mitre/saf
top
---
The SAF CLI can be installed and kept up to date using brew.
``
brew install mitre/saf/saf-cli
#### Update via Brew
To update the SAF CLI with brew:
``
brew upgrade mitre/saf/saf-cli
top
---
On Linux and Mac:
The docker command below can be used to run the SAF CLI one time, where arguments contains the command and flags you want to run. For ex: --version or view summary -i hdf-results.json.``
docker run -it -v$(pwd):/share mitre/saf
To run the SAF CLI with a persistent shell for one or more commands, use the following, then run each full command. For ex: saf --version or saf view summary -i hdf-results.json. You can change the entrypoint you wish to use. For example, run with --entrypoint sh to open in a shell terminal. If the specified entrypoint is not found, try using the path such as --entrypoint /bin/bash.
``
docker run --rm -it --entrypoint bash -v$(pwd):/share mitre/saf
On Windows:
The docker command below can be used to run the SAF CLI one time, where arguments contains the command and flags you want to run. For ex: --version or view summary -i hdf-results.json.
``
docker run -it -v%cd%:/share mitre/saf
To run the SAF CLI with a persistent shell for one or more commands, use the following, then run each full command. For ex: saf --version or saf view summary -i hdf-results.json. You can change the entrypoint you wish to use. For example, run with --entrypoint sh to open in a shell terminal. If the specified entrypoint is not found, try using the path such as --entrypoint /bin/bash.
``
docker run --rm -it --entrypoint sh -v%cd%:/share mitre/saf
NOTE:
Remember to use Docker CLI flags as necessary to run the various subcommands.
For example, to run the emasser configure subcommand, you need to pass in a volume that contains your certificates and where you can store the resultant .env. Furthermore, you need to pass in flags for enabling the pseudo-TTY and interactivity.
``
docker run -it -v "$(pwd)":/share mitre/saf emasser configure
Other commands might not require the -i or -t flags and instead only need a bind-mounted volume, such as a file based convert.
``
docker run --rm -v "$(pwd)":/share mitre/saf convert -i test/sample_data/trivy/sample_input_report/trivy-image_golang-1.12-alpine_sample.json -o test.json
Other flags exist to open up network ports or pass through environment variables so make sure to use whichever ones are required to successfully run a command.
#### Update via Docker
To update the SAF CLI with docker:
`bash`
docker pull mitre/saf:latest
top
---
To install the latest release of the SAF CLI on Windows, download and run the most recent installer for your system architecture from the Releases π¬οΈ page.
#### Update via Windows Installer
To update the SAF CLI on Windows, uninstall any existing version from your system and then download and run the most recent installer for your system architecture from the Releases π¬οΈ page.
Attest to 'Not Reviewed' controls: sometimes requirements canβt be tested automatically by security tools and hence require manual review, whereby someone interviews people and/or examines a system to confirm (i.e., attest as to) whether the control requirements have been satisfied.
#### Create Attestations
`saf attest apply
attest create Create attestation files for use with
USAGE
$ saf attest create -o
FLAGS
-i, --input=
-o, --output=
-t, --format=
GLOBAL FLAGS
-h, --help Show CLI help
-L, --logLevel=
EXAMPLES
$ saf attest create -o attestation.json -i hdf.json
$ saf attest create -o attestation.xlsx -t xlsx
``
top
#### Apply Attestations
attest apply Apply one or more attestation files to one or more HDF results sets
USAGE
$ saf attest apply -i
FLAGS
-i, --input=
-o, --output=
GLOBAL FLAGS
-h, --help Show CLI help
-L, --logLevel=
EXAMPLES
$ saf attest apply -i hdf.json attestation.json -o new-hdf.json
$ saf attest apply -i hdf1.json hdf2.json attestation.xlsx -o outputDir
`
top$3
Translating your data to and from Heimdall Data Format (HDF) is done using the saf convert command.
Want to Recommend or Help Develop a Converter? See how to get started π°
top
#### Anchore Grype to HDF
`
convert anchoregrype2hdf Translate a Anchore Grype output file into an HDF results set
USAGE
$ saf convert anchoregrype2hdf -i
FLAGS
-i, --input=
-o, --output=
-w, --includeRaw Include raw data from the input Anchore Grype file
GLOBAL FLAGS
-h, --help Show CLI help
-L, --logLevel=
EXAMPLES
$ saf convert anchoregrype2hdf -i anchoregrype.json -o output-hdf-name.json
`
top
#### HDF to ASFF
Note: Uploading findings into AWS Security hub requires configuration of the AWS CLI, see π the AWS documentation or configuration of environment variables via Docker.
`
convert hdf2asff Translate a Heimdall Data Format JSON file into
AWS Security Findings Format JSON file(s) and/or
upload to AWS Security Hub
USAGE
$ saf convert hdf2asff -a
FLAGS
-C, --certificate=
-I, --insecure Disable SSL verification, this is insecure.
-R, --specifyRegionAttribute Manually specify the top-level Region attribute - SecurityHubBatchImportFindings
populates this attribute automatically and prohibits one from
updating it using or BatchUpdateFindings
-i, --input=
-o, --output=
-r, --region=
-t, --target=
-u, --upload Upload findings to AWS Security Hub
GLOBAL FLAGS
-h, --help Show CLI help
-L, --logLevel=
EXAMPLES
Send output to local file system
$ saf convert hdf2asff -i rhel7-scan_02032022A.json -a 123456789 -r us-east-1 -t rhel7_example_host -o rhel7.asff
Upload findings to AWS Security Hub
$ saf convert hdf2asff -i rds_mysql_i123456789scan_03042022A.json -a 987654321 -r us-west-1 -t Instance_i123456789 -u
Upload findings to AWS Security Hub and Send output to local file system
$ saf convert hdf2asff -i snyk_acme_project5_hdf_04052022A.json -a 2143658798 -r us-east-1 -t acme_project5 -o snyk_acme_project5 -u
`
top
#### HDF to Splunk
Notice: HDF to Splunk requires configuration on the Splunk server. See π Splunk Configuration.
`
convert hdf2splunk Translate and upload a Heimdall Data Format JSON file into a Splunk server
USAGE
$ saf convert hdf2splunk -i
FLAGS
-H, --host=
-I, --index=
-P, --port=
-i, --input=
-p, --password=
-s, --scheme=
GLOBAL FLAGS
-h, --help Show CLI help
-L, --logLevel=
EXAMPLES
User name/password Authentication
$ saf convert hdf2splunk -i rhel7-results.json -H 127.0.0.1 -u admin -p Valid_password! -I hdf
Token Authentication
$ saf convert hdf2splunk -i rhel7-results.json -H 127.0.0.1 -t your.splunk.token -I hdf
`
For HDF Splunk Schema documentation visit π Heimdall converter schemas
Previewing HDF Data Within Splunk:
An example of a full raw search query:
`sql
index="<
| join meta.guid
[search index="<
| join meta.guid
[search index="<
``
An example of a formatted table search query:sql`
index="<
| join meta.guid
[search index="<
| join meta.guid
[search index="<
| rename values(meta.filename) AS "Results Set", values(meta.filetype) AS "Scan Type", list(statistics.duration) AS "Scan Duration", first(meta.status) AS "Control Status", list(results{}.status) AS "Test(s) Status", id AS "ID", values(title) AS "Title", values(desc) AS "Description", values(impact) AS "Impact", last(code) AS Code, values(descriptions.check) AS "Check", values(descriptions.fix) AS "Fix", values(tags.cci{}) AS "CCI IDs", list(results{}.code_desc) AS "Results Description", list(results{}.skip_message) AS "Results Skip Message (if applicable)", values(tags.nist{}) AS "NIST SP 800-53 Controls", last(name) AS "Scan (Profile) Name", last(summary) AS "Scan (Profile) Summary", last(version) AS "Scan (Profile) Version"
| table meta.guid "Results Set" "Scan Type" "Scan (Profile) Name" ID "NIST SP 800-53 Controls" Title "Control Status" "Test(s) Status" "Results Description" "Results Skip Message (if applicable)" Description Impact Severity Check Fix "CCI IDs" Code "Scan Duration" "Scan (Profile) Summary" "Scan (Profile) Version"`
top
#### HDF to XCCDF Results
convert hdf2xccdf Translate an HDF file into an XCCDF XML
USAGE
$ saf convert hdf2xccdf -i
FLAGS
-i, --input=
-o, --output=
GLOBAL FLAGS
-h, --help Show CLI help
-L, --logLevel=
EXAMPLES
$ saf convert hdf2xccdf -i hdf_input.json -o xccdf-results.xml
``
top
#### HDF to Checklist
convert hdf2ckl Translate a Heimdall Data Format JSON file into a
DISA checklist file
USAGE
$ saf convert hdf2ckl -i
FLAGS
-h, --help Show CLI help.
-i, --input=
-o, --output=
CHECKLIST METADATA FLAGS
-F, --fqdn=
-H, --hostname=
-I, --ip=
-M, --mac=
-m, --metadata=
--assettype=
DESCRIPTION
Translate a Heimdall Data Format JSON file into a DISA checklist file
EXAMPLES
$ saf convert hdf2ckl -i rhel7-results.json -o rhel7.ckl --fqdn reverseproxy.example.org --hostname reverseproxy --ip 10.0.0.3 --mac 12:34:56:78:90:AB
$ saf convert hdf2ckl -i rhel8-results.json -o rhel8.ckl -m rhel8-metadata.json
``
top
#### HDF to CSV
convert hdf2csv Translate a Heimdall Data Format JSON file into a
Comma Separated Values (CSV) file
USAGE
$ saf convert hdf2csv -i
FLAGS
-f, --fields=
-i, --input=
-o, --output=
-t, --noTruncate Don't truncate fields longer than 32,767 characters (the cell limit in Excel)
GLOBAL FLAGS
-h, --help Show CLI help
-L, --logLevel=
--interactive Collect input tags interactively (not available on all CLI commands)
EXAMPLES
Running the CLI interactively
$ saf convert hdf2csv --interactive
Providing flags at the command line
$ saf convert hdf2csv -i rhel7-results.json -o rhel7.csv --fields "Results Set,Status,ID,Title,Severity"
``
top
#### HDF to Condensed JSON
convert hdf2condensed Condensed format used by some community members
to pre-process data for elasticsearch and custom dashboards
USAGE
$ saf convert hdf2condensed -i
FLAGS
-i, --input=
-o, --output=
GLOBAL FLAGS
-h, --help Show CLI help
-L, --logLevel=
--interactive Collect input tags interactively (not available on all CLI commands)
EXAMPLES
$ saf convert hdf2condensed -i rhel7-results.json -o rhel7-condensed.json
`
top
---
#### ASFF to HDF
Output|Use|Command
---|---|---
ASFF json|All the findings that will be fed into the mapper|aws securityhub get-findings > asff.json
AWS SecurityHub enabled standards json|Get all the enabled standards so you can get their identifiers|aws securityhub get-enabled-standards > asff_standards.json
AWS SecurityHub standard controls json|Get all the controls for a standard that will be fed into the mapper|aws securityhub describe-standards-controls --standards-subscription-arn "arn:aws:securityhub:us-east-1:123456789123:subscription/cis-aws-foundations-benchmark/v/1.2.0" > asff_cis_standard.json
`
convert asff2hdf Translate a AWS Security Finding Format JSON into a
Heimdall Data Format JSON file(s)
USAGE
$ saf convert asff2hdf -o
FLAGS
-C, --certificate=
-I, --insecure Disable SSL verification, this is insecure
-H, --securityHub=
such as the CIS AWS Foundations or AWS Foundational Security Best
Practices documents (in ASFF compliant JSON form)
-a, --aws Pull findings from AWS Security Hub
-i, --input=
-o, --output=
-r, --region=
-t, --target=
GLOBAL FLAGS
-h, --help Show CLI help
-L, --logLevel=
--interactive Collect input tags interactively (not available on all CLI commands)
EXAMPLES
Using ASFF JSON file
$ saf convert asff2hdf -i asff-findings.json -o output-folder-name
Using ASFF JSON file with additional input files
$ saf convert asff2hdf -i asff-findings.json --securityhub
Using AWS to pull ASFF JSON findings
$ saf convert asff2hdf --aws -o out -r us-west-2 --target rhel7
`
top
#### AWS Config to HDF
Note: Pulling AWS Config results data requires configuration of the AWS CLI, see π the AWS documentation or configuration of environment variables via Docker.
`
convert aws_config2hdf Pull Configuration findings from AWS Config and convert
into a Heimdall Data Format JSON file
USAGE
$ saf convert aws_config2hdf -r
FLAGS
-a, --accessKeyId=
-i, --insecure Disable SSL verification, this is insecure.
-o, --output=
-r, --region=
-s, --secretAccessKey=
-t, --sessionToken=
GLOBAL FLAGS
-h, --help Show CLI help
-L, --logLevel=
--interactive Collect input tags interactively (not available on all CLI commands)
EXAMPLES
$ saf convert aws_config2hdf -a ABCDEFGHIJKLMNOPQRSTUV -s +4NOT39A48REAL93SECRET934 -r us-east-1 -o output-hdf-name.json
``
top
#### Burp Suite to HDF
convert burpsuite2hdf Translate a BurpSuite Pro XML file into a Heimdall
Data Format JSON file
USAGE
$ saf convert burpsuite2hdf -i
FLAGS
-i, --input=
-o, --output=
-w, --includeRaw Include raw input file in HDF JSON file
GLOBAL FLAGS
-h, --help Show CLI help
-L, --logLevel=
--interactive Collect input tags interactively (not available on all CLI commands)
EXAMPLES
$ saf convert burpsuite2hdf -i burpsuite_results.xml -o output-hdf-name.json
`
top
#### CKL to POA&M
Note: The included CCI to NIST Mappings are the extracted from NIST.gov, for mappings specific to eMASS use this file instead).
`
convert ckl2POAM Translate DISA Checklist CKL file(s) to POA&M files
USAGE
$ saf convert ckl2POAM -i
FLAGS
-O, --officeOrg=
-d, --deviceName=
-i, --input=
-o, --output=
-s, --rowsToSkip=
GLOBAL FLAGS
-h, --help Show CLI help
-L, --logLevel=
--interactive Collect input tags interactively (not available on all CLI commands)
ALIASES
$ saf convert ckl2poam
EXAMPLES
$ saf convert ckl2POAM -i checklist_file.ckl -o output-folder -d abcdefg -s 2
`
top
#### CycloneDX SBOM to HDF
Note: Currently, only the CycloneDX SBOM, VEX, and HBOM formats are officially supported in the CycloneDX SBOM convert command (formats like SaaSBOM are NOT supported and will result in errors). To convert other non-CycloneDX SBOM formats, first convert your current SBOM data file into the CycloneDX SBOM data format with their provided utility and then convert the CycloneDX SBOM file to OHDF with the saf convert cyclonedx_sbom2hdf command.
EX) To convert SPDX SBOM format to CycloneDX SBOM format using the CycloneDX CLI, you can perform the following:
``
cyclonedx-cli convert --input-file spdx-sbom.json --output-file cyclonedx-sbom.json --input-format spdxjson --output-format json
And then use that resulting CycloneDX SBOM file to convert to OHDF.
`
convert cyclonedx_sbom2hdf Translate a CycloneDX SBOM report into an HDF results set
USAGE
$ saf convert cyclonedx_sbom2hdf -i
FLAGS
-i, --input=
-o, --output=
-w, --includeRaw Include raw input file in HDF JSON file
GLOBAL FLAGS
-h, --help Show CLI help
-L, --logLevel=
--interactive Collect input tags interactively (not available on all CLI commands)
EXAMPLES
$ saf convert cyclonedx_sbom2hdf -i cyclonedx_sbom.json -o output-hdf-name.json
`
top
#### DBProtect to HDF
`
convert dbprotect2hdf Translate a DBProtect report in "Check Results
Details" XML format into a Heimdall Data Format JSON file
USAGE
$ saf convert dbprotect2hdf -i
FLAGS
-i, --input=
-o, --output=
-w, --includeRaw Include raw input file in HDF JSON file
GLOBAL FLAGS
-h, --help Show CLI help
-L, --logLevel=
--interactive Collect input tags interactively (not available on all CLI commands)
EXAMPLES
$ saf convert dbprotect2hdf -i check_results_details_report.xml -o output-hdf-name.json
`
top
##### Dependency-Track to HDF
`
convert dependency_track2hdf Translate a Dependency-Track results JSON
file into a Heimdall Data Format JSON file
USAGE
$ saf convert dependency_track2hdf -i
FLAGS
-h, --help Show CLI help.
-i, --input=
-o, --output=
-w, --with-raw
GLOBAL FLAGS
-L, --logLevel=
--interactive Collect input tags interactively (not available on all CLI commands)
EXAMPLES
saf convert dependency_track2hdf -i dt-fpf.json -o output-hdf-name.json
`
top
#### Fortify to HDF
`
convert fortify2hdf Translate a Fortify results FVDL file into a Heimdall
Data Format JSON file; the FVDL file is an XML that can be
extracted from the Fortify FPR project file using standard
file compression tools
USAGE
$ saf convert fortify2hdf -i
FLAGS
-i, --input=
-o, --output=
-w, --includeRaw Include raw input file in HDF JSON file
GLOBAL FLAGS
-h, --help Show CLI help
-L, --logLevel=
--interactive Collect input tags interactively (not available on all CLI commands)
EXAMPLES
$ saf convert fortify2hdf -i audit.fvdl -o output-hdf-name.json
`
top
#### gosec to HDF
`
convert gosec2hdf Translate a gosec (Golang Security Checker) results file
into a Heimdall Data Format JSON file
USAGE
$ saf convert gosec2hdf -i
FLAGS
-h, --help Show CLI help.
-i, --input=
-o, --output=
-w, --includeRaw Include raw input file in HDF JSON file
GLOBAL FLAGS
-L, --logLevel=
--interactive Collect input tags interactively (not available on all CLI commands)
EXAMPLES
$ saf convert gosec2hdf -i gosec_results.json -o output-hdf-name.json
`
top
#### Ion Channel 2 HDF
`
convert ionchannel2hdf Pull and translate SBOM data from Ion Channel
into Heimdall Data Format
USAGE
$ saf convert ionchannel2hdf -o
FLAGS
-A, --allProjects Pull all projects available within your team
-L, --logLevel=
-a, --apiKey=
-i, --input=
-o, --output=
-p, --project=
-t, --teamName=
--raw Output Ion Channel raw data
GLOBAL FLAGS
-h, --help Show CLI help
-L, --logLevel=
--interactive Collect input tags interactively (not available on all CLI commands)
EXAMPLES
Using Input IonChannel JSON file
$ saf convert ionchannel2hdf -o output-folder-name -i ion-channel-file.json
Using IonChannel API Key (pull one project)
$ saf convert ionchannel2hdf -o output-folder-name -a ion-channel-apikey -t team-name -p project-name-to-pull --raw
Using IonChannel API Key (pull all project)
$ saf convert ionchannel2hdf -o output-folder-name -a ion-channel-apikey -t team-name -A --raw
`
top
#### JFrog Xray to HDF
`
convert jfrog_xray2hdf Translate a JFrog Xray results JSON file into a
Heimdall Data Format JSON file
USAGE
$ saf convert jfrog_xray2hdf -i
FLAGS
-i, --input=
-o, --output=
-w, --includeRaw Include raw input file in HDF JSON file
GLOBAL FLAGS
-h, --help Show CLI help
-L, --logLevel=
--interactive Collect input tags interactively (not available on all CLI commands)
EXAMPLES
$ saf convert jfrog_xray2hdf -i xray_results.json -o output-hdf-name.json
`
top
#### Tenable Nessus to HDF
`
convert nessus2hdf Translate a Nessus XML results file into a Heimdall Data Format JSON file.
The current iteration maps all plugin families except for 'Policy Compliance'
A separate HDF JSON is generated for each host reported in the Nessus Report.
USAGE
$ saf convert nessus2hdf -i
FLAGS
-i, --input=
-o, --output=
-w, --includeRaw Include raw input file in HDF JSON file
GLOBAL FLAGS
-L, --logLevel=
--interactive Collect input tags interactively (not available on all CLI commands)
EXAMPLES
$ saf convert nessus2hdf -i nessus_results.xml -o output-hdf-name.json
`
top
#### Microsoft Secure Score to HDF
Output|Use|Command
---|---|---
Microsoft Secure Score JSON|This file contains the Graph API response for the security/secureScore endpoint|PowerShell$ Get-MgSecuritySecureScore -Top 500security/secureScoreControlProfiles
Microsoft Secure Score Control Profiles JSON|This file contains the Graph API response for the endpoint|PowerShell$ Get-MgSecuritySecureScoreControlProfile -Top 500security/secureScore
Combined JSON|Combine the outputs from and security/secureScoreControlProfiles endpoints|jq -s \'{"secureScore": .[0], "profiles": .[1]}\' secureScore.json secureScoreControlProfiles.json
`
convert msft_secure2hdf Translate a Microsoft Secure Score report and Secure Score Control to a Heimdall Data Format JSON file
USAGE
$ saf convert msft_secure2hdf -p
$ saf convert msft_secure2hdf -t
$ saf convert msft_secure2hdf -i
FLAGS
-C, --certificate=
-I, --insecure Disable SSL verification, this is insecure.
-a, --appId=
-i, --combinedInputs=
{secureScore:
-o, --output=
-p, --inputProfiles=
-r, --inputScoreDoc=
-s, --appSecret=
-t, --tenantId=
-w, --includeRaw Include raw input file in HDF JSON file
GLOBAL FLAGS
-h, --help Show CLI help
-L, --logLevel=
--interactive Collect input tags interactively (not available on all CLI commands)
EXAMPLES
Using input files
$ saf convert msft_secure2hdf -p secureScore.json -r secureScoreControlProfiles -o output-hdf-name.json [-w]
Using Azure tenant ID
$ saf convert msft_secure2hdf -t "12345678-1234-1234-1234-1234567890abcd" \
-a "12345678-1234-1234-1234-1234567890abcd" \
-s "aaaaa~bbbbbbbbbbbbbbbbbbbbbbbbb-cccccccc" \
-o output-hdf-name.json [-I | -C
Using combined inputs
$ saf convert msft_secure2hdf -i <(jq '{"secureScore": .[0], "profiles": .[1]}' secureScore.json secureScoreControlProfiles.json)> \
-o output-hdf-name.json [-w]
`
top
#### Netsparker to HDF
`
convert netsparker2hdf Translate a Netsparker XML results file into a
Heimdall Data Format JSON file. The current
iteration only works with Netsparker Enterprise
Vulnerabilities Scan.
USAGE
$ saf convert netsparker2hdf -i
FLAGS
-i, --input=
-o, --output=
-w, --includeRaw Include raw input file in HDF JSON file
GLOBAL FLAGS
-h, --help Show CLI help
-L, --logLevel=
--interactive Collect input tags interactively (not available on all CLI commands)
EXAMPLES
$ saf convert netsparker2hdf -i netsparker_results.xml -o output-hdf-name.json
`
top
#### NeuVector to HDF
`
convert neuvector2hdf Translate a NeuVector results JSON to a Heimdall Data Format JSON file
USAGE
$ saf convert neuvector2hdf -i
FLAGS
-i, --input=
-o, --output=
-w, --includeRaw Include raw input file in HDF JSON file
GLOBAL FLAGS
-L, --logLevel=
--interactive Collect input tags interactively (not available on all CLI commands)
EXAMPLES
$ saf convert neuvector2hdf -i neuvector.json -o output-hdf-name.json
``
top
#### Nikto to HDF
convert nikto2hdf Translate a Nikto results JSON file into a Heimdall
Data Format JSON file.
Note: Currently this mapper only supports single
target Nikto Scans
USAGE
$ saf convert nikto2hdf -i
FLAGS
-i, --input=
-o, --output=
-w, --includeRaw Include raw input file in HDF JSON file
GLOBAL FLAGS
-h, --help Show CLI help
-L, --logLevel=
--interactive Collect input tags interactively (not available on all CLI commands)
EXAMPLES
$ saf convert nikto2hdf -i nikto-results.json -o output-hdf-name.json
``
top
#### Prisma to HDF
convert prisma2hdf Translate a Prisma Cloud Scan Report CSV file into
Heimdall Data Format JSON files
USAGE
$ saf convert prisma2hdf -i
FLAGS
-i, --input=
-o, --output=
GLOBAL FLAGS
-L, --logLevel=
--interactive Collect input tags interactively (not available on all CLI commands)
EXAMPLES
$ saf convert prisma2hdf -i prismacloud-report.csv -o output-hdf-name.json
``
top
#### Prowler to HDF
convert prowler2hdf Translate a Prowler-derived AWS Security Finding
Format results from JSONL
into a Heimdall Data Format JSON file
USAGE
$ saf convert prowler2hdf -i
FLAGS
-i, --input=
-o, --output=
GLOBAL FLAGS
-h, --help Show CLI help
-L, --logLevel=
--interactive Collect input tags interactively (not available on all CLI commands)
EXAMPLES
$ saf convert prowler2hdf -i prowler-asff.json -o output-folder
``
top
#### Sarif to HDF
convert sarif2hdf Translate a SARIF JSON file into a Heimdall Data
Format JSON file
USAGE
$ saf convert sarif2hdf -i
FLAGS
-i, --input=
-o, --output=
-w, --includeRaw Include raw input file in HDF JSON file
GLOBAL FLAGS
-h, --help Show CLI help
-L, --logLevel=
--interactive Collect input tags interactively (not available on all CLI commands)
DESCRIPTION
SARIF level to HDF impact mapping are:
SARIF level error -> HDF impact 0.7
SARIF level warning -> HDF impact 0.5
SARIF level note -> HDF impact 0.3
SARIF level none -> HDF impact 0.1
SARIF level not provided -> HDF impact 0.1 as default
EXAMPLES
$ saf convert sarif2hdf -i sarif-results.json -o output-hdf-name.json
`
top
#### Scoutsuite to HDF
`
convert scoutsuite2hdf Translate a ScoutSuite results from a Javascript
object into a Heimdall Data Format JSON file
Note: Currently this mapper only supports AWS
USAGE
$ saf convert scoutsuite2hdf -i
FLAGS
-i, --input=
-o, --output=
-w, --includeRaw Include raw input file in HDF JSON file
GLOBAL FLAGS
-h, --help Show CLI help
-L, --logLevel=
--interactive Collect input tags interactively (not available on all CLI commands)
EXAMPLES
$ saf convert scoutsuite2hdf -i scoutsuite-results.js -o output-hdf-name.json
``
top
#### Snyk to HDF
convert snyk2hdf Translate a Snyk results JSON file into a Heimdall
Data Format JSON file
A separate HDF JSON is generated for each project
reported in the Snyk Report
USAGE
$ saf convert snyk2hdf -i
FLAGS
-i, --input=
-o, --output=
GLOBAL FLAGS
-h, --help Show CLI help
-L, --logLevel=
--interactive Collect input tags interactively (not available on all CLI commands)
EXAMPLES
$ saf convert snyk2hdf -i snyk_results.json -o output-file-prefix
`
top
#### SonarQube to HDF
NOTE: Pulling data from the SonarQube instance could take an extended amount of time depending on network conditions and the scale of the project being assessed.
NOTE: The SonarQube instance might need "warming up" before it properly returns all the codesnippets and rules from its API so repeated attempts at this command might be necessary.
`
convert sonarqube2hdf Pull SonarQube vulnerabilities for the specified
project name and optional branch or pull/merge
request ID name from an API and convert into a
Heimdall Data Format JSON file
USAGE
$ saf convert sonarqube2hdf -n
FLAGS
-a, --auth=
-n, --projectKey=
-o, --output=
-u, --url=
-b, --branch=
-p, --pullRequestID=
-g, --organization=
-w, --includeRaw Include raw input requests in HDF JSON file
GLOBAL FLAGS
-h, --help Show CLI help
-L, --logLevel=
--interactive Collect input tags interactively (not available on all CLI commands)
EXAMPLES
$ saf convert sonarqube2hdf -n sonar_project_key -u http://sonar:9000 --auth abcdefg -p 123 -o scan_results.json -w
`
top
#### Splunk to HDF
`
convert splunk2hdf Pull HDF data from your Splunk instance back into an HDF file
USAGE
$ saf splunk2hdf -H
FLAGS
-H, --host=
-I, --index=
-P, --port=
-i, --input=
-o, --output=
-p, --password=
-s, --scheme=
-t, --token=
-u, --username=
GLOBAL FLAGS
-L, --logLevel=
--interactive Collect input tags interactively (not available on all CLI commands)
EXAMPLES
$ saf convert splunk2hdf -H 127.0.0.1 -u admin -p Valid_password! -I hdf -i some-file-in-your-splunk-instance.json -i yBNxQsE1mi4f3mkjtpap5YxNTttpeG -o output-folder
`
top
#### Trivy to HDF
`
convert trivy2hdf Translate a Trivy-derived AWS Security Finding
Format results from JSONL
into a Heimdall Data Format JSON file
USAGE
$ saf convert trivy2hdf -i
FLAGS
-i, --input=
-o, --output=
GLOBAL FLAGS
-h, --help Show CLI help
-L, --logLevel=
--interactive Collect input tags interactively (not available on all CLI commands)
DESCRIPTION
Note: Currently this mapper only supports the results of Trivy's image
subcommand (featuring the CVE findings) while using the ASFF template format
(which comes bundled with the repo). An example call to Trivy to get this
type of file looks as follows:
AWS_REGION=us-east-1 AWS_ACCOUNT_ID=123456789012 trivy image --no-progress --format template --template "@/absolute_path_to/git_clone_of/trivy/contrib/asff.tpl" -o trivy_asff.json golang:1.12-alpine
EXAMPLES
$ saf convert trivy2hdf -i trivy-asff.json -o output-folder
`
top
#### Trufflehog to HDF
`
convert trufflehog2hdf Translate a Trufflehog output file into an HDF results set
USAGE
$ saf convert trufflehog2hdf -i
FLAGS
-i, --input=
-o, --output=
-w, --includeRaw Include raw input file in HDF JSON file
GLOBAL FLAGS
-h, --help Show CLI help
-L, --logLevel=
--interactive Collect input tags interactively (not available on all CLI commands)
EXAMPLES
$ saf convert trufflehog2hdf -i trufflehog.json -o output-hdf-name.json
`
top
#### Twistlock to HDF
`
convert twistlock2hdf Translate a Twistlock CLI output file into an HDF results set
USAGE
$ saf convert twistlock2hdf -i
FLAGS
-i, --input=
-o, --output=
-w, --includeRaw Include raw input file in HDF JSON file
GLOBAL FLAGS
-h, --help Show CLI help
-L, --logLevel=
--interactive Collect input tags interactively (not available on all CLI commands)
EXAMPLES
$ saf convert twistlock2hdf -i twistlock.json -o output-hdf-name.json
`
top
#### Veracode to HDF
`
convert veracode2hdf Translate a Veracode XML file into a Heimdall Data
Format JSON file
USAGE
$ saf convert veracode2hdf -i
FLAGS
-i, --input=
-o, --output=
GLOBAL FLAGS
-h, --help Show CLI help
-L, --logLevel=
--interactive Collect input tags interactively (not available on all CLI commands)
EXAMPLES
$ saf convert veracode2hdf -i veracode_results.xml -o output-hdf-name.json
`xccdf_results2hdf
top
#### XCCDF Results to HDF
Note: only supports native OpenSCAP and SCC output.`
convert xccdf_results2hdf Translate a SCAP client XCCDF-Results XML report
to a Heimdall Data Format JSON file
USAGE
$ saf convert xccdf_results2hdf -i
FLAGS
-i, --input=
-o, --output=
-w, --includeRaw Include raw input file in HDF JSON file
GLOBAL FLAGS
-h, --help Show CLI help
-L, --logLevel=
--interactive Collect input tags interactively (not available on all CLI commands)
EXAMPLES
$ saf convert xccdf_results2hdf -i results-xccdf.xml -o output-hdf-name.json
`
top
#### OWASP ZAP to HDF
`
convert zap2hdf Translate a OWASP ZAP results JSON to a Heimdall Data Format JSON file
USAGE
$ saf convert zap2hdf -i
FLAGS
-i, --input=
-n, --name=
-o, --output=
-w, --includeRaw Include raw input file in HDF JSON file
GLOBAL FLAGS
-h, --help Show CLI help
-L, --logLevel=
--interactive Collect input tags interactively (not available on all CLI commands)
EXAMPLES
$ saf convert zap2hdf -i zap_results.json -n mitre.org -o scan_results.json
`
top
---
The SAF CLI implements the eMASS REST API capabilities by incorporating the eMASSer CLI into the SAF CLI. Please references the eMASSer Features π for additional information
To get top level help execute the following commad:
`
$ saf emasser [-h or -help]
[eMASS] The eMASS REST API implementation
USAGE
$ saf emasser COMMAND
TOPICS
emasser delete eMass REST API DELETE endpoint commands
emasser get eMass REST API GET endpoint commands
emasser post eMass REST API POST endpoint commands
emasser put eMass REST API PUT endpoint commands
COMMANDS
emasser configure Generate a configuration file (.env) for accessing an eMASS instances.
emasser version Display the eMASS API specification version the CLI implements.
`
___
#### Heimdall
You can start a local Heimdall Lite instance to visualize your findings with the SAF CLI. To start an instance use the saf view heimdall command:
`
view heimdall Run an instance of Heimdall Lite to
visualize your data
USAGE
$ saf view heimdall [-h] [-p
FLAGS
-h, --help Show CLI help
-f, --files=
-n, --noOpenBrowser Don't open the default browser automatically
-p, --port=
ALIASES
$ saf heimdall
EXAMPLES
$ saf view heimdall -p 8080
``
top
#### Summary
To get a quick compliance summary from an HDF file (grou