Skip to content

Supply-Chain Security

CSA-0032 — reference for the CSA-in-a-Box supply-chain guarantees. This document is the single place to look when answering:

  • How do I regenerate the dependency lock files?
  • Where are the SBOMs published?
  • How do I verify SLSA provenance on a release artifact?
  • How do I run Trivy locally?
  • A CVE just landed in a dependency — what do I do?

Overview

Control Workflow Artifact Audience
Pinned, hash-verified Python deps (local) scripts/update-locks.sh requirements/*.lock Maintainers, CI, Docker
CycloneDX + SPDX SBOM (Python + images) .github/workflows/sbom.yml Workflow artifacts + Release assets SOC, customers, auditors
CVE scanning (CRITICAL gate) .github/workflows/trivy.yml PR comment, Code Scanning SARIF Reviewers
SLSA Level 3 build provenance .github/workflows/slsa-provenance.yml Signed *.intoto.jsonl on Release Downstream consumers
Weekly dependency upgrades .github/dependabot.yml PRs labeled dependencies Maintainers

1. Regenerating the lock files

The lock files in requirements/*.lock are the single source of truth for reproducible installs. They are generated by pip-compile (from the pip-tools package) directly from pyproject.toml — each optional- dependency extra gets its own lock.

# One-time setup
python -m pip install --upgrade 'pip-tools>=7'

# Regenerate every lock (recommended)
./scripts/update-locks.sh

# Regenerate a single extra
./scripts/update-locks.sh portal

# Regenerate a subset
./scripts/update-locks.sh portal dev governance

The script:

  • Invokes python -m piptools compile --generate-hashes --strip-extras --allow-unsafe --no-emit-index-url --extra=<name> pyproject.toml for each extra.
  • Runs from the repo root so pyproject.toml is picked up correctly.
  • Produces requirements/<extra>.lock with full sha256 pinning.

When to regenerate:

  • Any time pyproject.toml changes (new dep, version bump, new extra).
  • When Dependabot proposes an upgrade PR — regenerate the locks locally and push them to the same PR branch.
  • After a CVE-response upgrade (see §6).

Never hand-edit *.lock — it is a mechanical output. Edit pyproject.toml and re-run scripts/update-locks.sh.

2. SBOM publication

The sbom.yml workflow runs:

  • On every push to main that touches requirements/*.lock, pyproject.toml, or the portal Dockerfiles.
  • On every published GitHub Release.
  • On manual workflow_dispatch.

It emits two SBOM formats — CycloneDX (JSON) and SPDX (JSON) — for each of two surfaces:

  1. Python lock files — one SBOM pair per extra (base, dev, governance, functions, platform, portal, bff, postgres, copilot, streaming).
  2. Portal container imagesportal-backend, portal-frontend (built locally from portal/kubernetes/docker/*/Dockerfile).

Where to find the SBOMs

Location How Retention
Workflow run artifacts Actions → "SBOM" run → artifacts 90 days
GitHub Release assets Releases page → expand release Permanent

Artifact naming pattern:

  • sbom-python-<extra>-cyclonedx.json
  • sbom-python-<extra>-spdx.json
  • sbom-image-<component>-cyclonedx.json
  • sbom-image-<component>-spdx.json

Validating an SBOM locally

# CycloneDX validator
pip install cyclonedx-bom
cyclonedx-py validate --input-file sbom-python-portal-cyclonedx.json

# SPDX tools
pip install spdx-tools
pyspdxtools --infile sbom-python-portal-spdx.json

3. SLSA Level 3 provenance

The slsa-provenance.yml workflow runs on every published Release (on: release: types: [published]). It uses the official slsa-framework/slsa-github-generator reusable workflows:

  • Python artifactsgenerator_generic_slsa3.yml@v2.1.0 produces a signed csa-inabox-python.intoto.jsonl attestation for the sdist and wheel. Sigstore Fulcio + Rekor provide keyless signing; no secret signing key is stored in the repo.
  • Portal container imagesgenerator_container_slsa3.yml@v2.1.0 produces a signed attestation for each image by digest after it has been pushed to ghcr.io/<repo>/portal-{backend,frontend}:<tag>.

Verifying provenance on a release artifact

# Install slsa-verifier
go install github.com/slsa-framework/slsa-verifier/v2/cli/slsa-verifier@latest

# Verify a Python artifact
slsa-verifier verify-artifact csa_inabox-0.1.0-py3-none-any.whl \
    --provenance-path csa-inabox-python.intoto.jsonl \
    --source-uri github.com/<owner>/csa-inabox \
    --source-tag v0.1.0

# Verify a container image
slsa-verifier verify-image \
    ghcr.io/<owner>/csa-inabox/portal-backend@sha256:<digest> \
    --source-uri github.com/<owner>/csa-inabox \
    --source-tag v0.1.0

A PASS means the artifact was built by the exact workflow in this repo at the specified tag, on GitHub-hosted runners, with the exact source commit that the tag points to. Any tampering — modified bytes, forged signatures, swapped registry — is rejected.

What SLSA Level 3 gives you

  • Non-falsifiable — provenance is produced by an isolated builder (the SLSA reusable workflow) that the calling repo cannot influence.
  • Signed — every *.intoto.jsonl is Sigstore-signed against a short-lived Fulcio certificate; the signature is logged in the public Rekor transparency log.
  • Reproducible source mapping — the attestation binds the artifact hash to the exact commit, workflow file path, and trigger event.

4. Running Trivy locally

The trivy.yml workflow scans three surfaces on every PR. You can reproduce each scan locally before opening a PR.

# Install
brew install aquasecurity/trivy/trivy         # macOS
# or
scoop install trivy                            # Windows
# or
sudo apt install trivy                         # Debian/Ubuntu

# 1. Filesystem scan — the one that gates PRs on CRITICAL.
trivy fs --severity CRITICAL --exit-code 1 \
    --ignore-unfixed --vuln-type library \
    requirements/

# 2. Docker image scan (build locally first).
docker build -t csa/portal-backend:local \
    -f portal/kubernetes/docker/backend/Dockerfile .
trivy image --severity CRITICAL --exit-code 1 \
    --ignore-unfixed csa/portal-backend:local

# 3. Config scan (Dockerfile + k8s).
trivy config --severity HIGH,CRITICAL portal/kubernetes/

PR gating rules

Severity Lock files Container images Config
CRITICAL Blocks PR Blocks PR Reports only
HIGH PR comment + SARIF SARIF Reports only
MEDIUM and below Workflow summary Workflow summary Ignored

The HIGH PR comment is idempotent — the workflow updates an existing comment rather than spamming a new one each run. SARIF output is surfaced in GitHub Code Scanning alongside CodeQL.

5. Dependabot configuration

.github/dependabot.yml watches three ecosystems:

  • pip (root, weekly) — Python deps with an azure-* group.
  • github-actions (root, weekly) — pins action SHAs.
  • npm (portal/react-webapp, weekly) — frontend deps.

When Dependabot opens a Python PR, run ./scripts/update-locks.sh locally and force-push the regenerated requirements/*.lock to the same PR branch so the lock stays in sync with the pyproject.toml bump.

Suggested further groupings (non-blocking)

  • An opentelemetry-* group to batch OTel instrumentation bumps (shipped across portal, copilot).
  • An azure-mgmt-* group separate from azure-* so management-plane churn doesn't dominate PR review.
  • A pytest-* group to batch the dev test stack.

These are quality-of-life improvements and are not required for CSA-0032 closure.

6. CVE incident response

When a CVE is disclosed against a dep CSA-in-a-Box ships (detected by Trivy, CodeQL, a Dependabot security alert, or an external report):

6.1. Triage (within 1 business day)

  1. Confirm the CVE ID and affected version range on the GitHub Advisory DB.
  2. Identify which extras pull in the vulnerable package. Grep requirements/*.lock for the package name — the # via comment in each lock shows the dependency chain.
  3. Classify severity using CVSS + reachability:
    • Critical — remotely exploitable, reachable from the portal or an Azure Function entry point. Mitigate within 72 hours.
    • High — exploitable but not network-reachable. Mitigate within 7 days.
    • Medium/Low — track in the next Dependabot batch.

6.2. Remediate

Option A — patched version available (the common case):

  1. Bump the floor in pyproject.toml to the fixed version (preserving the upper bound). Example: cryptography>=42.0.4,<47.0.0.
  2. Regenerate the affected locks:
    ./scripts/update-locks.sh portal copilot  # only the impacted extras
    
  3. Open a PR titled fix(security): bump <pkg> to <ver> for CVE-YYYY-NNNN.
  4. Confirm the Trivy CRITICAL gate now passes on the PR.
  5. Merge and cut a patch release (vX.Y.Z+1).

Option B — no patch available yet:

  1. Evaluate whether the vulnerable code path is reachable. Document the reachability analysis in the PR or advisory.
  2. If unreachable, add a time-boxed Trivy ignore with the CVE ID and an expiry date — track in .trivyignore with a comment linking to the upstream issue.
  3. If reachable, mitigate at the application layer (WAF rule, feature flag, configuration change) and pin to the last-known-good version.
  4. Subscribe to the upstream advisory so we're notified when the patch lands.

6.3. Publish the fix

  1. After merge, the sbom.yml workflow re-publishes SBOMs for the next build — these reflect the patched versions.
  2. The Release pipeline issues new SLSA provenance for the patched artifacts.
  3. File a security advisory on GitHub if the CVE affected a published release (Security tab → AdvisoriesNew).
  4. Update the CHANGELOG under ### Security.

6.4. Post-mortem (for Critical CVEs only)

Within 2 weeks of remediation, open an ADR covering:

  • Timeline from disclosure → patch merge → release.
  • Whether existing controls (Dependabot, Trivy, CodeQL) detected the issue, and if not, why.
  • Any changes to the supply-chain pipeline to catch the next one earlier.

7. Responsible disclosure

If you believe you have found a vulnerability in CSA-in-a-Box itself (not a dependency), report it privately per .github/SECURITY.md. Do not open a public issue or PR.


Last updated: CSA-0032 closure. Owners: Platform Security — @CODEOWNERS:.github/workflows/


See also: