Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Revert "Remove driver builds, relying on downstream for support packages" #1740

Closed
wants to merge 1 commit into from
Closed
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
37 changes: 1 addition & 36 deletions .github/workflows/check-drivers-failures.yml
Original file line number Diff line number Diff line change
Expand Up @@ -24,42 +24,7 @@ jobs:

- name: Check build failures
run: |
set -eo pipefail

FAILURES_DIR=${FAILURES_DIR:-/tmp/FAILURES}

shopt -s nullglob
cd "$FAILURES_DIR"
failure_files=(*/*/*.log)

for failure_file in "${failure_files[@]}"; do
if [[ "$failure_file" =~ ^([^/]+)/([^/]+)/([^/]+)\.log$ ]]; then
# If the file is empty, there's nothing to alert on. Sometimes kernels
# that don't support eBPF leave the error file hanging around for no
# good reason (I suspect some shenanigan's with tee spawning after we
# mark the file for deletion).
if [[ ! -s "$failure_file" ]]; then
rm -f "$failure_file"
continue
fi

kernel_version="${BASH_REMATCH[1]}"
module_version="${BASH_REMATCH[2]}"
probe_type="${BASH_REMATCH[3]}"
echo >&2 "============================================================================"
echo >&2 "Failed to build ${probe_type} probe"
echo >&2 "Module version: ${module_version}"
echo >&2 "Kernel version: ${kernel_version}"
echo >&2
cat >&2 "$failure_file"
echo >&2
echo >&2
fi
done

# We expand it again to ignore empty files.
failure_files=(*/*/*.log)
[[ "${#failure_files[@]}" == 0 ]]
FAILURES_DIR=/tmp/FAILURES/ ${{ github.workspace }}/kernel-modules/build/drivers-build-failures.sh

- name: Slack notification
if: failure() && github.event_name != 'pull_request'
Expand Down
7 changes: 6 additions & 1 deletion .github/workflows/cpaas-sync-drivers.yml
Original file line number Diff line number Diff line change
Expand Up @@ -120,24 +120,29 @@ jobs:
fi
done

# x86 support packages will not be uploaded to GCP.
# This is to prevent mixing it up with the upstream built packages.
- name: Generate 'latest' and checksum files
if: matrix.platform != 'x86_64'
run: ${{ github.workspace }}/kernel-modules/support-packages/cpaas-additional-files.sh

- name: Create metadata.json
uses: ./.github/actions/support-package-metadata-json
if: matrix.platform != 'x86_64'
with:
support-pkg-dir: ${{ env.SUPPORT_PACKAGE_TMP_DIR }}

- name: Push support-packages
uses: 'google-github-actions/upload-cloud-storage@v2'
if: matrix.platform != 'x86_64'
with:
path: ${{ env.SUPPORT_PACKAGE_TMP_DIR }}
parent: false
destination: ${{ inputs.support-packages-bucket }}/${{ matrix.platform }}

- name: Push support-packages to public bucket
uses: 'google-github-actions/upload-cloud-storage@v2'
if: github.event_name != 'pull_request'
if: github.event_name != 'pull_request' && matrix.platform != 'x86_64'
with:
path: ${{ env.SUPPORT_PACKAGE_TMP_DIR }}
parent: false
Expand Down
252 changes: 252 additions & 0 deletions .github/workflows/drivers.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,252 @@
name: Build collector drivers

on:
workflow_call:
inputs:
drivers-bucket:
type: string
required: true
description: The GCP bucket to pull cached drivers from
bundles-bucket:
type: string
required: true
description: The GCP bucket to pull bundles from
branch-name:
type: string
required: true
description: Branch CI is running on
outputs:
parallel-jobs:
description: |
Number of builders used to build drivers, 0 if no driver is built
value: ${{ jobs.split-tasks.outputs.parallel-jobs }}

env:
DRIVERS_BUCKET: ${{ inputs.drivers-bucket }}

jobs:
# This sentinel job exists such that this workflow call "runs" even
# if subsequent driver build steps are skipped. This means that other
# workflows can depend on this workflow and still run even if it is skipped
#
# This only affects release branches, so is limited to those.
sentinel:
runs-on: ubuntu-latest
if: startsWith(github.ref_name, 'release-') || github.ref_type == 'tag'
steps:
- run: echo Drivers Build

split-tasks:
runs-on: ubuntu-latest
if: |
(github.event_name == 'push' && github.ref_name == 'master') ||
github.event_name == 'pull_request'
outputs:
parallel-jobs: ${{ steps.set-parallel.outputs.parallel-jobs }}
parallel-array: ${{ steps.set-parallel.outputs.parallel-array }}
env:
KERNELS_FILE: ${{ github.workspace }}/kernel-modules/KERNEL_VERSIONS

steps:
- uses: actions/checkout@v4

- name: Patch files
env:
BUILD_LEGACY_DRIVERS: ${{ contains(github.event.pull_request.labels.*.name, 'build-legacy-probes') || github.event_name == 'push' }}
OSCI_RUN: 1
DOCKERIZED: 1
CHECKOUT_BEFORE_PATCHING: false

run: |
git fetch

# Initialize just the falco submodule
git submodule update --init ${{ github.workspace }}/falcosecurity-libs

${{ github.workspace }}/kernel-modules/build/patch-files.sh \
${{ inputs.branch-name }} \
"${BUILD_LEGACY_DRIVERS}" \
${{ github.workspace }} \
kernel-modules/build/prepare-src \
/tmp

- name: Authenticate with GCP
uses: 'google-github-actions/auth@v2'
with:
credentials_json: '${{ secrets.GOOGLE_CREDENTIALS_COLLECTOR_SVC_ACCT }}'

- name: 'Set up Cloud SDK'
uses: 'google-github-actions/setup-gcloud@v2'

- name: Create a mock cache
if: ${{ !contains(github.event.pull_request.labels.*.name, 'no-cache') }}
run: |
${{ github.workspace }}/kernel-modules/build/mock-cache.sh

- name: Get build tasks
env:
USE_KERNELS_FILE: true
DOCKERIZED: 1
OUTPUT_DIR: /tmp
CACHE_DIR: /tmp
BLOCKLIST_DIR: ${{ github.workspace }}/kernel-modules
SCRIPTS_DIR: ${{ github.workspace }}/kernel-modules/build

run: |
${{ github.workspace }}/kernel-modules/build/get-build-tasks.sh

mkdir -p /tmp/tasks
mv /tmp/build-tasks /tmp/tasks/all

- name: Set number of parallel builds
id: set-parallel
shell: python
run : |
import json
import math
import os

kernels = set()

with open('/tmp/tasks/all', 'r') as tasks:
for line in tasks.readlines():
kernel = line.split()[0]
kernels.add(kernel)

# Add a parallel job every 10 kernels, capped to 32
parallel_jobs = math.ceil(len(kernels)/10)
if parallel_jobs > 32:
parallel_jobs=32

parallel = [a for a in range(parallel_jobs)]

with open(os.environ['GITHUB_OUTPUT'], 'a') as f:
f.write(f'parallel-jobs={parallel_jobs}\n')
f.write(f'parallel-array={json.dumps(parallel)}\n')

- name: Split kernels
env:
TASKS_DIR: /tmp/tasks
DRIVER_BUILDERS: ${{ steps.set-parallel.outputs.parallel-jobs }}

run: |
${{ github.workspace }}/kernel-modules/build/kernel-splitter.py

- name: Store tasks and sources
uses: actions/upload-artifact@v4
with:
name: tasks-and-sources
if-no-files-found: ignore
path: |
/tmp/kobuild-tmp/versions-src/
/tmp/tasks/**/**/all
retention-days: 7

build-drivers:
runs-on: ubuntu-latest
needs:
- split-tasks
if: ${{ needs.split-tasks.outputs.parallel-jobs > 0 }}
env:
BUILDERS_DIR: ${{ github.workspace }}/kernel-modules/build

strategy:
matrix:
builder: ${{ fromJSON(needs.split-tasks.outputs.parallel-array) }}

steps:
- uses: actions/checkout@v4

- name: Authenticate with GCP
uses: 'google-github-actions/auth@v2'
with:
credentials_json: '${{ secrets.GOOGLE_CREDENTIALS_COLLECTOR_SVC_ACCT }}'

- name: 'Set up Cloud SDK'
uses: 'google-github-actions/setup-gcloud@v2'

- name: Restore tasks and sources
uses: actions/download-artifact@v4
with:
name: tasks-and-sources
path: /tmp

- name: Set required builders
id: required-builders
run: |
builders=()

for builder_file in "${BUILDERS_DIR}/"*.Dockerfile; do
builder="${builder_file%".Dockerfile"}"
builder="${builder#"${BUILDERS_DIR}/"}"

if [[ ! -f "/tmp/tasks/${builder}/${{ matrix.builder }}/all" ]]; then
continue
fi

tasks="$(wc -l < "/tmp/tasks/${builder}/${{ matrix.builder }}/all")"

if ((tasks)); then
builders+=("${builder}")
fi
done

echo "builders=${builders[*]}" >> "$GITHUB_OUTPUT"

- name: Build builders
if: ${{ steps.required-builders.outputs.builders != '' }}
run: |
# SC gets confused here, this for loop gets expanded to the
# builders and runs once for each of the needed ones
# shellcheck disable=SC2043
for builder in ${{ steps.required-builders.outputs.builders }}; do
docker build --tag "${builder}:latest" \
-f "${BUILDERS_DIR}/${builder}.Dockerfile" \
${{ github.workspace }}/kernel-modules/build/
done

- name: Build drivers
if: ${{ steps.required-builders.outputs.builders != '' }}
run: |
mkdir -p /tmp/{output,bundles,FAILURES}

# this is required for GHA to upload a build failures artifact when
# no build fails
touch /tmp/FAILURES/.dummy

# SC gets confused here, this for loop gets expanded to the
# builders and runs once for each of the needed ones
# shellcheck disable=SC2043
for builder in ${{ steps.required-builders.outputs.builders }}; do
# Download bundles for current builder
awk '{ print "gs://${{ inputs.bundles-bucket }}/bundle-"$1".tgz" }' "/tmp/tasks/${builder}/${{ matrix.builder }}/all" |
sort | uniq | gsutil -m cp -I /tmp/bundles

docker run --rm -i \
-v /tmp/tasks:/tasks:ro \
-v /tmp/kobuild-tmp/versions-src:/kobuild-tmp/versions-src \
-v /tmp/output:/kernel-modules \
-v /tmp/bundles:/bundles:ro \
-v /tmp/FAILURES:/FAILURES \
-e DOCKERIZED=1 \
--name "${builder}" \
"${builder}:latest" < "/tmp/tasks/${builder}/${{ matrix.builder }}/all"

rm -rf /tmp/bundles/*
done

- name: Store built drivers
uses: actions/upload-artifact@v4
with:
name: built-drivers-${{ matrix.builder }}
path: /tmp/output
if-no-files-found: ignore
retention-days: 1

- name: Store build failures
uses: actions/upload-artifact@v4
with:
name: driver-build-failures-${{ matrix.builder }}
path: /tmp/FAILURES
if-no-files-found: ignore
retention-days: 1
18 changes: 18 additions & 0 deletions .github/workflows/init.yml
Original file line number Diff line number Diff line change
Expand Up @@ -22,9 +22,19 @@ on:
description: |
Branch CI is running on
value: ${{ jobs.common-variables.outputs.branch-name }}
drivers-bucket:
description: Bucket used to pull collector drivers from
value: ${{ jobs.common-variables.outputs.drivers-bucket }}
merged-drivers-bucket:
description: Bucket used to push collector drivers into
value: ${{ jobs.common-variables.outputs.merged-drivers-bucket }}
push-drivers-bucket:
description: Bucket used to push collector drivers into
value: ${{ jobs.common-variables.outputs.push-drivers-bucket }}
bundles-bucket:
description: |
Bucket used to download kernel bundles when building drivers
value: ${{ jobs.common-variables.outputs.bundles-bucket }}
support-packages-bucket:
description: |
Bucket to push built support-packages into
Expand Down Expand Up @@ -66,7 +76,10 @@ jobs:
collector-qa-tag: ${{ steps.collector-env.outputs.collector-qa-tag }}
collector-image: ${{ steps.collector-env.outputs.collector-image }}
branch-name: ${{ steps.collector-env.outputs.branch-name }}
drivers-bucket: ${{ steps.gcp-buckets.outputs.drivers-bucket }}
merged-drivers-bucket: ${{ steps.gcp-buckets.outputs.merged-drivers-bucket }}
push-drivers-bucket: ${{ steps.gcp-buckets.outputs.push-drivers-bucket }}
bundles-bucket: ${{ steps.gcp-buckets.outputs.bundles-bucket }}
support-packages-bucket: ${{ steps.gcp-buckets.outputs.support-packages-bucket }}
public-support-packages-bucket: ${{ steps.gcp-buckets.outputs.public-support-packages-bucket }}
cpaas-drivers-bucket: ${{ steps.gcp-buckets.outputs.cpaas-drivers-bucket }}
Expand Down Expand Up @@ -127,6 +140,7 @@ jobs:
MERGED_DRIVER_BUCKET="${MAIN_DRIVER_BUCKET}/merged-build"
STAGING_DRIVER_BUCKET="stackrox-collector-modules-staging/pr-builds/${STAGING_RELATIVE_PATH}"
STAGING_MERGED_DRIVER_BUCKET="${STAGING_DRIVER_BUCKET}/merged-build"
BUNDLES_BUCKET="collector-kernel-bundles-public"
SUPPORT_PACKAGES_BUCKET="sr-roxc/collector/support-packages"
STAGING_SUPPORT_PACKAGES_BUCKET="${SUPPORT_PACKAGES_BUCKET}/${STAGING_RELATIVE_PATH}"
PUBLIC_SUPPORT_PACKAGES_BUCKET="collector-support-public/offline/v1/support-packages"
Expand All @@ -139,11 +153,14 @@ jobs:
CPAAS_STAGING_SUPPORT_PACKAGES_BUCKET="${STAGING_SUPPORT_PACKAGES_BUCKET}"

{
echo "drivers-bucket=${MAIN_DRIVER_BUCKET}"
echo "bundles-bucket=${BUNDLES_BUCKET}"
echo "public-support-packages-bucket=${PUBLIC_SUPPORT_PACKAGES_BUCKET}"
} >> "$GITHUB_OUTPUT"

if [[ ${{ github.event_name }} == "pull_request" ]]; then
{
echo "push-drivers-bucket=${STAGING_DRIVER_BUCKET}"
echo "merged-drivers-bucket=${STAGING_MERGED_DRIVER_BUCKET}"
echo "support-packages-bucket=${STAGING_SUPPORT_PACKAGES_BUCKET}"
if [[ "${{ inputs.cpaas-workflow }}" == "true" &&
Expand All @@ -161,6 +178,7 @@ jobs:
} >> "$GITHUB_OUTPUT"
else
{
echo "push-drivers-bucket=${MAIN_DRIVER_BUCKET}"
echo "merged-drivers-bucket=${MERGED_DRIVER_BUCKET}"
echo "support-packages-bucket=${SUPPORT_PACKAGES_BUCKET}"
echo "cpaas-drivers-bucket=${CPAAS_DRIVERS_BUCKET}"
Expand Down
Loading
Loading