Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Allow LAVA jobs to pass artifacts back to the runner #17

Open
wants to merge 1 commit into
base: main
Choose a base branch
from

Commits on Jan 22, 2023

  1. Allow LAVA jobs to pass artifacts back to the runner

    We'd like to potentially add artifacts created by LAVA jobs to the
    archive stored by Gitlab. To achieve this, we run a cut-down web
    server on the lava-gitlab-runner that is able to respond to POST
    requests at
    
       http://<runner ip:ephemeral port>/<key>/artifacts/
    
    The LAVA job is able to know the upload URL because we introduce a new
    namespace for templated variables `runner` in the lava-gitlab-runner
    and add `ARTIFACT_UPLOAD_URL` to it.
    
    It's still relatively complicated to get variables into LAVA tests;
    the pattern I used in my test repository
    
       https://gitlab.collabora.com/eds/callback-tests
    
    is to create a parameter called `CALLBACK_URL` in the test itself, and
    then in the job we can use a stanza like:
    
    ```yaml
       test: foo
         parameters:
           CALLBACK_URL: {{ '{{ runner.ARTIFACT_UPLOAD_URL }}' }}
    ```
    
    to make it available to the test. Bear in mind, if you use this that
    LAVA does not automatically export parameters from environment
    variables so you will need to export it inside your `steps:` in your
    test if you want to use it in scripts.
    
    The `key` part of the upload URL is a long random string. It's
    generated uniquely per Gitlab job, not per LAVA job, although this
    detail is not important unless you are performing multi-node tests.
    
    Because of the dynamic nature of the key, and the runner's port and
    IP, artifact upload is only possible for `submit` jobs. For `monitor`
    jobs, there's simply no way to communicate the necessary URL to them.
    
    Backing the webserver, there is a shared `UploadServer` that stores
    the uploaded artifacts, and bridges between the web server thread and
    the job thread. It stores a `JobArtifacts` for each active job, which
    the `ArtifactStore` can query when we come to upload files. I've elected
    to put the uploaded artifacts at:
    
      `<job_id>_artifacts/`
    
    in the archive, to match the other uploads which also lead with the
    job ID. Note however that it will require some significant reworking
    to support distinct directories for multi-node jobs. That's because we
    do not know how many nodes there are in the job until after we submit,
    at which point it's too late to create new keys for the other jobs.
    We could speculatively create a surplus, for example, but we couldn't
    then tie them to job IDs anyway.
    
    For the generated upload URL, note that you can use the environment
    variables `LAVA_GITLAB_RUNNER_ROUTABLE_HOST`,
    `LAVA_GITLAB_RUNNER_ROUTABLE_PORT` to specify an arbitrary external
    address, for example for an appropriate reverse proxy. Also the upload
    URL will always be `https`, even though the service itself does not
    have TLS support.
    
    Signed-off-by: Ed Smith <[email protected]>
    eds-collabora committed Jan 22, 2023
    Configuration menu
    Copy the full SHA
    efd3f33 View commit details
    Browse the repository at this point in the history