- Oct 17, 2024
-
-
Marc Burnie [AWS] authored
-
- Apr 06, 2024
-
-
David Diederich authored
-
- Jun 07, 2023
-
-
David Diederich authored
Dependencies listed here should only be used for packages used as a tool -- and never packages used as libraries. They are excluded from the license analysis step, because most license terms don't trigger when using packages as tools.
-
- Apr 07, 2023
-
-
David Diederich authored
-
- Jan 16, 2023
-
-
David Diederich authored
-
- Nov 04, 2022
-
-
Christophe Lallement authored
-
- Oct 25, 2022
-
-
Christophe Lallement authored
-
Christophe Lallement authored
-
- May 27, 2022
-
-
David Diederich authored
GitLab 15.0 removed this field. Documentation: https://docs.gitlab.com/ee/ci/yaml/artifacts_reports.html
-
- Apr 20, 2022
-
-
David Diederich authored
This works by capturing the standard output of the pip install command to see where it is downloading packages from. That's used in the analysis step to separate the libraries into first-party and third-party dependencies. Package names and versions are generated by looking for setup.py files. These can be skipped by adding them to the IGNORE_PYTHON_SETUP variable.
-
- Feb 28, 2022
-
-
David Diederich authored
The for loops are both more verbose, and are hidden in job logs. xargs is a little arcane, but on balance requiring looking up the CI logic seems worse than using "clever" UNIX commands.
-
David Diederich authored
-
David Diederich authored
In case the same URL is present in multiple requirements files, we pass them through `sort -u` before appending the rest of the frozen requirements. This avoids duplication, which isn't wrong per se, but can be confusing.
-
David Diederich authored
This makes tho job trace look better, because multi-line commands are collapsed in the output it is less clear what's happening
-
David Diederich authored
This can be done from the requirements files directly, or via a PIP_EXTRA_INDEX_URLS variable in the GitLab CI file. These are used in the `pip install` steps, but also embedded in the resulting `all-requirements.txt` file. Additionally, there is a `REQ_COLLECT` variable available, which can be used to provide custom steps for generating a suitable `all-requirements.txt` file.
-
- Feb 10, 2022
-
-
David Diederich authored
This cache is shared across branches, and can cause FOSSA to see dependencies from other branches. As a fast and simple implementation, download everything fresh every time.
-
- Feb 03, 2022
-
-
fabian serin authored
-
- Feb 02, 2022
-
-
fabian serin authored
-
- Jan 28, 2022
-
-
fabian serin authored
-
- Sep 17, 2021
-
-
David Diederich authored
Pipeline artifacts are a substantial part of the utilized disk space, so I'm hoping this helps free up some space.
-
- Nov 18, 2020
-
-
David Diederich authored
-
David Diederich authored
-
David Diederich authored
-
David Diederich authored
Change of FOSSA strategy -- instead of trying to generate the requirements in the fossa-analyze stage, do it in the compile job This way, a project needing a different version of python will be able to override the image in the build-and-unit-test job, and the new pip / venv commands will be used to generate the full dependency list (pip freeze). That project couldn't reasonably be expected to come up with a new version of the fossa-cli-utilities image based on a different python package.
-
David Diederich authored
If you're gonna go through the trouble of defining an environment variable for PIP_CMD, you should probably use it everywhere..
-
David Diederich authored
The python-git image was based on python:3.7-slim-buster, which git installed. Using the non-slim version comes with git preinstalled, and that is easier than maintaining our own separate container. Also, if the services using this build script want to specify a different version of python to use, they can do so. If necessary, the VENV_CMD can be altered to enable the use of virtualenv over venv.
-
- Nov 04, 2020
-
-
David Diederich authored
-
David Diederich authored
-