Merge branch 'slb/dperez50/update-to-use-ci' into 'master'

ci: update scripts to use npm ci instead of npm install

See merge request !135
17 jobs for master in 10 minutes and 1 second (queued for 7 seconds)
Status Name Job ID Coverage
  Build
passed compile-and-unit-test #426548
osdu-medium

00:02:09

 
  Containerize
passed aws-containerize #426551
osdu-medium

00:03:39

passed osdu-gcp-containerize-gcloud #426552

00:04:43

failed osdu-gcp-containerize-gitlab #426553
osdu-medium allowed to fail

00:00:07

passed push_runtime_image #426549
osdu-medium

00:03:23

passed push_runtime_image_azure #426550
osdu-medium

00:02:49

 
  Scan
passed fossa-analyze #426554
osdu-medium

00:01:30

passed lint #426555
osdu-small

00:00:39

 
  Deploy
passed aws-update-ecs #426556
osdu-medium

00:00:41

passed azure_deploy #426557
osdu-medium

00:02:44

passed ibm-deploy #426558
osdu-medium

00:02:25

passed osdu-gcp-deploy #426559

00:01:16

 
  Integration
passed aws-test-newman #426560
osdu-medium

00:02:05

passed azure_test #426561
osdu-medium

00:02:10

failed ibm-test #426562
osdu-medium allowed to fail

00:01:53

passed osdu-gcp-test #426563

00:00:21

 
  Attribution
failed fossa-check-notice #426564
osdu-small

00:01:02

 
Name Stage Failure
failed
ibm-test Integration
 4.  JSONError       Validate ctag                                                                   
Unexpected token 's' at 1:2
[seismic-store-service] The dataset sd://opendes/lxegvjkjpkbjbsiq/WAJ95GiN/WXsA
^
at assertion:1 in test-script
inside "datasets / DATASET DSX01 CHECK CTAG FALSE"
--------------------------------------------
Cleaning up file based variables
ERROR: Job failed: exit code 1
failed
fossa-check-notice Attribution
You can download the NOTICE file from this job's artifacts and directly commit it to the repository to resolve
this. Before doing so, review the differences to make sure that they make sense given the changes that you have
made. If they do not, reach out to a maintainer to help diagnose the issue.
Uploading artifacts for failed job
Uploading artifacts...
public: found 2 matching files and directories

Uploading artifacts as "archive" to coordinator... ok
id=426564 responseStatus=201 Created token=R_zmMagq
Cleaning up file based variables
ERROR: Job failed: exit code 1
failed
osdu-gcp-containerize-gitlab Containerize
Downloading artifacts for compile-and-unit-test (426548)...
Downloading artifacts from coordinator... ok
id=426548 responseStatus=200 OK token=yeisNi-i
Executing "step_script" stage of the job script
Using docker image sha256:33d3743ceb6ae9c655769327f4d723433841510d5d9602307e79189807dac391 for docker:19.03 with digest docker@sha256:8209ca87a5bd61f58b988d239bfeeea1e55ba0b0e7e3256c1bc1e5abe68a178b ...
$ export EXTRA_DOCKER_TAG=""; if [ "$CI_COMMIT_TAG" != "" ] ; then EXTRA_DOCKER_TAG="-t $CI_REGISTRY_IMAGE/osdu-gcp:$CI_COMMIT_TAG" ; elif [ "$CI_COMMIT_REF_NAME" = "master" ] ; then EXTRA_DOCKER_TAG="-t $CI_REGISTRY_IMAGE/osdu-gcp:latest" ; fi
$ docker build -t $OSDU_GCP_LOCAL_IMAGE_TAG_SHA $EXTRA_DOCKER_TAG --file provider/$OSDU_GCP_SERVICE-$OSDU_GCP_VENDOR/cloudbuild/Dockerfile.cloudbuild --build-arg PROVIDER_NAME=$OSDU_GCP_VENDOR --build-arg PORT=$OSDU_GCP_PORT .
unable to prepare context: unable to evaluate symlinks in Dockerfile path: lstat /builds/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-service/provider/seismic-store-gcp/cloudbuild/Dockerfile.cloudbuild: no such file or directory
Cleaning up file based variables
ERROR: Job failed: exit code 1