Ingestion Workflow issueshttps://community.opengroup.org/osdu/platform/data-flow/ingestion/ingestion-workflow/-/issues2023-11-29T15:11:50Zhttps://community.opengroup.org/osdu/platform/data-flow/ingestion/ingestion-workflow/-/issues/160Manifest-based ingestion workflow - Custom attributes check fails2023-11-29T15:11:50ZSamiullah GhousudeenManifest-based ingestion workflow - Custom attributes check fails **{+ Manifest-based ingestion workflow - Custom attributes check fails +}**
Below reference data manifest with custom attributes successfully ingested into WKS `schema-osdu:wks:reference-data--FacilityType:1.0.0` using Manifest-based i... **{+ Manifest-based ingestion workflow - Custom attributes check fails +}**
Below reference data manifest with custom attributes successfully ingested into WKS `schema-osdu:wks:reference-data--FacilityType:1.0.0` using Manifest-based ingestion workflow.
Looks like manifest ingestion workflow fails to validate the custom attributes based on defined WKS schema in manifest.
:zap: _Note : this test performed in pre-shipping Azure environment, workflow {-RUN-ID : ce3ce29f-47b6-47f5-ab93-5cca3163350a-}.
Have also tested in pre-shipping GCP environment, and results are same as well._
cc -@chad @debasisc @todaiks
````{
"data": {
"Code": "Well",
"ID": "Well",
"Name-test": "Well",
"Source": "Workbook Published/FacilityTypeType.1.0.0.xlsx; commit SHA 0b4db59a.",
"test-ingestion": [
{
"test-id": "12345"
}
]
},
"meta": null,
"modifyUser": "preshipping@azureglobal1.onmicrosoft.com",
"modifyTime": "2023-11-08T10:34:42.243Z",
"id": "opendes:reference-data--FacilityType:Well-06112023",
"version": 1699439681616809,
"kind": "osdu:wks:reference-data--FacilityType:1.0.0",
"acl": {
"viewers": [
"data.default.viewers@opendes.contoso.com"
],
"owners": [
"data.default.owners@opendes.contoso.com"
]
},
"legal": {
"legaltags": [
"opendes-Test-Legal-Tag-1007568"
],
"otherRelevantDataCountries": [
"US"
],
"status": "compliant"
},
"createUser": "preshipping@azureglobal1.onmicrosoft.com",
"createTime": "2023-11-06T18:14:31.762Z"
}https://community.opengroup.org/osdu/platform/data-flow/ingestion/ingestion-workflow/-/issues/159ADR: Implement Airflow facade endpoint2024-01-08T10:10:33ZRiabokon Stanislav(EPAM)[GCP]ADR: Implement Airflow facade endpoint# Context
OSDU Platform uses Apache Airflow for orchestration of various data ingestion and processing jobs.
# Problem statement
Currently OSDU Airflow component does not support data isolation for multi-tenant deployments. Airflow Admi...# Context
OSDU Platform uses Apache Airflow for orchestration of various data ingestion and processing jobs.
# Problem statement
Currently OSDU Airflow component does not support data isolation for multi-tenant deployments. Airflow Administrative UI is available for all users and makes possible to observe all the processing data for all existing tenants which may cause data leaks and security issues.
# Proposal of the solution
It is proposed to introduce a facade that will replace Airflow admin UI and will collect in a tenant-specific way via the Airflow REST API job execution information (namely its resulting x-com variables). To do this we need to add a new endpoint in the Workflow service API, which will collect the details of the DAG run using the existing Airflow REST API v2.
New API endpoint /v1/workflow/{workflow_name}/workflowRun/{runId}/lastInfo should implement the following business logic:
![image-2023-10-18_17-48-20](/uploads/44f53a3de410b8dff0276b127387f29a/image-2023-10-18_17-48-20.png)
- Get internal workflow entity with getWorkflowRunByName and check if submittedBy corresponds to the user submitted in the header, otherwise return 401 NOT_AUTHORIZED
- Get list of all task instances with /dags/{dag_id}/dagRuns/{dag_run_id}/taskInstances where dag_id is workflow_name and dag_run_id is runId
- Select task instance with maximal end_date
- With task_id of the selected task instance get list of xcom entries keys /dags/{dag_id}/dagRuns/{dag_run_id}/taskInstances/{task_id}/xcomEntries
- Obtain xcom values by theis keys using /dags/{dag_id}/dagRuns/{dag_run_id}/taskInstances/{task_id}/xcomEntries/{xcom_key}
- Return task instance details from step 3 combined with xcom values map in a single JSON responceM23 - Release 0.26Rustam Lotsmanenko (EPAM)rustam_lotsmanenko@epam.comRiabokon Stanislav(EPAM)[GCP]Andrei Dalhikh [EPAM/GC]Rustam Lotsmanenko (EPAM)rustam_lotsmanenko@epam.comhttps://community.opengroup.org/osdu/platform/data-flow/ingestion/ingestion-workflow/-/issues/158A custom header 'x-user-id' is used in core part2023-11-08T19:54:10ZRiabokon Stanislav(EPAM)[GCP]A custom header 'x-user-id' is used in core partI wanted to bring to your attention an issue that was identified by our GC Team while they were in the process of addressing https://community.opengroup.org/osdu/platform/data-flow/ingestion/ingestion-workflow/-/issues/157.
org.opengrou...I wanted to bring to your attention an issue that was identified by our GC Team while they were in the process of addressing https://community.opengroup.org/osdu/platform/data-flow/ingestion/ingestion-workflow/-/issues/157.
org.opengroup.osdu.workflow.service.WorkflowRunServiceImpl#addUserId
```
private Map<String, Object> addUserId(String workflowName, TriggerWorkflowRequest request) {
final Map<String, Object> executionContext = request.getExecutionContext();
if (executionContext.get(KEY_USER_ID) != null) {
String errorMessage = String.format("Request to trigger workflow with name %s failed because execution context contains reserved key 'userId'", workflowName);
throw new AppException(400, "Failed to trigger workflow run", errorMessage);
}
String userId = dpsHeaders.getUserId();
log.debug("putting user id: " + userId + " in execution context");
executionContext.put(KEY_USER_ID, userId);
return executionContext;
}
```
The current logic relies on a custom header that is primarily intended for use at an infrastructural level, as outlined in https://community.opengroup.org/osdu/platform/data-flow/ingestion/home/-/issues/52. The GC team approved an ADR with the understanding that this custom header would not be utilized within the core codebase.
However, as indicated in https://community.opengroup.org/osdu/platform/deployment-and-operations/helm-charts-azure/-/merge_requests/366, a header named 'x-user-id' is populated with data from 'x-on-behalf-of' using a specific rule. This mechanism aligns with the requirements of the CSP provider but may not be entirely suitable for the Core Part of the Workflow Service.
```
if (jwt_authn[msft_issuer]["appid"] == serviceAccountClientId and on_behalf_of_header ~= nil and on_behalf_of_header ~= '') then
request_handle:headers():add("x-user-id", request_handle:headers():get("x-on-behalf-of"))
else
request_handle:headers():add("x-user-id", jwt_authn[msft_issuer]["appid"])
end
```
This logic introduces **three key issues**:
- The core part of the Workflow service depends on a custom CSP header to execute context, which may not be in alignment with the intended architecture.
- The Workflow service may not operate correctly without <ISTIO> and the accompanying special rule, potentially limiting its usability.
- There is a security concern in that 'x-user-id' is not currently validated on the BackEnd side, allowing any user to utilize it for potentially vested interests.
_As for the third problem_, there is the test case:
1. A user was authorized within Workflow Service.
1. This user uses 'x-user-id' with the name of another user, resulting in the triggering of a workflow under the identity of a different user.https://community.opengroup.org/osdu/platform/data-flow/ingestion/ingestion-workflow/-/issues/157Pass workflow user ID to the Airflow as part of payload.2023-11-08T19:54:46ZRiabokon Stanislav(EPAM)[GCP]Pass workflow user ID to the Airflow as part of payload.This issue was discovered by GC Team when the QA Team was testing a platform.
It revolves around triggering workflows and the addition of the User ID into the execution context through the 'x-user-id' header.
Upon further investigation,...This issue was discovered by GC Team when the QA Team was testing a platform.
It revolves around triggering workflows and the addition of the User ID into the execution context through the 'x-user-id' header.
Upon further investigation, we came across the(MR) https://community.opengroup.org/osdu/platform/deployment-and-operations/helm-charts-azure/-/merge_requests/366, which appears to implement this logic with a dependency on the infrastructural level.
However, we have to add some kind of validation or additional logic to use a header 'user' in core logic. This adjustment is essential as we might want to use the service without a service mesh or similar infrastructure.
org.opengroup.osdu.workflow.service.WorkflowRunServiceImpl#addUserId
```
private Map<String, Object> addUserId(String workflowName, TriggerWorkflowRequest request) {
final Map<String, Object> executionContext = request.getExecutionContext();
if (executionContext.get(KEY_USER_ID) != null) {
String errorMessage = String.format("Request to trigger workflow with name %s failed because execution context contains reserved key 'userId'", workflowName);
throw new AppException(400, "Failed to trigger workflow run", errorMessage);
}
String userId = dpsHeaders.getUserId();
log.debug("putting user id: " + userId + " in execution context");
executionContext.put(KEY_USER_ID, userId);
return executionContext;
}
```M21 - Release 0.24https://community.opengroup.org/osdu/platform/data-flow/ingestion/ingestion-workflow/-/issues/156listallworkflow API does not return Version of services/DAG deployed.2023-10-23T08:06:15Zvinisha krishnalistallworkflow API does not return Version of services/DAG deployed.The following API returns Version 1 which seems to be hardcoded. We need to get the version of services/DAGs deployed. With ADME we do not have visibility of which version is needed.
{base_url}/solutions/data-flow/apis/workflow-service...The following API returns Version 1 which seems to be hardcoded. We need to get the version of services/DAGs deployed. With ADME we do not have visibility of which version is needed.
{base_url}/solutions/data-flow/apis/workflow-service#/Workflow/listAllWorkflow
![image](/uploads/c99bde240f7cd2276d899e3f05f7cd73/image.png)https://community.opengroup.org/osdu/platform/data-flow/ingestion/ingestion-workflow/-/issues/155Workflow Run API - Providing incorrect data partition Id in payload does not ...2023-08-03T04:26:37ZSurabhi SethWorkflow Run API - Providing incorrect data partition Id in payload does not trigger workflow but gives 200 responseAPI: Workflow Service API \> Workflow Run /workflow/{workflow_name}/workflowRun
![image.png](/uploads/46a0c21db8f5f38c5fd62c233cfff6f9/image.png){width=916 height=473}
If incorrect dataPartitionId is passed in the payload, workflow tri...API: Workflow Service API \> Workflow Run /workflow/{workflow_name}/workflowRun
![image.png](/uploads/46a0c21db8f5f38c5fd62c233cfff6f9/image.png){width=916 height=473}
If incorrect dataPartitionId is passed in the payload, workflow trigger does not happen.
Actual:
However status code of the trigger API is 200.
Expected:
Non-200 status code should be returned - 5xx, 4xx.https://community.opengroup.org/osdu/platform/data-flow/ingestion/ingestion-workflow/-/issues/154Workflow Run API - requires datapartitionId in body as well as header2023-10-26T12:23:43ZSurabhi SethWorkflow Run API - requires datapartitionId in body as well as header
API: Workflow Service API \> Workflow Run /workflow/{workflow_name}/workflowRun
This service takes data-partition-id as part of the headers as well as payload body { "executionContext": { "id": "string", \*\* "dataPartitionId": "string...
API: Workflow Service API \> Workflow Run /workflow/{workflow_name}/workflowRun
This service takes data-partition-id as part of the headers as well as payload body { "executionContext": { "id": "string", \*\* "dataPartitionId": "string"\*\* }, "runId": "string" }
![MicrosoftTeams-image__5\<span data-escaped-char\>\_\</span\>](/uploads/5e8d61cdc1316019ab905597094525b9/MicrosoftTeams-image__5_.png)Issue: Requesting for dataPartitionId in the payload body is redundant, and inconsistent with the implementation of all other OSDU API's (where data-partition-id is used from the header)
Ref: https://community.opengroup.org/osdu/platform/data-flow/ingestion/ingestion-workflow/-/blob/master/docs/api/openapi.workflow.yaml?plain=0Chad LeongDeepa KumariChad Leonghttps://community.opengroup.org/osdu/platform/data-flow/ingestion/ingestion-workflow/-/issues/153IBM workflow integration test failing - for M182023-06-01T09:52:21Zvikas ranaIBM workflow integration test failing - for M18https://community.opengroup.org/osdu/platform/data-flow/ingestion/ingestion-workflow/-/jobs/1996356https://community.opengroup.org/osdu/platform/data-flow/ingestion/ingestion-workflow/-/jobs/1996356M18 - Release 0.21vikas ranavikas ranahttps://community.opengroup.org/osdu/platform/data-flow/ingestion/ingestion-workflow/-/issues/152APIs to get the XCOM summary (Entries) are working in AWS environment, but ar...2023-11-09T07:43:12ZKamlesh TodaiAPIs to get the XCOM summary (Entries) are working in AWS environment, but are NOT working in other CSPs (Azure, GC and IBM) environmentsThe APIs to get the xcomEntries using the the runid and the taskinstance are working in AWS environment. The endpoints/API are not implemented/deployed in other CSP's (Azure, GC, IBM) environments.
<details><summary>curl --location 'htt...The APIs to get the xcomEntries using the the runid and the taskinstance are working in AWS environment. The endpoints/API are not implemented/deployed in other CSP's (Azure, GC, IBM) environments.
<details><summary>curl --location 'https://r3m16.forumtesting.osdu.aws/api/airflow/api/v1/dags/Osdu_ingest/dagRuns/45eb9f45-aada-4e2c-b618-818fb5dfcf28/taskInstances/process_single_manifest_file_task/**xcomEntries/record_ids**' \
--header 'data-partition-id: osdu' \
--header 'Authorization: Bearer eyJraWQiOi...fWbOUA3RcQ'</summary>
</details>
Response 200 OK
{
"dag_id": "Osdu_ingest",
"execution_date": "2023-04-04T21:19:27.327451+00:00",
"key": "record_ids",
"task_id": "process_single_manifest_file_task",
"timestamp": "2023-04-04T21:19:48.761929+00:00",
"value": "['osdu:reference-data--FacilityType:WELL_999259423605', 'osdu:master-data--Organisation:Auto_Test_999259423605', 'osdu:reference-data--FacilityEventType:SPUD_DATE_999259423605', 'osdu:reference-data--VerticalMeasurementPath:DEPTH_DATUM_ELEV_999259423605', 'osdu:reference-data--AliasNameType:WELL_NAME_999259423605', 'osdu:master-data--Well:999259423605']"
}
=========================================================================================================================
`curl --location 'https://r3m16.forumtesting.osdu.aws/api/airflow/api/v1/dags/Osdu_ingest/dagRuns/45eb9f45-aada-4e2c-b618-818fb5dfcf28/taskInstances/process_single_manifest_file_task/**xcomEntries/skipped_ids**' \
--header 'data-partition-id: osdu' \
--header 'Authorization: Bearer eyJraWQiOi...fWbOUA3RcQ`
Response 200 OK
{
"dag_id": "Osdu_ingest",
"execution_date": "2023-04-04T21:19:27.327451+00:00",
"key": "skipped_ids",
"task_id": "process_single_manifest_file_task",
"timestamp": "2023-04-04T21:19:48.783236+00:00",
"value": "[]"
}
@chad @debasisc @Srinivasan_Narayanan @dzmitry_malkevich @anujguptahttps://community.opengroup.org/osdu/platform/data-flow/ingestion/ingestion-workflow/-/issues/151Misleading message in Xcom summary when legal tag is missing2023-03-16T11:31:11ZDebasis ChatterjeeMisleading message in Xcom summary when legal tag is missingI was running a simple test case in AWS/M16/Preship.
I was getting this message.
Now, I get failure. \[{'id': 'osdu:reference-data--FacilityEventType:DC13MAR', 'kind': 'osdu:wks:reference-data--FacilityEventType:1.0.0', 'reason': '400 C...I was running a simple test case in AWS/M16/Preship.
I was getting this message.
Now, I get failure. \[{'id': 'osdu:reference-data--FacilityEventType:DC13MAR', 'kind': 'osdu:wks:reference-data--FacilityEventType:1.0.0', 'reason': '400 Client Error: Bad Request for url: [http://os-storage.osdu-services:8080/api/storage/v2/records'}](http://os-storage.osdu-services:8080/api/storage/v2/records'%7D)\]
Turns out (thanks to AWS Support Nazeem Akbar Ali) that this is because legal tag was not defined and he found the reason by chckingr elevant log file.
See details here -
https://community.opengroup.org/osdu/platform/pre-shipping/-/issues/470#note_207244https://community.opengroup.org/osdu/platform/data-flow/ingestion/ingestion-workflow/-/issues/150Misleading log statements2022-12-12T15:35:32ZMaksim MalkovMisleading log statementsWorkflow service search for a triggered workflow first in provided data partition. System workflow like CSV would not be available in data partition. In such cases service publish logs "workflow not found"
Next same workflow is searched ...Workflow service search for a triggered workflow first in provided data partition. System workflow like CSV would not be available in data partition. In such cases service publish logs "workflow not found"
Next same workflow is searched in system db and it is found there and processing completes
But these logs are creating a confusion that some workflow is not found by workflow service, but actually there is no such issue.M16 - Release 0.19https://community.opengroup.org/osdu/platform/data-flow/ingestion/ingestion-workflow/-/issues/149Untagged release image removed2022-09-30T17:18:19ZDzmitry Malkevich (EPAM)Untagged release image removedImage community.opengroup.org:5555/osdu/platform/data-flow/ingestion/ingestion-workflow/osdu-gcp-workflow:34edf0e7 was deployed to DEV2 with pipeline https://community.opengroup.org/osdu/platform/data-flow/ingestion/ingestion-workflow/-/...Image community.opengroup.org:5555/osdu/platform/data-flow/ingestion/ingestion-workflow/osdu-gcp-workflow:34edf0e7 was deployed to DEV2 with pipeline https://community.opengroup.org/osdu/platform/data-flow/ingestion/ingestion-workflow/-/pipelines/134615 on Sep. 8
Recently this image was removed from repository and now DEV2 is partially broken.
Expectations: untagged images created from release branch are not purged at least until next releasehttps://community.opengroup.org/osdu/platform/data-flow/ingestion/ingestion-workflow/-/issues/147Update Status end point is publishing INGESTOR stage GSM messages2022-08-11T04:56:25Zdevesh bajpaiUpdate Status end point is publishing INGESTOR stage GSM messagesUpdate status endpoint in workflow service is publishing "INGESTOR"stage GSM messages. Workflow service update status endpoint can be called by user and this case "INGESTOR" stage in published GSM message doesn't seems valid. Workflow se...Update status endpoint in workflow service is publishing "INGESTOR"stage GSM messages. Workflow service update status endpoint can be called by user and this case "INGESTOR" stage in published GSM message doesn't seems valid. Workflow service can publish GSM message with "WORKFLOW" stage to give clear distinction regarding the source of GSM message.M13 - Release 0.16https://community.opengroup.org/osdu/platform/data-flow/ingestion/ingestion-workflow/-/issues/146Azure - One Service IT Test is flaky2022-05-31T03:58:58Zharshit aggarwalAzure - One Service IT Test is flakyTestWorkflowRunV3Integration.updateWorkflowRunStatus_should_returnSuccess_when_givenValidRequest_StatusRunning()
following tests seems to be flaky and fails occasionally
https://community.opengroup.org/osdu/platform/data-flow/ingestion...TestWorkflowRunV3Integration.updateWorkflowRunStatus_should_returnSuccess_when_givenValidRequest_StatusRunning()
following tests seems to be flaky and fails occasionally
https://community.opengroup.org/osdu/platform/data-flow/ingestion/ingestion-workflow/-/jobs/1041557Akshat JoshiAkshat Joshihttps://community.opengroup.org/osdu/platform/data-flow/ingestion/ingestion-workflow/-/issues/145workflow_id instead of dag_name2022-05-20T19:00:20ZRiabokon Stanislav(EPAM)[GCP]workflow_id instead of dag_nameWorkflow Service passes _workflow_id_ instead of _dag_name_, when it try to get dag_run_status.Workflow Service passes _workflow_id_ instead of _dag_name_, when it try to get dag_run_status.Riabokon Stanislav(EPAM)[GCP]Riabokon Stanislav(EPAM)[GCP]https://community.opengroup.org/osdu/platform/data-flow/ingestion/ingestion-workflow/-/issues/144WhiteSource update2022-08-23T21:24:04ZMaksim MalkovWhiteSource updateUpdate `core` and `azure` modules according to WS reports.Update `core` and `azure` modules according to WS reports.M12 - Release 0.15Maksim MalkovMaksim Malkovhttps://community.opengroup.org/osdu/platform/data-flow/ingestion/ingestion-workflow/-/issues/143Status publisher incorrectly sets status to FAILED2022-04-20T20:32:08ZMorris EstepaStatus publisher incorrectly sets status to FAILEDIngestion workflow is incorrectly publishing the status of a CSV ingestion as FAILED when a DAG attempts to set the workflow status to FINISHED. The problem occurs because "org.opengroup.osdu.workflow.service.WorkflowRunServiceImpl" call...Ingestion workflow is incorrectly publishing the status of a CSV ingestion as FAILED when a DAG attempts to set the workflow status to FINISHED. The problem occurs because "org.opengroup.osdu.workflow.service.WorkflowRunServiceImpl" calls the publishStatusWithUnexpectedErrors method when it receives "finished" as the workflow status in its logUpdatedStatus method.
Replication steps:
1) Subscribe to ingestion workflow status topic (SNS topic in AWS) to receive status messages.
2) Ingest a CSV file.
Ingestion workflow will publish 3 messages with the following statuses:
* SUBMITTED
* IN_PROGRESS
* FAILED
The 3rd status should have said SUCCESS.Okoun-Ola Fabien HouetoOkoun-Ola Fabien Houetohttps://community.opengroup.org/osdu/platform/data-flow/ingestion/ingestion-workflow/-/issues/141Incorrect Info endpoint [GONRG-4528]2022-03-31T09:04:56ZDenis Karpenok (EPAM)Incorrect Info endpoint [GONRG-4528]```
GET 'https://preship.gcp.gnrg-osdu.projects.epam.com/api/workflow/v1/info' - 404, "Not Found",
GET 'https://preship.gcp.gnrg-osdu.projects.epam.com/api/workflow/info' - works well
```
Expected:
`GET 'https://preship.gcp.gnrg-osdu.p...```
GET 'https://preship.gcp.gnrg-osdu.projects.epam.com/api/workflow/v1/info' - 404, "Not Found",
GET 'https://preship.gcp.gnrg-osdu.projects.epam.com/api/workflow/info' - works well
```
Expected:
`GET 'https://preship.gcp.gnrg-osdu.projects.epam.com/api/workflow/v1/info' - works well`M12 - Release 0.15Chris ZhangDzmitry Malkevich (EPAM)Chris Zhanghttps://community.opengroup.org/osdu/platform/data-flow/ingestion/ingestion-workflow/-/issues/140Airflow 2.0. A new dag state 'queued'2022-02-18T07:25:14ZRiabokon Stanislav(EPAM)[GCP]Airflow 2.0. A new dag state 'queued'Response from https://airflow.apache.org/api/v1/dags/{dag_id}/dagRuns with Airflow 2.0 contains
a new dag state 'queued'
`Changed in version 2.1.3: 'queued' is added as a possible value.`
https://airflow.apache.org/docs/apache-airflow...Response from https://airflow.apache.org/api/v1/dags/{dag_id}/dagRuns with Airflow 2.0 contains
a new dag state 'queued'
`Changed in version 2.1.3: 'queued' is added as a possible value.`
https://airflow.apache.org/docs/apache-airflow/stable/stable-rest-api-ref.html#operation/post_dag_runM11 - Release 0.14Riabokon Stanislav(EPAM)[GCP]Riabokon Stanislav(EPAM)[GCP]https://community.opengroup.org/osdu/platform/data-flow/ingestion/ingestion-workflow/-/issues/139osdu_ingest - Make dataset optional inside "Data" block of JSON payload2022-02-03T17:25:56ZDebasis Chatterjeeosdu_ingest - Make dataset optional inside "Data" block of JSON payloadIn the current structure, "Data block" expects all 3 sections - work-product, work-product component and then Dataset.
During recent discussion with @todaiks and @Kateryna_Kurach , it transpired that Work-product component may simply r...In the current structure, "Data block" expects all 3 sections - work-product, work-product component and then Dataset.
During recent discussion with @todaiks and @Kateryna_Kurach , it transpired that Work-product component may simply refer to an existing Dataset record (created in previous step). so, we do not want to spend double effort to deal with Dataset record.
See collection 29 from Platform Validation. Steps 4c and 5a.
https://community.opengroup.org/osdu/platform/testing/-/blob/master/Postman%20Collection/29_CICD_Setup_Ingestion/R3%20Full%20manifest-based%20ingestion.postman_collection.json
![osdu_ingest-Postman](/uploads/b5b956bf197bd8ce06542fda59697a0b/osdu_ingest-Postman.PNG)