Pre Shipping issueshttps://community.opengroup.org/osdu/platform/pre-shipping/-/issues2021-09-02T01:29:50Zhttps://community.opengroup.org/osdu/platform/pre-shipping/-/issues/6CSV Parser -R3M7/AWS - issue loading Basin Type data2021-09-02T01:29:50Zkenneth liewCSV Parser -R3M7/AWS - issue loading Basin Type dataThe CSV Parser(DAG Portal) contains some errors when import Basin Type Master Data by CSV Ingestion.
[_R3M7_AWS_CSV_Ingestion.txt](/uploads/ddcb5b29680b8383177ebdc7c48f50e4/_R3M7_AWS_CSV_Ingestion.txt)
[BasinType.csv](/uploads/6311a0e84...The CSV Parser(DAG Portal) contains some errors when import Basin Type Master Data by CSV Ingestion.
[_R3M7_AWS_CSV_Ingestion.txt](/uploads/ddcb5b29680b8383177ebdc7c48f50e4/_R3M7_AWS_CSV_Ingestion.txt)
[BasinType.csv](/uploads/6311a0e844e869bd80e3918832e9bb24/BasinType.csv)
[CSVParserErrorLog.txt](/uploads/c71ff4835981391648a0f88544542243/CSVParserErrorLog.txt)M8 - Release 0.11https://community.opengroup.org/osdu/platform/pre-shipping/-/issues/12Retrieving entitlements/groups2021-09-03T14:26:39Zetienne peyssonRetrieving entitlements/groupsFor testing purposes, I'm trying to get the proper ACL groups from the entitlement endpoint as follow :
```
curl --location --request GET 'https://osdu-cpd-osdu.odi-osdu-og-fa7661852f2ab29a6be32f560b2f5573-0000.us-south.containers.appdo...For testing purposes, I'm trying to get the proper ACL groups from the entitlement endpoint as follow :
```
curl --location --request GET 'https://osdu-cpd-osdu.odi-osdu-og-fa7661852f2ab29a6be32f560b2f5573-0000.us-south.containers.appdomain.cloud/osdu-entitlements/api/entitlements/v1/groups' \
--header 'data-partition-id: opendes' \
--header 'Content-Type: application/json' \
--header 'Authorization: Bearer HIDDEN'
```
But I'm reaching a 502 BAD gateway response.
I'ven't found a reference to this API in the postman collections provided.https://community.opengroup.org/osdu/platform/pre-shipping/-/issues/9Test email issue2021-09-03T19:04:30ZSergey Krupenin (EPAM)Test email issueTest ticketTest tickethttps://community.opengroup.org/osdu/platform/pre-shipping/-/issues/55Access of preshipping env, (Need service account key)2021-09-21T13:06:04Zupendra kumarAccess of preshipping env, (Need service account key)## Summary
(Summarize the bug encountered concisely)
## Steps to reproduce
(How one can reproduce the issue - this is very important)
## Example Environment(Tenant)
(Tenant name)
## What is the current bug behavior?
(What actually...## Summary
(Summarize the bug encountered concisely)
## Steps to reproduce
(How one can reproduce the issue - this is very important)
## Example Environment(Tenant)
(Tenant name)
## What is the current bug behavior?
(What actually happens)
## What is the expected correct behavior?
(What you should see instead)
## Relevant logs and/or screenshots
(Paste any relevant logs - please use code blocks (```) to format console output, logs, and code, as
it's very hard to read otherwise.)
## Possible fixes
(If you can, link to the line of code that might be responsible for the problem)
/cc @kateryna_kurach
/cc @Aliaksandr_Ramanovich1
Need access of pre shipping env, (Need service account key )https://community.opengroup.org/osdu/platform/pre-shipping/-/issues/53WITSML Parser (Trajectory) - Error 500 on Post metadata - R3M82021-09-23T11:09:44Zetienne peyssonWITSML Parser (Trajectory) - Error 500 on Post metadata - R3M8I'm receiving the following error :
"Client failed to authenticate using SASL: PLAIN" and code : 500
After making the following call :
POST https://osdu-cpd-osdu.odi-osdu-og-fa7661852f2ab29a6be32f560b2f5573-0000.us-south.containers...I'm receiving the following error :
"Client failed to authenticate using SASL: PLAIN" and code : 500
After making the following call :
POST https://osdu-cpd-osdu.odi-osdu-og-fa7661852f2ab29a6be32f560b2f5573-0000.us-south.containers.appdomain.cloud/osdu-file/api/file/v2/files/metadata
- Given Authorization with Access/id token
- Given data-partition-id opendes
- Given Content-Type application/json
- Given x-ms-blob-type BlockBlob
Given body :
```json
{
"data" : {
"TotalSize" : 5299.0,
"Source" : "TNO Data Source",
"Name" : "Trajectory DC",
"Endian" : "BIG",
"Description" : "Trajectory WITSML dataset",
"DatasetProperties" : {
"FileSourceInfo" : {
"FileSource" : "567637002f924633833787511ab77dfa",
"Name" : "trajectory_DC.xml",
"PreloadFilePath" : "s3://oc-cpd-opendes-staging-bucket/567637002f924633833787511ab77dfa",
"PreloadFileCreateUser" : null,
"PreloadFileModifyDate" : 1631859302.437914453,
"PreloadFileModifyUser" : null
}
}
},
"kind" : "opendes:wks:dataset--File.Generic:1.0.0",
"acl" : {
"viewers" : [ "data.default.viewers@opendes.ibm.com" ],
"owners" : [ "data.default.owners@opendes.ibm.com" ]
},
"legal" : {
"otherRelevantDataCountries" : [ "US" ],
"status" : "compliant",
"legaltags" : [ "opendes-Test-Legal-Tag-7292798" ]
},
"createUser" : null,
"createTime" : 1631859302.433772431,
"modifyUser" : null,
"modifyTime" : 1631859302.437564058
}
```
It was working properly few days ago.
Another question :
I see in the DAG that you generate the metadata if we don't provide it when you are triggering the witsml parser.
What is the recommended way of doing ?
Do you still allow the metaadata to be sent ?Pre-Shipping R3-M8Gokul NagareGokul Nagarehttps://community.opengroup.org/osdu/platform/pre-shipping/-/issues/63WITSML Parser (Well) - "code":403; "reason": "Access denied" AWS2021-09-23T15:22:31ZEsmira RafigayevaWITSML Parser (Well) - "code":403; "reason": "Access denied" AWS![image](/uploads/bb3de7c1157d9cdb656243f6abdb391c/image.png)![image](/uploads/bb3de7c1157d9cdb656243f6abdb391c/image.png)Pre-Shipping R3-M8GregGreghttps://community.opengroup.org/osdu/platform/pre-shipping/-/issues/59R3M8 - Commands with {{FILE_HOST}} do not work2021-09-23T15:38:23ZValery GinakR3M8 - Commands with {{FILE_HOST}} do not work## Summary
When trying command 7 in CSV Workflow or command 1 in Manifest Ingestion (both include {{FILE_HOST}}) the error (Error: getaddrinfo ENOTFOUND https) occurs
## Steps to reproduce
Run any command with {{FILE_HOST}}
/cc @ka...## Summary
When trying command 7 in CSV Workflow or command 1 in Manifest Ingestion (both include {{FILE_HOST}}) the error (Error: getaddrinfo ENOTFOUND https) occurs
## Steps to reproduce
Run any command with {{FILE_HOST}}
/cc @kateryna_kurach
/cc @Aliaksandr_Ramanovich1Aliaksandr Ramanovich (EPAM)Aliaksandr Ramanovich (EPAM)https://community.opengroup.org/osdu/platform/pre-shipping/-/issues/64IBM Airflow job hangs2021-09-23T19:10:16ZYanbin ZhangIBM Airflow job hangsJust submitted a job which hangs
{"runId": "4a307ab1-aeb4-47e9-ab54-c85a9a35362c", "startTimeStamp": 1632406417846, "status": "running", "submittedBy": "preshipteamd@osdu.opengroup.org"}Just submitted a job which hangs
{"runId": "4a307ab1-aeb4-47e9-ab54-c85a9a35362c", "startTimeStamp": 1632406417846, "status": "running", "submittedBy": "preshipteamd@osdu.opengroup.org"}https://community.opengroup.org/osdu/platform/pre-shipping/-/issues/65IBM M8 csv parser dag - Airflow in continual "running" state for custom schem...2021-09-24T06:57:43ZSteven EvansIBM M8 csv parser dag - Airflow in continual "running" state for custom schema collectionAirflow csv parser DAG for the M8 custom schema using the platform validation collection and environment jsons has bee stuck in “running” mode for over 4 hours. When viewing the logs – no logs appear or are available.![Custom_schema_Dag_...Airflow csv parser DAG for the M8 custom schema using the platform validation collection and environment jsons has bee stuck in “running” mode for over 4 hours. When viewing the logs – no logs appear or are available.![Custom_schema_Dag__-_no_log_display](/uploads/f35b098cd399154dcdc02d889392a4a6/Custom_schema_Dag__-_no_log_display.PNG)
![Custom_Schema_Dag_run_issue_1](/uploads/0fd4a116d50316aa3c7649b71a805af6/Custom_Schema_Dag_run_issue_1.PNG)
All Postman collection steps were successful including Trigger Workflow.Steven EvansSteven Evanshttps://community.opengroup.org/osdu/platform/pre-shipping/-/issues/52CSV Parser failure - R3M7 IBM- Custom Schema - Airflow DAG error2021-09-24T07:00:12ZSteven EvansCSV Parser failure - R3M7 IBM- Custom Schema - Airflow DAG errorRunning a practice test on M7 csv custom schema prior to M8 testing program, an issue was encountered when “triggering the Workflow” step 09 was actioned. This successfully triggered and created a runid however on checking the Airflow DA...Running a practice test on M7 csv custom schema prior to M8 testing program, an issue was encountered when “triggering the Workflow” step 09 was actioned. This successfully triggered and created a runid however on checking the Airflow DAG logs the process failed with the following error:
![image](/uploads/fcf25be25a5dcd82f619c84b33e1f728/image.png)![Custom_Schema_DAG_failure_log](/uploads/95971f525950934d1271d0e77cbccde9/Custom_Schema_DAG_failure_log.PNG)Pre-Shipping R3-M7Shrikant GargSteven EvansShrikant Garghttps://community.opengroup.org/osdu/platform/pre-shipping/-/issues/70IBM master data injection shows success without passing validation2021-09-27T12:34:18ZYanbin ZhangIBM master data injection shows success without passing validationIngest the following master data:[3.load_Well.json](/uploads/dc357b7638da45ad6caa5846715a5668/3.load_Well.json)
In task provide_manifest_integrity_task, I see the following warning:
```
[2021-09-23 20:40:11,579] {logging_mixin.py:112} ...Ingest the following master data:[3.load_Well.json](/uploads/dc357b7638da45ad6caa5846715a5668/3.load_Well.json)
In task provide_manifest_integrity_task, I see the following warning:
```
[2021-09-23 20:40:11,579] {logging_mixin.py:112} INFO - Running %s on host %s <TaskInstance: Osdu_ingest.provide_manifest_integrity_task 2021-09-23T20:38:08+00:00 [running]> airflow-worker-0.airflow-worker.odi-airflow-ns.svc.cluster.local
[2021-09-23 20:40:50,247] {logging_mixin.py:112} INFO - [2021-09-23 20:40:50,246] {validate_referential_integrity.py:210} WARNING - Resource with kind opendes:wks:master-data--Well:1.0.0 and id: 'opendes:master-data--Well:fcd27e71-28a5-4cb9-b8ad-4bad397b3613' was rejected. Missing ids '{'opendes:master-data--GeoPoliticalEntity:Limburg:', 'opendes:master-data--GeoPoliticalEntity:Netherlands:', 'opendes:master-data--GeoPoliticalEntity:L:'}'
[2021-09-23 20:40:50,367] {taskinstance.py:1065} INFO - Marking task as SUCCESS.dag_id=Osdu_ingest, task_id=provide_manifest_integrity_task, execution_date=20210923T203808, start_date=20210923T204011, end_date=20210923T204050
[2021-09-23 20:40:51,537] {logging_mixin.py:112} INFO - [2021-09-23 20:40:51,536] {local_task_job.py:103} INFO - Task exited with return code 0
```
The master data is not being ingested because some reference ids are missing. But the entire workflow ("runId": "331b1678-5168-40ee-bfd1-8dccef2eabd1") finished successfully.https://community.opengroup.org/osdu/platform/pre-shipping/-/issues/73AWS R3M8 - standard ACLs are not available for Preship user2021-09-27T22:25:42ZDebasis ChatterjeeAWS R3M8 - standard ACLs are not available for Preship userPlease see this excerpt from Manifest Ingestion Smoke test sample.
```
"acl": {
"owners": [
"data.default.owners@{{data_partition_id}}.testing.com"
],
"viewers": [
...Please see this excerpt from Manifest Ingestion Smoke test sample.
```
"acl": {
"owners": [
"data.default.owners@{{data_partition_id}}.testing.com"
],
"viewers": [
"data.default.viewers@{{data_partition_id}}.testing.com"
]
},
```
When I check "Get group for members", these are not present. See enclosed.
[AWS-ACLs.txt](/uploads/827a727407992ee0ff5c4dd3e66a18ff/AWS-ACLs.txt)
cc - @esmira.rafigayeva , @sje7253bp , @Wibben for informationhttps://community.opengroup.org/osdu/platform/pre-shipping/-/issues/60Witsml Parser (Trajectory) - DAG failure - R3M82021-09-28T09:46:37Zetienne peyssonWitsml Parser (Trajectory) - DAG failure - R3M8Following are the steps I've followed to get the error from Airflow (you'll see the error at the bottom)
1. Get the SignedUrl from the File DMS
GET https://osdu-cpd-osdu.odi-osdu-og-fa7661852f2ab29a6be32f560b2f5573-0000.us-south.contai...Following are the steps I've followed to get the error from Airflow (you'll see the error at the bottom)
1. Get the SignedUrl from the File DMS
GET https://osdu-cpd-osdu.odi-osdu-og-fa7661852f2ab29a6be32f560b2f5573-0000.us-south.containers.appdomain.cloud/osdu-file/api/file/v2/files/uploadURL
- Given Authorization access/id token
- Given data-partition-id : opendes
- Signed Url : https://minio-osdu-minio.odi-osdu-og-fa7661852f2ab29a6be32f560b2f5573-0000.us-south.containers.appdomain.cloud/oc-cpd-opendes-staging-bucket/dbbe4c019ef5476597b417574ffcdb6a?X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Date=20210922T121558Z&X-Amz-SignedHeaders=host&X-Amz-Expires=86399&X-Amz-Credential=minio%2F20210922%2Fus-east-1%2Fs3%2Faws4_request&X-Amz-Signature=97344e6d73c77e073e27dfe5942eb9744cfac56c4d96a6f6062b84e83691182e
- File source ID : dbbe4c019ef5476597b417574ffcdb6a
2. File upload
PUT https://minio-osdu-minio.odi-osdu-og-fa7661852f2ab29a6be32f560b2f5573-0000.us-south.containers.appdomain.cloud/oc-cpd-opendes-staging-bucket/dbbe4c019ef5476597b417574ffcdb6a?X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Date=20210922T121558Z&X-Amz-SignedHeaders=host&X-Amz-Expires=86399&X-Amz-Credential=minio%2F20210922%2Fus-east-1%2Fs3%2Faws4_request&X-Amz-Signature=97344e6d73c77e073e27dfe5942eb9744cfac56c4d96a6f6062b84e83691182e
- Given headers :
Content-Type: Application/xml
- Given Binary file stream
Upload file response code 200
3. Load metadata
POST https://osdu-cpd-osdu.odi-osdu-og-fa7661852f2ab29a6be32f560b2f5573-0000.us-south.containers.appdomain.cloud/osdu-file/api/file/v2/files/metadata
- Given Authorization with Access/id token
- Given data-partition-id opendes
- Given Content-Type application/json
- Given x-ms-blob-type BlockBlob
Given body :
```json
{
"data" : {
"TotalSize" : 5299.0,
"Source" : "TNO Data Source",
"Name" : "Trajectory DC",
"Endian" : "BIG",
"Description" : "Trajectory WITSML dataset",
"DatasetProperties" : {
"FileSourceInfo" : {
"FileSource" : "dbbe4c019ef5476597b417574ffcdb6a",
"Name" : "trajectory_DC.xml",
"PreloadFilePath" : "s3://oc-cpd-opendes-staging-bucket/dbbe4c019ef5476597b417574ffcdb6a",
"PreloadFileCreateUser" : "preshipteama",
"PreloadFileModifyDate" : "2021-09-22 02:15:59",
"PreloadFileModifyUser" : "preshipteama"
}
}
},
"kind" : "opendes:wks:dataset--File.Generic:1.0.0",
"acl" : {
"viewers" : [ "data.default.viewers@opendes.ibm.com" ],
"owners" : [ "data.default.owners@opendes.ibm.com" ]
},
"legal" : {
"otherRelevantDataCountries" : [ "US" ],
"status" : "compliant",
"legaltags" : [ "opendes-Test-Legal-Tag-7292798" ]
},
"createUser" : "preshipteama",
"createTime" : "2021-09-22 02:15:59",
"modifyUser" : "preshipteama",
"modifyTime" : "2021-09-22 02:15:59"
}
```
Response obtained with dataset ID opendes:dataset--File.Generic:3f2d8cb1-dfec-48e6-ba90-986b5a89a0ed
4. Trigger DAG Workflow
POST https://osdu-cpd-osdu.odi-osdu-og-fa7661852f2ab29a6be32f560b2f5573-0000.us-south.containers.appdomain.cloud/osdu-workflow/api/workflow/v1/workflow/Energistics_xml_ingest/workflowRun
Given Authorization with Access/id token
Given data-partition-id opendes
Given Content-Type application/json
Given body :
```json
{
"executionContext" : {
"Payload" : {
"AppKey" : "test-app",
"data-partition-id" : "opendes"
},
"Context" : {
"acl" : {
"viewers" : [ "data.default.viewers@opendes.ibm.com" ],
"owners" : [ "data.default.owners@opendes.ibm.com" ]
},
"legal" : {
"otherRelevantDataCountries" : [ "US" ],
"status" : "compliant",
"legaltags" : [ "opendes-Test-Legal-Tag-7292798" ]
},
"kind" : "opendes:wks:dataset--File.Generic:1.0.0",
"version" : 1.0,
"dataset_id" : "opendes:dataset--File.Generic:3f2d8cb1-dfec-48e6-ba90-986b5a89a0ed",
"file_name" : "trajectory_DC.xml",
"preload_file_path" : "s3://oc-cpd-opendes-staging-bucket/dbbe4c019ef5476597b417574ffcdb6a"
}
}
}
```
Response with run ID : 0174ebc2-1c65-47b4-9c06-7ac5a8823925
Error logs :
```
[2021-09-22 12:16:49,662] {logging_mixin.py:112} INFO - [2021-09-22 12:16:49,662] {pod_launcher.py:142} INFO - Event: witsml-parser-task-25a574f5 had an event of type Running
[2021-09-22 12:16:59,071] {logging_mixin.py:112} INFO - [2021-09-22 12:16:59,071] {pod_launcher.py:125} INFO - b'IBMBlobStorageFactory().get_s3_client\n'
[2021-09-22 12:16:59,071] {logging_mixin.py:112} INFO - [2021-09-22 12:16:59,071] {pod_launcher.py:125} INFO - b'IBMBlobStorageFactory().s3_client: <botocore.client.S3 object at 0x7fd4d46e21d0>\n'
[2021-09-22 12:16:59,072] {logging_mixin.py:112} INFO - [2021-09-22 12:16:59,072] {pod_launcher.py:125} INFO - b'Traceback (most recent call last):\n'
[2021-09-22 12:16:59,075] {logging_mixin.py:112} INFO - [2021-09-22 12:16:59,072] {pod_launcher.py:125} INFO - b' File "main.py", line 45, in <module>\n'
[2021-09-22 12:16:59,075] {logging_mixin.py:112} INFO - [2021-09-22 12:16:59,075] {pod_launcher.py:125} INFO - b' manifest = json.dumps(main(json.loads(args.context), args.file_service))\n'
[2021-09-22 12:16:59,076] {logging_mixin.py:112} INFO - [2021-09-22 12:16:59,076] {pod_launcher.py:125} INFO - b' File "main.py", line 35, in main\n'
[2021-09-22 12:16:59,076] {logging_mixin.py:112} INFO - [2021-09-22 12:16:59,076] {pod_launcher.py:125} INFO - b' manifest = manifest_creator.create_manifest(execution_context)\n'
[2021-09-22 12:16:59,076] {logging_mixin.py:112} INFO - [2021-09-22 12:16:59,076] {pod_launcher.py:125} INFO - b' File "/home/witsml_parser/create_energistics_manifest.py", line 234, in create_manifest\n'
[2021-09-22 12:16:59,077] {logging_mixin.py:112} INFO - [2021-09-22 12:16:59,076] {pod_launcher.py:125} INFO - b' dataset_dms_client = DatasetDmsClient(DefaultConfigManager(), self.payload_context.data_partition_id)\n'
[2021-09-22 12:16:59,077] {logging_mixin.py:112} INFO - [2021-09-22 12:16:59,077] {pod_launcher.py:125} INFO - b'TypeError: __init__() takes from 1 to 2 positional arguments but 3 were given\n'
[2021-09-22 12:17:02,050] {logging_mixin.py:112} INFO - [2021-09-22 12:17:02,049] {pod_launcher.py:217} INFO - Running command... cat /airflow/xcom/return.json
[2021-09-22 12:17:02,383] {logging_mixin.py:112} INFO - [2021-09-22 12:17:02,382] {pod_launcher.py:224} INFO - cat: can't open '/airflow/xcom/return.json': No such file or directory
```
More details here :
http://airflow-web-odi-airflow-ns.odi-osdu-og-fa7661852f2ab29a6be32f560b2f5573-0000.us-south.containers.appdomain.cloud/log?task_id=witsml_parser_task&dag_id=Energistics_xml_ingest&execution_date=2021-09-22T12%3A16%3A06%2B00%3A00Shrikant GargGokul NagareShrikant Garghttps://community.opengroup.org/osdu/platform/pre-shipping/-/issues/77IBM R3M8 - Wellbore DMS - issue with bulk access API2021-09-28T10:00:39ZMariia ZverIBM R3M8 - Wellbore DMS - issue with bulk access APISomething happened on the step of getting wellbore ddms data while testing Wellbore DDMS (IBM). What might be a problem?
![Снимок_экрана_2021-09-25_в_23.35.39](/uploads/e7b5efb47d3b29beade74ddf016a4866/Снимок_экрана_2021-09-25_в_23.35....Something happened on the step of getting wellbore ddms data while testing Wellbore DDMS (IBM). What might be a problem?
![Снимок_экрана_2021-09-25_в_23.35.39](/uploads/e7b5efb47d3b29beade74ddf016a4866/Снимок_экрана_2021-09-25_в_23.35.39.png)Anuj GuptaAnuj Guptahttps://community.opengroup.org/osdu/platform/pre-shipping/-/issues/83DAG tasks in queue state - Witsml parser2021-09-28T12:12:30Zetienne peyssonDAG tasks in queue state - Witsml parserI'm having an issue since yesterday while triggering the Witsml parser ingestion workflow.
None of my tasks are getting scheduled.
![Screenshot_from_2021-09-28_08-18-09](/uploads/a32c6a4b40c36d63d009cc0abcc51527/Screenshot_from_2021-09-...I'm having an issue since yesterday while triggering the Witsml parser ingestion workflow.
None of my tasks are getting scheduled.
![Screenshot_from_2021-09-28_08-18-09](/uploads/a32c6a4b40c36d63d009cc0abcc51527/Screenshot_from_2021-09-28_08-18-09.png)
![Screenshot_from_2021-09-28_08-18-01](/uploads/74b25c33124926f4baf2a7b60e418799/Screenshot_from_2021-09-28_08-18-01.png)
Given body :
```json
{
"executionContext" : {
"Payload" : {
"AppKey" : "test-app",
"data-partition-id" : "opendes"
},
"Context" : {
"acl" : {
"viewers" : [ "data.default.viewers@opendes.ibm.com" ],
"owners" : [ "data.default.owners@opendes.ibm.com" ]
},
"legal" : {
"otherRelevantDataCountries" : [ "US" ],
"status" : "compliant",
"legaltags" : [ "opendes-Test-Legal-Tag-7292798" ]
},
"kind" : "opendes:wks:dataset--File.Generic:1.0.0",
"version" : 1.0,
"file_name" : "1.xml",
"preload_file_path" : "s3://oc-cpd-opendes-staging-bucket/77142592e5c5491795bc7e7b4d41914a"
}
}
}
```
Response with run ID : e45edb27-0b9f-4a4f-b163-a07b9f5f506chttps://community.opengroup.org/osdu/platform/pre-shipping/-/issues/86[GCP Airflow] No Error Logging Recorded for Missing Reference Record During M...2021-09-28T12:48:50ZNaufal Mohamed Noori[GCP Airflow] No Error Logging Recorded for Missing Reference Record During Manifest IngestionManifest ingestion will not provide any ERROR logging in airflow log if if record json manifest contains non-existing reference/master data parameter.
Steps to reproduce:
a) Using DAG manifest ingestion, load a master data wellbore reco...Manifest ingestion will not provide any ERROR logging in airflow log if if record json manifest contains non-existing reference/master data parameter.
Steps to reproduce:
a) Using DAG manifest ingestion, load a master data wellbore record. Note that I inserted data.WellID which is not existing in the current database **"WellID": "{{data-partition-id}}:master-data--Well:TEST_ERROR:"**:
BODY:
`{
"executionContext": {
"Payload": {
"AppKey": "test-app",
"data-partition-id": "{{data-partition-id}}"
},
"manifest": {
"kind": "{{data-partition-id}}:wks:Manifest:1.0.0",
"MasterData": [
{
"id": "{{data-partition-id}}:master-data--Wellbore:Test_NN_2021_09_24_01",
"kind": "{{data-partition-id}}:wks:master-data--Wellbore:1.0.0",
"acl": {
"owners": [
"data.default.owners@{{data-partition-id}}.osdu-gcp.go3-nrg.projects.epam.com"
],
"viewers": [
"data.default.viewers@{{data-partition-id}}.osdu-gcp.go3-nrg.projects.epam.com"
]
},
"legal": {
"legaltags": [
"{{data-partition-id}}-demo-legaltag"
],
"otherRelevantDataCountries": [
"US"
]
},
"data": {
"WellID": "{{data-partition-id}}:master-data--Well:TEST_ERROR:",
"FacilityName": "TEST_NN_1_ALIAS",
"SequenceNumber": 1,
"Source": "TEST_NN_1_ALIAS_SOURCE",
"NameAliases": [
{
"AliasName": "TEST_NN_1_ALIAS"
}
]
}
}
]
}
}
}
`
b) Run DAG Manifest POST: https://{{WORKFLOW_HOST}}/workflow/Osdu_ingest/workflowRun:
_{
"workflowId": "ef82cba0-0e45-4df3-91bf-4df1553102d3",
"runId": "5a786c6f-103e-44d3-b192-d34e3026b722",
"startTimeStamp": 1632812342734,
"status": "submitted",
"submittedBy": "preshipping_test_user@osdu-gcp.go3-nrg.projects.epam.com"
}_
c) Observe the airflow log. In all stages of the log there is no indication of ERROR logging even if the DAG run is failing at the end and no new record stored. Found a trace of DEBUG logging inside the airflow which indicates some kind of records checking but no ERROR logging observed:
_[2021-09-28 06:59:54,321] {search_record_ids.py:78} DEBUG - Search query "odesprod:master-data--Well:TEST_ERROR"
[2021-09-28 06:59:54,365] {connectionpool.py:939} DEBUG - Starting new HTTPS connection (1): preship-asm.osdu-gcp.go3-nrg.projects.epam.com:443
[2021-09-28 06:59:56,781] {connectionpool.py:433} DEBUG - https://preship-asm.osdu-gcp.go3-nrg.projects.epam.com:443 "POST /api/search/v2/query HTTP/1.1" 200 None
[2021-09-28 06:59:56,785] {search_record_ids.py:183} DEBUG - {"results":[],"aggregations":[],"totalCount":0}
[2021-09-28 06:59:56,785] {search_record_ids.py:188} DEBUG - Got total count 0
[2021-09-28 06:59:56,786] {search_record_ids.py:169} DEBUG - response ids: []_
EXPECTATION:
If the record is not stored due to not found existing records reference in the database, we should observe ERROR type logging in the airflow.
**(TESTED ON R3M8 Preship GCP environment on 27 September 2021)**
cc @esmira.rafigayeva @debasisc @aliaksandr_ramanovichAleksandr Spivakov (EPAM)Aleksandr Spivakov (EPAM)https://community.opengroup.org/osdu/platform/pre-shipping/-/issues/67Step 9 from CSV Workflow does not work in GCP environment2021-09-28T12:49:06ZValery GinakStep 9 from CSV Workflow does not work in GCP environment![Снимок_экрана_2021-09-23_в_18.37.42](/uploads/fd63b9ce817231de990b07e9f3a2532d/Снимок_экрана_2021-09-23_в_18.37.42.png)![Снимок_экрана_2021-09-23_в_18.37.42](/uploads/fd63b9ce817231de990b07e9f3a2532d/Снимок_экрана_2021-09-23_в_18.37.42.png)https://community.opengroup.org/osdu/platform/pre-shipping/-/issues/75IBM R3M8 - Reference data population2021-09-28T16:44:20ZDebasis ChatterjeeIBM R3M8 - Reference data population@shamazum - Please check current list. CRS is not there and UNIT is incomplete, I think.
[IBM-Reference-data-populated.txt](/uploads/29da32a5bc074bf5d4ebe4b4182b2254/IBM-Reference-data-populated.txt)
Source is here -
https://community....@shamazum - Please check current list. CRS is not there and UNIT is incomplete, I think.
[IBM-Reference-data-populated.txt](/uploads/29da32a5bc074bf5d4ebe4b4182b2254/IBM-Reference-data-populated.txt)
Source is here -
https://community.opengroup.org/osdu/data/data-definitions/-/tree/master/ReferenceValues/Manifests/reference-data
Thank you
cc - @esmira.rafigayeva and @sehuboyhttps://community.opengroup.org/osdu/platform/pre-shipping/-/issues/92AWS Segy to ZGY workflow not working2021-09-28T22:39:44ZMichaelAWS Segy to ZGY workflow not workingWhile attempting to convert a segy dataset to a zgy dataset in seismic ddms using the SEGY_TO_ZGY dag, the taks failed. Here are the workflow task logs.
```
*** Reading remote log from s3://prer3m8-ingest-logbucket-59uj-s3airflowbucketp...While attempting to convert a segy dataset to a zgy dataset in seismic ddms using the SEGY_TO_ZGY dag, the taks failed. Here are the workflow task logs.
```
*** Reading remote log from s3://prer3m8-ingest-logbucket-59uj-s3airflowbucketprod-udbcrvficay5/logs/SEGY_TO_ZGY/segy-to-zgy/2021-09-28T13:45:15.858388+00:00/1.log.
[2021-09-28 13:45:21,816] {taskinstance.py:670} INFO - Dependencies all met for <TaskInstance: SEGY_TO_ZGY.segy-to-zgy 2021-09-28T13:45:15.858388+00:00 [queued]>
[2021-09-28 13:45:21,840] {taskinstance.py:670} INFO - Dependencies all met for <TaskInstance: SEGY_TO_ZGY.segy-to-zgy 2021-09-28T13:45:15.858388+00:00 [queued]>
[2021-09-28 13:45:21,840] {taskinstance.py:880} INFO -
--------------------------------------------------------------------------------
[2021-09-28 13:45:21,840] {taskinstance.py:881} INFO - Starting attempt 1 of 1
[2021-09-28 13:45:21,840] {taskinstance.py:882} INFO -
--------------------------------------------------------------------------------
[2021-09-28 13:45:21,858] {taskinstance.py:901} INFO - Executing <Task(KubernetesPodOperator): segy-to-zgy> on 2021-09-28T13:45:15.858388+00:00
[2021-09-28 13:45:21,861] {standard_task_runner.py:54} INFO - Started process 470 to run task
[2021-09-28 13:45:21,892] {standard_task_runner.py:77} INFO - Running: ['airflow', 'run', 'SEGY_TO_ZGY', 'segy-to-zgy', '2021-09-28T13:45:15.858388+00:00', '--job_id', '5128', '--pool', 'default_pool', '--raw', '-sd', 'DAGS_FOLDER/openzgy/segy_to_zgy_ingestion_dag.py', '--cfg_path', '/tmp/tmpxqzm08nt']
[2021-09-28 13:45:21,893] {standard_task_runner.py:78} INFO - Job 5128: Subtask segy-to-zgy
[2021-09-28 13:45:21,956] {logging_mixin.py:120} INFO - Running <TaskInstance: SEGY_TO_ZGY.segy-to-zgy 2021-09-28T13:45:15.858388+00:00 [running]> on host airflow-worker-0.airflow-worker.osdu-airflow.svc.cluster.local
[2021-09-28 13:45:22,028] {logging_mixin.py:120} WARNING - /home/airflow/.local/lib/python3.6/site-packages/airflow/kubernetes/pod_launcher.py:331: DeprecationWarning: Using `airflow.contrib.kubernetes.pod.Pod` is deprecated. Please use `k8s.V1Pod`.
security_context=_extract_security_context(pod.spec.security_context)
[2021-09-28 13:45:22,028] {logging_mixin.py:120} WARNING - /home/airflow/.local/lib/python3.6/site-packages/airflow/kubernetes/pod_launcher.py:77: DeprecationWarning: Using `airflow.contrib.kubernetes.pod.Pod` is deprecated. Please use `k8s.V1Pod` instead.
pod = self._mutate_pod_backcompat(pod)
[2021-09-28 13:45:22,078] {pod_launcher.py:173} INFO - Event: segy-to-zgy-a6019acd8c6641a587114d9efa5cfa6a had an event of type Pending
[2021-09-28 13:45:22,078] {pod_launcher.py:139} WARNING - Pod not yet started: segy-to-zgy-a6019acd8c6641a587114d9efa5cfa6a
[2021-09-28 13:45:23,087] {pod_launcher.py:173} INFO - Event: segy-to-zgy-a6019acd8c6641a587114d9efa5cfa6a had an event of type Pending
[2021-09-28 13:45:23,087] {pod_launcher.py:139} WARNING - Pod not yet started: segy-to-zgy-a6019acd8c6641a587114d9efa5cfa6a
[2021-09-28 13:45:24,096] {pod_launcher.py:173} INFO - Event: segy-to-zgy-a6019acd8c6641a587114d9efa5cfa6a had an event of type Failed
[2021-09-28 13:45:24,097] {pod_launcher.py:284} INFO - Event with job id segy-to-zgy-a6019acd8c6641a587114d9efa5cfa6a Failed
[2021-09-28 13:45:24,116] {pod_launcher.py:156} INFO - b'[0.003890] SEGYTOZGY_ZFP_LOD_COMPRESS=[]\n'
[2021-09-28 13:45:24,116] {pod_launcher.py:156} INFO - b'[0.003896] SEGYTOZGY_ZFP_LOD_SNR=[]\n'
[2021-09-28 13:45:24,117] {pod_launcher.py:156} INFO - b'[0.003903] set SEGYTOZGY_INSECURE_PRINT_TOKEN=1 to print token values\n'
[2021-09-28 13:45:24,117] {pod_launcher.py:156} INFO - b'[0.003912] END Environment variables\n'
[2021-09-28 13:45:24,117] {pod_launcher.py:156} INFO - b'[0.003921] Command line arguments: [/usr/local/bin/segy/SegyToZgy] [--osdu] [osdu:dataset--FileCollection.SEGY:e1d8444c4ae545c1b3446211be7995bb] [{{SeismicTraceDataBinGridWPId}}]\n'
[2021-09-28 13:45:24,117] {pod_launcher.py:156} INFO - b'[0.003933] Fetching work product [{{SeismicTraceDataBinGridWPId}}]\n'
[2021-09-28 13:45:24,117] {pod_launcher.py:156} INFO - b'[0.003949] About to get record [{{SeismicTraceDataBinGridWPId}}]\n'
[2021-09-28 13:45:24,117] {pod_launcher.py:156} INFO - b'[0.004017] Storage service URL: [https://preshiptesting.osdu.aws/api/storage/v2]\n'
[2021-09-28 13:45:24,117] {pod_launcher.py:156} INFO - b'[0.004026] Data partition ID : [osdu]\n'
[2021-09-28 13:45:24,117] {pod_launcher.py:156} INFO - b'Invalid format of object reference.\n'
[2021-09-28 13:45:24,132] {pod_launcher.py:173} INFO - Event: segy-to-zgy-a6019acd8c6641a587114d9efa5cfa6a had an event of type Failed
[2021-09-28 13:45:24,132] {pod_launcher.py:284} INFO - Event with job id segy-to-zgy-a6019acd8c6641a587114d9efa5cfa6a Failed
[2021-09-28 13:45:24,138] {pod_launcher.py:173} INFO - Event: segy-to-zgy-a6019acd8c6641a587114d9efa5cfa6a had an event of type Failed
[2021-09-28 13:45:24,138] {pod_launcher.py:284} INFO - Event with job id segy-to-zgy-a6019acd8c6641a587114d9efa5cfa6a Failed
[2021-09-28 13:45:24,172] {taskinstance.py:1150} ERROR - Pod Launching failed: Pod returned a failure: failed
Traceback (most recent call last):
File "/home/airflow/.local/lib/python3.6/site-packages/airflow/contrib/operators/kubernetes_pod_operator.py", line 309, in execute
'Pod returned a failure: {state}'.format(state=final_state))
airflow.exceptions.AirflowException: Pod returned a failure: failed
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/home/airflow/.local/lib/python3.6/site-packages/airflow/models/taskinstance.py", line 979, in _run_raw_task
result = task_copy.execute(context=context)
File "/home/airflow/.local/lib/python3.6/site-packages/airflow/contrib/operators/kubernetes_pod_operator.py", line 312, in execute
raise AirflowException('Pod Launching failed: {error}'.format(error=ex))
airflow.exceptions.AirflowException: Pod Launching failed: Pod returned a failure: failed
[2021-09-28 13:45:24,173] {taskinstance.py:1194} INFO - Marking task as FAILED. dag_id=SEGY_TO_ZGY, task_id=segy-to-zgy, execution_date=20210928T134515, start_date=20210928T134521, end_date=20210928T134524
[2021-09-28 13:45:26,726] {local_task_job.py:102} INFO - Task exited with return code 1
```
I have attached the steps I used to test this feature in a file attached to this ticket.
[AWS_M8_OpenZGY_Test_3_Results.docx](/uploads/649c2b58582cd7401d44dc3007b298af/AWS_M8_OpenZGY_Test_3_Results.docx)GregGreghttps://community.opengroup.org/osdu/platform/pre-shipping/-/issues/66IBM missing standard reference data2021-09-28T22:43:02ZYanbin ZhangIBM missing standard reference dataTry to find all the schema available in the IBM preshipping environment. Under authority='opendes' and source='wks', there is only 59 different schemas preloaded. We expect all reference data preloaded in the preshipping environment.
@an...Try to find all the schema available in the IBM preshipping environment. Under authority='opendes' and source='wks', there is only 59 different schemas preloaded. We expect all reference data preloaded in the preshipping environment.
@anujgupta