Pre Shipping issueshttps://community.opengroup.org/osdu/platform/pre-shipping/-/issues2022-08-23T13:29:54Zhttps://community.opengroup.org/osdu/platform/pre-shipping/-/issues/24CSV Ingestion2022-08-23T13:29:54ZEsmira RafigayevaCSV IngestionPre-Shipping R3-M8Taylor GraberTaylor Graberhttps://community.opengroup.org/osdu/platform/pre-shipping/-/issues/27WITSML Parser2022-08-24T14:21:18ZEsmira RafigayevaWITSML ParserPre-Shipping R3-M8Esmira RafigayevaEsmira Rafigayevahttps://community.opengroup.org/osdu/platform/pre-shipping/-/issues/19Manifest Ingestion2022-09-03T02:23:44ZEsmira RafigayevaManifest IngestionPre-Shipping R3-M8Naufal Mohamed NooriNaufal Mohamed Noorihttps://community.opengroup.org/osdu/platform/pre-shipping/-/issues/344GCP - M13 - WITSML Parser - fails when we try to upload Well data type2022-09-01T18:51:55ZDebasis ChatterjeeGCP - M13 - WITSML Parser - fails when we try to upload Well data typePlease see detailed log here.
[M13-GCP-WITSML-failure.txt](/uploads/11daf057a0fe6da81ad3ea70b0299db4/M13-GCP-WITSML-failure.txt)
cc @andrei_dalhikhPlease see detailed log here.
[M13-GCP-WITSML-failure.txt](/uploads/11daf057a0fe6da81ad3ea70b0299db4/M13-GCP-WITSML-failure.txt)
cc @andrei_dalhikhDzmitry Malkevich (EPAM)Dzmitry Malkevich (EPAM)https://community.opengroup.org/osdu/platform/pre-shipping/-/issues/62Wellbore DDMS (R3M8)2022-08-23T13:30:00ZEsmira RafigayevaWellbore DDMS (R3M8)I've gone through the Wellbore DDMS Collection with no issues before the step "get data - wellbore ddms", where smith went wrong..
![Снимок_экрана_2021-09-25_в_23.35.39](/uploads/0ed795b0c668aad32a34625d3f64f9ea/Снимок_экрана_2021-09-25...I've gone through the Wellbore DDMS Collection with no issues before the step "get data - wellbore ddms", where smith went wrong..
![Снимок_экрана_2021-09-25_в_23.35.39](/uploads/0ed795b0c668aad32a34625d3f64f9ea/Снимок_экрана_2021-09-25_в_23.35.39.png)Pre-Shipping R3-M8Mariia ZverMariia Zverhttps://community.opengroup.org/osdu/platform/pre-shipping/-/issues/13Manifest Ingestion2022-08-24T14:20:39ZEsmira RafigayevaManifest IngestionPre-Shipping R3-M8Steven EvansSteven Evanshttps://community.opengroup.org/osdu/platform/pre-shipping/-/issues/46Admin UI2022-08-23T13:29:55ZSehubo AkinyanmiAdmin UIAdmin UI Test Validation
(All CSPs should now be on EV2; potential to see same behaviour across all CSPs)Admin UI Test Validation
(All CSPs should now be on EV2; potential to see same behaviour across all CSPs)Pre-Shipping R3-M8Grant MarblestoneGrant Marblestonehttps://community.opengroup.org/osdu/platform/pre-shipping/-/issues/47Admin UI2022-08-23T13:29:56ZSehubo AkinyanmiAdmin UIAdmin UI Test Validation (All CSPs should now be on EV2; potential to see same behaviour across all CSPs)Admin UI Test Validation (All CSPs should now be on EV2; potential to see same behaviour across all CSPs)Pre-Shipping R3-M8Grant MarblestoneGrant Marblestonehttps://community.opengroup.org/osdu/platform/pre-shipping/-/issues/48Admin UI2022-08-23T13:29:56ZSehubo AkinyanmiAdmin UIAdmin UI
(All CSPs should now be on EV2; potential to see same behaviour across all CSPs)Admin UI
(All CSPs should now be on EV2; potential to see same behaviour across all CSPs)Pre-Shipping R3-M8Grant MarblestoneGrant Marblestonehttps://community.opengroup.org/osdu/platform/pre-shipping/-/issues/49Admin UI2022-08-23T13:29:59ZSehubo AkinyanmiAdmin UIAdmin UI
(All CSPs should now be on EV2; potential to see same behaviour across all CSPs)Admin UI
(All CSPs should now be on EV2; potential to see same behaviour across all CSPs)Pre-Shipping R3-M8Grant MarblestoneGrant Marblestonehttps://community.opengroup.org/osdu/platform/pre-shipping/-/issues/50HTTP Status Error 404 Not found - workflow service osdu_ingest2022-07-04T00:33:29ZChad LeongHTTP Status Error 404 Not found - workflow service osdu_ingest## Summary
Tried to submit a manifest through
`POST https://workflow-drgfbg5txq-uc.a.run.app/v1/workflow/Osdu_ingest/workflowRun`
Got the error 404.
## Steps to reproduce
1) Get auth token
2) Submit manifest through the workflow s...## Summary
Tried to submit a manifest through
`POST https://workflow-drgfbg5txq-uc.a.run.app/v1/workflow/Osdu_ingest/workflowRun`
Got the error 404.
## Steps to reproduce
1) Get auth token
2) Submit manifest through the workflow service through postman
## Example Environment(Tenant)
gcp preship
## What is the current bug behavior?
Error 404
## What is the expected correct behavior?
Status code 200
## Relevant logs and/or screenshots
(Paste any relevant logs - please use code blocks (```) to format console output, logs, and code, as
it's very hard to read otherwise.)
## Possible fixes
(If you can, link to the line of code that might be responsible for the problem)
/cc @kateryna_kurach
/cc @aliaksandr_ramanovich
/cc @sergey_krupeninAliaksandr Ramanovich (EPAM)Aliaksandr Ramanovich (EPAM)https://community.opengroup.org/osdu/platform/pre-shipping/-/issues/58Could not find osdu_api.ini config file - Wistml Ingestion - Airflow DAG - R3M82023-02-24T22:21:53Zetienne peyssonCould not find osdu_api.ini config file - Wistml Ingestion - Airflow DAG - R3M8When energetics_xml_ingest DAG is triggered.
Entering the Running state gives the following error :
[2021-09-20 18:56:56,511] {pod_launcher.py:149} INFO - File "/usr/local/lib/python3.6/site-packages/osdu_api/clients/base_client.py"...When energetics_xml_ingest DAG is triggered.
Entering the Running state gives the following error :
[2021-09-20 18:56:56,511] {pod_launcher.py:149} INFO - File "/usr/local/lib/python3.6/site-packages/osdu_api/clients/base_client.py", line 57, in _parse_config
[2021-09-20 18:56:56,512] {pod_launcher.py:149} INFO - raise Exception('Could not find osdu_api.ini config file')
[2021-09-20 18:56:56,512] {pod_launcher.py:149} INFO - Exception: Could not find osdu_api.ini config file
More details here :
https://waa6f12562960aaa4p-tp.appspot.com/log?dag_id=Energistics_xml_ingest&task_id=witsml_parser_task&execution_date=2021-09-20T18%3A55%3A07%2B00%3A00Aliaksandr Ramanovich (EPAM)Aleksandr Spivakov (EPAM)Aliaksandr Ramanovich (EPAM)https://community.opengroup.org/osdu/platform/pre-shipping/-/issues/57[CVS parser] Ingested data cannot be found.2021-10-12T06:32:05ZKateryna Kurach (EPAM)[CVS parser] Ingested data cannot be found.## Summary
(Summarize the bug encountered concisely)
## Steps to reproduce
(How one can reproduce the issue - this is very important)
## Example Environment(Tenant)
Pre-shipping
## What is the current bug behavior?
CVS ingestion i...## Summary
(Summarize the bug encountered concisely)
## Steps to reproduce
(How one can reproduce the issue - this is very important)
## Example Environment(Tenant)
Pre-shipping
## What is the current bug behavior?
CVS ingestion is executed successfully, but ingested data cannot be found.
## What is the expected correct behavior?
Ingested data is found.
## Relevant logs and/or screenshots
Please see attached file.
## Possible fixes
(If you can, link to the line of code that might be responsible for the problem)
/cc @kateryna_kurach
/cc @Aliaksandr_Ramanovich1[Bug_57.txt](/uploads/707f0fd88cd336cae73c4356e7bfb8ce/Bug_57.txt)Aliaksandr Ramanovich (EPAM)Aliaksandr Ramanovich (EPAM)https://community.opengroup.org/osdu/platform/pre-shipping/-/issues/72GCP Smoke test collection - Issue with Seismic DMS folder, requests2021-09-29T13:34:21ZDebasis ChatterjeeGCP Smoke test collection - Issue with Seismic DMS folder, requestsI reported authentication problem to @aliaksandr_ramanovich from Smoke Test collection.
After you authenticate (refresh token), requests from “Core Services” and Ingestion folders work fine.
But requests from any other folder (Dataset...I reported authentication problem to @aliaksandr_ramanovich from Smoke Test collection.
After you authenticate (refresh token), requests from “Core Services” and Ingestion folders work fine.
But requests from any other folder (Dataset, SeismicDMS) fail with 401-unauthorized.Aliaksandr Ramanovich (EPAM)Aliaksandr Ramanovich (EPAM)https://community.opengroup.org/osdu/platform/pre-shipping/-/issues/96GCP - R3M8, oZgy - problem with subproject creation (Collection 27)2021-09-30T19:41:15ZDebasis ChatterjeeGCP - R3M8, oZgy - problem with subproject creation (Collection 27)@aliaksandr_ramanovich and @Denis_Karpenok - After resolving Domain API authentication problem, I could create tenant and sub-project.
But when copying SegY file to SDMS by using sdutil in command line mode, it turns out that the subproj...@aliaksandr_ramanovich and @Denis_Karpenok - After resolving Domain API authentication problem, I could create tenant and sub-project.
But when copying SegY file to SDMS by using sdutil in command line mode, it turns out that the subproject is created with proper legal tag.
I use collection#27 from Platform validation.
Step-01 to create Tenant.
Steps 04, 05 to create subproject. response shows legal tag is empty.
I suspect this is the reason why "sdutil cp" fails at a later stage.
Can you please check?
[GCP-sdutil-issue-legal-tag.txt](/uploads/75590f9207a4767de93d04546cae01c3/GCP-sdutil-issue-legal-tag.txt)Yan Sushchynski (EPAM)Aleksandr Spivakov (EPAM)Yan Sushchynski (EPAM)https://community.opengroup.org/osdu/platform/pre-shipping/-/issues/101GCP - SegY to zgy conversion - DAG run shows failure2021-10-01T13:27:42ZDebasis ChatterjeeGCP - SegY to zgy conversion - DAG run shows failurecc - @Yan_Sushchynski for information
runid = 3b837205-e9da-47bd-9565-e168e0883010
See Airflow log
[GCP-Airflow-log-failure-conversion-zgy.txt](/uploads/b37518eba176b0123fe397323d5fa3b6/GCP-Airflow-log-failure-conversion-zgy.txt)
Look...cc - @Yan_Sushchynski for information
runid = 3b837205-e9da-47bd-9565-e168e0883010
See Airflow log
[GCP-Airflow-log-failure-conversion-zgy.txt](/uploads/b37518eba176b0123fe397323d5fa3b6/GCP-Airflow-log-failure-conversion-zgy.txt)
Looks like the program is unable to access the file from SDMS.
Checked stat of the file from sdutil
(sdutilenv) C:\seismic-store-sdutil-master>python sdutil stat sd://dc-test30septry2/dc-proj/osdu-volve.segy --idtoken=%ID_TOKEN%
- Name: sd://dc-test30septry2/dc-proj/osdu-volve.segy
- Created By: 110703984333908487442
- Created Date: Thu Sep 30 2021 18:48:58 GMT+0000 (Coordinated Universal Time)
- Size: 1.0 GB
In JSON definition -
"FileSource": "sd://dc-test30septry2/dc-proj/osdu-volve.segy",
Relevant records for this task -
```
"recordIds": [
"odesprod:work-product--WorkProduct:dc-30sep-wp",
"odesprod:work-product-component--SeismicTraceData:dc-30sep-tracedata",
"odesprod:work-product-component--SeismicBinGrid:dc-30sep-bingrid",
"odesprod:dataset--FileCollection.SEGY:dc-30sep-dataset"
```Yan Sushchynski (EPAM)Yan Sushchynski (EPAM)https://community.opengroup.org/osdu/platform/pre-shipping/-/issues/95GCP - R3M8 - oVDS workflow - problem in actual conversion (collection-40) (GO...2021-10-15T12:39:50ZDebasis ChatterjeeGCP - R3M8 - oVDS workflow - problem in actual conversion (collection-40) (GONRG-3608)I am using collection-40 from Platform Validation site.
https://community.opengroup.org/osdu/platform/testing/-/tree/master/Postman%20Collection
Step1 1 thru 6 went fine. I get this **error at Step-7**,
POST https://{{DATASET_HOST}}/g...I am using collection-40 from Platform Validation site.
https://community.opengroup.org/osdu/platform/testing/-/tree/master/Postman%20Collection
Step1 1 thru 6 went fine. I get this **error at Step-7**,
POST https://{{DATASET_HOST}}/getRetrievalInstructions
Body
```
{
"datasetRegistryIds": [
"{{SegyDatasetRegistryId}}"
]
}
```
Response
```
{
"code": 400,
"reason": "Malformed URL",
"message": "Exception creating signed url"
}
```
Curl command
```
curl --location --request POST 'https://dataset-drgfbg5txq-uc.a.run.app/api/dataset/v1/getRetrievalInstructions' \
--header 'data-partition-id: odesprod' \
--header 'Authorization: Bearer ya29.a0ARrdaM8gAVViJQwUnRpdKfvKez-G706vLQ-SM_LlsJfuPkG4dzi8XRf5le4nl8z52xlm8wCRq9sHNQtNLHm4h3HuluK9yhSM2pjf0VmRfY7kCyUTOkJyZkvIHLXRB8ltTyX7ZGm9g7fdA8_grebvxsjArSwSyA' \
--header 'Content-Type: application/json' \
--data-raw '{
"datasetRegistryIds": [
"odesprod:dataset--File.Generic:b3ce1dfb-b2b3-4c28-a6e1-6762372dfac5"
]
}'
```
Please check and advise.
Thanks
Debasis
cc - @aliaksandr_ramanovich , @esmira.rafigayevaM9 - Release 0.12Yan Sushchynski (EPAM)Yan Sushchynski (EPAM)https://community.opengroup.org/osdu/platform/pre-shipping/-/issues/103Loading of 50,000 records using Osdu_ingest DAG fails for GCP Platform [GONRG...2022-08-23T12:49:24ZKamlesh TodaiLoading of 50,000 records using Osdu_ingest DAG fails for GCP Platform [GONRG-3584]When trying to load test manifest ingestion (Osdu_ingest DAG) with 50,000 organization records, the process is failing with the Response code 500.
**First attempt**
DEBUG:root:Response: 500
DEBUG:root:json = {"**timestamp": 163309726423...When trying to load test manifest ingestion (Osdu_ingest DAG) with 50,000 organization records, the process is failing with the Response code 500.
**First attempt**
DEBUG:root:Response: 500
DEBUG:root:json = {"**timestamp": 1633097264234, "status": 500, "error": "Internal Server Error", "message": "Java heap space", "path": "/api/workflow/v1/workflow/Osdu_ingest/workflowRun**"}
**The second attempt after @aliaksandr_ramanovich added more memory**
_639_50000", "Source": "Source_TEST_BULK_2021-09-22_639_50000"}}]}}}, "headers": {"data-partition-id": "odesprod", "Accept": "application/json", "Authorization": "Bearer ya29.a0ARrdaM_7bjuhcD4aNLPF2jbK_MsQL1ncG7k_kwSR00_AxgkOPVdXhgOJH_iLzxMSj3mLDooFbusYwCM-avGyTphGmN1-BITJ6xzTwQzar88AFIdcYBYdSjbQ8ECne-IVen3aJdRKVW3YJaisEsBwXRdrL276"}, "verify": false}
DEBUG:root:Response: 500
DEBUG:root:text =
>>>
This time it did not give any details in the text field of the response.Aliaksandr Ramanovich (EPAM)Artem Dobrynin (EPAM)Aliaksandr Ramanovich (EPAM)https://community.opengroup.org/osdu/platform/pre-shipping/-/issues/88Invalid URL - generating manifest - Airflow - Witsml Parser (GONRG-3529)2021-09-30T09:35:12Zetienne peyssonInvalid URL - generating manifest - Airflow - Witsml Parser (GONRG-3529)[2021-09-28 09:42:41,225] {pod_launcher.py:149} INFO - File "/usr/local/lib/python3.6/site-packages/requests/models.py", line 390, in prepare_url
[2021-09-28 09:42:41,225] {pod_launcher.py:149} INFO - raise MissingSchema(error)
[20...[2021-09-28 09:42:41,225] {pod_launcher.py:149} INFO - File "/usr/local/lib/python3.6/site-packages/requests/models.py", line 390, in prepare_url
[2021-09-28 09:42:41,225] {pod_launcher.py:149} INFO - raise MissingSchema(error)
[2021-09-28 09:42:41,226] {pod_launcher.py:149} INFO - requests.exceptions.MissingSchema: Invalid URL '': No schema supplied. Perhaps you meant http://?
More details in the logs on Airflow :
https://waa6f12562960aaa4p-tp.appspot.com/log?dag_id=Energistics_xml_ingest&task_id=witsml_parser_task&execution_date=2021-09-28T09%3A41%3A32%2B00%3A00Aleksandr Spivakov (EPAM)Aleksandr Spivakov (EPAM)https://community.opengroup.org/osdu/platform/pre-shipping/-/issues/69Steps 14 and 15 from CSV Workflow of GCP smoke tests do not search the record...2021-10-01T16:49:17ZValery GinakSteps 14 and 15 from CSV Workflow of GCP smoke tests do not search the record you created earlierThe date of the result you are getting is MayThe date of the result you are getting is Mayhttps://community.opengroup.org/osdu/platform/pre-shipping/-/issues/68Post OSDU WellLog data command from Wellbore DDMS collection does not work in...2021-10-01T16:48:53ZValery GinakPost OSDU WellLog data command from Wellbore DDMS collection does not work in GCP environment![Снимок_экрана_2021-09-23_в_18.40.42](/uploads/645ac5aea795e028e9aee2c6957d8984/Снимок_экрана_2021-09-23_в_18.40.42.png)![Снимок_экрана_2021-09-23_в_18.40.42](/uploads/645ac5aea795e028e9aee2c6957d8984/Снимок_экрана_2021-09-23_в_18.40.42.png)Siarhei Khaletski (EPAM)Siarhei Khaletski (EPAM)https://community.opengroup.org/osdu/platform/pre-shipping/-/issues/87[GCP Airflow] XCOM Skipped_Ids Is Not Working2021-09-29T07:57:11ZNaufal Mohamed Noori[GCP Airflow] XCOM Skipped_Ids Is Not WorkingWhen ingestion record failed and no record being saved we dont see any entry record in XCOM skipped_ids.
Steps to reproduce:
a) Using DAG manifest ingestion, load a master data wellbore record. Note that this record will definitely fai...When ingestion record failed and no record being saved we dont see any entry record in XCOM skipped_ids.
Steps to reproduce:
a) Using DAG manifest ingestion, load a master data wellbore record. Note that this record will definitely failed due to missing WellId reference in the record. Hence at the end we expect this record json wont be saved:
BODY:
```
{
"executionContext": {
"Payload": {
"AppKey": "test-app",
"data-partition-id": "{{data-partition-id}}"
},
"manifest": {
"kind": "{{data-partition-id}}:wks:Manifest:1.0.0",
"MasterData": [
{
"id": "{{data-partition-id}}:master-data--Wellbore:Test_NN_2021_09_24_01",
"kind": "{{data-partition-id}}:wks:master-data--Wellbore:1.0.0",
"acl": {
"owners": [
"data.default.owners@{{data-partition-id}}.osdu-gcp.go3-nrg.projects.epam.com"
],
"viewers": [
"data.default.viewers@{{data-partition-id}}.osdu-gcp.go3-nrg.projects.epam.com"
]
},
"legal": {
"legaltags": [
"{{data-partition-id}}-demo-legaltag"
],
"otherRelevantDataCountries": [
"US"
]
},
"data": {
"WellID": "{{data-partition-id}}:master-data--Well:TEST_ERROR:",
"FacilityName": "TEST_NN_1_ALIAS",
"SequenceNumber": 1,
"Source": "TEST_NN_1_ALIAS_SOURCE",
"NameAliases": [
{
"AliasName": "TEST_NN_1_ALIAS"
}
]
}
}
]
}
}
}
```
b) Run DAG Manifest POST: https://{{WORKFLOW_HOST}}/workflow/Osdu_ingest/workflowRun:
``{ "workflowId": "ef82cba0-0e45-4df3-91bf-4df1553102d3", "runId": "5a786c6f-103e-44d3-b192-d34e3026b722", "startTimeStamp": 1632812342734, "status": "submitted", "submittedBy": "[preshipping_test_user@osdu-gcp.go3-nrg.projects.epam.com](mailto:preshipping_test_user@osdu-gcp.go3-nrg.projects.epam.com)" }``
c) Observe the airflow log. In XCOM no skipped_ids are recorded even the log says no record being saved:
```
[2021-09-28 07:00:05,878] {process_manifest_r3.py:167} DEBUG - Manifest data: {'ReferenceData': [], 'MasterData': [], 'Data': {'Datasets': [], 'WorkProductComponents': [], 'WorkProduct': {}}, 'kind': 'odesprod:wks:Manifest:1.0.0'}
[2021-09-28 07:00:05,879] {manifest_analyzer.py:286} DEBUG - Entity graph {}.
[2021-09-28 07:00:05,881] {single_manifest_processor.py:136} INFO - Processed ids []
[2021-09-28 07:00:05,882] {process_manifest_r3.py:173} INFO - Processed ids []
```
![image](/uploads/207710337ef15f86d0d10caf150a42ca/image.png)
(TESTED ON R3M8 Preship GCP environment on 27 September 2021)
cc @esmira.rafigayeva @debasisc @aliaksandr_ramanovichAleksandr Spivakov (EPAM)Aleksandr Spivakov (EPAM)https://community.opengroup.org/osdu/platform/pre-shipping/-/issues/67Step 9 from CSV Workflow does not work in GCP environment2021-09-28T12:49:06ZValery GinakStep 9 from CSV Workflow does not work in GCP environment![Снимок_экрана_2021-09-23_в_18.37.42](/uploads/fd63b9ce817231de990b07e9f3a2532d/Снимок_экрана_2021-09-23_в_18.37.42.png)![Снимок_экрана_2021-09-23_в_18.37.42](/uploads/fd63b9ce817231de990b07e9f3a2532d/Снимок_экрана_2021-09-23_в_18.37.42.png)https://community.opengroup.org/osdu/platform/pre-shipping/-/issues/51ADR: E2E preshipment team A workflow bot2023-08-09T08:47:29Zetienne peyssonADR: E2E preshipment team A workflow bot
## Status
- [x] Draft
- [ ] Proposed
- [x] Trialing
- [ ] Under Review
- [ ] Approved
- [ ] Retired
## Context & Scope
Following the Preshipment validation dashboard :
![preship-validation-scope](/uploads/1205798b98bfc6cbc4b2f341fe...
## Status
- [x] Draft
- [ ] Proposed
- [x] Trialing
- [ ] Under Review
- [ ] Approved
- [ ] Retired
## Context & Scope
Following the Preshipment validation dashboard :
![preship-validation-scope](/uploads/1205798b98bfc6cbc4b2f341fe68eb45/preship-validation-scope.png)
There are multiple manual steps to achieve in order to test each workflow.
This seems to be error prone and time consuming.
This ADR focuses on the following steps :
- Authenticate to any CSP
- File uploading whenever required
- Trigger DAG
- Validate the workflow
- Generate a report
- Clean up
That work could be also be extended with the following :
- Bulk loading
Another ADR has been approved for maintaining Postman collections to be integrated for testing on DAGs and services endpoints end to end executions.
There might be some overlapping with the ![Postman Collection ADR](https://community.opengroup.org/osdu/platform/data-flow/ingestion/home/-/issues/49).
## Proposition
Implement a script/framework so we can test each use case independently from any computer but also a Gitlab pipeline (e2e tests).
- Pros :
Get a clear report of the process so we can more easily provide feedbacks.
Have a common place with configuration templates to fill in for CSPs (another script could also help on that part).
Add other workflows using the existing framework.
Test using multiple source files.
Match the framework version with the releases so pipelines could be run in multiple test environments at the same time (as long as environments are available).
Easy onboarding for new developers.
- Cons :
Maintenance of the configuration parameters (CSPs) should follow releases cadence.
Needs developers
## Decision
## Rationale
## Consequences
- Direct consequence on the Preshipment team A.
## When to revisit
## Tradeoff analysis - input to decision
## Decision timelinehttps://community.opengroup.org/osdu/platform/pre-shipping/-/issues/52CSV Parser failure - R3M7 IBM- Custom Schema - Airflow DAG error2021-09-24T07:00:12ZSteven EvansCSV Parser failure - R3M7 IBM- Custom Schema - Airflow DAG errorRunning a practice test on M7 csv custom schema prior to M8 testing program, an issue was encountered when “triggering the Workflow” step 09 was actioned. This successfully triggered and created a runid however on checking the Airflow DA...Running a practice test on M7 csv custom schema prior to M8 testing program, an issue was encountered when “triggering the Workflow” step 09 was actioned. This successfully triggered and created a runid however on checking the Airflow DAG logs the process failed with the following error:
![image](/uploads/fcf25be25a5dcd82f619c84b33e1f728/image.png)![Custom_Schema_DAG_failure_log](/uploads/95971f525950934d1271d0e77cbccde9/Custom_Schema_DAG_failure_log.PNG)Pre-Shipping R3-M7Shrikant GargSteven EvansShrikant Garghttps://community.opengroup.org/osdu/platform/pre-shipping/-/issues/53WITSML Parser (Trajectory) - Error 500 on Post metadata - R3M82021-09-23T11:09:44Zetienne peyssonWITSML Parser (Trajectory) - Error 500 on Post metadata - R3M8I'm receiving the following error :
"Client failed to authenticate using SASL: PLAIN" and code : 500
After making the following call :
POST https://osdu-cpd-osdu.odi-osdu-og-fa7661852f2ab29a6be32f560b2f5573-0000.us-south.containers...I'm receiving the following error :
"Client failed to authenticate using SASL: PLAIN" and code : 500
After making the following call :
POST https://osdu-cpd-osdu.odi-osdu-og-fa7661852f2ab29a6be32f560b2f5573-0000.us-south.containers.appdomain.cloud/osdu-file/api/file/v2/files/metadata
- Given Authorization with Access/id token
- Given data-partition-id opendes
- Given Content-Type application/json
- Given x-ms-blob-type BlockBlob
Given body :
```json
{
"data" : {
"TotalSize" : 5299.0,
"Source" : "TNO Data Source",
"Name" : "Trajectory DC",
"Endian" : "BIG",
"Description" : "Trajectory WITSML dataset",
"DatasetProperties" : {
"FileSourceInfo" : {
"FileSource" : "567637002f924633833787511ab77dfa",
"Name" : "trajectory_DC.xml",
"PreloadFilePath" : "s3://oc-cpd-opendes-staging-bucket/567637002f924633833787511ab77dfa",
"PreloadFileCreateUser" : null,
"PreloadFileModifyDate" : 1631859302.437914453,
"PreloadFileModifyUser" : null
}
}
},
"kind" : "opendes:wks:dataset--File.Generic:1.0.0",
"acl" : {
"viewers" : [ "data.default.viewers@opendes.ibm.com" ],
"owners" : [ "data.default.owners@opendes.ibm.com" ]
},
"legal" : {
"otherRelevantDataCountries" : [ "US" ],
"status" : "compliant",
"legaltags" : [ "opendes-Test-Legal-Tag-7292798" ]
},
"createUser" : null,
"createTime" : 1631859302.433772431,
"modifyUser" : null,
"modifyTime" : 1631859302.437564058
}
```
It was working properly few days ago.
Another question :
I see in the DAG that you generate the metadata if we don't provide it when you are triggering the witsml parser.
What is the recommended way of doing ?
Do you still allow the metaadata to be sent ?Pre-Shipping R3-M8Gokul NagareGokul Nagarehttps://community.opengroup.org/osdu/platform/pre-shipping/-/issues/16Version Endpoint2022-08-23T13:29:57ZEsmira RafigayevaVersion EndpointPre-Shipping R3-M8Sehubo AkinyanmiSehubo Akinyanmihttps://community.opengroup.org/osdu/platform/pre-shipping/-/issues/54Not able to copy a small file (with one line of text) using sdutils from seis...2021-09-30T23:03:56ZKamlesh TodaiNot able to copy a small file (with one line of text) using sdutils from seismic-dms-sdutil and following the steps described in readme file(env) D:\OSDU\PreShipping\M8\seismic-store-sdutil>type data1.txt
"My Test Data"
(env) D:\OSDU\PreShipping\M8\seismic-store-sdutil>python sdutil cp data1.txt sd://opendes/kttestsubprojsep16/mydata1.txt
[423] [seismic-store-service] open...(env) D:\OSDU\PreShipping\M8\seismic-store-sdutil>type data1.txt
"My Test Data"
(env) D:\OSDU\PreShipping\M8\seismic-store-sdutil>python sdutil cp data1.txt sd://opendes/kttestsubprojsep16/mydata1.txt
[423] [seismic-store-service] opendes/kttestsubprojsep16/mydata1.txt is write locked [RCODE:WL86400] Locked from yesterday’s attempt
(env) D:\OSDU\PreShipping\M8\seismic-store-sdutil>python sdutil cp data2.txt sd://opendes/kttestsubprojsep16/mydata2.txt
- Uploading Data [ 0% | | 0.00/17.0 - 00:06|? - ?B/s ]
maximum recursion depth exceeded while calling a Python object
Attached is the list of commands used to set the environment and other steps before trying to execute the copy command[sdutil_problemLog.docx](/uploads/91875b1726f4ec8cd25a087677f741fb/sdutil_problemLog.docx)Walter DWalter Dhttps://community.opengroup.org/osdu/platform/pre-shipping/-/issues/90Unable to upload larger file to Azure ('Connection aborted.', timeout('The wr...2021-10-01T12:38:09ZGrant MarblestoneUnable to upload larger file to Azure ('Connection aborted.', timeout('The write operation timed out'))I have attempted to write a segy file to azure via Sdutil.
I have successfully uploaded a small file 17k using the command following successfully.
python sdutil cp data1.txt sd://opendes/grant-test/grant/data1.txt
But I fail on a 80Meg...I have attempted to write a segy file to azure via Sdutil.
I have successfully uploaded a small file 17k using the command following successfully.
python sdutil cp data1.txt sd://opendes/grant-test/grant/data1.txt
But I fail on a 80Meg file with the following command.
python sdutil cp data2.txt sd://opendes/grant-test/grant/data2.txt
I attempted to track down the source of the issue but was not able to spend the time.
in \seismic-store-sdutil\sdlib\cmd\cp\cmd.py
on line approx 47 there is the method **upload_data_chunks**
The small file goes thru the if/else for a single put in seismic-store-sdutil\sdutilenv\Lib\site-packages\azure\storage\blob\_upload_helpers.py
while the larger file goes thru the use_original_upload_path (line 122).
After that i followed the data thru the code to seismic-store-sdutil\sdutilenv\Lib\site-packages\azure\storage\blob\_shared\uploads.py
The large file seems to have a max_concurrency of 1. Which seems strange.
Anyway, at this point i gave up. I was unable to set the timeout anywhere.
Note: Debasis and I are unable to upload but Chris can.Sumra ZafarSumra Zafarhttps://community.opengroup.org/osdu/platform/pre-shipping/-/issues/55Access of preshipping env, (Need service account key)2021-09-21T13:06:04Zupendra kumarAccess of preshipping env, (Need service account key)## Summary
(Summarize the bug encountered concisely)
## Steps to reproduce
(How one can reproduce the issue - this is very important)
## Example Environment(Tenant)
(Tenant name)
## What is the current bug behavior?
(What actually...## Summary
(Summarize the bug encountered concisely)
## Steps to reproduce
(How one can reproduce the issue - this is very important)
## Example Environment(Tenant)
(Tenant name)
## What is the current bug behavior?
(What actually happens)
## What is the expected correct behavior?
(What you should see instead)
## Relevant logs and/or screenshots
(Paste any relevant logs - please use code blocks (```) to format console output, logs, and code, as
it's very hard to read otherwise.)
## Possible fixes
(If you can, link to the line of code that might be responsible for the problem)
/cc @kateryna_kurach
/cc @Aliaksandr_Ramanovich1
Need access of pre shipping env, (Need service account key )https://community.opengroup.org/osdu/platform/pre-shipping/-/issues/56While accessing the uploaded segy file using openvds_import DAG getting Inval...2021-10-05T06:05:13ZKamlesh TodaiWhile accessing the uploaded segy file using openvds_import DAG getting Invalid DNS Label found in URI hostUsing python script and minio, uploaded the segy file. Then using the Airflow DAG openvds_import and the workflow service tried to execute the DAG to ingest the data from the uploaded segy file. Using Airflow Dashboard found out that DAG...Using python script and minio, uploaded the segy file. Then using the Airflow DAG openvds_import and the workflow service tried to execute the DAG to ingest the data from the uploaded segy file. Using Airflow Dashboard found out that DAG was not executed successfully and then looking at the Log show the following message:
2021-09-20 15:17:22,948] {logging_mixin.py:112} INFO - [2021-09-20 15:17:22,947] {pod_launcher.py:142} INFO - Event: openvds-0b96eeff had an event of type Running
[2021-09-20 15:17:24,063] {logging_mixin.py:112} INFO - [2021-09-20 15:17:24,063] {pod_launcher.py:125} INFO - b'[Could not open input file] s3://osdu-seismic-test-data/ST10010ZC11_PZ_PSDM_RAW_NEAR_T.MIG_RAW.POST_STACK.3D.JS-017536.segy: : **Invalid DNS Label found in URI host**\n'
[2021-09-20 15:17:24,350] {logging_mixin.py:112} INFO - [2021-09-20 15:17:24,350] {pod_launcher.py:142} INFO - Event: openvds-0b96eeff had an event of type Running
For more details see the attached word doc
[SegyFileNotFoundLog.docx](/uploads/884ede71295f07f6dfc69bfb638e3713/SegyFileNotFoundLog.docx)Walter DWalter Dhttps://community.opengroup.org/osdu/platform/pre-shipping/-/issues/59R3M8 - Commands with {{FILE_HOST}} do not work2021-09-23T15:38:23ZValery GinakR3M8 - Commands with {{FILE_HOST}} do not work## Summary
When trying command 7 in CSV Workflow or command 1 in Manifest Ingestion (both include {{FILE_HOST}}) the error (Error: getaddrinfo ENOTFOUND https) occurs
## Steps to reproduce
Run any command with {{FILE_HOST}}
/cc @ka...## Summary
When trying command 7 in CSV Workflow or command 1 in Manifest Ingestion (both include {{FILE_HOST}}) the error (Error: getaddrinfo ENOTFOUND https) occurs
## Steps to reproduce
Run any command with {{FILE_HOST}}
/cc @kateryna_kurach
/cc @Aliaksandr_Ramanovich1Aliaksandr Ramanovich (EPAM)Aliaksandr Ramanovich (EPAM)https://community.opengroup.org/osdu/platform/pre-shipping/-/issues/60Witsml Parser (Trajectory) - DAG failure - R3M82021-09-28T09:46:37Zetienne peyssonWitsml Parser (Trajectory) - DAG failure - R3M8Following are the steps I've followed to get the error from Airflow (you'll see the error at the bottom)
1. Get the SignedUrl from the File DMS
GET https://osdu-cpd-osdu.odi-osdu-og-fa7661852f2ab29a6be32f560b2f5573-0000.us-south.contai...Following are the steps I've followed to get the error from Airflow (you'll see the error at the bottom)
1. Get the SignedUrl from the File DMS
GET https://osdu-cpd-osdu.odi-osdu-og-fa7661852f2ab29a6be32f560b2f5573-0000.us-south.containers.appdomain.cloud/osdu-file/api/file/v2/files/uploadURL
- Given Authorization access/id token
- Given data-partition-id : opendes
- Signed Url : https://minio-osdu-minio.odi-osdu-og-fa7661852f2ab29a6be32f560b2f5573-0000.us-south.containers.appdomain.cloud/oc-cpd-opendes-staging-bucket/dbbe4c019ef5476597b417574ffcdb6a?X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Date=20210922T121558Z&X-Amz-SignedHeaders=host&X-Amz-Expires=86399&X-Amz-Credential=minio%2F20210922%2Fus-east-1%2Fs3%2Faws4_request&X-Amz-Signature=97344e6d73c77e073e27dfe5942eb9744cfac56c4d96a6f6062b84e83691182e
- File source ID : dbbe4c019ef5476597b417574ffcdb6a
2. File upload
PUT https://minio-osdu-minio.odi-osdu-og-fa7661852f2ab29a6be32f560b2f5573-0000.us-south.containers.appdomain.cloud/oc-cpd-opendes-staging-bucket/dbbe4c019ef5476597b417574ffcdb6a?X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Date=20210922T121558Z&X-Amz-SignedHeaders=host&X-Amz-Expires=86399&X-Amz-Credential=minio%2F20210922%2Fus-east-1%2Fs3%2Faws4_request&X-Amz-Signature=97344e6d73c77e073e27dfe5942eb9744cfac56c4d96a6f6062b84e83691182e
- Given headers :
Content-Type: Application/xml
- Given Binary file stream
Upload file response code 200
3. Load metadata
POST https://osdu-cpd-osdu.odi-osdu-og-fa7661852f2ab29a6be32f560b2f5573-0000.us-south.containers.appdomain.cloud/osdu-file/api/file/v2/files/metadata
- Given Authorization with Access/id token
- Given data-partition-id opendes
- Given Content-Type application/json
- Given x-ms-blob-type BlockBlob
Given body :
```json
{
"data" : {
"TotalSize" : 5299.0,
"Source" : "TNO Data Source",
"Name" : "Trajectory DC",
"Endian" : "BIG",
"Description" : "Trajectory WITSML dataset",
"DatasetProperties" : {
"FileSourceInfo" : {
"FileSource" : "dbbe4c019ef5476597b417574ffcdb6a",
"Name" : "trajectory_DC.xml",
"PreloadFilePath" : "s3://oc-cpd-opendes-staging-bucket/dbbe4c019ef5476597b417574ffcdb6a",
"PreloadFileCreateUser" : "preshipteama",
"PreloadFileModifyDate" : "2021-09-22 02:15:59",
"PreloadFileModifyUser" : "preshipteama"
}
}
},
"kind" : "opendes:wks:dataset--File.Generic:1.0.0",
"acl" : {
"viewers" : [ "data.default.viewers@opendes.ibm.com" ],
"owners" : [ "data.default.owners@opendes.ibm.com" ]
},
"legal" : {
"otherRelevantDataCountries" : [ "US" ],
"status" : "compliant",
"legaltags" : [ "opendes-Test-Legal-Tag-7292798" ]
},
"createUser" : "preshipteama",
"createTime" : "2021-09-22 02:15:59",
"modifyUser" : "preshipteama",
"modifyTime" : "2021-09-22 02:15:59"
}
```
Response obtained with dataset ID opendes:dataset--File.Generic:3f2d8cb1-dfec-48e6-ba90-986b5a89a0ed
4. Trigger DAG Workflow
POST https://osdu-cpd-osdu.odi-osdu-og-fa7661852f2ab29a6be32f560b2f5573-0000.us-south.containers.appdomain.cloud/osdu-workflow/api/workflow/v1/workflow/Energistics_xml_ingest/workflowRun
Given Authorization with Access/id token
Given data-partition-id opendes
Given Content-Type application/json
Given body :
```json
{
"executionContext" : {
"Payload" : {
"AppKey" : "test-app",
"data-partition-id" : "opendes"
},
"Context" : {
"acl" : {
"viewers" : [ "data.default.viewers@opendes.ibm.com" ],
"owners" : [ "data.default.owners@opendes.ibm.com" ]
},
"legal" : {
"otherRelevantDataCountries" : [ "US" ],
"status" : "compliant",
"legaltags" : [ "opendes-Test-Legal-Tag-7292798" ]
},
"kind" : "opendes:wks:dataset--File.Generic:1.0.0",
"version" : 1.0,
"dataset_id" : "opendes:dataset--File.Generic:3f2d8cb1-dfec-48e6-ba90-986b5a89a0ed",
"file_name" : "trajectory_DC.xml",
"preload_file_path" : "s3://oc-cpd-opendes-staging-bucket/dbbe4c019ef5476597b417574ffcdb6a"
}
}
}
```
Response with run ID : 0174ebc2-1c65-47b4-9c06-7ac5a8823925
Error logs :
```
[2021-09-22 12:16:49,662] {logging_mixin.py:112} INFO - [2021-09-22 12:16:49,662] {pod_launcher.py:142} INFO - Event: witsml-parser-task-25a574f5 had an event of type Running
[2021-09-22 12:16:59,071] {logging_mixin.py:112} INFO - [2021-09-22 12:16:59,071] {pod_launcher.py:125} INFO - b'IBMBlobStorageFactory().get_s3_client\n'
[2021-09-22 12:16:59,071] {logging_mixin.py:112} INFO - [2021-09-22 12:16:59,071] {pod_launcher.py:125} INFO - b'IBMBlobStorageFactory().s3_client: <botocore.client.S3 object at 0x7fd4d46e21d0>\n'
[2021-09-22 12:16:59,072] {logging_mixin.py:112} INFO - [2021-09-22 12:16:59,072] {pod_launcher.py:125} INFO - b'Traceback (most recent call last):\n'
[2021-09-22 12:16:59,075] {logging_mixin.py:112} INFO - [2021-09-22 12:16:59,072] {pod_launcher.py:125} INFO - b' File "main.py", line 45, in <module>\n'
[2021-09-22 12:16:59,075] {logging_mixin.py:112} INFO - [2021-09-22 12:16:59,075] {pod_launcher.py:125} INFO - b' manifest = json.dumps(main(json.loads(args.context), args.file_service))\n'
[2021-09-22 12:16:59,076] {logging_mixin.py:112} INFO - [2021-09-22 12:16:59,076] {pod_launcher.py:125} INFO - b' File "main.py", line 35, in main\n'
[2021-09-22 12:16:59,076] {logging_mixin.py:112} INFO - [2021-09-22 12:16:59,076] {pod_launcher.py:125} INFO - b' manifest = manifest_creator.create_manifest(execution_context)\n'
[2021-09-22 12:16:59,076] {logging_mixin.py:112} INFO - [2021-09-22 12:16:59,076] {pod_launcher.py:125} INFO - b' File "/home/witsml_parser/create_energistics_manifest.py", line 234, in create_manifest\n'
[2021-09-22 12:16:59,077] {logging_mixin.py:112} INFO - [2021-09-22 12:16:59,076] {pod_launcher.py:125} INFO - b' dataset_dms_client = DatasetDmsClient(DefaultConfigManager(), self.payload_context.data_partition_id)\n'
[2021-09-22 12:16:59,077] {logging_mixin.py:112} INFO - [2021-09-22 12:16:59,077] {pod_launcher.py:125} INFO - b'TypeError: __init__() takes from 1 to 2 positional arguments but 3 were given\n'
[2021-09-22 12:17:02,050] {logging_mixin.py:112} INFO - [2021-09-22 12:17:02,049] {pod_launcher.py:217} INFO - Running command... cat /airflow/xcom/return.json
[2021-09-22 12:17:02,383] {logging_mixin.py:112} INFO - [2021-09-22 12:17:02,382] {pod_launcher.py:224} INFO - cat: can't open '/airflow/xcom/return.json': No such file or directory
```
More details here :
http://airflow-web-odi-airflow-ns.odi-osdu-og-fa7661852f2ab29a6be32f560b2f5573-0000.us-south.containers.appdomain.cloud/log?task_id=witsml_parser_task&dag_id=Energistics_xml_ingest&execution_date=2021-09-22T12%3A16%3A06%2B00%3A00Shrikant GargGokul NagareShrikant Garghttps://community.opengroup.org/osdu/platform/pre-shipping/-/issues/61Storage service (R3M8) not finding record with Ids returned from Search Query2021-09-29T16:39:33ZChad LeongStorage service (R3M8) not finding record with Ids returned from Search Query## Description
The Storage service in M8 is not finding record with Ids returned from Search Query.
## Steps to reproduce
**1. Make a query through the search services for Welllog**
`POST https://osdu-ship.msft-osdu-test.org/api/sea...## Description
The Storage service in M8 is not finding record with Ids returned from Search Query.
## Steps to reproduce
**1. Make a query through the search services for Welllog**
`POST https://osdu-ship.msft-osdu-test.org/api/search/v2/query`
```json
{
"kind": "osdu:wks:work-product-component--Welllog:1.0.0",
"query": "*",
"aggregateBy": "kind",
"returnedFields": [
"id"
]
}
```
Results
```json
{
"results": [
{
"id": "opendes:work-product-component--WellLog:a6b516e24d2342d5903e52d2d9bc816e"
},
{
"id": "opendes:work-product-component--WellLog:731bb5687e5f46ad992c86c5d916e7b9"
},
...
}
```
**2. Copy any of these id and put it to storage record.**
`GET https://osdu-ship.msft-osdu-test.org/api/storage/v2/records/opendes:work-product-component--WellLog:731bb5687e5f46ad992c86c5d916e7b9`
Results
```json
{
"code": 404,
"reason": "Record not found",
"message": "The record 'opendes:work-product-component--WellLog:731bb5687e5f46ad992c86c5d916e7b9' was not found"
}
```Krishna Nikhil VedurumudiChad LeongKrishna Nikhil Vedurumudihttps://community.opengroup.org/osdu/platform/pre-shipping/-/issues/63WITSML Parser (Well) - "code":403; "reason": "Access denied" AWS2021-09-23T15:22:31ZEsmira RafigayevaWITSML Parser (Well) - "code":403; "reason": "Access denied" AWS![image](/uploads/bb3de7c1157d9cdb656243f6abdb391c/image.png)![image](/uploads/bb3de7c1157d9cdb656243f6abdb391c/image.png)Pre-Shipping R3-M8GregGreghttps://community.opengroup.org/osdu/platform/pre-shipping/-/issues/64IBM Airflow job hangs2021-09-23T19:10:16ZYanbin ZhangIBM Airflow job hangsJust submitted a job which hangs
{"runId": "4a307ab1-aeb4-47e9-ab54-c85a9a35362c", "startTimeStamp": 1632406417846, "status": "running", "submittedBy": "preshipteamd@osdu.opengroup.org"}Just submitted a job which hangs
{"runId": "4a307ab1-aeb4-47e9-ab54-c85a9a35362c", "startTimeStamp": 1632406417846, "status": "running", "submittedBy": "preshipteamd@osdu.opengroup.org"}https://community.opengroup.org/osdu/platform/pre-shipping/-/issues/65IBM M8 csv parser dag - Airflow in continual "running" state for custom schem...2021-09-24T06:57:43ZSteven EvansIBM M8 csv parser dag - Airflow in continual "running" state for custom schema collectionAirflow csv parser DAG for the M8 custom schema using the platform validation collection and environment jsons has bee stuck in “running” mode for over 4 hours. When viewing the logs – no logs appear or are available.![Custom_schema_Dag_...Airflow csv parser DAG for the M8 custom schema using the platform validation collection and environment jsons has bee stuck in “running” mode for over 4 hours. When viewing the logs – no logs appear or are available.![Custom_schema_Dag__-_no_log_display](/uploads/f35b098cd399154dcdc02d889392a4a6/Custom_schema_Dag__-_no_log_display.PNG)
![Custom_Schema_Dag_run_issue_1](/uploads/0fd4a116d50316aa3c7649b71a805af6/Custom_Schema_Dag_run_issue_1.PNG)
All Postman collection steps were successful including Trigger Workflow.Steven EvansSteven Evanshttps://community.opengroup.org/osdu/platform/pre-shipping/-/issues/66IBM missing standard reference data2021-09-28T22:43:02ZYanbin ZhangIBM missing standard reference dataTry to find all the schema available in the IBM preshipping environment. Under authority='opendes' and source='wks', there is only 59 different schemas preloaded. We expect all reference data preloaded in the preshipping environment.
@an...Try to find all the schema available in the IBM preshipping environment. Under authority='opendes' and source='wks', there is only 59 different schemas preloaded. We expect all reference data preloaded in the preshipping environment.
@anujguptahttps://community.opengroup.org/osdu/platform/pre-shipping/-/issues/70IBM master data injection shows success without passing validation2021-09-27T12:34:18ZYanbin ZhangIBM master data injection shows success without passing validationIngest the following master data:[3.load_Well.json](/uploads/dc357b7638da45ad6caa5846715a5668/3.load_Well.json)
In task provide_manifest_integrity_task, I see the following warning:
```
[2021-09-23 20:40:11,579] {logging_mixin.py:112} ...Ingest the following master data:[3.load_Well.json](/uploads/dc357b7638da45ad6caa5846715a5668/3.load_Well.json)
In task provide_manifest_integrity_task, I see the following warning:
```
[2021-09-23 20:40:11,579] {logging_mixin.py:112} INFO - Running %s on host %s <TaskInstance: Osdu_ingest.provide_manifest_integrity_task 2021-09-23T20:38:08+00:00 [running]> airflow-worker-0.airflow-worker.odi-airflow-ns.svc.cluster.local
[2021-09-23 20:40:50,247] {logging_mixin.py:112} INFO - [2021-09-23 20:40:50,246] {validate_referential_integrity.py:210} WARNING - Resource with kind opendes:wks:master-data--Well:1.0.0 and id: 'opendes:master-data--Well:fcd27e71-28a5-4cb9-b8ad-4bad397b3613' was rejected. Missing ids '{'opendes:master-data--GeoPoliticalEntity:Limburg:', 'opendes:master-data--GeoPoliticalEntity:Netherlands:', 'opendes:master-data--GeoPoliticalEntity:L:'}'
[2021-09-23 20:40:50,367] {taskinstance.py:1065} INFO - Marking task as SUCCESS.dag_id=Osdu_ingest, task_id=provide_manifest_integrity_task, execution_date=20210923T203808, start_date=20210923T204011, end_date=20210923T204050
[2021-09-23 20:40:51,537] {logging_mixin.py:112} INFO - [2021-09-23 20:40:51,536] {local_task_job.py:103} INFO - Task exited with return code 0
```
The master data is not being ingested because some reference ids are missing. But the entire workflow ("runId": "331b1678-5168-40ee-bfd1-8dccef2eabd1") finished successfully.https://community.opengroup.org/osdu/platform/pre-shipping/-/issues/71AWS/WITSML Parser: not getting Processed ids and master data in Log view (Air...2023-02-26T13:48:25ZEsmira RafigayevaAWS/WITSML Parser: not getting Processed ids and master data in Log view (Airflow) when Triggering workflowTriggering workflow on WITSML Parser/AWS – Create Dataset Registry – Well: Airflow shows success and Postman has no issue. Id number is identical in both locations:
![image](/uploads/15e554fb42cae3344f674bb2c90e16c6/image.png)
However, I...Triggering workflow on WITSML Parser/AWS – Create Dataset Registry – Well: Airflow shows success and Postman has no issue. Id number is identical in both locations:
![image](/uploads/15e554fb42cae3344f674bb2c90e16c6/image.png)
However, I don’t have Processed ids and master data in my Log view:
![image](/uploads/f73894eb24cddc910b2160542ccd93cc/image.png)
"runId": "9a7cf175-a1de-4da0-9bb1-c10fe55f611a"M10 - Release 0.13Gregetienne peyssonGreghttps://community.opengroup.org/osdu/platform/pre-shipping/-/issues/73AWS R3M8 - standard ACLs are not available for Preship user2021-09-27T22:25:42ZDebasis ChatterjeeAWS R3M8 - standard ACLs are not available for Preship userPlease see this excerpt from Manifest Ingestion Smoke test sample.
```
"acl": {
"owners": [
"data.default.owners@{{data_partition_id}}.testing.com"
],
"viewers": [
...Please see this excerpt from Manifest Ingestion Smoke test sample.
```
"acl": {
"owners": [
"data.default.owners@{{data_partition_id}}.testing.com"
],
"viewers": [
"data.default.viewers@{{data_partition_id}}.testing.com"
]
},
```
When I check "Get group for members", these are not present. See enclosed.
[AWS-ACLs.txt](/uploads/827a727407992ee0ff5c4dd3e66a18ff/AWS-ACLs.txt)
cc - @esmira.rafigayeva , @sje7253bp , @Wibben for informationhttps://community.opengroup.org/osdu/platform/pre-shipping/-/issues/74AWS R3M8 - Reference data population2021-12-03T01:30:27ZDebasis ChatterjeeAWS R3M8 - Reference data population@Wibben - Please see enclosed file. It seems that standard set of values have not been populated for Preship environment.
[AWS-Reference-data-populated.txt](/uploads/01747e5e3aa037b7bf1f6be3e0e7c5c5/AWS-Reference-data-populated.txt)
Pl...@Wibben - Please see enclosed file. It seems that standard set of values have not been populated for Preship environment.
[AWS-Reference-data-populated.txt](/uploads/01747e5e3aa037b7bf1f6be3e0e7c5c5/AWS-Reference-data-populated.txt)
Please check this location for source JSON files.
https://community.opengroup.org/osdu/data/data-definitions/-/tree/master/ReferenceValues/Manifests/reference-data
cc - @esmira.rafigayevahttps://community.opengroup.org/osdu/platform/pre-shipping/-/issues/75IBM R3M8 - Reference data population2021-09-28T16:44:20ZDebasis ChatterjeeIBM R3M8 - Reference data population@shamazum - Please check current list. CRS is not there and UNIT is incomplete, I think.
[IBM-Reference-data-populated.txt](/uploads/29da32a5bc074bf5d4ebe4b4182b2254/IBM-Reference-data-populated.txt)
Source is here -
https://community....@shamazum - Please check current list. CRS is not there and UNIT is incomplete, I think.
[IBM-Reference-data-populated.txt](/uploads/29da32a5bc074bf5d4ebe4b4182b2254/IBM-Reference-data-populated.txt)
Source is here -
https://community.opengroup.org/osdu/data/data-definitions/-/tree/master/ReferenceValues/Manifests/reference-data
Thank you
cc - @esmira.rafigayeva and @sehuboyhttps://community.opengroup.org/osdu/platform/pre-shipping/-/issues/76Azure R3M8 - Smoke Test collection2021-12-03T01:32:17ZDebasis ChatterjeeAzure R3M8 - Smoke Test collection@vivekojha - Please note that I could not find Smoke tst collection in this folder.
https://community.opengroup.org/osdu/platform/pre-shipping/-/tree/main/R3-M8/Azure/M8%20collections
Can you please check?
Thank you
cc - @esmira.rafi...@vivekojha - Please note that I could not find Smoke tst collection in this folder.
https://community.opengroup.org/osdu/platform/pre-shipping/-/tree/main/R3-M8/Azure/M8%20collections
Can you please check?
Thank you
cc - @esmira.rafigayeva and @sehuboyM9 - Release 0.12Krishnan GanesanKrishnan Ganesanhttps://community.opengroup.org/osdu/platform/pre-shipping/-/issues/77IBM R3M8 - Wellbore DMS - issue with bulk access API2021-09-28T10:00:39ZMariia ZverIBM R3M8 - Wellbore DMS - issue with bulk access APISomething happened on the step of getting wellbore ddms data while testing Wellbore DDMS (IBM). What might be a problem?
![Снимок_экрана_2021-09-25_в_23.35.39](/uploads/e7b5efb47d3b29beade74ddf016a4866/Снимок_экрана_2021-09-25_в_23.35....Something happened on the step of getting wellbore ddms data while testing Wellbore DDMS (IBM). What might be a problem?
![Снимок_экрана_2021-09-25_в_23.35.39](/uploads/e7b5efb47d3b29beade74ddf016a4866/Снимок_экрана_2021-09-25_в_23.35.39.png)Anuj GuptaAnuj Guptahttps://community.opengroup.org/osdu/platform/pre-shipping/-/issues/78Azure R3M8 - Search collection (no-37) from Platform Validation2023-09-28T13:06:07ZDebasis ChatterjeeAzure R3M8 - Search collection (no-37) from Platform ValidationWhen I tried this from Platform validation, initially I found that schema version is set to 0.2.0 (should be 1.0.0) and schema source (wks) was not set.
https://community.opengroup.org/osdu/platform/testing/-/tree/master/Postman%20Colle...When I tried this from Platform validation, initially I found that schema version is set to 0.2.0 (should be 1.0.0) and schema source (wks) was not set.
https://community.opengroup.org/osdu/platform/testing/-/tree/master/Postman%20Collection
So, if we run this we get failure.
![Azure-Preship-Search](/uploads/caf63653d01d2253de91eeae7b8385e6/Azure-Preship-Search.PNG)
cc - @vivekojha , @manishk and @esmira.rafigayevaM9 - Release 0.12Vibhuti Sharma [Microsoft]Vibhuti Sharma [Microsoft]https://community.opengroup.org/osdu/platform/pre-shipping/-/issues/79Azure R3M8 - Requst to populate standard Reference values from Data Definitio...2021-12-05T00:42:04ZDebasis ChatterjeeAzure R3M8 - Requst to populate standard Reference values from Data Definition Team's sourcePlease load up standard reference values.
See this place for good resource.
https://community.opengroup.org/osdu/data/data-definitions/-/tree/master/ReferenceValues/Manifests/reference-data
cc - @vivekojha , @manishk , @sehuboy and @e...Please load up standard reference values.
See this place for good resource.
https://community.opengroup.org/osdu/data/data-definitions/-/tree/master/ReferenceValues/Manifests/reference-data
cc - @vivekojha , @manishk , @sehuboy and @esmira.rafigayevaKrishna Nikhil VedurumudiKrishna Nikhil Vedurumudihttps://community.opengroup.org/osdu/platform/pre-shipping/-/issues/80Azure R3M8 - demo data (TNO, Volve) population - Master and Work-product comp...2021-12-04T13:33:49ZDebasis ChatterjeeAzure R3M8 - demo data (TNO, Volve) population - Master and Work-product componentCan you please check and advise if this step was completed for Preship environment?
Thank you
cc - @sehuboy , @esmira.rafigayeva , @vivekojha and @manishkCan you please check and advise if this step was completed for Preship environment?
Thank you
cc - @sehuboy , @esmira.rafigayeva , @vivekojha and @manishkM9 - Release 0.12harshit aggarwalKrishna Nikhil Vedurumudiharshit aggarwalhttps://community.opengroup.org/osdu/platform/pre-shipping/-/issues/81Azure, R3M8, Wellbore DMS, Environment variables, worked collection2021-10-25T20:43:43ZDebasis ChatterjeeAzure, R3M8, Wellbore DMS, Environment variables, worked collectionI am using the collection provided by Azure team here.
https://community.opengroup.org/osdu/platform/pre-shipping/-/tree/main/R3-M8/Azure/M8%20collections
What we requested each CSP is to provide worked example/collection so that we can...I am using the collection provided by Azure team here.
https://community.opengroup.org/osdu/platform/pre-shipping/-/tree/main/R3-M8/Azure/M8%20collections
What we requested each CSP is to provide worked example/collection so that we can simply use environment JSON file (provided by CSP) and execute each request successfully **without having to debug, troubleshoot, change anything anywhere.**
So that Preship tester can learn from simple “click, click, click”, build their own variation test case and perform true Business UAT.
Even after I adjusted environment variable (per suggestion from @chad ), I got failure for the first 3 requests – About, Version, “Create Well”. Did anyone from Azure team actually execute the steps in Preship environment?
WELLBORE_DDMS_HOST = osdu-ship.msft-osdu-test.org/api/os-wellbore-ddms/ddms/v3
About or version
GET https://{{WELLBORE_DDMS_HOST}}/about
```
{
"detail": "Not Found"
}
```
Create Well
https://{{WELLBORE_DDMS_HOST}}/wells
Response
```
{
"detail": [
{
"loc": [
"body",
0,
"id"
],
"msg": "string does not match regex \"^[\\w\\-\\.]+:master-data\\-\\-Well:[\\w\\-\\.\\:\\%]+$\"",
"type": "value_error.str.regex",
"ctx": {
"pattern": "^[\\w\\-\\.]+:master-data\\-\\-Well:[\\w\\-\\.\\:\\%]+$"
}
}
]
}
```
Thanks for your help
cc - @manishk , @vivekojha , @esmira.rafigayeva , @sehuboy , @gmarblestone , @s0rhe1mSumra ZafarSumra Zafarhttps://community.opengroup.org/osdu/platform/pre-shipping/-/issues/82IBM - R3M8 - Wellbore DMS - Schema version numbers2021-12-16T12:38:29ZDebasis ChatterjeeIBM - R3M8 - Wellbore DMS - Schema version numbers@shamazum , @shrikgar and @paromitr
A.
Please check supported Schema version numbers here.
Also partition/source information (mixup between opendes and osdu?)
https://community.opengroup.org/osdu/data/data-definitions/-/tree/master/E-...@shamazum , @shrikgar and @paromitr
A.
Please check supported Schema version numbers here.
Also partition/source information (mixup between opendes and osdu?)
https://community.opengroup.org/osdu/data/data-definitions/-/tree/master/E-R
With that reference please see records created by Wellbore DMS (using your worked collection) -
[IBM_R3M8-Wellbore-DDMS-records.txt](/uploads/8231145a49497a8e4d34015b35a2c7c6/IBM_R3M8-Wellbore-DDMS-records.txt)
I could not see "logset" as part of your worked collection.
"kind": "opendes:wks:well:2.0.0", (official version is 1.0.0)
"kind": "osdu:wks:master-data--Wellbore:1.0.0", (matches supported schema)
"kind": "opendes:wks:welllogs:2.0.0", (official versions are 1.0.0 and 1.1.0)
B.
Noted that Wellbore does not refer to the Master (Well).
C.
Do you know if integrity validation is performed when creating these records?
Or all checks are bypassed since Storage service (PUT) is used to create the record and we know that is very forgiving.
D.
WellLog record does not show Curves array. So, we are unsure about how do we tie up subsequent request for writing log data?
Your worked example shows "Split" approach as per Swagger documentation. Also Depth and 2 curves. Right?
https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/wellbore/wellbore-domain-services/-/blob/master/spec/generated/openapi.json
```
{
"columns": [
"Ref",
"col_1",
"col_2"
],
"index": [
0,
1,
2,
3,
4
],
"data": [
[
0,
1111.1,
2222.1
],
[
0.5,
1111.2,
2222.2
],
[
1,
1111.3,
2222.3
],
[
1.5,
1111.4,
2222.4
],
[
2,
1111.5,
2222.5
]
]
}
```
cc - @Mariiazver , @esmira.rafigayeva , @s0rhe1m , @chad , @gmarblestone , @valeraginakhttps://community.opengroup.org/osdu/platform/pre-shipping/-/issues/83DAG tasks in queue state - Witsml parser2021-09-28T12:12:30Zetienne peyssonDAG tasks in queue state - Witsml parserI'm having an issue since yesterday while triggering the Witsml parser ingestion workflow.
None of my tasks are getting scheduled.
![Screenshot_from_2021-09-28_08-18-09](/uploads/a32c6a4b40c36d63d009cc0abcc51527/Screenshot_from_2021-09-...I'm having an issue since yesterday while triggering the Witsml parser ingestion workflow.
None of my tasks are getting scheduled.
![Screenshot_from_2021-09-28_08-18-09](/uploads/a32c6a4b40c36d63d009cc0abcc51527/Screenshot_from_2021-09-28_08-18-09.png)
![Screenshot_from_2021-09-28_08-18-01](/uploads/74b25c33124926f4baf2a7b60e418799/Screenshot_from_2021-09-28_08-18-01.png)
Given body :
```json
{
"executionContext" : {
"Payload" : {
"AppKey" : "test-app",
"data-partition-id" : "opendes"
},
"Context" : {
"acl" : {
"viewers" : [ "data.default.viewers@opendes.ibm.com" ],
"owners" : [ "data.default.owners@opendes.ibm.com" ]
},
"legal" : {
"otherRelevantDataCountries" : [ "US" ],
"status" : "compliant",
"legaltags" : [ "opendes-Test-Legal-Tag-7292798" ]
},
"kind" : "opendes:wks:dataset--File.Generic:1.0.0",
"version" : 1.0,
"file_name" : "1.xml",
"preload_file_path" : "s3://oc-cpd-opendes-staging-bucket/77142592e5c5491795bc7e7b4d41914a"
}
}
}
```
Response with run ID : e45edb27-0b9f-4a4f-b163-a07b9f5f506chttps://community.opengroup.org/osdu/platform/pre-shipping/-/issues/84Azure/Manifest Ingestion: MarkerPropertyType, SurveyToolType, TrajectoryStati...2021-12-09T14:35:55ZEsmira RafigayevaAzure/Manifest Ingestion: MarkerPropertyType, SurveyToolType, TrajectoryStationPropertyTypeUnable to ingest
MarkerPropertyType, SurveyToolType, TrajectoryStationPropertyType
https://osdu-ship.msft-osdu-test.org/airflow/graph?dag_id=Osdu_ingest&run_id=f4ebd33f-fcbb-4a92-864d-c6a79fc11998&execution_date=2021-09-28+06%3A45%3A29...Unable to ingest
MarkerPropertyType, SurveyToolType, TrajectoryStationPropertyType
https://osdu-ship.msft-osdu-test.org/airflow/graph?dag_id=Osdu_ingest&run_id=f4ebd33f-fcbb-4a92-864d-c6a79fc11998&execution_date=2021-09-28+06%3A45%3A29.777460%2B00%3A00
Specific issue not identified, need some assistance to identify problem.
@s0rhe1mharshit aggarwalharshit aggarwalhttps://community.opengroup.org/osdu/platform/pre-shipping/-/issues/85Airflow regex issue - likely not CSP related (?)2021-10-01T12:15:53ZEsmira RafigayevaAirflow regex issue - likely not CSP related (?)
Encountered issue through manifest ingestion inside airflow DAG:
"UnitQuantityID": "opendes:reference-data--UnitQuantity:1:", The error here seems to be regex related, as airflow does the search for: "opendes:reference-data--UnitQuanti...
Encountered issue through manifest ingestion inside airflow DAG:
"UnitQuantityID": "opendes:reference-data--UnitQuantity:1:", The error here seems to be regex related, as airflow does the search for: "opendes:reference-data--UnitQuantity"
It seems that the number 1 at the end is identified as a version and removed. Who can fix this?
This issue is CSP independent and therefore a bug across all CSPs
Edited by ivar Soerheim just now @s0rhe1mivar SoerheimMANISH KUMARVivek Ojhaivar Soerheimhttps://community.opengroup.org/osdu/platform/pre-shipping/-/issues/86[GCP Airflow] No Error Logging Recorded for Missing Reference Record During M...2021-09-28T12:48:50ZNaufal Mohamed Noori[GCP Airflow] No Error Logging Recorded for Missing Reference Record During Manifest IngestionManifest ingestion will not provide any ERROR logging in airflow log if if record json manifest contains non-existing reference/master data parameter.
Steps to reproduce:
a) Using DAG manifest ingestion, load a master data wellbore reco...Manifest ingestion will not provide any ERROR logging in airflow log if if record json manifest contains non-existing reference/master data parameter.
Steps to reproduce:
a) Using DAG manifest ingestion, load a master data wellbore record. Note that I inserted data.WellID which is not existing in the current database **"WellID": "{{data-partition-id}}:master-data--Well:TEST_ERROR:"**:
BODY:
`{
"executionContext": {
"Payload": {
"AppKey": "test-app",
"data-partition-id": "{{data-partition-id}}"
},
"manifest": {
"kind": "{{data-partition-id}}:wks:Manifest:1.0.0",
"MasterData": [
{
"id": "{{data-partition-id}}:master-data--Wellbore:Test_NN_2021_09_24_01",
"kind": "{{data-partition-id}}:wks:master-data--Wellbore:1.0.0",
"acl": {
"owners": [
"data.default.owners@{{data-partition-id}}.osdu-gcp.go3-nrg.projects.epam.com"
],
"viewers": [
"data.default.viewers@{{data-partition-id}}.osdu-gcp.go3-nrg.projects.epam.com"
]
},
"legal": {
"legaltags": [
"{{data-partition-id}}-demo-legaltag"
],
"otherRelevantDataCountries": [
"US"
]
},
"data": {
"WellID": "{{data-partition-id}}:master-data--Well:TEST_ERROR:",
"FacilityName": "TEST_NN_1_ALIAS",
"SequenceNumber": 1,
"Source": "TEST_NN_1_ALIAS_SOURCE",
"NameAliases": [
{
"AliasName": "TEST_NN_1_ALIAS"
}
]
}
}
]
}
}
}
`
b) Run DAG Manifest POST: https://{{WORKFLOW_HOST}}/workflow/Osdu_ingest/workflowRun:
_{
"workflowId": "ef82cba0-0e45-4df3-91bf-4df1553102d3",
"runId": "5a786c6f-103e-44d3-b192-d34e3026b722",
"startTimeStamp": 1632812342734,
"status": "submitted",
"submittedBy": "preshipping_test_user@osdu-gcp.go3-nrg.projects.epam.com"
}_
c) Observe the airflow log. In all stages of the log there is no indication of ERROR logging even if the DAG run is failing at the end and no new record stored. Found a trace of DEBUG logging inside the airflow which indicates some kind of records checking but no ERROR logging observed:
_[2021-09-28 06:59:54,321] {search_record_ids.py:78} DEBUG - Search query "odesprod:master-data--Well:TEST_ERROR"
[2021-09-28 06:59:54,365] {connectionpool.py:939} DEBUG - Starting new HTTPS connection (1): preship-asm.osdu-gcp.go3-nrg.projects.epam.com:443
[2021-09-28 06:59:56,781] {connectionpool.py:433} DEBUG - https://preship-asm.osdu-gcp.go3-nrg.projects.epam.com:443 "POST /api/search/v2/query HTTP/1.1" 200 None
[2021-09-28 06:59:56,785] {search_record_ids.py:183} DEBUG - {"results":[],"aggregations":[],"totalCount":0}
[2021-09-28 06:59:56,785] {search_record_ids.py:188} DEBUG - Got total count 0
[2021-09-28 06:59:56,786] {search_record_ids.py:169} DEBUG - response ids: []_
EXPECTATION:
If the record is not stored due to not found existing records reference in the database, we should observe ERROR type logging in the airflow.
**(TESTED ON R3M8 Preship GCP environment on 27 September 2021)**
cc @esmira.rafigayeva @debasisc @aliaksandr_ramanovichAleksandr Spivakov (EPAM)Aleksandr Spivakov (EPAM)https://community.opengroup.org/osdu/platform/pre-shipping/-/issues/89IBM - R3M8 - Cannot use collection from Platform validation site and run "as is"2021-12-16T12:39:18ZDebasis ChatterjeeIBM - R3M8 - Cannot use collection from Platform validation site and run "as is"Reported by Ananth (LTI) as he was testing in IBM R3M8 Preship environment.
General expectation is that we can use **Environment JSON file** as provided by IBM team,
(https://community.opengroup.org/osdu/platform/pre-shipping/-/tree/ma...Reported by Ananth (LTI) as he was testing in IBM R3M8 Preship environment.
General expectation is that we can use **Environment JSON file** as provided by IBM team,
(https://community.opengroup.org/osdu/platform/pre-shipping/-/tree/main/R3-M8/IBM)
perform authentication, then get any of the collections from here and run smoothly.
https://community.opengroup.org/osdu/platform/testing/-/tree/master/Postman%20Collection
Ananth tried with Entitlements. (Collection - 14)
This fails.
Ananth did necessary troubleshooting and figured that we need to tweak the variable value to make it work.
It uses old environments variable/values.
ENTITLEMENTS_HOST = https://{{osdu-cpd}}/osdu-entitlements/api/entitlements/v1
Request#26 shows endpoint as GET https://{{ENTITLEMENTS_HOST}}/groups
He looked at
ENTITLEMENTS_V2_HOST https://{{osdu-cpd}}/osdu-entitlements-v2/api/entitlements/v2
Made suitable change in the value of ENTITLEMENTS_HOST to make it work.
cc - @jingdongsun , @anujgupta , @shamazum , @todaiks for informationhttps://community.opengroup.org/osdu/platform/pre-shipping/-/issues/91Missing records on Storage API after successful DAG run - Witsml Parser - Tub...2021-10-19T19:30:25Zetienne peyssonMissing records on Storage API after successful DAG run - Witsml Parser - Tubular & Well logSee report here :
https://community.opengroup.org/osdu/platform/pre-shipping/-/issues/29#note_74961
and here :
https://community.opengroup.org/osdu/platform/pre-shipping/-/issues/29#note_74970
After a successful workflow, Airflow logs ...See report here :
https://community.opengroup.org/osdu/platform/pre-shipping/-/issues/29#note_74961
and here :
https://community.opengroup.org/osdu/platform/pre-shipping/-/issues/29#note_74970
After a successful workflow, Airflow logs shows record ids for tubular components :
'Components': ['opendes:work-product-component--TubularComponent:e779f566-2483-48af-b338-ca5b55406c1f:', 'opendes:work-product-component--TubularComponent:b1490ef6-479a-400a-bc82-93e5923a1e08:', 'opendes:work-product-component--TubularComponent:10761ddc-a772-49c7-b2bb-57140fdaf1ea:', 'opendes:work-product-component--TubularComponent:735c62a2-e193-4447-bd7a-9b47abb85575:', 'opendes:work-product-component--TubularComponent:c9966b89-d811-4406-a6f4-995745a79934:', 'opendes:work-product-component--TubularComponent:9542d2b7-8001-4928-9849-be02de35b470:', 'opendes:work-product-component--TubularComponent:b388e356-df0a-4e12-b938-a5b9a76af4ab:', 'opendes:work-product-component--TubularComponent:a2f47889-1e7f-46b0-bed5-bf673c8fbc65:', 'opendes:work-product-component--TubularComponent:319879b6-5a64-4225-8c3c-bcc23e4424a0:', 'opendes:work-product-component--TubularComponent:3b9d6ee1-5621-4c12-b982-5061f8ca55a2:']
But also for logs :
'Components': ['opendes:work-product-component--WellLog:20C60DDC-D36D-4A3C-800F-504CE0B5605D:']
However I'm not able to find via the storage API :
ex : {{STORAGE_HOST}}/records/opendes:work-product-component--TubularComponent:e779f566-2483-48af-b338-ca5b55406c1f
```json
{
"code": 404,
"reason": "Record not found",
"message": "The record 'opendes:work-product-component--TubularComponent:e779f566-2483-48af-b338-ca5b55406c1f' was not found"
}
```
the issue might not be CSP specific.M9 - Release 0.12Chris ZhangChris Zhanghttps://community.opengroup.org/osdu/platform/pre-shipping/-/issues/92AWS Segy to ZGY workflow not working2021-09-28T22:39:44ZMichaelAWS Segy to ZGY workflow not workingWhile attempting to convert a segy dataset to a zgy dataset in seismic ddms using the SEGY_TO_ZGY dag, the taks failed. Here are the workflow task logs.
```
*** Reading remote log from s3://prer3m8-ingest-logbucket-59uj-s3airflowbucketp...While attempting to convert a segy dataset to a zgy dataset in seismic ddms using the SEGY_TO_ZGY dag, the taks failed. Here are the workflow task logs.
```
*** Reading remote log from s3://prer3m8-ingest-logbucket-59uj-s3airflowbucketprod-udbcrvficay5/logs/SEGY_TO_ZGY/segy-to-zgy/2021-09-28T13:45:15.858388+00:00/1.log.
[2021-09-28 13:45:21,816] {taskinstance.py:670} INFO - Dependencies all met for <TaskInstance: SEGY_TO_ZGY.segy-to-zgy 2021-09-28T13:45:15.858388+00:00 [queued]>
[2021-09-28 13:45:21,840] {taskinstance.py:670} INFO - Dependencies all met for <TaskInstance: SEGY_TO_ZGY.segy-to-zgy 2021-09-28T13:45:15.858388+00:00 [queued]>
[2021-09-28 13:45:21,840] {taskinstance.py:880} INFO -
--------------------------------------------------------------------------------
[2021-09-28 13:45:21,840] {taskinstance.py:881} INFO - Starting attempt 1 of 1
[2021-09-28 13:45:21,840] {taskinstance.py:882} INFO -
--------------------------------------------------------------------------------
[2021-09-28 13:45:21,858] {taskinstance.py:901} INFO - Executing <Task(KubernetesPodOperator): segy-to-zgy> on 2021-09-28T13:45:15.858388+00:00
[2021-09-28 13:45:21,861] {standard_task_runner.py:54} INFO - Started process 470 to run task
[2021-09-28 13:45:21,892] {standard_task_runner.py:77} INFO - Running: ['airflow', 'run', 'SEGY_TO_ZGY', 'segy-to-zgy', '2021-09-28T13:45:15.858388+00:00', '--job_id', '5128', '--pool', 'default_pool', '--raw', '-sd', 'DAGS_FOLDER/openzgy/segy_to_zgy_ingestion_dag.py', '--cfg_path', '/tmp/tmpxqzm08nt']
[2021-09-28 13:45:21,893] {standard_task_runner.py:78} INFO - Job 5128: Subtask segy-to-zgy
[2021-09-28 13:45:21,956] {logging_mixin.py:120} INFO - Running <TaskInstance: SEGY_TO_ZGY.segy-to-zgy 2021-09-28T13:45:15.858388+00:00 [running]> on host airflow-worker-0.airflow-worker.osdu-airflow.svc.cluster.local
[2021-09-28 13:45:22,028] {logging_mixin.py:120} WARNING - /home/airflow/.local/lib/python3.6/site-packages/airflow/kubernetes/pod_launcher.py:331: DeprecationWarning: Using `airflow.contrib.kubernetes.pod.Pod` is deprecated. Please use `k8s.V1Pod`.
security_context=_extract_security_context(pod.spec.security_context)
[2021-09-28 13:45:22,028] {logging_mixin.py:120} WARNING - /home/airflow/.local/lib/python3.6/site-packages/airflow/kubernetes/pod_launcher.py:77: DeprecationWarning: Using `airflow.contrib.kubernetes.pod.Pod` is deprecated. Please use `k8s.V1Pod` instead.
pod = self._mutate_pod_backcompat(pod)
[2021-09-28 13:45:22,078] {pod_launcher.py:173} INFO - Event: segy-to-zgy-a6019acd8c6641a587114d9efa5cfa6a had an event of type Pending
[2021-09-28 13:45:22,078] {pod_launcher.py:139} WARNING - Pod not yet started: segy-to-zgy-a6019acd8c6641a587114d9efa5cfa6a
[2021-09-28 13:45:23,087] {pod_launcher.py:173} INFO - Event: segy-to-zgy-a6019acd8c6641a587114d9efa5cfa6a had an event of type Pending
[2021-09-28 13:45:23,087] {pod_launcher.py:139} WARNING - Pod not yet started: segy-to-zgy-a6019acd8c6641a587114d9efa5cfa6a
[2021-09-28 13:45:24,096] {pod_launcher.py:173} INFO - Event: segy-to-zgy-a6019acd8c6641a587114d9efa5cfa6a had an event of type Failed
[2021-09-28 13:45:24,097] {pod_launcher.py:284} INFO - Event with job id segy-to-zgy-a6019acd8c6641a587114d9efa5cfa6a Failed
[2021-09-28 13:45:24,116] {pod_launcher.py:156} INFO - b'[0.003890] SEGYTOZGY_ZFP_LOD_COMPRESS=[]\n'
[2021-09-28 13:45:24,116] {pod_launcher.py:156} INFO - b'[0.003896] SEGYTOZGY_ZFP_LOD_SNR=[]\n'
[2021-09-28 13:45:24,117] {pod_launcher.py:156} INFO - b'[0.003903] set SEGYTOZGY_INSECURE_PRINT_TOKEN=1 to print token values\n'
[2021-09-28 13:45:24,117] {pod_launcher.py:156} INFO - b'[0.003912] END Environment variables\n'
[2021-09-28 13:45:24,117] {pod_launcher.py:156} INFO - b'[0.003921] Command line arguments: [/usr/local/bin/segy/SegyToZgy] [--osdu] [osdu:dataset--FileCollection.SEGY:e1d8444c4ae545c1b3446211be7995bb] [{{SeismicTraceDataBinGridWPId}}]\n'
[2021-09-28 13:45:24,117] {pod_launcher.py:156} INFO - b'[0.003933] Fetching work product [{{SeismicTraceDataBinGridWPId}}]\n'
[2021-09-28 13:45:24,117] {pod_launcher.py:156} INFO - b'[0.003949] About to get record [{{SeismicTraceDataBinGridWPId}}]\n'
[2021-09-28 13:45:24,117] {pod_launcher.py:156} INFO - b'[0.004017] Storage service URL: [https://preshiptesting.osdu.aws/api/storage/v2]\n'
[2021-09-28 13:45:24,117] {pod_launcher.py:156} INFO - b'[0.004026] Data partition ID : [osdu]\n'
[2021-09-28 13:45:24,117] {pod_launcher.py:156} INFO - b'Invalid format of object reference.\n'
[2021-09-28 13:45:24,132] {pod_launcher.py:173} INFO - Event: segy-to-zgy-a6019acd8c6641a587114d9efa5cfa6a had an event of type Failed
[2021-09-28 13:45:24,132] {pod_launcher.py:284} INFO - Event with job id segy-to-zgy-a6019acd8c6641a587114d9efa5cfa6a Failed
[2021-09-28 13:45:24,138] {pod_launcher.py:173} INFO - Event: segy-to-zgy-a6019acd8c6641a587114d9efa5cfa6a had an event of type Failed
[2021-09-28 13:45:24,138] {pod_launcher.py:284} INFO - Event with job id segy-to-zgy-a6019acd8c6641a587114d9efa5cfa6a Failed
[2021-09-28 13:45:24,172] {taskinstance.py:1150} ERROR - Pod Launching failed: Pod returned a failure: failed
Traceback (most recent call last):
File "/home/airflow/.local/lib/python3.6/site-packages/airflow/contrib/operators/kubernetes_pod_operator.py", line 309, in execute
'Pod returned a failure: {state}'.format(state=final_state))
airflow.exceptions.AirflowException: Pod returned a failure: failed
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/home/airflow/.local/lib/python3.6/site-packages/airflow/models/taskinstance.py", line 979, in _run_raw_task
result = task_copy.execute(context=context)
File "/home/airflow/.local/lib/python3.6/site-packages/airflow/contrib/operators/kubernetes_pod_operator.py", line 312, in execute
raise AirflowException('Pod Launching failed: {error}'.format(error=ex))
airflow.exceptions.AirflowException: Pod Launching failed: Pod returned a failure: failed
[2021-09-28 13:45:24,173] {taskinstance.py:1194} INFO - Marking task as FAILED. dag_id=SEGY_TO_ZGY, task_id=segy-to-zgy, execution_date=20210928T134515, start_date=20210928T134521, end_date=20210928T134524
[2021-09-28 13:45:26,726] {local_task_job.py:102} INFO - Task exited with return code 1
```
I have attached the steps I used to test this feature in a file attached to this ticket.
[AWS_M8_OpenZGY_Test_3_Results.docx](/uploads/649c2b58582cd7401d44dc3007b298af/AWS_M8_OpenZGY_Test_3_Results.docx)GregGreghttps://community.opengroup.org/osdu/platform/pre-shipping/-/issues/93Version Info Error for Both Entitlement & Search Service2021-10-05T06:04:42ZSehubo AkinyanmiVersion Info Error for Both Entitlement & Search ServiceVersion Info is expected to work on all core services in M8 but currently, error is noted when ran against the Entitlement service as noted below. Can you investigate and fix Version Info returned for Entitlement service?
**{{ENTITLEME...Version Info is expected to work on all core services in M8 but currently, error is noted when ran against the Entitlement service as noted below. Can you investigate and fix Version Info returned for Entitlement service?
**{{ENTITLEMENTS_V2_HOST}}/info**
{
"code": 500,
"reason": "Internal Server Error",
"message": "An unknown error has occurred"
}
**{{SEARCH_HOST}}/Info**
{
"timestamp": "2021-09-29T17:22:18.123+00:00",
"status": 404,
"error": "Not Found",
"message": "No message available",
"path": "/api/search/v2/Info"
}
Below is ran against the schema service with results returned (although 0.11.0 is expected for M8)
{
"groupId": "org.opengroup.osdu",
"artifactId": "os-schema-ibm",
"version": "0.12.0-SNAPSHOT",
"buildTime": "2021-09-28T19:10:08.221Z",
"branch": "master",
"commitId": "6ae053df18123b1abc036cea5ed4dc9bd4f44bd3",
"commitMessage": "Merge branch 'schema-offset-issue-slb' into 'master'",
"connectedOuterServices": []
}
cc @debasiscAnuj GuptaAnuj Guptahttps://community.opengroup.org/osdu/platform/pre-shipping/-/issues/94WPC surrogate keys not working in M82023-08-11T12:57:49Zivar SoerheimWPC surrogate keys not working in M8Trying to load the sample WPC data e.g. TNO\output_tno_document\load_document_prov33_pdf.json doesn’t work in M8
First failure is due to schema validation failure:
[2021-09-29 09:14:29,765] {validate_schema.py:290} ERROR - Manifest ...Trying to load the sample WPC data e.g. TNO\output_tno_document\load_document_prov33_pdf.json doesn’t work in M8
First failure is due to schema validation failure:
[2021-09-29 09:14:29,765] {validate_schema.py:290} ERROR - Manifest kind: osdu:wks:work-product-component--Document:1.0.0
[2021-09-29 09:14:29,766] {validate_schema.py:291} ERROR - Error: 'surrogate-key:file-1' does not match '^[\\w\\-\\.]+:dataset\\-\\-[\\w\\-\\.]+:[\\w\\-\\.\\:\\%]+:[0-9]*$'
Failed validating 'pattern' in schema['properties']['data']['allOf'][1]['properties']['Datasets']['items']:
{'description': 'The SRN which identifies this OSDU File resource.',
'pattern': '^[\\w\\-\\.]+:dataset\\-\\-[\\w\\-\\.]+:[\\w\\-\\.\\:\\%]+:[0-9]*$',
'type': 'string',
'x-osdu-relationship': [{'GroupType': 'dataset'}]}
On instance['data']['Datasets'][0]:
'surrogate-key:file-1'
Correcting the format to something like “surrogate-key:dataset--1:0:0” makes the schema pass validation, but if then seems to be stripped away in the provide_manifest_integrity_check
[2021-09-29 09:05:53,883] {ensure_manifest_integrity.py:70} DEBUG - Manifest data: {'ReferenceData': [], 'MasterData': [], 'kind': 'osdu:wks:Manifest:1.0.0', 'Data': {'WorkProduct': {'data': {'Components': ['surrogate-key:wpc-1'], 'Description': 'Document', 'ResourceSecurityClassification': 'opendes:reference-data--ResourceSecurityClassification:RESTRICTED:', 'Name': 'prov33'}, 'kind': 'osdu:wks:work-product--WorkProduct:1.0.0', 'legal': {'legaltags': ['opendes-public-usa-dataset-7643990'], 'otherRelevantDataCountries': ['US']}, 'acl': {'viewers': ['data.default.viewers@opendes.contoso.com'], 'owners': ['data.default.owners@opendes.contoso.com']}}, 'Datasets': [{'data': {'DatasetProperties': {'FileSourceInfo': {'FileSource': '/osdu-user/1632906292949-2021-09-29-09-04-52-949/1de301fc52664f49b90b3a9bee81ae48', 'PreloadFilePath': 's3://osdu-seismic-test-data/r1/data/provided/USGS_docs/prov33.pdf', 'Name': 'prov33.pdf'}}, 'ResourceSecurityClassification': 'opendes:reference-data--ResourceSecurityClassification:RESTRICTED:'}, 'kind': 'osdu:wks:dataset--File.Generic:1.0.0', 'legal': {'legaltags': ['opendes-public-usa-dataset-7643990'], 'otherRelevantDataCountries': ['US']}, 'id': 'surrogate-key:dataset--1:0:0', 'acl': {'viewers': ['data.default.viewers@opendes.contoso.com'], 'owners': ['data.default.owners@opendes.contoso.com']}}], 'WorkProductComponents': [{'data': {'Datasets': ['surrogate-key:dataset--1:0:0'], 'Description': 'Document', 'ResourceSecurityClassification': 'opendes:reference-data--ResourceSecurityClassification:RESTRICTED:', 'Name': 'prov33'}, 'kind': 'osdu:wks:work-product-component--Document:1.0.0', 'meta': [], 'legal': {'legaltags': ['opendes-public-usa-dataset-7643990'], 'otherRelevantDataCountries': ['US']}, 'id': 'surrogate-key:wpc-1', 'acl': {'viewers': ['data.default.viewers@opendes.contoso.com'], 'owners': ['data.default.owners@opendes.contoso.com']}}]}}
[2021-09-29 09:05:53,884] {validate_referential_integrity.py:124} DEBUG - WPC: surrogate-key:wpc-1 doesn't have Artefacts field. Mark it as valid.
[2021-09-29 09:05:53,884] {search_record_ids.py:78} DEBUG - Search query "opendes:reference-data--ResourceSecurityClassification:RESTRICTED"
[2021-09-29 09:05:54,264] {connectionpool.py:230} DEBUG - Starting new HTTP connection (1): search.osdu-azure.svc.cluster.local:80
[2021-09-29 09:05:54,655] {connectionpool.py:442} DEBUG - http://search.osdu-azure.svc.cluster.local:80 "POST /api/search/v2/query HTTP/1.1" 200 None
[2021-09-29 09:05:54,656] {search_record_ids.py:183} DEBUG - {"results":[],"aggregations":[],"totalCount":0}
[2021-09-29 09:05:54,656] {search_record_ids.py:188} DEBUG - Got total count 0
[2021-09-29 09:05:54,656] {search_record_ids.py:169} DEBUG - response ids: []
[2021-09-29 09:05:54,657] {ensure_manifest_integrity.py:76} DEBUG - Valid manifest data: {'ReferenceData': [], 'MasterData': [], 'kind': 'osdu:wks:Manifest:1.0.0', 'Data': {'WorkProduct': {'data': {'Components': ['surrogate-key:wpc-1'], 'Description': 'Document', 'ResourceSecurityClassification': 'opendes:reference-data--ResourceSecurityClassification:RESTRICTED:', 'Name': 'prov33'}, 'kind': 'osdu:wks:work-product--WorkProduct:1.0.0', 'legal': {'legaltags': ['opendes-public-usa-dataset-7643990'], 'otherRelevantDataCountries': ['US']}, 'acl': {'viewers': ['data.default.viewers@opendes.contoso.com'], 'owners': ['data.default.owners@opendes.contoso.com']}}, 'Datasets': [{'data': {'DatasetProperties': {'FileSourceInfo': {'FileSource': '/osdu-user/1632906292949-2021-09-29-09-04-52-949/1de301fc52664f49b90b3a9bee81ae48', 'PreloadFilePath': 's3://osdu-seismic-test-data/r1/data/provided/USGS_docs/prov33.pdf', 'Name': 'prov33.pdf'}}, 'ResourceSecurityClassification': 'opendes:reference-data--ResourceSecurityClassification:RESTRICTED:'}, 'kind': 'osdu:wks:dataset--File.Generic:1.0.0', 'legal': {'legaltags': ['opendes-public-usa-dataset-7643990'], 'otherRelevantDataCountries': ['US']}, 'id': 'surrogate-key:dataset--1:0:0', 'acl': {'viewers': ['data.default.viewers@opendes.contoso.com'], 'owners': ['data.default.owners@opendes.contoso.com']}}], 'WorkProductComponents': [{'data': {'Datasets': ['surrogate-key:dataset--1:0:0'], 'Description': 'Document', 'ResourceSecurityClassification': 'opendes:reference-data--ResourceSecurityClassification:RESTRICTED:', 'Name': 'prov33'}, 'kind': 'osdu:wks:work-product-component--Document:1.0.0', 'meta': [], 'legal': {'legaltags': ['opendes-public-usa-dataset-7643990'], 'otherRelevantDataCountries': ['US']}, 'id': 'surrogate-key:wpc-1', 'acl': {'viewers': ['data.default.viewers@opendes.contoso.com'], 'owners': ['data.default.owners@opendes.contoso.com']}}]}}
[2021-09-29 09:05:55,170] {__init__.py:62} DEBUG - Backend: None, Lineage called with inlets: [], outlets: []
[2021-09-29 09:05:55,409] {taskinstance.py:1070} INFO - Marking task as SUCCESS.dag_id=Osdu_ingest, task_id=provide_manifest_integrity_task, execution_date=20210929T090455, start_date=20210929T090548, end_date=20210929T090555
[2021-09-29 09:05:57,884] {base_job.py:197} DEBUG - [heartbeat]
[2021-09-29 09:05:57,884] {local_task_job.py:102} INFO - Task exited with return code 0
![image](/uploads/800899711bab7c4b4aa8cf6a8c7f11ce/image.png)MANISH KUMARVivek OjhaMANISH KUMARhttps://community.opengroup.org/osdu/platform/pre-shipping/-/issues/97Azure R3M8 - Worked example from R3M8 folder (CSV Ingestion) - comments2021-12-07T12:22:03ZDebasis ChatterjeeAzure R3M8 - Worked example from R3M8 folder (CSV Ingestion) - commentsI started off by using the worked collection provided in this folder.
https://community.opengroup.org/osdu/platform/pre-shipping/-/tree/main/R3-M8/Azure/M8%20collections
Noted that this is using deprecated Schema feature from Storage s...I started off by using the worked collection provided in this folder.
https://community.opengroup.org/osdu/platform/pre-shipping/-/tree/main/R3-M8/Azure/M8%20collections
Noted that this is using deprecated Schema feature from Storage service. Step-04
![Azure-CSV-Schema-by-Storage-service](/uploads/92b5e24fcc625b659724ca2703af0c4a/Azure-CSV-Schema-by-Storage-service.PNG)
Also, I see a step 05 to actually populate information in the created schema. Not sure why is this step required?
Expected flow - create schema (only if it is not wks), define meta data of source CSV file and then populate data from CSV source.
Please check and advise.
cc -@manishk , @vivekojha , @TaylorGraber , @esmira.rafigayeva , @sje7253bp for informationhttps://community.opengroup.org/osdu/platform/pre-shipping/-/issues/98Issue with Master Data - CSV Ingestion Azure2021-09-30T17:29:27ZTaylor GraberIssue with Master Data - CSV Ingestion AzureWhen trying to test Master Data in Azure CSV ingestion, I am unable to get data back from the storage service successfully. I tried with the Field and Play master data with no success. I've attached both test files that I created and tri...When trying to test Master Data in Azure CSV ingestion, I am unable to get data back from the storage service successfully. I tried with the Field and Play master data with no success. I've attached both test files that I created and tried. I've attached the storage service GET result. The data is always returned in an unreadable form. I followed the CSV collection steps and have successfully done the same for Reference data. Reference data was successful.
[StorageGET.txt](/uploads/e391f4fe1b8ca34abd72e24d4d45552d/StorageGET.txt)
[Sample_data-field.xlsx](/uploads/39b961bb73c892902a75e33eca016726/Sample_data-field.xlsx)
[Play-TG-Azure.xlsx](/uploads/fecfc852c2f9b6789543756dfa67ab18/Play-TG-Azure.xlsx)MANISH KUMARVivek OjhaMANISH KUMARhttps://community.opengroup.org/osdu/platform/pre-shipping/-/issues/99IBM manifest ingestion unit conversion does not work2021-10-13T14:25:26ZYanbin ZhangIBM manifest ingestion unit conversion does not workIngest the following manifest with unit conversion:[unit_conv.json](/uploads/601d8c9f8d68e41661d8dd63a7bb1b64/unit_conv.json)
Ingestion failed with the following
```
[2021-09-30 16:04:02,265] {standard_task_runner.py:53} INFO - Started ...Ingest the following manifest with unit conversion:[unit_conv.json](/uploads/601d8c9f8d68e41661d8dd63a7bb1b64/unit_conv.json)
Ingestion failed with the following
```
[2021-09-30 16:04:02,265] {standard_task_runner.py:53} INFO - Started process 20749 to run task
[2021-09-30 16:04:02,381] {logging_mixin.py:112} INFO - Running %s on host %s <TaskInstance: Osdu_ingest.provide_manifest_integrity_task 2021-09-30T16:02:52+00:00 [running]> airflow-worker-0.airflow-worker.odi-airflow-ns.svc.cluster.local
[2021-09-30 16:04:17,903] {logging_mixin.py:112} INFO - [2021-09-30 16:04:17,902] {validate_referential_integrity.py:210} WARNING - Resource with kind opendes:wks:master-data--SeismicAcquisitionSurvey:1.0.0 and id: 'opendes:master-data--SeismicAcquisitionSurvey:TEST_CONVERSION_NN_2021_09_29_02' was rejected. Missing ids '{'opendes:reference-data--UnitOfMeasure:ft:', 'opendes:reference-data--UnitOfMeasure:ms:'}'
[2021-09-30 16:04:17,959] {taskinstance.py:1065} INFO - Marking task as SUCCESS.dag_id=Osdu_ingest, task_id=provide_manifest_integrity_task, execution_date=20210930T160252, start_date=20210930T160402, end_date=20210930T160417
[2021-09-30 16:04:22,227] {logging_mixin.py:112} INFO - [2021-09-30 16:04:22,226] {local_task_job.py:103} INFO - Task exited with return code 0
```
@anujguptahttps://community.opengroup.org/osdu/platform/pre-shipping/-/issues/100IBM wpc surrogate key will not be replaced by actual id2021-10-13T14:24:00ZYanbin ZhangIBM wpc surrogate key will not be replaced by actual idIngest the following manifest for wpc with surrogate keys:[wpc.json](/uploads/db5016c10d9d3cdd7ac3812680378980/wpc.json)
The ingestion itself is successfully
| Key | Value ...Ingest the following manifest for wpc with surrogate keys:[wpc.json](/uploads/db5016c10d9d3cdd7ac3812680378980/wpc.json)
The ingestion itself is successfully
| Key | Value |
|------------|----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| record_ids | ['opendes:work-product-component--WellLog:382f3dbd3b2f435f8195622eef9732ce', 'opendes:dataset--File.Generic:2986b06750794aafb854f53d79bdf9d9', 'opendes:work-product--WorkProduct:298463fe56cc4c0a8c7b47124e00befd'] |
However, when search for the ingested records, one find out that the original surrogate key is saved "as is" rather than replaced by actual ids generated by the system. E.g., here is the record of the WP:
```
{
"results": [
{
"data": {
"Description": "TEST_NN_QA_1_wp",
"Name": "TEST_NN_QA_1_wp",
"Components": [
"surrogate-key:wpc-1:"
]
},
"kind": "opendes:wks:work-product--WorkProduct:1.0.0",
"source": "wks",
"acl": {
"viewers": [
"data.default.viewers@opendes.ibm.com"
],
"owners": [
"data.default.owners@opendes.ibm.com"
]
},
"type": "work-product--WorkProduct",
"version": 1633021107588724,
"createTime": "2021-09-30T16:58:27.698Z",
"authority": "opendes",
"namespace": "opendes:wks",
"legal": {
"legaltags": [
"opendes-demo-legaltag"
],
"otherRelevantDataCountries": [
"US"
],
"status": "compliant"
},
"createUser": "airflow-user@osdu.opengroup.org",
"id": "opendes:work-product--WorkProduct:298463fe56cc4c0a8c7b47124e00befd"
}
],
"totalCount": 1
}
```https://community.opengroup.org/osdu/platform/pre-shipping/-/issues/102Loading of 50,000 records using Osdu_ingest DAG fails for Azure platform2022-08-23T12:49:24ZKamlesh TodaiLoading of 50,000 records using Osdu_ingest DAG fails for Azure platformWhen trying to load test manifest ingestion (Osdu_ingest DAG) with 50,000 organization records, the DAG is failing. The test with 500 and 1000 records was successful but not 50,000 records.
While checking the status of DAG it returns wi...When trying to load test manifest ingestion (Osdu_ingest DAG) with 50,000 organization records, the DAG is failing. The test with 500 and 1000 records was successful but not 50,000 records.
While checking the status of DAG it returns with failed status:
**>>> azure_client.get_workflow('Osdu_ingest', '2c032948-3a8f-441f-9578-7a356709aa64')**
DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): osdu-ship.msft-osdu-test.org:443
DEBUG:urllib3.connectionpool:https://osdu-ship.msft-osdu-test.org:443 "GET /api/workflow/v1/workflow/Osdu_ingest/workflowRun/2c032948-3a8f-441f-9578-7a356709aa64 HTTP/1.1" 200 None
DEBUG:root:HTTP GET https://osdu-ship.msft-osdu-test.org/api/workflow/v1/workflow/Osdu_ingest/workflowRun/2c032948-3a8f-441f-9578-7a356709aa64
...
DEBUG:root:Response: 200
DEBUG:root:json = {"workflowId": "Osdu_ingest", "runId": "2c032948-3a8f-441f-9578-7a356709aa64", "startTimeStamp": 1633056515159, "endTimeStamp": 1633069037479, **"status": "failed",** "submittedBy": "preshipping@azureglobal1.onmicrosoft.com"}
<osdu_client.OsduClient object at 0x00000261C7BC2B80>
The link to airflow log is
https://osdu-ship.msft-osdu-test.org/airflow/log?task_id=update_status_finished_task&dag_id=Osdu_ingest&execution_date=2021-10-01T02%3A48%3A37.320308%2B00%3A00
The tail part of the log is posted here
DAG: Osdu_ingest R3 manifest processing with providing integrity
[2021-10-01 06:17:16,034] {decorators.py:28} INFO - ManagedIdentityCredential.get_token succeeded
[2021-10-01 06:17:16,034] {chained.py:68} INFO - DefaultAzureCredential acquired a token from ManagedIdentityCredential
[2021-10-01 06:17:16,034] {_universal.py:474} INFO - Request URL: 'https://osdu-mvp-crship-9vns-kv.vault.azure.net/secrets/app-dev-sp-username/?api-version=REDACTED'/nRequest method: 'GET'/nRequest headers:/n 'Accept': 'application/json'/n 'x-ms-client-request-id': '2a6154b8-227f-11ec-8125-3644e0c71593'/n 'User-Agent': 'azsdk-python-keyvault-secrets/4.2.0 Python/3.6.12 (Linux-5.4.0-1051-azure-x86_64-with-debian-10.5)'/n 'Authorization': 'REDACTED'/nNo body was attached to the request
[2021-10-01 06:17:16,064] {connectionpool.py:442} DEBUG - https://osdu-mvp-crship-9vns-kv.vault.azure.net:443 "GET /secrets/app-dev-sp-username/?api-version=7.1 HTTP/1.1" 200 324
[2021-10-01 06:17:16,065] {_universal.py:502} INFO - Response status: 200/nResponse headers:/n 'Cache-Control': 'no-cache'/n 'Pragma': 'no-cache'/n 'Content-Type': 'application/json; charset=utf-8'/n 'Expires': '-1'/n 'x-ms-keyvault-region': 'centralus'/n 'x-ms-client-request-id': '2a6154b8-227f-11ec-8125-3644e0c71593'/n 'x-ms-request-id': 'fb03f535-9856-47e6-83af-8f35058b8bb1'/n 'x-ms-keyvault-service-version': '1.9.79.2'/n 'x-ms-keyvault-network-info': 'conn_type=Subnet;addr=10.10.2.151;act_addr_fam=InterNetworkV6;'/n 'X-Powered-By': 'REDACTED'/n 'Strict-Transport-Security': 'REDACTED'/n 'X-Content-Type-Options': 'REDACTED'/n 'Date': 'Fri, 01 Oct 2021 06:17:15 GMT'/n 'Content-Length': '324'
[2021-10-01 06:17:16,066] {_universal.py:474} INFO - Request URL: 'https://osdu-mvp-crship-9vns-kv.vault.azure.net/secrets/app-dev-sp-password/?api-version=REDACTED'/nRequest method: 'GET'/nRequest headers:/n 'Accept': 'application/json'/n 'x-ms-client-request-id': '39639520-227f-11ec-8125-3644e0c71593'/n 'User-Agent': 'azsdk-python-keyvault-secrets/4.2.0 Python/3.6.12 (Linux-5.4.0-1051-azure-x86_64-with-debian-10.5)'/n 'Authorization': 'REDACTED'/nNo body was attached to the request
[2021-10-01 06:17:16,083] {connectionpool.py:442} DEBUG - https://osdu-mvp-crship-9vns-kv.vault.azure.net:443 "GET /secrets/app-dev-sp-password/?api-version=7.1 HTTP/1.1" 200 322
[2021-10-01 06:17:16,084] {_universal.py:502} INFO - Response status: 200/nResponse headers:/n 'Cache-Control': 'no-cache'/n 'Pragma': 'no-cache'/n 'Content-Type': 'application/json; charset=utf-8'/n 'Expires': '-1'/n 'x-ms-keyvault-region': 'centralus'/n 'x-ms-client-request-id': '39639520-227f-11ec-8125-3644e0c71593'/n 'x-ms-request-id': 'd8fd8cfa-d69a-4721-89f7-7afda24a9e30'/n 'x-ms-keyvault-service-version': '1.9.79.2'/n 'x-ms-keyvault-network-info': 'conn_type=Subnet;addr=10.10.2.151;act_addr_fam=InterNetworkV6;'/n 'X-Powered-By': 'REDACTED'/n 'Strict-Transport-Security': 'REDACTED'/n 'X-Content-Type-Options': 'REDACTED'/n 'Date': 'Fri, 01 Oct 2021 06:17:15 GMT'/n 'Content-Length': '322'
[2021-10-01 06:17:16,085] {_universal.py:474} INFO - Request URL: 'https://osdu-mvp-crship-9vns-kv.vault.azure.net/secrets/app-dev-sp-tenant-id/?api-version=REDACTED'/nRequest method: 'GET'/nRequest headers:/n 'Accept': 'application/json'/n 'x-ms-client-request-id': '39668c30-227f-11ec-8125-3644e0c71593'/n 'User-Agent': 'azsdk-python-keyvault-secrets/4.2.0 Python/3.6.12 (Linux-5.4.0-1051-azure-x86_64-with-debian-10.5)'/n 'Authorization': 'REDACTED'/nNo body was attached to the request
[2021-10-01 06:17:16,136] {connectionpool.py:442} DEBUG - https://osdu-mvp-crship-9vns-kv.vault.azure.net:443 "GET /secrets/app-dev-sp-tenant-id/?api-version=7.1 HTTP/1.1" 200 325
[2021-10-01 06:17:16,137] {_universal.py:502} INFO - Response status: 200/nResponse headers:/n 'Cache-Control': 'no-cache'/n 'Pragma': 'no-cache'/n 'Content-Type': 'application/json; charset=utf-8'/n 'Expires': '-1'/n 'x-ms-keyvault-region': 'centralus'/n 'x-ms-client-request-id': '39668c30-227f-11ec-8125-3644e0c71593'/n 'x-ms-request-id': '5955db9c-8001-411d-9b62-ffdfd53b142d'/n 'x-ms-keyvault-service-version': '1.9.79.2'/n 'x-ms-keyvault-network-info': 'conn_type=Subnet;addr=10.10.2.151;act_addr_fam=InterNetworkV6;'/n 'X-Powered-By': 'REDACTED'/n 'Strict-Transport-Security': 'REDACTED'/n 'X-Content-Type-Options': 'REDACTED'/n 'Date': 'Fri, 01 Oct 2021 06:17:15 GMT'/n 'Content-Length': '325'
[2021-10-01 06:17:16,139] {_universal.py:474} INFO - Request URL: 'https://osdu-mvp-crship-9vns-kv.vault.azure.net/secrets/aad-client-id/?api-version=REDACTED'/nRequest method: 'GET'/nRequest headers:/n 'Accept': 'application/json'/n 'x-ms-client-request-id': '396eb9d2-227f-11ec-8125-3644e0c71593'/n 'User-Agent': 'azsdk-python-keyvault-secrets/4.2.0 Python/3.6.12 (Linux-5.4.0-1051-azure-x86_64-with-debian-10.5)'/n 'Authorization': 'REDACTED'/nNo body was attached to the request
[2021-10-01 06:17:16,162] {connectionpool.py:442} DEBUG - https://osdu-mvp-crship-9vns-kv.vault.azure.net:443 "GET /secrets/aad-client-id/?api-version=7.1 HTTP/1.1" 200 318
[2021-10-01 06:17:16,163] {_universal.py:502} INFO - Response status: 200/nResponse headers:/n 'Cache-Control': 'no-cache'/n 'Pragma': 'no-cache'/n 'Content-Type': 'application/json; charset=utf-8'/n 'Expires': '-1'/n 'x-ms-keyvault-region': 'centralus'/n 'x-ms-client-request-id': '396eb9d2-227f-11ec-8125-3644e0c71593'/n 'x-ms-request-id': 'bbc4c81b-e04b-414b-a1a8-2473db395859'/n 'x-ms-keyvault-service-version': '1.9.79.2'/n 'x-ms-keyvault-network-info': 'conn_type=Subnet;addr=10.10.2.151;act_addr_fam=InterNetworkV6;'/n 'X-Powered-By': 'REDACTED'/n 'Strict-Transport-Security': 'REDACTED'/n 'X-Content-Type-Options': 'REDACTED'/n 'Date': 'Fri, 01 Oct 2021 06:17:15 GMT'/n 'Content-Length': '318'
[2021-10-01 06:17:16,167] {connectionpool.py:943} DEBUG - Starting new HTTPS connection (1): login.microsoftonline.com:443
[2021-10-01 06:17:16,232] {connectionpool.py:442} DEBUG - https://login.microsoftonline.com:443 "GET /58975fd3-4977-44d0-bea8-37af0baac100/v2.0/.well-known/openid-configuration HTTP/1.1" 200 1753
[2021-10-01 06:17:16,234] {authority.py:92} DEBUG - openid_config = {'token_endpoint': 'https://login.microsoftonline.com/58975fd3-4977-44d0-bea8-37af0baac100/oauth2/v2.0/token', 'token_endpoint_auth_methods_supported': ['client_secret_post', 'private_key_jwt', 'client_secret_basic'], 'jwks_uri': 'https://login.microsoftonline.com/58975fd3-4977-44d0-bea8-37af0baac100/discovery/v2.0/keys', 'response_modes_supported': ['query', 'fragment', 'form_post'], 'subject_types_supported': ['pairwise'], 'id_token_signing_alg_values_supported': ['RS256'], 'response_types_supported': ['code', 'id_token', 'code id_token', 'id_token token'], 'scopes_supported': ['openid', 'profile', 'email', 'offline_access'], 'issuer': 'https://login.microsoftonline.com/58975fd3-4977-44d0-bea8-37af0baac100/v2.0', 'request_uri_parameter_supported': False, 'userinfo_endpoint': 'https://graph.microsoft.com/oidc/userinfo', 'authorization_endpoint': 'https://login.microsoftonline.com/58975fd3-4977-44d0-bea8-37af0baac100/oauth2/v2.0/authorize', 'device_authorization_endpoint': 'https://login.microsoftonline.com/58975fd3-4977-44d0-bea8-37af0baac100/oauth2/v2.0/devicecode', 'http_logout_supported': True, 'frontchannel_logout_supported': True, 'end_session_endpoint': 'https://login.microsoftonline.com/58975fd3-4977-44d0-bea8-37af0baac100/oauth2/v2.0/logout', 'claims_supported': ['sub', 'iss', 'cloud_instance_name', 'cloud_instance_host_name', 'cloud_graph_host_name', 'msgraph_host', 'aud', 'exp', 'iat', 'auth_time', 'acr', 'nonce', 'preferred_username', 'name', 'tid', 'ver', 'at_hash', 'c_hash', 'email'], 'kerberos_endpoint': 'https://login.microsoftonline.com/58975fd3-4977-44d0-bea8-37af0baac100/kerberos', 'tenant_region_scope': 'NA', 'cloud_instance_name': 'microsoftonline.com', 'cloud_graph_host_name': 'graph.windows.net', 'msgraph_host': 'graph.microsoft.com', 'rbac_url': 'https://pas.windows.net'}
[2021-10-01 06:17:16,235] {application.py:60} DEBUG - Generates correlation_id: 5cdce50a-b115-4ce8-81c5-abef3222005e
[2021-10-01 06:17:16,306] {connectionpool.py:442} DEBUG - https://login.microsoftonline.com:443 "POST /58975fd3-4977-44d0-bea8-37af0baac100/oauth2/v2.0/token HTTP/1.1" 200 1331
[2021-10-01 06:17:16,307] {token_cache.py:120} DEBUG - event={
"client_id": "60c4b736-2aa4-4889-88a0-d50503d63de7",
"data": {
"claims": null,
"scope": [
"ab320ed3-9cdd-4798-8e3c-2a657800183b/.default"
]
},
"environment": "login.microsoftonline.com",
"grant_type": "client_credentials",
"params": null,
"response": {
"access_token": "********",
"expires_in": 3599,
"ext_expires_in": 3599,
"token_type": "Bearer"
},
"scope": [
"ab320ed3-9cdd-4798-8e3c-2a657800183b/.default"
],
"token_endpoint": "https://login.microsoftonline.com/58975fd3-4977-44d0-bea8-37af0baac100/oauth2/v2.0/token"
}
[2021-10-01 06:17:16,560] {update_status.py:82} DEBUG - Sending request '{"status": "failed"}'
[2021-10-01 06:17:16,560] {update_status.py:84} DEBUG - Workflow URL: http://workflow.osdu-azure.svc.cluster.local/api/workflow/v1/workflow/Osdu_ingest/workflowRun/2c032948-3a8f-441f-9578-7a356709aa64
[2021-10-01 06:17:16,563] {connectionpool.py:230} DEBUG - Starting new HTTP connection (1): workflow.osdu-azure.svc.cluster.local:80
[2021-10-01 06:17:18,754] {connectionpool.py:442} DEBUG - http://workflow.osdu-azure.svc.cluster.local:80 "PUT /api/workflow/v1/workflow/Osdu_ingest/workflowRun/2c032948-3a8f-441f-9578-7a356709aa64 HTTP/1.1" 200 None
[2021-10-01 06:17:18,755] {taskinstance.py:1150} ERROR - Dag failed
Traceback (most recent call last):
File "/home/airflow/.local/lib/python3.6/site-packages/airflow/models/taskinstance.py", line 984, in _run_raw_task
result = task_copy.execute(context=context)
File "/opt/airflow/dags/osdu_manifest/operators/update_status.py", line 133, in execute
raise PipelineFailedError("Dag failed")
osdu_api.libs.exceptions.PipelineFailedError: Dag failed
[2021-10-01 06:17:18,795] {taskinstance.py:1194} INFO - Marking task as FAILED. dag_id=Osdu_ingest, task_id=update_status_finished_task, execution_date=20211001T024837, start_date=20211001T061641, end_date=20211001T061718
[2021-10-01 06:17:18,983] {cli_action_loggers.py:86} DEBUG - Calling callbacks: []
[2021-10-01 06:17:20,064] {base_job.py:197} DEBUG - [heartbeat]
[2021-10-01 06:17:20,064] {local_task_job.py:102} INFO - Task exited with return code 1Kishore BattulaMANISH KUMARKishore Battulahttps://community.opengroup.org/osdu/platform/pre-shipping/-/issues/104AWS - R3M8 - Manifest-based Ingestion - Frame of Reference2021-12-13T06:52:48ZDebasis ChatterjeeAWS - R3M8 - Manifest-based Ingestion - Frame of ReferencePlease check enclosed file for details. I do not see values being converted from "ft" to "m".
[AWS-Ingest-Master-SeismicAcquisitionSurvey-ST0202R08-DC-2Oct-steps.txt](/uploads/936d11727d51090b157b8053d02285fb/AWS-Ingest-Master-SeismicAc...Please check enclosed file for details. I do not see values being converted from "ft" to "m".
[AWS-Ingest-Master-SeismicAcquisitionSurvey-ST0202R08-DC-2Oct-steps.txt](/uploads/936d11727d51090b157b8053d02285fb/AWS-Ingest-Master-SeismicAcquisitionSurvey-ST0202R08-DC-2Oct-steps.txt)
Also see issue #99 by @yanbinzhanghttps://community.opengroup.org/osdu/platform/pre-shipping/-/issues/105AWS - R3M8 - Manifest-based Ingestion - Master data - Well2021-12-14T16:39:37ZDebasis ChatterjeeAWS - R3M8 - Manifest-based Ingestion - Master data - Well@sje7253bp reported issue with this load manifest/JSON file.
[Steve_AWS_Master_Data_body_example-problem.txt](/uploads/077a37c05219a6b3d4e53235dfc7028b/Steve_AWS_Master_Data_body_example-problem.txt)
When he runs this, it does not show a...@sje7253bp reported issue with this load manifest/JSON file.
[Steve_AWS_Master_Data_body_example-problem.txt](/uploads/077a37c05219a6b3d4e53235dfc7028b/Steve_AWS_Master_Data_body_example-problem.txt)
When he runs this, it does not show any error in any of the log files (Airflow).
Yet, it does not perform desired task of creating Master record Well.
I took his load manifest and could recreate the same issue in AWS.
I also took his load manifest and used in GCP (after adjusting some of the environment variables). In GCP, it does the job properly and creates a Well record.
What we are baffled with is - why do we not see any error from any of the logs in AWS?
Can you please check this and advise?
Thank you
cc - @WibbenM10 - Release 0.13https://community.opengroup.org/osdu/platform/pre-shipping/-/issues/117Wellbore DMS | Marker2022-08-23T10:47:26ZGrant MarblestoneWellbore DMS | MarkerM9 - Release 0.12Jefferson OliveiraJefferson Oliveirahttps://community.opengroup.org/osdu/platform/pre-shipping/-/issues/110WITSML Parser2022-08-23T15:55:56ZEsmira RafigayevaWITSML ParserM9 - Release 0.12etienne peyssonetienne peyssonhttps://community.opengroup.org/osdu/platform/pre-shipping/-/issues/111Seismic DMS Integration(oVDS)2022-01-18T04:11:27ZGrant MarblestoneSeismic DMS Integration(oVDS)M9 - Release 0.12https://community.opengroup.org/osdu/platform/pre-shipping/-/issues/113Seismic DMS Integration(oZGY)2021-12-16T15:15:06ZGrant MarblestoneSeismic DMS Integration(oZGY)M9 - Release 0.12Grant MarblestoneGrant Marblestonehttps://community.opengroup.org/osdu/platform/pre-shipping/-/issues/112Sdutil oZGY2021-12-16T15:15:02ZGrant MarblestoneSdutil oZGYM9 - Release 0.12Grant MarblestoneGrant Marblestonehttps://community.opengroup.org/osdu/platform/pre-shipping/-/issues/118Wellbore DMS | Load curve data from LAS file2021-12-16T15:14:30ZGrant MarblestoneWellbore DMS | Load curve data from LAS fileM9 - Release 0.12Grant MarblestoneGrant Marblestonehttps://community.opengroup.org/osdu/platform/pre-shipping/-/issues/119Well Delivery DMS | Prototype - API functionality tests2022-04-28T17:02:58ZGrant MarblestoneWell Delivery DMS | Prototype - API functionality tests@esmira.rafigayeva working with Andrei
supposed to be assigned to Jualiana F., but could not find her in assignees list.@esmira.rafigayeva working with Andrei
supposed to be assigned to Jualiana F., but could not find her in assignees list.Juliana Fernandesjuliana.fernandes@iesbrazil.com.brJuliana Fernandesjuliana.fernandes@iesbrazil.com.brhttps://community.opengroup.org/osdu/platform/pre-shipping/-/issues/120Operations Procedure2022-08-24T14:34:02ZSehubo AkinyanmiOperations ProcedureM9 - Release 0.12Naufal Mohamed NooriNaufal Mohamed Noorihttps://community.opengroup.org/osdu/platform/pre-shipping/-/issues/121Performance Loading Tests2022-08-24T14:34:28ZSehubo AkinyanmiPerformance Loading Tests| Performance tests on record loading : #500 records |
|-------------------------------------------------------|
| Performance tests on record loading : #1000 records |
| Performance tests on record loading : #50000 records || Performance tests on record loading : #500 records |
|-------------------------------------------------------|
| Performance tests on record loading : #1000 records |
| Performance tests on record loading : #50000 records |M9 - Release 0.12Wira YarmanWira Yarmanhttps://community.opengroup.org/osdu/platform/pre-shipping/-/issues/123Manifest ingestion - GCP R3M9 {workflow URL} not found, when previously work...2022-08-23T12:47:20ZSteven EvansManifest ingestion - GCP R3M9 {workflow URL} not found, when previously worked for M9 smoke testsSummary
After successful token authorization, when running any of the R3M9 smoke test collections and trying the R3 validation reference data manifest ingestion, each process fails with the same error: getaddrinfo ENOTFOUND preship-asm....Summary
After successful token authorization, when running any of the R3M9 smoke test collections and trying the R3 validation reference data manifest ingestion, each process fails with the same error: getaddrinfo ENOTFOUND preship-asm.osdu-gcp.go3-nrg.projects.epam.com
Steps to reproduce
1. Refresh token for M9 environment
2. Run any of the smoke test collections i.e legal tag and step 01a - reference Data Ingestion (Facility Type) from the R3 Validation full based manifest ingestion collection - #29 from the platform validation repository in Gitlab
the following error will appear - getaddrinfo ENOTFOUND preship-asm.osdu-gcp.go3-nrg.projects.epam.com
Example Environment(Tenant)
R3M9
What is the current bug behavior?
As above. Collections fail to run
What is the expected correct behavior?
Collection to run and return a valid response
Relevant logs and/or screenshots -
![image](/uploads/4d7504192ef0ccb4dfe250c8c572c15b/image.png)
![image](/uploads/d25b22defbb523b69ea3c81c34c0434a/image.png)Aliaksandr Ramanovich (EPAM)Aliaksandr Ramanovich (EPAM)https://community.opengroup.org/osdu/platform/pre-shipping/-/issues/124File metadata API is not called in Osdu_Ingest DAG2022-08-24T13:34:19Zivar SoerheimFile metadata API is not called in Osdu_Ingest DAGWhen I send in a manifest WPC in the Osdu_Ingest DAG the file supplied (in the file-generic-wpc) using a surrogate key in not moved from staging to persistent.
In the Postman collection this is done using calling the file metadata api....When I send in a manifest WPC in the Osdu_Ingest DAG the file supplied (in the file-generic-wpc) using a surrogate key in not moved from staging to persistent.
In the Postman collection this is done using calling the file metadata api.
As I see it this is missing functionality in the DAG.
Manifest for reference:
{
"kind": "osdu:wks:Manifest:1.0.0",
"ReferenceData": [],
"MasterData": [],
"Data": {
"WorkProduct": {
"kind": "osdu:wks:work-product--WorkProduct:1.0.0",
"acl": {
"owners": [
"data.default.owners@opendes.contoso.com"
],
"viewers": [
"data.default.viewers@opendes.contoso.com"
]
},
"legal": {
"legaltags": ["opendes-public-usa-dataset-7643990"],
"otherRelevantDataCountries": [ "US"]
},
"data": {
"ResourceSecurityClassification": "opendes:reference-data--ResourceSecurityClassification:RESTRICTED:",
"Name": "15_9-19_SR_CPI.las",
"Description": "Well Log",
"Components": [
"surrogate-key:wpc-1"
]
}
},
"WorkProductComponents": [
{
"id": "surrogate-key:wpc-1",
"kind": "osdu:wks:work-product-component--WellLog:1.0.0",
"acl": {
"owners": [
"data.default.owners@opendes.contoso.com"
],
"viewers": [
"data.default.viewers@opendes.contoso.com"
]
},
"legal": {
"legaltags": ["opendes-public-usa-dataset-7643990"],
"otherRelevantDataCountries": [ "US"]
},
"meta": [],
"data": {
"ResourceSecurityClassification": "opendes:reference-data--ResourceSecurityClassification:RESTRICTED:",
"Name": "15_9-19_SR_CPI.las",
"Description": "Well Log",
"Datasets": [
"surrogate-key:file-1"
],
"WellboreID": "opendes:master-data--Wellbore:NPD-2105:",
"TopMeasuredDepth": 3549.7008,
"BottomMeasuredDepth": 4618.3296,
"Curves": [
{
"Mnemonic": "DEPTH",
"TopDepth": 3549.7008,
"BaseDepth": 4618.3296,
"DepthUnit": "opendes:reference-data--UnitOfMeasure:M:",
"CurveUnit": "opendes:reference-data--UnitOfMeasure:M:"
},
{
"Mnemonic": "BWV",
"TopDepth": 3549.7008,
"BaseDepth": 4618.1772,
"DepthUnit": "opendes:reference-data--UnitOfMeasure:M:"
},
{
"Mnemonic": "DT",
"TopDepth": 3549.8532,
"BaseDepth": 4618.3296,
"DepthUnit": "opendes:reference-data--UnitOfMeasure:M:"
},
{
"Mnemonic": "KLOGH",
"TopDepth": 3568.9032,
"BaseDepth": 4618.1772,
"DepthUnit": "opendes:reference-data--UnitOfMeasure:M:",
"CurveUnit": "opendes:reference-data--UnitOfMeasure:MD:"
},
{
"Mnemonic": "KLOGV",
"TopDepth": 3568.9032,
"BaseDepth": 4578.7056,
"DepthUnit": "opendes:reference-data--UnitOfMeasure:M:",
"CurveUnit": "opendes:reference-data--UnitOfMeasure:MD:"
},
{
"Mnemonic": "PHIF",
"TopDepth": 3568.9032,
"BaseDepth": 4618.1772,
"DepthUnit": "opendes:reference-data--UnitOfMeasure:M:",
"CurveUnit": "opendes:reference-data--UnitOfMeasure:V%2FV:"
},
{
"Mnemonic": "SAND_FLAG",
"TopDepth": 3568.9032,
"BaseDepth": 4618.1772,
"DepthUnit": "opendes:reference-data--UnitOfMeasure:M:",
"CurveUnit": "opendes:reference-data--UnitOfMeasure:UNITLESS:"
},
{
"Mnemonic": "SW",
"TopDepth": 3568.9032,
"BaseDepth": 4618.1772,
"DepthUnit": "opendes:reference-data--UnitOfMeasure:M:",
"CurveUnit": "opendes:reference-data--UnitOfMeasure:V%2FV:"
},
{
"Mnemonic": "VSH",
"TopDepth": 4304.2332,
"BaseDepth": 4591.5072,
"DepthUnit": "opendes:reference-data--UnitOfMeasure:M:",
"CurveUnit": "opendes:reference-data--UnitOfMeasure:V%2FV:"
}
]
}
}
],
"Datasets": [
{
"id": "surrogate-key:file-1",
"kind": "osdu:wks:dataset--File.Generic:1.0.0",
"acl": {
"owners": [
"data.default.owners@opendes.contoso.com"
],
"viewers": [
"data.default.viewers@opendes.contoso.com"
]
},
"legal": {
"legaltags": ["opendes-public-usa-dataset-7643990"],
"otherRelevantDataCountries": [ "US"]
},
"data": {
"ResourceSecurityClassification": "opendes:reference-data--ResourceSecurityClassification:RESTRICTED:",
"SchemaFormatTypeID": "opendes:reference-data--SchemaFormatType:LAS2:",
"DatasetProperties": {
"FileSourceInfo": {
"FileSource": "<Filesource from file api call goes here>",
"Name": "15_9-19_SR_CPI.las",
"PreloadFilePath": <>
}
}
}
}
]
}
}Debasis ChatterjeeChad LeongDebasis Chatterjeehttps://community.opengroup.org/osdu/platform/pre-shipping/-/issues/125IBM R3M9 Manifest ingestion - Vertical Measurement Path failure2022-08-23T12:47:21ZSteven EvansIBM R3M9 Manifest ingestion - Vertical Measurement Path failureThere appears to be an issue with the reference data schema for Vertical Measurement Path. The Airflow DAG validate schema log reports the following:
[2021-11-18 11:07:58,477] {validate_schema.py:292} ERROR - Schema validation error. Dat...There appears to be an issue with the reference data schema for Vertical Measurement Path. The Airflow DAG validate schema log reports the following:
[2021-11-18 11:07:58,477] {validate_schema.py:292} ERROR - Schema validation error. Data field.
[2021-11-18 11:07:58,478] {validate_schema.py:293} ERROR - Manifest kind: osdu:wks:reference-data--VerticalMeasurementPath:1.0.0
[2021-11-18 11:07:58,503] {validate_schema.py:294} ERROR - Error: Additional properties are not allowed ('Data' was unexpected)
Failed validating 'additionalProperties' in schema:
{'$id': 'http://os-schema-ibm.osdu.svc.cluster.local:8080/api/schema-service/v1/schema/osdu:wks:reference-data--VerticalMeasurementPath:1.0.0',
'$schema': 'http://json-schema.org/draft-07/schema#',
'additionalProperties': False,
I have attached the complete DAG schema validation log [IBM_R3M9_Manifets_ingestion_VerticalMeasurementPath_Error.txt](/uploads/8719d88acc2e13be6899c08b11bd6698/IBM_R3M9_Manifets_ingestion_VerticalMeasurementPath_Error.txt)Anuj GuptaShaonShrikant GargAnuj Guptahttps://community.opengroup.org/osdu/platform/pre-shipping/-/issues/126IBM R3M9 Manifest ingestion - Frame of Reference no "Processed ids" in Airflo...2022-08-23T12:47:07ZEsmira RafigayevaIBM R3M9 Manifest ingestion - Frame of Reference no "Processed ids" in Airflow Response@Shaon Mazumder [IBM] @Anuj Gupta [IBM] Hi Shaon, Anuj - FoR unit conversion (Manifest Ingestion) in IBM - no "Processed ids", also Airflow version is showing 2.1.1; Xcom summary is empty:
[Airflow_Response.txt](/uploads/219b68dd99b6551b...@Shaon Mazumder [IBM] @Anuj Gupta [IBM] Hi Shaon, Anuj - FoR unit conversion (Manifest Ingestion) in IBM - no "Processed ids", also Airflow version is showing 2.1.1; Xcom summary is empty:
[Airflow_Response.txt](/uploads/219b68dd99b6551b24a4eaac3f4a38a3/Airflow_Response.txt)M9 - Release 0.12Anuj GuptaShaonShrikant GargAnuj Guptahttps://community.opengroup.org/osdu/platform/pre-shipping/-/issues/127IBM R3M9 - Storage Service - Frame of Reference - Unit conversion is not working2022-08-23T12:47:05ZDebasis ChatterjeeIBM R3M9 - Storage Service - Frame of Reference - Unit conversion is not workingI tested this for unit conversion from "ft" to "m". It does not seem to work. I did some troubleshooting but no clue is revealed with reason for failure.
[IBM-Storage-service-Unit-conversion.txt](/uploads/264b69b5f285abf5e4cb5ae54af3b08...I tested this for unit conversion from "ft" to "m". It does not seem to work. I did some troubleshooting but no clue is revealed with reason for failure.
[IBM-Storage-service-Unit-conversion.txt](/uploads/264b69b5f285abf5e4cb5ae54af3b085/IBM-Storage-service-Unit-conversion.txt)Anuj GuptaShrikant GargAnuj Guptahttps://community.opengroup.org/osdu/platform/pre-shipping/-/issues/129Azure R3M9 - Search Service - Frame of Reference - Unit conversion is not wor...2022-08-23T12:47:06ZEsmira RafigayevaAzure R3M9 - Search Service - Frame of Reference - Unit conversion is not workingTested unit conversion Search Service "m" and "ft".
Response is:
{
"results": [],
"aggregations": [],
"totalCount": 0
}Tested unit conversion Search Service "m" and "ft".
Response is:
{
"results": [],
"aggregations": [],
"totalCount": 0
}M9 - Release 0.12Shrikant GargMANISH KUMARVivek OjhaNikhil Singh[MicroSoft]Shrikant Garghttps://community.opengroup.org/osdu/platform/pre-shipping/-/issues/130GCP R3M9 - Manifest Ingestion - Reference data - newly created record cannot ...2022-08-23T12:47:19ZDebasis ChatterjeeGCP R3M9 - Manifest Ingestion - Reference data - newly created record cannot be seen using Search Service (OK from Storage service)Please see enclosed.
https://community.opengroup.org/osdu/platform/pre-shipping/-/blob/main/R3-M9/Test_Plan_Results%20_M9/Manifest_Ingestion/OSDU_PTP_M9_TeamA-GCP-Manifest-Ingestion-Reference-Debasis.txt
cc - @aliaksandr_ramanovich and...Please see enclosed.
https://community.opengroup.org/osdu/platform/pre-shipping/-/blob/main/R3-M9/Test_Plan_Results%20_M9/Manifest_Ingestion/OSDU_PTP_M9_TeamA-GCP-Manifest-Ingestion-Reference-Debasis.txt
cc - @aliaksandr_ramanovich and @esmira.rafigayevahttps://community.opengroup.org/osdu/platform/pre-shipping/-/issues/131GCP R3M9 - Version endpoint - Workflow Service - does not show version of Air...2022-08-23T12:47:20ZDebasis ChatterjeeGCP R3M9 - Version endpoint - Workflow Service - does not show version of AirflowGET https://{{WORKFLOW_HOST}}/info
Response
```
{
"groupId": "org.opengroup.osdu",
"artifactId": "workflow-gcp",
"version": "0.12.0",
"buildTime": "2021-10-28T04:01:37.449Z",
"branch": "v0.12.0",
"commitId": "fe...GET https://{{WORKFLOW_HOST}}/info
Response
```
{
"groupId": "org.opengroup.osdu",
"artifactId": "workflow-gcp",
"version": "0.12.0",
"buildTime": "2021-10-28T04:01:37.449Z",
"branch": "v0.12.0",
"commitId": "fed0537df69a41e652164c33eb9103c3e1b719b3",
"commitMessage": "Creating Release Commit",
"connectedOuterServices": []
}
```
cc - @Kateryna_Kurach for informationhttps://community.opengroup.org/osdu/platform/pre-shipping/-/issues/132GCP-R3M9 - Smoke test Collection - CSV Ingestion uses deprecated Storage serv...2023-08-07T10:49:19ZDebasis ChatterjeeGCP-R3M9 - Smoke test Collection - CSV Ingestion uses deprecated Storage service (schema create)See enclosed.
![GCP-CSV-schema-create](/uploads/4802d2b2bbfe1f8d2b125e7fe12b0d34/GCP-CSV-schema-create.PNG)
Note that schema creation from Storage Service is deprecated.
Can you please change to "Schema service" for this?
You may che...See enclosed.
![GCP-CSV-schema-create](/uploads/4802d2b2bbfe1f8d2b125e7fe12b0d34/GCP-CSV-schema-create.PNG)
Note that schema creation from Storage Service is deprecated.
Can you please change to "Schema service" for this?
You may check collection-31 from Platform Validation.
https://community.opengroup.org/osdu/platform/testing/-/tree/master/Postman%20CollectionDebasis ChatterjeeYauhen Shaliou [EPAM/GCP]Debasis Chatterjeehttps://community.opengroup.org/osdu/platform/pre-shipping/-/issues/136Smoke Collection2022-08-24T14:35:12ZSehubo AkinyanmiSmoke Collection**Perform Smoke Collection validations across all the CSP deployments (IBM, Azure, GCP & AWS)**
- [ ] Core Services
- [ ] Ingestion
- [ ] Global Status Monitoring (Kateryna to update on scope)
- [ ] DDMS**Perform Smoke Collection validations across all the CSP deployments (IBM, Azure, GCP & AWS)**
- [ ] Core Services
- [ ] Ingestion
- [ ] Global Status Monitoring (Kateryna to update on scope)
- [ ] DDMSM9 - Release 0.12Kamlesh TodaiKamlesh Todaihttps://community.opengroup.org/osdu/platform/pre-shipping/-/issues/137Version Endpoint - Stage 22022-08-23T15:55:57ZSehubo AkinyanmiVersion Endpoint - Stage 2**Validate the version endpoint of the underlisted on GCP**
- [ ] Elasticsearch via Indexer Service
- [ ] Airflow via Workflow Service**Validate the version endpoint of the underlisted on GCP**
- [ ] Elasticsearch via Indexer Service
- [ ] Airflow via Workflow ServiceM10 - Release 0.13https://community.opengroup.org/osdu/platform/pre-shipping/-/issues/138Audit & Metrics2022-09-03T02:22:12ZSehubo AkinyanmiAudit & MetricsKPI Batch 1 - Scope validation from Project Lead, @Srinivasan Ramamoorthi [LTI]KPI Batch 1 - Scope validation from Project Lead, @Srinivasan Ramamoorthi [LTI]M9 - Release 0.12Naufal Mohamed NooriNaufal Mohamed Noorihttps://community.opengroup.org/osdu/platform/pre-shipping/-/issues/139Azure R3M9 Preship - Airflow is Failing2022-08-23T12:47:12ZEsmira RafigayevaAzure R3M9 Preship - Airflow is FailingM9 - Release 0.12MANISH KUMARVivek OjhaMANISH KUMARhttps://community.opengroup.org/osdu/platform/pre-shipping/-/issues/140Calls to M9 IBM Entitlements and Indexer Version Endpoints are Failing2022-08-23T13:29:58ZMichaelCalls to M9 IBM Entitlements and Indexer Version Endpoints are FailingIBM Entitlements Version Endpoint failed with 502 error for url https://osdu-cpd-osdu.odi-osdu-og-fa7661852f2ab29a6be32f560b2f5573-0000.us-south.containers.appdomain.cloud/osdu-entitlements/api/entitlements/v1/info
IBM Indexer Version E...IBM Entitlements Version Endpoint failed with 502 error for url https://osdu-cpd-osdu.odi-osdu-og-fa7661852f2ab29a6be32f560b2f5573-0000.us-south.containers.appdomain.cloud/osdu-entitlements/api/entitlements/v1/info
IBM Indexer Version Endpoint failed with 404 error for url https://osdu-cpd-osdu.odi-osdu-og-fa7661852f2ab29a6be32f560b2f5573-0000.us-south.containers.appdomain.cloud/osdu-indexer/api/indexer/v2/infoM9 - Release 0.12https://community.opengroup.org/osdu/platform/pre-shipping/-/issues/141Azure R3M9 Smoke Test collection missing2022-08-23T12:47:13ZEsmira RafigayevaAzure R3M9 Smoke Test collection missing@krganesan Hi Krishnan, could you provide Azure Smoke Test collection as it is missing from this location: https://community.opengroup.org/osdu/platform/pre-shipping/-/tree/main/R3-M9/Azure-M9/
[RE__Preshipping_Env_Details_-_Smoke_Test_C...@krganesan Hi Krishnan, could you provide Azure Smoke Test collection as it is missing from this location: https://community.opengroup.org/osdu/platform/pre-shipping/-/tree/main/R3-M9/Azure-M9/
[RE__Preshipping_Env_Details_-_Smoke_Test_Collection.msg](/uploads/d671efb36a6f181592f37134eb0f22a3/RE__Preshipping_Env_Details_-_Smoke_Test_Collection.msg) Regards, Esmira cc: @manishk @harshit283 @debasisc @sehuboyM9 - Release 0.12Krishnan GanesanKrishnan Ganesanhttps://community.opengroup.org/osdu/platform/pre-shipping/-/issues/144Azure R3M9 - Storage service, Update gives 403 (unauthorized) error2022-08-23T12:47:02ZDebasis ChatterjeeAzure R3M9 - Storage service, Update gives 403 (unauthorized) errorReported by @s0rhe1m
Collection-12 from Platform validation site. request-11)
PUT https://{{STORAGE_HOST}}/records?skipdupes=true
Body
```
[
{
"acl": {
"owners": [
"{{New_OwnerDataGroup}}@{{data-partition-id}}{{do...Reported by @s0rhe1m
Collection-12 from Platform validation site. request-11)
PUT https://{{STORAGE_HOST}}/records?skipdupes=true
Body
```
[
{
"acl": {
"owners": [
"{{New_OwnerDataGroup}}@{{data-partition-id}}{{domain}}"
],
"viewers": [
"{{New_ViewerDataGroup}}@{{data-partition-id}}{{domain}}"
]
},
"data": {
"msg": "Updated Message from AutoTest while testing update",
"weight": 888.0
},
"id": "{{data-partition-id}}:master-data--SeismicAcquisitionSurvey:ST0202R08-DC-25Nov",
"kind": "{{schemaAuth}}:wks:master-data--SeismicAcquisitionSurvey:1.0.0",
"legal": {
"legaltags": [
"{{LegalTagNameExists}}"
],
"otherRelevantDataCountries": [
"US"
],
"status": "compliant"
},
"meta": [
{}
],
"version": 0
}
]
```
**Error in response**
```
{
"code": 403,
"reason": "User Unauthorized",
"message": "User is not authorized to update records."
}
```
cc - @manishk for informationhttps://community.opengroup.org/osdu/platform/pre-shipping/-/issues/145Azure R3M9 - Storage service, GET gives 403 (unauthorized) error2022-08-23T12:47:03ZDebasis ChatterjeeAzure R3M9 - Storage service, GET gives 403 (unauthorized) errorSomething strange happening.
I created fresh record and then I get 403 error when I try to retrieve that using GET (Storage service).
GET https://{{STORAGE_HOST}}/records/opendes:master-data--SeismicAcquisitionSurvey:ST0202R08-DC-25Nov...Something strange happening.
I created fresh record and then I get 403 error when I try to retrieve that using GET (Storage service).
GET https://{{STORAGE_HOST}}/records/opendes:master-data--SeismicAcquisitionSurvey:ST0202R08-DC-25Novtry2
```
{
"code": 403,
"reason": "Access denied",
"message": "The user does not have access to the record"
}
```
This works
GET https://{{STORAGE_HOST}}/records/opendes:master-data--Field:VOLVE
See enclosed for details.
[OSDU_PTP_M9_TeamA-Azure-Storage-service-FoR-Unit-conversion-Debasis-issue.txt](/uploads/592c10f0e7c9836db78b4c3f7ee658bd/OSDU_PTP_M9_TeamA-Azure-Storage-service-FoR-Unit-conversion-Debasis-issue.txt)
cc - @esmira.rafigayeva and @manishk for informationhttps://community.opengroup.org/osdu/platform/pre-shipping/-/issues/146Azure R3M9 - Audit & Metrics: Issue to Access Certain APIs2022-08-23T11:28:56ZNaufal Mohamed NooriAzure R3M9 - Audit & Metrics: Issue to Access Certain APIsUnable to access the following APIs (with 404 response) (2 out 11 API requests):
- Data Access Rights Entitlement usage or non-usage Data Access Rights (Post)
- Total Data Size transferred A metric capturing file size of data ingested (...Unable to access the following APIs (with 404 response) (2 out 11 API requests):
- Data Access Rights Entitlement usage or non-usage Data Access Rights (Post)
- Total Data Size transferred A metric capturing file size of data ingested (Post)
Waiting for LTI engineer to rectify the issueMohd Asad ShaikhMohd Asad Shaikhhttps://community.opengroup.org/osdu/platform/pre-shipping/-/issues/147IBM R3M9 - Standard Reference values not yet populated entirely2023-09-28T13:04:40ZDebasis ChatterjeeIBM R3M9 - Standard Reference values not yet populated entirelyWhile working with @sehuboy earlier today, we were checking for WellInteerstType (at random) and found that the standard valuelist has not been completely loaded yet.
[OSDU_PTP_M9_TeamA-IBM-Reference-data-aggregate-query-Debasis.txt](/u...While working with @sehuboy earlier today, we were checking for WellInteerstType (at random) and found that the standard valuelist has not been completely loaded yet.
[OSDU_PTP_M9_TeamA-IBM-Reference-data-aggregate-query-Debasis.txt](/uploads/77df88389ba765168884ae625e85e6ab/OSDU_PTP_M9_TeamA-IBM-Reference-data-aggregate-query-Debasis.txt)
Please check enclosed.
Thank you
cc @anujgupta and @shamazum for informationhttps://community.opengroup.org/osdu/platform/pre-shipping/-/issues/152Azure R3M9 - unable to achieve successful run with manifest Surrogate key - S...2022-08-23T12:47:16ZSteven EvansAzure R3M9 - unable to achieve successful run with manifest Surrogate key - Schema validation errorMultiple attempts to achieve a successful run for Surrogate key manifest ingestion results in failure with the same error in the DAG validate manifest schema log. Using the latest provisioned environment file and the validation collectio...Multiple attempts to achieve a successful run for Surrogate key manifest ingestion results in failure with the same error in the DAG validate manifest schema log. Using the latest provisioned environment file and the validation collection #29 CICD Setup Ingestion running 04a - 04d and 06a. Attached a text file outlining the steps and log response containing the error
This routine mimics that of Debasis successful Surrogate run in GCP R3M9.MANISH KUMARMANISH KUMARhttps://community.opengroup.org/osdu/platform/pre-shipping/-/issues/153Azure R3M9 - Legal Tag with no opendes prefix2022-08-30T20:40:43ZSehubo AkinyanmiAzure R3M9 - Legal Tag with no opendes prefixAfter creating legal tags as noted below; the opendes is not prefix in the stored variable (stored as "Test-Legal-Tag-7343099") which makes subsequent action of creating record throw up error of wrong legal tags.
I was able to get arou...After creating legal tags as noted below; the opendes is not prefix in the stored variable (stored as "Test-Legal-Tag-7343099") which makes subsequent action of creating record throw up error of wrong legal tags.
I was able to get around this by editing the legal tag variable as created "opendes-Test-Legal-Tag-7343099". Why is the opendes prefix missing in the stored variable string for legal tags?
RESPONSE
{
"name": "opendes-Test-Legal-Tag-7343099",
"description": "Legal Tag added for Well",
"properties": {
"countryOfOrigin": [
"US",
"CA"
],
"contractId": "123456",
"expirationDate": "2025-12-25",
"originator": "Schlumberger",
"dataType": "Third Party Data",
"securityClassification": "Private",
"personalData": "No Personal Data",
"exportClassification": "EAR99"
}
}M9 - Release 0.12MANISH KUMARMANISH KUMARhttps://community.opengroup.org/osdu/platform/pre-shipping/-/issues/154Azure R3M9 - Unable to GET Created Record in Storage2021-12-09T14:07:00ZSehubo AkinyanmiAzure R3M9 - Unable to GET Created Record in Storage**I have created below record.**
{
"recordCount": 1,
"recordIds": [
"opendes:master-data--SeismicAcquisitionSurvey:Sehubo-UnitConv-30Nov"
],
"skippedRecordIds": [],
"recordIdVersions": [
"opendes:maste...**I have created below record.**
{
"recordCount": 1,
"recordIds": [
"opendes:master-data--SeismicAcquisitionSurvey:Sehubo-UnitConv-30Nov"
],
"skippedRecordIds": [],
"recordIdVersions": [
"opendes:master-data--SeismicAcquisitionSurvey:Sehubo-UnitConv-30Nov:1638236008097439"
]
}
**I am unable to GET this record in Storage nor Search the same record.**
GET https://{{STORAGE_HOST}}/records/opendes:master-data--SeismicAcquisitionSurvey:Sehubo-UnitConv-30Nov
Response
{
"code": 403,
"reason": "Access denied",
"message": "The user does not have access to the record"
}
**Not SEARCHABLE**
POST https://{{SEARCH_HOST}}/query
body:
{
"kind": "osdu:wks:reference-data--UnitOfMeasure:1.0.0",
"query": "id: \"opendes:master-data--SeismicAcquisitionSurvey:Sehubo-UnitConv-30Nov\""
}
Response:
{
"results": [],
"aggregations": [],
"totalCount": 0
}M9 - Release 0.12MANISH KUMARMANISH KUMARhttps://community.opengroup.org/osdu/platform/pre-shipping/-/issues/155IBM R3M9 - Manifest Ingestion - record is not created, but cannot find any er...2022-08-23T12:47:19ZDebasis ChatterjeeIBM R3M9 - Manifest Ingestion - record is not created, but cannot find any error in log filesBoth @esmira.rafigayeva and I encountered this issue in IBM for Master data creation, by Manifest-based Ingestion.
No clue from any of logs in Airflow Console, but the job has not succeeded.
See details here.
[IBM-Manifest-ingestion-Ma...Both @esmira.rafigayeva and I encountered this issue in IBM for Master data creation, by Manifest-based Ingestion.
No clue from any of logs in Airflow Console, but the job has not succeeded.
See details here.
[IBM-Manifest-ingestion-Master-Seismic-issue.txt](/uploads/24bc65c18a1cb6958fad3fb8e5ff8628/IBM-Manifest-ingestion-Master-Seismic-issue.txt)
For another attempt (Well), we see the job completing successfully.
[IBM-Manifest-ingestion-Master-Well-success.txt](/uploads/7e3e8ffffd60a0f56e1d68617a61ef35/IBM-Manifest-ingestion-Master-Well-success.txt)
Can you please check to see if you can get some clue? How can one troubleshoot the problem if none of the DAG components is showing any error?
Thank you
cc - @sje7253bp , @chad , @anujgupta for informationhttps://community.opengroup.org/osdu/platform/pre-shipping/-/issues/156Azure - R3M9 - Load testing - 500 records2022-08-23T12:47:24ZWira YarmanAzure - R3M9 - Load testing - 500 recordsLoad testing for Azure on 500 records. The Airflow DAG is not showing the progress. Run id: faa3303b-e31e-403c-9f63-3ce5c09e5f2f
cc - @sehuboy @todaiks @debasisc @naufalnoori89 @manishkLoad testing for Azure on 500 records. The Airflow DAG is not showing the progress. Run id: faa3303b-e31e-403c-9f63-3ce5c09e5f2f
cc - @sehuboy @todaiks @debasisc @naufalnoori89 @manishkhttps://community.opengroup.org/osdu/platform/pre-shipping/-/issues/157Azure- Admin UI keeps logging out when clicking on all features in the website2022-11-06T15:25:32ZNaufal Mohamed NooriAzure- Admin UI keeps logging out when clicking on all features in the websiteClicking in all function in https://osdu-qa-admin-ui.azurewebsites.net/dashboard will logged out and relogged in user. No function i.e. entitlement / starage search can be done from this website.
CC @debasisc @sehuboy @todaiksClicking in all function in https://osdu-qa-admin-ui.azurewebsites.net/dashboard will logged out and relogged in user. No function i.e. entitlement / starage search can be done from this website.
CC @debasisc @sehuboy @todaiksMANISH KUMARYaraslau SushchykMANISH KUMAR