Pre Shipping issueshttps://community.opengroup.org/osdu/platform/pre-shipping/-/issues2024-01-15T19:16:00Zhttps://community.opengroup.org/osdu/platform/pre-shipping/-/issues/654M22 IBM Unauthorized error when getting upload signed URL in CSV ingestion2024-01-15T19:16:00ZTaylor GraberM22 IBM Unauthorized error when getting upload signed URL in CSV ingestionWhen following step 5 in the csv ingestion collection, I receive the following. Please let me know if there is something I need to do on my side that's causing the issue.
GET {{FILE_HOST}}/files/uploadURL
Response:
{
"error": {
...When following step 5 in the csv ingestion collection, I receive the following. Please let me know if there is something I need to do on my side that's causing the issue.
GET {{FILE_HOST}}/files/uploadURL
Response:
{
"error": {
"code": 401,
"message": "Unauthorized",
"errors": [
{
"domain": "global",
"reason": "unauthorized",
"message": "Unauthorized"
}
]
}
}
Test Results:
Couldn't evaluate the test script:
TypeError: Cannot read properties of undefined (reading 'SignedURL')
cURL:
curl --location 'https://cpd-osdu.apps.osdu-preship.ibmodi.com/osdu-file/api/file/v2/files/uploadURL' \
--header 'data-partition-id: opendes' \
--header 'Content-Type: application/json' \
--header 'Authorization: Bearer eyJhbGciOiJSUzI1NiIsInR5cCIgOiAiSldUIiwia2lkIiA6ICJ2UmhTM0hBSnB5TU5jSTkwMko2UjR0R0tYYk0zTE94UWhSdkNIMFN5RDJvIn0.eyJleHAiOjE3MDQ5OTUyOTEsImlhdCI6MTcwNDk4NDQ5MSwianRpIjoiMzhmZWYxOGQtMWUxNy00NDJhLWI4MzAtOGMxYjk3YTZhOGY5IiwiaXNzIjoiaHR0cHM6Ly9rZXljbG9hay1vc2R1LmFwcHMub3NkdS1wcmVzaGlwLmlibW9kaS5jb20vYXV0aC9yZWFsbXMvT1NEVSIsInN1YiI6ImE5OWFmMTEzLTBkODktNDI1Ny1iZDIwLWE5Mzg0OWI4MjkwYSIsInR5cCI6IkJlYXJlciIsImF6cCI6Im9zZHUtbG9naW4iLCJzZXNzaW9uX3N0YXRlIjoiNzY3ZDYzZDctYjFjOC00MzQwLWFjZTQtZmU3OGNjYTVmNzJmIiwicmVhbG1fYWNjZXNzIjp7InJvbGVzIjpbImRlZmF1bHQtcm9sZXMtb3NkdSJdfSwic2NvcGUiOiJvcGVuaWQgZW1haWwgcHJvZmlsZSIsInNpZCI6Ijc2N2Q2M2Q3LWIxYzgtNDM0MC1hY2U0LWZlNzhjY2E1ZjcyZiIsInJvb3RVc2VyIjp0cnVlLCJlbWFpbF92ZXJpZmllZCI6ZmFsc2UsIm5hbWUiOiJvc2R1IGJ2dCIsInByZWZlcnJlZF91c2VybmFtZSI6Im9zZHUtYnZ0IiwiZ2l2ZW5fbmFtZSI6Im9zZHUiLCJmYW1pbHlfbmFtZSI6ImJ2dCIsImVtYWlsIjoib3NkdS1idnRAb3NkdS5vcGVuZ3JvdXAub3JnIn0.Snc2NEovp1pNXZIuV_gnfC_GxRJ0tNQv7rNY6hiqKc0WI_H7q7HeYyOy_wtGPzw1Na_30rKi4pnCVE8JA3tBYR6NXI8p8vbFQ3mNRt0u7UXFk8-EzOA_hCiYnyEq69lXLMcbHB2MJYxds_D8GsqW_QFV7YvG1OXxBGQ3KnHuQMks1VNU6Pci3MLgujUI5B-NPg3MYNQzzXpQaVNCpZgsFedEXL1I0RJKxtzcGpC3JvvCbWE5S0w8__TssgNeWIeqgVzgVWgq61DwIEne_mxYevn6OMDHssbiZ5gag9KlZGlsN8lKBRD5j1JxSJeT5ILwKj6f-vpxi8v1ZUVbc4W3eg' \
--header 'Cookie: 5e7c4a992d2558462d1c9fcde6f1cee7=4ec498d778b559cf74a987cfbc06b21b'M22 - Release 0.25https://community.opengroup.org/osdu/platform/pre-shipping/-/issues/653M22 - RAFS-Azure - HTTPS Code : 422 Data validation failed2024-01-19T05:12:04ZEsakkiprem SubramaniyanM22 - RAFS-Azure - HTTPS Code : 422 Data validation failedAPI Details:
Version : V2
Folder : Samples Analysis [Wettability Index]
API : Add Data
Method : Post
URL : https://{{RAFS_DDMS_HOST}}/v2/samplesanalysis/{{wettabilityindex_record_id}}/data/wettabilityindex
Response :
{"code...API Details:
Version : V2
Folder : Samples Analysis [Wettability Index]
API : Add Data
Method : Post
URL : https://{{RAFS_DDMS_HOST}}/v2/samplesanalysis/{{wettabilityindex_record_id}}/data/wettabilityindex
Response :
{"code":422,"reason":"Data validation failed.","errors":{"Invalid type":[{"DataSchema":"{'WettabilityIndexData': {'CapillaryPressureAnalysisID': '{{cp_record_id}}:', 'ForcedImbibedBrineVolume': {'Value': 12, 'UnitOfMeasure': 'opendes:reference-data--UnitOfMeasure:cm3:'}, 'ForcedImbibedOilVolume': {'Value': 32, 'UnitOfMeasure': 'opendes:reference-data--UnitOfMeasure:cm3:'}, 'Temperature': {'Value': 1.0, 'UnitOfMeasure': 'opendes:reference-data--UnitOfMeasure:degF:'}, 'InitialBrineSaturation': {'Value': 0.457, 'UnitOfMeasure': 'opendes:reference-data--UnitOfMeasure:m3%2Fm3:'}, 'InitialOilSaturation': {'Value': 0.98, 'UnitOfMeasure': 'opendes:reference-data--UnitOfMeasure:m3%2Fm3:'}, 'SpontaneousImbibedBrineVolume': {'Value': 6, 'UnitOfMeasure': 'opendes:reference-data--UnitOfMeasure:cm3:'}, 'DisplacedOilVolume': {'Value': 0.7768, 'UnitOfMeasure': 'opendes:reference-data--UnitOfMeasure:cm3:'}, 'BrineImbibitionBrineSaturation': {'Value': 7, 'UnitOfMeasure': 'opendes:reference-data--UnitOfMeasure:m3%2Fm3:'}, 'BrineDisplacementOilSaturation': {'Value': 0.36, 'UnitOfMeasure': 'opendes:reference-data--UnitOfMeasure:m3%2Fm3:'}, 'SpontaneousImbibedOilVolume': {'Value': 0.7768, 'UnitOfMeasure': 'opendes:reference-data--UnitOfMeasure:cm3:'}, 'OilImbibitionBrineSaturation': {'Value': 3.98, 'UnitOfMeasure': 'opendes:reference-data--UnitOfMeasure:m3%2Fm3:'}, 'DisplacedBrineVolume': {'Value': 0.24234000000000003, 'UnitOfMeasure': 'opendes:reference-data--UnitOfMeasure:cm3:'}, 'DisplacementRatio': {'Value': 5, 'UnitOfMeasure': 'opendes:reference-data--UnitOfMeasure:cm3%2Fcm3:'}, 'OilImbibitionOilSaturation': {'Value': 8.1, 'UnitOfMeasure': 'opendes:reference-data--UnitOfMeasure:m3%2Fm3:'}, 'WettabilityIndex': [{'Value': 4.67, 'UnitOfMeasure': 'opendes:reference-data--UnitOfMeasure:m3%2Fm3:', 'WettabilityIndexType': 'opendes:reference-data--WettabilityIndexType:AmottHarvey:'}, {'Value': 3.2, 'UnitOfMeasure': 'opendes:reference-data--UnitOfMeasure:m3%2Fm3:', 'WettabilityIndexType': 'opendes:reference-data--WettabilityIndexType:AmottHarvey:'}], 'ConfiningPressure': {'Value': 0.453, 'UnitOfMeasure': 'opendes:reference-data--UnitOfMeasure:psi:'}, 'DesaturationMethod': 'opendes:reference-data--DesaturationMethod:CentrifugeOilWater:', 'DisplacingFluid': 'opendes:reference-data--DisplacingFluidType:Decane:', 'DisplacedFluid': 'opendes:reference-data--DisplacedFluidType:Carnation:', 'FluidSystem': 'opendes:reference-data--FluidSystemType:GasOil:', 'InitialWaterSaturation': {'Value': 32, 'UnitOfMeasure': 'opendes:reference-data--UnitOfMeasure:ppk:'}, 'InitialCapillaryPressure': {'Value': 65, 'UnitOfMeasure': 'opendes:reference-data--UnitOfMeasure:bar:'}, 'ForcedWaterImbibition': {'Value': 12, 'UnitOfMeasure': 'opendes:reference-data--UnitOfMeasure:m3%2Fm3:'}, 'ForcedOilImbibition': {'Value': 5.9, 'UnitOfMeasure': 'opendes:reference-data--UnitOfMeasure:m3%2Fm3:'}}}"}]}}https://community.opengroup.org/osdu/platform/pre-shipping/-/issues/652M22/AWS/Preship - Wellbore DDMS - "Create markertset" step fails2024-01-10T03:39:14ZDebasis ChatterjeeM22/AWS/Preship - Wellbore DDMS - "Create markertset" step failsStep 7.1 from the collection which is provided by CSP.
https://community.opengroup.org/osdu/platform/pre-shipping/-/blob/main/R3-M22/AWS-M22/DDMS%20Wellbore/AWS_OSDU_R3M22_WellboreDDMS_Collection.postman_collection.json
POST {{osduonaws...Step 7.1 from the collection which is provided by CSP.
https://community.opengroup.org/osdu/platform/pre-shipping/-/blob/main/R3-M22/AWS-M22/DDMS%20Wellbore/AWS_OSDU_R3M22_WellboreDDMS_Collection.postman_collection.json
POST {{osduonaws_base_url}}/api/os-wellbore-ddms/ddms/v3/wellboremarkersets
fails with legal tag problem.
{
"origin": "osdu-data-ecosystem-storage",
"errors": [
{
"code": 400,
"reason": "Invalid legal tags",
"message": "Invalid legal tags found on record"
}
]
}
Checked from curl code and found that the proper tag is being used. I also tried by forcing the legal tag name instead of variable.
Even then it fails with the same error.
cc @ydzeng , @cailletg and @Diddakuntlahttps://community.opengroup.org/osdu/platform/pre-shipping/-/issues/651RI: Entitlement swagger UI is down2024-01-08T11:27:49ZChad LeongRI: Entitlement swagger UI is downEntitlement swagger UI is down : https://osdu.bm22.gcp.gnrg-osdu.projects.epam.com/api/entitlements/v2/swagger-ui/index.html
Expected behavior:
https://osdu-ship.msft-osdu-test.org/api/entitlements/v2/swagger-ui/index.htmlEntitlement swagger UI is down : https://osdu.bm22.gcp.gnrg-osdu.projects.epam.com/api/entitlements/v2/swagger-ui/index.html
Expected behavior:
https://osdu-ship.msft-osdu-test.org/api/entitlements/v2/swagger-ui/index.htmlDzmitry Malkevich (EPAM)Dzmitry Malkevich (EPAM)https://community.opengroup.org/osdu/platform/pre-shipping/-/issues/649M21 Azure, Storage service - one field in record is not indexed and is not se...2023-12-16T16:45:20ZDebasis ChatterjeeM21 Azure, Storage service - one field in record is not indexed and is not seen from Search responseWorking with a Dataset record and I am facing problem with field TotalSize.
https://community.opengroup.org/osdu/data/data-definitions/-/blob/master/Examples/dataset/FileCollection.Slb.OpenZGY.1.1.0.json
Seems to be valid field. See lin...Working with a Dataset record and I am facing problem with field TotalSize.
https://community.opengroup.org/osdu/data/data-definitions/-/blob/master/Examples/dataset/FileCollection.Slb.OpenZGY.1.1.0.json
Seems to be valid field. See line 45.
I am enclosing documented steps showing the problem.
[M21-Azure-Storage-service-issue-in-Dataset-record.docx](/uploads/145c09eae42b90ab3ee6fb3ed433921e/M21-Azure-Storage-service-issue-in-Dataset-record.docx)Debasis ChatterjeeDebasis Chatterjeehttps://community.opengroup.org/osdu/platform/pre-shipping/-/issues/648M21 Azure RAFS DDMS - Gas Chromatography fails when adding content data2024-01-08T16:07:09ZDebasis ChatterjeeM21 Azure RAFS DDMS - Gas Chromatography fails when adding content dataCheck this record.
opendes:work-product-component--SamplesAnalysis:2aa099e6b39840578b341c3f3535b038:1701535074042641
I get failure when adding data.
```
{
"code": 422,
"reason": "Data validation failed.",
"errors": {
...Check this record.
opendes:work-product-component--SamplesAnalysis:2aa099e6b39840578b341c3f3535b038:1701535074042641
I get failure when adding data.
```
{
"code": 422,
"reason": "Data validation failed.",
"errors": {
"Missing records in storage": [
"opendes:reference-data--CompoundsAreaHeight:HEIGHT",
"opendes:reference-data--CompoundsAreaHeight:AREA"
]
}
}
```
cc @ernesto_gutierrez , @Siarhei_KhaletskiM22 - Release 0.25Debasis ChatterjeeDebasis Chatterjeehttps://community.opengroup.org/osdu/platform/pre-shipping/-/issues/647M21 Azure RAFS DDMS - RCA and Electricalproperties,GCTMS,TriaxialTest,CEC,Cap...2024-01-08T16:12:02ZDebasis ChatterjeeM21 Azure RAFS DDMS - RCA and Electricalproperties,GCTMS,TriaxialTest,CEC,CapPressure,EDS,Wettability Index missing content schemaSimilar to NMR, I tried RCA.
Made similar changes to create the record.
opendes:work-product-component--SamplesAnalysis:6f761704b53d420c819bf3a5204e3a77:1701530363638834
But when I try to add data, then I get this error.
```
{
"cod...Similar to NMR, I tried RCA.
Made similar changes to create the record.
opendes:work-product-component--SamplesAnalysis:6f761704b53d420c819bf3a5204e3a77:1701530363638834
But when I try to add data, then I get this error.
```
{
"code": 422,
"reason": "Unimplemented model for route /api/rafs-ddms/v2/samplesanalysis/opendes:work-product-component--SamplesAnalysis:6f761704b53d420c819bf3a5204e3a77/data/routinecoreanalyses."
}
```
Next I tried to see how many tests are supported.
GET https://{{RAFS_DDMS_HOST}}/v2/samplesanalysis/analysistypes
This does not show routinecoreanalyses
Shows
```
"rocksampleanalyses": [
"1.0.0"
],
```
Please check this.
=============== Next I tried electricalproperties ====
Here also I get this error.
```
{
"code": 422,
"reason": "Unimplemented model for route /api/rafs-ddms/v2/samplesanalysis/opendes:work-product-component--SamplesAnalysis:31eb4ec0e5644ee990b7ef0165492d34/data/electricalproperties."
}
```
Although when I list available content schema, this appears.
```
"electricalproperties": [
"1.0.0"
],
```M22 - Release 0.25Debasis ChatterjeeDebasis Chatterjeehttps://community.opengroup.org/osdu/platform/pre-shipping/-/issues/646M21/Azure/Preship - RAFS DDMS V2 API - fails to create record for NMR2023-11-29T01:34:32ZDebasis ChatterjeeM21/Azure/Preship - RAFS DDMS V2 API - fails to create record for NMRI used this collection.
First step works. But step to create NMR Sample Analysis wpc record fails.
@marneson confirmed that he gets the same failure.
```
{
"code": 422,
"reason": "Validation failed for rafsddms:wks:work-product...I used this collection.
First step works. But step to create NMR Sample Analysis wpc record fails.
@marneson confirmed that he gets the same failure.
```
{
"code": 422,
"reason": "Validation failed for rafsddms:wks:work-product-component--SamplesAnalysis:1.0.0. Skipped records [{'id': None, 'reason': ValidationError(model='SamplesAnalysis', errors=[{'loc': ('data', 'ParentSamplesAnalysesReports', 0), 'msg': 'str type expected', 'type': 'type_error.str'}, {'loc': ('data', 'DatePublished'), 'msg': 'invalid datetime format', 'type': 'value_error.datetime'}])}]"
}
```
I tried changing date to full format (with timezone). Even then the error persists. There is also second error with parent record.
"AnalysisDate": "2022-12-16T11:46:20.163Z",
Please check and advise.Siarhei Khaletski (EPAM)Siarhei Khaletski (EPAM)https://community.opengroup.org/osdu/platform/pre-shipping/-/issues/645M21 Azure RAFS - Sample Analysis Aromatics Request fails2023-12-04T19:33:41ZMichaelM21 Azure RAFS - Sample Analysis Aromatics Request failsI encountered an "Data validation failed" error when executing the "V2/Samples Analysis[aromatics]/02- Add data" request in the rafs postman collection: https://community.opengroup.org/osdu/platform/pre-shipping/-/blob/main/R3-M20/Azure-...I encountered an "Data validation failed" error when executing the "V2/Samples Analysis[aromatics]/02- Add data" request in the rafs postman collection: https://community.opengroup.org/osdu/platform/pre-shipping/-/blob/main/R3-M20/Azure-M20/Services/DDMS/RAFS/RAFSDDMSAPI/RAFSDDMS_API_CI-CD_v1.0.postman_collection.json.
Full details are below:
```
curl --location 'https://osdu-ship.msft-osdu-test.org/api/rafs-ddms/v2/samplesanalysis/opendes:work-product-component--SamplesAnalysis:19e52be4b008425e93f2ba04fa247f89/data/gcmsaromatics' \
--header 'Authorization: Bearer eyJ0eXAiOiJKV1QiLCJhbGciOiJSUzI1NiIsIng1dCI6IlQxU3QtZExUdnlXUmd4Ql82NzZ1OGtyWFMtSSIsImtpZCI6IlQxU3QtZExUdnlXUmd4Ql82NzZ1OGtyWFMtSSJ9.eyJhdWQiOiJhYjMyMGVkMy05Y2RkLTQ3OTgtOGUzYy0yYTY1NzgwMDE4M2IiLCJpc3MiOiJodHRwczovL3N0cy53aW5kb3dzLm5ldC81ODk3NWZkMy00OTc3LTQ0ZDAtYmVhOC0zN2FmMGJhYWMxMDAvIiwiaWF0IjoxNzAwNjc3NDcyLCJuYmYiOjE3MDA2Nzc0NzIsImV4cCI6MTcwMDY4MjU3NCwiYWNyIjoiMSIsImFpbyI6IkFUUUF5LzhWQUFBQW5lM250NU5PMkYxdWhTYlB4bzA3ZVcrb3JHaVJhZ0FTRXhuTGwrNHZxMVNqNlhwbGxVQVlwbTB0R1BvajBSMUEiLCJhbXIiOlsicHdkIl0sImFwcGlkIjoiYWIzMjBlZDMtOWNkZC00Nzk4LThlM2MtMmE2NTc4MDAxODNiIiwiYXBwaWRhY3IiOiIxIiwiZmFtaWx5X25hbWUiOiJEZWZhdWx0IiwiZ2l2ZW5fbmFtZSI6IlByZXNoaXAiLCJpcGFkZHIiOiIxMDYuMjA4LjQ2LjQ1IiwibmFtZSI6IlByZXNoaXBwaW5nIiwib2lkIjoiOGUwYjQ2NDQtMGVkMC00MWYyLThjNmQtZDZhYWE3OTg2MTgyIiwicmgiOiIwLkFUY0EwMS1YV0hkSjBFUy1xRGV2QzZyQkFOTU9NcXZkbkpoSGpqd3FaWGdBR0RzM0FNVS4iLCJzY3AiOiJEaXJlY3RvcnkuUmVhZC5BbGwgVXNlci5SZWFkIiwic3ViIjoiOTdwUWdKdFJGSDk5WTFLVml3RlY0R2FBRHhLc0llUkc5WlBKLTRQbk1iMCIsInRpZCI6IjU4OTc1ZmQzLTQ5NzctNDRkMC1iZWE4LTM3YWYwYmFhYzEwMCIsInVuaXF1ZV9uYW1lIjoicHJlc2hpcHBpbmdAYXp1cmVnbG9iYWwxLm9ubWljcm9zb2Z0LmNvbSIsInVwbiI6InByZXNoaXBwaW5nQGF6dXJlZ2xvYmFsMS5vbm1pY3Jvc29mdC5jb20iLCJ1dGkiOiJkdzFHTlhBbklVLTNpbVl3VjF4T0FRIiwidmVyIjoiMS4wIn0.j_H7lu_en2yZwK-TyzwSpHhWx39s5TEGdZeu1wIzG_97TdIAjiOZBmGtJdFNAzHtyAMOK-gSW9dUblg3FOQvgid1O9eK0EhnhUuFeQ3E3lChFBTVxYLNKyZxXtIHsw9CQKwDS2oahyozEGqLHxA87OINC_2Ix_Mfy5w-dyfuFV_zsiP_tnxBsEMjAabJM4DWs4HDAwNRTJ-46_e52r7f9o8IXRrMH-2ehfKmSFuPPJnKo2Q1jCrLQXkGJ_o5731ihQ6UqKdD-MMsYlVvEqwoDTlKwrPTUR0zO2iL5EVxJVVb7bEG1GUKkbcYmpJ9FnSx81k_-j7JRUq49ZAB68PHMQ' \
--header 'data-partition-id: opendes' \
--header 'Content-Type: application/json' \
--header 'Accept: */*;version=1.0.0' \
--data '{
"columns": [
"SamplesAnalysisID",
"SampleID",
"AromaticBiomarkers",
"StdCompound"
],
"index": [
0
],
"data": [
[
"opendes:work-product-component--SamplesAnalysis:19e52be4b008425e93f2ba04fa247f89:",
"opendes:master-data--Sample:test:",
[
{
"CompoundCode": "opendes:reference-data--AromaticBiomarkersCompounds:mn1:",
"RetentionTime": {
"Value": 12.34,
"UnitOfMeasure": "opendes:reference-data--UnitOfMeasure:min:"
},
"Ion": "K+",
"Peak": [
{
"Value": 56.78,
"UnitOfMeasure": "opendes:reference-data--UnitOfMeasure:g:"
},
{
"Value": 90.12,
"UnitOfMeasure": "opendes:reference-data--UnitOfMeasure:g:"
}
]
}
],
{
"CompoundCode": "opendes:reference-data--AromaticBiomarkersCompounds:mn1:",
"RetentionTime": {
"Value": 45.67,
"UnitOfMeasure": "opendes:reference-data--UnitOfMeasure:min:"
},
"Ion": "Mg2+",
"Peak": [
{
"Value": 78.9,
"UnitOfMeasure": "opendes:reference-data--UnitOfMeasure:g:"
},
{
"Value": 12.34,
"UnitOfMeasure": "opendes:reference-data--UnitOfMeasure:g:"
}
]
}
]
]
}'
```
Response:
```
{
"code": 422,
"reason": "Data validation failed.",
"errors": {
"Invalid type": [
{
"DataSchema": "{'StdCompound': {'CompoundCode': 'opendes:reference-data--AromaticBiomarkersCompounds:mn1:', 'RetentionTime': {'Value': 45.67, 'UnitOfMeasure': 'opendes:reference-data--UnitOfMeasure:min:'}, 'Ion': 'Mg2+', 'Peak': [{'Value': 78.9, 'UnitOfMeasure': 'opendes:reference-data--UnitOfMeasure:g:'}, {'Value': 12.34, 'UnitOfMeasure': 'opendes:reference-data--UnitOfMeasure:g:'}]}}"
}
]
}
}
```
The steps I used can be found here[M21-Azure-RAFS-testing-steps-Michael.docx](/uploads/97b917ad78b2097e010a4f3fa8ac5a30/M21-Azure-RAFS-testing-steps-Michael.docx)Siarhei Khaletski (EPAM)Om Prakash GuptaSiarhei Khaletski (EPAM)https://community.opengroup.org/osdu/platform/pre-shipping/-/issues/644M21 Azure GCZ - query requests give "does not exist in cluster with these cac...2023-11-22T14:45:26ZMichaelM21 Azure GCZ - query requests give "does not exist in cluster with these caches" errorAll `/gcz/FeatureServer/{{number}}/query` requests fail from the postman collection: https://community.opengroup.org/osdu/platform/pre-shipping/-/blob/main/R3-M21/Azure-M21/Services/GCZ/Azure-Geospatial%20Consumption%20Zone%20-%20Provid...All `/gcz/FeatureServer/{{number}}/query` requests fail from the postman collection: https://community.opengroup.org/osdu/platform/pre-shipping/-/blob/main/R3-M21/Azure-M21/Services/GCZ/Azure-Geospatial%20Consumption%20Zone%20-%20Provider%20Postman%20Tests.postman_collection.json
The gcz postman collection uses a PROVIDER_URL variable which is not defined in the Azure M21 postman environment:https://community.opengroup.org/osdu/platform/pre-shipping/-/blob/main/R3-M21/Azure-M21/Environment/Verify_Preshipping_New_TeamA-E.postman_environment.json
I used the PROVIDER_URL variable from a previous milestone: `https://osdu-gcz.msft-osdu-test.org`
Below are some examples of the failed requests:
`curl --location 'https://osdu-gcz.msft-osdu-test.org/ignite-provider/gcz/FeatureServer/1/query'`
Response:
```
{
"error": "osdu_wks_master-data--Well_1.0.0 does not exist in cluster with these caches: "
}
```
`curl --location 'https://osdu-gcz.msft-osdu-test.org/ignite-provider/gcz/FeatureServer/2/query'`
Response:
```
{
"error": "osdu_wks_master-data--Wellbore_1.0.0 does not exist in cluster with these caches: "
}
```
`curl --location 'https://osdu-gcz.msft-osdu-test.org/ignite-provider/gcz/FeatureServer/3/query'`
Response:
```
{
"error": "osdu_wks_master-data--SeismicAcquisitionSurvey_1.0.0 does not exist in cluster with these caches: "
}
```
When running the Get Cache Size request, a "CACHE_NOT_INITIALIZED" error is returned:
`curl --location 'https://osdu-gcz.msft-osdu-test.org/gcz/transformer/admin/cacheSize?kind=osdu%3Awks%3Amaster-data--Well%3A1.0.0&userEmail=preshipping%40azureglobal1.onmicrosoft.com'`
Response:
```
{
"error": "CACHE_NOT_INITIALIZED"
}
```
The UPDATE Cache request does work and the Get Layer Definition request does workshivani karipeOm Prakash Guptashivani karipehttps://community.opengroup.org/osdu/platform/pre-shipping/-/issues/643In R3M21 Pre-ship RI environment the endpoint to patch commit WellLog session...2024-02-26T14:37:24ZKamlesh TodaiIn R3M21 Pre-ship RI environment the endpoint to patch commit WellLog session for Wellbore DDMS is not failing with 500 Internal Server ErrorIn R3M21 Pre-ship RI environment the endpoint to patch commit WellLog session for Wellbore DDMS is not failing with 500 Internal Server Error.
<details><summary>Request to commit wellog session</summary>
curl --location --request PATCH ...In R3M21 Pre-ship RI environment the endpoint to patch commit WellLog session for Wellbore DDMS is not failing with 500 Internal Server Error.
<details><summary>Request to commit wellog session</summary>
curl --location --request PATCH 'https://osdu.bm21.gcp.gnrg-osdu.projects.epam.com/api/os-wellbore-ddms/ddms/v3/welllogs/osdu:work-product-component--WellLog:AutoTest_999714120507/sessions/null' \
--header 'data-partition-id: osdu' \
--header 'Content-Type: application/json' \
--header 'Authorization: Bearer eyJhbGciOiJSUzI1Ni...Truncated...JzRkR1sswQtS-Mtg' \
--data '{
"state": "commit"
}'
</details>
<details><summary>Response for the above request</summary>
Response 500 Internal Server Error
{
"error": [
"Access Denied."
]
}
</details>
The same request from the same collection is working in GC and Azure environments
The collection can be found [here](https://community.opengroup.org/osdu/platform/pre-shipping/-/blob/main/R3-M21/QA_Artifacts_M21/envFilesAndCollections/Wellbore%20DDMS%20CI-CD%20v3.0.postman_collection.json?ref_type=heads)M22 - Release 0.25Kamlesh TodaiYan Sushchynski (EPAM)Dzmitry Malkevich (EPAM)Kamlesh Todaihttps://community.opengroup.org/osdu/platform/pre-shipping/-/issues/642Azure M21 Preshipping - Issue with Osdu_ingest Validation task2023-11-17T12:17:00ZPriyanka BhongadeAzure M21 Preshipping - Issue with Osdu_ingest Validation task- Osdu_ingest fails at Validation task with below error.
ClientSecretCredential.get_token succeeded \[2023-11-15, 13:20:30 UTC\] {validate_schema.py:166} **ERROR - Error on getting schema of kind 'osdu:wks:Manifest:1.0.0'** \[2023-11-15...- Osdu_ingest fails at Validation task with below error.
ClientSecretCredential.get_token succeeded \[2023-11-15, 13:20:30 UTC\] {validate_schema.py:166} **ERROR - Error on getting schema of kind 'osdu:wks:Manifest:1.0.0'** \[2023-11-15, 13:20:30 UTC\] {validate_schema.py:167} ERROR - 401 Client Error: Unauthorized for url: http://schema.osdu-azure.svc.cluster.local/api/schema-service/v1/schema/osdu:wks:Manifest:1.0.0 \[2023-11-15, 13:20:30 UTC\] {taskinstance.py:1718} ERROR - Task failed with exception Traceback (most recent call last): File "/home/airflow/.local/lib/python3.8/site-packages/airflow/models/taskinstance.py", line 1334, in \_run_raw_task self.\_execute_task_with_callbacks(context) File "/home/airflow/.local/lib/python3.8/site-packages/airflow/models/taskinstance.py", line 1460, in \_execute_task_with_callbacks result = self.\_execute_task(context, self.task) File "/home/airflow/.local/lib/python3.8/site-packages/airflow/models/taskinstance.py", line 1516, in \_execute_task result = execute_callable(context=context) File "/home/airflow/.local/lib/python3.8/site-packages/osdu_airflow/operators/validate_manifest_schema.py", line 84, in execute \_ = schema_validator.validate_common_schema(manifest_data) File "/home/airflow/.local/lib/python3.8/site-packages/osdu_ingestion/libs/validation/validate_schema.py", line 362, in validate_common_schema raise GenericManifestSchemaError( osdu_ingestion.libs.exceptions.GenericManifestSchemaError: **There is no schema for Manifest kind osdu:wks:Manifest:1.0.0**M21 - Release 0.24saketh somarajusaketh somarajuhttps://community.opengroup.org/osdu/platform/pre-shipping/-/issues/639Ingestion By Reference in RI (baremetal/anthos) implementation is not ingesti...2023-11-14T21:28:43ZKamlesh TodaiIngestion By Reference in RI (baremetal/anthos) implementation is not ingesting the data.Ingestion By Reference ( Osdu_ingest_by_reference) in RI (baremetal/anthos) implementation is not ingesting the data. I am able to upload and download the JSON file used for ingestion. The airflow is returning the status "finished"/Green...Ingestion By Reference ( Osdu_ingest_by_reference) in RI (baremetal/anthos) implementation is not ingesting the data. I am able to upload and download the JSON file used for ingestion. The airflow is returning the status "finished"/Green color, but the records are not getting ingested.
Upon looking at the airflow logs, one sees the following messages
<details><summary>airflow log</summary>
osdu-ingest-by-reference-update-status-finished-task-a6jlna1m
*** Found logs in s3:
*** * s3://airflow-log/logs/dag_id=Osdu_ingest_by_reference/run_id=f7882867-d918-41f1-b047-f4c21be6c00c/task_id=update_status_finished_task/attempt=1.log
[2023-11-13, 19:03:06 UTC] {taskinstance.py:1103} INFO - Dependencies all met for dep_context=non-requeueable deps ti=<TaskInstance: Osdu_ingest_by_reference.update_status_finished_task f7882867-d918-41f1-b047-f4c21be6c00c [queued]>
[2023-11-13, 19:03:07 UTC] {taskinstance.py:1103} INFO - Dependencies all met for dep_context=requeueable deps ti=<TaskInstance: Osdu_ingest_by_reference.update_status_finished_task f7882867-d918-41f1-b047-f4c21be6c00c [queued]>
[2023-11-13, 19:03:07 UTC] {taskinstance.py:1308} INFO - Starting attempt 1 of 1
[2023-11-13, 19:03:07 UTC] {taskinstance.py:1327} INFO - Executing <Task(UpdateStatusOperatorByReference): update_status_finished_task> on 2023-11-13 19:02:09.292958+00:00
[2023-11-13, 19:03:07 UTC] {standard_task_runner.py:57} INFO - Started process 17 to run task
[2023-11-13, 19:03:07 UTC] {standard_task_runner.py:84} INFO - Running: ['airflow', 'tasks', 'run', 'Osdu_ingest_by_reference', 'update_status_finished_task', 'f7882867-d918-41f1-b047-f4c21be6c00c', '--job-id', '6044', '--raw', '--subdir', 'DAGS_FOLDER/external/osdu-ingest-r3-by-reference.py', '--cfg-path', '/tmp/tmpop0igj0q']
[2023-11-13, 19:03:07 UTC] {standard_task_runner.py:85} INFO - Job 6044: Subtask update_status_finished_task
[2023-11-13, 19:03:07 UTC] {task_command.py:410} INFO - Running <TaskInstance: Osdu_ingest_by_reference.update_status_finished_task f7882867-d918-41f1-b047-f4c21be6c00c [running]> on host osdu-ingest-by-reference-update-status-finished-task-a6jlna1m
[2023-11-13, 19:03:07 UTC] {pod_generator.py:529} WARNING - Model file does not exist
[2023-11-13, 19:03:07 UTC] {taskinstance.py:1545} INFO - Exporting env vars: AIRFLOW_CTX_DAG_OWNER='airflow' AIRFLOW_CTX_DAG_ID='Osdu_ingest_by_reference' AIRFLOW_CTX_TASK_ID='update_status_finished_task' AIRFLOW_CTX_EXECUTION_DATE='2023-11-13T19:02:09.292958+00:00' AIRFLOW_CTX_TRY_NUMBER='1' AIRFLOW_CTX_DAG_RUN_ID='f7882867-d918-41f1-b047-f4c21be6c00c'
[2023-11-13, 19:03:07 UTC] {update_status_by_reference.py:75} INFO - There are successed tasks before this one. So it has status SUCCESSED
[2023-11-13, 19:03:07 UTC] {logging_mixin.py:149} INFO - user_id in Context Initialization is None
[2023-11-13, 19:03:08 UTC] {logging_mixin.py:149} WARNING - /opt/bitnami/airflow/venv/lib/python3.9/site-packages/urllib3/connectionpool.py:1045 InsecureRequestWarning: Unverified HTTPS request is being made to host 's3.bm21.gcp.gnrg-osdu.projects.epam.com'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings
[2023-11-13, 19:03:08 UTC] {update_status_by_reference.py:201} **ERROR - #SKIPPED_IDS: Some ids in the manifest were skipped**. You can find the report in the datasetService with this record id : osdu:dataset--File.Generic:ea9c9e40f8474a57952fd8df4870ad64
[2023-11-13, 19:03:08 UTC] {taskinstance.py:1345} INFO - Marking task as SUCCESS. dag_id=Osdu_ingest_by_reference, task_id=update_status_finished_task, execution_date=20231113T190209, start_date=20231113T190306, end_date=20231113T190308
[2023-11-13, 19:03:08 UTC] {local_task_job_runner.py:225} INFO - Task exited with return code 0
[2023-11-13, 19:03:08 UTC] {taskinstance.py:2651} INFO - 0 downstream tasks scheduled from follow-on schedule check
</details>
<details><summary>Upload the file</summary>
curl --location --request PUT 'https://s3.bm21.gcp.gnrg-osdu.projects.epam.com/refi-osdu-staging-area/d89cb375-6ce1-48d6-8b2c-681ee8b2c776/3d578febbd01444e94e208b09dbc3722?X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Credential=fileUser%2F20231113%2Fus-east-1%2Fs3%2Faws4_request&X-Amz-Date=20231113T190106Z&X-Amz-Expires=3600&X-Amz-SignedHeaders=host&X-Amz-Signature=7fedf88d44eec464d3b9348c62ae07300be7136d70501cca021f199416861757' \
--header 'x-ms-blob-type: BlockBlob' \
--header 'data-partition-id: osdu' \
--header 'Content-Type: application/json' \
--data '@anthos_IngestByRefTest_2Master_records.json'
Response 200 OK
</details>
<details><summary>Request</summary>
curl --location 'https://osdu.bm21.gcp.gnrg-osdu.projects.epam.com/api/workflow/v1/workflow/Osdu_ingest_by_reference/workflowRun' \
--header 'data-partition-id: osdu' \
--header 'Content-Type: application/json' \
--header 'Authorization: Bearer eyJhbGciOi...Truncated...iw2woo0P53Q' \
--data '{
"executionContext": {
"Payload": {
"AppKey": "test-app",
"data-partition-id": "osdu"
},
"manifest": "osdu:dataset--File.Generic:dbd47f02fa1a4ab3b48ede6777406840"
}
}'
Response 200 OK
{
"workflowId": "09b47b8a-b0e1-4c08-8742-c3eba971d203",
"runId": "f7882867-d918-41f1-b047-f4c21be6c00c",
"startTimeStamp": 1699902128741,
"status": "submitted",
"submittedBy": "osdu-tester@service.local"
}
</details>
<details><summary>Check the ingestion status</summary>
curl --location 'https://osdu.bm21.gcp.gnrg-osdu.projects.epam.com/api/workflow/v1/workflow/Osdu_ingest_by_reference/workflowRun/f7882867-d918-41f1-b047-f4c21be6c00c' \
--header 'Data-Partition-Id: osdu' \
--header 'Authorization: Bearer eyJhbGciOi...Truncated...iw2woo0P53Q' \
--data ''
Response 200 OK
{
"workflowId": "09b47b8a-b0e1-4c08-8742-c3eba971d203",
"runId": "f7882867-d918-41f1-b047-f4c21be6c00c",
"startTimeStamp": 1699902128741,
"endTimeStamp": 1699902187583,
"status": "finished",
"submittedBy": "osdu-tester@service.local"
}
</details>
<details><summary>Search the record</summary>
curl --location 'https://osdu.bm21.gcp.gnrg-osdu.projects.epam.com/api/storage/v2/records/osdu:master-data--Well:0' \
--header 'Content-Type: application/json' \
--header 'data-partition-id: osdu' \
--header 'Authorization: Bearer eyJhbGciOi...Truncated...iw2woo0P53Q' \
--data ''
Response 404 Not Found
{
"code": 404,
"reason": "Record not found",
"message": "The record 'osdu:master-data--Well:0' was not found"
}
</details>
Payload file attached
[anthos_IngestByRefTest_2Master_records.json](/uploads/4ded7158408e06f28d5d72e63f29a018/anthos_IngestByRefTest_2Master_records.json)M21 - Release 0.24Dzmitry Malkevich (EPAM)Dzmitry Malkevich (EPAM)https://community.opengroup.org/osdu/platform/pre-shipping/-/issues/638In RI R3M21 pre-ship environment not able to upload the file using the Signed...2023-11-10T15:09:52ZKamlesh TodaiIn RI R3M21 pre-ship environment not able to upload the file using the SignedURL gotten by using the uploadURL endpoint of the FILE API<details><summary>Request to get the signedURL</summary>
curl --location 'https://osdu.bm21.gcp.gnrg-osdu.projects.epam.com/api/file/v2/files/uploadURL' \
--header 'Data-Partition-Id: osdu' \
--header 'Authorization: Bearer eyJhbGciOi......<details><summary>Request to get the signedURL</summary>
curl --location 'https://osdu.bm21.gcp.gnrg-osdu.projects.epam.com/api/file/v2/files/uploadURL' \
--header 'Data-Partition-Id: osdu' \
--header 'Authorization: Bearer eyJhbGciOi...Truncated...jfa80htjkDM2QwkSbnf1orug'
</details>
<details><summary>Response to the above request</summary>
{
"FileID": "64d6d4c4a0fb4e30b6ebcee92194a0dd",
"Location": {
"SignedURL": "http://minio:9000/refi-osdu-staging-area/345679c2-1a66-4bcf-b4ec-2d1d46247bb7/64d6d4c4a0fb4e30b6ebcee92194a0dd?X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Credential=fileUser%2F20231109%2Fus-east-1%2Fs3%2Faws4_request&X-Amz-Date=20231109T204858Z&X-Amz-Expires=3600&X-Amz-SignedHeaders=host&X-Amz-Signature=dd1fc0acfcada76611d31c28c0764f8c0df9cdc5d897d4f16e3bc024de0e8a82",
"FileSource": "/345679c2-1a66-4bcf-b4ec-2d1d46247bb7/64d6d4c4a0fb4e30b6ebcee92194a0dd"
}
}
</details>
<details><summary>Upload File by using SignedURL from the above response</summary>
curl --location --request PUT 'http://minio:9000/refi-osdu-staging-area/345679c2-1a66-4bcf-b4ec-2d1d46247bb7/64d6d4c4a0fb4e30b6ebcee92194a0dd?X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Credential=fileUser%2F20231109%2Fus-east-1%2Fs3%2Faws4_request&X-Amz-Date=20231109T204858Z&X-Amz-Expires=3600&X-Amz-SignedHeaders=host&X-Amz-Signature=dd1fc0acfcada76611d31c28c0764f8c0df9cdc5d897d4f16e3bc024de0e8a82' \
--header 'x-ms-blob-type: BlockBlob' \
--form '=@"7004_a1501_1978_comp.las"'
</details>
<details><summary>Response to the above request</summary>
Error: getaddrinfo ENOTFOUND minio
Request Headers
x-ms-blob-type: BlockBlob
User-Agent: PostmanRuntime/7.34.0
Accept: */*
Postman-Token: ecf0cbe1-42a3-46c2-ad08-c9e90780102b
Host: minio:9000
Accept-Encoding: gzip, deflate, br
Connection: keep-alive
</details>
Because of this not able to perform the workflows that require uploading of the files.M21 - Release 0.24Dzmitry Malkevich (EPAM)Dzmitry Malkevich (EPAM)https://community.opengroup.org/osdu/platform/pre-shipping/-/issues/637In GC R3 M21 the Ingestion by Reference DAG does not seem to be running.2023-11-10T15:10:44ZKamlesh TodaiIn GC R3 M21 the Ingestion by Reference DAG does not seem to be running.I can register the DAG
{
"code": 409,
"reason": "Conflict",
"message": "Workflow with name Osdu_ingest_by_reference_partially already exists."
}
But when I trigger the DAG, I get submitted status but then it fails
{
"wo...I can register the DAG
{
"code": 409,
"reason": "Conflict",
"message": "Workflow with name Osdu_ingest_by_reference_partially already exists."
}
But when I trigger the DAG, I get submitted status but then it fails
{
"workflowId": "4b26e189-ae13-4f22-91f3-759748391dd9",
"runId": "24a7152e-532c-450d-a6af-0f665cd1cb34",
"startTimeStamp": 1699562145265,
"status": "submitted",
"submittedBy": "preshipping_test_user_m19@gcp.gnrg-osdu.projects.epam.com"
}
When I check the status
{
"workflowId": "4b26e189-ae13-4f22-91f3-759748391dd9",
"runId": "24a7152e-532c-450d-a6af-0f665cd1cb34",
"startTimeStamp": 1699562145265,
"endTimeStamp": 1699562189176,
"status": "failed",
"submittedBy": "preshipping_test_user_m19@gcp.gnrg-osdu.projects.epam.com"
}
When I go to the airflow dashboard I see
![image](/uploads/bb77c42765b3e7040036c83c57b76c6b/image.png)
No jobs have been run (executed), so one cannot even check the reason for failure.M21 - Release 0.24Dzmitry Malkevich (EPAM)Dzmitry Malkevich (EPAM)https://community.opengroup.org/osdu/platform/pre-shipping/-/issues/636Postman Collection wrt RDDMS Manifest Ingestion Check Status fails2023-11-17T10:44:11ZSoumik DuttaPostman Collection wrt RDDMS Manifest Ingestion Check Status failsWorkflow API might have some issue while checking Manifest Ingestion Status (MA4) it is giving unexpected response
Postman Collection for reference: https://community.opengroup.org/osdu/platform/pre-shipping/-/blob/main/R3-M21/Azure-M21...Workflow API might have some issue while checking Manifest Ingestion Status (MA4) it is giving unexpected response
Postman Collection for reference: https://community.opengroup.org/osdu/platform/pre-shipping/-/blob/main/R3-M21/Azure-M21/Services/DDMS/Reservoir-DDMS/Reservoir_DDMS_Extra_v1.0.postman_collection.json
Query:
curl --location --request GET 'https://osdu-ship.msft-osdu-test.org/api/workflow/v1/workflow/Osdu_ingest/workflowRun/' \
--header 'Authorization: Bearer eyJ0e****' \
--header 'data-partition-id: opendes' \
--header 'Content-Type: application/json' \
--header 'Cookie: JSESSIONID=B23BF32461D12FE7BFF8D016D65D7645' \
--data '{
"executionContext": {
"Payload": {
"AppKey": "test-app",
"data-partition-id": "opendes"
},
"manifest": [
]
}
}'
Response: FAIL
Status is finished | AssertionError: expected undefined to equal 'finished'Deepa KumariDeepa Kumarihttps://community.opengroup.org/osdu/platform/pre-shipping/-/issues/635M21 Policy validation issue for non-ascii characters2024-03-14T16:27:36ZDadong ZhouM21 Policy validation issue for non-ascii charactershttps://community.opengroup.org/osdu/platform/security-and-compliance/policy/-/issues/122
In M21, the new policy validation has issue for non-ascii characters. Here is a sample policy tested in M21 GC:
```
package osdu.partition["m19"]....https://community.opengroup.org/osdu/platform/security-and-compliance/policy/-/issues/122
In M21, the new policy validation has issue for non-ascii characters. Here is a sample policy tested in M21 GC:
```
package osdu.partition["m19"].organisation_code_2
organisation_code := {
"AGÊNCIA NACIONAL DO PETRÓLEO": {
"Name": "ANP",
"Code": "G0013"
}
}
```
Failed to load the policy with the following error:
```
{
"detail":"Unable to validate policy! Error: {\n \"code\": \"invalid_parameter\",\n \"message\": \"error(s) occurred while compiling module(s)\",\n \"errors\": [\n {\n \"code\": \"rego_parse_error\",\n \"message\": \"unexpected assign token: expected rule value term (e.g., organisation_code := \<VALUE\> { ... })\",\n \"location\": {\n \"file\": \"tmp/m19/organisation_code_2.rego\",\n \"row\": 3,\n \"col\": 19\n },\n \"details\": {\n \"line\": \"organisation_code := {\",\n \"idx\": 18\n }\n }\n ]\n}\n 400."
}
```
This policy can be loaded before the policy validation is added in M21.
Please fix it for M22. Thanks.https://community.opengroup.org/osdu/platform/pre-shipping/-/issues/634M21 GC Segy to ZGY conversion airflow job failed2023-11-08T09:01:31ZBhuwan Prasad UpadhyayM21 GC Segy to ZGY conversion airflow job failedTest results attached with this issue.
[M21-GCP-Segy-To-OpenZGY-Conversion-Bhuwan.txt](/uploads/f798c3667d6cd04235541138cfd15f4f/M21-GCP-Segy-To-OpenZGY-Conversion-Bhuwan.txt)Test results attached with this issue.
[M21-GCP-Segy-To-OpenZGY-Conversion-Bhuwan.txt](/uploads/f798c3667d6cd04235541138cfd15f4f/M21-GCP-Segy-To-OpenZGY-Conversion-Bhuwan.txt)Dzmitry Malkevich (EPAM)Dzmitry Malkevich (EPAM)https://community.opengroup.org/osdu/platform/pre-shipping/-/issues/633Error in M21 GC OSDU Smoke tests2024-01-26T08:14:39ZCarl GodkinError in M21 GC OSDU Smoke testsWhile doing some testing, I noticed a mistake (I think) in one of the Google Postman collections called [GC OSDU Smoke Tests](https://community.opengroup.org/osdu/platform/pre-shipping/-/blob/main/R3-M21/GC-M21/GC_OSDU_Smoke_Tests.postma...While doing some testing, I noticed a mistake (I think) in one of the Google Postman collections called [GC OSDU Smoke Tests](https://community.opengroup.org/osdu/platform/pre-shipping/-/blob/main/R3-M21/GC-M21/GC_OSDU_Smoke_Tests.postman_collection.json).
The tests under Core Services/Search beginning with "A01" and "C01" each have a mistake in the body:
```
"kind": "{{data-partition-id}}:{{schemaSource}}:master-data--Well:*.*.*",
```
For the M21Google pre-ship environment, the `data-partition-id` is `m19` but these tests don't match anything unless you change ``{{data-parition-id}} to `osdu`.
Fixing these to use``{{authority}}`` (I think that's correct) would help future newbies. Thanks.Denis Karpenok (EPAM)Dzmitry Malkevich (EPAM)Denis Karpenok (EPAM)https://community.opengroup.org/osdu/platform/pre-shipping/-/issues/632GCP M21 Unable upload SEGY file using sdutil2023-11-07T07:56:14ZBhuwan Prasad UpadhyayGCP M21 Unable upload SEGY file using sdutilNot able to ingest SEGY file using [sdutil ](https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-sdutil/-/tree/master/)in gcp M21. Receiving following error:
```
Error encounte...Not able to ingest SEGY file using [sdutil ](https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-sdutil/-/tree/master/)in gcp M21. Receiving following error:
```
Error encountered during upload, deleting the partially created record from seismic store
not enough values to unpack (expected 2, got 1)
```
Test results attached with this issue.
[M21-GCP-Upload-Segy-File-SDUTIL-Bhuwan.txt](/uploads/964edbf8aa35d70ffe2b727a0af2d028/M21-GCP-Upload-Segy-File-SDUTIL-Bhuwan.txt)Dzmitry Malkevich (EPAM)Dzmitry Malkevich (EPAM)