OSDU Software issueshttps://community.opengroup.org/groups/osdu/-/issues2022-09-13T15:27:19Zhttps://community.opengroup.org/osdu/platform/consumption/geospatial/-/issues/143Provider - Bug: initial extent in AGO not reflecting extent of features2022-09-13T15:27:19ZLevi RemingtonProvider - Bug: initial extent in AGO not reflecting extent of featuresWhen viewing a Koop-provided layer in AGO, it should automatically zoom to the extent of all visible features. This was previously working, but for some reason it is not working now.
Need to investigate the initial query made by AGO to...When viewing a Koop-provided layer in AGO, it should automatically zoom to the extent of all visible features. This was previously working, but for some reason it is not working now.
Need to investigate the initial query made by AGO to see how Provider is handling a `returnExtentOnly` query.
Acceptance Criteria:
- Opening a koop-provided layer in AGO initializes with accurate extent of featuresGCZ Sprint 24Levi RemingtonLevi Remingtonhttps://community.opengroup.org/osdu/platform/consumption/geospatial/-/issues/158Deployment - Fix GCZ configuration2022-09-28T15:18:57ZAnkita SrivastavaDeployment - Fix GCZ configurationRemove authorize URL
Add Data partition IdRemove authorize URL
Add Data partition IdGCZ Sprint 25Ankita SrivastavaAnkita Srivastavahttps://community.opengroup.org/osdu/platform/pre-shipping/-/issues/393SEGY-to-VDS DAG fails on Azure2022-11-28T10:15:45ZRaman SinghSEGY-to-VDS DAG fails on AzureAs seen in the attached log, the DAG reports failure. The SEGY file is from Volve dataset.
[AirFlow_log_segy_to_vds.txt](/uploads/41b61e7fbf84e37f067883e11af8c136/AirFlow_log_segy_to_vds.txt)As seen in the attached log, the DAG reports failure. The SEGY file is from Volve dataset.
[AirFlow_log_segy_to_vds.txt](/uploads/41b61e7fbf84e37f067883e11af8c136/AirFlow_log_segy_to_vds.txt)M14 - Release 0.17https://community.opengroup.org/osdu/platform/pre-shipping/-/issues/392SEGY-to-ZGY does not create any output even after reporting 'success'2023-01-12T05:00:58ZRaman SinghSEGY-to-ZGY does not create any output even after reporting 'success'As seen in the attached log, the Airflow DAG reports success, but there are no zgy files created in sd store.
[AirFlow_log_segy_to_zgy.txt](/uploads/fc2e40225af88f051a586cd3a14d05a5/AirFlow_log_segy_to_zgy.txt)As seen in the attached log, the Airflow DAG reports success, but there are no zgy files created in sd store.
[AirFlow_log_segy_to_zgy.txt](/uploads/fc2e40225af88f051a586cd3a14d05a5/AirFlow_log_segy_to_zgy.txt)M14 - Release 0.17https://community.opengroup.org/osdu/platform/pre-shipping/-/issues/387Search/Policy integration is still failing on Azure2022-11-07T13:17:55ZDadong ZhouSearch/Policy integration is still failing on AzureOn Azure Pre-Shipping env, I just rerun the policy test postman collection. The Storage/Policy integration is working now but the Search/Policy integration is still failing. ThanksOn Azure Pre-Shipping env, I just rerun the policy test postman collection. The Storage/Policy integration is working now but the Search/Policy integration is still failing. ThanksM14 - Release 0.17Thulasi Dass SubramanianThulasi Dass Subramanianhttps://community.opengroup.org/osdu/platform/pre-shipping/-/issues/386No Log Registered for Any OSDU Core Services in AWS Cloudwatch2022-11-06T11:30:07ZNaufal Mohamed NooriNo Log Registered for Any OSDU Core Services in AWS CloudwatchI encountered issue to view log in AWS cloudwatch, whenever I tried to submit Postman request i.e. POST XX//api/search/v2/query I could not see any log generated in cloudwatch console. I browsed through container map under os-search I ca...I encountered issue to view log in AWS cloudwatch, whenever I tried to submit Postman request i.e. POST XX//api/search/v2/query I could not see any log generated in cloudwatch console. I browsed through container map under os-search I can see some activity there from metric view but when I clicked view log no event registered. Same goes with os-ingestion/os-storage (this is a few osdu core srvices i have checked)
![image](/uploads/9a178e34ad4a701f8abcbf3d2eb70964/image.png)
![image__1_](/uploads/3499817038f033a8de45660178084635/image__1_.png)
![image__2_](/uploads/acc459e22017c9c1feb2d32249eee4c9/image__2_.png)M14 - Release 0.17https://community.opengroup.org/osdu/platform/pre-shipping/-/issues/382In pre-ship R3 M14 IBM environment Storage API is not working as expected.2022-11-02T18:37:33ZKamlesh TodaiIn pre-ship R3 M14 IBM environment Storage API is not working as expected.While performing test for Storage (Create-Update-Retrieve) & Unit Conversion - In the pre-ship R3 M14, IBM environment, Storage API is not working as expected.
I create a record
curl --location --request PUT 'https://cpd-osdu.odi-og-osd...While performing test for Storage (Create-Update-Retrieve) & Unit Conversion - In the pre-ship R3 M14, IBM environment, Storage API is not working as expected.
I create a record
curl --location --request PUT 'https://cpd-osdu.odi-og-osdu-ba8e38d4c011d627379af1a4280c4e35-0000.us-south.containers.appdomain.cloud/osdu-storage/api/storage/v2/records' \
--header 'data-partition-id: opendes' \
--header 'Accept: application/json' \
--header 'Authorization: Bearer eyJhbGciOi...DgtIIKxmQ' \
--header 'Content-Type: application/json' \
--data-raw '[
{
"id": "opendes:master-data--SeismicAcquisitionSurvey:Autotest_999294098303",
"kind": "osdu:wks:master-data--SeismicAcquisitionSurvey:1.0.0",
"meta": [
{
"kind": "Unit",
"name": "ms",
"persistableReference": "{\"abcd\":{\"a\":0.0,\"b\":0.001,\"c\":1.0,\"d\":0.0},\"symbol\":\"ms\",\"baseMeasurement\":{\"ancestry\":\"T\",\"type\":\"UM\"},\"type\":\"UAD\"}",
"unitOfMeasureID": "opendes:reference-data--UnitOfMeasure:ms:",
"propertyNames": [
"RecordLength"
]
},
{
"kind": "Unit",
"name": "ft",
"persistableReference": "{\"abcd\":{\"a\":0.0,\"b\":0.3048,\"c\":1.0,\"d\":0.0},\"symbol\":\"ft\",\"baseMeasurement\":{\"ancestry\":\"L\",\"type\":\"UM\"},\"type\":\"UAD\"}",
"unitOfMeasureID": "opendes:reference-data--UnitOfMeasure:ft:",
"propertyNames": [
"ShotpointIncrementDistance",
"CableLength",
"CableSpacingDistance",
"MaxOffsetDistance"
]
}
],
"data": {
"Purpose": "Acquisition for Volve",
"SeismicGeometryTypeID": "opendes:reference-data--SeismicGeometryType:3D:",
"OperatingEnvironmentID": "opendes:reference-data--OperatingEnvironment:Offshore:",
"ShotpointIncrementDistance": 25.0,
"EnergySourceTypeID": "opendes:reference-data--SeismicEnergySourceType:Airgun:",
"SourceArrayCount": 2,
"SourceArraySeparationDistance": 100.0,
"SampleInterval": 4.0,
"RecordLength": 10200,
"CableCount": 4,
"CableLength": 6000.0,
"CableSpacingDistance": 100.0,
"MaxOffsetDistance": 6000.0,
"FoldCount": 120,
"VesselNames": [
"GECO ANGLER"
]
},
"acl": {
"owners": [
"data.default.owner@opendes.ibm.com"
],
"viewers": [
"data.wellboreMarkerdb47059.viewers@opendes.ibm.com"
]
},
"legal": {
"legaltags": [
"opendes-STORUNIT-Legal-Tag-Test6690637"
],
"otherRelevantDataCountries": [
"US"
],
"status": "compliant"
}
}
]'
Response: 201 Created
{
"recordCount": 1,
"recordIds": [
"opendes:master-data--SeismicAcquisitionSurvey:Autotest_999294098303"
],
"skippedRecordIds": [],
"recordIdVersions": [
"opendes:master-data--SeismicAcquisitionSurvey:Autotest_999294098303:1667320492120321"
]
}
I try to retrieve the record using record ID
curl --location --request GET 'https://cpd-osdu.odi-og-osdu-ba8e38d4c011d627379af1a4280c4e35-0000.us-south.containers.appdomain.cloud/osdu-storage/api/storage/v2/records/opendes:master-data--SeismicAcquisitionSurvey:Autotest_999294098303' \
--header 'data-partition-id: opendes' \
--header 'Accept: application/json' \
--header 'Authorization: Bearer eyJhbGciOi...DgtIIKxmQ'
Response: 403 Forbidden
{
"code": 403,
"reason": "Access denied",
"message": "The user is not authorized to perform this action"
}
The same workflow is working in AWS and Azure. In GCP storage part is working but Search is failingM14 - Release 0.17Anuj GuptaShrikant GargAnuj Guptahttps://community.opengroup.org/osdu/platform/pre-shipping/-/issues/381Search does not seem to be working as expected in pre-ship R3 M14 GCP environ...2022-11-03T13:53:39ZKamlesh TodaiSearch does not seem to be working as expected in pre-ship R3 M14 GCP environment.While performing the test for Storage (Create-Update-Retrieve) & Unit Conversion I find that Search API does not seem to be working as expected in the pre-ship R3 M14 GCP environment.
**I create a record using storage API, I can retrie...While performing the test for Storage (Create-Update-Retrieve) & Unit Conversion I find that Search API does not seem to be working as expected in the pre-ship R3 M14 GCP environment.
**I create a record using storage API, I can retrieve the record using storage API**
`
curl --location --request GET 'https://preship.gcp.gnrg-osdu.projects.epam.com/api/storage/v2/records/odesprod:master-data--SeismicAcquisitionSurvey:Autotest_999704543963' \
--header 'data-partition-id: odesprod' \
--header 'Accept: application/json' \
--header 'Authorization: Bearer ya29.a0Aa4xr...KA0165'
`
Response:
```json
{
"data": {
"Purpose": "Acquisition for Volve",
"SeismicGeometryTypeID": "odesprod:reference-data--SeismicGeometryType:3D:",
...
"FoldCount": 120,
"VesselNames": [
"GECO ANGLER"
]
},
"meta": [
{
...
},
{
"kind": "Unit",
"name": "ft",
...
}
],
**"id": "odesprod:master-data--SeismicAcquisitionSurvey:Autotest_999704543963"**,
"version": 1667319211819809,
"kind": "odesprod:wks:master-data--SeismicAcquisitionSurvey:1.0.0",
"acl": {
...
},
"legal": {
...
},
"createUser": "preshipping_test_user@osdu-gcp.go3-nrg.projects.epam.com",
"createTime": "2022-11-01T16:13:33.370Z"
}
```
**when I try to use search API to get the record, it is not returning that record.**
`curl --location --request POST 'https://preship.gcp.gnrg-osdu.projects.epam.com/api/search/v2/query' \
--header 'Content-Type: application/json' \
--header 'data-partition-id: odesprod' \
--header 'Authorization: Bearer ya29.a0Aa4xr....KA0165' \
--data-raw '{
"kind": "odesprod:wks:master-data--SeismicAcquisitionSurvey:1.0.0",
"query": **"id: \"odesprod:master-data--SeismicAcquisitionSurvey:Autotest_999704543963\"**"
}'`
**Response:**
```json
{
"results": [],
"aggregations": [],
"totalCount": 0
}
```
I have tried
`curl --location --request POST 'https://preship.gcp.gnrg-osdu.projects.epam.com/api/search/v2/query' \
--header 'Content-Type: application/json' \
--header 'data-partition-id: odesprod' \
--header 'Authorization: Bearer ya29.a0Aa4xr...KA0165' \
--data-raw '{
"kind": "odesprod:wks:master-data--SeismicAcquisitionSurvey:1.0.0",
"query": **"*"**
}'`
In the last case I do get the records, but not the one (odesprod:master-data--SeismicAcquisitionSurvey:Autotest_999704543963) that was inserted using storage API (edited)
This workflow is working in AWS and Azure R3 M14 environmentM14 - Release 0.17Dzmitry Malkevich (EPAM)Yauhen Shaliou [EPAM/GCP]Dzmitry Malkevich (EPAM)https://community.opengroup.org/osdu/platform/pre-shipping/-/issues/379AZURE M14: Nested Spud date Search throws 400 error2022-11-04T13:05:29ZMichaelAZURE M14: Nested Spud date Search throws 400 errorWhen searching for a well with a spud date using the following query a 400 error is returned:
```
curl --location --request POST 'https://osdu-ship.msft-osdu-test.org/api/search/v2/query' \
--header 'Content-Type: application/json' \
--h...When searching for a well with a spud date using the following query a 400 error is returned:
```
curl --location --request POST 'https://osdu-ship.msft-osdu-test.org/api/search/v2/query' \
--header 'Content-Type: application/json' \
--header 'data-partition-id: opendes' \
--header 'Authorization: Bearer ...' \
--data-raw '{
"kind": "osdu:wks:master-data--Well:1.0.0",
"query": "(nested(data.FacilityEvents, (EffectiveDateTime: [1901-05-01 TO 1972-06-01] AND FacilityEventTypeID:\"opendes:reference-data--FacilityEventType:SPUD\")))",
"limit": 5,
"returnedFields": ["id","kind","data.FacilityEvents"],
"offset": 0
}'
```
Response:
```
{
"code": 400,
"reason": "Bad Request",
"message": "failed to create query: [nested] failed to find nested object under path [data.FacilityEvents]"
}
```
This same request works in the other pre shipping M14 environments.
There are wells that have the data.FacilityEvents field populated as shown below:
```
curl --location --request POST 'https://osdu-ship.msft-osdu-test.org/api/search/v2/query' \
--header 'Content-Type: application/json' \
--header 'data-partition-id: opendes' \
--header 'Authorization: Bearer ...' \
--data-raw '{
"kind": "osdu:wks:master-data--Well:*.*.*",
"query": "id:\"opendes:master-data--Well:ceb2fb4c35b042b5ba9e815785050b7f\"",
"returnedFields": [
"id",
"kind",
"data.FacilityEvents"
]
}'
```
Response:
```
{
"results": [
{
"data": {
"FacilityEvents": [
{
"EffectiveDateTime": "1971-02-16T00:00:00+0000",
"FacilityEventTypeID": "opendes:reference-data--FacilityEventType:Spud:"
},
{
"EffectiveDateTime": "1971-04-16T00:00:00+0000",
"FacilityEventTypeID": "opendes:reference-data--FacilityEventType:TDReached:"
}
]
},
"kind": "osdu:wks:master-data--Well:1.0.0",
"id": "opendes:master-data--Well:ceb2fb4c35b042b5ba9e815785050b7f"
}
],
"aggregations": null,
"totalCount": 1
}
```
I checked the osdu:wks:master-data--Well:1.0.0 and the well schema being used is identical to the one in the AWS M14 pre shipping environment.M14 - Release 0.17https://community.opengroup.org/osdu/platform/pre-shipping/-/issues/376AWS M14: 500 error when calling dataset service for volve seismic datasets2023-01-05T05:30:18ZMichaelAWS M14: 500 error when calling dataset service for volve seismic datasetsI am getting a 500 error when calling the Get Retrieval Instructions Service (/api/dataset/v1/getRetrievalInstructions) and using the dataset ids of the volve SeismicTraceData records ( osdu:work-product-component--SeismicTraceData:ST020...I am getting a 500 error when calling the Get Retrieval Instructions Service (/api/dataset/v1/getRetrievalInstructions) and using the dataset ids of the volve SeismicTraceData records ( osdu:work-product-component--SeismicTraceData:ST0202R08-time-volume, osdu:work-product-component--SeismicTraceData:ST0202R08-depth-volume).
Below are the specific Get Retrieval Instructions Service requests that are failing:
1. For osdu:work-product-component--SeismicTraceData:ST0202R08-time-volume
```
curl --location --request POST 'https://r3m14.preshiptesting.osdu.aws/api/dataset/v1/getRetrievalInstructions' \
--header 'data-partition-id: osdu' \
--header 'Authorization: Bearer ...' \
--header 'Content-Type: application/json' \
--data-raw '{
"datasetRegistryIds": [
"osdu:dataset--FileCollection.SEGY:4822148772d1e2c523d92cf891b68dcb1105988abd59bee79487ed7e2ae183eb"
]
}'
```
Response:
```
{
"code": 500,
"reason": "Internal Server Error",
"message": "Unrecognized field \"code\" (class org.opengroup.osdu.core.common.dms.model.RetrievalInstructionsResponse), not marked as ignorable (2 known properties: \"datasets\", \"providerKey\"])_ at [Source: (String)\"{\"code\":500,\"reason\":\"Invalid/Empty File Collection Path\",\"message\":\"Invalid/Empty File Collection Path - File collection not found at specified S3 path or is empty\"}\"; line: 1, column: 12] (through reference chain: org.opengroup.osdu.core.common.dms.model.RetrievalInstructionsResponse[\"code\"])"
}
```
2. For osdu:work-product-component--SeismicTraceData:ST0202R08-depth-volume
```
curl --location --request POST 'https://r3m14.preshiptesting.osdu.aws/api/dataset/v1/getRetrievalInstructions' \
--header 'data-partition-id: osdu' \
--header 'Authorization: Bearer eyJraWQiOiJXQTZNbnpnUGFJeDhwaWwxb0J1dGZNMU5UOTZPdHJlMWtqbjJRXC83Ykp0ST0iLCJhbGciOiJSUzI1NiJ9.eyJzdWIiOiJiNmRlNmE3Yy1hMmYwLTQ4NWYtODQxNy02Nzc1NDI3NzVlMDkiLCJpc3MiOiJodHRwczpcL1wvY29nbml0by1pZHAudXMtZWFzdC0yLmFtYXpvbmF3cy5jb21cL3VzLWVhc3QtMl9qVWM2Z3VNNDQiLCJ2ZXJzaW9uIjoyLCJjbGllbnRfaWQiOiI3bTQ1ZjMydW84N2k4b2lkbjJ0dmY1cDYyNyIsIm9yaWdpbl9qdGkiOiJiZTRiMGQyOS0xZjYyLTRlZmMtOWFmZi01MGJmMjRhZTkwOWEiLCJldmVudF9pZCI6Ijg3MmVhZWEzLWNiZDgtNDZlNy05ZjUyLTBkMDcxZmU1MDc0MiIsInRva2VuX3VzZSI6ImFjY2VzcyIsInNjb3BlIjoib3BlbmlkIGVtYWlsIiwiYXV0aF90aW1lIjoxNjY2ODI1MjM5LCJleHAiOjE2NjY5NzQzOTIsImlhdCI6MTY2Njk3MjU5MiwianRpIjoiZWQwZWZlYTgtOGEwZC00NzgzLWFjYTctM2Q5NTk2ZTcwY2YzIiwidXNlcm5hbWUiOiJhZG1pbkB0ZXN0aW5nLmNvbSJ9.ith5fs_iIf4YLrGCH5Gke_MDNvlOJmjAieUKxblLe0jRB6LFl6MrVR7hUSNnTj-S7eiXbET703x-mjsJ66NOJMY5wcmABgQ1UnGsJVxOdRBJoSOT3BzGzKZMsJLtMFIWTrl5Xr96NiTCUtuj2A1I_w8fWLCPY6ZpXBAIfJiCA4ji41NZ7wB09uccZymFPtiG76M7FUVwXxtP5yXc9ZiQKYLCgfv3MCwrV5AAoLpbxXS4sK52nJnaScM7IB2Fh8tsZm8YjQX5xh8TTjNPqzTkrNIM1w-wxBG9aGbN8O23A6mEFpUIo4nQm-0kseVOTqaoWacpU8NvR3Il7CabuBCgaw' \
--header 'Content-Type: application/json' \
--data-raw '{
"datasetRegistryIds": [
"osdu:dataset--FileCollection.SEGY:373f17757dcdb47abc42198e764f1386e75233ce9247d84453e67be09a0bcb69"
]
}'
```
Response:
```
{
"code": 500,
"reason": "Internal Server Error",
"message": "Unrecognized field \"code\" (class org.opengroup.osdu.core.common.dms.model.RetrievalInstructionsResponse), not marked as ignorable (2 known properties: \"datasets\", \"providerKey\"])_ at [Source: (String)\"{\"code\":500,\"reason\":\"Invalid/Empty File Collection Path\",\"message\":\"Invalid/Empty File Collection Path - File collection not found at specified S3 path or is empty\"}\"; line: 1, column: 12] (through reference chain: org.opengroup.osdu.core.common.dms.model.RetrievalInstructionsResponse[\"code\"])"
}
```M14 - Release 0.17Okoun-Ola Fabien HouetoOkoun-Ola Fabien Houetohttps://community.opengroup.org/osdu/platform/data-flow/ingestion/external-data-sources/core-external-data-workflow/-/issues/6EDS Preship test - DAG import errors2022-11-23T16:27:56ZNur SheikhEDS Preship test - DAG import errorsFollowing DAG import errors are affecting testing in M14 Azure/Preship env
![image](/uploads/a25921553222fe6486dd6ba265eab944/image.png)
This could possibly mean that the eds ingestion and EDS scheduler dags were not imported in airflow.Following DAG import errors are affecting testing in M14 Azure/Preship env
![image](/uploads/a25921553222fe6486dd6ba265eab944/image.png)
This could possibly mean that the eds ingestion and EDS scheduler dags were not imported in airflow.M14 - Release 0.17shivani karipeshivani karipehttps://community.opengroup.org/osdu/platform/pre-shipping/-/issues/373IBM - M14 - Sdutil problem with legal tags2022-10-27T13:48:03ZThiago MoreiraIBM - M14 - Sdutil problem with legal tagsHi,
I am trying to create a new subproject in sdutil, but it keeps getting errors with Legal-Tags
When I try to use an older Legal-tag (still available in the Environment), I got this error:
![error2](/uploads/6176f355da981b1d05563ab...Hi,
I am trying to create a new subproject in sdutil, but it keeps getting errors with Legal-Tags
When I try to use an older Legal-tag (still available in the Environment), I got this error:
![error2](/uploads/6176f355da981b1d05563ab505c82d89/Error1.png)
When I try to generate a new Legal-tag, the error changes to:
![error2](/uploads/81971617aea4a16ce3d52241f663d199/error2.png)
I am using the same config.yaml file as M13 and the authentication part was fine.M14 - Release 0.17https://community.opengroup.org/osdu/platform/security-and-compliance/policy/-/issues/75Policy removed with confusing msg2022-11-10T15:14:35ZDadong ZhouPolicy removed with confusing msgHi I am testing the policy apis locally. I added a partition policy and tried to remove it. The policy is removed successfully but the response msg says Error:
{
"policy_id": "storage.rego",
"data_partition": "osdu",
"status": tru...Hi I am testing the policy apis locally. I added a partition policy and tried to remove it. The policy is removed successfully but the response msg says Error:
{
"policy_id": "storage.rego",
"data_partition": "osdu",
"status": true,
"message": "Error while removing policy osdu/partition/osdu/storage.rego",
"result": {}
}
Please fix.
Thanks,
DadongM14 - Release 0.17Shane HutchinsShane Hutchinshttps://community.opengroup.org/osdu/platform/security-and-compliance/policy/-/issues/73Migrate swagger from /docs to /api/policy/v1/docs2022-10-02T00:45:23ZArturo Hernandez [EPAM]Migrate swagger from /docs to /api/policy/v1/docsMigrate swagger from /docs to SERVICE_BASE_PATH (/api/policy/v1/docs),
Seems that it is not quite right to have the `/docs` as it can be confusing for other services, we are exposing services through path, policy service exposed path ma...Migrate swagger from /docs to SERVICE_BASE_PATH (/api/policy/v1/docs),
Seems that it is not quite right to have the `/docs` as it can be confusing for other services, we are exposing services through path, policy service exposed path match `api/policy/v1*`, following other services convention, therefore, It would make sense to me to have something like `api/policy/V1/docs` instead, such as for example wellbore or other servicesM14 - Release 0.17Shane HutchinsShane Hutchinshttps://community.opengroup.org/osdu/platform/domain-data-mgmt-services/reservoir/open-etp-server/-/issues/22Facing issue while deploying open-etp-server2022-10-04T22:28:44ZBrindaban DasFacing issue while deploying open-etp-serverWe are facing issue while trying to deploy the open-etp-server in openshift environment. Actually the pod is crashing and this is happening due to the "**-h**" option used in "**CMD [ "openETPServer", "server", "-h" ]**" of **Dockerfile....We are facing issue while trying to deploy the open-etp-server in openshift environment. Actually the pod is crashing and this is happening due to the "**-h**" option used in "**CMD [ "openETPServer", "server", "-h" ]**" of **Dockerfile.runtime** file. The issue is resolved after replacing "-h" to "--start", "**CMD [ "openETPServer", "server", "--start" ]**".
Anyone faced this issue? can we made this change in "Dockerfile.runtime"?M14 - Release 0.17Laurent DenyLaurent Denyhttps://community.opengroup.org/osdu/platform/domain-data-mgmt-services/wellbore/wellbore-domain-services/-/issues/52Add support for wellbore1.1.1 and wellboremarkerset1.2.12022-09-30T09:38:41ZYannickAdd support for wellbore1.1.1 and wellboremarkerset1.2.1Add support for [wellbore1.1.1](https://community.opengroup.org/osdu/data/data-definitions/-/blob/v0.17.0/E-R/work-product-component/WellboreMarkerSet.1.2.1.md) and [wellboremarkerset1.2.1](https://community.opengroup.org/osdu/data/data-...Add support for [wellbore1.1.1](https://community.opengroup.org/osdu/data/data-definitions/-/blob/v0.17.0/E-R/work-product-component/WellboreMarkerSet.1.2.1.md) and [wellboremarkerset1.2.1](https://community.opengroup.org/osdu/data/data-definitions/-/blob/v0.17.0/E-R/work-product-component/WellboreMarkerSet.1.2.1.md)M14 - Release 0.17fabian serinfabian serinhttps://community.opengroup.org/osdu/platform/domain-data-mgmt-services/reservoir/open-etp-server/-/issues/20data-partition-id header should not come from environment variable when the s...2022-10-06T22:11:09ZShuai Lidata-partition-id header should not come from environment variable when the server calls Entilement service apiOpen ETP server supports multi-partition capability.
When open ETP server calls entitlement service API, it should add data-partition-id header in the request. Currently the codes get the data-partition-id value from the environment var...Open ETP server supports multi-partition capability.
When open ETP server calls entitlement service API, it should add data-partition-id header in the request. Currently the codes get the data-partition-id value from the environment variable DATA_PARTITION_ID. This is strange. I think it is wrong. If data-partition-id comes from one specific environment variable, it is not possible to support multi-partition.
If I create a new dataspace with a partition id "partition-A" as a command argument, the breakpoint I make in the following code shows it adds the value of environment variable DATA_PARTITION_ID in the http header when making calls to Entilement service, not the "partition-A" I pass in the client command line.
```c++
oes::core::HTTP::Response BaseClient::makeRequest(
oes::core::HTTP::Method method,
std::string& bearerToken,
const std::string& url,
const std::string& queryParams,
const oes::core::HTTP::Headers& additionalHeaders,
const oes::core::HTTP::Fields& params,
const std::string* body
)
{
oes::core::HTTP::Headers headers {
"content-type: application/json",
};
if (!_dataPartitionId.empty()) {
headers.push_back("data-partition-id: " + _dataPartitionId);
}
//some other codes below
}
```
```c++
class EntitlementOSDUConfigFromEnv::Impl {
public:
Impl(const std::string& delegateUri) {
auto from_env = [](const char* envName) {
const char* p = getenv(envName);
return p ? std::string(p) : std::string();
};
base_url_ = (!delegateUri.empty()) ? delegateUri : from_env("OSDU_HOST");
data_partition_id_ = from_env("DATA_PARTITION_ID");
schema_id_ = from_env("SCHEMA_ID");
domain_name_ = from_env("DOMAIN_NAME");
legal_tags_ = from_env("LEGAL_TAGS");;
legal_countries_ = from_env("LEGAL_COUNTRIES");;
}
}
```M14 - Release 0.17https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/reservoir/open-etp-client/-/issues/4Not able to run Open ETP Client in local system2023-05-05T00:45:27ZBrindaban DasNot able to run Open ETP Client in local systemWe are not able to run the Open ETP client in our local machine. We have followed the below steps as per documentation.
1. As per setup installation step mentioned, we could see that the below link is not available "**https://community....We are not able to run the Open ETP client in our local machine. We have followed the below steps as per documentation.
1. As per setup installation step mentioned, we could see that the below link is not available "**https://community.opengroup.org/api/v4/projects/osdu/packages/npm/**", not sure whether this is due the access issue and who will provide the access. The command `npm install @osdu/open-etp-client` didn't work properly, below is the screen shot.
![image](/uploads/1f3f880f95647f392794959c88bb5908/image.png)
2. In "**config.user.env**" file not able to provide npm_token as we could not find the password in **.npmrc** file.
3. Then after running "npm install && npm run build" command, we are getting the below output as present in the below screenshot.
![image](/uploads/8944333e773cda064bc9ad5c70b71339/image.png)
We are not sure whether the command is executed successfully and we are not able to verify the same.
4. After that we ran "**npm run all**" command but getting the below error.
![image](/uploads/4300d37dbf41b3f858c33c854c5e6afd/image.png)
5. In the end we have tried "`npm run test`" command and we are getting below error and 1 test suite is failing out of 4.
![image](/uploads/aeb8ea051af15d1bacf3e25abdaa49fd/image.png)
We are not able to get proper understanding of how to run open-etp-client in local system and verify that it is running. Also, how to test the functionality it provides from the documentation.
Could you please help to resolve the above mentioned issues and guide us to run the open-etp-client in local system.M14 - Release 0.17Laurent DenyAlice ChanvinLaurent Denyhttps://community.opengroup.org/osdu/platform/system/search-service/-/issues/99Fix the search and indexing performance issues when the geometry of the docum...2023-07-10T16:17:26ZZhibin MaiFix the search and indexing performance issues when the geometry of the document is large##### Background:
Today the geometry or called shapes in the indexed records are not decimated. The size of geometry data could be large and reach tens of MB if hundreds of MB. As we know, the geometry in the search index can be used t...##### Background:
Today the geometry or called shapes in the indexed records are not decimated. The size of geometry data could be large and reach tens of MB if hundreds of MB. As we know, the geometry in the search index can be used to support spatial query, data preview or data discovery.
However, the large size of geometry in the indexed records could significantly affect the performance on retrieving the search results and prevent search results to be used efficiently in some utilities, such as GIS map. In O&G application, GIS map is a critical component that users may use to render the shapes in the given region as a tool for the data discovery. It may require to retrieve and render thousands or even millions of shapes from the OSDU index. If there are tens of thousand of shapes to be retrieved and rendered, the performance won't be good enough even the shapes are decimated. At another end, it is unnecessary to show the detail of the shapes when tens of thousands indexed records are returned from the search.
##### Proposal:
We propose decimate the geometry of the following GeoJSON geometry types by implementing Ramer–Douglas–Peucker algorithm for the original shape attribute and shape attribute "data.VirtualProperties.DefaultLocation.Wgs84Coordinates" if exists.
- LineString
- MultiLineString
- Polygon
- MultiPolygon
Regarding shape attribute "data.VirtualProperties.DefaultLocation", please refer to ADR [Common discovery within and across kinds](https://community.opengroup.org/osdu/platform/system/search-service/-/issues/69)
##### Performance Evaluation:
We did some performance evaluation with the prototype to decimate the original shape attribute and shape attribute "data.VirtualProperties.DefaultLocation.Wgs84Coordinates" using some seismic 2D surveys. The tolerance or epsilon is about 10 meters which is about 0.0001 degree around the equator.
The information of the test dataset and summary of the test report are attached below:
- [performance_test_summary.txt](/uploads/dc913a11d5cead3a1b5b54529c5449de/performance_test_summary.txt)
- [test_dataset.csv](/uploads/0263b8e976526c246e4dd8074a8c52f2/test_dataset.csv)
##### Summary:
1. The decimation of the shape attributes significantly improve the end to end search performance (search and data retrieval from elastic search to the test client)
2. The extra overhead of the decimation during indexing is offset by the gain of saving time on elastic search indexing of the geo-shapes. The test result indicates that it reduced the overall indexing time by 58%.M14 - Release 0.17Zhibin MaiZhibin Maihttps://community.opengroup.org/osdu/platform/domain-data-mgmt-services/wellbore/wellbore-domain-services/-/issues/51changes in AWS pipeline breaks CI/CD2022-09-26T13:47:42ZYannickchanges in AWS pipeline breaks CI/CDa recent change in [pipeline for AWS](https://community.opengroup.org/osdu/platform/ci-cd-pipelines/-/merge_requests/722) breaks WDMS CI/CDa recent change in [pipeline for AWS](https://community.opengroup.org/osdu/platform/ci-cd-pipelines/-/merge_requests/722) breaks WDMS CI/CDM14 - Release 0.17