OSDU Software issueshttps://community.opengroup.org/groups/osdu/-/issues2023-08-11T09:53:55Zhttps://community.opengroup.org/osdu/platform/pre-shipping/-/issues/575IBM R3 M19 environment having issue with Search and or indexer service.2023-08-11T09:53:55ZKamlesh TodaiIBM R3 M19 environment having issue with Search and or indexer service.After ingesting the data using manifest ingestion (Osdu_ingest), one can retrieve the data using storage API, but cannot retrieve it using the Search API.
Even after waiting for more then 15 minutes.
Using Storage API
```
curl --locat...After ingesting the data using manifest ingestion (Osdu_ingest), one can retrieve the data using storage API, but cannot retrieve it using the Search API.
Even after waiting for more then 15 minutes.
Using Storage API
```
curl --location 'https://cpd-osdu.apps.ibmosdu-preship.lndu.p1.openshiftapps.com/osdu-storage/api/storage/v2/records/opendes:master-data--GeoPoliticalEntity:Limburg' \
--header 'data-partition-id: opendes' \
--header 'Authorization: Bearer eyJhbGciOi...Truncated...3k3p_JLqg'
Response 200 OK
{
"data": {
"GeoPoliticalEntityName": "Limburg",
"Source": "TNO",
"VirtualProperties.DefaultName": "Limburg"
},
"meta": null,
"modifyUser": "osdu-bvt@osdu.opengroup.org",
"modifyTime": "2023-08-08T13:40:37.481Z",
"id": "opendes:master-data--GeoPoliticalEntity:Limburg",
"version": 1691502037052901,
"kind": "osdu:wks:master-data--GeoPoliticalEntity:1.0.0",
"acl": {
"viewers": [
"data.default.viewers@opendes.ibm.com"
],
"owners": [
"data.default.owners@opendes.ibm.com"
]
},
"legal": {
"legaltags": [
"opendes-R3FullManifest-Legal-Tag-Test7791699"
],
"otherRelevantDataCountries": [
"US"
],
"status": "compliant"
},
"createUser": "osdu-bvt@osdu.opengroup.org",
"createTime": "2023-08-04T20:51:59.686Z"
}
```
But when one performs the search using Search API, the results returned are zero for the count.
```
curl --location 'https://cpd-osdu.apps.ibmosdu-preship.lndu.p1.openshiftapps.com/osdu-search/api/search/v2/query' \
--header 'Content-Type: application/json' \
--header 'data-partition-id: opendes' \
--header 'Authorization: Bearer eyJhbGciOi...Truncated...3p_JLqg' \
--data '{
"kind": "osdu:wks:master-data--GeoPoliticalEntity:1.0.0",
"query" : "*",
"limit" : 500,
"returnedFields" : ["id"]
}'
Response 200 OK
{
"results": [],
"aggregations": [],
"totalCount": 0
}
```
Does not matter what search is being performed, the total count returned is zero.
@chad @debasisc @davidglassAnuj Guptavikas ranaAnuj Guptahttps://community.opengroup.org/osdu/platform/pre-shipping/-/issues/574Azure M19 - Search results not returning expected fields2023-08-10T15:52:40ZJuliana Fernandesjuliana.fernandes@iesbrazil.com.brAzure M19 - Search results not returning expected fieldsHi,
I'm trying to perform the search query for Unit Conversion but I`m not getting what I'm expecting.
What I'm doing is:
POST
```
curl --location 'https://osdu-ship.msft-osdu-test.org/api/search/v2/query' \
```
Response
```json
{
...Hi,
I'm trying to perform the search query for Unit Conversion but I`m not getting what I'm expecting.
What I'm doing is:
POST
```
curl --location 'https://osdu-ship.msft-osdu-test.org/api/search/v2/query' \
```
Response
```json
{
"results": [
{
"kind": "osdu:wks:master-data--SeismicAcquisitionSurvey:1.0.0",
"source": "wks",
"acl": {
"viewers": [
"data.default.viewers@opendes.contoso.com"
],
"owners": [
"data.default.owners@opendes.contoso.com"
]
},
"type": "master-data--SeismicAcquisitionSurvey",
"version": 1686905173380617,
"tags": {
"normalizedKind": "osdu:wks:master-data--SeismicAcquisitionSurvey:1"
},
"modifyUser": "preshipping@azureglobal1.onmicrosoft.com",
"modifyTime": "2023-06-16T08:46:13.991Z",
"createTime": "2023-06-15T13:18:03.831Z",
"authority": "osdu",
"namespace": "osdu:wks",
"legal": {
"legaltags": [
"opendes-R3FullManifest-Legal-Tag-Test6917654"
],
"otherRelevantDataCountries": [
"US"
],
"status": "compliant"
},
"createUser": "preshipping@azureglobal1.onmicrosoft.com",
"id": "opendes:master-data--SeismicAcquisitionSurvey:ST0202R08_UC_JFA_Jun152023"
}
],
"aggregations": null,
"totalCount": 1
}
```
What I'm expecting as response is something like:
```json
{
"results": [
{
"data": {
"Operator": null,
"SourceArrayCount": 2,
"ProjectName": null,
"ShotpointIncrementDistance": 7.62,
"VirtualProperties.DefaultLocation.QuantitativeAccuracyBandID": null,
"ResourceLifecycleStatus": null,
"ProjectID": null,
"CableCount": 4,
"CableLength": 1828.8000000000002,
"ResourceCurationStatus": null,
"TechnicalAssuranceID": null,
"VirtualProperties.DefaultLocation.SpatialGeometryTypeID": null,
"Source": null,
"SourceArraySeparationDistance": 100.0,
"VirtualProperties.DefaultName": null,
"CableSpacingDistance": 30.48,
"VirtualProperties.DefaultLocation.CoordinateQualityCheckPerformedBy": null,
"VersionCreationReason": null,
"ResourceSecurityClassification": null,
"SpatialLocation.SpatialParameterTypeID": null,
"ExistenceKind": null,
"SpatialLocation.CoordinateQualityCheckPerformedBy": null,
"VesselNames": [
"GECO ANGLER"
],
"MaxOffsetDistance": 1828.8000000000002,
"TechnicalAssuranceTypeID": null,
"Purpose": "Acquisition for Volve",
"RecordLength": 10.200000000000001,
"FoldCount": 120,
"SpatialLocation.QualitativeSpatialAccuracyTypeID": null,
"SpatialLocation.SpatialGeometryTypeID": null,
"OperatingEnvironmentID": "opendes:reference-data--OperatingEnvironment:Offshore:",
"VirtualProperties.DefaultLocation.SpatialParameterTypeID": null,
"ResourceHomeRegionID": null,
"SeismicGeometryTypeID": "opendes:reference-data--SeismicGeometryType:3D:",
"VirtualProperties.DefaultLocation.QualitativeSpatialAccuracyTypeID": null,
"SampleInterval": 4.0,
"EnergySourceTypeID": "opendes:reference-data--SeismicEnergySourceType:Airgun:",
"SpatialLocation.QuantitativeAccuracyBandID": null
},
"kind": "osdu:wks:master-data--SeismicAcquisitionSurvey:1.0.0",
"source": "wks",
"acl": {
"viewers": [
"data.default.viewers@opendes.contoso.com"
],
"owners": [
"data.default.owners@opendes.contoso.com"
]
},
"type": "master-data--SeismicAcquisitionSurvey",
"version": 1687473559876188,
"tags": {
"normalizedKind": "osdu:wks:master-data--SeismicAcquisitionSurvey:1"
},
"modifyUser": "preshipping@azureglobal1.onmicrosoft.com",
"modifyTime": "2023-06-22T22:39:20.957Z",
"createTime": "2023-06-22T19:35:49.945Z",
"authority": "osdu",
"namespace": "osdu:wks",
"legal": {
"legaltags": [
"opendes-Test-Legal-Tag-9727379"
],
"otherRelevantDataCountries": [
"US"
],
"status": "compliant"
},
"createUser": "preshipping@azureglobal1.onmicrosoft.com",
"id": "opendes:master-data--SeismicAcquisitionSurvey:ST0202R08_UC_JFA_Jun152023_DB22Jun"
}
],
"aggregations": null,
"totalCount": 1
}
```https://community.opengroup.org/osdu/platform/security-and-compliance/legal/-/issues/44Incorrect validation error message for updating LegalTags - HV000028: Unexpec...2023-11-22T06:58:00ZChad LeongIncorrect validation error message for updating LegalTags - HV000028: Unexpected exception during isValid call# Summary
If you are updating legal tags with invalid request body, the error message is incorrect.
```PUT https://osdu.bm-preship.gcp.gnrg-osdu.projects.epam.com/api/legal/v1/legaltags/```
Request body:
```json
{
"name": "opende...# Summary
If you are updating legal tags with invalid request body, the error message is incorrect.
```PUT https://osdu.bm-preship.gcp.gnrg-osdu.projects.epam.com/api/legal/v1/legaltags/```
Request body:
```json
{
"name": "opendes-Test-Legal-Tag-chad-123456",
"description": "updated desc 2",
"properties": {
"contractId": "123456",
"countryOfOrigin": [
"US",
"CA"
],
"dataType": "Third Party Data",
"exportClassification": "EAR99",
"originator": "Schlumberger",
"personalData": "No Personal Data",
"securityClassification": "Private",
"expirationDate": "2023-07-31"
}
}
```
# Expected Behavior
Response
```json
{
"code": 400,
"reason": "Validation error.",
"message": "The request body is invalid."
}
```
## Actual Behavior
Response
```json
{
"code": 400,
"reason": "Validation error.",
"message": "HV000028: Unexpected exception during isValid call."
}
```
This is an example of valid request body
```json
{
"name": "osdu-Test-Legal-Tag-chad-123456",
"description": "Legal Tag added for Well",
"contractId": "chad-AE33334",
"expirationDate": "2100-12-21",
"extensionProperties": {
"test_attr": "chad-test"
}
}
```https://community.opengroup.org/osdu/platform/pre-shipping/-/issues/565While performing ingestion to check for FoR for CRS conversion there is a sch...2023-08-07T12:43:05ZKamlesh TodaiWhile performing ingestion to check for FoR for CRS conversion there is a schema validation error which which is not understood.When using collection [Manifest_Based_Ingestion_Osdu_ingest_CI-CD_v2.0](https://community.opengroup.org/osdu/platform/pre-shipping/-/blob/main/R3-M19/QA_Artifacts_M19/envFilesAndCollections/Manifest_Based_Ingestion_Osdu_ingest_CI-CD_v2....When using collection [Manifest_Based_Ingestion_Osdu_ingest_CI-CD_v2.0](https://community.opengroup.org/osdu/platform/pre-shipping/-/blob/main/R3-M19/QA_Artifacts_M19/envFilesAndCollections/Manifest_Based_Ingestion_Osdu_ingest_CI-CD_v2.0.postman_collection.json) and "folder Integrity Check FoR for CRS Conversion" to test,
one gets the failure message from Osdu_ingest
complaining about the schema validation which is not clear. (This collection works in other cloud provider's (Azure and GC) environments
```
File "/home/airflow/.local/lib/python3.8/site-packages/jsonschema/validators.py", line 353, in validate
raise error
jsonschema.exceptions.ValidationError: 'acl' is a required property
Failed validating 'required' in schema:
{'$id': 'https://schema.osdu.opengroup.org/json/master-data/Well.1.0.0.json',
'$schema': 'http://json-schema.org/draft-07/schema#',
'additionalProperties': False,
'definitions': {'osdu:wks:AbstractAccessControlList:1.0.0': {'$id': 'https://schema.osdu.opengroup.org/json/abstract/AbstractAccessControlList.1.0.0.json',
'$schema': 'http://json-schema.org/draft-07/schema#',
'additionalProperties': False,
'description': 'The '
'access '
'control '
'tags '
'associated '
'with '
'this '
'entity. '
'This '
'structure '
'is '
'included '
'by '
'the '
'SystemProperties '
'"acl", '
'which '
'is '
'part '
'of '
'all '
'OSDU '
'records. '
'Not '
'extensible.',
```
**More details can be found by looking at the logs of run id: 1d22e6b1-29aa-4960-b0dd-ea3744dd1866 OR following the following link**
[http://airflow-web-osdu.apps.ibmosdu-preship.lndu.p1.openshiftapps.com/log?dag_id=Osdu_ingest&task_id=validate_manifest_schema_task&execution_date=2023-08-04T19%3A07%3A31.213150%2B00%3A00](http://airflow-web-osdu.apps.ibmosdu-preship.lndu.p1.openshiftapps.com/log?dag_id=Osdu_ingest&task_id=validate_manifest_schema_task&execution_date=2023-08-04T21%3A48%3A04.758668%2B00%3A00)
**Please Note the CRS**
```postman_json
{{data-partition-id}}:reference-data--CoordinateReferenceSystem:23031024 is missing from this instance
```
**Payload while triggering the DAG**
```
curl --location 'https://cpd-osdu.apps.ibmosdu-preship.lndu.p1.openshiftapps.com/osdu-workflow/api/workflow/v1/workflow/Osdu_ingest/workflowRun' \
--header 'data-partition-id: opendes' \
--header 'Content-Type: application/json' \
--header 'Authorization: Bearer eyJhbGciOi...Truncated...2NgrgQQCk8qWg' \
--data-raw '{
"runId": "f3aaac37-1366-4205-a389-e84659b0c6b1",
"executionContext": {
"Payload": {
"AppKey": "test-app",
"data-partition-id": "opendes"
},
"manifest": {
"kind": "osdu:wks:master-data--Well:1.0.0",
"ReferenceData": [ ],
"MasterData": [
{
"id": "opendes:master-data--Well:818188-FoR-CRS",
"kind": "osdu:wks:master-data--Well:1.0.0",
"acl": {
"owners": [
"data.default.owners@opendes.ibm.com"
],
"viewers": [
"data.default.viewers@opendes.ibm.com"
]
},
"legal": {
"legaltags": [
"opendes-Manifest-Ingestion-Legal-Tag"
],
"otherRelevantDataCountries": [
"US"
],
"status": "compliant"
},
"meta": [
{
"kind": "Unit",
"name": "m",
"persistableReference": "{\"abcd\":{\"a\":0.0,\"b\":1.0,\"c\":1.0,\"d\":0.0},\"symbol\":\"m\",\"baseMeasurement\":{\"ancestry\":\"L\",\"type\":\"UM\"},\"type\":\"UAD\"}",
"unitOfMeasureID": "opendes:reference-data--UnitOfMeasure:m:",
"propertyNames": [
"VerticalMeasurements[].VerticalMeasurement"
]
}
],
"data": {
"SpatialLocation": {
"AsIngestedCoordinates": {
"type": "AnyCrsFeatureCollection",
"CoordinateReferenceSystemID": "opendes:reference-data--CoordinateReferenceSystem:23031024:",
"persistableReferenceCrs": "{\"authCode\":{\"auth\":\"OSDU\",\"code\":\"23031024\"},\"lateBoundCRS\":{\"authCode\":{\"auth\":\"EPSG\",\"code\":\"23031\"},\"name\":\"ED_1950_UTM_Zone_31N\",\"type\":\"LBC\",\"ver\":\"PE_10_3_1\",\"wkt\":\"PROJCS[\\\"ED_1950_UTM_Zone_31N\\\",GEOGCS[\\\"GCS_European_1950\\\",DATUM[\\\"D_European_1950\\\",SPHEROID[\\\"International_1924\\\",6378388.0,297.0]],PRIMEM[\\\"Greenwich\\\",0.0],UNIT[\\\"Degree\\\",0.0174532925199433]],PROJECTION[\\\"Transverse_Mercator\\\"],PARAMETER[\\\"False_Easting\\\",500000.0],PARAMETER[\\\"False_Northing\\\",0.0],PARAMETER[\\\"Central_Meridian\\\",3.0],PARAMETER[\\\"Scale_Factor\\\",0.9996],PARAMETER[\\\"Latitude_Of_Origin\\\",0.0],UNIT[\\\"Meter\\\",1.0],AUTHORITY[\\\"EPSG\\\",23031]]\"},\"name\":\"ED50 * EPSG-Nor S62 2001 / UTM zone 31N [23031,1613]\",\"singleCT\":{\"authCode\":{\"auth\":\"EPSG\",\"code\":\"1613\"},\"name\":\"ED_1950_To_WGS_1984_24\",\"type\":\"ST\",\"ver\":\"PE_10_3_1\",\"wkt\":\"GEOGTRAN[\\\"ED_1950_To_WGS_1984_24\\\",GEOGCS[\\\"GCS_European_1950\\\",DATUM[\\\"D_European_1950\\\",SPHEROID[\\\"International_1924\\\",6378388.0,297.0]],PRIMEM[\\\"Greenwich\\\",0.0],UNIT[\\\"Degree\\\",0.0174532925199433]],GEOGCS[\\\"GCS_WGS_1984\\\",DATUM[\\\"D_WGS_1984\\\",SPHEROID[\\\"WGS_1984\\\",6378137.0,298.257223563]],PRIMEM[\\\"Greenwich\\\",0.0],UNIT[\\\"Degree\\\",0.0174532925199433]],METHOD[\\\"Position_Vector\\\"],PARAMETER[\\\"X_Axis_Translation\\\",-90.365],PARAMETER[\\\"Y_Axis_Translation\\\",-101.13],PARAMETER[\\\"Z_Axis_Translation\\\",-123.384],PARAMETER[\\\"X_Axis_Rotation\\\",0.333],PARAMETER[\\\"Y_Axis_Rotation\\\",0.077],PARAMETER[\\\"Z_Axis_Rotation\\\",0.894],PARAMETER[\\\"Scale_Difference\\\",1.994],AUTHORITY[\\\"EPSG\\\",1613]]\"},\"type\":\"EBC\",\"ver\":\"PE_10_3_1\"}",
"features": [
{
"type": "AnyCrsFeature",
"geometry": {
"type": "AnyCrsPoint",
"coordinates": [
700113.0,
5757315.0
]
},
"properties": {}
}
]
}
},
"FacilityTypeID": "opendes:reference-data--FacilityType:Wellbore:",
"FacilityName": "1001",
"SequenceNumber": 1,
"VerticalMeasurements": [
{
"VerticalMeasurementID": "TD-Original",
"VerticalMeasurement": 662.0,
"VerticalMeasurementTypeID": "opendes:reference-data--VerticalMeasurementType:TD:",
"VerticalMeasurementPathID": "opendes:reference-data--VerticalMeasurementPath:MD:",
"VerticalReferenceID": "Measured_From"
},
{
"VerticalMeasurementID": "TVD",
"VerticalMeasurement": 662.0,
"VerticalMeasurementTypeID": "opendes:reference-data--VerticalMeasurementType:TD:",
"VerticalMeasurementPathID": "opendes:reference-data--VerticalMeasurementPath:TVD:",
"VerticalReferenceID": "Measured_From"
},
{
"VerticalMeasurementID": "Measured_From",
"VerticalMeasurement": 11.01,
"VerticalMeasurementTypeID": "opendes:reference-data--VerticalMeasurementType:RT:",
"VerticalMeasurementPathID": "opendes:reference-data--VerticalMeasurementPath:ELEV:"
}
],
"DefaultVerticalMeasurementID": "Measured_From",
"ProjectedBottomHoleLocation": {
"AsIngestedCoordinates": {
"type": "AnyCrsFeatureCollection",
"CoordinateReferenceSystemID": "opendes:reference-data--CoordinateReferenceSystem:23031024:",
"persistableReferenceCrs": "{\"authCode\":{\"auth\":\"OSDU\",\"code\":\"23031024\"},\"lateBoundCRS\":{\"authCode\":{\"auth\":\"EPSG\",\"code\":\"23031\"},\"name\":\"ED_1950_UTM_Zone_31N\",\"type\":\"LBC\",\"ver\":\"PE_10_3_1\",\"wkt\":\"PROJCS[\\\"ED_1950_UTM_Zone_31N\\\",GEOGCS[\\\"GCS_European_1950\\\",DATUM[\\\"D_European_1950\\\",SPHEROID[\\\"International_1924\\\",6378388.0,297.0]],PRIMEM[\\\"Greenwich\\\",0.0],UNIT[\\\"Degree\\\",0.0174532925199433]],PROJECTION[\\\"Transverse_Mercator\\\"],PARAMETER[\\\"False_Easting\\\",500000.0],PARAMETER[\\\"False_Northing\\\",0.0],PARAMETER[\\\"Central_Meridian\\\",3.0],PARAMETER[\\\"Scale_Factor\\\",0.9996],PARAMETER[\\\"Latitude_Of_Origin\\\",0.0],UNIT[\\\"Meter\\\",1.0],AUTHORITY[\\\"EPSG\\\",23031]]\"},\"name\":\"ED50 * EPSG-Nor S62 2001 / UTM zone 31N [23031,1613]\",\"singleCT\":{\"authCode\":{\"auth\":\"EPSG\",\"code\":\"1613\"},\"name\":\"ED_1950_To_WGS_1984_24\",\"type\":\"ST\",\"ver\":\"PE_10_3_1\",\"wkt\":\"GEOGTRAN[\\\"ED_1950_To_WGS_1984_24\\\",GEOGCS[\\\"GCS_European_1950\\\",DATUM[\\\"D_European_1950\\\",SPHEROID[\\\"International_1924\\\",6378388.0,297.0]],PRIMEM[\\\"Greenwich\\\",0.0],UNIT[\\\"Degree\\\",0.0174532925199433]],GEOGCS[\\\"GCS_WGS_1984\\\",DATUM[\\\"D_WGS_1984\\\",SPHEROID[\\\"WGS_1984\\\",6378137.0,298.257223563]],PRIMEM[\\\"Greenwich\\\",0.0],UNIT[\\\"Degree\\\",0.0174532925199433]],METHOD[\\\"Position_Vector\\\"],PARAMETER[\\\"X_Axis_Translation\\\",-90.365],PARAMETER[\\\"Y_Axis_Translation\\\",-101.13],PARAMETER[\\\"Z_Axis_Translation\\\",-123.384],PARAMETER[\\\"X_Axis_Rotation\\\",0.333],PARAMETER[\\\"Y_Axis_Rotation\\\",0.077],PARAMETER[\\\"Z_Axis_Rotation\\\",0.894],PARAMETER[\\\"Scale_Difference\\\",1.994],AUTHORITY[\\\"EPSG\\\",1613]]\"},\"type\":\"EBC\",\"ver\":\"PE_10_3_1\"}",
"features": [
{
"type": "AnyCrsFeature",
"geometry": {
"type": "AnyCrsPoint",
"coordinates": [
700113.0,
5757315.0
]
},
"properties": {}
}
]
}
}
}
}
]
}
}
}'
```Anuj Guptavikas ranaAnuj Guptahttps://community.opengroup.org/osdu/platform/pre-shipping/-/issues/564In IBM R3 M19 environment some of the reference data for Entity type UNIT is ...2023-08-24T10:32:13ZKamlesh TodaiIn IBM R3 M19 environment some of the reference data for Entity type UNIT is missing.Following UNIT reference data for Entity reference-data--UnitOfMeasure is missing.
```plaintext
'Missing parents: {SRN: opendes:reference-data--UnitOfMeasure:in,
SRN: opendes:reference-data--UnitOfMeasure:gAPI,
...Following UNIT reference data for Entity reference-data--UnitOfMeasure is missing.
```plaintext
'Missing parents: {SRN: opendes:reference-data--UnitOfMeasure:in,
SRN: opendes:reference-data--UnitOfMeasure:gAPI,
SRN: opendes:reference-data--UnitOfMeasure:cC,
SRN: opendes:dataset--File.Generic:999283442886,
SRN: opendes:reference-data--UnitOfMeasure:psi,
SRN: opendes:reference-data--UnitOfMeasure:ohm.m,
SRN: opendes:reference-data--UnitOfMeasure:g%2Fcm3,
SRN: opendes:reference-data--UnitOfMeasure:h,
SRN: opendes:reference-data--UnitOfMeasure:m3%2Fm3,
SRN: opendes:reference-data--UnitOfMeasure:m%2Fh,
SRN: opendes:reference-data--UnitOfMeasure:degC}'
```Anuj Guptavikas ranaAnuj Guptahttps://community.opengroup.org/osdu/platform/pre-shipping/-/issues/563Energistics XML parser DAG not able to execute successfully. Fails while cr...2023-08-16T10:04:39ZKamlesh TodaiEnergistics XML parser DAG not able to execute successfully. Fails while creating new pod for the oepratorWhen one looks at the DAG runs there has not been any successful run. It is failing while creating new pod.
```plaintext
[2023-08-04 20:00:50,255] {pod_launcher.py:128} WARNING - Pod not yet started: witsml-parser-task.10c656bb27934c3c9...When one looks at the DAG runs there has not been any successful run. It is failing while creating new pod.
```plaintext
[2023-08-04 20:00:50,255] {pod_launcher.py:128} WARNING - Pod not yet started: witsml-parser-task.10c656bb27934c3c90951ea67d10e8a0
[2023-08-04 20:00:50,272] {taskinstance.py:1463} ERROR - Task failed with exception
Traceback (most recent call last):
File "/home/airflow/.local/lib/python3.8/site-packages/airflow/providers/cncf/kubernetes/operators/kubernetes_pod.py", line 367, in execute
final_state, remote_pod, result = self.create_new_pod_for_operator(labels, launcher)
File "/home/airflow/.local/lib/python3.8/site-packages/airflow/providers/cncf/kubernetes/operators/kubernetes_pod.py", line 520, in create_new_pod_for_operator
launcher.start_pod(self.pod, startup_timeout=self.startup_timeout_seconds)
File "/home/airflow/.local/lib/python3.8/site-packages/tenacity/__init__.py", line 329, in wrapped_f
return self.call(f, *args, **kw)
File "/home/airflow/.local/lib/python3.8/site-packages/tenacity/__init__.py", line 409, in call
do = self.iter(retry_state=retry_state)
File "/home/airflow/.local/lib/python3.8/site-packages/tenacity/__init__.py", line 356, in iter
return fut.result()
File "/usr/local/lib/python3.8/concurrent/futures/_base.py", line 437, in result
return self.__get_result()
File "/usr/local/lib/python3.8/concurrent/futures/_base.py", line 389, in __get_result
raise self._exception
File "/home/airflow/.local/lib/python3.8/site-packages/tenacity/__init__.py", line 412, in call
result = fn(*args, **kwargs)
File "/home/airflow/.local/lib/python3.8/site-packages/airflow/providers/cncf/kubernetes/utils/pod_launcher.py", line 131, in start_pod
raise AirflowException("Pod took too long to start")
airflow.exceptions.AirflowException: Pod took too long to start
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/home/airflow/.local/lib/python3.8/site-packages/airflow/models/taskinstance.py", line 1165, in _run_raw_task
self._prepare_and_execute_task_with_callbacks(context, task)
File "/home/airflow/.local/lib/python3.8/site-packages/airflow/models/taskinstance.py", line 1283, in _prepare_and_execute_task_with_callbacks
result = self._execute_task(context, task_copy)
File "/home/airflow/.local/lib/python3.8/site-packages/airflow/models/taskinstance.py", line 1313, in _execute_task
result = task_copy.execute(context=context)
File "/home/airflow/.local/lib/python3.8/site-packages/airflow/providers/cncf/kubernetes/operators/kubernetes_pod.py", line 374, in execute
raise AirflowException(f'Pod Launching failed: {ex}')
airflow.exceptions.AirflowException: Pod Launching failed: Pod took too long to start
[2023-08-04 20:00:50,273] {taskinstance.py:1506} INFO - Marking task as FAILED. dag_id=Energistics_xml_ingest, task_id=witsml_parser_task, execution_date=20230804T195547, start_date=20230804T195549, end_date=20230804T200050
[2023-08-04 20:00:50,318] {local_task_job.py:151} INFO - Task exited with return code 1
[2023-08-04 20:00:50,339] {local_task_job.py:261} INFO - 0 downstream tasks scheduled from follow-on schedule check
```Anuj Guptavikas ranaAnuj Guptahttps://community.opengroup.org/osdu/platform/pre-shipping/-/issues/562CSV parser DAG on IBM R3 M19 environment is failing. Issue seems to be config...2023-08-22T18:30:48ZKamlesh TodaiCSV parser DAG on IBM R3 M19 environment is failing. Issue seems to be configuration and or resource allocationWhen one looks at the DAG runs for CSV parser, there is not a single run that has been successful.
The log indicates that there is some issue in dict - undefined (may be setup configuration issue)
```
[2023-08-04 19:00:31,847] {standa...When one looks at the DAG runs for CSV parser, there is not a single run that has been successful.
The log indicates that there is some issue in dict - undefined (may be setup configuration issue)
```
[2023-08-04 19:00:31,847] {standard_task_runner.py:77} INFO - Job 986: Subtask csv-parser
[2023-08-04 19:00:32,162] {logging_mixin.py:109} INFO - Running <TaskInstance: csv-parser-dag.csv-parser 2023-08-04T18:55:28.858482+00:00 [running]> on host ***-worker-0.***-worker.osdu.svc.cluster.local
[2023-08-04 19:00:32,194] {taskinstance.py:1463} ERROR - Task failed with exception
Traceback (most recent call last):
File "/home/airflow/.local/lib/python3.8/site-packages/airflow/models/taskinstance.py", line 1165, in _run_raw_task
self._prepare_and_execute_task_with_callbacks(context, task)
File "/home/airflow/.local/lib/python3.8/site-packages/airflow/models/taskinstance.py", line 1246, in _prepare_and_execute_task_with_callbacks
self.render_templates(context=context)
File "/home/airflow/.local/lib/python3.8/site-packages/airflow/models/taskinstance.py", line 1755, in render_templates
self.task.render_template_fields(context)
File "/home/airflow/.local/lib/python3.8/site-packages/airflow/models/baseoperator.py", line 997, in render_template_fields
self._do_render_template_fields(self, self.template_fields, context, jinja_env, set())
File "/home/airflow/.local/lib/python3.8/site-packages/airflow/models/baseoperator.py", line 1010, in _do_render_template_fields
rendered_content = self.render_template(content, context, jinja_env, seen_oids)
File "/home/airflow/.local/lib/python3.8/site-packages/airflow/models/baseoperator.py", line 1061, in render_template
return [self.render_template(element, context, jinja_env) for element in content]
File "/home/airflow/.local/lib/python3.8/site-packages/airflow/models/baseoperator.py", line 1061, in <listcomp>
return [self.render_template(element, context, jinja_env) for element in content]
File "/home/airflow/.local/lib/python3.8/site-packages/airflow/models/baseoperator.py", line 1047, in render_template
return jinja_env.from_string(content).render(**context)
File "/home/airflow/.local/lib/python3.8/site-packages/jinja2/environment.py", line 1090, in render
self.environment.handle_exception()
File "/home/airflow/.local/lib/python3.8/site-packages/jinja2/environment.py", line 832, in handle_exception
reraise(*rewrite_traceback_stack(source=source))
File "/home/airflow/.local/lib/python3.8/site-packages/jinja2/_compat.py", line 28, in reraise
raise value.with_traceback(tb)
File "<template>", line 1, in top-level template code
jinja2.exceptions.UndefinedError: 'dict object' has no attribute 'correlationId'
[2023-08-04 19:00:32,197] {taskinstance.py:1506} INFO - Marking task as FAILED. dag_id=csv-parser-dag, task_id=csv-parser, execution_date=20230804T185528, start_date=20230804T190031, end_date=20230804T190032
```Anuj Guptavikas ranaAnuj Guptahttps://community.opengroup.org/osdu/platform/system/schema-service/-/issues/133Restrict high limit value on get schema endpoint2023-08-07T09:54:59ZAbhishek Kumar (SLB)Restrict high limit value on get schema endpointAt present there is no upper limit put by the api on the limit parameter to get the list of Schema Info.
The default value is 100 when limit is not explicitly provided, likewise, there should be some upper bound to restrict abuse case fo...At present there is no upper limit put by the api on the limit parameter to get the list of Schema Info.
The default value is 100 when limit is not explicitly provided, likewise, there should be some upper bound to restrict abuse case for this endpoint.https://community.opengroup.org/osdu/platform/deployment-and-operations/infra-gcp-provisioning/-/issues/25docker-desktop Installation issue?2023-08-16T16:26:21ZShane Hutchinsdocker-desktop Installation issue?- Docker Desktop: 4.21.1 (114176)
- istioctl: 1.18.2
- OS/Arch: darwin/arm64
custom-values:
- domain: "localhost"
- useHttps: false
- limitsEnabled: false
- useInternalServerUrl: true
`% kubectl get po|egrep "Error|CrashLoopBackOff"`
<...- Docker Desktop: 4.21.1 (114176)
- istioctl: 1.18.2
- OS/Arch: darwin/arm64
custom-values:
- domain: "localhost"
- useHttps: false
- limitsEnabled: false
- useInternalServerUrl: true
`% kubectl get po|egrep "Error|CrashLoopBackOff"`
<details><summary>Click to expand</summary>
`airflow-bootstrap-deployment-78959b9768-b426s 0/1 CrashLoopBackOff 11 (80s ago) 58m
elastic-bootstrap-deployment-75444774c7-f64km 0/1 CrashLoopBackOff 12 (4m23s ago) 58m
elasticsearch-0 0/1 CrashLoopBackOff 13 (4m54s ago) 58m
entitlements-7c78fffcc9-cr6xf 1/2 CrashLoopBackOff 9 (2m30s ago) 58m
entitlements-bootstrap-58566c8847-45k47 0/1 CrashLoopBackOff 12 (2m12s ago) 58m
file-84864855f9-wfk74 1/2 CrashLoopBackOff 8 (4m3s ago) 57m
indexer-869f9fd455-pswx2 1/2 CrashLoopBackOff 10 (4m28s ago) 58m
keycloak-bootstrap-deployment-76cf5d84bc-kb4d2 0/1 CrashLoopBackOff 11 (2m43s ago) 57m
notification-575f8db75c-2g75f 0/1 CrashLoopBackOff 9 (3m43s ago) 57m
partition-bootstrap-7856b465f9-lbgwt 1/2 CrashLoopBackOff 13 (3m51s ago) 58m
rabbitmq-bootstrap-deployment-bb7f6c6fd-zzdx4 0/1 CrashLoopBackOff 11 (4m30s ago) 58m
schema-bootstrap-7ff4dd7d5f-qgv7z 0/1 CrashLoopBackOff 12 (2m33s ago) 57m
storage-8658c8d47d-bxd7l 0/1 CrashLoopBackOff 9 (3m8s ago) 57m
wellbore-68b8c4799b-slmrf 0/1 CrashLoopBackOff 20 (4m6s ago) 57m
workflow-bootstrap-859dc856c6-4lfp6 0/1 CrashLoopBackOff 12 (116s ago) 57m`
</details>
`% helm install -f custom-values.yaml osdu-baremetal oci://community.opengroup.org:5555/osdu/platform/deployment-and-operations/infra-gcp-provisioning/gc-helm/osdu-gc-baremetal`
<details><summary>Click to expand</summary>
Pulled: community.opengroup.org:5555/osdu/platform/deployment-and-operations/infra-gcp-provisioning/gc-helm/osdu-gc-baremetal:0.22.2
Digest: sha256:76f7b528ba2d8266ce8fa9274a93fe47240f294cfa492d9d7c0dc6dbfc63364e
W0803 14:23:07.067208 96416 warnings.go:70] configured AuthorizationPolicy will deny all traffic to TCP ports under its scope due to the use of only HTTP attributes in a DENY rule; it is recommended to explicitly specify the port
W0803 14:23:07.073368 96416 warnings.go:70] configured AuthorizationPolicy will deny all traffic to TCP ports under its scope due to the use of only HTTP attributes in a DENY rule; it is recommended to explicitly specify the port
NAME: osdu-baremetal
LAST DEPLOYED: Thu Aug 3 14:22:54 2023
NAMESPACE: default
STATUS: deployed
REVISION: 1
TEST SUITE: None
</details>Dzmitry Malkevich (EPAM)Dzmitry Malkevich (EPAM)https://community.opengroup.org/osdu/platform/pre-shipping/-/issues/559Cannot trigger the DAG Osdu_ingest_by_reference to execute manifest ingestion...2023-08-08T09:43:28ZKamlesh TodaiCannot trigger the DAG Osdu_ingest_by_reference to execute manifest ingestion by Reference in IBM R3 M19 environmentCannot trigger the DAG Osdu_ingest_by_reference to execute manifest ingestion by Reference in IBM R3 M19 environment
Request
curl --location 'https://cpd-osdu.apps.ibmosdu-preship.lndu.p1.openshiftapps.com/osdu-workflow/api/workflow/v1...Cannot trigger the DAG Osdu_ingest_by_reference to execute manifest ingestion by Reference in IBM R3 M19 environment
Request
curl --location 'https://cpd-osdu.apps.ibmosdu-preship.lndu.p1.openshiftapps.com/osdu-workflow/api/workflow/v1/workflow/Osdu_ingest_by_reference/workflowRun' \
\--header 'data-partition-id: opendes' \
\--header 'Content-Type: application/json' \
\--header 'Authorization: Bearer eyJhbGciOiJSUzI1NiIsIn...JjQhFlbeJb0gg' \
\--data '{ "executionContext": { "Payload": { "AppKey": "test-app", "data-partition-id": "opendes" }, "manifest": "opendes:dataset--File.Generic:5bb42ca88eca49da90c48272601fca68" } }'
Response
{ "code": 500, "reason": "could not create addressopendes-status-event-default-topic", "message": "io.netty.util.internal.ReferenceCountUpdater.setInitialValue(Lio/netty/util/ReferenceCounted;)V" }
@anujgupta @chad @debasisc @davidglassvikas ranavikas ranahttps://community.opengroup.org/osdu/platform/deployment-and-operations/infra-azure-provisioning/-/issues/315[Feature] Airflow2 stage with private endpoints2023-08-02T22:19:08ZArturo Hernandez [EPAM][Feature] Airflow2 stage with private endpoints# Airflow2 stage
---
By default airflow2 it is deployed at service resources stage, one airflow it is configured for the hole OSDU.
It looks like airflow2 it is not enough for service resources in a multipartition environment, therefor...# Airflow2 stage
---
By default airflow2 it is deployed at service resources stage, one airflow it is configured for the hole OSDU.
It looks like airflow2 it is not enough for service resources in a multipartition environment, therefore, airflow2 it is deployed externally per data partition, in a separated network and subnet (brand new airflow2 resources will be created).
In order to secure and increase performance when using an external airflow2, we need to setup private endpoint for those resources, including private endpoint for the partition-airflow2 application gateway from the main AKS cluster.
Airflow2 interacts with the storage accounts mostly, therefore I guess those private endpoints will be needed as well.
## Airflow2 independent stage
---
I have an strong opinion that airflow 2 it should be segregated from the partition resources, in case there is need for new external airflow, that should be created as separate stage (like data-partition, service, central), should be some "airflow" resources stage, which should provide configured airflow out of the box.
## ADF Replacement
---
To achieve convergence between ADME and community we might want to start thinking about Azure Data Factory, which it is already available for [terraform - AzureRM](https://registry.terraform.io/providers/hashicorp/azurerm/latest/docs/resources/data_factory.html), we might start smoothly this migration at data partition level and then at service resource level.
## Action items
---
@lucynliu @nursheikh I would like to start these discussion here in the forum, it would be nice to start convergence between community and ADME, I have the feeling we should get rid of the per-partition Airflow2 resources (including AKS for airflow), and start considering at this first stage to use ADF per partition as optional feature, then move forward with ADF at Service Resources, or if should be fine now to start considering using ADF per partition (I don't know if this it is really convenient).
We should also include the optional feature of private endpoints from AKS to ADF/AKS-Airflow in any case.
cc. @lucynliu @vleskivArturo Hernandez [EPAM]Arturo Hernandez [EPAM]https://community.opengroup.org/osdu/platform/consumption/geospatial/-/issues/284SPIKE: Data - Investigate options for incremental updates2023-08-02T15:39:57ZNoel OkanyaSPIKE: Data - Investigate options for incremental updatesCache updates should take into account what records have been loaded since the last update, and then attempt to add only those new records instead of doing a complete destroy-rebuild of the Ignite cache.
Important details:
- What proper...Cache updates should take into account what records have been loaded since the last update, and then attempt to add only those new records instead of doing a complete destroy-rebuild of the Ignite cache.
Important details:
- What property on a Search API record dictates the last time the record was modified or added?
- Need to enforce that new records abide by the schema currently set for the Ignite Cache. New records that break the Ignite schema will not be added. Schema cannot be modified once it is in place.https://community.opengroup.org/osdu/platform/data-flow/ingestion/ingestion-workflow/-/issues/154Workflow Run API - requires datapartitionId in body as well as header2023-10-26T12:23:43ZSurabhi SethWorkflow Run API - requires datapartitionId in body as well as header
API: Workflow Service API \> Workflow Run /workflow/{workflow_name}/workflowRun
This service takes data-partition-id as part of the headers as well as payload body { "executionContext": { "id": "string", \*\* "dataPartitionId": "string...
API: Workflow Service API \> Workflow Run /workflow/{workflow_name}/workflowRun
This service takes data-partition-id as part of the headers as well as payload body { "executionContext": { "id": "string", \*\* "dataPartitionId": "string"\*\* }, "runId": "string" }
![MicrosoftTeams-image__5\<span data-escaped-char\>\_\</span\>](/uploads/5e8d61cdc1316019ab905597094525b9/MicrosoftTeams-image__5_.png)Issue: Requesting for dataPartitionId in the payload body is redundant, and inconsistent with the implementation of all other OSDU API's (where data-partition-id is used from the header)
Ref: https://community.opengroup.org/osdu/platform/data-flow/ingestion/ingestion-workflow/-/blob/master/docs/api/openapi.workflow.yaml?plain=0Chad LeongDeepa KumariChad Leonghttps://community.opengroup.org/osdu/platform/pre-shipping/-/issues/558IBM M19 Policy engine incorrect settings2023-08-29T10:55:12ZDadong ZhouIBM M19 Policy engine incorrect settingsKamlesh reported the errors when calling the Policy Evaluate api:
```
{
"result": {
"records": [
{
"errors": [
{
"code": 404,
"id...Kamlesh reported the errors when calling the Policy Evaluate api:
```
{
"result": {
"records": [
{
"errors": [
{
"code": 404,
"id": "opendes:master-data--Well:test1111111111",
"message": "Entitlements response 404 Error 404 - Not Found ",
"reason": "Unauthorized"
},
{
"code": 404,
"id": "opendes:master-data--Well:test1111111111",
"message": "Legal response 404 Error 404 - Not Found ",
"reason": "Error from compliance service"
},
{
"code": "403",
"id": "opendes:master-data--Well:test1111111111",
"message": "The user is not authorized to perform this action",
"reason": "Access denied"
}
],
"id": "opendes:master-data--Well:test1111111111"
}
]
}
}
```
I confirmed the error in IBM M19 environment.
From the error messages, it seems the two environmental variables ENTITLEMENTS_BASE_URL and LEGAL_BASE_URL in Policy engine are not set correctly. Please check and correct.
Thanks.
cc @todaiksAnuj Guptavikas ranaAnuj Guptahttps://community.opengroup.org/osdu/platform/system/reference/crs-conversion-service/-/issues/76convertTrajectory API returns NaN value for the input request.2024-03-13T17:20:52ZKIRAN ALLAMSETYconvertTrajectory API returns NaN value for the input request.convertTrajectory API , which internally calls the convert API and the error is thrown from the convert API , and as this is not handled , so the WGS coordinates for x and y are set “NaN” in the convertTrajectory Response.
Request:
{
...convertTrajectory API , which internally calls the convert API and the error is thrown from the convert API , and as this is not handled , so the WGS coordinates for x and y are set “NaN” in the convertTrajectory Response.
Request:
{
"azimuthReference": "GN",
"interpolate": false,
"referencePoint": {
"x": 400000,
"y": 6500000,
"z": 100
},
"unitZ": "osdu:reference-data--UnitOfMeasure:ft:",
"unitMD": "osdu:reference-data--UnitOfMeasure:m:",
"inputStations": [
{
"md": 0,
"inclination": 0,
"azimuth": 20
},
{
"md": 100,
"inclination": 10,
"azimuth": 40
}
],
"trajectoryCRS": "osdu:reference-data--CoordinateReferenceSystem:Projected:EPSG::32066:",
"inputKind": "MD_Incl_Azim",
"method": "AzimuthalEquidistant"
}
Response:
{
"trajectoryCRS": "{\"authCode\":{\"auth\":\"EPSG\",\"code\":\"32066\"},\"name\":\"NAD_1927_BLM_Zone_16N\",\"type\":\"LBC\",\"ver\":\"PE_10_9_1\",\"wkt\":\"PROJCS[\\\"NAD_1927_BLM_Zone_16N\\\",GEOGCS[\\\"GCS_North_American_1927\\\",DATUM[\\\"D_North_American_1927\\\",SPHEROID[\\\"Clarke_1866\\\",6378206.4,294.9786982]],PRIMEM[\\\"Greenwich\\\",0.0],UNIT[\\\"Degree\\\",0.0174532925199433]],PROJECTION[\\\"Transverse_Mercator\\\"],PARAMETER[\\\"False_Easting\\\",1640416.666666667],PARAMETER[\\\"False_Northing\\\",0.0],PARAMETER[\\\"Central_Meridian\\\",-87.0],PARAMETER[\\\"Scale_Factor\\\",0.9996],PARAMETER[\\\"Latitude_Of_Origin\\\",0.0],UNIT[\\\"Foot_US\\\",0.3048006096012192],AUTHORITY[\\\"EPSG\\\",32066]]\"}",
"unitXY": "{\"abcd\":{\"a\":0.0,\"b\":1200.0,\"c\":3937.0,\"d\":0.0},\"symbol\":\"ft[US]\",\"baseMeasurement\":{\"ancestry\":\"L\",\"type\":\"UM\"},\"type\":\"UAD\"}",
"unitZ": "{\"abcd\":{\"a\":0.0,\"b\":0.3048,\"c\":1.0,\"d\":0.0},\"symbol\":\"ft\",\"baseMeasurement\":{\"ancestry\":\"L\",\"type\":\"UM\"},\"type\":\"UAD\"}",
"unitDls": "{\"scaleOffset\":{\"scale\":5.72614583987641E-4,\"offset\":0.0},\"symbol\":\"deg/100ft\",\"baseMeasurement\":{\"ancestry\":\"Rotation_Per_Length\",\"type\":\"UM\"},\"type\":\"USO\"}",
"stations": [
{
"md": 0.0,
"inclination": 0.0,
"azimuthTN": 18.903042055778428,
"azimuthGN": 20.0,
"dxTN": 0.0,
"dyTN": 0.0,
"point": {
"x": 121920.24384049278,
"y": 1981203.9624078341,
"z": 99.99999999999999
},
"wgs84Longitude": "NaN",
"wgs84Latitude": "NaN",
"dls": 0.0,
"original": true,
"dz": 0.0
},
{
"md": 100.0,
"inclination": 10.0,
"azimuthTN": 38.90304205577843,
"azimuthGN": 40.0,
"dxTN": 17.934591286244686,
"dyTN": 22.224168064134773,
"point": {
"x": 121925.84664895518,
"y": 1981210.6395745277,
"z": -226.42085631407443
},
"wgs84Longitude": "NaN",
"wgs84Latitude": "NaN",
"dls": 3.048000000000004,
"original": true,
"dz": 326.4208563140744
}
],
"localCRS": "{\"name\":\"Azimuthal Equidistant\",\"type\":\"LBC\",\"ver\":\"PE_10_9_1\",\"wkt\":\"PROJCS[\\\"Azimuthal Equidistant Lng=-90.56722112;Lat=17.88719244\\\",GEOGCS[\\\"GCS_North_American_1927\\\",DATUM[\\\"D_North_American_1927\\\",SPHEROID[\\\"Clarke_1866\\\",6378206.4,294.9786982]],PRIMEM[\\\"Greenwich\\\",0.0],UNIT[\\\"Degree\\\",0.0174532925199433]],PROJECTION[\\\"Modified Azimuthal_Equidistant\\\"],PARAMETER[\\\"False_Easting\\\",0.0],PARAMETER[\\\"False_Northing\\\",0.0],PARAMETER[\\\"Central_Meridian\\\",-90.56722111685697],PARAMETER[\\\"Latitude_Of_Origin\\\",17.887192439357598],UNIT[\\\"Foot_US\\\",0.3048006096012192]]\"}",
"method": "AzimuthalEquidistant",
"operationsApplied": [
"derived TN from GN azimuth by grid convergence 358.903042",
"unitMD Factor value: 1.0 is used for computation of MD",
"computed deflections via minimum curvature method",
"computation method: AzimuthalEquidistant",
"conversion from 'Azimuthal Equidistant' to 'GCS_North_American_1927'",
"conversion from 'GCS_North_American_1927' to 'NAD_1927_BLM_Zone_16N'"
],
"scaleConvergenceList": [
{
"scalefactor": 1.001368,
"convergence": -1.09696,
"point": {
"x": 121920.24384049278,
"y": 1981203.9624078341,
"z": 99.99999999999999
}
},
{
"scalefactor": 1.001368,
"convergence": -1.09695,
"point": {
"x": 121925.84664895518,
"y": 1981210.6395745277,
"z": -226.42085631407443
}
}
],
"unitMD": "{\"abcd\":{\"a\":0.0,\"b\":1.0,\"c\":1.0,\"d\":0.0},\"symbol\":\"m\",\"baseMeasurement\":{\"ancestry\":\"L\",\"type\":\"UM\"},\"type\":\"UAD\"}",
"inputKind": "MD_Incl_Azim"
}M20 - Release 0.23Puneet BhardwajKIRAN ALLAMSETYPuneet Bhardwajhttps://community.opengroup.org/osdu/platform/consumption/geospatial/-/issues/283Postman - Update automated tests2023-07-27T22:27:12ZLevi RemingtonPostman - Update automated testsThe GCZ Postman Collection features a set of automated tests for light validation on features returned by GCZ Feature Layers. These tests are not applicable to every endpoint within GCZ services, but they still apply - and fail.
The Col...The GCZ Postman Collection features a set of automated tests for light validation on features returned by GCZ Feature Layers. These tests are not applicable to every endpoint within GCZ services, but they still apply - and fail.
The Collection should be updated to run tests more selectively, so that failure may reliably indicate a problem.
Acceptance Criteria:
- Postman Collection can be run in its entirety without failurehttps://community.opengroup.org/osdu/platform/consumption/geospatial/-/issues/282Transformer - Testing Support - Add Partial Record Loading capability for sho...2023-10-11T15:26:24ZLevi RemingtonTransformer - Testing Support - Add Partial Record Loading capability for shorter testing durationsFor the automated JUnit tests, we require a configurable limiter on the getAllRecords function. The goal is to artificially limit the number of records plugged into lengthy workflows so they can be tested 100% without taking hours to com...For the automated JUnit tests, we require a configurable limiter on the getAllRecords function. The goal is to artificially limit the number of records plugged into lengthy workflows so they can be tested 100% without taking hours to complete. This is useful for advanced datatypes which require a great deal of processing.
This is an important task, as the more complex workflows that are added without automated testing support, the lower our test coverage numbers become.
Acceptance criteria:
- getAllRecords function extended with optional parameter for imposing a hard-limit on the max number of features that can be ingested in total.
- Updated function leveraged in all applicable test scenarios.Ankita SrivastavaAnkita Srivastavahttps://community.opengroup.org/osdu/platform/consumption/geospatial/-/issues/281Transformer - Enhance Wellbore Marker Ingestion with dynamic RefPoint2023-07-27T22:15:32ZLevi RemingtonTransformer - Enhance Wellbore Marker Ingestion with dynamic RefPointThe upcoming wellbore marker set ingestion workflow will support dynamic creation of Marker Points by interpolating measured depth along a Trajectory. However, **the current assumption is that a Wellbore Trajectory CSV will possess a Sur...The upcoming wellbore marker set ingestion workflow will support dynamic creation of Marker Points by interpolating measured depth along a Trajectory. However, **the current assumption is that a Wellbore Trajectory CSV will possess a SurfaceX and SurfaceY column** to dictate the Surface Location in the same CRS as the Trajectory CSV.
The goal of this issue is to expand support to CSVs which do not possess these columns. The workflow would be:
If CSV does not possess SurfaceX and SurfaceY...
1. Locate the original Well or Wellbore record through a series of ID references
2. Make a query to the Storage API to get the AsIngestedCoordinates of the surface well.
3. Detect the CRS of the coordinates and convert to the WKID of the Trajectory (if necessary)
4. Plug results into the -x and -y inputs of the interpolation script
Acceptane Criteria:
- Wellbore Markers produced and displayed on a map from MarkerSet record where related Trajectory CSV did not contain SurfaceX or SurfaceY columnshttps://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-service/-/issues/104'path' parameter should be optional not required2023-08-29T15:07:30ZZachary Keirn'path' parameter should be optional not requiredThe API docs all state that 'path' is a required field. It looks like it is actually optional and I am told by Mark Yan that the service inserts a "/" if it is not provided. Current collections for testing this all set the 'path' paramet...The API docs all state that 'path' is a required field. It looks like it is actually optional and I am told by Mark Yan that the service inserts a "/" if it is not provided. Current collections for testing this all set the 'path' parameter to an empty string in a pre-request script. But they could be made more clear by just not entering this parameter at all. ALSO, would like to see example of when setting the path is needed and how it should be set under that circumstance. From reading the API doc, it seems to suggest that you would enter the path to the segy file, but if you do that you get following error that seems to insert slashes at start and end of the provided 'path' parameter: The ‘path’ parameter /sd://osdu/testtenant2/ST0202R08_PS_PSDM_RAW_PP_TIME.MIG_RAW.POST_STACK.3D.JS-017534.segy/ is in a wrong format.It should match the regex expression ^[/A-Za-z0-9_.-]*$. In this case, 'path' was set in the params as "sd://osdu/testtenant2/ST0202R08_PS_PSDM_RAW_PP_TIME.MIG_RAW.POST_STACK.3D.JS-017534.segy"Mark YanMark Yanhttps://community.opengroup.org/osdu/platform/system/register/-/issues/46Use Secret service for storing and fetching subscriber secrets.2023-11-08T12:11:40ZRustam Lotsmanenko (EPAM)rustam_lotsmanenko@epam.comUse Secret service for storing and fetching subscriber secrets.Rustam Lotsmanenko (EPAM)rustam_lotsmanenko@epam.comRustam Lotsmanenko (EPAM)rustam_lotsmanenko@epam.com