Ingestion Workflow issueshttps://community.opengroup.org/osdu/platform/data-flow/ingestion/ingestion-workflow/-/issues2023-11-29T15:11:50Zhttps://community.opengroup.org/osdu/platform/data-flow/ingestion/ingestion-workflow/-/issues/160Manifest-based ingestion workflow - Custom attributes check fails2023-11-29T15:11:50ZSamiullah GhousudeenManifest-based ingestion workflow - Custom attributes check fails **{+ Manifest-based ingestion workflow - Custom attributes check fails +}**
Below reference data manifest with custom attributes successfully ingested into WKS `schema-osdu:wks:reference-data--FacilityType:1.0.0` using Manifest-based i... **{+ Manifest-based ingestion workflow - Custom attributes check fails +}**
Below reference data manifest with custom attributes successfully ingested into WKS `schema-osdu:wks:reference-data--FacilityType:1.0.0` using Manifest-based ingestion workflow.
Looks like manifest ingestion workflow fails to validate the custom attributes based on defined WKS schema in manifest.
:zap: _Note : this test performed in pre-shipping Azure environment, workflow {-RUN-ID : ce3ce29f-47b6-47f5-ab93-5cca3163350a-}.
Have also tested in pre-shipping GCP environment, and results are same as well._
cc -@chad @debasisc @todaiks
````{
"data": {
"Code": "Well",
"ID": "Well",
"Name-test": "Well",
"Source": "Workbook Published/FacilityTypeType.1.0.0.xlsx; commit SHA 0b4db59a.",
"test-ingestion": [
{
"test-id": "12345"
}
]
},
"meta": null,
"modifyUser": "preshipping@azureglobal1.onmicrosoft.com",
"modifyTime": "2023-11-08T10:34:42.243Z",
"id": "opendes:reference-data--FacilityType:Well-06112023",
"version": 1699439681616809,
"kind": "osdu:wks:reference-data--FacilityType:1.0.0",
"acl": {
"viewers": [
"data.default.viewers@opendes.contoso.com"
],
"owners": [
"data.default.owners@opendes.contoso.com"
]
},
"legal": {
"legaltags": [
"opendes-Test-Legal-Tag-1007568"
],
"otherRelevantDataCountries": [
"US"
],
"status": "compliant"
},
"createUser": "preshipping@azureglobal1.onmicrosoft.com",
"createTime": "2023-11-06T18:14:31.762Z"
}https://community.opengroup.org/osdu/platform/data-flow/ingestion/ingestion-workflow/-/issues/156listallworkflow API does not return Version of services/DAG deployed.2023-10-23T08:06:15Zvinisha krishnalistallworkflow API does not return Version of services/DAG deployed.The following API returns Version 1 which seems to be hardcoded. We need to get the version of services/DAGs deployed. With ADME we do not have visibility of which version is needed.
{base_url}/solutions/data-flow/apis/workflow-service...The following API returns Version 1 which seems to be hardcoded. We need to get the version of services/DAGs deployed. With ADME we do not have visibility of which version is needed.
{base_url}/solutions/data-flow/apis/workflow-service#/Workflow/listAllWorkflow
![image](/uploads/c99bde240f7cd2276d899e3f05f7cd73/image.png)https://community.opengroup.org/osdu/platform/data-flow/ingestion/ingestion-workflow/-/issues/154Workflow Run API - requires datapartitionId in body as well as header2023-10-26T12:23:43ZSurabhi SethWorkflow Run API - requires datapartitionId in body as well as header
API: Workflow Service API \> Workflow Run /workflow/{workflow_name}/workflowRun
This service takes data-partition-id as part of the headers as well as payload body { "executionContext": { "id": "string", \*\* "dataPartitionId": "string...
API: Workflow Service API \> Workflow Run /workflow/{workflow_name}/workflowRun
This service takes data-partition-id as part of the headers as well as payload body { "executionContext": { "id": "string", \*\* "dataPartitionId": "string"\*\* }, "runId": "string" }
![MicrosoftTeams-image__5\<span data-escaped-char\>\_\</span\>](/uploads/5e8d61cdc1316019ab905597094525b9/MicrosoftTeams-image__5_.png)Issue: Requesting for dataPartitionId in the payload body is redundant, and inconsistent with the implementation of all other OSDU API's (where data-partition-id is used from the header)
Ref: https://community.opengroup.org/osdu/platform/data-flow/ingestion/ingestion-workflow/-/blob/master/docs/api/openapi.workflow.yaml?plain=0Chad LeongDeepa KumariChad Leonghttps://community.opengroup.org/osdu/platform/data-flow/ingestion/ingestion-workflow/-/issues/153IBM workflow integration test failing - for M182023-06-01T09:52:21Zvikas ranaIBM workflow integration test failing - for M18https://community.opengroup.org/osdu/platform/data-flow/ingestion/ingestion-workflow/-/jobs/1996356https://community.opengroup.org/osdu/platform/data-flow/ingestion/ingestion-workflow/-/jobs/1996356M18 - Release 0.21vikas ranavikas ranahttps://community.opengroup.org/osdu/platform/data-flow/ingestion/ingestion-workflow/-/issues/152APIs to get the XCOM summary (Entries) are working in AWS environment, but ar...2023-11-09T07:43:12ZKamlesh TodaiAPIs to get the XCOM summary (Entries) are working in AWS environment, but are NOT working in other CSPs (Azure, GC and IBM) environmentsThe APIs to get the xcomEntries using the the runid and the taskinstance are working in AWS environment. The endpoints/API are not implemented/deployed in other CSP's (Azure, GC, IBM) environments.
<details><summary>curl --location 'htt...The APIs to get the xcomEntries using the the runid and the taskinstance are working in AWS environment. The endpoints/API are not implemented/deployed in other CSP's (Azure, GC, IBM) environments.
<details><summary>curl --location 'https://r3m16.forumtesting.osdu.aws/api/airflow/api/v1/dags/Osdu_ingest/dagRuns/45eb9f45-aada-4e2c-b618-818fb5dfcf28/taskInstances/process_single_manifest_file_task/**xcomEntries/record_ids**' \
--header 'data-partition-id: osdu' \
--header 'Authorization: Bearer eyJraWQiOi...fWbOUA3RcQ'</summary>
</details>
Response 200 OK
{
"dag_id": "Osdu_ingest",
"execution_date": "2023-04-04T21:19:27.327451+00:00",
"key": "record_ids",
"task_id": "process_single_manifest_file_task",
"timestamp": "2023-04-04T21:19:48.761929+00:00",
"value": "['osdu:reference-data--FacilityType:WELL_999259423605', 'osdu:master-data--Organisation:Auto_Test_999259423605', 'osdu:reference-data--FacilityEventType:SPUD_DATE_999259423605', 'osdu:reference-data--VerticalMeasurementPath:DEPTH_DATUM_ELEV_999259423605', 'osdu:reference-data--AliasNameType:WELL_NAME_999259423605', 'osdu:master-data--Well:999259423605']"
}
=========================================================================================================================
`curl --location 'https://r3m16.forumtesting.osdu.aws/api/airflow/api/v1/dags/Osdu_ingest/dagRuns/45eb9f45-aada-4e2c-b618-818fb5dfcf28/taskInstances/process_single_manifest_file_task/**xcomEntries/skipped_ids**' \
--header 'data-partition-id: osdu' \
--header 'Authorization: Bearer eyJraWQiOi...fWbOUA3RcQ`
Response 200 OK
{
"dag_id": "Osdu_ingest",
"execution_date": "2023-04-04T21:19:27.327451+00:00",
"key": "skipped_ids",
"task_id": "process_single_manifest_file_task",
"timestamp": "2023-04-04T21:19:48.783236+00:00",
"value": "[]"
}
@chad @debasisc @Srinivasan_Narayanan @dzmitry_malkevich @anujguptahttps://community.opengroup.org/osdu/platform/data-flow/ingestion/ingestion-workflow/-/issues/151Misleading message in Xcom summary when legal tag is missing2023-03-16T11:31:11ZDebasis ChatterjeeMisleading message in Xcom summary when legal tag is missingI was running a simple test case in AWS/M16/Preship.
I was getting this message.
Now, I get failure. \[{'id': 'osdu:reference-data--FacilityEventType:DC13MAR', 'kind': 'osdu:wks:reference-data--FacilityEventType:1.0.0', 'reason': '400 C...I was running a simple test case in AWS/M16/Preship.
I was getting this message.
Now, I get failure. \[{'id': 'osdu:reference-data--FacilityEventType:DC13MAR', 'kind': 'osdu:wks:reference-data--FacilityEventType:1.0.0', 'reason': '400 Client Error: Bad Request for url: [http://os-storage.osdu-services:8080/api/storage/v2/records'}](http://os-storage.osdu-services:8080/api/storage/v2/records'%7D)\]
Turns out (thanks to AWS Support Nazeem Akbar Ali) that this is because legal tag was not defined and he found the reason by chckingr elevant log file.
See details here -
https://community.opengroup.org/osdu/platform/pre-shipping/-/issues/470#note_207244https://community.opengroup.org/osdu/platform/data-flow/ingestion/ingestion-workflow/-/issues/150Misleading log statements2022-12-12T15:35:32ZMaksim MalkovMisleading log statementsWorkflow service search for a triggered workflow first in provided data partition. System workflow like CSV would not be available in data partition. In such cases service publish logs "workflow not found"
Next same workflow is searched ...Workflow service search for a triggered workflow first in provided data partition. System workflow like CSV would not be available in data partition. In such cases service publish logs "workflow not found"
Next same workflow is searched in system db and it is found there and processing completes
But these logs are creating a confusion that some workflow is not found by workflow service, but actually there is no such issue.M16 - Release 0.19https://community.opengroup.org/osdu/platform/data-flow/ingestion/ingestion-workflow/-/issues/149Untagged release image removed2022-09-30T17:18:19ZDzmitry Malkevich (EPAM)Untagged release image removedImage community.opengroup.org:5555/osdu/platform/data-flow/ingestion/ingestion-workflow/osdu-gcp-workflow:34edf0e7 was deployed to DEV2 with pipeline https://community.opengroup.org/osdu/platform/data-flow/ingestion/ingestion-workflow/-/...Image community.opengroup.org:5555/osdu/platform/data-flow/ingestion/ingestion-workflow/osdu-gcp-workflow:34edf0e7 was deployed to DEV2 with pipeline https://community.opengroup.org/osdu/platform/data-flow/ingestion/ingestion-workflow/-/pipelines/134615 on Sep. 8
Recently this image was removed from repository and now DEV2 is partially broken.
Expectations: untagged images created from release branch are not purged at least until next releasehttps://community.opengroup.org/osdu/platform/data-flow/ingestion/ingestion-workflow/-/issues/139osdu_ingest - Make dataset optional inside "Data" block of JSON payload2022-02-03T17:25:56ZDebasis Chatterjeeosdu_ingest - Make dataset optional inside "Data" block of JSON payloadIn the current structure, "Data block" expects all 3 sections - work-product, work-product component and then Dataset.
During recent discussion with @todaiks and @Kateryna_Kurach , it transpired that Work-product component may simply r...In the current structure, "Data block" expects all 3 sections - work-product, work-product component and then Dataset.
During recent discussion with @todaiks and @Kateryna_Kurach , it transpired that Work-product component may simply refer to an existing Dataset record (created in previous step). so, we do not want to spend double effort to deal with Dataset record.
See collection 29 from Platform Validation. Steps 4c and 5a.
https://community.opengroup.org/osdu/platform/testing/-/blob/master/Postman%20Collection/29_CICD_Setup_Ingestion/R3%20Full%20manifest-based%20ingestion.postman_collection.json
![osdu_ingest-Postman](/uploads/b5b956bf197bd8ce06542fda59697a0b/osdu_ingest-Postman.PNG)https://community.opengroup.org/osdu/platform/data-flow/ingestion/ingestion-workflow/-/issues/137Osdu_Ingest - Provide additional integrity check to catch inconsistencies in ...2022-01-03T01:42:11ZDebasis ChatterjeeOsdu_Ingest - Provide additional integrity check to catch inconsistencies in denormalized data (Ex: Master Entity "Play")When looking at an example provided by Development team (CSV Ingestion), I have this question regarding Master Data (Play) definition.
https://gitlab.opengroup.org/osdu/subcommittees/data-def/work-products/schema/-/blob/master/E-R/mast...When looking at an example provided by Development team (CSV Ingestion), I have this question regarding Master Data (Play) definition.
https://gitlab.opengroup.org/osdu/subcommittees/data-def/work-products/schema/-/blob/master/E-R/master-data/Play.1.0.0.md
data.GeoContexts[].BasinID -> Basin
data.GeoContexts[].GeoTypeID -> BasinType
Isn’t the second field unnecessary (as one can find out that information from Master record Basin itself)?
"BasinTypeID": "namespace:reference-data--BasinType:ArcWrenchOceanContinent:"
In addition, we open up possibility of conflicting information by offering two separate fields in “Play” Master record.
So, suitable integrity check is required.
-----------------------------------
See notes from @gehrmann -
Hi Debasis,
The schema is de-normalised to support queries by BasinType/GeoPoliticalEntityType/...
Yes, every de-normalisation had the risk of introducing contradictions. This was considered as a trade-off - and considered worthwhile in the interest of easier query handling.
Finally, it is possible to organise the master-data as parent-child structures with self-references. This is easiest understood with GeoPoliticalEntity hierarchy: country, state, county,...
Best regards,
Thomas
________________________________________
Additional notes from Thomas
Hi Debasis,
whether or not the extra validation during ingestion is sufficient - I am not so sure. Basin, Play, Prospect, GeoPoliticalEntity are all master-data and therefore subject to continuous improvement. I would think a generic set of data quality rules, which can be re-evaluated after any change, might be a better choice.
The schema, by the way, does mark derived properties (=de-normalised properties) - please check the schema definitions with the dedicated extension tag x-osdu-is-derived:
Example for AbstractGeoBasinContext:
```
"x-osdu-is-derived": {
"RelationshipPropertyName": "BasinID",
"TargetPropertyName": "BasinTypeID"
}
```
In other words: the property GeoTypeID is derived via the sibling property BasinID linking to the target object's property BasinTypeID.
This decoration has been done in other places as well. It should be possible to create a generic implementation of a quality rule covering all of the derived/de-normalised values.
Best regards,
Thomashttps://community.opengroup.org/osdu/platform/data-flow/ingestion/ingestion-workflow/-/issues/136ADR: Workflow Versioning and Update Workflow API2024-01-17T15:31:22ZVineeth Guna [Microsoft]ADR: Workflow Versioning and Update Workflow API# Workflow Versioning
Workflow versioning is a feature to enable seamless running of a newer version of an existing workflow via ingestion workflow. Below are the design challenges/questions around workflow versioning we will discuss goi...# Workflow Versioning
Workflow versioning is a feature to enable seamless running of a newer version of an existing workflow via ingestion workflow. Below are the design challenges/questions around workflow versioning we will discuss going forward
## How to create new version of airflow OSDU DAG’s?
As airflow does not have a way to distinguish between two different versions of a same DAG, we will build this functionality around airflow capabilities
Airflow distinguishes different DAGs based on the DAG name; hence we can create multiple versions of a single DAG by adding a version as a suffix to the DAG name. For example
|Workflow Name|Workflow Version|DAG Name|
|-------------|----------------|--------|
|CSV Parser| 1.0.0 |csv-parser-1.0.0|
|CSV Parser| 2.0.0 |csv-parser-2.0.0|
|CSV Parser| 1.3.1 |csv-parser-1.3.1|
Workflow Version can be one or more of the following
- Git SHA
- Release Version
We can leverage the pipelines which build the final DAG/Packaged DAG to suffix this version to the airflow DAG name before generating the final artifact which can be consumed by airflow to get the new version of an existing DAG up and running
## How ingestion workflow understands about different versions of an existing workflow/DAG?
A workflow metadata in ingestion consists of the following properties
- Workflow ID
- Workflow Name
- Version
- Registration Instructions
- DAG Name (In Airflow)
We can use the combination of version and DAG name to identify different versions of a workflow, For example
|Workflow Name| Workflow Version| DAG Name| Explanation|
|-------------|-----------------|---------------|------------------|
|csv-parser| 1 |csv-parser-1.0.0| This corresponds to a workflow with name “csv-parser” with version “1” which when triggered will use “csv-parser-1.0.0” as the DAG to create a DAG run|
|csv-parser| 2 |csv-parser-1.2.0| This corresponds to a workflow with name “csv-parser” with version “2” which when triggered will use “csv-parser-1.2.0” as the DAG to create a DAG run, note that this is a minor version change |
|csv-parser| 3 |csv-parser-2.0.0| This corresponds to a workflow with name “csv-parser” with version “2” which when triggered will use “csv-parser-2.0.0” as the DAG to create a DAG run, note that this is a major version change|
## Can we trigger different versions of a workflow?
There will always be only one active version of workflow which can be triggered, to answer the question we cannot trigger different versions of same workflow, we can only trigger the active version of the workflow.
Existing trigger workflow API does not support triggering different versions of a workflow
To explain the above, follow the below example
|Workflow Name| Workflow Version| DAG Name| ACTIVE?|
|-------------|-----------------|---------------|--------------|
|Foo| 1| Foo-1.0.0| Yes|
|Foo| 2| Foo-2.0.0| No|
|Bar| 2| Bar-2.0.0| Yes|
|Bar| 1| Bar-1.0.0| No|
In this case we can trigger workflow for the following
- When “foo” workflow is triggered, ingestion workflow triggers the DAG associated with the active version i.e “1”, so it triggers “foo-1.0.0” DAG on airflow
- When “bar” workflow is triggered, ingestion workflow triggers the DAG associated with the active version i.e “2”, so it triggers “bar-2.0.0” DAG on airflow
**There can always be only one active version for a workflow**
## How to add a new version of a workflow?
We can use the update workflow API to add a new version of a workflow, the details of the API will be discussed below
## How to mark a version of workflow as active?
We can use the update workflow API to mark a version of workflow as active, the details of the API will be discussed below
## How to get all versions of workflow?
Another API is introduced to get all the versions of a workflow. This API should return all workflow metadata for all versions present in the system
Refer to get versions API in this specification here - [API Specification](https://community.opengroup.org/osdu/platform/data-flow/ingestion/ingestion-workflow/-/blob/update_api_spec/docs/api/openapi.workflow.yaml)
## How to mark a version of workflow as active?
By default, once you add a new version of workflow using update workflow API, it becomes active, to make an older version of workflow active, use mark workflow version active API
Steps to activate older version of workflow
1. Call get all versions API to fetch the existing versions of workflow
2. Determine the version of workflow which needs to be activated
3. Call mark workflow version active API by passing the version and workflow name
We can use the mark workflow version active API to make an older version active, refer to the API in this [specification]( https://community.opengroup.org/osdu/platform/data-flow/ingestion/ingestion-workflow/-/blob/update_api_spec/docs/api/openapi.workflow.yaml)
## Will updates to a workflow effect the existing in progress workflow runs?
All the existing workflow runs will not get effected because of this change, they will run and reach to a completion state
Any new workflow runs triggered after this change will trigger the DAG associated with the active version as discussed above
## Any changes to the existing API specifications?
|API| Any Changes?| API Specification Changes| Behavioral Changes|
|---|---------------|----------------------------|------------------------|
|Register workflow| No| N/A| N/A|
|Get all workflows| Yes| N/A| It should return active version of workflow|
|Delete workflow| Yes| N/A| It should delete all versions of a workflow|
|Get workflow by name| Yes| N/A| It should return active version of workflow|
|Trigger workflow| Yes| N/A| It should only trigger the active version of a workflow|
|Get all workflow runs| Yes| N/A| It should return all workflow runs across all versions of a workflow|
|Get specific workflow run| Yes| N/A| It should get the status of the workflow run based on the DAG associated to the version|
|Update workflow run| No| N/A| N/A|
|Info| No| N/A| N/A|
## How does this change effect existing workflows in the system?
All existing workflows only have one version; hence we treat existing workflows having only single version and use it to trigger the respective DAG’s
If any new metadata is missing as part of the workflow, the workflows should be updated with this metadata if it has some benefits, else we can keep it as is
## Any limitations on the number of versions supported per workflow?
For now, there are no limitations set, but we can revisit this part if we see any issues
## Can we disable DAGs in airflow for inactive versions?
We cannot disable DAGs in airflow as it will result in stopping all the in progress DAG runs, which is not acceptable
If we can build a solution around checking whether all the DAG runs are completed for an inactive DAG asynchronously, we can disable the DAG, this is again only needed if it helps improve airflow performance
## How does workflow versioning apply for system workflows?
It is similar to normal workflows, the concept of versioning for normal workflows applies to system workflows in similar way, as system workflows applies to all data partitions, any change in version of system workflow will affect all data partitions
# Workflow Update API
Check the update API in this specification - [API Specification](https://community.opengroup.org/osdu/platform/data-flow/ingestion/ingestion-workflow/-/blob/update_api_spec/docs/api/openapi.workflow.yaml)
## Update API supports the following
- To add a new version of workflow and activate it
```bash
curl --location --request PUT 'https://<osdu_endpoint>/v1/workflow/csv-parser' \
--header 'Content-Type: application/json' \
--header 'Authorization: <API Key>' \
--data-raw '{
"registrationInstructions": {
"dagName": "csv-parser-2.0.0",
"dagContent": ""
}
}'
```
- To activate an older version (version 1) of workflow
```bash
curl --location --request PUT 'https://<osdu_endpoint>/v1/workflow/csv-parser/version/1/active' \
--header 'Content-Type: application/json' \
--header 'Authorization: <API Key>'
```
## Update API Limitations
- Cannot update dagName for an already existing version of workflow
- Cannot update description of workflow
- Cannot disable any version of workflow
# Sequence Diagrams for API's after introducing this feature
## Get workflow by name
![Get_Workflow_By_Name](/uploads/8b39b7ceaab1169c03da21309687c800/Get_Workflow_By_Name.png)
## Get all workflows
![Get_All_Workflows](/uploads/626124ec56f069e39ad6c342dbaf4426/Get_All_Workflows.png)
## Trigger workflow
![Trigger_workflow](/uploads/455ac50e420c910f1df519d1a0ef292f/Trigger_workflow.png)
## Get workflow run
![Get_workflow_run](/uploads/20b719d081b6ab83f821a0e2719923ea/Get_workflow_run.png)
## Delete workflow
![Delete_Workflow](/uploads/c7bcc09428fa07bc1af456cda63b0636/Delete_Workflow.png)Vineeth Guna [Microsoft]Vineeth Guna [Microsoft]https://community.opengroup.org/osdu/platform/data-flow/ingestion/ingestion-workflow/-/issues/131Elesticsearch is not able to index the geo_shape defined in the 9 missing rec...2021-11-11T06:18:54ZMonalisa SrivastavaElesticsearch is not able to index the geo_shape defined in the 9 missing recordsDuring testing for a –ve scenario we ingested the same data for the kind : opendes:staticrela:states:4.0.0 which lead to the updation of the records.
Hence to recreate the scenario, I created a new schema with kind : opendes:relationshi...During testing for a –ve scenario we ingested the same data for the kind : opendes:staticrela:states:4.0.0 which lead to the updation of the records.
Hence to recreate the scenario, I created a new schema with kind : opendes:relationships:teststates:1.0.0
Reingested the file with the metadata id : opendes:dataset--File.Generic:6645b7c0-821c-4c32-91dc-cb985a6707fd
Runid for the shapefile_ingestor_wf : 17c0935d-9562-4274-ae7b-54f5285ce792, which ingested 49 records, logs attached.
However the search service gives only 40 records in the output :
{
"kind": "opendes:relationships:teststates:1.0.0",
"returnedFields": [
"id"
],
"limit": 100
}
Response :
{
"cursor": "932F1FA4A28082E6FC5F97FE4D4BC102",
"results": [
{
"id": "opendes:teststates:relationships-YXJpem9uYQ"
},
{
"id": "opendes:teststates:relationships-Y29sb3JhZG8"
},
{
"id": "opendes:teststates:relationships-bWFyeWxhbmQ"
},
{
"id": "opendes:teststates:relationships-dGV4YXM"
},
{
"id": "opendes:teststates:relationships-bm9ydGggY2Fyb2xpbmE"
},
{
"id": "opendes:teststates:relationships-bmV3IG1leGljbw"
},
{
"id": "opendes:teststates:relationships-Z2VvcmdpYQ"
},
{
"id": "opendes:teststates:relationships-ZGlzdHJpY3Qgb2YgY29sdW1iaWE"
},
{
"id": "opendes:teststates:relationships-d2VzdCB2aXJnaW5pYQ"
},
{
"id": "opendes:teststates:relationships-c291dGggY2Fyb2xpbmE"
},
{
"id": "opendes:teststates:relationships-dmlyZ2luaWE"
},
{
"id": "opendes:teststates:relationships-a2Fuc2Fz"
},
{
"id": "opendes:teststates:relationships-b2tsYWhvbWE"
},
{
"id": "opendes:teststates:relationships-ZGVsYXdhcmU"
},
{
"id": "opendes:teststates:relationships-a2VudHVja3k"
},
{
"id": "opendes:teststates:relationships-YWxhYmFtYQ"
},
{
"id": "opendes:teststates:relationships-bWFzc2FjaHVzZXR0cw"
},
{
"id": "opendes:teststates:relationships-d3lvbWluZw"
},
{
"id": "opendes:teststates:relationships-b3JlZ29u"
},
{
"id": "opendes:teststates:relationships-aWRhaG8"
},
{
"id": "opendes:teststates:relationships-bmVicmFza2E"
},
{
"id": "opendes:teststates:relationships-bm9ydGggZGFrb3Rh"
},
{
"id": "opendes:teststates:relationships-bWFpbmU"
},
{
"id": "opendes:teststates:relationships-ZmxvcmlkYQ"
},
{
"id": "opendes:teststates:relationships-bmV3IGhhbXBzaGlyZQ"
},
{
"id": "opendes:teststates:relationships-c291dGggZGFrb3Rh"
},
{
"id": "opendes:teststates:relationships-cGVubnN5bHZhbmlh"
},
{
"id": "opendes:teststates:relationships-aW93YQ"
},
{
"id": "opendes:teststates:relationships-bW9udGFuYQ"
},
{
"id": "opendes:teststates:relationships-dmVybW9udA"
},
{
"id": "opendes:teststates:relationships-bmV3IHlvcms"
},
{
"id": "opendes:teststates:relationships-bmV3IGplcnNleQ"
},
{
"id": "opendes:teststates:relationships-Y29ubmVjdGljdXQ"
},
{
"id": "opendes:teststates:relationships-cmhvZGUgaXNsYW5k"
},
{
"id": "opendes:teststates:relationships-b2hpbw"
},
{
"id": "opendes:teststates:relationships-d2FzaGluZ3Rvbg"
},
{
"id": "opendes:teststates:relationships-dXRhaA"
},
{
"id": "opendes:teststates:relationships-bmV2YWRh"
},
{
"id": "opendes:teststates:relationships-aW5kaWFuYQ"
},
{
"id": "opendes:teststates:relationships-Y2FsaWZvcm5pYQ"
}
],
"totalCount": 40
}
When I validated the information found these 9 ids missing again :
opendes:teststates:relationships-aWxsaW5vaXM
opendes:teststates:relationships-bWlzc291cmk
opendes:teststates:relationships-dGVubmVzc2Vl
opendes:teststates:relationships-bWlzc2lzc2lwcGk
opendes:teststates:relationships-YXJrYW5zYXM
opendes:teststates:relationships-bG91aXNpYW5h
opendes:teststates:relationships-bWljaGlnYW4
opendes:teststates:relationships-d2lzY29uc2lu
opendes:teststates:relationships-bWlubmVzb3Rh
Tried to get it one of them from storage :
{
"data": {
"STATE_NAME": "Illinois",
"STATE_FIPS": 17,
"SUB_REGION": "E N Cen",
"STATE_ABBR": "slb:wing:ABBR-sxs0f1a5219-4640-50af-9f63-c09140d57c4d:",
"LAND_KM": 143986.61,
"WATER_KM": 1993.335,
"PERSONS": 11430602,
"FAMILIES": 2924880,
"HOUSHOLD": 4202240,
"MALE": 5552233,
"FEMALE": 5878369,
"WORKERS": "4199206.0",
"DRVALONE": "3741715.0",
"CARPOOL": "652603.0",
"PUBTRANS": "538071.0",
"EMPLOYED": "5417967.0",
"UNEMPLOY": "385040.0",
"SERVICE": "1360159.0",
"MANUAL": "828906.0",
"P_MALE": "0.486",
"P_FEMALE": "0.514",
"SAMP_POP": "1747776.0",
"SpatialLocation": {
"AsIngestedCoordinates": {
"type": "AnyCrsFeatureCollection",
"features": [
{
"type": "AnyCrsFeature",
"properties": {},
"geometry": {
"type": "AnyCrsPolygon",
"coordinates": [
[
[
37.51099000000001,
-88.071564
],
[
37.583572000000004,
-88.134171
],
[
37.628479,
-88.157631
],
[
37.660686,
-88.15937
],
[
37.700745,
-88.133636
],
[
37.735400999999996,
-88.072472
],
[
37.805683,
-88.035576
],
[
37.817612,
-88.086029
],
[
37.831249,
-88.089264
],
[
37.827522,
-88.042137
],
[
37.843745999999996,
-88.034241
],
[
37.867808999999994,
-88.075737
],
[
37.895306000000005,
-88.101456
],
[
37.90617,
-88.100082
],
[
37.896004000000005,
-88.044868
],
[
37.905758000000006,
-88.026588
],
[
37.917591,
-88.030441
],
[
37.92366,
-88.084
],
[
37.944,
-88.078941
],
[
37.929783,
-88.064621
],
[
37.934498000000005,
-88.041771
],
[
37.956264000000004,
-88.042511
],
[
37.975055999999995,
-88.021706
],
[
38.00823600000001,
-88.029213
],
[
38.03353100000001,
-88.021698
],
[
38.03830300000001,
-88.041473
],
[
38.04512,
-88.043091
],
[
38.054084999999986,
-88.034729
],
[
38.073307,
-87.975296
],
[
38.09674799999999,
-87.964867
],
[
38.09234599999999,
-88.012329
],
[
38.10330200000001,
-88.018547
],
[
38.131760000000014,
-87.973503
],
[
38.13691299999999,
-87.950569
],
[
38.15752800000001,
-87.931992
],
[
38.171131,
-87.932289
],
[
38.200714000000005,
-87.977928
],
[
38.234814,
-87.986008
],
[
38.241085,
-87.980019
],
[
38.30477099999999,
-87.925919
],
[
38.302345,
-87.913651
],
[
38.281048,
-87.914108
],
[
38.300658999999996,
-87.888466
],
[
38.315552,
-87.883446
],
[
38.316788,
-87.874039
],
[
38.28536199999999,
-87.863007
],
[
38.28609800000001,
-87.850082
],
[
38.35252399999999,
-87.834503
],
[
38.378124000000014,
-87.784019
],
[
38.41796500000001,
-87.748428
],
[
38.44548,
-87.738953
],
[
38.45709600000001,
-87.758659
],
[
38.466125000000005,
-87.756096
],
[
38.48153300000001,
-87.692818
],
[
38.50400500000001,
-87.679909
],
[
38.50044299999999,
-87.653534
],
[
38.51536899999999,
-87.65139
],
[
38.54742400000001,
-87.672943
],
[
38.573871999999994,
-87.652855
],
[
38.593177999999995,
-87.640594
],
[
38.599209,
-87.619827
],
[
38.622917,
-87.628647
],
[
38.642810999999995,
-87.625191
],
[
38.672169,
-87.588478
],
[
38.68597399999999,
-87.543892
],
[
38.73663300000001,
-87.508316
],
[
38.769722,
-87.508003
],
[
38.77669900000001,
-87.519028
],
[
38.795559,
-87.507889
],
[
38.857890999999995,
-87.550507
],
[
38.869811999999996,
-87.559059
],
[
38.90486100000001,
-87.5392
],
[
38.93191899999999,
-87.530182
],
[
38.96370300000001,
-87.53347
],
[
38.97707700000001,
-87.547905
],
[
38.99408299999999,
-87.591858
],
[
38.995743000000004,
-87.581749
],
[
39.062434999999994,
-87.58532
],
[
39.08460600000001,
-87.612007
],
[
39.08897400000001,
-87.630867
],
[
39.10394299999999,
-87.631668
],
[
39.11346800000001,
-87.662262
],
[
39.130652999999995,
-87.659454
],
[
39.146679000000006,
-87.670326
],
[
39.168507000000005,
-87.644257
],
[
39.196068,
-87.607925
],
[
39.198128,
-87.594208
],
[
39.20846599999999,
-87.588593
],
[
39.248752999999994,
-87.584564
],
[
39.258162999999996,
-87.606895
],
[
39.281418,
-87.615799
],
[
39.297661000000005,
-87.610619
],
[
39.30740399999999,
-87.625237
],
[
39.338268,
-87.597664
],
[
39.350525000000005,
-87.540215
],
[
39.47744800000001,
-87.538567
],
[
39.609341,
-87.535576
],
[
39.887302000000005,
-87.535774
],
[
40.16619499999999,
-87.535339
],
[
40.48324600000001,
-87.535675
],
[
40.494609999999994,
-87.53717
],
[
40.74541099999999,
-87.532669
],
[
41.00993,
-87.532021
],
[
41.173756,
-87.531731
],
[
41.30130399999999,
-87.532448
],
[
41.46971500000001,
-87.532646
],
[
41.723591,
-87.529861
],
[
41.847331999999994,
-87.612625
],
[
42.059822,
-87.670547
],
[
42.15645599999999,
-87.760239
],
[
42.314212999999995,
-87.836945
],
[
42.48913200000001,
-87.79731
],
[
42.48961299999999,
-88.194702
],
[
42.49197000000001,
-88.297897
],
[
42.489655,
-88.70652
],
[
42.490905999999995,
-88.764954
],
[
42.49086399999999,
-88.939079
],
[
42.497906,
-89.359444
],
[
42.49749,
-89.400497
],
[
42.50345999999999,
-89.834618
],
[
42.504108,
-89.923569
],
[
42.508362000000005,
-90.419975
],
[
42.50936100000001,
-90.638329
],
[
42.494698,
-90.651772
],
[
42.47564299999999,
-90.648346
],
[
42.46055999999999,
-90.605827
],
[
42.42183700000001,
-90.563583
],
[
42.38878299999999,
-90.491043
],
[
42.360073,
-90.441597
],
[
42.340633,
-90.427681
],
[
42.263924,
-90.417984
],
[
42.24264500000001,
-90.407173
],
[
42.21020899999999,
-90.367729
],
[
42.19731899999999,
-90.323601
],
[
42.15972099999999,
-90.230934
],
[
42.12268800000001,
-90.191574
],
[
42.12050199999999,
-90.176086
],
[
42.103745,
-90.166649
],
[
42.06104300000001,
-90.168098
],
[
42.03342799999999,
-90.150536
],
[
41.98396299999999,
-90.14267
],
[
41.93077500000001,
-90.154518
],
[
41.80613700000001,
-90.195839
],
[
41.78173799999999,
-90.25531
],
[
41.75646599999999,
-90.304886
],
[
41.722736,
-90.326027
],
[
41.64909,
-90.341133
],
[
41.60279800000001,
-90.339348
],
[
41.586849,
-90.348366
],
[
41.567272,
-90.423004
],
[
41.543578999999994,
-90.434967
],
[
41.527546,
-90.454994
],
[
41.52597,
-90.54084
],
[
41.50958600000001,
-90.6007
],
[
41.46231800000001,
-90.658791
],
[
41.450062,
-90.708214
],
[
41.449820999999986,
-90.7799
],
[
41.44462200000001,
-90.844139
],
[
41.421234,
-90.949654
],
[
41.431084,
-91.000694
],
[
41.423508,
-91.027489
],
[
41.40137899999999,
-91.055786
],
[
41.334895999999986,
-91.07328
],
[
41.267818000000005,
-91.102348
],
[
41.23152200000001,
-91.101524
],
[
41.17625799999999,
-91.05632
],
[
41.16582500000001,
-91.018257
],
[
41.14437100000001,
-90.990341
],
[
41.10435899999999,
-90.957787
],
[
41.07036199999999,
-90.954651
],
[
40.950503999999995,
-90.960709
],
[
40.92392699999999,
-90.983276
],
[
40.87958499999999,
-91.04921
],
[
40.833729000000005,
-91.088905
],
[
40.76154700000001,
-91.092751
],
[
40.70540199999999,
-91.119987
],
[
40.68214800000001,
-91.129158
],
[
40.65631099999999,
-91.162498
],
[
40.64381800000001,
-91.214912
],
[
40.639545,
-91.262062
],
[
40.60343900000001,
-91.37561
],
[
40.572970999999995,
-91.411118
],
[
40.54799299999999,
-91.412872
],
[
40.52849599999999,
-91.382103
],
[
40.50365400000001,
-91.374794
],
[
40.44725,
-91.385399
],
[
40.40298799999999,
-91.372757
],
[
40.392360999999994,
-91.385757
],
[
40.386875,
-91.418816
],
[
40.371902000000006,
-91.448593
],
[
40.309624000000014,
-91.486694
],
[
40.25137699999999,
-91.498932
],
[
40.200458999999995,
-91.506546
],
[
40.134544000000005,
-91.516129
],
[
40.066711,
-91.504005
],
[
40.005753,
-91.487289
],
[
39.94606400000001,
-91.447243
],
[
39.92183700000001,
-91.430389
],
[
39.90182899999999,
-91.434052
],
[
39.885242000000005,
-91.450989
],
[
39.86304899999999,
-91.449188
],
[
39.80377200000001,
-91.381714
],
[
39.76127199999999,
-91.373421
],
[
39.724639999999994,
-91.367088
],
[
39.68591699999999,
-91.317665
],
[
39.600021,
-91.203247
],
[
39.552593,
-91.156189
],
[
39.52892700000001,
-91.093613
],
[
39.473984,
-91.064384
],
[
39.444412,
-91.036339
],
[
39.40058500000001,
-90.947891
],
[
39.35045199999999,
-90.850494
],
[
39.29680300000001,
-90.779343
],
[
39.24780999999999,
-90.738083
],
[
39.22474700000001,
-90.732338
],
[
39.195873000000006,
-90.718193
],
[
39.14421100000001,
-90.716736
],
[
39.09370000000001,
-90.690399
],
[
39.058178,
-90.707588
],
[
39.037791999999996,
-90.70607
],
[
38.93525299999999,
-90.668877
],
[
38.880795000000006,
-90.627213
],
[
38.87132600000001,
-90.570328
],
[
38.89160899999999,
-90.530426
],
[
38.959179000000006,
-90.469841
],
[
38.96233000000001,
-90.413071
],
[
38.92490799999999,
-90.31974
],
[
38.92471699999999,
-90.278931
],
[
38.91450900000001,
-90.243927
],
[
38.85303099999999,
-90.132812
],
[
38.830467,
-90.113121
],
[
38.80051,
-90.121727
],
[
38.785484,
-90.135178
],
[
38.773098000000005,
-90.163399
],
[
38.72396499999999,
-90.196571
],
[
38.70036300000001,
-90.20224
],
[
38.658772,
-90.183578
],
[
38.61027100000001,
-90.183708
],
[
38.562805,
-90.240944
],
[
38.532768000000004,
-90.26123
],
[
38.518688,
-90.265785
],
[
38.427357,
-90.301842
],
[
38.39084600000001,
-90.339607
],
[
38.36533,
-90.358688
],
[
38.32355899999999,
-90.369347
],
[
38.23429899999999,
-90.364769
],
[
38.18871300000001,
-90.336716
],
[
38.16681700000001,
-90.289635
],
[
38.122169000000014,
-90.254059
],
[
38.08890500000001,
-90.207527
],
[
38.05395100000001,
-90.134712
],
[
38.032272000000006,
-90.119339
],
[
37.993206,
-90.041924
],
[
37.969318,
-90.010811
],
[
37.963634,
-89.958229
],
[
37.911884,
-89.978912
],
[
37.878044,
-89.937874
],
[
37.875904000000006,
-89.900551
],
[
37.891875999999996,
-89.866814
],
[
37.905486999999994,
-89.861046
],
[
37.905063999999996,
-89.851715
],
[
37.840992,
-89.728447
],
[
37.804794,
-89.691055
],
[
37.78397,
-89.675858
],
[
37.745453,
-89.666458
],
[
37.706103999999996,
-89.581436
],
[
37.694798000000006,
-89.521523
],
[
37.67984,
-89.513374
],
[
37.650375,
-89.51918
],
[
37.615928999999994,
-89.513367
],
[
37.571957,
-89.524971
],
[
37.491726,
-89.494781
],
[
37.453186,
-89.453621
],
[
37.411018,
-89.427574
],
[
37.355717,
-89.435738
],
[
37.339409,
-89.468742
],
[
37.329441,
-89.50058
],
[
37.304962,
-89.513885
],
[
37.276402000000004,
-89.513885
],
[
37.256001,
-89.489594
],
[
37.253731,
-89.465309
],
[
37.224266,
-89.468216
],
[
37.165318,
-89.440521
],
[
37.137203,
-89.423798
],
[
37.09908299999999,
-89.37999
],
[
37.049212999999995,
-89.38295
],
[
37.009682,
-89.310982
],
[
36.999207,
-89.282768
],
[
37.008686,
-89.262001
],
[
37.027733,
-89.264244
],
[
37.060908999999995,
-89.3097
],
[
37.085384000000005,
-89.303291
],
[
37.091244,
-89.284233
],
[
37.087124,
-89.264053
],
[
37.041732999999994,
-89.237679
],
[
37.02897299999999,
-89.210052
],
[
36.986771000000005,
-89.193512
],
[
36.988113,
-89.12986
],
[
36.99844,
-89.150246
],
[
37.025711,
-89.174332
],
[
37.064235999999994,
-89.169548
],
[
37.093185000000005,
-89.146347
],
[
37.112137000000004,
-89.116821
],
[
37.185860000000005,
-89.065033
],
[
37.22003599999999,
-88.993172
],
[
37.218407,
-88.932503
],
[
37.202194000000006,
-88.863289
],
[
37.152107,
-88.746506
],
[
37.141182,
-88.739113
],
[
37.13540999999999,
-88.68837
],
[
37.109047000000004,
-88.61422
],
[
37.072815000000006,
-88.559273
],
[
37.064769999999996,
-88.517273
],
[
37.06818,
-88.4907
],
[
37.072143999999994,
-88.476799
],
[
37.098670999999996,
-88.45047
],
[
37.156909999999996,
-88.422516
],
[
37.205669,
-88.450699
],
[
37.257782000000006,
-88.501427
],
[
37.296852,
-88.511322
],
[
37.400757,
-88.467644
],
[
37.420292,
-88.419853
],
[
37.40930899999999,
-88.359177
],
[
37.442852,
-88.311707
],
[
37.476273000000006,
-88.087883
],
[
37.51099000000001,
-88.071564
]
]
]
}
}
],
"persistableReferenceCrs": "{\"lateBoundCRS\":{\"wkt\":\"GEOGCS[\\\"GCS_North_American_1983\\\",DATUM[\\\"D_North_American_1983\\\",SPHEROID[\\\"GRS_1980\\\",6378137.0,298.257222101]],PRIMEM[\\\"Greenwich\\\",0.0],UNIT[\\\"Degree\\\",0.0174532925199433],AUTHORITY[\\\"EPSG\\\",4269]]\",\"ver\":\"PE_10_3_1\",\"name\":\"GCS_North_American_1983\",\"authCode\":{\"auth\":\"EPSG\",\"code\":\"4269\"},\"type\":\"LBC\"},\"singleCT\":{\"wkt\":\"GEOGTRAN[\\\"NAD_1983_To_WGS_1984_1\\\",GEOGCS[\\\"GCS_North_American_1983\\\",DATUM[\\\"D_North_American_1983\\\",SPHEROID[\\\"GRS_1980\\\",6378137.0,298.257222101]],PRIMEM[\\\"Greenwich\\\",0.0],UNIT[\\\"Degree\\\",0.0174532925199433]],GEOGCS[\\\"GCS_WGS_1984\\\",DATUM[\\\"D_WGS_1984\\\",SPHEROID[\\\"WGS_1984\\\",6378137.0,298.257223563]],PRIMEM[\\\"Greenwich\\\",0.0],UNIT[\\\"Degree\\\",0.0174532925199433]],METHOD[\\\"Geocentric_Translation\\\"],PARAMETER[\\\"X_Axis_Translation\\\",0.0],PARAMETER[\\\"Y_Axis_Translation\\\",0.0],PARAMETER[\\\"Z_Axis_Translation\\\",0.0],AUTHORITY[\\\"EPSG\\\",1188]]\",\"ver\":\"PE_10_3_1\",\"name\":\"NAD_1983_To_WGS_1984_1\",\"authCode\":{\"auth\":\"EPSG\",\"code\":\"1188\"},\"type\":\"ST\"},\"ver\":\"PE_10_3_1\",\"name\":\"NAD83 * DMA-N Am [4269,1188]\",\"authCode\":{\"auth\":\"SLB\",\"code\":\"4269001\"},\"type\":\"EBC\"}"
}
},
"relationships": {
"projects": {
"ids": [
"slb:wing:project-sxs0f1a5219-4640-50af-9f63-c09140d57c4d:"
]
}
}
},
"meta": [
{
"kind": "CRS",
"name": "GCS_North_American_1983",
"persistableReference": "{\"lateBoundCRS\":{\"wkt\":\"GEOGCS[\\\"GCS_North_American_1983\\\",DATUM[\\\"D_North_American_1983\\\",SPHEROID[\\\"GRS_1980\\\",6378137.0,298.257222101]],PRIMEM[\\\"Greenwich\\\",0.0],UNIT[\\\"Degree\\\",0.0174532925199433],AUTHORITY[\\\"EPSG\\\",4269]]\",\"ver\":\"PE_10_3_1\",\"name\":\"GCS_North_American_1983\",\"authCode\":{\"auth\":\"EPSG\",\"code\":\"4269\"},\"type\":\"LBC\"},\"singleCT\":{\"wkt\":\"GEOGTRAN[\\\"NAD_1983_To_WGS_1984_1\\\",GEOGCS[\\\"GCS_North_American_1983\\\",DATUM[\\\"D_North_American_1983\\\",SPHEROID[\\\"GRS_1980\\\",6378137.0,298.257222101]],PRIMEM[\\\"Greenwich\\\",0.0],UNIT[\\\"Degree\\\",0.0174532925199433]],GEOGCS[\\\"GCS_WGS_1984\\\",DATUM[\\\"D_WGS_1984\\\",SPHEROID[\\\"WGS_1984\\\",6378137.0,298.257223563]],PRIMEM[\\\"Greenwich\\\",0.0],UNIT[\\\"Degree\\\",0.0174532925199433]],METHOD[\\\"Geocentric_Translation\\\"],PARAMETER[\\\"X_Axis_Translation\\\",0.0],PARAMETER[\\\"Y_Axis_Translation\\\",0.0],PARAMETER[\\\"Z_Axis_Translation\\\",0.0],AUTHORITY[\\\"EPSG\\\",1188]]\",\"ver\":\"PE_10_3_1\",\"name\":\"NAD_1983_To_WGS_1984_1\",\"authCode\":{\"auth\":\"EPSG\",\"code\":\"1188\"},\"type\":\"ST\"},\"ver\":\"PE_10_3_1\",\"name\":\"NAD83 * DMA-N Am [4269,1188]\",\"authCode\":{\"auth\":\"SLB\",\"code\":\"4269001\"},\"type\":\"EBC\"}",
"propertyNames": [
"SpatialLocation.AsIngestedCoordinates"
]
}
],
"id": "opendes:teststates:relationships-aWxsaW5vaXM",
"version": 1631682054027397,
"kind": "opendes:relationships:teststates:1.0.0",
"acl": {
"viewers": [
"data.default.viewers@opendes.enterprisedata.cloud.slb-ds.com"
],
"owners": [
"data.default.viewers@opendes.enterprisedata.cloud.slb-ds.com"
]
},
"legal": {
"legaltags": [
"opendes-public-usa-dataset-7643990"
],
"otherRelevantDataCountries": [
"US"
],
"status": "compliant"
},
"createUser": "39916b94-71a9-409e-856e-0f29558fa908",
"createTime": "2021-09-15T05:00:55.838Z"
}
Please let me know if you need any other information.
Thanks and Regards,
Monalisa Srivastava
Schlumberger-Private
From: Sanjeev Pellikoduku <SPellikoduku@slb.com>
Sent: Wednesday, September 15, 2021 2:00 AM
To: Monalisa Srivastava <MSrivastava4@slb.com>
Cc: Neelesh Thakur <NThakur4@slb.com>; Bhakti Ashwin Thakkar <BThakkar@slb.com>; Nitin Jain <NJain5@slb.com>
Subject: RE: Issue with search
Hi Monalisa,
The records were ingested using multiple kinds "kind": "opendes:staticrela:states:3.0.0" and "kind": "opendes:staticrela:states:4.0.0".
"kind": "opendes:staticrela:states:3.0.0" "totalCount": 40
"kind": "opendes:staticrela:states:4.0.0" "totalCount": 49
Below 9 records ingested with a kind “opendes:staticrela:states:4.0.0”.
opendes:states:staticrela-aWxsaW5vaXMxNw
opendes:states:staticrela-bWlzc291cmkyOQ
opendes:states:staticrela-dGVubmVzc2VlNDc
opendes:states:staticrela-bWlzc2lzc2lwcGkyOA
opendes:states:staticrela-YXJrYW5zYXM1
opendes:states:staticrela-bG91aXNpYW5hMjI
opendes:states:staticrela-bWljaGlnYW4yNg
opendes:states:staticrela-d2lzY29uc2luNTU
opendes:states:staticrela-bWlubmVzb3RhMjc
Please try this query
{
"kind": "opendes:staticrela:states:4.0.0",
"returnedFields": [
"id",
"index"
]
}
Regards.
Sanjeev
Schlumberger-Private
From: Monalisa Srivastava <MSrivastava4@slb.com>
Sent: Tuesday, September 14, 2021 6:45 AM
To: Sanjeev Pellikoduku <SPellikoduku@slb.com>
Cc: Neelesh Thakur <NThakur4@slb.com>; Bhakti Ashwin Thakkar <BThakkar@slb.com>
Subject: Issue with search
Hi Sanjeev,
Need you help to validate an issue we are facing with the search service, below are the details – Env QA
The shapefile ingestor ingested 49 records :
opendes:states:staticrela-aWxsaW5vaXMxNw
opendes:states:staticrela-bWlzc291cmkyOQ
opendes:states:staticrela-dGVubmVzc2VlNDc
opendes:states:staticrela-bWlzc2lzc2lwcGkyOA
opendes:states:staticrela-YXJrYW5zYXM1
opendes:states:staticrela-bG91aXNpYW5hMjI
opendes:states:staticrela-bWljaGlnYW4yNg
opendes:states:staticrela-d2lzY29uc2luNTU
opendes:states:staticrela-bWlubmVzb3RhMjc
opendes:states:staticrela-ZGlzdHJpY3Qgb2YgY29sdW1iaWExMQ
opendes:states:staticrela-ZGVsYXdhcmUxMA
opendes:states:staticrela-d2VzdCB2aXJnaW5pYTU0
opendes:states:staticrela-bWFyeWxhbmQyNA
opendes:states:staticrela-Y29sb3JhZG84
opendes:states:staticrela-a2VudHVja3kyMQ
opendes:states:staticrela-a2Fuc2FzMjA
opendes:states:staticrela-dmlyZ2luaWE1MQ
opendes:states:staticrela-YXJpem9uYTQ
opendes:states:staticrela-b2tsYWhvbWE0MA
opendes:states:staticrela-bm9ydGggY2Fyb2xpbmEzNw
opendes:states:staticrela-dGV4YXM0OA
opendes:states:staticrela-bmV3IG1leGljbzM1
opendes:states:staticrela-YWxhYmFtYTE
opendes:states:staticrela-Z2VvcmdpYTEz
opendes:states:staticrela-c291dGggY2Fyb2xpbmE0NQ
opendes:states:staticrela-ZmxvcmlkYTEy
opendes:states:staticrela-bW9udGFuYTMw
opendes:states:staticrela-bWFpbmUyMw
opendes:states:staticrela-bm9ydGggZGFrb3RhMzg
opendes:states:staticrela-c291dGggZGFrb3RhNDY
opendes:states:staticrela-d3lvbWluZzU2
opendes:states:staticrela-aWRhaG8xNg
opendes:states:staticrela-dmVybW9udDUw
opendes:states:staticrela-b3JlZ29uNDE
opendes:states:staticrela-bmV3IGhhbXBzaGlyZTMz
opendes:states:staticrela-aW93YTE5
opendes:states:staticrela-bWFzc2FjaHVzZXR0czI1
opendes:states:staticrela-bmVicmFza2EzMQ
opendes:states:staticrela-bmV3IHlvcmszNg
opendes:states:staticrela-cGVubnN5bHZhbmlhNDI
opendes:states:staticrela-Y29ubmVjdGljdXQ5
opendes:states:staticrela-cmhvZGUgaXNsYW5kNDQ
opendes:states:staticrela-bmV3IGplcnNleTM0
opendes:states:staticrela-aW5kaWFuYTE4
opendes:states:staticrela-bmV2YWRhMzI
opendes:states:staticrela-dXRhaDQ5
opendes:states:staticrela-Y2FsaWZvcm5pYTY
opendes:states:staticrela-b2hpbzM5
opendes:states:staticrela-d2FzaGluZ3RvbjUz
However the search query gives me only 40 records, I even used the index getting 200 however the records count and ids are 40 :
{
"kind": "opendes:staticrela:states:3.0.0",
"returnedFields": [
"id",
"index"
],
"limit": 100
}
Response :
{
"cursor": "75E5D618FA7C168226980DA08027ECAB",
"results": [
{
"index": {
"trace": [],
"statusCode": 200,
"lastUpdateTime": "2021-09-14T08:14:52.956Z"
},
"id": "opendes:states:staticrela-bmV3IHlvcmszNg"
},
{
"index": {
"trace": [],
"statusCode": 200,
"lastUpdateTime": "2021-09-14T08:14:52.956Z"
},
"id": "opendes:states:staticrela-bm9ydGggY2Fyb2xpbmEzNw"
},
{
"index": {
"trace": [],
"statusCode": 200,
"lastUpdateTime": "2021-09-14T08:14:52.956Z"
},
"id": "opendes:states:staticrela-aW5kaWFuYTE4"
},
{
"index": {
"trace": [],
"statusCode": 200,
"lastUpdateTime": "2021-09-14T08:14:52.956Z"
},
"id": "opendes:states:staticrela-b2tsYWhvbWE0MA"
},
{
"index": {
"trace": [],
"statusCode": 200,
"lastUpdateTime": "2021-09-14T08:14:52.956Z"
},
"id": "opendes:states:staticrela-YWxhYmFtYTE"
},
{
"index": {
"trace": [],
"statusCode": 200,
"lastUpdateTime": "2021-09-14T08:14:52.956Z"
},
"id": "opendes:states:staticrela-aWRhaG8xNg"
},
{
"index": {
"trace": [],
"statusCode": 200,
"lastUpdateTime": "2021-09-14T08:14:52.956Z"
},
"id": "opendes:states:staticrela-bmV3IGhhbXBzaGlyZTMz"
},
{
"index": {
"trace": [],
"statusCode": 200,
"lastUpdateTime": "2021-09-14T08:14:52.956Z"
},
"id": "opendes:states:staticrela-b3JlZ29uNDE"
},
{
"index": {
"trace": [],
"statusCode": 200,
"lastUpdateTime": "2021-09-14T08:14:52.956Z"
},
"id": "opendes:states:staticrela-bmV2YWRhMzI"
},
{
"index": {
"trace": [],
"statusCode": 200,
"lastUpdateTime": "2021-09-14T08:14:52.956Z"
},
"id": "opendes:states:staticrela-d2VzdCB2aXJnaW5pYTU0"
},
{
"index": {
"trace": [],
"statusCode": 200,
"lastUpdateTime": "2021-09-14T08:14:52.956Z"
},
"id": "opendes:states:staticrela-dmlyZ2luaWE1MQ"
},
{
"index": {
"trace": [],
"statusCode": 200,
"lastUpdateTime": "2021-09-14T08:14:52.956Z"
},
"id": "opendes:states:staticrela-bm9ydGggZGFrb3RhMzg"
},
{
"index": {
"trace": [],
"statusCode": 200,
"lastUpdateTime": "2021-09-14T08:14:52.956Z"
},
"id": "opendes:states:staticrela-ZGlzdHJpY3Qgb2YgY29sdW1iaWExMQ"
},
{
"index": {
"trace": [],
"statusCode": 200,
"lastUpdateTime": "2021-09-14T08:14:52.956Z"
},
"id": "opendes:states:staticrela-c291dGggY2Fyb2xpbmE0NQ"
},
{
"index": {
"trace": [],
"statusCode": 200,
"lastUpdateTime": "2021-09-14T08:14:52.956Z"
},
"id": "opendes:states:staticrela-d2FzaGluZ3RvbjUz"
},
{
"index": {
"trace": [],
"statusCode": 200,
"lastUpdateTime": "2021-09-14T08:14:52.956Z"
},
"id": "opendes:states:staticrela-YXJpem9uYTQ"
},
{
"index": {
"trace": [],
"statusCode": 200,
"lastUpdateTime": "2021-09-14T08:14:52.956Z"
},
"id": "opendes:states:staticrela-Y29ubmVjdGljdXQ5"
},
{
"index": {
"trace": [],
"statusCode": 200,
"lastUpdateTime": "2021-09-14T08:14:52.956Z"
},
"id": "opendes:states:staticrela-b2hpbzM5"
},
{
"index": {
"trace": [],
"statusCode": 200,
"lastUpdateTime": "2021-09-14T08:14:52.956Z"
},
"id": "opendes:states:staticrela-bWFzc2FjaHVzZXR0czI1"
},
{
"index": {
"trace": [],
"statusCode": 200,
"lastUpdateTime": "2021-09-14T08:14:52.956Z"
},
"id": "opendes:states:staticrela-a2Fuc2FzMjA"
},
{
"index": {
"trace": [],
"statusCode": 200,
"lastUpdateTime": "2021-09-14T08:14:52.956Z"
},
"id": "opendes:states:staticrela-bmV3IG1leGljbzM1"
},
{
"index": {
"trace": [],
"statusCode": 200,
"lastUpdateTime": "2021-09-14T08:14:52.956Z"
},
"id": "opendes:states:staticrela-cmhvZGUgaXNsYW5kNDQ"
},
{
"index": {
"trace": [],
"statusCode": 200,
"lastUpdateTime": "2021-09-14T08:14:52.956Z"
},
"id": "opendes:states:staticrela-Y29sb3JhZG84"
},
{
"index": {
"trace": [],
"statusCode": 200,
"lastUpdateTime": "2021-09-14T08:14:52.956Z"
},
"id": "opendes:states:staticrela-Y2FsaWZvcm5pYTY"
},
{
"index": {
"trace": [],
"statusCode": 200,
"lastUpdateTime": "2021-09-14T08:14:52.956Z"
},
"id": "opendes:states:staticrela-bWFpbmUyMw"
},
{
"index": {
"trace": [],
"statusCode": 200,
"lastUpdateTime": "2021-09-14T08:14:52.956Z"
},
"id": "opendes:states:staticrela-a2VudHVja3kyMQ"
},
{
"index": {
"trace": [],
"statusCode": 200,
"lastUpdateTime": "2021-09-14T08:14:52.956Z"
},
"id": "opendes:states:staticrela-aW93YTE5"
},
{
"index": {
"trace": [],
"statusCode": 200,
"lastUpdateTime": "2021-09-14T08:14:52.956Z"
},
"id": "opendes:states:staticrela-bmV3IGplcnNleTM0"
},
{
"index": {
"trace": [],
"statusCode": 200,
"lastUpdateTime": "2021-09-14T08:14:52.956Z"
},
"id": "opendes:states:staticrela-dmVybW9udDUw"
},
{
"index": {
"trace": [],
"statusCode": 200,
"lastUpdateTime": "2021-09-14T08:14:52.956Z"
},
"id": "opendes:states:staticrela-Z2VvcmdpYTEz"
},
{
"index": {
"trace": [],
"statusCode": 200,
"lastUpdateTime": "2021-09-14T08:14:52.956Z"
},
"id": "opendes:states:staticrela-d3lvbWluZzU2"
},
{
"index": {
"trace": [],
"statusCode": 200,
"lastUpdateTime": "2021-09-14T08:14:52.956Z"
},
"id": "opendes:states:staticrela-bW9udGFuYTMw"
},
{
"index": {
"trace": [],
"statusCode": 200,
"lastUpdateTime": "2021-09-14T08:14:52.956Z"
},
"id": "opendes:states:staticrela-dXRhaDQ5"
},
{
"index": {
"trace": [],
"statusCode": 200,
"lastUpdateTime": "2021-09-14T08:14:52.956Z"
},
"id": "opendes:states:staticrela-bmVicmFza2EzMQ"
},
{
"index": {
"trace": [],
"statusCode": 200,
"lastUpdateTime": "2021-09-14T08:14:52.956Z"
},
"id": "opendes:states:staticrela-ZmxvcmlkYTEy"
},
{
"index": {
"trace": [],
"statusCode": 200,
"lastUpdateTime": "2021-09-14T08:14:52.956Z"
},
"id": "opendes:states:staticrela-bWFyeWxhbmQyNA"
},
{
"index": {
"trace": [],
"statusCode": 200,
"lastUpdateTime": "2021-09-14T08:14:52.956Z"
},
"id": "opendes:states:staticrela-ZGVsYXdhcmUxMA"
},
{
"index": {
"trace": [],
"statusCode": 200,
"lastUpdateTime": "2021-09-14T08:14:52.956Z"
},
"id": "opendes:states:staticrela-c291dGggZGFrb3RhNDY"
},
{
"index": {
"trace": [],
"statusCode": 200,
"lastUpdateTime": "2021-09-14T08:14:52.956Z"
},
"id": "opendes:states:staticrela-dGV4YXM0OA"
},
{
"index": {
"trace": [],
"statusCode": 200,
"lastUpdateTime": "2021-09-14T08:14:52.956Z"
},
"id": "opendes:states:staticrela-cGVubnN5bHZhbmlhNDI"
}
],
"totalCount": 40
}
Missing 9 records :
opendes:states:staticrela-aWxsaW5vaXMxNw
opendes:states:staticrela-bWlzc291cmkyOQ
opendes:states:staticrela-dGVubmVzc2VlNDc
opendes:states:staticrela-bWlzc2lzc2lwcGkyOA
opendes:states:staticrela-YXJrYW5zYXM1
opendes:states:staticrela-bG91aXNpYW5hMjI
opendes:states:staticrela-bWljaGlnYW4yNg
opendes:states:staticrela-d2lzY29uc2luNTU
opendes:states:staticrela-bWlubmVzb3RhMjc
As discussed today, based on further investigation the Elesticsearch is not able to index the geo_shape defined in the 9 missing records.
reason=Unable to Tessellate shape. It’s a bug in Lucene tessellator. I think it’s fixed in the latest version of the Elastic search.
All the below records have several more coordinates with the type “AnyCrsPolygon” and “AnyCrsMultiPolygon” than the other records.
opendes:teststates:relationships-aWxsaW5vaXM
opendes:teststates:relationships-bWlzc291cmk
opendes:teststates:relationships-dGVubmVzc2Vl
opendes:teststates:relationships-bWlzc2lzc2lwcGk
opendes:teststates:relationships-YXJrYW5zYXM
opendes:teststates:relationships-bG91aXNpYW5h
opendes:teststates:relationships-bWljaGlnYW4
opendes:teststates:relationships-d2lzY29uc2lu
opendes:teststates:relationships-bWlubmVzb3Rhhttps://community.opengroup.org/osdu/platform/data-flow/ingestion/ingestion-workflow/-/issues/128System Dags Implementation for AWS, GCP and IBM2021-09-24T13:28:18ZAalekh JainSystem Dags Implementation for AWS, GCP and IBMLink to the ADR: #118
Link to the MR: !146
In order to support system dags, the following changes are required for AWS, GCP and IBM -
1. `IWorkflowSystemMetadataRepository` (Link to azure implementation for reference: [here](https://c...Link to the ADR: #118
Link to the MR: !146
In order to support system dags, the following changes are required for AWS, GCP and IBM -
1. `IWorkflowSystemMetadataRepository` (Link to azure implementation for reference: [here](https://community.opengroup.org/osdu/platform/data-flow/ingestion/ingestion-workflow/-/blob/bc469fb101f27c24670bf03bc92760ad7303d747/provider/workflow-azure/src/main/java/org/opengroup/osdu/workflow/provider/azure/repository/WorkflowSystemMetadataRepository.java))
2. `IAdminAuthorizationService` (Link to azure implementation for reference: [here](https://community.opengroup.org/osdu/platform/data-flow/ingestion/ingestion-workflow/-/blob/bc469fb101f27c24670bf03bc92760ad7303d747/provider/workflow-azure/src/main/java/org/opengroup/osdu/workflow/provider/azure/service/AdminAuthorizationServiceImpl.java))
Once these SPIs are implemented, the ITs for the corresponding can be extended for each of the cloud provider by extending the base abstract class which is - `DeleteSystemWorkflowV3IntegrationTests` and `PostCreateSystemWorkflowV3IntegrationTests`. For reference, this is how it's done for azure.
1. Extending ITs for delete system workflow - [TestDeleteSystemWorkflowV3Integration.java](https://community.opengroup.org/osdu/platform/data-flow/ingestion/ingestion-workflow/-/blob/bc469fb101f27c24670bf03bc92760ad7303d747/testing/workflow-test-azure/src/test/java/org/opengroup/osdu/azure/workflow/workflow/TestDeleteSystemWorkflowV3Integration.java)
2. Extending ITs for create system workflow - [TestPostCreateSystemWorkflowV3Integration.java](https://community.opengroup.org/osdu/platform/data-flow/ingestion/ingestion-workflow/-/blob/bc469fb101f27c24670bf03bc92760ad7303d747/testing/workflow-test-azure/src/test/java/org/opengroup/osdu/azure/workflow/workflow/TestPostCreateSystemWorkflowV3Integration.java)
The expected behaviour of system workflows is presented in the ADR.https://community.opengroup.org/osdu/platform/data-flow/ingestion/ingestion-workflow/-/issues/127Code refactoring - WorkflowEngineRequest2022-02-15T09:38:05ZAalekh JainCode refactoring - WorkflowEngineRequestThere are too many constructors that exists for [`WorkflowEngineRequest`](https://community.opengroup.org/osdu/platform/data-flow/ingestion/ingestion-workflow/-/blob/master/workflow-core/src/main/java/org/opengroup/osdu/workflow/model/Wo...There are too many constructors that exists for [`WorkflowEngineRequest`](https://community.opengroup.org/osdu/platform/data-flow/ingestion/ingestion-workflow/-/blob/master/workflow-core/src/main/java/org/opengroup/osdu/workflow/model/WorkflowEngineRequest.java). It would be a better decision to use a builder pattern instead to maintain a cleaner code.https://community.opengroup.org/osdu/platform/data-flow/ingestion/ingestion-workflow/-/issues/119Integrate notification service to trigger DAG automatically2022-01-18T04:10:04ZChris ZhangIntegrate notification service to trigger DAG automaticallyAs of today, the ingestion DAGs are triggered manually. With the notification service in place, and the design pattern aligned on OSDU platform events, the DAG trigger could be done automatically in code by subscribing to certain events....As of today, the ingestion DAGs are triggered manually. With the notification service in place, and the design pattern aligned on OSDU platform events, the DAG trigger could be done automatically in code by subscribing to certain events. This should improve the overall the ingestion workflow.
Ref: https://community.opengroup.org/osdu/platform/system/home/-/issues/58
OSDU Platform Events: https://community.opengroup.org/osdu/platform/system/notification/-/wikis/OSDU-Platform-Eventshttps://community.opengroup.org/osdu/platform/data-flow/ingestion/ingestion-workflow/-/issues/117Ingestion process - Differentiate status "Success" to reflect true success an...2021-05-11T15:34:21ZDebasis ChatterjeeIngestion process - Differentiate status "Success" to reflect true success and partial successRight now, DAG status may show "Success" and colored green in graph/tree view although inside the log we can see some failures. Such as when 10 IDs created and 2 IDs rejected.
Propose two changes - Status - "Completed with some failures...Right now, DAG status may show "Success" and colored green in graph/tree view although inside the log we can see some failures. Such as when 10 IDs created and 2 IDs rejected.
Propose two changes - Status - "Completed with some failures" instead of "Success" if some or all records could not be processed for some reason.
In Airflow graph/tree view - perhaps use different shade of green.
Copy of old ticket
https://gitlab.opengroup.org/osdu/subcommittees/ea/projects/pre-shipping/home/-/issues/171#note_37091
cc - @Keith_Wall for information.https://community.opengroup.org/osdu/platform/data-flow/ingestion/ingestion-workflow/-/issues/116Airflow Experimental Delete API not implemented for RBAC enabled Airflow.2021-06-14T16:39:57ZMayank Saggar [Microsoft]Airflow Experimental Delete API not implemented for RBAC enabled Airflow.There is a issue with Airflow where if RBAC for Webserver is enabled, the delete API from airflow was returning 404. On digging the airflow source code, it was discovered that there is separate handler for api requests with rbac enabled ...There is a issue with Airflow where if RBAC for Webserver is enabled, the delete API from airflow was returning 404. On digging the airflow source code, it was discovered that there is separate handler for api requests with rbac enabled which doesn't provide delete API.
Github Source: [endpoints when rbac enabled](https://github.com/apache/airflow/blob/1.10.12/airflow/www_rbac/api/experimental/endpoints.py)
[endpoints when rbac disabled](https://github.com/apache/airflow/blob/1.10.12/airflow/www/api/experimental/endpoints.py)
So if we remove RBAC from webserver we get the Experimental delete API.https://community.opengroup.org/osdu/platform/data-flow/ingestion/ingestion-workflow/-/issues/114POST /v1/workflow/{workflow_name}/workflowRun (Trigger Workflow) with empty r...2021-06-14T16:18:30ZAalekh JainPOST /v1/workflow/{workflow_name}/workflowRun (Trigger Workflow) with empty runId throws 500 internal server error## Description
**Current Behaviour**
For the given request body
```json
{
"runId": "",
"executionContext": {
}
}
```
Error thrown is
```json
{
"code": 500,
"reason": "Unexpectedly failed to insert item into CosmosDB",
...## Description
**Current Behaviour**
For the given request body
```json
{
"runId": "",
"executionContext": {
}
}
```
Error thrown is
```json
{
"code": 500,
"reason": "Unexpectedly failed to insert item into CosmosDB",
"message": "[\"The input name '' is invalid. Ensure to provide a unique non-empty string less than '1024' characters.\"], {\"userAgent\":\"azsdk-java-cosmos/4.7.1 Windows10/10.0 JRE/1.8.0_265\",\"requestLatencyInMs\":212,\"requestStartTimeUTC\":\"2021-04-19T09:01:33.929Z\",\"requestEndTimeUTC\":\"2021-04-19T09:01:34.141Z\",\"connectionMode\":\"DIRECT\",\"responseStatisticsList\":[{\"storeResult\":{\"storePhysicalAddress\":\"rntbd://cdb-ms-prod-eastus2-fd7.documents.azure.com:14178/apps/a78846d5-27aa-45e8-bef0-0950c8a3c1d2/services/9a98fb60-a0fb-43e4-be05-3fe8dc8d6498/partitions/e158dec0-4caf-42e6-b7e8-1eb9dc9b7c84/replicas/132593652864958972p/\",\"lsn\":84405,\"globalCommittedLsn\":84405,\"partitionKeyRangeId\":\"1\",\"isValid\":true,\"statusCode\":400,\"subStatusCode\":0,\"isGone\":false,\"isNotFound\":false,\"isInvalidPartition\":false,\"requestCharge\":1.24,\"itemLSN\":-1,\"sessionToken\":\"-1#84405\",\"exception\":\"[\\\"The input name '' is invalid. Ensure to provide a unique non-empty string less than '1024' characters.\\\"]\",\"transportRequestTimeline\":[{\"eventName\":\"created\",\"durationInMicroSec\":\"0\",\"startTime\":\"2021-04-19T09:01:33.931Z\"},{\"eventName\":\"queued\",\"durationInMicroSec\":\"0\",\"startTime\":\"2021-04-19T09:01:33.931Z\"},{\"eventName\":\"channelAcquisitionStarted\",\"durationInMicroSec\":\"3000\",\"startTime\":\"2021-04-19T09:01:33.931Z\"},{\"eventName\":\"pipelined\",\"durationInMicroSec\":\"1000\",\"startTime\":\"2021-04-19T09:01:33.934Z\"},{\"eventName\":\"transitTime\",\"durationInMicroSec\":\"204000\",\"startTime\":\"2021-04-19T09:01:33.935Z\"},{\"eventName\":\"received\",\"durationInMicroSec\":\"1000\",\"startTime\":\"2021-04-19T09:01:34.139Z\"},{\"eventName\":\"completed\",\"durationInMicroSec\":\"1000\",\"startTime\":\"2021-04-19T09:01:34.140Z\"}],\"rntbdRequestLengthInBytes\":714,\"rntbdResponseLengthInBytes\":325,\"requestPayloadLengthInBytes\":282,\"responsePayloadLengthInBytes\":null,\"channelTaskQueueSize\":1,\"pendingRequestsCount\":1,\"serviceEndpointStatistics\":{\"availableChannels\":1,\"acquiredChannels\":0,\"executorTaskQueueSize\":0,\"inflightRequests\":1,\"lastSuccessfulRequestTime\":\"2021-04-19T08:52:54.424Z\",\"lastRequestTime\":\"2021-04-19T08:52:54.211Z\",\"createdTime\":\"2021-04-19T08:34:55.741Z\",\"isClosed\":false}},\"requestResponseTimeUTC\":\"2021-04-19T09:01:34.141Z\",\"requestResourceType\":\"Document\",\"requestOperationType\":\"Create\"}],\"supplementalResponseStatisticsList\":[],\"addressResolutionStatistics\":{},\"regionsContacted\":[\"https://osdu-mvp-dp1dev-qs29-db-eastus2.documents.azure.com:443/\"],\"retryContext\":{\"retryCount\":0,\"statusAndSubStatusCodes\":null,\"retryLatency\":0},\"metadataDiagnosticsContext\":{\"metadataDiagnosticList\":null},\"serializationDiagnosticsContext\":{\"serializationDiagnosticsList\":[{\"serializationType\":\"ITEM_SERIALIZATION\",\"startTimeUTC\":\"2021-04-19T09:01:33.929Z\",\"endTimeUTC\":\"2021-04-19T09:01:33.929Z\",\"durationInMicroSec\":0}]},\"gatewayStatistics\":null,\"systemInformation\":{\"usedMemory\":\"237293 KB\",\"availableMemory\":\"3432723 KB\",\"systemCpuLoad\":\"(2021-04-19T09:01:08.919Z 5.1%), (2021-04-19T09:01:13.920Z 6.9%), (2021-04-19T09:01:18.921Z 4.0%), (2021-04-19T09:01:23.919Z 5.1%), (2021-04-19T09:01:28.920Z 5.3%), (2021-04-19T09:01:33.922Z 10.5%)\"},\"clientCfgs\":{\"id\":0,\"numberOfClients\":1,\"connCfg\":{\"rntbd\":\"(cto:PT5S, rto:PT5S, icto:PT0S, ieto:PT1H, mcpe:130, mrpc:30)\",\"gw\":\"(cps:1000, rto:PT5S, icto:null, p:false)\",\"other\":\"(ed: true, cs: false)\"},\"consistencyCfg\":\"(consistency: null, mm: true, prgns: [])\"}}"
}
```
**Expected Behaviour**
Should throw an error saying `runId` cannot be empty or invalid `runId` given OR `runId` needs to be generated (similar to what happens when `runId` field is not present in the request body) - Need confirmation on the expected behaviour
Works fine when the request body does not have `runId` as the key
```json
{
"executionContext": {
}
}
```
cc: @kibattul @vineethgunahttps://community.opengroup.org/osdu/platform/data-flow/ingestion/ingestion-workflow/-/issues/113[core] Validating mandatory headers in authorization filter2021-04-09T09:27:50ZAalekh Jain[core] Validating mandatory headers in authorization filterThe validation for mandatory headers, such as `data-partition-id` and `authorization` is **not** present in `AuthorizationFilter`. This check needs to be added to core.
For reference: [Schema service validation of mandatory headers](ht...The validation for mandatory headers, such as `data-partition-id` and `authorization` is **not** present in `AuthorizationFilter`. This check needs to be added to core.
For reference: [Schema service validation of mandatory headers](https://community.opengroup.org/osdu/platform/system/schema-service/-/blob/master/schema-core/src/main/java/org/opengroup/osdu/schema/security/AuthorizationFilter.java#L55)
Original issue: #96
cc: @kibattul @msrivastavahttps://community.opengroup.org/osdu/platform/data-flow/ingestion/ingestion-workflow/-/issues/112Swagger is not behaving correctly for the API /v1/workflow/{workflow_name}/wo...2021-04-07T12:08:26ZMonalisa SrivastavaSwagger is not behaving correctly for the API /v1/workflow/{workflow_name}/workflowRun getAllRunInstancesSwagger https://osdu-glab.msft-osdu-test.org/api/workflow/swagger-ui.html#/workflow-run-api/getAllRunInstancesUsingGET is not accepting params however through postman the API works fine and accept the 4 params `prefix`, `startDate`, `end...Swagger https://osdu-glab.msft-osdu-test.org/api/workflow/swagger-ui.html#/workflow-run-api/getAllRunInstancesUsingGET is not accepting params however through postman the API works fine and accept the 4 params `prefix`, `startDate`, `endDate` and `limit`and we get proper response.
Screenshot attached.![Swagger_Error](/uploads/0f1b760a28aa3501b4a88acef12cfbfd/Swagger_Error.JPG)