Ingestion Workflow issueshttps://community.opengroup.org/osdu/platform/data-flow/ingestion/ingestion-workflow/-/issues2022-01-03T01:42:11Zhttps://community.opengroup.org/osdu/platform/data-flow/ingestion/ingestion-workflow/-/issues/137Osdu_Ingest - Provide additional integrity check to catch inconsistencies in ...2022-01-03T01:42:11ZDebasis ChatterjeeOsdu_Ingest - Provide additional integrity check to catch inconsistencies in denormalized data (Ex: Master Entity "Play")When looking at an example provided by Development team (CSV Ingestion), I have this question regarding Master Data (Play) definition.
https://gitlab.opengroup.org/osdu/subcommittees/data-def/work-products/schema/-/blob/master/E-R/mast...When looking at an example provided by Development team (CSV Ingestion), I have this question regarding Master Data (Play) definition.
https://gitlab.opengroup.org/osdu/subcommittees/data-def/work-products/schema/-/blob/master/E-R/master-data/Play.1.0.0.md
data.GeoContexts[].BasinID -> Basin
data.GeoContexts[].GeoTypeID -> BasinType
Isn’t the second field unnecessary (as one can find out that information from Master record Basin itself)?
"BasinTypeID": "namespace:reference-data--BasinType:ArcWrenchOceanContinent:"
In addition, we open up possibility of conflicting information by offering two separate fields in “Play” Master record.
So, suitable integrity check is required.
-----------------------------------
See notes from @gehrmann -
Hi Debasis,
The schema is de-normalised to support queries by BasinType/GeoPoliticalEntityType/...
Yes, every de-normalisation had the risk of introducing contradictions. This was considered as a trade-off - and considered worthwhile in the interest of easier query handling.
Finally, it is possible to organise the master-data as parent-child structures with self-references. This is easiest understood with GeoPoliticalEntity hierarchy: country, state, county,...
Best regards,
Thomas
________________________________________
Additional notes from Thomas
Hi Debasis,
whether or not the extra validation during ingestion is sufficient - I am not so sure. Basin, Play, Prospect, GeoPoliticalEntity are all master-data and therefore subject to continuous improvement. I would think a generic set of data quality rules, which can be re-evaluated after any change, might be a better choice.
The schema, by the way, does mark derived properties (=de-normalised properties) - please check the schema definitions with the dedicated extension tag x-osdu-is-derived:
Example for AbstractGeoBasinContext:
```
"x-osdu-is-derived": {
"RelationshipPropertyName": "BasinID",
"TargetPropertyName": "BasinTypeID"
}
```
In other words: the property GeoTypeID is derived via the sibling property BasinID linking to the target object's property BasinTypeID.
This decoration has been done in other places as well. It should be possible to create a generic implementation of a quality rule covering all of the derived/de-normalised values.
Best regards,
Thomashttps://community.opengroup.org/osdu/platform/data-flow/ingestion/ingestion-workflow/-/issues/136ADR: Workflow Versioning and Update Workflow API2024-01-17T15:31:22ZVineeth Guna [Microsoft]ADR: Workflow Versioning and Update Workflow API# Workflow Versioning
Workflow versioning is a feature to enable seamless running of a newer version of an existing workflow via ingestion workflow. Below are the design challenges/questions around workflow versioning we will discuss goi...# Workflow Versioning
Workflow versioning is a feature to enable seamless running of a newer version of an existing workflow via ingestion workflow. Below are the design challenges/questions around workflow versioning we will discuss going forward
## How to create new version of airflow OSDU DAG’s?
As airflow does not have a way to distinguish between two different versions of a same DAG, we will build this functionality around airflow capabilities
Airflow distinguishes different DAGs based on the DAG name; hence we can create multiple versions of a single DAG by adding a version as a suffix to the DAG name. For example
|Workflow Name|Workflow Version|DAG Name|
|-------------|----------------|--------|
|CSV Parser| 1.0.0 |csv-parser-1.0.0|
|CSV Parser| 2.0.0 |csv-parser-2.0.0|
|CSV Parser| 1.3.1 |csv-parser-1.3.1|
Workflow Version can be one or more of the following
- Git SHA
- Release Version
We can leverage the pipelines which build the final DAG/Packaged DAG to suffix this version to the airflow DAG name before generating the final artifact which can be consumed by airflow to get the new version of an existing DAG up and running
## How ingestion workflow understands about different versions of an existing workflow/DAG?
A workflow metadata in ingestion consists of the following properties
- Workflow ID
- Workflow Name
- Version
- Registration Instructions
- DAG Name (In Airflow)
We can use the combination of version and DAG name to identify different versions of a workflow, For example
|Workflow Name| Workflow Version| DAG Name| Explanation|
|-------------|-----------------|---------------|------------------|
|csv-parser| 1 |csv-parser-1.0.0| This corresponds to a workflow with name “csv-parser” with version “1” which when triggered will use “csv-parser-1.0.0” as the DAG to create a DAG run|
|csv-parser| 2 |csv-parser-1.2.0| This corresponds to a workflow with name “csv-parser” with version “2” which when triggered will use “csv-parser-1.2.0” as the DAG to create a DAG run, note that this is a minor version change |
|csv-parser| 3 |csv-parser-2.0.0| This corresponds to a workflow with name “csv-parser” with version “2” which when triggered will use “csv-parser-2.0.0” as the DAG to create a DAG run, note that this is a major version change|
## Can we trigger different versions of a workflow?
There will always be only one active version of workflow which can be triggered, to answer the question we cannot trigger different versions of same workflow, we can only trigger the active version of the workflow.
Existing trigger workflow API does not support triggering different versions of a workflow
To explain the above, follow the below example
|Workflow Name| Workflow Version| DAG Name| ACTIVE?|
|-------------|-----------------|---------------|--------------|
|Foo| 1| Foo-1.0.0| Yes|
|Foo| 2| Foo-2.0.0| No|
|Bar| 2| Bar-2.0.0| Yes|
|Bar| 1| Bar-1.0.0| No|
In this case we can trigger workflow for the following
- When “foo” workflow is triggered, ingestion workflow triggers the DAG associated with the active version i.e “1”, so it triggers “foo-1.0.0” DAG on airflow
- When “bar” workflow is triggered, ingestion workflow triggers the DAG associated with the active version i.e “2”, so it triggers “bar-2.0.0” DAG on airflow
**There can always be only one active version for a workflow**
## How to add a new version of a workflow?
We can use the update workflow API to add a new version of a workflow, the details of the API will be discussed below
## How to mark a version of workflow as active?
We can use the update workflow API to mark a version of workflow as active, the details of the API will be discussed below
## How to get all versions of workflow?
Another API is introduced to get all the versions of a workflow. This API should return all workflow metadata for all versions present in the system
Refer to get versions API in this specification here - [API Specification](https://community.opengroup.org/osdu/platform/data-flow/ingestion/ingestion-workflow/-/blob/update_api_spec/docs/api/openapi.workflow.yaml)
## How to mark a version of workflow as active?
By default, once you add a new version of workflow using update workflow API, it becomes active, to make an older version of workflow active, use mark workflow version active API
Steps to activate older version of workflow
1. Call get all versions API to fetch the existing versions of workflow
2. Determine the version of workflow which needs to be activated
3. Call mark workflow version active API by passing the version and workflow name
We can use the mark workflow version active API to make an older version active, refer to the API in this [specification]( https://community.opengroup.org/osdu/platform/data-flow/ingestion/ingestion-workflow/-/blob/update_api_spec/docs/api/openapi.workflow.yaml)
## Will updates to a workflow effect the existing in progress workflow runs?
All the existing workflow runs will not get effected because of this change, they will run and reach to a completion state
Any new workflow runs triggered after this change will trigger the DAG associated with the active version as discussed above
## Any changes to the existing API specifications?
|API| Any Changes?| API Specification Changes| Behavioral Changes|
|---|---------------|----------------------------|------------------------|
|Register workflow| No| N/A| N/A|
|Get all workflows| Yes| N/A| It should return active version of workflow|
|Delete workflow| Yes| N/A| It should delete all versions of a workflow|
|Get workflow by name| Yes| N/A| It should return active version of workflow|
|Trigger workflow| Yes| N/A| It should only trigger the active version of a workflow|
|Get all workflow runs| Yes| N/A| It should return all workflow runs across all versions of a workflow|
|Get specific workflow run| Yes| N/A| It should get the status of the workflow run based on the DAG associated to the version|
|Update workflow run| No| N/A| N/A|
|Info| No| N/A| N/A|
## How does this change effect existing workflows in the system?
All existing workflows only have one version; hence we treat existing workflows having only single version and use it to trigger the respective DAG’s
If any new metadata is missing as part of the workflow, the workflows should be updated with this metadata if it has some benefits, else we can keep it as is
## Any limitations on the number of versions supported per workflow?
For now, there are no limitations set, but we can revisit this part if we see any issues
## Can we disable DAGs in airflow for inactive versions?
We cannot disable DAGs in airflow as it will result in stopping all the in progress DAG runs, which is not acceptable
If we can build a solution around checking whether all the DAG runs are completed for an inactive DAG asynchronously, we can disable the DAG, this is again only needed if it helps improve airflow performance
## How does workflow versioning apply for system workflows?
It is similar to normal workflows, the concept of versioning for normal workflows applies to system workflows in similar way, as system workflows applies to all data partitions, any change in version of system workflow will affect all data partitions
# Workflow Update API
Check the update API in this specification - [API Specification](https://community.opengroup.org/osdu/platform/data-flow/ingestion/ingestion-workflow/-/blob/update_api_spec/docs/api/openapi.workflow.yaml)
## Update API supports the following
- To add a new version of workflow and activate it
```bash
curl --location --request PUT 'https://<osdu_endpoint>/v1/workflow/csv-parser' \
--header 'Content-Type: application/json' \
--header 'Authorization: <API Key>' \
--data-raw '{
"registrationInstructions": {
"dagName": "csv-parser-2.0.0",
"dagContent": ""
}
}'
```
- To activate an older version (version 1) of workflow
```bash
curl --location --request PUT 'https://<osdu_endpoint>/v1/workflow/csv-parser/version/1/active' \
--header 'Content-Type: application/json' \
--header 'Authorization: <API Key>'
```
## Update API Limitations
- Cannot update dagName for an already existing version of workflow
- Cannot update description of workflow
- Cannot disable any version of workflow
# Sequence Diagrams for API's after introducing this feature
## Get workflow by name
![Get_Workflow_By_Name](/uploads/8b39b7ceaab1169c03da21309687c800/Get_Workflow_By_Name.png)
## Get all workflows
![Get_All_Workflows](/uploads/626124ec56f069e39ad6c342dbaf4426/Get_All_Workflows.png)
## Trigger workflow
![Trigger_workflow](/uploads/455ac50e420c910f1df519d1a0ef292f/Trigger_workflow.png)
## Get workflow run
![Get_workflow_run](/uploads/20b719d081b6ab83f821a0e2719923ea/Get_workflow_run.png)
## Delete workflow
![Delete_Workflow](/uploads/c7bcc09428fa07bc1af456cda63b0636/Delete_Workflow.png)Vineeth Guna [Microsoft]Vineeth Guna [Microsoft]https://community.opengroup.org/osdu/platform/data-flow/ingestion/ingestion-workflow/-/issues/135Upgrade to Log4J 2.172021-12-21T03:54:17ZDavid Diederichd.diederich@opengroup.orgUpgrade to Log4J 2.17The Apache Foundation released another Log4j2 update, version 2.17, which address a denial of service vulnerability.
This issue tracks progress to upgrade this dependency for this project.The Apache Foundation released another Log4j2 update, version 2.17, which address a denial of service vulnerability.
This issue tracks progress to upgrade this dependency for this project.https://community.opengroup.org/osdu/platform/data-flow/ingestion/ingestion-workflow/-/issues/134Log4J Expedient Updates and Patches2021-12-16T20:37:57ZDavid Diederichd.diederich@opengroup.orgLog4J Expedient Updates and PatchesThis issue associates MRs that were applied to this project quickly to get a patched version ready as soon as possible. The intent is to provide a reference point for later, more thoughtful, analysis.This issue associates MRs that were applied to this project quickly to get a patched version ready as soon as possible. The intent is to provide a reference point for later, more thoughtful, analysis.David Diederichd.diederich@opengroup.orgDavid Diederichd.diederich@opengroup.orghttps://community.opengroup.org/osdu/platform/data-flow/ingestion/ingestion-workflow/-/issues/133Update dependencies accoriding WhiteSource reports[SLB]2022-05-10T08:03:36ZMaksim MalkovUpdate dependencies accoriding WhiteSource reports[SLB]This is just a regular update raised by the WhiteSource check we have conducted on the SLB side.
Dependencies updates for:
* root pom
* core module pom
* azure module pomThis is just a regular update raised by the WhiteSource check we have conducted on the SLB side.
Dependencies updates for:
* root pom
* core module pom
* azure module pomM10 - Release 0.13Maksim MalkovMaksim Malkovhttps://community.opengroup.org/osdu/platform/data-flow/ingestion/ingestion-workflow/-/issues/132Add a version of Airflow into an endpoint 'info' for Workflow Service [GONRG-...2021-12-15T19:43:52ZKateryna Kurach (EPAM)Add a version of Airflow into an endpoint 'info' for Workflow Service [GONRG-3777]Add a version of Airflow into an endpoint 'info' for Workflow Service
Add v1 into /api/workflow/info
Expected path:
{workflow}
/api/workflow/v1/infoAdd a version of Airflow into an endpoint 'info' for Workflow Service
Add v1 into /api/workflow/info
Expected path:
{workflow}
/api/workflow/v1/infoM10 - Release 0.13https://community.opengroup.org/osdu/platform/data-flow/ingestion/ingestion-workflow/-/issues/131Elesticsearch is not able to index the geo_shape defined in the 9 missing rec...2021-11-11T06:18:54ZMonalisa SrivastavaElesticsearch is not able to index the geo_shape defined in the 9 missing recordsDuring testing for a –ve scenario we ingested the same data for the kind : opendes:staticrela:states:4.0.0 which lead to the updation of the records.
Hence to recreate the scenario, I created a new schema with kind : opendes:relationshi...During testing for a –ve scenario we ingested the same data for the kind : opendes:staticrela:states:4.0.0 which lead to the updation of the records.
Hence to recreate the scenario, I created a new schema with kind : opendes:relationships:teststates:1.0.0
Reingested the file with the metadata id : opendes:dataset--File.Generic:6645b7c0-821c-4c32-91dc-cb985a6707fd
Runid for the shapefile_ingestor_wf : 17c0935d-9562-4274-ae7b-54f5285ce792, which ingested 49 records, logs attached.
However the search service gives only 40 records in the output :
{
"kind": "opendes:relationships:teststates:1.0.0",
"returnedFields": [
"id"
],
"limit": 100
}
Response :
{
"cursor": "932F1FA4A28082E6FC5F97FE4D4BC102",
"results": [
{
"id": "opendes:teststates:relationships-YXJpem9uYQ"
},
{
"id": "opendes:teststates:relationships-Y29sb3JhZG8"
},
{
"id": "opendes:teststates:relationships-bWFyeWxhbmQ"
},
{
"id": "opendes:teststates:relationships-dGV4YXM"
},
{
"id": "opendes:teststates:relationships-bm9ydGggY2Fyb2xpbmE"
},
{
"id": "opendes:teststates:relationships-bmV3IG1leGljbw"
},
{
"id": "opendes:teststates:relationships-Z2VvcmdpYQ"
},
{
"id": "opendes:teststates:relationships-ZGlzdHJpY3Qgb2YgY29sdW1iaWE"
},
{
"id": "opendes:teststates:relationships-d2VzdCB2aXJnaW5pYQ"
},
{
"id": "opendes:teststates:relationships-c291dGggY2Fyb2xpbmE"
},
{
"id": "opendes:teststates:relationships-dmlyZ2luaWE"
},
{
"id": "opendes:teststates:relationships-a2Fuc2Fz"
},
{
"id": "opendes:teststates:relationships-b2tsYWhvbWE"
},
{
"id": "opendes:teststates:relationships-ZGVsYXdhcmU"
},
{
"id": "opendes:teststates:relationships-a2VudHVja3k"
},
{
"id": "opendes:teststates:relationships-YWxhYmFtYQ"
},
{
"id": "opendes:teststates:relationships-bWFzc2FjaHVzZXR0cw"
},
{
"id": "opendes:teststates:relationships-d3lvbWluZw"
},
{
"id": "opendes:teststates:relationships-b3JlZ29u"
},
{
"id": "opendes:teststates:relationships-aWRhaG8"
},
{
"id": "opendes:teststates:relationships-bmVicmFza2E"
},
{
"id": "opendes:teststates:relationships-bm9ydGggZGFrb3Rh"
},
{
"id": "opendes:teststates:relationships-bWFpbmU"
},
{
"id": "opendes:teststates:relationships-ZmxvcmlkYQ"
},
{
"id": "opendes:teststates:relationships-bmV3IGhhbXBzaGlyZQ"
},
{
"id": "opendes:teststates:relationships-c291dGggZGFrb3Rh"
},
{
"id": "opendes:teststates:relationships-cGVubnN5bHZhbmlh"
},
{
"id": "opendes:teststates:relationships-aW93YQ"
},
{
"id": "opendes:teststates:relationships-bW9udGFuYQ"
},
{
"id": "opendes:teststates:relationships-dmVybW9udA"
},
{
"id": "opendes:teststates:relationships-bmV3IHlvcms"
},
{
"id": "opendes:teststates:relationships-bmV3IGplcnNleQ"
},
{
"id": "opendes:teststates:relationships-Y29ubmVjdGljdXQ"
},
{
"id": "opendes:teststates:relationships-cmhvZGUgaXNsYW5k"
},
{
"id": "opendes:teststates:relationships-b2hpbw"
},
{
"id": "opendes:teststates:relationships-d2FzaGluZ3Rvbg"
},
{
"id": "opendes:teststates:relationships-dXRhaA"
},
{
"id": "opendes:teststates:relationships-bmV2YWRh"
},
{
"id": "opendes:teststates:relationships-aW5kaWFuYQ"
},
{
"id": "opendes:teststates:relationships-Y2FsaWZvcm5pYQ"
}
],
"totalCount": 40
}
When I validated the information found these 9 ids missing again :
opendes:teststates:relationships-aWxsaW5vaXM
opendes:teststates:relationships-bWlzc291cmk
opendes:teststates:relationships-dGVubmVzc2Vl
opendes:teststates:relationships-bWlzc2lzc2lwcGk
opendes:teststates:relationships-YXJrYW5zYXM
opendes:teststates:relationships-bG91aXNpYW5h
opendes:teststates:relationships-bWljaGlnYW4
opendes:teststates:relationships-d2lzY29uc2lu
opendes:teststates:relationships-bWlubmVzb3Rh
Tried to get it one of them from storage :
{
"data": {
"STATE_NAME": "Illinois",
"STATE_FIPS": 17,
"SUB_REGION": "E N Cen",
"STATE_ABBR": "slb:wing:ABBR-sxs0f1a5219-4640-50af-9f63-c09140d57c4d:",
"LAND_KM": 143986.61,
"WATER_KM": 1993.335,
"PERSONS": 11430602,
"FAMILIES": 2924880,
"HOUSHOLD": 4202240,
"MALE": 5552233,
"FEMALE": 5878369,
"WORKERS": "4199206.0",
"DRVALONE": "3741715.0",
"CARPOOL": "652603.0",
"PUBTRANS": "538071.0",
"EMPLOYED": "5417967.0",
"UNEMPLOY": "385040.0",
"SERVICE": "1360159.0",
"MANUAL": "828906.0",
"P_MALE": "0.486",
"P_FEMALE": "0.514",
"SAMP_POP": "1747776.0",
"SpatialLocation": {
"AsIngestedCoordinates": {
"type": "AnyCrsFeatureCollection",
"features": [
{
"type": "AnyCrsFeature",
"properties": {},
"geometry": {
"type": "AnyCrsPolygon",
"coordinates": [
[
[
37.51099000000001,
-88.071564
],
[
37.583572000000004,
-88.134171
],
[
37.628479,
-88.157631
],
[
37.660686,
-88.15937
],
[
37.700745,
-88.133636
],
[
37.735400999999996,
-88.072472
],
[
37.805683,
-88.035576
],
[
37.817612,
-88.086029
],
[
37.831249,
-88.089264
],
[
37.827522,
-88.042137
],
[
37.843745999999996,
-88.034241
],
[
37.867808999999994,
-88.075737
],
[
37.895306000000005,
-88.101456
],
[
37.90617,
-88.100082
],
[
37.896004000000005,
-88.044868
],
[
37.905758000000006,
-88.026588
],
[
37.917591,
-88.030441
],
[
37.92366,
-88.084
],
[
37.944,
-88.078941
],
[
37.929783,
-88.064621
],
[
37.934498000000005,
-88.041771
],
[
37.956264000000004,
-88.042511
],
[
37.975055999999995,
-88.021706
],
[
38.00823600000001,
-88.029213
],
[
38.03353100000001,
-88.021698
],
[
38.03830300000001,
-88.041473
],
[
38.04512,
-88.043091
],
[
38.054084999999986,
-88.034729
],
[
38.073307,
-87.975296
],
[
38.09674799999999,
-87.964867
],
[
38.09234599999999,
-88.012329
],
[
38.10330200000001,
-88.018547
],
[
38.131760000000014,
-87.973503
],
[
38.13691299999999,
-87.950569
],
[
38.15752800000001,
-87.931992
],
[
38.171131,
-87.932289
],
[
38.200714000000005,
-87.977928
],
[
38.234814,
-87.986008
],
[
38.241085,
-87.980019
],
[
38.30477099999999,
-87.925919
],
[
38.302345,
-87.913651
],
[
38.281048,
-87.914108
],
[
38.300658999999996,
-87.888466
],
[
38.315552,
-87.883446
],
[
38.316788,
-87.874039
],
[
38.28536199999999,
-87.863007
],
[
38.28609800000001,
-87.850082
],
[
38.35252399999999,
-87.834503
],
[
38.378124000000014,
-87.784019
],
[
38.41796500000001,
-87.748428
],
[
38.44548,
-87.738953
],
[
38.45709600000001,
-87.758659
],
[
38.466125000000005,
-87.756096
],
[
38.48153300000001,
-87.692818
],
[
38.50400500000001,
-87.679909
],
[
38.50044299999999,
-87.653534
],
[
38.51536899999999,
-87.65139
],
[
38.54742400000001,
-87.672943
],
[
38.573871999999994,
-87.652855
],
[
38.593177999999995,
-87.640594
],
[
38.599209,
-87.619827
],
[
38.622917,
-87.628647
],
[
38.642810999999995,
-87.625191
],
[
38.672169,
-87.588478
],
[
38.68597399999999,
-87.543892
],
[
38.73663300000001,
-87.508316
],
[
38.769722,
-87.508003
],
[
38.77669900000001,
-87.519028
],
[
38.795559,
-87.507889
],
[
38.857890999999995,
-87.550507
],
[
38.869811999999996,
-87.559059
],
[
38.90486100000001,
-87.5392
],
[
38.93191899999999,
-87.530182
],
[
38.96370300000001,
-87.53347
],
[
38.97707700000001,
-87.547905
],
[
38.99408299999999,
-87.591858
],
[
38.995743000000004,
-87.581749
],
[
39.062434999999994,
-87.58532
],
[
39.08460600000001,
-87.612007
],
[
39.08897400000001,
-87.630867
],
[
39.10394299999999,
-87.631668
],
[
39.11346800000001,
-87.662262
],
[
39.130652999999995,
-87.659454
],
[
39.146679000000006,
-87.670326
],
[
39.168507000000005,
-87.644257
],
[
39.196068,
-87.607925
],
[
39.198128,
-87.594208
],
[
39.20846599999999,
-87.588593
],
[
39.248752999999994,
-87.584564
],
[
39.258162999999996,
-87.606895
],
[
39.281418,
-87.615799
],
[
39.297661000000005,
-87.610619
],
[
39.30740399999999,
-87.625237
],
[
39.338268,
-87.597664
],
[
39.350525000000005,
-87.540215
],
[
39.47744800000001,
-87.538567
],
[
39.609341,
-87.535576
],
[
39.887302000000005,
-87.535774
],
[
40.16619499999999,
-87.535339
],
[
40.48324600000001,
-87.535675
],
[
40.494609999999994,
-87.53717
],
[
40.74541099999999,
-87.532669
],
[
41.00993,
-87.532021
],
[
41.173756,
-87.531731
],
[
41.30130399999999,
-87.532448
],
[
41.46971500000001,
-87.532646
],
[
41.723591,
-87.529861
],
[
41.847331999999994,
-87.612625
],
[
42.059822,
-87.670547
],
[
42.15645599999999,
-87.760239
],
[
42.314212999999995,
-87.836945
],
[
42.48913200000001,
-87.79731
],
[
42.48961299999999,
-88.194702
],
[
42.49197000000001,
-88.297897
],
[
42.489655,
-88.70652
],
[
42.490905999999995,
-88.764954
],
[
42.49086399999999,
-88.939079
],
[
42.497906,
-89.359444
],
[
42.49749,
-89.400497
],
[
42.50345999999999,
-89.834618
],
[
42.504108,
-89.923569
],
[
42.508362000000005,
-90.419975
],
[
42.50936100000001,
-90.638329
],
[
42.494698,
-90.651772
],
[
42.47564299999999,
-90.648346
],
[
42.46055999999999,
-90.605827
],
[
42.42183700000001,
-90.563583
],
[
42.38878299999999,
-90.491043
],
[
42.360073,
-90.441597
],
[
42.340633,
-90.427681
],
[
42.263924,
-90.417984
],
[
42.24264500000001,
-90.407173
],
[
42.21020899999999,
-90.367729
],
[
42.19731899999999,
-90.323601
],
[
42.15972099999999,
-90.230934
],
[
42.12268800000001,
-90.191574
],
[
42.12050199999999,
-90.176086
],
[
42.103745,
-90.166649
],
[
42.06104300000001,
-90.168098
],
[
42.03342799999999,
-90.150536
],
[
41.98396299999999,
-90.14267
],
[
41.93077500000001,
-90.154518
],
[
41.80613700000001,
-90.195839
],
[
41.78173799999999,
-90.25531
],
[
41.75646599999999,
-90.304886
],
[
41.722736,
-90.326027
],
[
41.64909,
-90.341133
],
[
41.60279800000001,
-90.339348
],
[
41.586849,
-90.348366
],
[
41.567272,
-90.423004
],
[
41.543578999999994,
-90.434967
],
[
41.527546,
-90.454994
],
[
41.52597,
-90.54084
],
[
41.50958600000001,
-90.6007
],
[
41.46231800000001,
-90.658791
],
[
41.450062,
-90.708214
],
[
41.449820999999986,
-90.7799
],
[
41.44462200000001,
-90.844139
],
[
41.421234,
-90.949654
],
[
41.431084,
-91.000694
],
[
41.423508,
-91.027489
],
[
41.40137899999999,
-91.055786
],
[
41.334895999999986,
-91.07328
],
[
41.267818000000005,
-91.102348
],
[
41.23152200000001,
-91.101524
],
[
41.17625799999999,
-91.05632
],
[
41.16582500000001,
-91.018257
],
[
41.14437100000001,
-90.990341
],
[
41.10435899999999,
-90.957787
],
[
41.07036199999999,
-90.954651
],
[
40.950503999999995,
-90.960709
],
[
40.92392699999999,
-90.983276
],
[
40.87958499999999,
-91.04921
],
[
40.833729000000005,
-91.088905
],
[
40.76154700000001,
-91.092751
],
[
40.70540199999999,
-91.119987
],
[
40.68214800000001,
-91.129158
],
[
40.65631099999999,
-91.162498
],
[
40.64381800000001,
-91.214912
],
[
40.639545,
-91.262062
],
[
40.60343900000001,
-91.37561
],
[
40.572970999999995,
-91.411118
],
[
40.54799299999999,
-91.412872
],
[
40.52849599999999,
-91.382103
],
[
40.50365400000001,
-91.374794
],
[
40.44725,
-91.385399
],
[
40.40298799999999,
-91.372757
],
[
40.392360999999994,
-91.385757
],
[
40.386875,
-91.418816
],
[
40.371902000000006,
-91.448593
],
[
40.309624000000014,
-91.486694
],
[
40.25137699999999,
-91.498932
],
[
40.200458999999995,
-91.506546
],
[
40.134544000000005,
-91.516129
],
[
40.066711,
-91.504005
],
[
40.005753,
-91.487289
],
[
39.94606400000001,
-91.447243
],
[
39.92183700000001,
-91.430389
],
[
39.90182899999999,
-91.434052
],
[
39.885242000000005,
-91.450989
],
[
39.86304899999999,
-91.449188
],
[
39.80377200000001,
-91.381714
],
[
39.76127199999999,
-91.373421
],
[
39.724639999999994,
-91.367088
],
[
39.68591699999999,
-91.317665
],
[
39.600021,
-91.203247
],
[
39.552593,
-91.156189
],
[
39.52892700000001,
-91.093613
],
[
39.473984,
-91.064384
],
[
39.444412,
-91.036339
],
[
39.40058500000001,
-90.947891
],
[
39.35045199999999,
-90.850494
],
[
39.29680300000001,
-90.779343
],
[
39.24780999999999,
-90.738083
],
[
39.22474700000001,
-90.732338
],
[
39.195873000000006,
-90.718193
],
[
39.14421100000001,
-90.716736
],
[
39.09370000000001,
-90.690399
],
[
39.058178,
-90.707588
],
[
39.037791999999996,
-90.70607
],
[
38.93525299999999,
-90.668877
],
[
38.880795000000006,
-90.627213
],
[
38.87132600000001,
-90.570328
],
[
38.89160899999999,
-90.530426
],
[
38.959179000000006,
-90.469841
],
[
38.96233000000001,
-90.413071
],
[
38.92490799999999,
-90.31974
],
[
38.92471699999999,
-90.278931
],
[
38.91450900000001,
-90.243927
],
[
38.85303099999999,
-90.132812
],
[
38.830467,
-90.113121
],
[
38.80051,
-90.121727
],
[
38.785484,
-90.135178
],
[
38.773098000000005,
-90.163399
],
[
38.72396499999999,
-90.196571
],
[
38.70036300000001,
-90.20224
],
[
38.658772,
-90.183578
],
[
38.61027100000001,
-90.183708
],
[
38.562805,
-90.240944
],
[
38.532768000000004,
-90.26123
],
[
38.518688,
-90.265785
],
[
38.427357,
-90.301842
],
[
38.39084600000001,
-90.339607
],
[
38.36533,
-90.358688
],
[
38.32355899999999,
-90.369347
],
[
38.23429899999999,
-90.364769
],
[
38.18871300000001,
-90.336716
],
[
38.16681700000001,
-90.289635
],
[
38.122169000000014,
-90.254059
],
[
38.08890500000001,
-90.207527
],
[
38.05395100000001,
-90.134712
],
[
38.032272000000006,
-90.119339
],
[
37.993206,
-90.041924
],
[
37.969318,
-90.010811
],
[
37.963634,
-89.958229
],
[
37.911884,
-89.978912
],
[
37.878044,
-89.937874
],
[
37.875904000000006,
-89.900551
],
[
37.891875999999996,
-89.866814
],
[
37.905486999999994,
-89.861046
],
[
37.905063999999996,
-89.851715
],
[
37.840992,
-89.728447
],
[
37.804794,
-89.691055
],
[
37.78397,
-89.675858
],
[
37.745453,
-89.666458
],
[
37.706103999999996,
-89.581436
],
[
37.694798000000006,
-89.521523
],
[
37.67984,
-89.513374
],
[
37.650375,
-89.51918
],
[
37.615928999999994,
-89.513367
],
[
37.571957,
-89.524971
],
[
37.491726,
-89.494781
],
[
37.453186,
-89.453621
],
[
37.411018,
-89.427574
],
[
37.355717,
-89.435738
],
[
37.339409,
-89.468742
],
[
37.329441,
-89.50058
],
[
37.304962,
-89.513885
],
[
37.276402000000004,
-89.513885
],
[
37.256001,
-89.489594
],
[
37.253731,
-89.465309
],
[
37.224266,
-89.468216
],
[
37.165318,
-89.440521
],
[
37.137203,
-89.423798
],
[
37.09908299999999,
-89.37999
],
[
37.049212999999995,
-89.38295
],
[
37.009682,
-89.310982
],
[
36.999207,
-89.282768
],
[
37.008686,
-89.262001
],
[
37.027733,
-89.264244
],
[
37.060908999999995,
-89.3097
],
[
37.085384000000005,
-89.303291
],
[
37.091244,
-89.284233
],
[
37.087124,
-89.264053
],
[
37.041732999999994,
-89.237679
],
[
37.02897299999999,
-89.210052
],
[
36.986771000000005,
-89.193512
],
[
36.988113,
-89.12986
],
[
36.99844,
-89.150246
],
[
37.025711,
-89.174332
],
[
37.064235999999994,
-89.169548
],
[
37.093185000000005,
-89.146347
],
[
37.112137000000004,
-89.116821
],
[
37.185860000000005,
-89.065033
],
[
37.22003599999999,
-88.993172
],
[
37.218407,
-88.932503
],
[
37.202194000000006,
-88.863289
],
[
37.152107,
-88.746506
],
[
37.141182,
-88.739113
],
[
37.13540999999999,
-88.68837
],
[
37.109047000000004,
-88.61422
],
[
37.072815000000006,
-88.559273
],
[
37.064769999999996,
-88.517273
],
[
37.06818,
-88.4907
],
[
37.072143999999994,
-88.476799
],
[
37.098670999999996,
-88.45047
],
[
37.156909999999996,
-88.422516
],
[
37.205669,
-88.450699
],
[
37.257782000000006,
-88.501427
],
[
37.296852,
-88.511322
],
[
37.400757,
-88.467644
],
[
37.420292,
-88.419853
],
[
37.40930899999999,
-88.359177
],
[
37.442852,
-88.311707
],
[
37.476273000000006,
-88.087883
],
[
37.51099000000001,
-88.071564
]
]
]
}
}
],
"persistableReferenceCrs": "{\"lateBoundCRS\":{\"wkt\":\"GEOGCS[\\\"GCS_North_American_1983\\\",DATUM[\\\"D_North_American_1983\\\",SPHEROID[\\\"GRS_1980\\\",6378137.0,298.257222101]],PRIMEM[\\\"Greenwich\\\",0.0],UNIT[\\\"Degree\\\",0.0174532925199433],AUTHORITY[\\\"EPSG\\\",4269]]\",\"ver\":\"PE_10_3_1\",\"name\":\"GCS_North_American_1983\",\"authCode\":{\"auth\":\"EPSG\",\"code\":\"4269\"},\"type\":\"LBC\"},\"singleCT\":{\"wkt\":\"GEOGTRAN[\\\"NAD_1983_To_WGS_1984_1\\\",GEOGCS[\\\"GCS_North_American_1983\\\",DATUM[\\\"D_North_American_1983\\\",SPHEROID[\\\"GRS_1980\\\",6378137.0,298.257222101]],PRIMEM[\\\"Greenwich\\\",0.0],UNIT[\\\"Degree\\\",0.0174532925199433]],GEOGCS[\\\"GCS_WGS_1984\\\",DATUM[\\\"D_WGS_1984\\\",SPHEROID[\\\"WGS_1984\\\",6378137.0,298.257223563]],PRIMEM[\\\"Greenwich\\\",0.0],UNIT[\\\"Degree\\\",0.0174532925199433]],METHOD[\\\"Geocentric_Translation\\\"],PARAMETER[\\\"X_Axis_Translation\\\",0.0],PARAMETER[\\\"Y_Axis_Translation\\\",0.0],PARAMETER[\\\"Z_Axis_Translation\\\",0.0],AUTHORITY[\\\"EPSG\\\",1188]]\",\"ver\":\"PE_10_3_1\",\"name\":\"NAD_1983_To_WGS_1984_1\",\"authCode\":{\"auth\":\"EPSG\",\"code\":\"1188\"},\"type\":\"ST\"},\"ver\":\"PE_10_3_1\",\"name\":\"NAD83 * DMA-N Am [4269,1188]\",\"authCode\":{\"auth\":\"SLB\",\"code\":\"4269001\"},\"type\":\"EBC\"}"
}
},
"relationships": {
"projects": {
"ids": [
"slb:wing:project-sxs0f1a5219-4640-50af-9f63-c09140d57c4d:"
]
}
}
},
"meta": [
{
"kind": "CRS",
"name": "GCS_North_American_1983",
"persistableReference": "{\"lateBoundCRS\":{\"wkt\":\"GEOGCS[\\\"GCS_North_American_1983\\\",DATUM[\\\"D_North_American_1983\\\",SPHEROID[\\\"GRS_1980\\\",6378137.0,298.257222101]],PRIMEM[\\\"Greenwich\\\",0.0],UNIT[\\\"Degree\\\",0.0174532925199433],AUTHORITY[\\\"EPSG\\\",4269]]\",\"ver\":\"PE_10_3_1\",\"name\":\"GCS_North_American_1983\",\"authCode\":{\"auth\":\"EPSG\",\"code\":\"4269\"},\"type\":\"LBC\"},\"singleCT\":{\"wkt\":\"GEOGTRAN[\\\"NAD_1983_To_WGS_1984_1\\\",GEOGCS[\\\"GCS_North_American_1983\\\",DATUM[\\\"D_North_American_1983\\\",SPHEROID[\\\"GRS_1980\\\",6378137.0,298.257222101]],PRIMEM[\\\"Greenwich\\\",0.0],UNIT[\\\"Degree\\\",0.0174532925199433]],GEOGCS[\\\"GCS_WGS_1984\\\",DATUM[\\\"D_WGS_1984\\\",SPHEROID[\\\"WGS_1984\\\",6378137.0,298.257223563]],PRIMEM[\\\"Greenwich\\\",0.0],UNIT[\\\"Degree\\\",0.0174532925199433]],METHOD[\\\"Geocentric_Translation\\\"],PARAMETER[\\\"X_Axis_Translation\\\",0.0],PARAMETER[\\\"Y_Axis_Translation\\\",0.0],PARAMETER[\\\"Z_Axis_Translation\\\",0.0],AUTHORITY[\\\"EPSG\\\",1188]]\",\"ver\":\"PE_10_3_1\",\"name\":\"NAD_1983_To_WGS_1984_1\",\"authCode\":{\"auth\":\"EPSG\",\"code\":\"1188\"},\"type\":\"ST\"},\"ver\":\"PE_10_3_1\",\"name\":\"NAD83 * DMA-N Am [4269,1188]\",\"authCode\":{\"auth\":\"SLB\",\"code\":\"4269001\"},\"type\":\"EBC\"}",
"propertyNames": [
"SpatialLocation.AsIngestedCoordinates"
]
}
],
"id": "opendes:teststates:relationships-aWxsaW5vaXM",
"version": 1631682054027397,
"kind": "opendes:relationships:teststates:1.0.0",
"acl": {
"viewers": [
"data.default.viewers@opendes.enterprisedata.cloud.slb-ds.com"
],
"owners": [
"data.default.viewers@opendes.enterprisedata.cloud.slb-ds.com"
]
},
"legal": {
"legaltags": [
"opendes-public-usa-dataset-7643990"
],
"otherRelevantDataCountries": [
"US"
],
"status": "compliant"
},
"createUser": "39916b94-71a9-409e-856e-0f29558fa908",
"createTime": "2021-09-15T05:00:55.838Z"
}
Please let me know if you need any other information.
Thanks and Regards,
Monalisa Srivastava
Schlumberger-Private
From: Sanjeev Pellikoduku <SPellikoduku@slb.com>
Sent: Wednesday, September 15, 2021 2:00 AM
To: Monalisa Srivastava <MSrivastava4@slb.com>
Cc: Neelesh Thakur <NThakur4@slb.com>; Bhakti Ashwin Thakkar <BThakkar@slb.com>; Nitin Jain <NJain5@slb.com>
Subject: RE: Issue with search
Hi Monalisa,
The records were ingested using multiple kinds "kind": "opendes:staticrela:states:3.0.0" and "kind": "opendes:staticrela:states:4.0.0".
"kind": "opendes:staticrela:states:3.0.0" "totalCount": 40
"kind": "opendes:staticrela:states:4.0.0" "totalCount": 49
Below 9 records ingested with a kind “opendes:staticrela:states:4.0.0”.
opendes:states:staticrela-aWxsaW5vaXMxNw
opendes:states:staticrela-bWlzc291cmkyOQ
opendes:states:staticrela-dGVubmVzc2VlNDc
opendes:states:staticrela-bWlzc2lzc2lwcGkyOA
opendes:states:staticrela-YXJrYW5zYXM1
opendes:states:staticrela-bG91aXNpYW5hMjI
opendes:states:staticrela-bWljaGlnYW4yNg
opendes:states:staticrela-d2lzY29uc2luNTU
opendes:states:staticrela-bWlubmVzb3RhMjc
Please try this query
{
"kind": "opendes:staticrela:states:4.0.0",
"returnedFields": [
"id",
"index"
]
}
Regards.
Sanjeev
Schlumberger-Private
From: Monalisa Srivastava <MSrivastava4@slb.com>
Sent: Tuesday, September 14, 2021 6:45 AM
To: Sanjeev Pellikoduku <SPellikoduku@slb.com>
Cc: Neelesh Thakur <NThakur4@slb.com>; Bhakti Ashwin Thakkar <BThakkar@slb.com>
Subject: Issue with search
Hi Sanjeev,
Need you help to validate an issue we are facing with the search service, below are the details – Env QA
The shapefile ingestor ingested 49 records :
opendes:states:staticrela-aWxsaW5vaXMxNw
opendes:states:staticrela-bWlzc291cmkyOQ
opendes:states:staticrela-dGVubmVzc2VlNDc
opendes:states:staticrela-bWlzc2lzc2lwcGkyOA
opendes:states:staticrela-YXJrYW5zYXM1
opendes:states:staticrela-bG91aXNpYW5hMjI
opendes:states:staticrela-bWljaGlnYW4yNg
opendes:states:staticrela-d2lzY29uc2luNTU
opendes:states:staticrela-bWlubmVzb3RhMjc
opendes:states:staticrela-ZGlzdHJpY3Qgb2YgY29sdW1iaWExMQ
opendes:states:staticrela-ZGVsYXdhcmUxMA
opendes:states:staticrela-d2VzdCB2aXJnaW5pYTU0
opendes:states:staticrela-bWFyeWxhbmQyNA
opendes:states:staticrela-Y29sb3JhZG84
opendes:states:staticrela-a2VudHVja3kyMQ
opendes:states:staticrela-a2Fuc2FzMjA
opendes:states:staticrela-dmlyZ2luaWE1MQ
opendes:states:staticrela-YXJpem9uYTQ
opendes:states:staticrela-b2tsYWhvbWE0MA
opendes:states:staticrela-bm9ydGggY2Fyb2xpbmEzNw
opendes:states:staticrela-dGV4YXM0OA
opendes:states:staticrela-bmV3IG1leGljbzM1
opendes:states:staticrela-YWxhYmFtYTE
opendes:states:staticrela-Z2VvcmdpYTEz
opendes:states:staticrela-c291dGggY2Fyb2xpbmE0NQ
opendes:states:staticrela-ZmxvcmlkYTEy
opendes:states:staticrela-bW9udGFuYTMw
opendes:states:staticrela-bWFpbmUyMw
opendes:states:staticrela-bm9ydGggZGFrb3RhMzg
opendes:states:staticrela-c291dGggZGFrb3RhNDY
opendes:states:staticrela-d3lvbWluZzU2
opendes:states:staticrela-aWRhaG8xNg
opendes:states:staticrela-dmVybW9udDUw
opendes:states:staticrela-b3JlZ29uNDE
opendes:states:staticrela-bmV3IGhhbXBzaGlyZTMz
opendes:states:staticrela-aW93YTE5
opendes:states:staticrela-bWFzc2FjaHVzZXR0czI1
opendes:states:staticrela-bmVicmFza2EzMQ
opendes:states:staticrela-bmV3IHlvcmszNg
opendes:states:staticrela-cGVubnN5bHZhbmlhNDI
opendes:states:staticrela-Y29ubmVjdGljdXQ5
opendes:states:staticrela-cmhvZGUgaXNsYW5kNDQ
opendes:states:staticrela-bmV3IGplcnNleTM0
opendes:states:staticrela-aW5kaWFuYTE4
opendes:states:staticrela-bmV2YWRhMzI
opendes:states:staticrela-dXRhaDQ5
opendes:states:staticrela-Y2FsaWZvcm5pYTY
opendes:states:staticrela-b2hpbzM5
opendes:states:staticrela-d2FzaGluZ3RvbjUz
However the search query gives me only 40 records, I even used the index getting 200 however the records count and ids are 40 :
{
"kind": "opendes:staticrela:states:3.0.0",
"returnedFields": [
"id",
"index"
],
"limit": 100
}
Response :
{
"cursor": "75E5D618FA7C168226980DA08027ECAB",
"results": [
{
"index": {
"trace": [],
"statusCode": 200,
"lastUpdateTime": "2021-09-14T08:14:52.956Z"
},
"id": "opendes:states:staticrela-bmV3IHlvcmszNg"
},
{
"index": {
"trace": [],
"statusCode": 200,
"lastUpdateTime": "2021-09-14T08:14:52.956Z"
},
"id": "opendes:states:staticrela-bm9ydGggY2Fyb2xpbmEzNw"
},
{
"index": {
"trace": [],
"statusCode": 200,
"lastUpdateTime": "2021-09-14T08:14:52.956Z"
},
"id": "opendes:states:staticrela-aW5kaWFuYTE4"
},
{
"index": {
"trace": [],
"statusCode": 200,
"lastUpdateTime": "2021-09-14T08:14:52.956Z"
},
"id": "opendes:states:staticrela-b2tsYWhvbWE0MA"
},
{
"index": {
"trace": [],
"statusCode": 200,
"lastUpdateTime": "2021-09-14T08:14:52.956Z"
},
"id": "opendes:states:staticrela-YWxhYmFtYTE"
},
{
"index": {
"trace": [],
"statusCode": 200,
"lastUpdateTime": "2021-09-14T08:14:52.956Z"
},
"id": "opendes:states:staticrela-aWRhaG8xNg"
},
{
"index": {
"trace": [],
"statusCode": 200,
"lastUpdateTime": "2021-09-14T08:14:52.956Z"
},
"id": "opendes:states:staticrela-bmV3IGhhbXBzaGlyZTMz"
},
{
"index": {
"trace": [],
"statusCode": 200,
"lastUpdateTime": "2021-09-14T08:14:52.956Z"
},
"id": "opendes:states:staticrela-b3JlZ29uNDE"
},
{
"index": {
"trace": [],
"statusCode": 200,
"lastUpdateTime": "2021-09-14T08:14:52.956Z"
},
"id": "opendes:states:staticrela-bmV2YWRhMzI"
},
{
"index": {
"trace": [],
"statusCode": 200,
"lastUpdateTime": "2021-09-14T08:14:52.956Z"
},
"id": "opendes:states:staticrela-d2VzdCB2aXJnaW5pYTU0"
},
{
"index": {
"trace": [],
"statusCode": 200,
"lastUpdateTime": "2021-09-14T08:14:52.956Z"
},
"id": "opendes:states:staticrela-dmlyZ2luaWE1MQ"
},
{
"index": {
"trace": [],
"statusCode": 200,
"lastUpdateTime": "2021-09-14T08:14:52.956Z"
},
"id": "opendes:states:staticrela-bm9ydGggZGFrb3RhMzg"
},
{
"index": {
"trace": [],
"statusCode": 200,
"lastUpdateTime": "2021-09-14T08:14:52.956Z"
},
"id": "opendes:states:staticrela-ZGlzdHJpY3Qgb2YgY29sdW1iaWExMQ"
},
{
"index": {
"trace": [],
"statusCode": 200,
"lastUpdateTime": "2021-09-14T08:14:52.956Z"
},
"id": "opendes:states:staticrela-c291dGggY2Fyb2xpbmE0NQ"
},
{
"index": {
"trace": [],
"statusCode": 200,
"lastUpdateTime": "2021-09-14T08:14:52.956Z"
},
"id": "opendes:states:staticrela-d2FzaGluZ3RvbjUz"
},
{
"index": {
"trace": [],
"statusCode": 200,
"lastUpdateTime": "2021-09-14T08:14:52.956Z"
},
"id": "opendes:states:staticrela-YXJpem9uYTQ"
},
{
"index": {
"trace": [],
"statusCode": 200,
"lastUpdateTime": "2021-09-14T08:14:52.956Z"
},
"id": "opendes:states:staticrela-Y29ubmVjdGljdXQ5"
},
{
"index": {
"trace": [],
"statusCode": 200,
"lastUpdateTime": "2021-09-14T08:14:52.956Z"
},
"id": "opendes:states:staticrela-b2hpbzM5"
},
{
"index": {
"trace": [],
"statusCode": 200,
"lastUpdateTime": "2021-09-14T08:14:52.956Z"
},
"id": "opendes:states:staticrela-bWFzc2FjaHVzZXR0czI1"
},
{
"index": {
"trace": [],
"statusCode": 200,
"lastUpdateTime": "2021-09-14T08:14:52.956Z"
},
"id": "opendes:states:staticrela-a2Fuc2FzMjA"
},
{
"index": {
"trace": [],
"statusCode": 200,
"lastUpdateTime": "2021-09-14T08:14:52.956Z"
},
"id": "opendes:states:staticrela-bmV3IG1leGljbzM1"
},
{
"index": {
"trace": [],
"statusCode": 200,
"lastUpdateTime": "2021-09-14T08:14:52.956Z"
},
"id": "opendes:states:staticrela-cmhvZGUgaXNsYW5kNDQ"
},
{
"index": {
"trace": [],
"statusCode": 200,
"lastUpdateTime": "2021-09-14T08:14:52.956Z"
},
"id": "opendes:states:staticrela-Y29sb3JhZG84"
},
{
"index": {
"trace": [],
"statusCode": 200,
"lastUpdateTime": "2021-09-14T08:14:52.956Z"
},
"id": "opendes:states:staticrela-Y2FsaWZvcm5pYTY"
},
{
"index": {
"trace": [],
"statusCode": 200,
"lastUpdateTime": "2021-09-14T08:14:52.956Z"
},
"id": "opendes:states:staticrela-bWFpbmUyMw"
},
{
"index": {
"trace": [],
"statusCode": 200,
"lastUpdateTime": "2021-09-14T08:14:52.956Z"
},
"id": "opendes:states:staticrela-a2VudHVja3kyMQ"
},
{
"index": {
"trace": [],
"statusCode": 200,
"lastUpdateTime": "2021-09-14T08:14:52.956Z"
},
"id": "opendes:states:staticrela-aW93YTE5"
},
{
"index": {
"trace": [],
"statusCode": 200,
"lastUpdateTime": "2021-09-14T08:14:52.956Z"
},
"id": "opendes:states:staticrela-bmV3IGplcnNleTM0"
},
{
"index": {
"trace": [],
"statusCode": 200,
"lastUpdateTime": "2021-09-14T08:14:52.956Z"
},
"id": "opendes:states:staticrela-dmVybW9udDUw"
},
{
"index": {
"trace": [],
"statusCode": 200,
"lastUpdateTime": "2021-09-14T08:14:52.956Z"
},
"id": "opendes:states:staticrela-Z2VvcmdpYTEz"
},
{
"index": {
"trace": [],
"statusCode": 200,
"lastUpdateTime": "2021-09-14T08:14:52.956Z"
},
"id": "opendes:states:staticrela-d3lvbWluZzU2"
},
{
"index": {
"trace": [],
"statusCode": 200,
"lastUpdateTime": "2021-09-14T08:14:52.956Z"
},
"id": "opendes:states:staticrela-bW9udGFuYTMw"
},
{
"index": {
"trace": [],
"statusCode": 200,
"lastUpdateTime": "2021-09-14T08:14:52.956Z"
},
"id": "opendes:states:staticrela-dXRhaDQ5"
},
{
"index": {
"trace": [],
"statusCode": 200,
"lastUpdateTime": "2021-09-14T08:14:52.956Z"
},
"id": "opendes:states:staticrela-bmVicmFza2EzMQ"
},
{
"index": {
"trace": [],
"statusCode": 200,
"lastUpdateTime": "2021-09-14T08:14:52.956Z"
},
"id": "opendes:states:staticrela-ZmxvcmlkYTEy"
},
{
"index": {
"trace": [],
"statusCode": 200,
"lastUpdateTime": "2021-09-14T08:14:52.956Z"
},
"id": "opendes:states:staticrela-bWFyeWxhbmQyNA"
},
{
"index": {
"trace": [],
"statusCode": 200,
"lastUpdateTime": "2021-09-14T08:14:52.956Z"
},
"id": "opendes:states:staticrela-ZGVsYXdhcmUxMA"
},
{
"index": {
"trace": [],
"statusCode": 200,
"lastUpdateTime": "2021-09-14T08:14:52.956Z"
},
"id": "opendes:states:staticrela-c291dGggZGFrb3RhNDY"
},
{
"index": {
"trace": [],
"statusCode": 200,
"lastUpdateTime": "2021-09-14T08:14:52.956Z"
},
"id": "opendes:states:staticrela-dGV4YXM0OA"
},
{
"index": {
"trace": [],
"statusCode": 200,
"lastUpdateTime": "2021-09-14T08:14:52.956Z"
},
"id": "opendes:states:staticrela-cGVubnN5bHZhbmlhNDI"
}
],
"totalCount": 40
}
Missing 9 records :
opendes:states:staticrela-aWxsaW5vaXMxNw
opendes:states:staticrela-bWlzc291cmkyOQ
opendes:states:staticrela-dGVubmVzc2VlNDc
opendes:states:staticrela-bWlzc2lzc2lwcGkyOA
opendes:states:staticrela-YXJrYW5zYXM1
opendes:states:staticrela-bG91aXNpYW5hMjI
opendes:states:staticrela-bWljaGlnYW4yNg
opendes:states:staticrela-d2lzY29uc2luNTU
opendes:states:staticrela-bWlubmVzb3RhMjc
As discussed today, based on further investigation the Elesticsearch is not able to index the geo_shape defined in the 9 missing records.
reason=Unable to Tessellate shape. It’s a bug in Lucene tessellator. I think it’s fixed in the latest version of the Elastic search.
All the below records have several more coordinates with the type “AnyCrsPolygon” and “AnyCrsMultiPolygon” than the other records.
opendes:teststates:relationships-aWxsaW5vaXM
opendes:teststates:relationships-bWlzc291cmk
opendes:teststates:relationships-dGVubmVzc2Vl
opendes:teststates:relationships-bWlzc2lzc2lwcGk
opendes:teststates:relationships-YXJrYW5zYXM
opendes:teststates:relationships-bG91aXNpYW5h
opendes:teststates:relationships-bWljaGlnYW4
opendes:teststates:relationships-d2lzY29uc2lu
opendes:teststates:relationships-bWlubmVzb3Rhhttps://community.opengroup.org/osdu/platform/data-flow/ingestion/ingestion-workflow/-/issues/130Bump os-core-common.version to 0.12.0-rc32021-10-20T09:06:08ZMaksim MalkovBump os-core-common.version to 0.12.0-rc3Same as here
https://community.opengroup.org/osdu/platform/system/file/-/merge_requests/181Same as here
https://community.opengroup.org/osdu/platform/system/file/-/merge_requests/181M9 - Release 0.12Maksim MalkovMaksim Malkovhttps://community.opengroup.org/osdu/platform/data-flow/ingestion/ingestion-workflow/-/issues/129Fixed get workflow status request by workflowName2021-10-11T09:44:52ZRiabokon Stanislav(EPAM)[GCP]Fixed get workflow status request by workflowNameGET "/{workflow_name}/workflowRun/{runId}" request used 'dagName' instead of 'workflowName' underhood during airflow call.
In case of different 'dagName' and 'workflowName' it leads error:
```
{
"code": 404,
"reason": "Failed to send...GET "/{workflow_name}/workflowRun/{runId}" request used 'dagName' instead of 'workflowName' underhood during airflow call.
In case of different 'dagName' and 'workflowName' it leads error:
```
{
"code": 404,
"reason": "Failed to send request.",
"message": "Unable to send request to Airflow. 404 NOT FOUND_{\"error\":\"Dag id workflow_name not found in DagModel\"}_"
}
```M9 - Release 0.12Riabokon Stanislav(EPAM)[GCP]Riabokon Stanislav(EPAM)[GCP]https://community.opengroup.org/osdu/platform/data-flow/ingestion/ingestion-workflow/-/issues/128System Dags Implementation for AWS, GCP and IBM2021-09-24T13:28:18ZAalekh JainSystem Dags Implementation for AWS, GCP and IBMLink to the ADR: #118
Link to the MR: !146
In order to support system dags, the following changes are required for AWS, GCP and IBM -
1. `IWorkflowSystemMetadataRepository` (Link to azure implementation for reference: [here](https://c...Link to the ADR: #118
Link to the MR: !146
In order to support system dags, the following changes are required for AWS, GCP and IBM -
1. `IWorkflowSystemMetadataRepository` (Link to azure implementation for reference: [here](https://community.opengroup.org/osdu/platform/data-flow/ingestion/ingestion-workflow/-/blob/bc469fb101f27c24670bf03bc92760ad7303d747/provider/workflow-azure/src/main/java/org/opengroup/osdu/workflow/provider/azure/repository/WorkflowSystemMetadataRepository.java))
2. `IAdminAuthorizationService` (Link to azure implementation for reference: [here](https://community.opengroup.org/osdu/platform/data-flow/ingestion/ingestion-workflow/-/blob/bc469fb101f27c24670bf03bc92760ad7303d747/provider/workflow-azure/src/main/java/org/opengroup/osdu/workflow/provider/azure/service/AdminAuthorizationServiceImpl.java))
Once these SPIs are implemented, the ITs for the corresponding can be extended for each of the cloud provider by extending the base abstract class which is - `DeleteSystemWorkflowV3IntegrationTests` and `PostCreateSystemWorkflowV3IntegrationTests`. For reference, this is how it's done for azure.
1. Extending ITs for delete system workflow - [TestDeleteSystemWorkflowV3Integration.java](https://community.opengroup.org/osdu/platform/data-flow/ingestion/ingestion-workflow/-/blob/bc469fb101f27c24670bf03bc92760ad7303d747/testing/workflow-test-azure/src/test/java/org/opengroup/osdu/azure/workflow/workflow/TestDeleteSystemWorkflowV3Integration.java)
2. Extending ITs for create system workflow - [TestPostCreateSystemWorkflowV3Integration.java](https://community.opengroup.org/osdu/platform/data-flow/ingestion/ingestion-workflow/-/blob/bc469fb101f27c24670bf03bc92760ad7303d747/testing/workflow-test-azure/src/test/java/org/opengroup/osdu/azure/workflow/workflow/TestPostCreateSystemWorkflowV3Integration.java)
The expected behaviour of system workflows is presented in the ADR.https://community.opengroup.org/osdu/platform/data-flow/ingestion/ingestion-workflow/-/issues/127Code refactoring - WorkflowEngineRequest2022-02-15T09:38:05ZAalekh JainCode refactoring - WorkflowEngineRequestThere are too many constructors that exists for [`WorkflowEngineRequest`](https://community.opengroup.org/osdu/platform/data-flow/ingestion/ingestion-workflow/-/blob/master/workflow-core/src/main/java/org/opengroup/osdu/workflow/model/Wo...There are too many constructors that exists for [`WorkflowEngineRequest`](https://community.opengroup.org/osdu/platform/data-flow/ingestion/ingestion-workflow/-/blob/master/workflow-core/src/main/java/org/opengroup/osdu/workflow/model/WorkflowEngineRequest.java). It would be a better decision to use a builder pattern instead to maintain a cleaner code.https://community.opengroup.org/osdu/platform/data-flow/ingestion/ingestion-workflow/-/issues/126As part of GSM feature, Implement status publishing method in MSFT AZURE2021-08-27T21:27:31ZMaksim MalkovAs part of GSM feature, Implement status publishing method in MSFT AZUREThis is as per the GSM requirement to be implemented in each CSP. This issue has been created for the Microsoft Aure team to implement the publish method to publish the status events in the message queue.This is as per the GSM requirement to be implemented in each CSP. This issue has been created for the Microsoft Aure team to implement the publish method to publish the status events in the message queue.M8 - Release 0.11Maksim MalkovMaksim Malkovhttps://community.opengroup.org/osdu/platform/data-flow/ingestion/ingestion-workflow/-/issues/125Unable to Ingest the Seismic horizon data using the manifest ingestion dag :...2022-09-29T14:06:02ZAnanth TUnable to Ingest the Seismic horizon data using the manifest ingestion dag : Osdu_ingest@todaiks @ChrisZhang @
GCP & AWS clouds
While working on the Seismic Horizon API, Refresh_Token was working fine. Similarly
10 (Post)-Ingestion Seismic Horizon record R3 and 10a (Get) Check the Seismic Horizon Workflow Status were exec...@todaiks @ChrisZhang @
GCP & AWS clouds
While working on the Seismic Horizon API, Refresh_Token was working fine. Similarly
10 (Post)-Ingestion Seismic Horizon record R3 and 10a (Get) Check the Seismic Horizon Workflow Status were executed normally. However at 10b (Post) Storage - Retrieve ingested Seismic Horizon Data record api was stopped due to 404- Record not found.
{
"code": 404,
"reason": "Record not found",
"message": "The record 'odesprod:work-product-component--SeismicHorizon:Auto_Test_999645280744' was not found"
}
Further when checked on the airflow lag files,
This "SeismicHorizon:Auto_Test_999645280744" was created in the before steps. But this was throwing error at 10b-Retrieve ingested Seismic Horizon Data record.
Two cloud platforms i.e GCP & AWS has given the errors at this level.
The corresponding GCP Code Snippet and body response for the 10, 10a, 10b API's has been compiled and the document is enclosed.
Kindly have the detailed review and revert us for the possible solution.
[Seismic_Horizon_Issue_GCP_020821.docx](/uploads/25668e785006d46aed881bf3151a5e2d/Seismic_Horizon_Issue_GCP_020821.docx)https://community.opengroup.org/osdu/platform/data-flow/ingestion/ingestion-workflow/-/issues/124Contributing Integration tests to core [Failing ITs for AWS]2021-08-25T05:33:46ZAalekh JainContributing Integration tests to core [Failing ITs for AWS]**Current behaviour**
There are **too few integration tests** in core which leads to non-robust testing of ingestion workflow across the cloud providers. This had also lead to non-consistent behaviour for some of the APIs across the var...**Current behaviour**
There are **too few integration tests** in core which leads to non-robust testing of ingestion workflow across the cloud providers. This had also lead to non-consistent behaviour for some of the APIs across the various cloud provider's implementation which sometimes become a huge bottleneck in the development process by becoming a hurdle in this setting of cross-cloud collaboration.
**Expected behaviour**
The target is to make sure that the ingestion workflow APIs exhibit same behaviour across all the cloud providers and to also have a robust mechanism to improve the integration test coverage thereby resulting into a reliable service development going forward.
---
**MAJOR CONTRIBUTIONS**
The integration tests have been contributed by Azure which will benefit all the other cloud providers as well. This has been done as part of the following MR: !131
A brief comparison of the contributions (number of ITs) are as follows -
| **AREA**| **API ENDPOINT** | **TYPE** | **BEFORE** | **AFTER** |
|---|---|---|---|---|
| WORKFLOW | /v1/workflow | POST | 3 | 8 |
| WORKFLOW | /v1/workflow | GET | 2 | 7 |
| WORKFLOW | /v1/workflow/{workflow_name} | DELETE | 1 | 5 |
| WORKFLOW | /v1/workflow/{workflow_name} | GET | 2 | 7 |
| WORKFLOW RUN | /v1/workflow/{workflow_name}/workflowRun | POST | 1 | 6 |
| WORKFLOW RUN | /v1/workflow/{workflow_name}/workflowRun | GET | 1 | 5 |
| WORKFLOW RUN | /v1/workflow/{workflow_name}/workflowRun/{runId} | GET| 1 | 6 |
| WORKFLOW RUN | /v1/workflow/{workflow_name}/workflowRun/{runId} | PUT| 1 | 10 |
| COMBINED | - | TOTAL | **12**| **54** |
Integration tests helps in identifying the functionality gap. Going forward, the expectation is to add/update the integration tests as per the common api spec in the **core module** which will aid in robust service development for all the cloud providers.
---
**Current State of the MR**
Link: !131
Majority of the tests are passing for majority of the cloud providers, a brief summary of the failing tests are as follows -
| **Cloud Provider** | **Number of failing tests** | **LINK TO TEST RUN** |
|---|---|---|
| AWS | 0 | https://community.opengroup.org/osdu/platform/data-flow/ingestion/ingestion-workflow/-/jobs/483006 |
| AZURE | 0 | https://community.opengroup.org/osdu/platform/data-flow/ingestion/ingestion-workflow/-/jobs/483007 |
| GCP | 0 | https://community.opengroup.org/osdu/platform/data-flow/ingestion/ingestion-workflow/-/jobs/483005 |
| IBM | 0 | https://community.opengroup.org/osdu/platform/data-flow/ingestion/ingestion-workflow/-/jobs/483004 |
Would request the respective owners to have a look into this.
cc: @kibattul @shrikgar @wsmatth @Kateryna_KurachM8 - Release 0.11https://community.opengroup.org/osdu/platform/data-flow/ingestion/ingestion-workflow/-/issues/123Upgrade Core IBM Dependency2022-02-11T21:56:14ZDavid Diederichd.diederich@opengroup.orgUpgrade Core IBM Dependencyhttps://community.opengroup.org/osdu/platform/data-flow/ingestion/ingestion-workflow/-/issues/122Upgrade Core Azure Dependency2022-02-11T21:56:18ZDavid Diederichd.diederich@opengroup.orgUpgrade Core Azure Dependencyhttps://community.opengroup.org/osdu/platform/data-flow/ingestion/ingestion-workflow/-/issues/121Upgrade Core Common Dependency2022-02-11T21:56:23ZDavid Diederichd.diederich@opengroup.orgUpgrade Core Common Dependencyhttps://community.opengroup.org/osdu/platform/data-flow/ingestion/ingestion-workflow/-/issues/142Communication between calling program and a launched run of manifest-based In...2023-07-05T09:53:27ZDebasis ChatterjeeCommunication between calling program and a launched run of manifest-based Ingestion processAs you can imagine, it is common practice for software vendors to offer UI-based insert/update/delete capability for meta data.
Such program would have to interact with user (human) to gather information about a new Wellbore (for example...As you can imagine, it is common practice for software vendors to offer UI-based insert/update/delete capability for meta data.
Such program would have to interact with user (human) to gather information about a new Wellbore (for example) and then, behind the scene, make up JSON load/manifest to actually populate OSDU DP by creating a new Wellbore.
Ideally, such program needs to report back to the user (human) almost right away whether his/her effort (to create a new Wellbore in OSDU DP) has succeeded or not.
Not enough to tell the user “Here is the RunID, go and check status from Airflow console”.
Linked to osdu/platform/system/home#80
Perhaps a solution through Notification Service?
This also impacts EDS fetch-and-ingest workflow. cc - @jrougeau (for information)
cc - @lasscock.b , @Kateryna_Kurach (for information)https://community.opengroup.org/osdu/platform/data-flow/ingestion/ingestion-workflow/-/issues/119Integrate notification service to trigger DAG automatically2022-01-18T04:10:04ZChris ZhangIntegrate notification service to trigger DAG automaticallyAs of today, the ingestion DAGs are triggered manually. With the notification service in place, and the design pattern aligned on OSDU platform events, the DAG trigger could be done automatically in code by subscribing to certain events....As of today, the ingestion DAGs are triggered manually. With the notification service in place, and the design pattern aligned on OSDU platform events, the DAG trigger could be done automatically in code by subscribing to certain events. This should improve the overall the ingestion workflow.
Ref: https://community.opengroup.org/osdu/platform/system/home/-/issues/58
OSDU Platform Events: https://community.opengroup.org/osdu/platform/system/notification/-/wikis/OSDU-Platform-Eventshttps://community.opengroup.org/osdu/platform/data-flow/ingestion/ingestion-workflow/-/issues/118ADR : New API to handle System workflows2023-12-15T05:47:08Zpreeti singh[Microsoft]ADR : New API to handle System workflows**Context:**
===
System workflows are the workflows which are available to all the data partitions. Any System workflow can be triggered and retrieved by any of the tenants, but it can be Created, updated, deleted by only a user with spe...**Context:**
===
System workflows are the workflows which are available to all the data partitions. Any System workflow can be triggered and retrieved by any of the tenants, but it can be Created, updated, deleted by only a user with special privilege (Let’s say it has system role).
This is more with respect to the workflows or DAGs that OSDU provides with.
**How it's done today:**
===
- There is no concept of system workflows.
- The workflow metadata is stored in partition specific cosmos collection.
**Issue with current design:**
===
- The behavior of create api endpoint will change and can be confusing to users if we use same for system as well as private workflow. Users might end up unknowingly creating system workflow by passing data-partition-id of special partition.
- It is difficult for the updates to be managed for the changes from the OSDU community, if we try to copy or replicate the information across all the customer partitions.
**Proposal:**
===
There are two types of workflows in the system, System workflows and Private workflows. The proposal is to create a new API endpoint to register System workflows.
- The new API shall be termed as `workflow/system`
- To **create/update/delete** System workflows - `/workflow/system` endpoint shall be used
- To **Get/Trigger** System workflows, existing workflow service endpoint must be used.
- The authorization of new end point shall be different from existing groups. We'll use service principal based authorization.
- The new API shall not accept data-partition-id as a header. Service would be aware where the System workflows are located.
- This API should interact *only* with System workflows. It should not have access to other workflows.
**Sequence Diagram for createWorkflow**
![createWorkflow](/uploads/da01a45cf14062aad0a5cfc48bd51c3d/createWorkflow.png)
**Sequence Diagram for getallWorkflows**
![getallWorkflows](/uploads/f767cc09ea1b27baeb6137e6dbdd9959/getallWorkflows.png)
**Sequence Diagram for getWorkflow option 1** (this one got finalized)
![getWorkflowOption1](/uploads/55dcd0f4ece4c0df83f5bef057c1cfbb/getWorkflowOption1.png)
**Sequence Diagram for getWorkflow option 2**
![getWorkflowOption2](/uploads/5ee667cb7226a62a05a03a8effa27988/getWorkflowOption2.png)M9 - Release 0.12preeti singh[Microsoft]preeti singh[Microsoft]