Open Test Data issueshttps://community.opengroup.org/osdu/platform/data-flow/data-loading/open-test-data/-/issues2024-02-19T20:09:46Zhttps://community.opengroup.org/osdu/platform/data-flow/data-loading/open-test-data/-/issues/96Fix TNO/Volve CRS reference data2024-02-19T20:09:46ZBert KampesFix TNO/Volve CRS reference dataTNO/Volve test data repo contains 3 reference data manifests that are not desired because they do not follow the standard record id agreement. Additionally they are not BoundCRSs and use schema version 1.0.0 and not 1.1.0. Meanwhile OSDU...TNO/Volve test data repo contains 3 reference data manifests that are not desired because they do not follow the standard record id agreement. Additionally they are not BoundCRSs and use schema version 1.0.0 and not 1.1.0. Meanwhile OSDU comes "out of the box" with distributed set of standard CRSs. The objective of this issue is to
- [ ] Replace any CRS id referenced in the (thousands of) Well and Wellbore TNO/Volve to use the standard "out of the box" CRSs (per table below).
- [ ] Remove the local CRS reference data manifests in the TNO/Volve repo because they won't be needed anymore (i.e., the Well and Wellbore manifest will point to the OSDU distributed CRSs).
The OSDU CRS record id standard is described at:https://gitlab.opengroup.org/osdu/subcommittees/data-def/work-products/schema/-/blob/master/Guides/Chapters/04-FrameOfReference.md?ref_type=heads#4331-record-id-dataid-for-coordinatereferencesystem
The standard CRS reference data included with a OSDU distribution are contained in file CRS_CT.json at https://gitlab.opengroup.org/osdu/subcommittees/data-def/work-products/schema/-/blob/master/ReferenceValues/Manifests/reference-data/LOCAL/CRS_CT.json?ref_type=heads
**Detailed analysis:**
There are 3 files with local reference data: The letters A,B,C are referenced in the below table.
* A) https://community.opengroup.org/osdu/platform/data-flow/data-loading/open-test-data/-/blob/master/rc--3.0.0/4-instances/Volve/reference-data/volve_ref_CoordinateReferenceSystem.1.0.0.json
* B) https://community.opengroup.org/osdu/platform/data-flow/data-loading/open-test-data/-/blob/master/rc--3.0.0/4-instances/Volve/reference-data/load_refCoordinateReferenceSystem.OPEN.json
* C) https://community.opengroup.org/osdu/platform/data-flow/data-loading/open-test-data/-/blob/master/rc--3.0.0/4-instances/TNO/reference-data/load_refCoordinateReferenceSystem.OPEN.json
<table>
<tr>
<th>
</th>
<th>
**_Volve/TNO 1.0.0 bogus string (record id)_**
</th>
<th>
**_Count in TNO/Volve manifests_**
</th>
<th>Replace in (Well and Wellbore) data manifests with proper record id</th>
<th>
**_Comment_**
</th>
</tr>
<tr>
<td>A,C</td>
<td>osdu:reference-data--CoordinateReferenceSystem:23031</td>
<td>12000+</td>
<td>osdu:reference-data--CoordinateReferenceSystem:BoundProjected:EPSG::23031_EPSG::1311</td>
<td>
_23031 is ED50/UTM31N; We should confirm where the data is. Namely 1311 is valid in specific area and if this would be northern Norway it would not be correct._
_See below 23031024 is a different CT valid in Norway. I assumed the TNO data is in North Sea near Netherlands._
_Note: 23031 is a normal CRS, not BoundCRS. But in OSDU the standard is to use BoundCRS to ingest data, so we best fix the test data and assign a BoundCRS to the original coordinates. The normalizer on ingest is then able to compute the WGS 84 coordinates. (and WGS 84 should not be in the input manifest ingested)._
_We can confirm the CT by checking the WGS 84 coordinates._
</td>
</tr>
<tr>
<td>A</td>
<td>osdu:reference-data--CoordinateReferenceSystem:4230</td>
<td>
</td>
<td>osdu:reference-data--CoordinateReferenceSystem:BoundGeographic2D:EPSG::4230_EPSG::1311</td>
<td>4230 is ED50, base of above. Bound the same to CT 1311 (common offshore in North Sea)</td>
</tr>
<tr>
<td>A</td>
<td>osdu:reference-data--CoordinateReferenceSystem:4979</td>
<td>0 (confirmed)</td>
<td>osdu:reference-data--CoordinateReferenceSystem:Geographic2D:EPSG::4979</td>
<td>
_WGS 84 3D. Cannot be used by data. (I cannot believe that; it should only referenced in 4326). We should be OK to simply delete it from Volve reference data._
</td>
</tr>
<tr>
<td>A</td>
<td>osdu:reference-data--CoordinateReferenceSystem:4978</td>
<td>0 (confirmed)</td>
<td>osdu:reference-data--CoordinateReferenceSystem:Geographic2D:EPSG::4978</td>
<td>
_WGS 84 Geocentric. Cannot be used by data. (I cannot believe that; it should only referenced in 4979). We should be OK to simply delete it from Volve reference data._
</td>
</tr>
<tr>
<td>A</td>
<td>osdu:reference-data--CoordinateReferenceSystem:4326</td>
<td>many</td>
<td>osdu:reference-data--CoordinateReferenceSystem:Geographic2D:EPSG::4326</td>
<td>
_WGS 84 2D. Not bound._
</td>
</tr>
<tr>
<td>B</td>
<td>osdu:reference-data--CoordinateReferenceSystem:WGS%2084</td>
<td>
0
(confirmed)
</td>
<td>osdu:reference-data--CoordinateReferenceSystem:Geographic2D:EPSG::4326</td>
<td>
_Violates current OSDU record id standard (space not allowed). Since it is not used, we can simply delete it from the local reference data .json, or let the issue resolve itself when that file is deleted out of this repo. If used, replace it with 4326 same as above (WGS 84 2D)_
</td>
</tr>
<tr>
<td>B</td>
<td>osdu:reference-data--CoordinateReferenceSystem:23031024</td>
<td>
</td>
<td>osdu:reference-data--CoordinateReferenceSystem:BoundProjected:EPSG::23031_EPSG::1613</td>
<td>
_024 is the variant for CT EPSG::1613 which is valid in Norway offshore, South of 62N._
[_https://epsg.org/transformation_1613/ED50-to-WGS-84-24.html_](https://eur02.safelinks.protection.outlook.com/?url=https%3A%2F%2Fepsg.org%2Ftransformation_1613%2FED50-to-WGS-84-24.html&data=05%7C02%7CBert.Kampes%40shell.com%7C9caf2a1428bd475aa64908dc16cde043%7Cdb1e96a8a3da442a930b235cac24cd5c%7C0%7C0%7C638410320683982171%7CUnknown%7CTWFpbGZsb3d8eyJWIjoiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0%3D%7C3000%7C%7C%7C&sdata=3KoFHFfz3bUZEaaadjaRe%2BB0m5xwvRq7IBNBGBBjkBQ%3D&reserved=0)
</td>
</tr>
<tr>
<td>B,C</td>
<td>osdu:reference-data--CoordinateReferenceSystem:MSL</td>
<td>
</td>
<td>osdu:reference-data--CoordinateReferenceSystem:Vertical:EPSG::5714</td>
<td>
\_MSL height \_[_https://epsg.org/crs_5714/MSL-height.html_](https://eur02.safelinks.protection.outlook.com/?url=https%3A%2F%2Fepsg.org%2Fcrs_5714%2FMSL-height.html&data=05%7C02%7CBert.Kampes%40shell.com%7C9caf2a1428bd475aa64908dc16cde043%7Cdb1e96a8a3da442a930b235cac24cd5c%7C0%7C0%7C638410320683982171%7CUnknown%7CTWFpbGZsb3d8eyJWIjoiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0%3D%7C3000%7C%7C%7C&sdata=bpZh4B%2B6Pwx6VNPvQ%2FLKirv9To%2B2uQC85H6uJqmUKXg%3D&reserved=0)
</td>
</tr>
<tr>
<td>C</td>
<td>osdu:reference-data--CoordinateReferenceSystem:5709</td>
<td>
</td>
<td>osdu:reference-data--CoordinateReferenceSystem:Vertical:EPSG::5709</td>
<td>
_5709 is NAP height (Dutch onshore vertical)_
</td>
</tr>
<tr>
<td>C</td>
<td>osdu:reference-data--CoordinateReferenceSystem:23095</td>
<td>
</td>
<td>osdu:reference-data--CoordinateReferenceSystem:BoundProjected:EPSG::23095_EPSG::1311</td>
<td>ED50 / TM 5 NE used in NLD with CT 1311 generally</td>
</tr>
<tr>
<td>C</td>
<td>osdu:reference-data--CoordinateReferenceSystem:23032</td>
<td>
</td>
<td>osdu:reference-data--CoordinateReferenceSystem:BoundProjected:EPSG::23032_EPSG::1311</td>
<td>ED50/UTM32N. Several CTs used. 1311, 1133, 1612, 1613, 1627, 1631, 1810 depending on area.</td>
</tr>
<tr>
<td>C</td>
<td>osdu:reference-data--CoordinateReferenceSystem:25831</td>
<td>
</td>
<td>osdu:reference-data--CoordinateReferenceSystem:BoundProjected:EPSG::25831_EPSG::1149</td>
<td>
ETRS89 / UTM zone 31N \[1149_25831\]
</td>
</tr>
<tr>
<td>C</td>
<td>osdu:reference-data--CoordinateReferenceSystem:28992</td>
<td>
</td>
<td>osdu:reference-data--CoordinateReferenceSystem:BoundProjected:EPSG::28992_EPSG::1672</td>
<td>Amersfoort/RD New used in NLD with CT 1672</td>
</tr>
<tr>
<td>C</td>
<td>osdu:reference-data--CoordinateReferenceSystem:4230</td>
<td>
</td>
<td>See above – first check if there is data in TNO with 4230 or if this is just a dependency for defining ED50 / UTM (projected CRS which points to 4230 geographic). If 4230 is used in data records, then we need to use the same CT as used in the BoundProjected here for the BoundGeographic</td>
<td>
</td>
</tr>
</table>
**Example manifests for a Well and Wellbore (search e.g., for AsIngestedCoordinates in data SpatialLocation):**
* https://community.opengroup.org/osdu/platform/data-flow/data-loading/open-test-data/-/blob/master/rc--3.0.0/4-instances/Volve/master-data/Well/load_Well.1.0.0_15%252F9-F-14.json
* containing "CoordinateReferenceSystemID": "osdu:reference-data--CoordinateReferenceSystem:4326:"
* [https://community.opengroup.org/osdu/platform/data-flow/data-loading/open-test-data/-/blob/master/rc--3.0.0/4-instances/TNO/master-data/Wellbore/load_Wellbore.1.0.0_1000.json](https://eur02.safelinks.protection.outlook.com/?url=https%3A%2F%2Fcommunity.opengroup.org%2Fosdu%2Fplatform%2Fdata-flow%2Fdata-loading%2Fopen-test-data%2F-%2Fblob%2Fmaster%2Frc--3.0.0%2F4-instances%2FTNO%2Fmaster-data%2FWellbore%2Fload_Wellbore.1.0.0_1000.json&data=05%7C02%7CBert.Kampes%40shell.com%7C9caf2a1428bd475aa64908dc16cde043%7Cdb1e96a8a3da442a930b235cac24cd5c%7C0%7C0%7C638410320683982171%7CUnknown%7CTWFpbGZsb3d8eyJWIjoiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0%3D%7C3000%7C%7C%7C&sdata=AAmNQVW2gz6f3dv0NSJ%2B1VZJZf%2FvXdMxubidAACc7lI%3D&reserved=0)
* Containing: "CoordinateReferenceSystemID": "osdu:reference-data--CoordinateReferenceSystem:23031:", with a PR as well.Dadong ZhouChad LeongDadong Zhouhttps://community.opengroup.org/osdu/platform/data-flow/data-loading/open-test-data/-/issues/94WorkProduct manifests generation fails while parsing LAS files2024-01-17T20:48:31ZZhubin SalehiWorkProduct manifests generation fails while parsing LAS filesParsing LAS file in well_logs and well_logs_1_1_0 directories for TNO fails with the following error log for each file:
```
2024-01-12 15:31:53 ERROR Unable to read laslog file: /home/zhubin/tno-wpc-datasets/well-logs/3938_del08_1994_co...Parsing LAS file in well_logs and well_logs_1_1_0 directories for TNO fails with the following error log for each file:
```
2024-01-12 15:31:53 ERROR Unable to read laslog file: /home/zhubin/tno-wpc-datasets/well-logs/3938_del08_1994_comp.las
Traceback (most recent call last):
File "/home/zhubin/.local/lib/python3.10/site-packages/lasio/las.py", line 176, in read
assert version in (1.2, 2, None)
AssertionError
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/home/zhubin/open-test-data/rc--3.0.0/2-scripts/load_manifest_scripts/src/loading_manifest/osdu_laslog_manifest.py", line 405, in create_laslog_manifest_from_path
log_data = read_data_from_log_file(full_valid_file_path)
File "/home/zhubin/open-test-data/rc--3.0.0/2-scripts/load_manifest_scripts/src/loading_manifest/osdu_laslog_manifest.py", line 39, in read_data_from_log_file
las = lasio.read(fp)
File "/home/zhubin/.local/lib/python3.10/site-packages/lasio/__init__.py", line 41, in read
return LASFile(file_ref, **kwargs)
File "/home/zhubin/.local/lib/python3.10/site-packages/lasio/las.py", line 77, in __init__
self.read(file_ref, **read_kwargs)
File "/home/zhubin/.local/lib/python3.10/site-packages/lasio/las.py", line 178, in read
if version < 2:
TypeError: '<' not supported between instances of 'str' and 'int'
```
I tried different version of `lasio` in `open-test-data/rc--3.0.0/2-scripts/load_manifest_scripts/requirements.txt` file. Using version 0.30 resolved the issue.https://community.opengroup.org/osdu/platform/data-flow/data-loading/open-test-data/-/issues/93Volve Stratigraphy2023-10-25T05:58:29ZThomas Gehrmann [slb]Volve StratigraphyAdd a complete stratigraphic column for Volve. This is based on the spreadsheet provided by Bjarne Bøklepp [Equinor] in the member GitLab [here](https://gitlab.opengroup.org/osdu/subcommittees/data-def/projects/well-delivery/docs/-/blob/...Add a complete stratigraphic column for Volve. This is based on the spreadsheet provided by Bjarne Bøklepp [Equinor] in the member GitLab [here](https://gitlab.opengroup.org/osdu/subcommittees/data-def/projects/well-delivery/docs/-/blob/master/Design%20Documents/WellLogExtensions/Volve_WellLog_examples/stratigraphy/volve_lithostratigraphy.xlsx?ref_type=heads).Thomas Gehrmann [slb]Thomas Gehrmann [slb]https://community.opengroup.org/osdu/platform/data-flow/data-loading/open-test-data/-/issues/92Create Seismic 2D Navigation sample JSON payloads - ready to support display ...2023-09-14T13:02:22ZDebasis ChatterjeeCreate Seismic 2D Navigation sample JSON payloads - ready to support display of SP labelsPlease see
https://gitlab.opengroup.org/osdu/subcommittees/data-def/work-products/schema/-/issues/348#note_69692
With that information, I think we may need to overhaul these (SEGP1) examples.
Existing JSON payloads here
https://commu...Please see
https://gitlab.opengroup.org/osdu/subcommittees/data-def/work-products/schema/-/issues/348#note_69692
With that information, I think we may need to overhaul these (SEGP1) examples.
Existing JSON payloads here
https://community.opengroup.org/osdu/platform/data-flow/data-loading/open-test-data/-/tree/master/rc--3.0.0/4-instances/Volve/work-products/seismics_1_2_0
cc @Keith_Wallhttps://community.opengroup.org/osdu/platform/data-flow/data-loading/open-test-data/-/issues/91TNO dataset, master-data missing required value2023-06-05T10:24:14ZDo DangTNO dataset, master-data missing required valueHi,
I'm using [generate_manifests.sh](https://community.opengroup.org/osdu/platform/data-flow/data-loading/open-test-data/-/blob/master/rc--3.0.0/6-data-load-scripts/scripts/generate-manifests.sh) to generate manifests. However, there ar...Hi,
I'm using [generate_manifests.sh](https://community.opengroup.org/osdu/platform/data-flow/data-loading/open-test-data/-/blob/master/rc--3.0.0/6-data-load-scripts/scripts/generate-manifests.sh) to generate manifests. However, there are some missing properties in the provided TNO dataset, in this [file](https://community.opengroup.org/osdu/platform/data-flow/data-loading/open-test-data/-/blob/master/rc--3.0.0/1-data/3-provided/TNO/master-data/Well/well_tno_csv_0915_2021.csv)
Log
```Creating manifest using /Users/dodang/workspace/free/osdu/open-test-data-master/rc--3.0.0/6-data-load-scripts/scripts/../../1-data/3-provided/TNO/master-data/Well/well_tno_csv_0915_2021.csv with template /Users/dodang/workspace/free/osdu/open-test-data-master/rc--3.0.0/6-data-load-scripts/scripts/../../5-templates/master_data/template_maWell.1.0.0.json
Not using group file
2023-06-04 23:07:55 INFO input_csv: /Users/dodang/workspace/free/osdu/open-test-data-master/rc--3.0.0/1-data/3-provided/TNO/master-data/Well/well_tno_csv_0915_2021.csv
2023-06-04 23:07:55 INFO template_json: /Users/dodang/workspace/free/osdu/open-test-data-master/rc--3.0.0/5-templates/master_data/template_maWell.1.0.0.json
2023-06-04 23:07:55 INFO output_path: /Users/dodang/workspace/free/osdu/open-test-data-master/rc--3.0.0/generated-manifests/tno/master-well-data-manifests
2023-06-04 23:07:55 INFO schema_path: /Users/dodang/workspace/free/osdu/open-test-data-master/rc--3.0.0/3-schema
2023-06-04 23:07:55 INFO schema_ns_name: <namespace>
2023-06-04 23:07:55 INFO schema_ns_value: --required_template
2023-06-04 23:07:55 INFO
2023-06-04 23:07:56 ERROR Unable to process data row: 1
Traceback (most recent call last):
File "/Users/dodang/workspace/free/osdu/open-test-data-master/rc--3.0.0/2-scripts/load_manifest_scripts/src/loading_manifest/csv_to_json.py", line 631, in create_manifest_from_csv
jsonschema.validate(to_be_validated, schema, resolver=resolver)
File "/Users/dodang/workspace/arm-venv/lib/python3.9/site-packages/jsonschema/validators.py", line 1121, in validate
raise error
jsonschema.exceptions.ValidationError: 'properties' is a required property
Failed validating 'required' in schema['properties']['data']['allOf'][1]['properties']['SpatialLocation']['properties']['Wgs84Coordinates']['properties']['features']['items']:
{'properties': {'bbox': {'items': {'type': 'number'},
'minItems': 4,
'type': 'array'},
'geometry': {'oneOf': [{'type': 'null'},
{'properties': {'bbox': {'items': {'type': 'number'},
'minItems': 4,
'type': 'array'},
'coordinates': {'items': {'type': 'number'},
'minItems': 2,
'type': 'array'},
'type': {'enum': ['Point'],
'type': 'string'}},
'required': ['type',
'coordinates'],
'title': 'GeoJSON Point',
'type': 'object'},
{'properties': {'bbox': {'items': {'type': 'number'},
'minItems': 4,
'type': 'array'},
'coordinates': {'items': {'items': {'type': 'number'},
'minItems': 2,
'type': 'array'},
'minItems': 2,
'type': 'array'},
'type': {'enum': ['LineString'],
'type': 'string'}},
'required': ['type',
'coordinates'],
'title': 'GeoJSON LineString',
'type': 'object'},
{'properties': {'bbox': {'items': {'type': 'number'},
'minItems': 4,
'type': 'array'},
'coordinates': {'items': {'items': {'items': {'type': 'number'},
'minItems': 2,
'type': 'array'},
'minItems': 4,
'type': 'array'},
'type': 'array'},
'type': {'enum': ['Polygon'],
'type': 'string'}},
'required': ['type',
'coordinates'],
'title': 'GeoJSON Polygon',
'type': 'object'},
{'properties': {'bbox': {'items': {'type': 'number'},
'minItems': 4,
'type': 'array'},
'coordinates': {'items': {'items': {'type': 'number'},
'minItems': 2,
'type': 'array'},
'type': 'array'},
'type': {'enum': ['MultiPoint'],
'type': 'string'}},
'required': ['type',
'coordinates'],
'title': 'GeoJSON MultiPoint',
'type': 'object'},
{'properties': {'bbox': {'items': {'type': 'number'},
'minItems': 4,
'type': 'array'},
'coordinates': {'items': {'items': {'items': {'type': 'number'},
'minItems': 2,
'type': 'array'},
'minItems': 2,
'type': 'array'},
'type': 'array'},
'type': {'enum': ['MultiLineString'],
'type': 'string'}},
'required': ['type',
'coordinates'],
'title': 'GeoJSON '
'MultiLineString',
'type': 'object'},
{'properties': {'bbox': {'items': {'type': 'number'},
'minItems': 4,
'type': 'array'},
'coordinates': {'items': {'items': {'items': {'items': {'type': 'number'},
'minItems': 2,
'type': 'array'},
'minItems': 4,
'type': 'array'},
'type': 'array'},
'type': 'array'},
'type': {'enum': ['MultiPolygon'],
'type': 'string'}},
'required': ['type',
'coordinates'],
'title': 'GeoJSON MultiPolygon',
'type': 'object'},
{'properties': {'bbox': {'items': {'type': 'number'},
'minItems': 4,
'type': 'array'},
'geometries': {'items': {'oneOf': [{'properties': {'bbox': {'items': {'type': 'number'},
'minItems': 4,
'type': 'array'},
'coordinates': {'items': {'type': 'number'},
'minItems': 2,
'type': 'array'},
'type': {'enum': ['Point'],
'type': 'string'}},
'required': ['type',
'coordinates'],
'title': 'GeoJSON '
'Point',
'type': 'object'},
{'properties': {'bbox': {'items': {'type': 'number'},
'minItems': 4,
'type': 'array'},
'coordinates': {'items': {'items': {'type': 'number'},
'minItems': 2,
'type': 'array'},
'minItems': 2,
'type': 'array'},
'type': {'enum': ['LineString'],
'type': 'string'}},
'required': ['type',
'coordinates'],
'title': 'GeoJSON '
'LineString',
'type': 'object'},
{'properties': {'bbox': {'items': {'type': 'number'},
'minItems': 4,
'type': 'array'},
'coordinates': {'items': {'items': {'items': {'type': 'number'},
'minItems': 2,
'type': 'array'},
'minItems': 4,
'type': 'array'},
'type': 'array'},
'type': {'enum': ['Polygon'],
'type': 'string'}},
'required': ['type',
'coordinates'],
'title': 'GeoJSON '
'Polygon',
'type': 'object'},
{'properties': {'bbox': {'items': {'type': 'number'},
'minItems': 4,
'type': 'array'},
'coordinates': {'items': {'items': {'type': 'number'},
'minItems': 2,
'type': 'array'},
'type': 'array'},
'type': {'enum': ['MultiPoint'],
'type': 'string'}},
'required': ['type',
'coordinates'],
'title': 'GeoJSON '
'MultiPoint',
'type': 'object'},
{'properties': {'bbox': {'items': {'type': 'number'},
'minItems': 4,
'type': 'array'},
'coordinates': {'items': {'items': {'items': {'type': 'number'},
'minItems': 2,
'type': 'array'},
'minItems': 2,
'type': 'array'},
'type': 'array'},
'type': {'enum': ['MultiLineString'],
'type': 'string'}},
'required': ['type',
'coordinates'],
'title': 'GeoJSON '
'MultiLineString',
'type': 'object'},
{'properties': {'bbox': {'items': {'type': 'number'},
'minItems': 4,
'type': 'array'},
'coordinates': {'items': {'items': {'items': {'items': {'type': 'number'},
'minItems': 2,
'type': 'array'},
'minItems': 4,
'type': 'array'},
'type': 'array'},
'type': 'array'},
'type': {'enum': ['MultiPolygon'],
'type': 'string'}},
'required': ['type',
'coordinates'],
'title': 'GeoJSON '
'MultiPolygon',
'type': 'object'}]},
'type': 'array'},
'type': {'enum': ['GeometryCollection'],
'type': 'string'}},
'required': ['type',
'geometries'],
'title': 'GeoJSON '
'GeometryCollection',
'type': 'object'}]},
'properties': {'oneOf': [{'type': 'null'},
{'type': 'object'}]},
'type': {'enum': ['Feature'], 'type': 'string'}},
'required': ['type', 'properties', 'geometry'],
'title': 'GeoJSON Feature',
'type': 'object'}
On instance['data']['SpatialLocation']['Wgs84Coordinates']['features'][0]:
{'geometry': {'coordinates': [3.51906683, 55.68101428],
'type': 'Point'},
'type': 'Feature'}```https://community.opengroup.org/osdu/platform/data-flow/data-loading/open-test-data/-/issues/90Add all released new schema versions to test data2023-05-22T21:07:39ZMichaelAdd all released new schema versions to test dataNew schemas are released (for instance master-data--Wellbore:1.3.0), however, these new schemas are not added to the pre-shipping environments for testing.
There should at least be a few records that use the latest released schemas for ...New schemas are released (for instance master-data--Wellbore:1.3.0), however, these new schemas are not added to the pre-shipping environments for testing.
There should at least be a few records that use the latest released schemas for each major data type (aster-data--Well, master-data--Wellbore, wpc--SeismicTraceData, wpc--SeismicHorizon, wpc--WellLog, etc) available in the pre-shipping environments by addition to the test dataset.https://community.opengroup.org/osdu/platform/data-flow/data-loading/open-test-data/-/issues/89Stratigraphy and WellboreMarkerSet - questions and concerns2023-10-25T05:58:29ZDebasis ChatterjeeStratigraphy and WellboreMarkerSet - questions and concernsRefer to this excellent documentation (worked example) https://gitlab.opengroup.org/osdu/subcommittees/data-def/work-products/schema/-/blob/master/Examples/WorkedExamples/Reservoir%20Data/Stratigraphy/README.md
For wellbore 15/3-7,
> T...Refer to this excellent documentation (worked example) https://gitlab.opengroup.org/osdu/subcommittees/data-def/work-products/schema/-/blob/master/Examples/WorkedExamples/Reservoir%20Data/Stratigraphy/README.md
For wellbore 15/3-7,
> Top of Viking (group, rank=1) = 4049.0 m Top of Draupne (formation, rank=2) is 4049.0 m. Top of Heather (formation, rank=2) is 4049.0 m.
Looking at populated WellboreMarkerSet record in https://community.opengroup.org/osdu/data/data-definitions/-/blob/master/Examples/work-product-component/WellboreMarkerSet.1.2.1.json
For populated load manifest of sample data (TNO marker data) does not utilize adequate number of properties from Marker array. https://community.opengroup.org/osdu/platform/data-flow/data-loading/open-test-data/-/blob/master/rc--3.0.0/4-instances/TNO/work-products/markers_1_1_0/load_top_1.1.0_1001_csv.json
```
"Markers": [
{
"MarkerName": "QUATER. UNDIFF.",
"MarkerMeasuredDepth": 0.0
},
{
"MarkerName": "Breda Formation",
"MarkerMeasuredDepth": 282.0
},
{
"MarkerName": "Veldhoven Clay Member",
"MarkerMeasuredDepth": 501.0
},
```
It would be nice to get suitable sample data, JSON files/load-manifests (for related entities) that actually match with this excellent documentation. Some hints are in the worked example. Such as- for "Gudrun" Stratigraphic Column - https://gitlab.opengroup.org/osdu/subcommittees/data-def/work-products/schema/-/blob/master/Examples/WorkedExamples/Reservoir%20Data/Stratigraphy/README.md#stratigraphic-column So, it is necessary to convert this information in a complete (loadable) package so as to get a proper reference.
My concerns - Markers.MarkerName is "free text". Open to human error. When Data Loader is populating data from many wells from this NPD field, he/she uses "Top of Draupne" for one well and uses "Top - Draupne" for another well. Use case - for obtaining contour map of "Top of Draupne", it becomes necessary to get MD (or TVD-SS), X, Y from all wellbores.
Question -
1. FeatureTypeID and FeatureName. FeatureType can be "Top", "Base". The values are clear. FeatureName - why is this left as "free text" rather than link to existing record in some other parent entity?
https://community.opengroup.org/osdu/data/data-definitions/-/blob/master/E-R/reference-data/FeatureType.1.0.0.md Description = Used to describe the type of features. Common values being Top, Base, OWC, Fault etc.
Is the property name unambiguous? Is this more "Contact type"? In any case, what would be typical value of FeatureName in NPD example when we have to build Markers array for NPD Wellbore 15-3/7?
> Top of Viking (group, rank=1) = 4049.0 m Top of Draupne (formation, rank=2) is 4049.0 m. Top of Heather (formation, rank=2) is 4049.0 m.
1. In Markers array, there are some properties. Such as MarkerTypeID, Missing. Being array, we can populate information of several markers within one specific Wellbore.
Now, there is also provision of new property/block.
AvailableMarkerProperties such as MissingThickness. Not obvious how this will be used for multiple elements as present in markers array.
1. For WellboreMarkerSet, it is linked to one StraigraphicColumn. The link is from overall record and not individual array element.
In any case, what would be typical value of StraigraphicColumn in NPD example when we have to build Markers array for NPD Wellbore 15-3/7? Leave that as "Gudrun" for Column overall?
> Top of Viking (group, rank=1) = 4049.0 m Top of Draupne (formation, rank=2) is 4049.0 m. Top of Heather (formation, rank=2) is 4049.0 m.
cc - @gehrmann and @keith_wall for informationhttps://community.opengroup.org/osdu/platform/data-flow/data-loading/open-test-data/-/issues/88Enhance sample load manifests for Marker data2022-10-08T11:28:56ZDebasis ChatterjeeEnhance sample load manifests for Marker dataRefer to this example
https://community.opengroup.org/osdu/platform/data-flow/data-loading/open-test-data/-/blob/master/rc--3.0.0/4-instances/TNO/work-products/markers_1_1_0/load_top_1.1.0_1002_csv.json
Schema version 1.1.0
Also see ex...Refer to this example
https://community.opengroup.org/osdu/platform/data-flow/data-loading/open-test-data/-/blob/master/rc--3.0.0/4-instances/TNO/work-products/markers_1_1_0/load_top_1.1.0_1002_csv.json
Schema version 1.1.0
Also see example of populated schema (1.2.1)
https://community.opengroup.org/osdu/data/data-definitions/-/blob/master/Examples/work-product-component/WellboreMarkerSet.1.2.1.json
Comments here are not for schema difference 1.1.0 vs. 1.2.1. But for lack of information shown for each marker in sample load manifest.
See marker array in sample load manifest. (shows only two properties filled for each marker inside array).
```
"WellboreID": "osdu:master-data--Wellbore:1002:",
"Markers": [
{
"MarkerName": "Maassluis Formation",
"MarkerMeasuredDepth": 0.0
},
{
"MarkerName": "Oosterhout Formation",
"MarkerMeasuredDepth": 242.5
},
....
```
This does not showcase use of detailed description of each marker for Wellbore "1002".
```
"AvailableMarkerProperties": [
{
"MarkerPropertyTypeID": "partition-id:reference-data--MarkerPropertyType:MissingThickness:",
"MarkerPropertyUnitID": "partition-id:reference-data--UnitOfMeasure:ft:",
"Name": "MissingThickness"
}
],
"Markers": [
{
"MarkerName": "Example MarkerName",
"MarkerID": "Example Marker ID",
"InterpretationID": "namespace:work-product-component--GeobodyBoundaryInterpretation:GeobodyBoundaryInterpretation-911bb71f-06ab-4deb-8e68-b8c9229dc76b:",
"MarkerMeasuredDepth": 12345.6,
"MarkerSubSeaVerticalDepth": 12345.6,
"MarkerDate": "2020-02-13T09:13:15.55Z",
"MarkerObservationNumber": 12345.6,
"MarkerInterpreter": "Example MarkerInterpreter",
"MarkerTypeID": "namespace:reference-data--MarkerType:BioStratigraphy:",
"FeatureTypeID": "namespace:reference-data--FeatureType:Base:",
"FeatureName": "Example FeatureName",
"PositiveVerticalDelta": 12345.6,
"NegativeVerticalDelta": 12345.6,
"SurfaceDipAngle": 12345.6,
"SurfaceDipAzimuth": 12345.6,
"Missing": "Example Missing",
"GeologicalAge": "Example GeologicalAge"
}
],
"StratigraphicColumnID": "namespace:work-product-component--StratigraphicColumn:StratigraphicColumn-911bb71f-06ab-4deb-8e68-b8c9229dc76b:",
"StratigraphicColumnRankInterpretationID": "namespace:work-product-component--StratigraphicColumnRankInterpretation:StratigraphicColumnRankInterpretation-911bb71f-06ab-4deb-8e68-b8c9229dc76b:",
```
This CSV file may be helpful.
https://gitlab.opengroup.org/osdu/subcommittees/data-def/work-products/schema/-/blob/master/Examples/WorkedExamples/WellboreMarkerSet/dataset_and_wpc/MarkerSet-b8fd398a-5d74-45fa-8ecb-03b1ad927026.csv
cc @Keith_Wallhttps://community.opengroup.org/osdu/platform/data-flow/data-loading/open-test-data/-/issues/87Invalid dataset records used in ingestion of Volve seismic data2022-10-06T20:49:55ZMichaelInvalid dataset records used in ingestion of Volve seismic dataWe have noticed in several different environments that the dataset records used for the Volve seismic data is incorrect.
The dataset records being used are dataset--FileCollection.SEGY dataset records.
When attempting to call the data...We have noticed in several different environments that the dataset records used for the Volve seismic data is incorrect.
The dataset records being used are dataset--FileCollection.SEGY dataset records.
When attempting to call the dataset get retrieval instruction service (api/dataset/v1/getRetrievalInstructions) a 500 error is returned.
Can the Volve ingestion process be changed to either fix the dataset--FileCollection.SEGY dataset records so that a 500 error does not occur when calling the dataset get retrieval instruction service or changing the dataset record to be a dataset-File.Generic.https://community.opengroup.org/osdu/platform/data-flow/data-loading/open-test-data/-/issues/80Add spatial information for TNO Well Log Records2022-06-10T15:49:09ZMichaelAdd spatial information for TNO Well Log RecordsThe manifest records for the TNO work-product-component--WellLog lack any Spatial fields. This prevents any spatialFilter queries from being done on the well logs. The location information should be provided for the trajectories when ava...The manifest records for the TNO work-product-component--WellLog lack any Spatial fields. This prevents any spatialFilter queries from being done on the well logs. The location information should be provided for the trajectories when available because this allows users to filter well logsby location on a large scale without having to link a well log to its parent Wellbore or Well.Keith WallKeith Wallhttps://community.opengroup.org/osdu/platform/data-flow/data-loading/open-test-data/-/issues/73Fix Generated Manifest contains names in ID instead of code2022-08-23T13:30:01Zetienne peyssonFix Generated Manifest contains names in ID instead of codeSteps to reproduce:
Generate Manifest files from CSV files for reference, misc master data, well and wellbore data for TNO/Volve data.
Generated Manifest for Well Master data contains reference value like
"VerticalMeasurementTypeID": ...Steps to reproduce:
Generate Manifest files from CSV files for reference, misc master data, well and wellbore data for TNO/Volve data.
Generated Manifest for Well Master data contains reference value like
"VerticalMeasurementTypeID": "osdu:reference-data--VerticalMeasurementType:Rotary%20Table:", "VerticalMeasurementPathID": "osdu:reference-data--VerticalMeasurementPath:Elevation:",
As per latest update these ID should contains ID instead of name like
"VerticalMeasurementTypeID": "osdu:reference-data--VerticalMeasurementType:RT:", "VerticalMeasurementPathID": "osdu:reference-data--VerticalMeasurementPath:ELEV:",
Mapping goes like this
Name Code Rotary Table RT Elevation ELEV
https://community.opengroup.org/osdu/platform/open-test-data/-/blob/master/rc--3.0.0/4-instances/TNO/master-data/Well/load_Well.1.0.0_1011.json
Current CSV Files needs to be updated to use code instead of ID.Keith WallDadong ZhouKeith Wall