Seismic issueshttps://community.opengroup.org/groups/osdu/platform/domain-data-mgmt-services/seismic/-/issues2023-01-13T07:37:43Zhttps://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/open-vds/-/issues/12Create a custom LOD generation example2023-01-13T07:37:43ZMorten OfstadCreate a custom LOD generation exampleAdd a program to the examples that reads a lower LOD, downsamples and creates a higher LOD. This example will have some value since it 1) shows how to write LODs and 2) can be run in the cloud (e.g. on AWS) after uploading the LOD 0 of t...Add a program to the examples that reads a lower LOD, downsamples and creates a higher LOD. This example will have some value since it 1) shows how to write LODs and 2) can be run in the cloud (e.g. on AWS) after uploading the LOD 0 of the data as a trigger or part of the ingestion service.Morten OfstadMorten Ofstadhttps://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/open-vds/-/issues/28Make it possible to set ActualValueRange during import2023-09-21T10:03:08ZMorten OfstadMake it possible to set ActualValueRange during importCurrently the VolumeDataLayout is immutable so there is no way to update the ActualValueRange once it's known (after importing all the data). In order for this to work there needs to be a setter for the ActualValueRange, a dirty-flag for...Currently the VolumeDataLayout is immutable so there is no way to update the ActualValueRange once it's known (after importing all the data). In order for this to work there needs to be a setter for the ActualValueRange, a dirty-flag for the layout and code to re-upload the layout if it is dirty when committing.https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/home/-/issues/3[Seismic DDMS] Provide coverage for metadata2021-06-15T21:24:32ZCelina Marcolinocsilva10@slb.com[Seismic DDMS] Provide coverage for metadata1. Provide coverage and support for seismic metadata following OSDU Seismic Data Model. Ensure seismic metadata are defined and follow OSDU Data Model Schema as in: https://gitlab.opengroup.org/osdu/json-schemas/-/tree/OpenDES_Archive/ge...1. Provide coverage and support for seismic metadata following OSDU Seismic Data Model. Ensure seismic metadata are defined and follow OSDU Data Model Schema as in: https://gitlab.opengroup.org/osdu/json-schemas/-/tree/OpenDES_Archive/geophysics
2. Provide Coverage for Processing Seismic Header information
3. Provide coverage for Survey Definition (grid definition)
4. Provide coverage for Seismic Navigation
5. Provide coverage for Dataset type and domainM1 - Release 0.1https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/home/-/issues/6[Seismic DDMS] Define SEG-Y ingestion flow2021-03-01T17:42:29ZCelina Marcolinocsilva10@slb.com[Seismic DDMS] Define SEG-Y ingestion flowDefine SEGY ingestion flow and how it get optimized into ZGY and/or Open VDS files within Seismic DMS.
To be linked with Segy Ingestion Project from Open Community.Define SEGY ingestion flow and how it get optimized into ZGY and/or Open VDS files within Seismic DMS.
To be linked with Segy Ingestion Project from Open Community.M1 - Release 0.1https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/open-zgy/-/issues/1golang binding2024-03-16T21:01:30ZJackie Ligolang bindingHi,
I'm Jackie from Target Energy Solutions. We have a golang binding for the ZGY library. Do you mind us contributing to this repo?
Best RegardsHi,
I'm Jackie from Target Energy Solutions. We have a golang binding for the ZGY library. Do you mind us contributing to this repo?
Best Regardshttps://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-service/-/issues/1Delete dataset API does not delete COS (Blob Storage) object2023-03-27T19:35:26ZWalter DDelete dataset API does not delete COS (Blob Storage) objectThe delete dataset API of seismic-store-service, calls the storage service POST delete record API. This API deletes the object from COS(Blob Storage) belonging to the dataset. However, the COS object is available even though the response...The delete dataset API of seismic-store-service, calls the storage service POST delete record API. This API deletes the object from COS(Blob Storage) belonging to the dataset. However, the COS object is available even though the response is 204 No Content. We realize that storage service POST delete is just doing soft delete. We wanted to confirm if this is the expected behavior.ethiraj krishnamanaiduethiraj krishnamanaiduhttps://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-service/-/issues/4GCP specfic naming conventions2023-03-27T19:32:16ZRucha DeshpandeGCP specfic naming conventionsThere are many GCP specific names used in the models:
such as gcpid, gcp_bucket etc.
There is also an API called /api/v3/utility/gcs-access-token.
The code should be re-visited to remove any CSP specific naming used.There are many GCP specific names used in the models:
such as gcpid, gcp_bucket etc.
There is also an API called /api/v3/utility/gcs-access-token.
The code should be re-visited to remove any CSP specific naming used.Rucha DeshpandeDiego MolteniRucha Deshpandehttps://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-service/-/issues/8Integration tests assume a valid legal tag exists2023-03-27T19:30:52ZRucha DeshpandeIntegration tests assume a valid legal tag existsIf the FEATURE FLAG for legal is set to true, it checks the validity of the legal tag passed in 'ltag'.If the FEATURE FLAG for legal is set to true, it checks the validity of the legal tag passed in 'ltag'.Rucha DeshpandeDiego MolteniRucha Deshpandehttps://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-service/-/issues/17Tenants - usage of Partition Service2023-03-27T19:30:13ZRucha DeshpandeTenants - usage of Partition ServiceSince we are using a new Tenant model in this service there are new tenant related APIs.
Can we not use the existing Partition Service APIs?Since we are using a new Tenant model in this service there are new tenant related APIs.
Can we not use the existing Partition Service APIs?Rucha DeshpandeDiego MolteniRucha Deshpandehttps://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-service/-/issues/18Dataset with seimsic metadata fails due to updates in R3 data definitions in ...2023-03-27T19:29:25ZRucha DeshpandeDataset with seimsic metadata fails due to updates in R3 data definitions in Storage ServicePosting a dataset with seismic metadata that is to be stored as a Storage record fails.
Seismic DMS service needs to be updated to work with R3 Data Definitions.
See issue:
https://community.opengroup.org/osdu/platform/system/storage/-/i...Posting a dataset with seismic metadata that is to be stored as a Storage record fails.
Seismic DMS service needs to be updated to work with R3 Data Definitions.
See issue:
https://community.opengroup.org/osdu/platform/system/storage/-/issues/44Rucha DeshpandeDiego MolteniRucha Deshpandehttps://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-service/-/issues/22e2e test script needs to run from repository root only2023-03-27T19:28:43ZRucha Deshpandee2e test script needs to run from repository root onlyThe run-e2e-tests.sh script has the following check. This will not work in internal pipelines where the distribution folder structure is different.
if [ ! -f "tsconfig.json" ]; then
printf "\n%s\n" "[ERROR] The script must be cal...The run-e2e-tests.sh script has the following check. This will not work in internal pipelines where the distribution folder structure is different.
if [ ! -f "tsconfig.json" ]; then
printf "\n%s\n" "[ERROR] The script must be called from the project root directory."
exit 1
fiRucha DeshpandeDiego MolteniRucha Deshpandehttps://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-service/-/issues/23createQuery and createKey - generalize structure2023-03-27T19:27:48ZRucha DeshpandecreateQuery and createKey - generalize structureThe following 2 methods
createQuery(namespace: string, kind: string): IJournalQueryModel;
createKey(specs: any): object;
The structure of the parameter should be abstracted to be s
AWS wants to be able to pass information such as
{...The following 2 methods
createQuery(namespace: string, kind: string): IJournalQueryModel;
createKey(specs: any): object;
The structure of the parameter should be abstracted to be s
AWS wants to be able to pass information such as
{
table_name:
tenant_name
subproject_name
..etc
}
of type 'any'.
This is required for AWS,as we are restricted to parse and use the 'Namespace', 'kind' which does not work in all scenarios for the models we have.Rucha DeshpandeDiego MolteniRucha Deshpandehttps://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-sdutil/-/issues/1e2e tests: setup step must create the subproject2023-03-30T16:57:09ZRucha Deshpandee2e tests: setup step must create the subprojectThe e2e tests assume that a subproject exists. Just as some files are uploaded in the 'setup' step, the subproject must also be created as part of the setup step here.
https://community.opengroup.org/osdu/platform/domain-data-mgmt-servic...The e2e tests assume that a subproject exists. Just as some files are uploaded in the 'setup' step, the subproject must also be created as part of the setup step here.
https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-sdutil/-/blob/master/test/e2e/conftest.pyRucha DeshpandeDiego MolteniYunhua KoglinRucha Deshpandehttps://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/open-vds/-/issues/74Handle existing VDSFile better when calling create()2023-08-24T11:31:13ZMorten OfstadHandle existing VDSFile better when calling create()We should report an error if you try to create a VDSFile which already exists. A 'overwriteExisting' option can be added to VDSFileOpenOptions in order to override this. Currently we will just start writing objects inside the existing fi...We should report an error if you try to create a VDSFile which already exists. A 'overwriteExisting' option can be added to VDSFileOpenOptions in order to override this. Currently we will just start writing objects inside the existing file which only works if the layout is identical.https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-service/-/issues/26Code linting IBM2023-03-27T19:26:57ZDiego MolteniCode linting IBMcode linting to apply on ibm code.
``$ tslint -c tslint.json 'src/cloud/providers/ibm/**/*.ts'``code linting to apply on ibm code.
``$ tslint -c tslint.json 'src/cloud/providers/ibm/**/*.ts'``Walter DWalter Dhttps://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-service/-/issues/27AWS seismic store service ci cd testing yaml needs to be in ci cd repo2023-03-27T19:26:25ZDaniel PerezAWS seismic store service ci cd testing yaml needs to be in ci cd repoI have noticed that gitlab yaml for AWS testing has been included in seismic store service (https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-service/-/tree/master/devops/aws...I have noticed that gitlab yaml for AWS testing has been included in seismic store service (https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-service/-/tree/master/devops/aws), this file needs to be in CI CD repo https://community.opengroup.org/osdu/platform/ci-cd-pipelines/-/tree/master/
Please also follow standard and integrate in same yaml inside of cloud providers as we do for other providers https://community.opengroup.org/osdu/platform/ci-cd-pipelines/-/tree/master/cloud-providersRucha DeshpandeRucha Deshpandehttps://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/open-zgy/-/issues/18.NET bindings2021-06-24T10:14:51ZRobert Schmidt.NET bindingsWe ([Cegal](https://cegal.com/)) have a branch with basic C and .NET bindings to the (new) OpenZGY API.
- netstandard2.0, with xunit tests and a console app for net5.0
- Capable of building Debug and Release nuget packages
- Console app...We ([Cegal](https://cegal.com/)) have a branch with basic C and .NET bindings to the (new) OpenZGY API.
- netstandard2.0, with xunit tests and a console app for net5.0
- Capable of building Debug and Release nuget packages
- Console app can dump ZGY meta data, copy and compare ZGY content
- Windows only for now (I would need assistance for Linux support)
- Seismic Store support - sdapi binaries and headers are added to branch
- 2 source files are added to OpenZGY.vcxproj to provide the C bindings, no changes to existing code
- Bindings expose a simplified API for now (e.g. no compression or const support)
Would this be of interest for this repo?
Until more complete, it should probably live as a separate branch.https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-service/-/issues/32[ADR] Domain API2023-10-23T10:46:12ZSacha Brants[ADR] Domain API# Introduction
In order to natively support seismic datasets as defined by the OSDU authority, avoid duplicating the logic in applications to convert seismic data from one schema version to another, and potentially implement different l...# Introduction
In order to natively support seismic datasets as defined by the OSDU authority, avoid duplicating the logic in applications to convert seismic data from one schema version to another, and potentially implement different logic, Seismic DMS should provide APIs to support and manage seismic datasets by validating their schema model and return them to the latest version of the schema.
## Status
- [X] Proposed
- [x] Trialing
- [x] Under review
- [x] Approved
- [ ] Retired
## Context & Scope
### (1) OSDU SCHEMAS ORGANIZATION
In the [OSDU schemas organization](https://community.opengroup.org/osdu/data/data-definitions/-/tree/master/E-R) schemas are organized into different categories. A “dataset” schema provides a piece of bulk data information along with its logical representation while the seismic record of other categories requires to be linked with an existing (pre-ingested) dataset.
![image](/uploads/4f34b082ffc589ca6cc0a549994c1f0a/image.png)
SDMS will provide a set of domain-specific API to support these schema format:
- FileCollection Datasets:
- [FileCollection.Generic.1.0.0](https://community.opengroup.org/osdu/data/data-definitions/-/blob/master/E-R/dataset/FileCollection.Generic.1.0.0.md)
- [FileCollection.SEGY.1.0.0](https://community.opengroup.org/osdu/data/data-definitions/-/blob/master/E-R/dataset/FileCollection.SEGY.1.0.0.md)
- [FileCollection.Slb.OpenZGY.1.0.0](https://community.opengroup.org/osdu/data/data-definitions/-/blob/master/E-R/dataset/FileCollection.Slb.OpenZGY.1.0.0.md)
- [FileCollection.Bluware.OpenVDS.1.0.0](https://community.opengroup.org/osdu/data/data-definitions/-/blob/master/E-R/dataset/FileCollection.Bluware.OpenVDS.1.0.0.md)
- Work Product Components:
- [SeismicTraceData.1.1.0](https://community.opengroup.org/osdu/data/data-definitions/-/blob/master/E-R/work-product-component/SeismicTraceData.1.1.0.md)
- [SeismicBinGrid.1.0.0](https://community.opengroup.org/osdu/data/data-definitions/-/blob/master/E-R/work-product-component/SeismicBinGrid.1.0.0.md)
- [SeismicLineGeometry.1.0.0.md](https://community.opengroup.org/osdu/data/data-definitions/-/blob/master/E-R/work-product-component/SeismicLineGeometry.1.0.0.md)
- Master Data:
- [SeismicAcquisitionSurvey.1.2.0.md](https://community.opengroup.org/osdu/data/data-definitions/-/blob/master/E-R/master-data/SeismicAcquisitionSurvey.1.2.0.md)
- [SeismicProcessingProject.1.1.0.md](https://community.opengroup.org/osdu/data/data-definitions/-/blob/master/E-R/master-data/SeismicProcessingProject.1.1.0.md)
References and Naming Convention:
- SCHEMA: The seismic schema model, for example, SeismicTraceData / SesmicBinGrid / FileCollection.SEGY /...
- SCHEMA VERSION: The schema model versions, for example, SeismicTraceData 1.0.0 / SeismicTraceData 1.6.2 /...
- RECORD: The seismic object schema recorded in the DE Storage Service
- RECORD-ID: the unique record ID, for example, ABC1234
- RECORD-VERSION: The record versions, for example, ABC1234 V1 / ABC1234 V2 /...
### (2) SDMS DOMAIN SPECIFIC APIs
SDMS will provide domain-specific APIs to handle the ingestion, schema validation, and underline bulk management for seismic datasets and components SCHEMA as defined by the OSDU authority.
For each supported SCHEMA, we will document the model with examples and provide APIs to manage both RECORD and their VERSIONS
![image](/uploads/b9b6d2e5f2371c680bf4ffebc7f741cb/image.png)
- An endpoint to ingest the seismic dataset:
- When an object is ingested using this endpoint, a new RECORD will be created if the RECORD-ID is not specified with the request model. A new RECORD-ID and the RECORD-VERSION will be generated and returned. In addition, for FileCollection dataset schema only, a storage resource will be created to host bulk.
- When an object is ingested using this endpoint, a RECORD will be updated if the RECORD-ID is specified in the request with the request model. A new RECORD-VERSIOn will be generated and returned.
- An endpoint to list all datasets of a specific kind
- This endpoint will support query paginated.
- An endpoint to retrieve the last version of the RECORD-ID
- An endpoint to retrieve a specific version of the RECORD-ID
- An endpoint to retrieve all versions for a RECORD-ID
- An endpoint to delete the RECORD with all associated version
- This endpoint will perform and hard delete by removing all RECORD-VERSIONS athe nd associated bulk.
- ~~An endpoint to reindex dataset ingested with the V3 version of SDMS into the V4~~
The SDMS service will provide support to the highest Patch.Minor version of each Major and automatic conversion between versions. For example, if the client calls the v1 endpoint that supports the schema version 1.1.0 to request a record that was ingested with a schema version 1.0.0, SDMS will automatically convert the required record, from the ingested version 1.0.0 to the supported 1.1.0. In addition, we will support conversion between Major versions if conversion rules have been correctly specified (or an error will be thrown).
For each SCHEMA VERSION, the schema model will be documented (in the shared swagger) and examples will also be provided:
**Schema Definition**
![image](/uploads/fdd3145394948c65a09882281d9f91bc/image.png)
**Schema Example**
![image](/uploads/e9a9332bd5699d52ce113a19b3597213/image.png)
### (3) STORAGE ORGANIZATION AND CONNECTION STRINGS
Each time a FileCollection SCHEMA is ingested in SDMS, a new storage container is created in the CSP storage service. The container name will be automatically generated by SDMS by hashing the dataset name information specified in the request schema with the generated ID to guarantee the unicity of the storage resource in each partition. SDMS will provide specific endpoints to generate connection strings for a RECORD-ID and/or RECORD-VERSION to let the caller independently ingest the associated bulk. These storage resources are protected and the connection strings are released only after the caller has been authorized by the service via shared Entitlement Service (ACL-check).
These are the endpoints SDMS will expose for generating upload or download connection strings:
![image](/uploads/f63a7470d79b673e06a58c401471bb0e/image.png)
### (4) DATASET UPLOAD EXAMPLE WORKFLOW
**Dataset Registration**
![image](/uploads/e4dd914cbad85087991611aa6a4b651c/image.png)
**Dataset Ingestion**
![image](/uploads/6ff4caa1beb7699a908694370d0f6ee4/image.png)
### (5) DATASET DOWNLOAD EXAMPLE WORKFLOW
**Dataset Registration**
![image](/uploads/e4dd914cbad85087991611aa6a4b651c/image.png)
**Dataset Consumption**
![image](/uploads/8cf6d9f3d2ffeb87376202c03dc21090/image.png)
### (6) Implementation
Check [Merge Requests](#related-merge-requests) associated to this issue and [OpenAPI](app/sdms-v4/docs/openapi.yaml) definition for details.
## Decision
## Rationale
## Consequences
## When to revisitDiego MolteniDiego Moltenihttps://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-sdutil/-/issues/8AWS upload/download implementation does not respect the applied manifest2023-03-30T16:56:40ZDiego MolteniAWS upload/download implementation does not respect the applied manifestThe upload/download storage method in the AWS implementation stores object as path/dataset-name without respecting the applied "GENERIC" manifest that is expecting block to be numbered from 0 to N. sdutil upload object as single blocks s...The upload/download storage method in the AWS implementation stores object as path/dataset-name without respecting the applied "GENERIC" manifest that is expecting block to be numbered from 0 to N. sdutil upload object as single blocks so these must be named as "0" or consumer application won't be able to read them back (according with the generic manifest)
Change `object_name = f"{s3_folder_name}/{dataset.name}"` to `object_name = f"{s3_folder_name}/0"` in both upload and download methodYunhua KoglinYunhua Koglinhttps://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/home/-/issues/13Seismic DDMS - Exxonmobil perspective - Domain API and other details2021-09-27T14:58:22ZDebasis ChatterjeeSeismic DDMS - Exxonmobil perspective - Domain API and other detailsInput from Brad Yates (Exxonmobil)
Discussion involving EA, Operator, Vendors on 19-Aug-2021.
Early presentation indicated that there will indeed be one common overlying API (irrespective of choice of VDS or ZGY). That would help simple...Input from Brad Yates (Exxonmobil)
Discussion involving EA, Operator, Vendors on 19-Aug-2021.
Early presentation indicated that there will indeed be one common overlying API (irrespective of choice of VDS or ZGY). That would help simple web applications. Vendors may then be able to create application faster, by using the common API.
Such common API should be lowest common denominator as some features of openVDS may not be supported by openZgy.
Desktop applications may still need to use native oZgy or oVDS libraries for read/write, for performance reasons and also in suitable language (ex: C+, Python).
@doniger suggested that we wait to receive clear picture of future Seismic DDMS (as per recommended architecture principles).
And then re-evaluate gaps, if any.
cc : @vishal for information