Seismic issueshttps://community.opengroup.org/groups/osdu/platform/domain-data-mgmt-services/seismic/-/issues2021-09-29T11:57:24Zhttps://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-service/-/issues/36File metadata service2021-09-29T11:57:24ZSacha BrantsFile metadata servicePlaceholder for File metadata service ADRPlaceholder for File metadata service ADRDuo ChenDuo Chenhttps://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-service/-/issues/90Fix deployment to allow access to in https://osdu-glab.msft-osdu-test.org/sei...2023-06-15T14:28:28ZSacha BrantsFix deployment to allow access to in https://osdu-glab.msft-osdu-test.org/seistore-svc/api/v4/swagger-ui.htmlhttps://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-service/-/issues/58For Tenant there is no endpoint that can be used to list all the available te...2023-03-24T19:22:43ZKamlesh TodaiFor Tenant there is no endpoint that can be used to list all the available tenantsThere should be a way to list all the tenants to which the user has access. At present, there is no way to do that. If one had created the tenant in the past and cannot remember the name, then there is no way to find that name.There should be a way to list all the tenants to which the user has access. At present, there is no way to do that. If one had created the tenant in the past and cannot remember the name, then there is no way to find that name.https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-sdutil/-/issues/12[GCP] Can't download data from Seismic Store if it consists of more than one ...2023-03-30T16:54:59ZYan Sushchynski (EPAM)[GCP] Can't download data from Seismic Store if it consists of more than one fileAfter uploading oVDS dataset to Seismic Store with SEGY->oVDS converter, I want to download the result to my local machine.
But I got this error
![image](/uploads/ebb3058d89b9dd27ddf937bd259b8125/image.png)
As I understand, `sdutil` us...After uploading oVDS dataset to Seismic Store with SEGY->oVDS converter, I want to download the result to my local machine.
But I got this error
![image](/uploads/ebb3058d89b9dd27ddf937bd259b8125/image.png)
As I understand, `sdutil` uses `gcsurl` of the dataset and attempts to download it directly from the bucket, but it can't do this if the dataset consists of more than one file.
This is how the oVDS dataset looks in the bucket
![image](/uploads/361a48c79afa88e2542dd269d4cb94b8/image.png)https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-service/-/issues/66[GCP] Docker Image loading time is very long2023-03-24T19:13:05ZMaksimelyan Tamashevich (EPAM)[GCP] Docker Image loading time is very long![image](/uploads/014b51a301ac9672301c884b1f9414da/image.png)
As you can see, the image loading time is more than 15 minutes. This problem is temporary, it was noticed several times in the evening (UTC+2).
And as a result, our deploymen...![image](/uploads/014b51a301ac9672301c884b1f9414da/image.png)
As you can see, the image loading time is more than 15 minutes. This problem is temporary, it was noticed several times in the evening (UTC+2).
And as a result, our deployment job fails because the default helm timeout of 5 minutes has been exceeded.https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-service/-/issues/42[GCP] Seismic store doesn't use Partition Service to get a GCP project-id of ...2023-03-27T19:16:22ZYan Sushchynski (EPAM)[GCP] Seismic store doesn't use Partition Service to get a GCP project-id of Google Cloud ProjectThe main problems are following:
- See no signs that SSDMS uses Partition Service at all, it accepts requests with no data-partition-id header
- When we create SSDMS tenant, we have to specify `gcpid`, the project where data will be stor...The main problems are following:
- See no signs that SSDMS uses Partition Service at all, it accepts requests with no data-partition-id header
- When we create SSDMS tenant, we have to specify `gcpid`, the project where data will be stored if we use this tenant in our `sd-path`.
It causes two problems:
- users have to know the actual `gcpid`
- users can specify the `gcpid` that doesn’t correspond `data-partition-id`
Example of create tenant request:
```
{
"gcpid": "{{gcp_project_id}}",
"esd": "{{data-partition-id}}.osdu-gcp.go3-nrg.projects.epam.com",
"default_acl": "data.default.owners@{{data-partition-id}}.osdu-gcp.go3-nrg.projects.epam.com"
}
```
Solution is to use Partition Service to get GCP project-id, thus users don't need to specify `gcpid` manually and the GCP project-id is chosen correctly.
cc:
@Kateryna_Kurach @Siarhei_KhaletskiM13 - Release 0.16https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-service/-/issues/4GCP specfic naming conventions2023-03-27T19:32:16ZRucha DeshpandeGCP specfic naming conventionsThere are many GCP specific names used in the models:
such as gcpid, gcp_bucket etc.
There is also an API called /api/v3/utility/gcs-access-token.
The code should be re-visited to remove any CSP specific naming used.There are many GCP specific names used in the models:
such as gcpid, gcp_bucket etc.
There is also an API called /api/v3/utility/gcs-access-token.
The code should be re-visited to remove any CSP specific naming used.Rucha DeshpandeDiego MolteniRucha Deshpandehttps://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/open-vds/-/issues/191Getting histogram/statistics from OpenVDS data2023-06-22T14:11:50ZQiang FuGetting histogram/statistics from OpenVDS dataDoes OpenVDS data have embedded metadata for histogram/Statistics?
Go through all bricks to calculate the histogram or statistics could be quite expensive.Does OpenVDS data have embedded metadata for histogram/Statistics?
Go through all bricks to calculate the histogram or statistics could be quite expensive.https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/open-zgy/-/issues/1golang binding2024-03-16T21:01:30ZJackie Ligolang bindingHi,
I'm Jackie from Target Energy Solutions. We have a golang binding for the ZGY library. Do you mind us contributing to this repo?
Best RegardsHi,
I'm Jackie from Target Energy Solutions. We have a golang binding for the ZGY library. Do you mind us contributing to this repo?
Best Regardshttps://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/open-vds/-/issues/74Handle existing VDSFile better when calling create()2023-08-24T11:31:13ZMorten OfstadHandle existing VDSFile better when calling create()We should report an error if you try to create a VDSFile which already exists. A 'overwriteExisting' option can be added to VDSFileOpenOptions in order to override this. Currently we will just start writing objects inside the existing fi...We should report an error if you try to create a VDSFile which already exists. A 'overwriteExisting' option can be added to VDSFileOpenOptions in order to override this. Currently we will just start writing objects inside the existing file which only works if the layout is identical.https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-service/-/issues/130IBM E2E tests fail2024-03-19T14:59:18ZDaniel PerezIBM E2E tests failE2E tests for IBM in SDMS V3 are failing with no healthy upstream, this seems to be an issue with environment itself.E2E tests for IBM in SDMS V3 are failing with no healthy upstream, this seems to be an issue with environment itself.Anuj GuptaIsha KumariAnuj Guptahttps://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-service/-/issues/123[IBM] replace keycloak-admin with @keycloak/keycloak-admin-client2024-03-05T01:08:14ZDiego Molteni[IBM] replace keycloak-admin with @keycloak/keycloak-admin-clientplease replace the deprecated and vulnerable package [keycloak-admin](https://www.npmjs.com/package/keycloak-admin) with the new [@keycloak/keycloak-admin-client](https://www.npmjs.com/package/@keycloak/keycloak-admin-client)please replace the deprecated and vulnerable package [keycloak-admin](https://www.npmjs.com/package/keycloak-admin) with the new [@keycloak/keycloak-admin-client](https://www.npmjs.com/package/@keycloak/keycloak-admin-client)M23 - Release 0.26Isha KumariIsha Kumarihttps://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-cpp-lib/-/issues/22IBM service not responding correctly on Seismic DMS Service dataset list endp...2023-03-30T16:31:30ZRashaad GrayIBM service not responding correctly on Seismic DMS Service dataset list endpointPOST /dataset/tenant/{tenantid}/subproject/{subprojectid}
Get the list of datasets in a subproject.
endpoint is not correctly responding, causing error with e2e test will skip specific test for nowPOST /dataset/tenant/{tenantid}/subproject/{subprojectid}
Get the list of datasets in a subproject.
endpoint is not correctly responding, causing error with e2e test will skip specific test for nowhttps://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-service/-/issues/89Implement dataset storage for AWS2023-09-27T13:19:38ZSacha BrantsImplement dataset storage for AWShttps://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-service/-/issues/113Implement dataset storage for GCP2023-09-20T02:17:21ZMark YanImplement dataset storage for GCPhttps://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-service/-/issues/114Implement dataset storage for IBM2023-09-20T02:17:59ZMark YanImplement dataset storage for IBMhttps://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-service/-/issues/97Implementing DDMSDatasets[] standardize content data2023-03-30T16:29:46ZChad LeongImplementing DDMSDatasets[] standardize content dataDDMS references to optimized content were found to be created ad-hoc and outside the work-product-component schemas.
Following the [original observation](https://gitlab.opengroup.org/osdu/subcommittees/ea/docs/-/issues/7) an [ADR was c...DDMS references to optimized content were found to be created ad-hoc and outside the work-product-component schemas.
Following the [original observation](https://gitlab.opengroup.org/osdu/subcommittees/ea/docs/-/issues/7) an [ADR was created](https://gitlab.opengroup.org/osdu/subcommittees/ea/docs/-/issues/10), which standardizes the optimized content references from work-product-component entity types. Over time, DDMSs are expected to implement optimized content references using the `data.DDMSDatasets[]` property and support migration.https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-sdutil/-/issues/32Implement resumable file transfer (upload and download)2024-01-11T15:59:10ZSacha BrantsImplement resumable file transfer (upload and download)Given the size of data in Seismic DMS, users want to resume a file transfer (upload/upload).
It should make sure that there are no integrity issues.Given the size of data in Seismic DMS, users want to resume a file transfer (upload/upload).
It should make sure that there are no integrity issues.https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/open-zgy/-/issues/25implications associated with Python(pickle) module.2022-12-30T09:03:08ZJayesh Bagulimplications associated with Python(pickle) module.I am working on a vulnerability issue of the Pickle module in open-zgy.
https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/open-zgy/-/security/vulnerabilities/18266
The pickle library’s documentation discour...I am working on a vulnerability issue of the Pickle module in open-zgy.
https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/open-zgy/-/security/vulnerabilities/18266
The pickle library’s documentation discourages the unpickling of untrusted data. Currently, deserialization is happening with a simple approach.
To prevent unsafe deserialization there are multiple approaches are there.
1) Implementing a message authentication code (MAC) to ensure the data integrity of the payload. (hmac and hashlib)
2) Run the deserialization code with limited access permissions.
3) Validate Inputs.
I would like to hear which will best suit it as well as compatibility option for all existing things.
CC: @Srinivasan_Narayanan @nursheikh @chadJayesh BagulJayesh Bagulhttps://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-service/-/issues/99Include aws region in dataset information for AWS Seismic DDMS data2024-02-26T21:52:49ZMichaelInclude aws region in dataset information for AWS Seismic DDMS dataWhen using sdapi to retreive seismic ddms data coming from AWS, a user needs to first set the AWS_REGION environment variable (see ticket https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/s...When using sdapi to retreive seismic ddms data coming from AWS, a user needs to first set the AWS_REGION environment variable (see ticket https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-cpp-lib/-/issues/21).
To better handle this use case, the get dataset service `/dataset/tenant/{tenantid}/subproject/{subproject}/dataset/{datasetid}` should provide information regarding the aws region if the dataset is stored in s3 storage.