seismic-dms-service issueshttps://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-service/-/issues2023-03-27T19:35:26Zhttps://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-service/-/issues/1Delete dataset API does not delete COS (Blob Storage) object2023-03-27T19:35:26ZWalter DDelete dataset API does not delete COS (Blob Storage) objectThe delete dataset API of seismic-store-service, calls the storage service POST delete record API. This API deletes the object from COS(Blob Storage) belonging to the dataset. However, the COS object is available even though the response...The delete dataset API of seismic-store-service, calls the storage service POST delete record API. This API deletes the object from COS(Blob Storage) belonging to the dataset. However, the COS object is available even though the response is 204 No Content. We realize that storage service POST delete is just doing soft delete. We wanted to confirm if this is the expected behavior.ethiraj krishnamanaiduethiraj krishnamanaiduhttps://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-service/-/issues/3Documentation of latest code changes2023-03-28T04:07:18ZWalter DDocumentation of latest code changesHi Team.
Hope everyone is safe and doing well.
The latest code check-in has lot of changes from addition
1. cloud.ts
2. trace.ts
to deletion of files (iam.ts) along with several updated files.
Also, the folder structure has changed ...Hi Team.
Hope everyone is safe and doing well.
The latest code check-in has lot of changes from addition
1. cloud.ts
2. trace.ts
to deletion of files (iam.ts) along with several updated files.
Also, the folder structure has changed
1. /config and /swaggerdocs are not present.
2. config.ts is moved into /cloud
Is it possible to provide us with document containing the important changes resulting in changes to CSP implementation? Thank you.Diego MolteniDiego Moltenihttps://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-service/-/issues/4GCP specfic naming conventions2023-03-27T19:32:16ZRucha DeshpandeGCP specfic naming conventionsThere are many GCP specific names used in the models:
such as gcpid, gcp_bucket etc.
There is also an API called /api/v3/utility/gcs-access-token.
The code should be re-visited to remove any CSP specific naming used.There are many GCP specific names used in the models:
such as gcpid, gcp_bucket etc.
There is also an API called /api/v3/utility/gcs-access-token.
The code should be re-visited to remove any CSP specific naming used.Rucha DeshpandeDiego MolteniRucha Deshpandehttps://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-service/-/issues/10E2E test issue2021-02-23T17:51:46ZRucha DeshpandeE2E test issueDATASET LIST AFTER DELETE
expects the list returned to be of length 6..
The collection posts only datasets and deletes one. In this case, the expected value must be 2.DATASET LIST AFTER DELETE
expects the list returned to be of length 6..
The collection posts only datasets and deletes one. In this case, the expected value must be 2.Rucha DeshpandeRucha Deshpandehttps://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-service/-/issues/11e2e tests: imptoken collection should be optional for CSPs that do not have i...2021-02-23T04:29:18ZRucha Deshpandee2e tests: imptoken collection should be optional for CSPs that do not have impersonation token implementedimptoken collection should be optional for CSPs that do not have impersonation token implementedimptoken collection should be optional for CSPs that do not have impersonation token implementedRucha DeshpandeDiego MolteniRucha Deshpandehttps://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-service/-/issues/12e2e test collection: clean up of collection using gcs urls2021-02-23T17:48:51ZRucha Deshpandee2e test collection: clean up of collection using gcs urlsThere are some requests in the collection that use 'googleapis' url to test.
These need to be cleaned up or changed to env vars.There are some requests in the collection that use 'googleapis' url to test.
These need to be cleaned up or changed to env vars.Rucha DeshpandeDiego MolteniRucha Deshpandehttps://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-service/-/issues/14Validating REGION against hardcoded region names instead of dynamic provider-...2021-02-23T15:31:17ZBrady Spiva [AWS]Validating REGION against hardcoded region names instead of dynamic provider-specific regions## Observed behavior
When attempting to create subprojects, the `storage_location` parameter is restricted to a hardcoded list of values (GCP values?).
These values are hardcoded here, in `services/subproject/parser.ts`:
https://commun...## Observed behavior
When attempting to create subprojects, the `storage_location` parameter is restricted to a hardcoded list of values (GCP values?).
These values are hardcoded here, in `services/subproject/parser.ts`:
https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-service/-/blob/master/src/services/subproject/parser.ts#L25-31
When attempting to pass an AWS-specific region, this is the error code received:
> [400] [seismic-store-service] The storage_location body field US-EAST-1 is not valid. It must be one of ASIA, EU, US, NORTHAMERICA-NORTHEAST1, US-CENTRAL1, US-EAST1, US-EAST4, US-WEST1, SOUTHAMERICA-EAST1, EUROPE-WEST1, EUROPE-WEST2, EUROPE-WEST3, EUROPE-WEST4, ASIA-EAST1, ASIA-NORTHEAST1, ASIA-SOUTH1, ASIA-SOUTHEAST1, AUSTRALIA-SOUTHEAST1
## Expected behavior
Subproject creation should validate the user-provided region against cloud provider's regions dynamically.
## Potential solutions
This might already be addressed in @DiegoMolteni's upcoming updates to make the Seismic service more abstracted and generic. If not, can we work to abstract `parser.ts` to have cloud provider specific implementations?Rucha DeshpandeDiego MolteniBrady Spiva [AWS]Yunhua KoglinRucha Deshpandehttps://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-service/-/issues/15502 Gateway Error on calling Redis get and set functions2021-02-23T04:27:47ZWalter D502 Gateway Error on calling Redis get and set functionsHi @DiegoMolteni
We have been getting the 502 Gateway Error for some APIs. One of the APIs is Create Subproject API. On debugging the code the error is thrown on the following 2 lines in compliance.ts file in the create subproject flow:...Hi @DiegoMolteni
We have been getting the 502 Gateway Error for some APIs. One of the APIs is Create Subproject API. On debugging the code the error is thrown on the following 2 lines in compliance.ts file in the create subproject flow:
1. await this._cache.set(ltag, results.invalidLegalTags.length === 0);
2. await this._cache.set(ltag, results.invalidLegalTags.length === 0);
Interestingly, this is happening on the DEV environment. I've not encountered the issue in my local. Have you faced this error or have a clue on what the problem could be? Any help would be appreciated. Thank you.M1 - Release 0.1Diego MolteniDaniel PerezDiego Molteni2021-02-26https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-service/-/issues/16Standardizing the OAUTH2.0 JWT payload2021-03-04T22:50:05ZBrady Spiva [AWS]Standardizing the OAUTH2.0 JWT payload## Observed behavior
The service assumes an "email" attribute is present in the JWT payload, and attempts to get the "email" attribute. See here in [/src/shared/utils.ts](https://community.opengroup.org/osdu/platform/domain-data-mgmt-ser...## Observed behavior
The service assumes an "email" attribute is present in the JWT payload, and attempts to get the "email" attribute. See here in [/src/shared/utils.ts](https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-service/-/blob/master/src/shared/utils.ts#L38). There is not a standardized naming convention for this attribute, it could be "email", or "username", etc.
When using the sdutil to upload data with the `--idtoken=` parameter, the service silently handles an error, and returns a cryptic response: `'created_by'`
## Expected behavior
The service should send a request to the `/userInfo/` OAUTH2.0 endpoint to determine what this custom email attribute is, and then get it from the JWT dynamically to avoid naming conflicts between cloud providers and identity provider services.
## Potential solutions
Here is an existing implementation of using this flow to first discover the custom email attribute, and then get it from the JWT payload: https://community.opengroup.org/osdu/platform/system/lib/cloud/aws/os-core-lib-aws/-/blob/master/src/main/java/org/opengroup/osdu/core/aws/entitlements/Authorizer.java#L121https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-service/-/issues/18Dataset with seimsic metadata fails due to updates in R3 data definitions in ...2023-03-27T19:29:25ZRucha DeshpandeDataset with seimsic metadata fails due to updates in R3 data definitions in Storage ServicePosting a dataset with seismic metadata that is to be stored as a Storage record fails.
Seismic DMS service needs to be updated to work with R3 Data Definitions.
See issue:
https://community.opengroup.org/osdu/platform/system/storage/-/i...Posting a dataset with seismic metadata that is to be stored as a Storage record fails.
Seismic DMS service needs to be updated to work with R3 Data Definitions.
See issue:
https://community.opengroup.org/osdu/platform/system/storage/-/issues/44Rucha DeshpandeDiego MolteniRucha Deshpandehttps://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-service/-/issues/19e2e test: getGCSAccessToken API validates token type2021-02-23T04:26:36ZRucha Deshpandee2e test: getGCSAccessToken API validates token typeThe getGCSAccessToken API test script validates the return token type to be 'Bearer'.
For AWS the return type could be 'STS token'.
This validation should be removedThe getGCSAccessToken API test script validates the return token type to be 'Bearer'.
For AWS the return type could be 'STS token'.
This validation should be removedRucha DeshpandeDiego MolteniRucha Deshpandehttps://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-service/-/issues/20e2e test: Subproject Get New - checks return child length2021-02-23T04:24:39ZRucha Deshpandee2e test: Subproject Get New - checks return child lengthSubproject get new test script validates response child length. It is hardcoded to 8.
The length of the response can vary per CSP.
pm.expect(child.length).to.eql(8);Subproject get new test script validates response child length. It is hardcoded to 8.
The length of the response can vary per CSP.
pm.expect(child.length).to.eql(8);Rucha DeshpandeDiego MolteniRucha Deshpandehttps://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-service/-/issues/21e2e test - Dataset exist / Dataset sizes validates a hardcoded list2021-03-04T22:50:06ZRucha Deshpandee2e test - Dataset exist / Dataset sizes validates a hardcoded listThis test validates a hardcoded list. Must change to environment variables OR the test script must be updated.
{
"datasets":[
"/async/dsx01",
"/a/b/c/dsx02",
"async/dsx03",
"test/dsx01",
"{{path01}}/{{dataset01}}"
]
}
Simil...This test validates a hardcoded list. Must change to environment variables OR the test script must be updated.
{
"datasets":[
"/async/dsx01",
"/a/b/c/dsx02",
"async/dsx03",
"test/dsx01",
"{{path01}}/{{dataset01}}"
]
}
Similarly, the Dataset Sizes test also validates the sizes for a hardcoded listRucha DeshpandeDiego MolteniRucha Deshpandehttps://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-service/-/issues/22e2e test script needs to run from repository root only2023-03-27T19:28:43ZRucha Deshpandee2e test script needs to run from repository root onlyThe run-e2e-tests.sh script has the following check. This will not work in internal pipelines where the distribution folder structure is different.
if [ ! -f "tsconfig.json" ]; then
printf "\n%s\n" "[ERROR] The script must be cal...The run-e2e-tests.sh script has the following check. This will not work in internal pipelines where the distribution folder structure is different.
if [ ! -f "tsconfig.json" ]; then
printf "\n%s\n" "[ERROR] The script must be called from the project root directory."
exit 1
fiRucha DeshpandeDiego MolteniRucha Deshpandehttps://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-service/-/issues/23createQuery and createKey - generalize structure2023-03-27T19:27:48ZRucha DeshpandecreateQuery and createKey - generalize structureThe following 2 methods
createQuery(namespace: string, kind: string): IJournalQueryModel;
createKey(specs: any): object;
The structure of the parameter should be abstracted to be s
AWS wants to be able to pass information such as
{...The following 2 methods
createQuery(namespace: string, kind: string): IJournalQueryModel;
createKey(specs: any): object;
The structure of the parameter should be abstracted to be s
AWS wants to be able to pass information such as
{
table_name:
tenant_name
subproject_name
..etc
}
of type 'any'.
This is required for AWS,as we are restricted to parse and use the 'Namespace', 'kind' which does not work in all scenarios for the models we have.Rucha DeshpandeDiego MolteniRucha Deshpandehttps://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-service/-/issues/34While testing the Seismic API - list of endpoint returning incorrect response.2023-03-27T19:19:16ZKamlesh TodaiWhile testing the Seismic API - list of endpoint returning incorrect response.Not able to retrieve tenant metadata - Get 403 Forbidden response
Upon trying to list subproject in a tenant with an exported authorization token - Get 500 Internal Server Error
After patching the dataset not able to retrieve the datase...Not able to retrieve tenant metadata - Get 403 Forbidden response
Upon trying to list subproject in a tenant with an exported authorization token - Get 500 Internal Server Error
After patching the dataset not able to retrieve the dataset info.
Upon trying to patch dataset with invalid/expired authorization token - Get response of 404 Not Found instead of 401 or 403
Upon trying to Validate the ctag of a dataset with an invalid/expired auth token - it successfully validates instead of returning 401 or 403
Upsert tags to a dataset with invalid gtag gives the response of 200 OK instead of 400 or 404
Delete a data set with an invalid datasetid gives the response of 200 OK instead of 400 or 404
Delete a data set with an invalid path gives the response of 200 OK instead of 400 or 404
Retrieve a list of datasets and sub-directories inside a seismic store path with invalid cursor gives Response of 200 OK instead of 400
Attached is the document giving the details of the request and the invalid responses received.
[SeismicDMS_CollectionNotes.json](/uploads/a5f8e41de5251679b22c277f9a210ec3/SeismicDMS_CollectionNotes.json)
The collection can be found here
https://community.opengroup.org/osdu/platform/testing/-/blob/master/Postman%20Collection/27_CICD_Setup_SeismicDMSAPI/SeismicDMS%20API%20CI-CD%20v2.0.postman_collection.json
The testing was primarily done on IBM and some on AWS.
@ChrisZhang @sacha @anujgupta @Wibbenhttps://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-service/-/issues/37Support auth with access_token2023-03-27T19:17:39ZAleksandr Spivakov (EPAM)Support auth with access_tokenCurrently service supports only id_token for authorization. It will be good to have support for access_token.Currently service supports only id_token for authorization. It will be good to have support for access_token.https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-service/-/issues/38Enable forwarding of original request headers to Dataecosystem APIs2021-11-18T13:21:05ZRucha DeshpandeEnable forwarding of original request headers to Dataecosystem APIsIn the current implementation, wherever Dataecosystem APIS are called, headers are re-created. This does not allow any headers from original request - 'x-user-id' for example from being forwarded.
![image](/uploads/d41f01c1648b16625ded5...In the current implementation, wherever Dataecosystem APIS are called, headers are re-created. This does not allow any headers from original request - 'x-user-id' for example from being forwarded.
![image](/uploads/d41f01c1648b16625ded5b18f13d9363/image.png)
Core code has to be modified to 'append' to original headers if there are any new headers, else re-use original request headers.Rucha DeshpandeDiego MolteniGregYunhua KoglinRucha Deshpandehttps://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-service/-/issues/40Domain API - provide read/write access to trace data2023-03-27T19:16:48ZDebasis ChatterjeeDomain API - provide read/write access to trace dataNeutral domain API to access Seismic trace data irrespective of content storage in oZgy or in oVDS.
Consider suitable protocol keeping in mind the large volume of data involved.
This will open up opportunity for interoperability for x-...Neutral domain API to access Seismic trace data irrespective of content storage in oZgy or in oVDS.
Consider suitable protocol keeping in mind the large volume of data involved.
This will open up opportunity for interoperability for x-vendor applications.
cc - @pq for informationhttps://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-service/-/issues/41Add new version info endpoint2023-06-13T20:06:33ZSiarhei Khaletski (EPAM)Add new version info endpointOriginal ADR: https://community.opengroup.org/osdu/platform/system/lib/core/os-core-common/-/issues/47
Additional info: https://community.opengroup.org/osdu/platform/home/-/issues/36Original ADR: https://community.opengroup.org/osdu/platform/system/lib/core/os-core-common/-/issues/47
Additional info: https://community.opengroup.org/osdu/platform/home/-/issues/36https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-service/-/issues/42[GCP] Seismic store doesn't use Partition Service to get a GCP project-id of ...2023-03-27T19:16:22ZYan Sushchynski (EPAM)[GCP] Seismic store doesn't use Partition Service to get a GCP project-id of Google Cloud ProjectThe main problems are following:
- See no signs that SSDMS uses Partition Service at all, it accepts requests with no data-partition-id header
- When we create SSDMS tenant, we have to specify `gcpid`, the project where data will be stor...The main problems are following:
- See no signs that SSDMS uses Partition Service at all, it accepts requests with no data-partition-id header
- When we create SSDMS tenant, we have to specify `gcpid`, the project where data will be stored if we use this tenant in our `sd-path`.
It causes two problems:
- users have to know the actual `gcpid`
- users can specify the `gcpid` that doesn’t correspond `data-partition-id`
Example of create tenant request:
```
{
"gcpid": "{{gcp_project_id}}",
"esd": "{{data-partition-id}}.osdu-gcp.go3-nrg.projects.epam.com",
"default_acl": "data.default.owners@{{data-partition-id}}.osdu-gcp.go3-nrg.projects.epam.com"
}
```
Solution is to use Partition Service to get GCP project-id, thus users don't need to specify `gcpid` manually and the GCP project-id is chosen correctly.
cc:
@Kateryna_Kurach @Siarhei_KhaletskiM13 - Release 0.16https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-service/-/issues/46SegyImport and OpenVDS DAG2023-03-27T19:15:37ZGregSegyImport and OpenVDS DAGThe OpenVDS DAG should allow header parameters to be passed for the conversion which can override header information in the Segy. The DAG uses SegyImport which can accept header parameters (see http://osdu.pages.community.opengroup.org/p...The OpenVDS DAG should allow header parameters to be passed for the conversion which can override header information in the Segy. The DAG uses SegyImport which can accept header parameters (see http://osdu.pages.community.opengroup.org/platform/domain-data-mgmt-services/seismic/open-vds/tools/SEGYImport/README.html ), however there is no mechanism to pass these header fields to the DAG.M10 - Release 0.13Chris ZhangChris Zhanghttps://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-service/-/issues/52Difference in documentation and functionality of utility cp endpoint2023-03-24T19:32:53ZWalter DDifference in documentation and functionality of utility cp endpointThe documentation for Utility CP endpoint mentions 'The source and destination dataset must be in the same sub-project.' However, the endpoint returns 202 Accepted response even when the source and destination sub-project are not same.The documentation for Utility CP endpoint mentions 'The source and destination dataset must be in the same sub-project.' However, the endpoint returns 202 Accepted response even when the source and destination sub-project are not same.M11 - Release 0.14Diego MolteniDiego Moltenihttps://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-service/-/issues/53Provide Domain API to read/write Seismic 2D Navigation data (sourced from mul...2023-03-24T21:39:41ZDebasis ChatterjeeProvide Domain API to read/write Seismic 2D Navigation data (sourced from multiple formats)This will be very similar to approach in Wellbore DDMS.
Access to Well Log data via API, although source data can be LAS, DLIS, LIS, WITSML.
Similarly in this case, source data can be UKOOA< SegP1, IOGP format.
But uniform set of Domai...This will be very similar to approach in Wellbore DDMS.
Access to Well Log data via API, although source data can be LAS, DLIS, LIS, WITSML.
Similarly in this case, source data can be UKOOA< SegP1, IOGP format.
But uniform set of Domain API should allow programmatic access to this information to help applications.
Common use case being an application that wants to display 2D Navigation on Map, with option to show SP labels at certain zoom level.
Please also check this issue for Data Definition.
https://gitlab.opengroup.org/osdu/subcommittees/data-def/work-products/schema/-/issues/348
cc - @Keith_Wall (for information)https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-service/-/issues/57Utilizing Standard Pipelines2023-03-24T19:24:00ZDavid Diederichd.diederich@opengroup.orgUtilizing Standard PipelinesI'd like this project to consider merging your CI pipeline work with the osdu/platform/ci-cd-pipelines> project, and utilize more jobs by includes than using local CI config.
### Some Reasons to Consider
**Copy/paste code is hard to ke...I'd like this project to consider merging your CI pipeline work with the osdu/platform/ci-cd-pipelines> project, and utilize more jobs by includes than using local CI config.
### Some Reasons to Consider
**Copy/paste code is hard to keep maintained**
Most of your CI logic appears to have started as a copy/paste from the main repository, anyway.
But keeping it local means that developers need to update changes in multiple places, and when they're working on the improvements they don't have your use case in mind.
This included some recent developments to get the dev2 environment going, but it also includes the changes to the FOSSA scanning -- you're still using an older, unmaintained image for the scanning.
And, when I did the changes, I worked test examples for maven and pip, the two supported build systems.
If npm had been there, I would have had it in mind.
**You miss new pipeline developments**
I'm moving pieces of the release management scripts into the pipeline to make more aspects of the tagging process happen automatically from branch creation.
For now, it's only dependency scanning data, but upgrades are planned to do more stages from there.
The GitLab Ultimate scanners check for security vulnerabilities, and the InfoSec team utilizes these results to plan their work.
These scanners aren't running on your project, but would be if included the appropriate CI configuration -- or at least, we'd see what needs to be improved on those scanners to function if they don't work out of the box.
**Your improvements aren't available to others**
Any improvements you make to the CI process after you've copied it remains in your local repository.
Others could benefit from having this available in a common location.
Supporting another language gives future OSDU projects more capabilities right at the start.
You'd even get to define the basic processes for these.
### Open to Discussion
I'd like to hear more about how the custom pipelines came to be, and if they are serving a need that can't be generalized.
For steps that are truly custom and unique to your project, it makes sense to have them as local CI config files.
If we do decide to start using more of the standard pipeline logic, I think we'll need to implement it slowly, a piece at a time.
Of course, if you think a big bang MR is better, I'd consider that, too.
Thank you in advance for your thoughts.https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-service/-/issues/58For Tenant there is no endpoint that can be used to list all the available te...2023-03-24T19:22:43ZKamlesh TodaiFor Tenant there is no endpoint that can be used to list all the available tenantsThere should be a way to list all the tenants to which the user has access. At present, there is no way to do that. If one had created the tenant in the past and cannot remember the name, then there is no way to find that name.There should be a way to list all the tenants to which the user has access. At present, there is no way to do that. If one had created the tenant in the past and cannot remember the name, then there is no way to find that name.https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-service/-/issues/59What are some of the variables in tests/e2e/postman_env.json and where does o...2022-06-16T09:21:25ZKamlesh TodaiWhat are some of the variables in tests/e2e/postman_env.json and where does one find the values for them?Looking at the following file
https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-service/-/blob/master/app/sdms/tests/e2e/postman_env.json
I am not sure what these variables a...Looking at the following file
https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-service/-/blob/master/app/sdms/tests/e2e/postman_env.json
I am not sure what these variables are and where does one get the values for them
the list of the variables that need clarification and also where to get to get the values for them?
**SVC_API_KEY**
**STOKEN** is it id_token, access_token or refresh_token?
**DE_APP_KEY**
**VCS_Provider**M12 - Release 0.15https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-service/-/issues/60PATCH API endpoint on subproject. Where can one find the list or guiding docu...2022-07-05T11:54:13ZKamlesh TodaiPATCH API endpoint on subproject. Where can one find the list or guiding document that helps determine as to what can be patched and what cannot be patched.PATCH API endpoint on subproject. Where can one find the list or guiding document that helps determine as to what can be patched and what cannot be patched. Because when I am trying to execute
curl --location --request PATCH 'https://os...PATCH API endpoint on subproject. Where can one find the list or guiding document that helps determine as to what can be patched and what cannot be patched. Because when I am trying to execute
curl --location --request PATCH 'https://osdu-cpd-osdu.osdu-og-platform-validati-ba8e38d4c011d627379af1a4280c4e35-0000.sjc03.containers.appdomain.cloud/osdu-seismic/api/v3/subproject/tenant/opendes/subproject/autotest?recursive=false' \
--header 'Content-Type: application/json' \
--header 'data-partition-id: opendes' \
--header 'ltag: opendes-SeismicDMS-Legal-Tag-Test2951201' \
--header 'Authorization: Bearer eyJhbGciOiJSUzI1NiIsInR5c...B2FJ2Oqvw' \
--data-raw '{
"access_policy": "dataset",
"acls": {
"admins": [
"data.sdms.opendes.autotest.783673ca-3095-4900-ab02-fafe9bb5246f.admin@opendes.ibm.com"
],
"viewers": [
"data.sdms.opendes.autotest.783673ca-3095-4900-ab02-fafe9bb5246f.viewer@opendes.ibm.com"
]
}
}'
I get the message/response 400 Bad Request
[seismic-store-service] The subproject access policy cannot be patched.https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-service/-/issues/65Make cloud interfaces and abstract classes less GCP specific2023-03-24T19:13:34ZYan Sushchynski (EPAM)Make cloud interfaces and abstract classes less GCP specificHello!
During implementing `Anthos` provider we faced troubles with creating the concrete `Journal` class: [here](https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-service/-/...Hello!
During implementing `Anthos` provider we faced troubles with creating the concrete `Journal` class: [here](https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-service/-/blob/feat/Anthos_GCP/app/sdms/src/cloud/providers/anthos/postgresql.ts#L96).
If we understand correctly, `AbstractJournal` and `AbstractJournalTransaction` classes simply reproduce GCP Datastore interfaces. It is ok for GCP implementation, because there is no extra effort needed for implementing concrete journal classes. However, it is hard to implement concrete journal classes for other CSPs. This becomes obvious when we compare the number of lines for GCP and other CSPs particular journal classes (e.g., [GCP](https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-service/-/blob/feat/Anthos_GCP/app/sdms/src/cloud/providers/google/datastore.ts) and [Azure](https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-service/-/blob/feat/Anthos_GCP/app/sdms/src/cloud/providers/azure/cosmosdb.ts)).
Also, using Datastore "low-level" logic in the core code makes this code hard to read and debug. E.g., for Anthos we use PostgreSQL database and the concrete implementation required a lot of workarounds to fit Datastore "low-level" methods into SQL.
For example, there are specific Datastore operators in the abstract class ([here](https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-service/-/blob/feat/Anthos_GCP/app/sdms/src/cloud/journal.ts#L25)).
I'd suggest refactoring the common code and switch from using Datastore methods to focusing on more general and high-level logic.
For example, instead of using [this](https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-service/-/blob/master/app/sdms/src/services/dataset/dao.ts#L46) in the common code
```ts
let query = journalClient.createQuery(
Config.SEISMIC_STORE_NS + '-' + dataset.tenant + '-' + dataset.subproject, Config.DATASETS_KIND);
query = query.filter('name', dataset.name).filter('path', dataset.path);
const [entities] = await journalClient.runQuery(query);
```
We could use something like this:
```ts
// Just an example
const entity = await journalClient.getEntity(dataset.path, dataset.name);
```
In this case, we could be more concise and cleaner in concrete CSP implementations. Also, implementing high-level classes lets us use the best practices for each particular data base.https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-service/-/issues/80osdu:wks:master-data--Seismic2DInterpretationSet:1.1.02023-06-06T18:39:25ZSacha Brantsosdu:wks:master-data--Seismic2DInterpretationSet:1.1.0Rashaad GrayRashaad Grayhttps://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-service/-/issues/81osdu:wks:master-data--Seismic3DInterpretationSet:1.1.02023-06-06T18:39:50ZSacha Brantsosdu:wks:master-data--Seismic3DInterpretationSet:1.1.0Rashaad GrayRashaad Grayhttps://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-service/-/issues/82osdu:wks:work-product-component--NotionalSeismicLine:1.0.02023-07-12T13:48:38ZSacha Brantsosdu:wks:work-product-component--NotionalSeismicLine:1.0.0Nova LaderNova Laderhttps://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-service/-/issues/83osdu:wks:work-product-component--SeismicHorizon:1.1.02023-06-13T19:53:40ZSacha Brantsosdu:wks:work-product-component--SeismicHorizon:1.1.0Nova LaderNova Laderhttps://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-service/-/issues/84osdu:wks:master-data--Seismic2DInterpretationSet:1.1.02023-06-13T19:52:58ZSacha Brantsosdu:wks:master-data--Seismic2DInterpretationSet:1.1.0https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-service/-/issues/85osdu:wks:master-data--Seismic3DInterpretationSet:1.1.02023-06-13T19:54:33ZSacha Brantsosdu:wks:master-data--Seismic3DInterpretationSet:1.1.0https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-service/-/issues/86osdu:wks:work-product-component--NotionalSeismicLine:1.0.02023-03-24T15:58:23ZSacha Brantsosdu:wks:work-product-component--NotionalSeismicLine:1.0.0https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-service/-/issues/88Generate OpenAPI from the code2023-07-12T13:48:47ZSacha BrantsGenerate OpenAPI from the codeNova LaderNova Laderhttps://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-service/-/issues/91The v3 to v4 sync process needs to be implemented for all models2023-09-20T02:16:49ZSacha BrantsThe v3 to v4 sync process needs to be implemented for all modelshttps://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-service/-/issues/92Refactor base images and npm dependencies version2023-03-24T15:54:04ZAliaksandr Ramanovich (EPAM)Refactor base images and npm dependencies versionIt seems it's time to update versions of base images used in Dockerfiles and dependencies in the package.json fileIt seems it's time to update versions of base images used in Dockerfiles and dependencies in the package.json fileSacha BrantsSacha Brantshttps://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-service/-/issues/93Create service.seismicddms.ops group2023-07-05T09:34:24ZJan MortensenCreate service.seismicddms.ops groupAs mentioned in [issue 73 in entitlements](https://community.opengroup.org/osdu/platform/security-and-compliance/entitlements/-/issues/73) there is a hardcoded dependency on being member of the users.datalake.admins for some of the funct...As mentioned in [issue 73 in entitlements](https://community.opengroup.org/osdu/platform/security-and-compliance/entitlements/-/issues/73) there is a hardcoded dependency on being member of the users.datalake.admins for some of the functionality in the Seismic DMS service. This causes some confusion, especially given that the users.datalake.* groups are not inherited, so even a member of the higher level users.datalake.ops would not be able to use the functionality as it specifically targets the admins-group.
**Suggestion**
Instead of creating a hard-coded dependency on this group, there should; in my opinion; rather have been created a new service-group for this purpose, e.g. service.seismicddms.ops (or service.sddms.ops, or...). This would create better transparency and independence on the access needed for using the service rather than relying on these group-of-groups/convenience-groups.https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-service/-/issues/94Info endpoint is missed2023-06-13T20:05:35ZDenis Karpenok (EPAM)Info endpoint is missedcurl --location --request GET 'https://preship.gcp.gnrg-osdu.projects.epam.com/api/seismic-store/v3/info'
Response:
[seismic-store-service] Unauthenticated Access. Authorizations not found in the request.
With authentication response ...curl --location --request GET 'https://preship.gcp.gnrg-osdu.projects.epam.com/api/seismic-store/v3/info'
Response:
[seismic-store-service] Unauthenticated Access. Authorizations not found in the request.
With authentication response is:
<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="utf-8">
<title>Error</title>
</head>
<body>
<pre>Cannot GET /api/v3/info</pre>
</body>
</html>
Expected:
Version is returning without authentication.https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-service/-/issues/96Read Only Root File System for Seismic Pods Crashes2023-04-12T17:52:30ZAbhay JoshiRead Only Root File System for Seismic Pods CrashesWhen making a change to have the Os-Seismic-Store pods be a Readonly RootFileSystem, the pods seem to crash without any kubectl logs whatsoever. We suspect it is because the Application is writing to the Pods but are unable to see where ...When making a change to have the Os-Seismic-Store pods be a Readonly RootFileSystem, the pods seem to crash without any kubectl logs whatsoever. We suspect it is because the Application is writing to the Pods but are unable to see where things are being written. We would like to fix this issue as it is a security concern.https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-service/-/issues/97Implementing DDMSDatasets[] standardize content data2023-03-30T16:29:46ZChad LeongImplementing DDMSDatasets[] standardize content dataDDMS references to optimized content were found to be created ad-hoc and outside the work-product-component schemas.
Following the [original observation](https://gitlab.opengroup.org/osdu/subcommittees/ea/docs/-/issues/7) an [ADR was c...DDMS references to optimized content were found to be created ad-hoc and outside the work-product-component schemas.
Following the [original observation](https://gitlab.opengroup.org/osdu/subcommittees/ea/docs/-/issues/7) an [ADR was created](https://gitlab.opengroup.org/osdu/subcommittees/ea/docs/-/issues/10), which standardizes the optimized content references from work-product-component entity types. Over time, DDMSs are expected to implement optimized content references using the `data.DDMSDatasets[]` property and support migration.https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-service/-/issues/110v3 to v4 sync design2023-09-19T14:01:16ZMark Yanv3 to v4 sync designDiego MolteniDiego Molteni