seismic-dms-service issueshttps://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-service/-/issues2021-02-23T04:27:47Zhttps://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-service/-/issues/15502 Gateway Error on calling Redis get and set functions2021-02-23T04:27:47ZWalter D502 Gateway Error on calling Redis get and set functionsHi @DiegoMolteni
We have been getting the 502 Gateway Error for some APIs. One of the APIs is Create Subproject API. On debugging the code the error is thrown on the following 2 lines in compliance.ts file in the create subproject flow:...Hi @DiegoMolteni
We have been getting the 502 Gateway Error for some APIs. One of the APIs is Create Subproject API. On debugging the code the error is thrown on the following 2 lines in compliance.ts file in the create subproject flow:
1. await this._cache.set(ltag, results.invalidLegalTags.length === 0);
2. await this._cache.set(ltag, results.invalidLegalTags.length === 0);
Interestingly, this is happening on the DEV environment. I've not encountered the issue in my local. Have you faced this error or have a clue on what the problem could be? Any help would be appreciated. Thank you.M1 - Release 0.1Diego MolteniDaniel PerezDiego Molteni2021-02-26https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-service/-/issues/41Add new version info endpoint2023-06-13T20:06:33ZSiarhei Khaletski (EPAM)Add new version info endpointOriginal ADR: https://community.opengroup.org/osdu/platform/system/lib/core/os-core-common/-/issues/47
Additional info: https://community.opengroup.org/osdu/platform/home/-/issues/36Original ADR: https://community.opengroup.org/osdu/platform/system/lib/core/os-core-common/-/issues/47
Additional info: https://community.opengroup.org/osdu/platform/home/-/issues/36https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-service/-/issues/23createQuery and createKey - generalize structure2023-03-27T19:27:48ZRucha DeshpandecreateQuery and createKey - generalize structureThe following 2 methods
createQuery(namespace: string, kind: string): IJournalQueryModel;
createKey(specs: any): object;
The structure of the parameter should be abstracted to be s
AWS wants to be able to pass information such as
{...The following 2 methods
createQuery(namespace: string, kind: string): IJournalQueryModel;
createKey(specs: any): object;
The structure of the parameter should be abstracted to be s
AWS wants to be able to pass information such as
{
table_name:
tenant_name
subproject_name
..etc
}
of type 'any'.
This is required for AWS,as we are restricted to parse and use the 'Namespace', 'kind' which does not work in all scenarios for the models we have.Rucha DeshpandeDiego MolteniRucha Deshpandehttps://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-service/-/issues/93Create service.seismicddms.ops group2023-07-05T09:34:24ZJan MortensenCreate service.seismicddms.ops groupAs mentioned in [issue 73 in entitlements](https://community.opengroup.org/osdu/platform/security-and-compliance/entitlements/-/issues/73) there is a hardcoded dependency on being member of the users.datalake.admins for some of the funct...As mentioned in [issue 73 in entitlements](https://community.opengroup.org/osdu/platform/security-and-compliance/entitlements/-/issues/73) there is a hardcoded dependency on being member of the users.datalake.admins for some of the functionality in the Seismic DMS service. This causes some confusion, especially given that the users.datalake.* groups are not inherited, so even a member of the higher level users.datalake.ops would not be able to use the functionality as it specifically targets the admins-group.
**Suggestion**
Instead of creating a hard-coded dependency on this group, there should; in my opinion; rather have been created a new service-group for this purpose, e.g. service.seismicddms.ops (or service.sddms.ops, or...). This would create better transparency and independence on the access needed for using the service rather than relying on these group-of-groups/convenience-groups.https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-service/-/issues/18Dataset with seimsic metadata fails due to updates in R3 data definitions in ...2023-03-27T19:29:25ZRucha DeshpandeDataset with seimsic metadata fails due to updates in R3 data definitions in Storage ServicePosting a dataset with seismic metadata that is to be stored as a Storage record fails.
Seismic DMS service needs to be updated to work with R3 Data Definitions.
See issue:
https://community.opengroup.org/osdu/platform/system/storage/-/i...Posting a dataset with seismic metadata that is to be stored as a Storage record fails.
Seismic DMS service needs to be updated to work with R3 Data Definitions.
See issue:
https://community.opengroup.org/osdu/platform/system/storage/-/issues/44Rucha DeshpandeDiego MolteniRucha Deshpandehttps://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-service/-/issues/1Delete dataset API does not delete COS (Blob Storage) object2023-03-27T19:35:26ZWalter DDelete dataset API does not delete COS (Blob Storage) objectThe delete dataset API of seismic-store-service, calls the storage service POST delete record API. This API deletes the object from COS(Blob Storage) belonging to the dataset. However, the COS object is available even though the response...The delete dataset API of seismic-store-service, calls the storage service POST delete record API. This API deletes the object from COS(Blob Storage) belonging to the dataset. However, the COS object is available even though the response is 204 No Content. We realize that storage service POST delete is just doing soft delete. We wanted to confirm if this is the expected behavior.ethiraj krishnamanaiduethiraj krishnamanaiduhttps://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-service/-/issues/52Difference in documentation and functionality of utility cp endpoint2023-03-24T19:32:53ZWalter DDifference in documentation and functionality of utility cp endpointThe documentation for Utility CP endpoint mentions 'The source and destination dataset must be in the same sub-project.' However, the endpoint returns 202 Accepted response even when the source and destination sub-project are not same.The documentation for Utility CP endpoint mentions 'The source and destination dataset must be in the same sub-project.' However, the endpoint returns 202 Accepted response even when the source and destination sub-project are not same.M11 - Release 0.14Diego MolteniDiego Moltenihttps://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-service/-/issues/3Documentation of latest code changes2023-03-28T04:07:18ZWalter DDocumentation of latest code changesHi Team.
Hope everyone is safe and doing well.
The latest code check-in has lot of changes from addition
1. cloud.ts
2. trace.ts
to deletion of files (iam.ts) along with several updated files.
Also, the folder structure has changed ...Hi Team.
Hope everyone is safe and doing well.
The latest code check-in has lot of changes from addition
1. cloud.ts
2. trace.ts
to deletion of files (iam.ts) along with several updated files.
Also, the folder structure has changed
1. /config and /swaggerdocs are not present.
2. config.ts is moved into /cloud
Is it possible to provide us with document containing the important changes resulting in changes to CSP implementation? Thank you.Diego MolteniDiego Moltenihttps://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-service/-/issues/40Domain API - provide read/write access to trace data2023-03-27T19:16:48ZDebasis ChatterjeeDomain API - provide read/write access to trace dataNeutral domain API to access Seismic trace data irrespective of content storage in oZgy or in oVDS.
Consider suitable protocol keeping in mind the large volume of data involved.
This will open up opportunity for interoperability for x-...Neutral domain API to access Seismic trace data irrespective of content storage in oZgy or in oVDS.
Consider suitable protocol keeping in mind the large volume of data involved.
This will open up opportunity for interoperability for x-vendor applications.
cc - @pq for informationhttps://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-service/-/issues/12e2e test collection: clean up of collection using gcs urls2021-02-23T17:48:51ZRucha Deshpandee2e test collection: clean up of collection using gcs urlsThere are some requests in the collection that use 'googleapis' url to test.
These need to be cleaned up or changed to env vars.There are some requests in the collection that use 'googleapis' url to test.
These need to be cleaned up or changed to env vars.Rucha DeshpandeDiego MolteniRucha Deshpandehttps://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-service/-/issues/21e2e test - Dataset exist / Dataset sizes validates a hardcoded list2021-03-04T22:50:06ZRucha Deshpandee2e test - Dataset exist / Dataset sizes validates a hardcoded listThis test validates a hardcoded list. Must change to environment variables OR the test script must be updated.
{
"datasets":[
"/async/dsx01",
"/a/b/c/dsx02",
"async/dsx03",
"test/dsx01",
"{{path01}}/{{dataset01}}"
]
}
Simil...This test validates a hardcoded list. Must change to environment variables OR the test script must be updated.
{
"datasets":[
"/async/dsx01",
"/a/b/c/dsx02",
"async/dsx03",
"test/dsx01",
"{{path01}}/{{dataset01}}"
]
}
Similarly, the Dataset Sizes test also validates the sizes for a hardcoded listRucha DeshpandeDiego MolteniRucha Deshpandehttps://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-service/-/issues/19e2e test: getGCSAccessToken API validates token type2021-02-23T04:26:36ZRucha Deshpandee2e test: getGCSAccessToken API validates token typeThe getGCSAccessToken API test script validates the return token type to be 'Bearer'.
For AWS the return type could be 'STS token'.
This validation should be removedThe getGCSAccessToken API test script validates the return token type to be 'Bearer'.
For AWS the return type could be 'STS token'.
This validation should be removedRucha DeshpandeDiego MolteniRucha Deshpandehttps://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-service/-/issues/10E2E test issue2021-02-23T17:51:46ZRucha DeshpandeE2E test issueDATASET LIST AFTER DELETE
expects the list returned to be of length 6..
The collection posts only datasets and deletes one. In this case, the expected value must be 2.DATASET LIST AFTER DELETE
expects the list returned to be of length 6..
The collection posts only datasets and deletes one. In this case, the expected value must be 2.Rucha DeshpandeRucha Deshpandehttps://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-service/-/issues/22e2e test script needs to run from repository root only2023-03-27T19:28:43ZRucha Deshpandee2e test script needs to run from repository root onlyThe run-e2e-tests.sh script has the following check. This will not work in internal pipelines where the distribution folder structure is different.
if [ ! -f "tsconfig.json" ]; then
printf "\n%s\n" "[ERROR] The script must be cal...The run-e2e-tests.sh script has the following check. This will not work in internal pipelines where the distribution folder structure is different.
if [ ! -f "tsconfig.json" ]; then
printf "\n%s\n" "[ERROR] The script must be called from the project root directory."
exit 1
fiRucha DeshpandeDiego MolteniRucha Deshpandehttps://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-service/-/issues/11e2e tests: imptoken collection should be optional for CSPs that do not have i...2021-02-23T04:29:18ZRucha Deshpandee2e tests: imptoken collection should be optional for CSPs that do not have impersonation token implementedimptoken collection should be optional for CSPs that do not have impersonation token implementedimptoken collection should be optional for CSPs that do not have impersonation token implementedRucha DeshpandeDiego MolteniRucha Deshpandehttps://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-service/-/issues/20e2e test: Subproject Get New - checks return child length2021-02-23T04:24:39ZRucha Deshpandee2e test: Subproject Get New - checks return child lengthSubproject get new test script validates response child length. It is hardcoded to 8.
The length of the response can vary per CSP.
pm.expect(child.length).to.eql(8);Subproject get new test script validates response child length. It is hardcoded to 8.
The length of the response can vary per CSP.
pm.expect(child.length).to.eql(8);Rucha DeshpandeDiego MolteniRucha Deshpandehttps://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-service/-/issues/38Enable forwarding of original request headers to Dataecosystem APIs2021-11-18T13:21:05ZRucha DeshpandeEnable forwarding of original request headers to Dataecosystem APIsIn the current implementation, wherever Dataecosystem APIS are called, headers are re-created. This does not allow any headers from original request - 'x-user-id' for example from being forwarded.
![image](/uploads/d41f01c1648b16625ded5...In the current implementation, wherever Dataecosystem APIS are called, headers are re-created. This does not allow any headers from original request - 'x-user-id' for example from being forwarded.
![image](/uploads/d41f01c1648b16625ded5b18f13d9363/image.png)
Core code has to be modified to 'append' to original headers if there are any new headers, else re-use original request headers.Rucha DeshpandeDiego MolteniGregYunhua KoglinRucha Deshpandehttps://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-service/-/issues/58For Tenant there is no endpoint that can be used to list all the available te...2023-03-24T19:22:43ZKamlesh TodaiFor Tenant there is no endpoint that can be used to list all the available tenantsThere should be a way to list all the tenants to which the user has access. At present, there is no way to do that. If one had created the tenant in the past and cannot remember the name, then there is no way to find that name.There should be a way to list all the tenants to which the user has access. At present, there is no way to do that. If one had created the tenant in the past and cannot remember the name, then there is no way to find that name.https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-service/-/issues/42[GCP] Seismic store doesn't use Partition Service to get a GCP project-id of ...2023-03-27T19:16:22ZYan Sushchynski (EPAM)[GCP] Seismic store doesn't use Partition Service to get a GCP project-id of Google Cloud ProjectThe main problems are following:
- See no signs that SSDMS uses Partition Service at all, it accepts requests with no data-partition-id header
- When we create SSDMS tenant, we have to specify `gcpid`, the project where data will be stor...The main problems are following:
- See no signs that SSDMS uses Partition Service at all, it accepts requests with no data-partition-id header
- When we create SSDMS tenant, we have to specify `gcpid`, the project where data will be stored if we use this tenant in our `sd-path`.
It causes two problems:
- users have to know the actual `gcpid`
- users can specify the `gcpid` that doesn’t correspond `data-partition-id`
Example of create tenant request:
```
{
"gcpid": "{{gcp_project_id}}",
"esd": "{{data-partition-id}}.osdu-gcp.go3-nrg.projects.epam.com",
"default_acl": "data.default.owners@{{data-partition-id}}.osdu-gcp.go3-nrg.projects.epam.com"
}
```
Solution is to use Partition Service to get GCP project-id, thus users don't need to specify `gcpid` manually and the GCP project-id is chosen correctly.
cc:
@Kateryna_Kurach @Siarhei_KhaletskiM13 - Release 0.16https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-service/-/issues/4GCP specfic naming conventions2023-03-27T19:32:16ZRucha DeshpandeGCP specfic naming conventionsThere are many GCP specific names used in the models:
such as gcpid, gcp_bucket etc.
There is also an API called /api/v3/utility/gcs-access-token.
The code should be re-visited to remove any CSP specific naming used.There are many GCP specific names used in the models:
such as gcpid, gcp_bucket etc.
There is also an API called /api/v3/utility/gcs-access-token.
The code should be re-visited to remove any CSP specific naming used.Rucha DeshpandeDiego MolteniRucha Deshpande