Seismic issueshttps://community.opengroup.org/groups/osdu/platform/domain-data-mgmt-services/seismic/-/issues2023-11-15T11:51:32Zhttps://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/open-vds/-/issues/218OpenVDS cuts one char from folder name in GC path2023-11-15T11:51:32ZDzmitry Malkevich (EPAM)OpenVDS cuts one char from folder name in GC pathWe've found issue with OpenVDS 3.2.7 (and possible with all 3.2.*) and 3.3.1 versions in GC: first character in folder name is lost in path to SEGY file.
We have Seismic dataset:
```json
{
"sbit_count": 0,
"last_modified_date": ...We've found issue with OpenVDS 3.2.7 (and possible with all 3.2.*) and 3.3.1 versions in GC: first character in folder name is lost in path to SEGY file.
We have Seismic dataset:
```json
{
"sbit_count": 0,
"last_modified_date": "Wed Sep 27 2023 18:04:33 GMT+0000 (Coordinated Universal Time)",
"created_by": "109239448567816450362",
"sbit": null,
"subproject": "fgx",
"path": "/",
"gcsurl": "osdu-data-prod-m19-ss-seismic/5a3a7d7b-3a94-4d7a-9bcb-68c133d19e77/96bd9293-3358-4716-8a04-23806f63053e",
"readonly": false,
"filemetadata": {
"md5Checksum": null,
"nobjects": 1,
"size": 277427976,
"type": "GENERIC",
"tier_class": null
},
"name": "ST0202R08_PS_PSDM_RAW_PP_TIME.MIG_RAW.POST_STACK.3D.JS-017534.segy",
"ctag": "l7nnwm4Onkcjg79Dm19;m19",
"created_date": "Wed Sep 27 2023 18:04:03 GMT+0000 (Coordinated Universal Time)",
"ltag": "m19-seismic-DDMS-Legal-Tag-PRFC",
"tenant": "m19",
"access_policy": "uniform"
}
```
and Seismic path is `sd://m19/fgx/ST0202R08_PS_PSDM_RAW_PP_TIME.MIG_RAW.POST_STACK.3D.JS-017534.segy`.
When we run SEGY-to-VDS conversion OpenVDS is trying to download this file from `https://storage.googleapis.com/osdu-data-prod-m19-ss-seismic/a3a7d7b-3a94-4d7a-9bcb-68c133d19e77/96bd9293-3358-4716-8a04-23806f63053e/0` and fails as URL is not correct and first character in folder name is missing. In this case correct path should be `https://storage.googleapis.com/osdu-data-prod-m19-ss-seismic/5a3a7d7b-3a94-4d7a-9bcb-68c133d19e77/96bd9293-3358-4716-8a04-23806f63053e/0`.
As result conversion fails:
```text
[2023-10-27, 09:36:52 UTC] {pod_launcher.py:198} INFO - Event: segy-vds-conversion.1ff9faad21db4204a03ac62025f93454 had an event of type Running
[2023-10-27, 09:36:52 UTC] {pod_launcher.py:149} INFO - [Could not open input file] sd://m19/fgx/ST0202R08_PS_PSDM_RAW_PP_TIME.MIG_RAW.POST_STACK.3D.JS-017534.segy: Http error response: 403 -> https://storage.googleapis.com/osdu-data-prod-m19-ss-seismic/a3a7d7b-3a94-4d7a-9bcb-68c133d19e77/96bd9293-3358-4716-8a04-23806f63053e/0
[2023-10-27, 09:36:53 UTC] {pod_launcher.py:198} INFO - Event: segy-vds-conversion.1ff9faad21db4204a03ac62025f93454 had an event of type Running
```
Conversion works with image community.opengroup.org:5555/osdu/platform/domain-data-mgmt-services/seismic/open-vds/openvds-ingestion:latest which seems to be version 3.1.41
Please check and advise as this affects M21 Pre-shipping testing.
cc: @Yan_Sushchynski , @Yauhen_ShaliouMorten OfstadMorten Ofstadhttps://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/open-vds/-/issues/217Reading concurrently2024-01-24T08:31:20ZVasilii SinkevichReading concurrentlyHi,
Not an issue, rather a question
I am getting familiar VDS and I am trying to read a slice of data from a VDS file, I split the slice (e.g., [1,0:1000,0:500]) into several portions along one axis (e.g.,[1,0:200,0:500],[1,200:400,0:5...Hi,
Not an issue, rather a question
I am getting familiar VDS and I am trying to read a slice of data from a VDS file, I split the slice (e.g., [1,0:1000,0:500]) into several portions along one axis (e.g.,[1,0:200,0:500],[1,200:400,0:500],[1,400:600,0:500],...) and try to read them with requestVolumeSubset concurrently using multiprocessing module, but even though the reading in each thread starts simultaneously (confirmed by text output), it looks like actual reading happens consecutively, one portion after another.
I tried opening vds file in the main thread and use the identifier in the threads (concurrent.futures allows it) and to open the file separately in each thread - in first case reading of each portion starts after previous has finished as if in a single thread, in the second case the reading starts simultaneously, but each portion is read way longer than normal taking overall same time as in the first case.
So the question is: Is there a some sort of queueing system for reading in the openvds library or it is just limitation of free version?
Can reading data by pages resolve it?
Sorry, no code snippet as I am not sure if I am allowed to post the code
Platform: Windows
API: Python
Thank you,
Vasiliihttps://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-service/-/issues/119Rename "IStorage" methods for v42023-10-24T09:09:14ZYan Sushchynski (EPAM)Rename "IStorage" methods for v4Hello,
I noticed that the cloud-storage [interface](https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-service/-/blob/master/app/sdms-v4/src/cloud/storage.ts?ref_type=heads#L1...Hello,
I noticed that the cloud-storage [interface](https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-service/-/blob/master/app/sdms-v4/src/cloud/storage.ts?ref_type=heads#L19) has the following methods:
```
createBucket(bucketName: string): Promise<void>;
bucketExists(bucketName: string): Promise<boolean>;
deleteBucket(bucketName: string): Promise<void>;
```
These method names suggest that new buckets are getting created, checked for existence, or deleted within a single data-partition. However, the GC and Baremetal implementations are different -- a data-partition is expected to work with its own pre-created bucket instead of creating new ones. This discrepancy between the method names and their actual functionality could lead to confusion and misunderstanding.
A similar situation exists in the AWS implementation, where comments had to be added to clarify that 'bucketNames' are actually BLOBs, which can be seen [here](https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-service/-/blob/master/app/sdms-v4/src/cloud/providers/aws/storage.ts?ref_type=heads#L45).
I propose that we consider renaming these methods to more accurately reflect their functionality and create a better alignment with the actual implementation.
Thank you.Diego MolteniYunhua KoglinSacha BrantsMark YanDiego Moltenihttps://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/open-vds/-/issues/216Data retrieval from requestVolumeTraces produces results inconsistent with or...2023-10-23T12:40:13ZDaniel MorganData retrieval from requestVolumeTraces produces results inconsistent with original segy and requestVolumeSubset.Using python libraries built from 3.2.6 to read but SEGYImport is version 3.3.255
We've have been trying to verify the fidelity of segy to vds conversions by comparing individual trace data from the original segy with the resulting VDS f...Using python libraries built from 3.2.6 to read but SEGYImport is version 3.3.255
We've have been trying to verify the fidelity of segy to vds conversions by comparing individual trace data from the original segy with the resulting VDS file. Since we were retrieving individual traces we thought to use VolumeDataAccessManager.requestVolumeTraces but the retrieved data did not match the original in many cases.
Test file was from Volve test set: "/ST0202/ST10010ZC11_MIG_VEL.MIG_VEL.VELOCITY.3D.JS-017527.segy"
Import used no compression, data retrieval LOD 0.
For ease of comparison, we wanted to retrieve the last trace from the file corresponding to inline/crossline 10396, 2800 (trace number 418996 from SEGY). The inline/crossline coordinates convert to inline/crossline indices 435 960 respectively.
Retrieval: `trace = accessManager.requestVolumeTraces([[435, 960]], traceDimension=0, lod=0)`
This is the result when compared with original trace data:
![image](/uploads/abd85b26d19f538913dd687dee607edd/image.png)
Now when we use VolumeDataAccessManager.requestVolumeSubset using min/max tuples narrowed to a single trace, our retrieved data matches the original segy perfectly.
```
minTup = (0, 960, 435, 0, 0, 0)
maxTup = (236, 961, 436, 0, 0, 0)
trace = accessManager.requestVolumeSubset(minTup, maxTup)
```
![image](/uploads/6494e54698f8d841b853b93bcefa953c/image.png)
My assumption is that we may be misusing requestVolumeTraces, but am unclear in what way.https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-service/-/issues/117[ADR] Advanced filters for dataset search2023-12-04T14:23:23ZAlexandre Gattiker[ADR] Advanced filters for dataset search# Introduction
We need additional filtering support to be able to filter the `POST /dataset/tenant/{tenantid}/subproject/{subprojectid}` and `PUT /operation/bulk-delete` (added in [!891](https://community.opengroup.org/osdu/platform/dom...# Introduction
We need additional filtering support to be able to filter the `POST /dataset/tenant/{tenantid}/subproject/{subprojectid}` and `PUT /operation/bulk-delete` (added in [!891](https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-service/-/merge_requests/891/diffs#fafb01a8314993d61fca390beef912c7813278eb)) operations by metadata fields with more complex expressions than a single key-value match.
# Status
* [x] Initiated
* [x] Proposed
* [x] Under Review
* [ ] Approved
* [ ] Rejected
# Problem statement
The SDMS API `POST /dataset/tenant/{tenantid}/subproject/{subprojectid}` currently accepts the following body parameters, among others:
* `search`, a single SQL-like search parameter, for example: `search=name=file%`
* `gtags`, an array of strings matching tags associated with dataset metadata.
The `search` field does not support more than one field, or more than one possible value for a field.
The SDMS API `PUT /operation/bulk-delete` (added in [!891](https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-service/-/merge_requests/891/diffs#fafb01a8314993d61fca390beef912c7813278eb)) requires a `path` parameter containing `tenantid`, `subprojectid` and `path` but does not support filtering by metadata fields or tags.
For both search and delete, we need to be able to filter by more than one field, or more than one possible value for a field.
Furthermore, we expect a need for more complex filter solutions, such as combining `AND`, `OR` and `NOT` operators. The proposed solution should ideally be extensible to support additional expressions and operators in the future if needed.
# Proposed solution
Add an optional `filter` parameter to the `POST /dataset/tenant/{tenantid}/subproject/{subprojectid}` and `PUT /operation/bulk-delete` API endpoints.
The `search` and `gtags` parameters are to be deprecated.
## Overview
The `filter` parameter can take a payload with a variable format, allowing expressing a simple filter on a single field, as well as logical combinations of filters with arbitrary complexity.
The `POST /dataset/tenant/{tenantid}/subproject/{subprojectid}` operation has been selected for extension because:
* Advanced metadata filtering, encompassing select and search functionalities, has already been incorporated into that operation.
* The SDMS API also accepts the `GET` method for the operation with parameters provided in the query string, as a legacy endpoint. The `POST` version of the endpoints has been introduced to address issues related to handling large request parameters, where sending the cursor as a query parameter can lead to oversized requests and subsequent failures.
## Examples
Example value for the `filter` parameter:
```json
{
"and": [
{
"not": {
"property": "gtags",
"operator": "CONTAINS",
"value": "tagA"
}
},
{
"or": [
{
"property": "name",
"operator": "LIKE",
"value": "test.%"
},
{
"property": "name",
"operator": "=",
"value": "dataset.sgy"
}
]
}
]
}
```
This is equivalent to the following pseudo-SQL statement:
```sql
SELECT * FROM datasets d WHERE
NOT (EXISTS (SELECT VALUE 1 FROM t IN d.data.gtags WHERE t = 'tagA')
OR (IS_STRING(d.data.gtags) AND STRINGEQUALS(d.data.gtags, 'tagA')))
AND (
d.name LIKE 'test.%'
OR d.name = 'dataset.sgy'
)
```
## Details
The `filter` parameter can be:
* A **property match filter**:
```json
{
"property": "...",
"operator": "...",
"value": "..."
}
```
The implementation will be extensible with additional keys if needed in the future, e.g. to specify case sensitivity.
* An **`and` or `or` filter**, i.e. an object containing only the key `and` or `or`, of which the value is an array of one or more filters (i.e. a property match filter or an `and`, `or` or `not` filter)
```json
{
"and": [...]
}
```
* A **`not` filter**, i.e. an object containing only the key `not`, of which the value is a filter (i.e. a property match filter or an `and`, `or` or `not` filter)
```json
{
"not": ...
}
```
# Out of scope / limitations
The operations at `GET /utility/ls` and `POST /utility/ls` can also be used for retrieving datasets, but will not be extended with advanced filtering at the moment. That functionality can be added later if required.Diego MolteniDiego Moltenihttps://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/open-vds/-/issues/215The sd protocol is failing for IBM2023-10-27T15:17:11ZAnuj GuptaThe sd protocol is failing for IBMThe sd protocol is failing for IBM while trying to call `vds = openvds.open(url, con)` is resulting in 404 error and seems the characters after `/` is getting escaped/skipped
If path is `ss-dev-seismic-dh2cqj2dwyr3tsz9/f013db48-47f5-430...The sd protocol is failing for IBM while trying to call `vds = openvds.open(url, con)` is resulting in 404 error and seems the characters after `/` is getting escaped/skipped
If path is `ss-dev-seismic-dh2cqj2dwyr3tsz9/f013db48-47f5-430b-a10e-c5f6622712d2 `
the bucket name is : ss-dev-seismic-dh2cqj2dwyr3tsz9
subpath/key : f013db48-47f5-430b-a10e-c5f6622712d2
where as the subpath/key is:
`013db48-47f5-430b-a10e-c5f6622712d2` (~~f~~013db48-47f5-430b-a10e-c5f6622712d2)2023-10-13https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/open-vds/-/issues/214Job Failed #2191076. Openvds-ingestion image for tag 3.3.0 not created2023-10-11T15:13:35ZAndrei Skorkin [EPAM / GCP]Job Failed #2191076. Openvds-ingestion image for tag 3.3.0 not createdJob [#2191076](https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/open-vds/-/jobs/2191076) failed for 50c3f90f370390cfd5d23defeb7603bbfa01374c
During the release cycle, we automatically change the latest ima...Job [#2191076](https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/open-vds/-/jobs/2191076) failed for 50c3f90f370390cfd5d23defeb7603bbfa01374c
During the release cycle, we automatically change the latest image tag to the latest one presented here for **openvds-ingestion** image. In this case it was **3.3.0** ( https://community.opengroup.org/osdu/platform/deployment-and-operations/infra-gcp-provisioning/-/blob/release/0.23/helm/osdu-infra-baremetal/values.yaml?ref_type=heads#L182 ). But due to the failure of the mentioned job, image does not exist and we need to manually fix the wrong version in the values.yaml file to make a deployment.
Could you please check this?
How to overcome similar problems in the future?
Thanks
CC: @Yauhen_Shaliou @Yan_SushchynskiMorten OfstadMorten Ofstadhttps://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-service/-/issues/116delete user returns 400 on success instead of 2002023-10-03T17:41:07ZZachary Keirndelete user returns 400 on success instead of 200The delete user from subproject endpoint (observed in m18/AWS) returns 400 even though the delete completes successfully. Then if you run it again it will correctly return 404.The delete user from subproject endpoint (observed in m18/AWS) returns 400 even though the delete completes successfully. Then if you run it again it will correctly return 404.https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/open-vds/-/issues/213NEWBIE - installation on ARM 64 Linux graviton2024-02-01T07:16:11ZKlaas KosterNEWBIE - installation on ARM 64 Linux gravitonFollowing the instructions, I executed:
1) cmake ..
2) make -j8
3) make install
No errors or warnings are generated, and five executables are placed in Dist/OpenVDS/bin.
Two issues:
1) The README file states that ./SEGYImport should sho...Following the instructions, I executed:
1) cmake ..
2) make -j8
3) make install
No errors or warnings are generated, and five executables are placed in Dist/OpenVDS/bin.
Two issues:
1) The README file states that ./SEGYImport should show the Wavelet Compression option, but it does not.
2) I cannot find the .whl file anywhere that would allow me to use 'pip install' to get Python to work with OpenVDS.
What did I do wrong or forget?https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/open-vds/-/issues/212VDSCopy hanging when uploading to Seismic DDMS2023-10-23T11:39:11Zvinicius Vicente Silva RosaVDSCopy hanging when uploading to Seismic DDMSI am attempting to upload a local VDS file (1.5TB) to a SD Path, and after approximately an hour, there is no visible progress in the file upload, creating the impression that the process is stalled. No error messages are being displayed...I am attempting to upload a local VDS file (1.5TB) to a SD Path, and after approximately an hour, there is no visible progress in the file upload, creating the impression that the process is stalled. No error messages are being displayed. I suspect it may be related to the token refresh.
We are using the command line bellow:
OSDU/ADME M16
lIB: VDSCopy - OpenVDS+ 3.3.0 installed on Linux
```bash
VDSCopy -a 01 -a 02 -a 12 --tolerance=1.0 --compression-method=Wavelet -d 'sdAuthorityUrl=
https://{HOST}.energy.azure.com/seistore-svc/api/v3;authTokenUrl=https://login.microsoftonline.com/{TENANT}/oauth2/v2.0/token/;client_id={APP_ID};client_secret={APP_SECRET};scopes={APP_ID}/.default;'
'/local_disk0/vds/FILE.vds' 'sd://{TENANT}/{SUBPROJECT}/dataset_name.vds'
```
The intention of the command above is to authenticate using ClientID and ClientSecrect.
The upload completes successfully when the file is processed within an hour or less.https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/open-vds/-/issues/211Any advices for saving vds file?2023-09-29T09:15:42Znanting liuAny advices for saving vds file?- SEGYImport [OPTION...] <input file>
- use _–url <string>_ saving vds file to cloud environment, then vds file will be splited into several files(Dimensions_012LOD0、VolumeDataLayout...etc.)
- use _–vdsfile <string>_ saving vds file to l...- SEGYImport [OPTION...] <input file>
- use _–url <string>_ saving vds file to cloud environment, then vds file will be splited into several files(Dimensions_012LOD0、VolumeDataLayout...etc.)
- use _–vdsfile <string>_ saving vds file to local file system,then vds file will be completed with a file ending in .vds(test.vds)
- when i requst data from a file ending in .vds it will be so fast.
- And any difference from -url to -vdsfile ? its all use to set path of output VDS file. I would like to know how to use these two parameters correctly to improve the efficiency in querying data.
- thank u so much.https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-service/-/issues/115path and sdpath not used consistently, error in /user with parameter path2023-09-27T19:34:47ZZachary Keirnpath and sdpath not used consistently, error in /user with parameter pathThere are I think two issues here. One is documentation in that the yaml doc for /user delete option has 'path' instead of 'sdpath' and I believe it should be 'sdpath'. The other is that when I try to delete someone that does not exist,...There are I think two issues here. One is documentation in that the yaml doc for /user delete option has 'path' instead of 'sdpath' and I believe it should be 'sdpath'. The other is that when I try to delete someone that does not exist, I get 400 instead of 404 in response. This is regardless of whether I try 'path' or 'sdpath' for the parameter.https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/open-vds/-/issues/210Error uploading VDS into SD Path using OpenVDS+2023-11-14T22:11:13ZJuliana Fernandesjuliana.fernandes@iesbrazil.com.brError uploading VDS into SD Path using OpenVDS+Hello,
I`m trying to upload a local VDS into a SD Path in AWS M20 pre-shipping and get an SDMS error (wrong location).
Some values where provided by e-mail, so I will not paste here, but if you need to test just let me know.
The command...Hello,
I`m trying to upload a local VDS into a SD Path in AWS M20 pre-shipping and get an SDMS error (wrong location).
Some values where provided by e-mail, so I will not paste here, but if you need to test just let me know.
The command I'm using is:
```
VDSCopy.exe -d "SdAuthorityUrl=https://prsh.testing.preshiptesting.osdu.aws/api/seismic-store/v3;SdApiKey=ABC;AuthTokenUrl={{received_by_email}};client_id={{received_by_email}};client_secret={{received_by_email}};grant_type=refresh_token;refresh_token={{generated_in_the_login}};LegalTag=osdu-public-usa-dataset-1;scopes=openid email;Region=us-east-2" E:\Juliana\osdu\osdu_test\ST0202R08_PS_PSDM_RAW_PP_TIME_MIG_RAW_POST_STACK_3D_JS_017534_tol1_JFA.vds sd://osdu/vdstestsjfa/ST0202R08_PS_PSDM_RAW_PP_TIME_MIG_RAW_POST_STACK_3D_JS_017534_tol1_JFA.vds
```
And the error I'm getting is:
```
[Could not create VDS sd://osdu/vdstestsjfa/test/ST0202R08_PS_PSDM_RAW_PP_TIME_MIG_RAW_POST_STACK_3D_JS_017534_tol1_JFA.vds] Error on uploading VolumeDataLayout object: Http error response: 301 -> https://psosdu-shared-seismicddms-20230814174725984500000004.s3.us-east-1.amazonaws.com/3o7c5j88s1ko0oyg/2b9b212b-21b5-4ccf-aaed-67485c113ae4/VolumeDataLayout: The bucket you are attempting to access must be addressed using the specified endpoint. Please send all future requests to this endpoint.
```
Seems to be accessing the wrong location since the instance is located at us-east-2.
Regards,
Juliana.https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/open-vds/-/issues/209Adding CRS to a VDS generated by Openvds+2023-11-16T12:21:10ZJuliana Fernandesjuliana.fernandes@iesbrazil.com.brAdding CRS to a VDS generated by Openvds+Hello,
I was taking a look into the doccumentation in order to add CRS to the VDS I'm generating with Openvds+.
In the doccumentation I saw the command "–crs-wkt <string>". The WKT is a Well-known Text and seems to be a geographical c...Hello,
I was taking a look into the doccumentation in order to add CRS to the VDS I'm generating with Openvds+.
In the doccumentation I saw the command "–crs-wkt <string>". The WKT is a Well-known Text and seems to be a geographical coordinate. There is a way to add a UTM coordinate to the data?
Regards,
Julianahttps://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/open-vds/-/issues/208Multiple queries of data will be much slower.2023-09-27T08:35:56Znanting liuMultiple queries of data will be much slower.Multiple queries of data at different depths then the response time of results will be much slower(from the first few tens of milliseconds to seven seconds)
_VolumeDataRequestDouble data= accessManager.requestVolumeSubsetDouble(managedBu...Multiple queries of data at different depths then the response time of results will be much slower(from the first few tens of milliseconds to seven seconds)
_VolumeDataRequestDouble data= accessManager.requestVolumeSubsetDouble(managedBuffer.getByteBuffer(), Dimensions_012, 0, 0, min, max);_
_data.waitForCompletion();_
_getdata()_
_managedBuffer.close();_
_vds.close();_
i found Loop call _requestVolumeSubsetDouble_ then each response time will be longer.https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/open-vds/-/issues/207Is there any way to get min value of sample or max value?2023-09-21T10:03:07Znanting liuIs there any way to get min value of sample or max value?i use `VolumeDataAccessManager.requestVolumeSubsetDouble()` to get sample data, and then i want to get the maximum and minimum values in all data, so i have to write code to compare them, is there any usable method to invoke to get that ...i use `VolumeDataAccessManager.requestVolumeSubsetDouble()` to get sample data, and then i want to get the maximum and minimum values in all data, so i have to write code to compare them, is there any usable method to invoke to get that two values in _VolumeDataLayout_ or _VolumeDataAccessManager_?https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/open-vds/-/issues/206SEGYImport will create a ramdon directory on my cloud environment?2023-09-20T18:51:48Znanting liuSEGYImport will create a ramdon directory on my cloud environment?l already defined a subpath with command("s3://vds/1704505448285212672"),but i found a record in the log("Successfully imported into s3://vds/1704505448285212672/CD352FB47324BA63"),how can I avoid this situation?l already defined a subpath with command("s3://vds/1704505448285212672"),but i found a record in the log("Successfully imported into s3://vds/1704505448285212672/CD352FB47324BA63"),how can I avoid this situation?https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-service/-/issues/114Implement dataset storage for IBM2023-09-20T02:17:59ZMark YanImplement dataset storage for IBMhttps://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-service/-/issues/113Implement dataset storage for GCP2023-09-20T02:17:21ZMark YanImplement dataset storage for GCPhttps://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/open-vds/-/issues/205fail to copy file from cloud server2023-09-19T10:20:18Znanting liufail to copy file from cloud serveri try to read vds from minio(a open-source, S3 compatible object store)
at first step,i try to read a vds file with _OpenVDS.open()_,but failed,
exception is "Error on downloading VolumeDataLayout object: Http error response: 404 -> http...i try to read vds from minio(a open-source, S3 compatible object store)
at first step,i try to read a vds file with _OpenVDS.open()_,but failed,
exception is "Error on downloading VolumeDataLayout object: Http error response: 404 -> https://endpoint/bucket-name/test.vds/VolumeDataLayout: The specified key does not exist.".
then,i realized that _open()_ can not read a vds file directly, cause the file uploaded manually.
and the second step,l try to use VDSCopy to copy the VDS file to the cloud environment,still fail! with error "Error on uploading VolumeDataLayout object: unexpected AWS signing failure",here is my command `VDSCopy.exe E:\PPCoef.vds s3://endpoint/bucket-name/testVDS -d "Region=us-west-rack-2;SecretKey=xxx;SecretAccessKey=xxx"`,my SecretKey&SecretAccessKey is correct,but l dont know why print this...
Could you please help me figure out how to deal with this situation?