Seismic issueshttps://community.opengroup.org/groups/osdu/platform/domain-data-mgmt-services/seismic/-/issues2024-02-26T20:07:38Zhttps://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/open-vds/-/issues/222VoumeData class not included in Java Library2024-02-26T20:07:38ZJulien LacosteVoumeData class not included in Java LibraryThe VolumeData class is not generated in the Java version.
Apparently it's missing from de the CmakeLists.txtThe VolumeData class is not generated in the Java version.
Apparently it's missing from de the CmakeLists.txthttps://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/open-vds/-/issues/227Documentation request: Unclear on constraints/expectations on axes for 3D dat...2024-02-20T17:27:16ZKevin McCartyDocumentation request: Unclear on constraints/expectations on axes for 3D datasetsHello,
I'm writing some utilities for my employer, Dynamic Graphics Inc., that are intended to convert between OpenVDS datasets and our own 3D grid file formats (one of them being a quite old internally-invented proprietary binary forma...Hello,
I'm writing some utilities for my employer, Dynamic Graphics Inc., that are intended to convert between OpenVDS datasets and our own 3D grid file formats (one of them being a quite old internally-invented proprietary binary format, and the more recent one being HDF5-based), not unlike `SEGYImport` and `SEGYExport`.
As I started working on this, I found that numerous questions arose regarding the axes/dimensions for OpenVDS datasets that aren't really covered, or are at best glossed over, by the online docs. Many thanks in advance for light that you can shed on them!
1) I'm presuming that for 3D datasets, axis 0 (that is, the axis named by `layout->GetAxisDescriptor(0)`) is always defined as being along the most rapidly-varying index in linear memory, while axis 2 is always along the most slowly-varying index in linear memory, is this correct?
2) For data in the inline/crossline/sample axis description system:
a) Will it always be the case that axis 0 is "sample", axis 1 is "crossline" and axis 2 is "inline" ? That seems to be what this line of `examples/GettingStarted/main.cpp` (as of OpenVDS 3.3.1) implies:
```plaintext
const int sampleDimension = 0, crosslineDimension = 1, inlineDimension = 2;
```
but is it a robust assumption? Currently I am checking that the names of the three axes are in this expected order and erroring out otherwise (which is an expectation met by all 3D datasets to which I have access with "inline" / "crossline" / "sample" axes).
b) If I have a dataset in our own 3D format for which the inline or crossline number _decreases_ in the positive direction (for the ordering in linear memory) of one or both of the two horizontal axes, this does not seem to be supported by OpenVDS. Do I have that correct? In other words, if I have such a 3D grid in our own format, it looks like I'll need to take care of moving the XY origin to the appropriate one of the other three grid corners in XY and swapping the nodes around correspondingly in memory, in order to be able to create an OpenVDS dataset where the inline/crossline numbers increase in the same direction as the node ordering in memory? Or am I missing something that would make this unnecessary (e.g. the axis descriptor min/max being possible to set in reversed order?)
3) Regarding the I/J/K axis description system:
a) Is there any specific mapping that is required/expected between the I/J/K labels and the 0/1/2 axis numbering? Or is it possible that I might find (say) one dataset where axis 2 is the K axis, and another dataset where axis 0 is the K axis (and so on) ?
b) In the transformation from axes to world coordinates using the metadata, are the world Z-coordinates always in the downward sense (i.e. more positive values are farther underground) as is stated to be the case with the inline/crossline/sample system? If not, how can I tell whether the Z-coordinate axis points upward or downward?
c) In the I/J/K system, do the axis descriptors' `GetCoordinateMin()/Max()` accessors return anything meaningful? (Since it would seem that the I/J/K axes are defined completely by the origin and step metadata, and by the axes' `GetNumSamples()` getters?) If so, in what way would they be used?
d) Should I expect that all VDS datasets with the I/J/K axis description system have the same handedness for the I/J/K axes (which?), or is this not guaranteed? For the inline/crossline/sample system I have found datasets with both chiralities.
4) Basically the same questions as (3a,b) for the X/Y/Z axis description system.
I'm not aware of any VDS datasets that exhibit the I/J/K or X/Y/Z coordinate systems; if there are public datasets available that do so, a pointer would be enormously appreciated!
5) It appears that the only units for Z coordinates that are supported are milliseconds, meters, and feet (and also apparently US survey feet?). I'm unclear what happens on attempting to create an OpenVDS dataset having a vertical axis with a different unit -- e.g. "fathoms" or "km" or "seconds". Is it allowed but unsupported (and if so is it round-trippable)? Or will it provoke an exception or crash?
Again, thanks so much for your help!https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/open-vds/-/issues/229AWS Directory Buckets2024-02-17T18:40:52ZKlaas KosterAWS Directory BucketsAWS has recently introduced something called Directory Buckets in addition to their General Purpose Buckets:
https://opsinsights.dev/exploring-the-newest-s3-bucket
Supposedly these come with lower cost and lower latency. Is there a rec...AWS has recently introduced something called Directory Buckets in addition to their General Purpose Buckets:
https://opsinsights.dev/exploring-the-newest-s3-bucket
Supposedly these come with lower cost and lower latency. Is there a recommendation to use these new Directory Buckets for VDS?Morten OfstadMorten Ofstadhttps://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/open-vds/-/issues/213NEWBIE - installation on ARM 64 Linux graviton2024-02-01T07:16:11ZKlaas KosterNEWBIE - installation on ARM 64 Linux gravitonFollowing the instructions, I executed:
1) cmake ..
2) make -j8
3) make install
No errors or warnings are generated, and five executables are placed in Dist/OpenVDS/bin.
Two issues:
1) The README file states that ./SEGYImport should sho...Following the instructions, I executed:
1) cmake ..
2) make -j8
3) make install
No errors or warnings are generated, and five executables are placed in Dist/OpenVDS/bin.
Two issues:
1) The README file states that ./SEGYImport should show the Wavelet Compression option, but it does not.
2) I cannot find the .whl file anywhere that would allow me to use 'pip install' to get Python to work with OpenVDS.
What did I do wrong or forget?https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/open-vds/-/issues/158Installation of master fails on MacOS 13.0 with Arm642024-02-01T07:15:58ZAlexander JaustInstallation of master fails on MacOS 13.0 with Arm64## Description
I am trying to build OpenVDS `master` on an Arm64 Mac with the current MacOS release (Ventura), but it fails. Any input would be appreciated. I would also try to supply patches/merge requests where it makes sense.
I am u...## Description
I am trying to build OpenVDS `master` on an Arm64 Mac with the current MacOS release (Ventura), but it fails. Any input would be appreciated. I would also try to supply patches/merge requests where it makes sense.
I am using the following command to configure
```text
cmake -S . \
-B build \
-DCMAKE_BUILD_TYPE=Release \
-DBUILD_SHARED_LIBS=ON \
-DBUILD_JAVA=OFF \
-DBUILD_PYTHON=ON \
-DBUILD_EXAMPLES=ON \
-DBUILD_TESTS=OFF \
-DBUILD_DOCS=OFF \
-DDISABLE_AWS_IOMANAGER=ON \
-DDISABLE_AZURESDKFORCPP_IOMANAGER=ON \
-DDISABLE_GCP_IOMANAGER=ON \
-DDISABLE_DMS_IOMANAGER=ON \
-DDISABLE_STRICT_WARNINGS=ON \
-DCMAKE_INSTALL_PREFIX="${INSTALLATION_DIR}" \
```
where `INSTALLATION_DIR` points to `/Users/aej/software/openvds-master-install-python`.
and the following command line for building OpenVDS
```text
cmake --build "build" \
--config Release \
--target install \
-j 1 \
--verbose \
```
## Expectation
OpenVDS is built and installed in the specified directory.
## Actual behavior
The build fails. I found the following problems
1. If I delete a downloaded third-party dependency from the `3rdParty` directory, delete my build directory and then rerun the CMake configuration step the automatic fetching of the library fails. After a second deletion of the build directory and rerunning the CMake configuration step the third-party library seems to be fetched correctly.
2. I get a problem due to the inclusion of `curl.h` by `cpprestsdk`
```text
In file included from /Users/aej/software/compilescripts/openvds/openvds-3.1.0-src/src/OpenVDS/IO/IOManagerCurl.h:41:
/Library/Developer/CommandLineTools/SDKs/MacOSX13.0.sdk/usr/include/curl/curl.h:115:41: error: too few arguments provided to function-like macro invocation
__has_declspec_attribute(dllimport))
```
This seems to be related to changed behavior of LLVM/clang and it appears [with other projects](https://github.com/llvm/llvm-project/issues/53269) and has been reported to [cURL as well](https://github.com/curl/curl/issues/8293). It seems to be some interaction of cURL and casablanca. There are an [issue](https://github.com/microsoft/cpprestsdk/issues/1710) and a [pull request](https://github.com/microsoft/cpprestsdk/pull/1723) in the `cpprestsdk` repository for this, but they say that this will not be fixed since `cpprestsdk` is in maintenance mode.
I can fix it by commenting out the `#define dllimport`, but I am not sure if that is the best thing to do.
```text
cpprestapi_file="${SOURCE_DIR}/3rdparty/cpprestapi-2.10.16/Release/include/cpprest/details/cpprest_compat.h"
sed -i '' 's/\#define dllimport/\/\/\#define dllimport/' "${cpprestapi_file}"
sed -i '' 's/\/\/\/\/\#define dllimport/\/\/\#define dllimport/' "${cpprestapi_file}"
```
As the [`cpprestsdk`](https://github.com/microsoft/cpprestsdk) project is marked as being in maintenance mode so maybe it is necessary to move to another project in the (near?) future.
Side question: Why is the package called `cpprestapi` within the OpenVDS project? It makes debugging a bit confusing since the actual package/repository is called `cpprestsdk`.
3. Building the AWS IOManager fails
```text
/Users/aej/software/compilescripts/openvds/openvds-master-src/src/OpenVDS/IO/IOManagerAWSCurl.h:10:10: fatal error: 'aws/crt/auth/Credentials.h' file not found
#include <aws/crt/auth/Credentials.h>
^~~~~~~~~~~~~~~~~~~~~~~~~~~~
1 error generated.
```
The file exists though if I search for in from the OpenVDS repository root
```text
$ find . -iname "Credentials.h" -type f
./3rdparty/google-cloud-cpp-1.14.0/google/cloud/storage/oauth2/credentials.h
./3rdparty/aws-cpp-sdk-1.9.336_/aws-cpp-sdk-cognito-identity/include/aws/cognito-identity/model/Credentials.h
./3rdparty/aws-cpp-sdk-1.9.336_/aws-cpp-sdk-finspace-data/include/aws/finspace-data/model/Credentials.h
./3rdparty/aws-cpp-sdk-1.9.336_/aws-cpp-sdk-connect/include/aws/connect/model/Credentials.h
./3rdparty/aws-cpp-sdk-1.9.336_/aws-cpp-sdk-sts/include/aws/sts/model/Credentials.h
./3rdparty/aws-cpp-sdk-1.9.336_/crt/aws-crt-cpp/include/aws/crt/auth/Credentials.h
./3rdparty/aws-cpp-sdk-1.9.336_/crt/aws-crt-cpp/crt/aws-c-auth/include/aws/auth/credentials.h
```
I am currently a bit stuck at this step since I did not find any straightforward way yet to avoid this problem. I assume that the include paths is not populated properly.
## System
- Arm64 MacOS 13.0.1
- OpenVDS `master` branch
- clang 14.0.0
```text
$ clang --version
Apple clang version 14.0.0 (clang-1400.0.29.202)
Target: arm64-apple-darwin22.1.0
Thread model: posix
InstalledDir: /Library/Developer/CommandLineTools/usr/bin
```
- cmake 3.24.3 (via Homebrew)https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/open-vds/-/issues/219[IOManagerAWS] - question about STS credential provider for AWS2024-01-31T14:47:35ZFilip Brzęk[IOManagerAWS] - question about STS credential provider for AWSHi,
Is it currently possible to form `OpenOptions` for the AWS backend by specifying a roleARN that should be used to generate an STS token?
After glancing at [IOManagerAWSCurl](https://community.opengroup.org/osdu/platform/domain-data-...Hi,
Is it currently possible to form `OpenOptions` for the AWS backend by specifying a roleARN that should be used to generate an STS token?
After glancing at [IOManagerAWSCurl](https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/open-vds/-/blob/master/src/OpenVDS/IO/IOManagerAWSCurl.cpp?ref_type=heads#L180) it seems it defaults to `CreateCredentialsProviderChainDefault()`.
Is there a way in the current code path to be able to specify a specific role to be used for S3 access? Or it would have to be added separately.
If so, are there some style/contribution guidelines about adding new `OpenOptions`?
Thanks,
Filiphttps://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/open-vds/-/issues/223Java Binding crash when querying VolumeTraces with LOD2024-01-31T14:27:36ZJulien LacosteJava Binding crash when querying VolumeTraces with LODUsing java binding, the call to VolumeDataAccessManager::requestVolumeTraces method causes an application crash if LOD > 0.Using java binding, the call to VolumeDataAccessManager::requestVolumeTraces method causes an application crash if LOD > 0.https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/open-vds/-/issues/182The sdapi proxy is neither forward nor backward compatible.2024-01-30T12:14:42ZPaal KvammeThe sdapi proxy is neither forward nor backward compatible.I have run tests on version 3.2.1 using both the old "sdapi" I/O manager and the new "proxy" I/O manager, using VDSCopy. I have built OpenVDS from sources. Observations:
- A file on the cloud written using the "sdapi" manager cannot be ...I have run tests on version 3.2.1 using both the old "sdapi" I/O manager and the new "proxy" I/O manager, using VDSCopy. I have built OpenVDS from sources. Observations:
- A file on the cloud written using the "sdapi" manager cannot be read with the "proxy" manager.
- A file on the cloud written using the "proxy" manager cannot be read with the "sdapi" manager.
- Additionally, the file written using the "proxy" manager is seen as corrupt by other dms applications. The list of block names is empty. Somehow the meta-data isn't stored correctly.https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/open-vds/-/issues/225DmsDataset class is using resources that have been moved from2024-01-29T13:47:01ZDeepa KumariDmsDataset class is using resources that have been moved fromThe vector containing response is moved here: https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/open-vds/-/blob/master/src/OpenVDS/IO/DmsIoFactories/DmsIoManagerFactory.cpp#L200
https://community.opengroup....The vector containing response is moved here: https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/open-vds/-/blob/master/src/OpenVDS/IO/DmsIoFactories/DmsIoManagerFactory.cpp#L200
https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/open-vds/-/blob/master/src/OpenVDS/IO/DmsIoFactories/DmsIoManagerFactory.cpp#L223
The same resource which was already moved from, is being used here:
https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/open-vds/-/blob/master/src/OpenVDS/IO/DmsIoFactories/DmsIoManagerFactory.cpp#L204
and https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/open-vds/-/blob/master/src/OpenVDS/IO/DmsIoFactories/DmsIoManagerFactory.cpp#L227
Resources that have been moved from should not be used, that is the objective of move. More ref: https://en.cppreference.com/w/cpp/utility/moveM23 - Release 0.26Deepa KumariDeepa Kumarihttps://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/open-vds/-/issues/226Building from source2024-01-29T08:38:09ZVasilii SinkevichBuilding from sourceHi,
I am trying to build the static library from the source code and as a first step I did compile dynamically as instructed and noticed that Wavelet compression is not supported in the resulting library - just checked with OpenVDS::Is...Hi,
I am trying to build the static library from the source code and as a first step I did compile dynamically as instructed and noticed that Wavelet compression is not supported in the resulting library - just checked with OpenVDS::IsCompressionMethodSupported(OpenVDS::CompressionMethod::Wavelet).
Is it how it is supposed to be or I did something wrong?
Does it mean that both compression/decompression is not supported or decompression will still be working?
Thanks you,
Vasiliihttps://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/open-vds/-/issues/224log out "0.00 % Done." then terminated while convert segy file to vds file.2024-01-28T06:20:39Znanting liulog out "0.00 % Done." then terminated while convert segy file to vds file.![1706185787839](/uploads/55ea2a4c9e6cd925c4a8c5d56edb5cf4/1706185787839.jpg)
![1706185713377](/uploads/b5ae9b60826865228a3349ece03e8d5d/1706185713377.jpg)![1706185787839](/uploads/55ea2a4c9e6cd925c4a8c5d56edb5cf4/1706185787839.jpg)
![1706185713377](/uploads/b5ae9b60826865228a3349ece03e8d5d/1706185713377.jpg)https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/open-vds/-/issues/217Reading concurrently2024-01-24T08:31:20ZVasilii SinkevichReading concurrentlyHi,
Not an issue, rather a question
I am getting familiar VDS and I am trying to read a slice of data from a VDS file, I split the slice (e.g., [1,0:1000,0:500]) into several portions along one axis (e.g.,[1,0:200,0:500],[1,200:400,0:5...Hi,
Not an issue, rather a question
I am getting familiar VDS and I am trying to read a slice of data from a VDS file, I split the slice (e.g., [1,0:1000,0:500]) into several portions along one axis (e.g.,[1,0:200,0:500],[1,200:400,0:500],[1,400:600,0:500],...) and try to read them with requestVolumeSubset concurrently using multiprocessing module, but even though the reading in each thread starts simultaneously (confirmed by text output), it looks like actual reading happens consecutively, one portion after another.
I tried opening vds file in the main thread and use the identifier in the threads (concurrent.futures allows it) and to open the file separately in each thread - in first case reading of each portion starts after previous has finished as if in a single thread, in the second case the reading starts simultaneously, but each portion is read way longer than normal taking overall same time as in the first case.
So the question is: Is there a some sort of queueing system for reading in the openvds library or it is just limitation of free version?
Can reading data by pages resolve it?
Sorry, no code snippet as I am not sure if I am allowed to post the code
Platform: Windows
API: Python
Thank you,
Vasiliihttps://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/open-zgy/-/issues/31Build fix to allow building the native library with clang 162024-01-04T08:24:23ZJon JenssenBuild fix to allow building the native library with clang 16Clang 16 requires an additional #include <cstdint> to be added to structaccess.h for the build to work.
See attach patch.
[clang16_patch_for_structaccess.diff](/uploads/a2dfaf3f64562fe857db97ef344f199e/clang16_patch_for_structaccess.diff)Clang 16 requires an additional #include <cstdint> to be added to structaccess.h for the build to work.
See attach patch.
[clang16_patch_for_structaccess.diff](/uploads/a2dfaf3f64562fe857db97ef344f199e/clang16_patch_for_structaccess.diff)https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/open-vds/-/issues/209Adding CRS to a VDS generated by Openvds+2023-11-16T12:21:10ZJuliana Fernandesjuliana.fernandes@iesbrazil.com.brAdding CRS to a VDS generated by Openvds+Hello,
I was taking a look into the doccumentation in order to add CRS to the VDS I'm generating with Openvds+.
In the doccumentation I saw the command "–crs-wkt <string>". The WKT is a Well-known Text and seems to be a geographical c...Hello,
I was taking a look into the doccumentation in order to add CRS to the VDS I'm generating with Openvds+.
In the doccumentation I saw the command "–crs-wkt <string>". The WKT is a Well-known Text and seems to be a geographical coordinate. There is a way to add a UTM coordinate to the data?
Regards,
Julianahttps://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/open-vds/-/issues/218OpenVDS cuts one char from folder name in GC path2023-11-15T11:51:32ZDzmitry Malkevich (EPAM)OpenVDS cuts one char from folder name in GC pathWe've found issue with OpenVDS 3.2.7 (and possible with all 3.2.*) and 3.3.1 versions in GC: first character in folder name is lost in path to SEGY file.
We have Seismic dataset:
```json
{
"sbit_count": 0,
"last_modified_date": ...We've found issue with OpenVDS 3.2.7 (and possible with all 3.2.*) and 3.3.1 versions in GC: first character in folder name is lost in path to SEGY file.
We have Seismic dataset:
```json
{
"sbit_count": 0,
"last_modified_date": "Wed Sep 27 2023 18:04:33 GMT+0000 (Coordinated Universal Time)",
"created_by": "109239448567816450362",
"sbit": null,
"subproject": "fgx",
"path": "/",
"gcsurl": "osdu-data-prod-m19-ss-seismic/5a3a7d7b-3a94-4d7a-9bcb-68c133d19e77/96bd9293-3358-4716-8a04-23806f63053e",
"readonly": false,
"filemetadata": {
"md5Checksum": null,
"nobjects": 1,
"size": 277427976,
"type": "GENERIC",
"tier_class": null
},
"name": "ST0202R08_PS_PSDM_RAW_PP_TIME.MIG_RAW.POST_STACK.3D.JS-017534.segy",
"ctag": "l7nnwm4Onkcjg79Dm19;m19",
"created_date": "Wed Sep 27 2023 18:04:03 GMT+0000 (Coordinated Universal Time)",
"ltag": "m19-seismic-DDMS-Legal-Tag-PRFC",
"tenant": "m19",
"access_policy": "uniform"
}
```
and Seismic path is `sd://m19/fgx/ST0202R08_PS_PSDM_RAW_PP_TIME.MIG_RAW.POST_STACK.3D.JS-017534.segy`.
When we run SEGY-to-VDS conversion OpenVDS is trying to download this file from `https://storage.googleapis.com/osdu-data-prod-m19-ss-seismic/a3a7d7b-3a94-4d7a-9bcb-68c133d19e77/96bd9293-3358-4716-8a04-23806f63053e/0` and fails as URL is not correct and first character in folder name is missing. In this case correct path should be `https://storage.googleapis.com/osdu-data-prod-m19-ss-seismic/5a3a7d7b-3a94-4d7a-9bcb-68c133d19e77/96bd9293-3358-4716-8a04-23806f63053e/0`.
As result conversion fails:
```text
[2023-10-27, 09:36:52 UTC] {pod_launcher.py:198} INFO - Event: segy-vds-conversion.1ff9faad21db4204a03ac62025f93454 had an event of type Running
[2023-10-27, 09:36:52 UTC] {pod_launcher.py:149} INFO - [Could not open input file] sd://m19/fgx/ST0202R08_PS_PSDM_RAW_PP_TIME.MIG_RAW.POST_STACK.3D.JS-017534.segy: Http error response: 403 -> https://storage.googleapis.com/osdu-data-prod-m19-ss-seismic/a3a7d7b-3a94-4d7a-9bcb-68c133d19e77/96bd9293-3358-4716-8a04-23806f63053e/0
[2023-10-27, 09:36:53 UTC] {pod_launcher.py:198} INFO - Event: segy-vds-conversion.1ff9faad21db4204a03ac62025f93454 had an event of type Running
```
Conversion works with image community.opengroup.org:5555/osdu/platform/domain-data-mgmt-services/seismic/open-vds/openvds-ingestion:latest which seems to be version 3.1.41
Please check and advise as this affects M21 Pre-shipping testing.
cc: @Yan_Sushchynski , @Yauhen_ShaliouMorten OfstadMorten Ofstadhttps://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/open-vds/-/issues/210Error uploading VDS into SD Path using OpenVDS+2023-11-14T22:11:13ZJuliana Fernandesjuliana.fernandes@iesbrazil.com.brError uploading VDS into SD Path using OpenVDS+Hello,
I`m trying to upload a local VDS into a SD Path in AWS M20 pre-shipping and get an SDMS error (wrong location).
Some values where provided by e-mail, so I will not paste here, but if you need to test just let me know.
The command...Hello,
I`m trying to upload a local VDS into a SD Path in AWS M20 pre-shipping and get an SDMS error (wrong location).
Some values where provided by e-mail, so I will not paste here, but if you need to test just let me know.
The command I'm using is:
```
VDSCopy.exe -d "SdAuthorityUrl=https://prsh.testing.preshiptesting.osdu.aws/api/seismic-store/v3;SdApiKey=ABC;AuthTokenUrl={{received_by_email}};client_id={{received_by_email}};client_secret={{received_by_email}};grant_type=refresh_token;refresh_token={{generated_in_the_login}};LegalTag=osdu-public-usa-dataset-1;scopes=openid email;Region=us-east-2" E:\Juliana\osdu\osdu_test\ST0202R08_PS_PSDM_RAW_PP_TIME_MIG_RAW_POST_STACK_3D_JS_017534_tol1_JFA.vds sd://osdu/vdstestsjfa/ST0202R08_PS_PSDM_RAW_PP_TIME_MIG_RAW_POST_STACK_3D_JS_017534_tol1_JFA.vds
```
And the error I'm getting is:
```
[Could not create VDS sd://osdu/vdstestsjfa/test/ST0202R08_PS_PSDM_RAW_PP_TIME_MIG_RAW_POST_STACK_3D_JS_017534_tol1_JFA.vds] Error on uploading VolumeDataLayout object: Http error response: 301 -> https://psosdu-shared-seismicddms-20230814174725984500000004.s3.us-east-1.amazonaws.com/3o7c5j88s1ko0oyg/2b9b212b-21b5-4ccf-aaed-67485c113ae4/VolumeDataLayout: The bucket you are attempting to access must be addressed using the specified endpoint. Please send all future requests to this endpoint.
```
Seems to be accessing the wrong location since the instance is located at us-east-2.
Regards,
Juliana.https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/open-vds/-/issues/215The sd protocol is failing for IBM2023-10-27T15:17:11ZAnuj GuptaThe sd protocol is failing for IBMThe sd protocol is failing for IBM while trying to call `vds = openvds.open(url, con)` is resulting in 404 error and seems the characters after `/` is getting escaped/skipped
If path is `ss-dev-seismic-dh2cqj2dwyr3tsz9/f013db48-47f5-430...The sd protocol is failing for IBM while trying to call `vds = openvds.open(url, con)` is resulting in 404 error and seems the characters after `/` is getting escaped/skipped
If path is `ss-dev-seismic-dh2cqj2dwyr3tsz9/f013db48-47f5-430b-a10e-c5f6622712d2 `
the bucket name is : ss-dev-seismic-dh2cqj2dwyr3tsz9
subpath/key : f013db48-47f5-430b-a10e-c5f6622712d2
where as the subpath/key is:
`013db48-47f5-430b-a10e-c5f6622712d2` (~~f~~013db48-47f5-430b-a10e-c5f6622712d2)2023-10-13https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/open-vds/-/issues/216Data retrieval from requestVolumeTraces produces results inconsistent with or...2023-10-23T12:40:13ZDaniel MorganData retrieval from requestVolumeTraces produces results inconsistent with original segy and requestVolumeSubset.Using python libraries built from 3.2.6 to read but SEGYImport is version 3.3.255
We've have been trying to verify the fidelity of segy to vds conversions by comparing individual trace data from the original segy with the resulting VDS f...Using python libraries built from 3.2.6 to read but SEGYImport is version 3.3.255
We've have been trying to verify the fidelity of segy to vds conversions by comparing individual trace data from the original segy with the resulting VDS file. Since we were retrieving individual traces we thought to use VolumeDataAccessManager.requestVolumeTraces but the retrieved data did not match the original in many cases.
Test file was from Volve test set: "/ST0202/ST10010ZC11_MIG_VEL.MIG_VEL.VELOCITY.3D.JS-017527.segy"
Import used no compression, data retrieval LOD 0.
For ease of comparison, we wanted to retrieve the last trace from the file corresponding to inline/crossline 10396, 2800 (trace number 418996 from SEGY). The inline/crossline coordinates convert to inline/crossline indices 435 960 respectively.
Retrieval: `trace = accessManager.requestVolumeTraces([[435, 960]], traceDimension=0, lod=0)`
This is the result when compared with original trace data:
![image](/uploads/abd85b26d19f538913dd687dee607edd/image.png)
Now when we use VolumeDataAccessManager.requestVolumeSubset using min/max tuples narrowed to a single trace, our retrieved data matches the original segy perfectly.
```
minTup = (0, 960, 435, 0, 0, 0)
maxTup = (236, 961, 436, 0, 0, 0)
trace = accessManager.requestVolumeSubset(minTup, maxTup)
```
![image](/uploads/6494e54698f8d841b853b93bcefa953c/image.png)
My assumption is that we may be misusing requestVolumeTraces, but am unclear in what way.https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/open-vds/-/issues/212VDSCopy hanging when uploading to Seismic DDMS2023-10-23T11:39:11Zvinicius Vicente Silva RosaVDSCopy hanging when uploading to Seismic DDMSI am attempting to upload a local VDS file (1.5TB) to a SD Path, and after approximately an hour, there is no visible progress in the file upload, creating the impression that the process is stalled. No error messages are being displayed...I am attempting to upload a local VDS file (1.5TB) to a SD Path, and after approximately an hour, there is no visible progress in the file upload, creating the impression that the process is stalled. No error messages are being displayed. I suspect it may be related to the token refresh.
We are using the command line bellow:
OSDU/ADME M16
lIB: VDSCopy - OpenVDS+ 3.3.0 installed on Linux
```bash
VDSCopy -a 01 -a 02 -a 12 --tolerance=1.0 --compression-method=Wavelet -d 'sdAuthorityUrl=
https://{HOST}.energy.azure.com/seistore-svc/api/v3;authTokenUrl=https://login.microsoftonline.com/{TENANT}/oauth2/v2.0/token/;client_id={APP_ID};client_secret={APP_SECRET};scopes={APP_ID}/.default;'
'/local_disk0/vds/FILE.vds' 'sd://{TENANT}/{SUBPROJECT}/dataset_name.vds'
```
The intention of the command above is to authenticate using ClientID and ClientSecrect.
The upload completes successfully when the file is processed within an hour or less.https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-service/-/issues/63Pipeline for GCP does not include all required variables2023-10-12T13:27:45ZMikhail Piatliou (EPAM)Pipeline for GCP does not include all required variablesThe job https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-service/-/jobs/1180886 fails because the pipeline does not include all required `yaml` files from the common ci-cd pi...The job https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-service/-/jobs/1180886 fails because the pipeline does not include all required `yaml` files from the common ci-cd pipelines, i.e. `$OSDU_GCP_GCR_REGISTRY` - currently it is defined here https://community.opengroup.org/osdu/platform/ci-cd-pipelines/-/blob/master/cloud-providers/osdu-gcp-global.yml.
GCP team uses a common approach in pipelines for all environments, in a service we include the common `yaml` file, like here: https://community.opengroup.org/osdu/platform/ci-cd-pipelines/-/blob/master/cloud-providers/osdu-gcp-global.yml. This global file itself has all required configurations for different envs inside.
Because of that, it is not enough to include just variables `yaml` as we can see here: https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-service/-/blob/master/devops/osdu/cloud-providers/gcp.yml.
It would be great if the seismic team could adjust their pipelines to avoid future GCP pipeline failures.
Cc: @Kateryna_Kurach @Oleksandr_KosseDaniel PerezDaniel Perez