Seismic issueshttps://community.opengroup.org/groups/osdu/platform/domain-data-mgmt-services/seismic/-/issues2024-02-26T21:52:50Zhttps://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-cpp-lib/-/issues/21Unable to use OpenVDS and OpenZGY libraries to access data in Seismic DDMS on...2024-02-26T21:52:50ZMichaelUnable to use OpenVDS and OpenZGY libraries to access data in Seismic DDMS on AWS M15More details can be found here: https://community.opengroup.org/osdu/platform/pre-shipping/-/issues/439
Both OpenVDS and OpenZGY data has been ingested into Seismic DDMS on AWS M15. I was able to download the OpenZGY file using SDUtil.
...More details can be found here: https://community.opengroup.org/osdu/platform/pre-shipping/-/issues/439
Both OpenVDS and OpenZGY data has been ingested into Seismic DDMS on AWS M15. I was able to download the OpenZGY file using SDUtil.
However, when I try to access the OpenZGY data using the OpenZGY library, I get the following exception:
`Initialize: Seismic Store: sdapi 3.16.0 - Encountered network error when sending http request`
When I try to access the OpenVDS data using the latest OpenZGY library, I get the following exception:
`-- sdapi 3.17.0 - Fri Feb 17 16:43:30 2023 -- Head Object error: - Encountered network error when sending http request`
I can download the OpenZGY data using the latest version of SDUtil. This leads me to think that the SDAPI library is not able to propertly access seismic data in Seismic DDMS for AWS M15.
Is there another version of the sdapi library that I should be using with the OpenVDS and OpenZGY libraries?https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-service/-/issues/99Include aws region in dataset information for AWS Seismic DDMS data2024-02-26T21:52:49ZMichaelInclude aws region in dataset information for AWS Seismic DDMS dataWhen using sdapi to retreive seismic ddms data coming from AWS, a user needs to first set the AWS_REGION environment variable (see ticket https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/s...When using sdapi to retreive seismic ddms data coming from AWS, a user needs to first set the AWS_REGION environment variable (see ticket https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-cpp-lib/-/issues/21).
To better handle this use case, the get dataset service `/dataset/tenant/{tenantid}/subproject/{subproject}/dataset/{datasetid}` should provide information regarding the aws region if the dataset is stored in s3 storage.https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/open-vds/-/issues/125"OpenVDS Works IBM Platform Validation SSDMS_to_SSDMS conversion CI/CD" is fa...2024-02-26T20:14:34ZAnkit Goyal"OpenVDS Works IBM Platform Validation SSDMS_to_SSDMS conversion CI/CD" is failing at Segy to VDS conversion "Check the triggered OpenVDS workflow status"AIRFLOW_CTX_DAG_RUN_ID=2d859565-e230-40f1-824e-84294e90ef94
[2022-06-15 13:17:49,173] {kubernetes_pod.py:365} INFO - creating pod with labels {'dag_id': 'openvds_import', 'task_id': 'segy_to_vds_ssdms_conversion', 'execution_date': '2022...AIRFLOW_CTX_DAG_RUN_ID=2d859565-e230-40f1-824e-84294e90ef94
[2022-06-15 13:17:49,173] {kubernetes_pod.py:365} INFO - creating pod with labels {'dag_id': 'openvds_import', 'task_id': 'segy_to_vds_ssdms_conversion', 'execution_date': '2022-06-15T131744.4632090000-bb7a04682', 'try_number': '1'} and launcher <airflow.providers.cncf.kubernetes.utils.pod_launcher.PodLauncher object at 0x7f0ed4a2ad60>
[2022-06-15 13:17:49,300] {pod_launcher.py:198} INFO - Event: segy-vds-conversion.4b797bbebaa84dd8b718869a9505e28b had an event of type Pending
[2022-06-15 13:17:49,300] {pod_launcher.py:128} WARNING - Pod not yet started: segy-vds-conversion.4b797bbebaa84dd8b718869a9505e28b
[2022-06-15 13:17:50,322] {pod_launcher.py:198} INFO - Event: segy-vds-conversion.4b797bbebaa84dd8b718869a9505e28b had an event of type Pending
[2022-06-15 13:17:50,322] {pod_launcher.py:128} WARNING - Pod not yet started: segy-vds-conversion.4b797bbebaa84dd8b718869a9505e28b
[2022-06-15 13:17:51,341] {pod_launcher.py:198} INFO - Event: segy-vds-conversion.4b797bbebaa84dd8b718869a9505e28b had an event of type Pending
[2022-06-15 13:17:51,341] {pod_launcher.py:128} WARNING - Pod not yet started: segy-vds-conversion.4b797bbebaa84dd8b718869a9505e28b
[2022-06-15 13:17:52,358] {pod_launcher.py:198} INFO - Event: segy-vds-conversion.4b797bbebaa84dd8b718869a9505e28b had an event of type Pending
[2022-06-15 13:17:52,358] {pod_launcher.py:128} WARNING - Pod not yet started: segy-vds-conversion.4b797bbebaa84dd8b718869a9505e28b
[2022-06-15 13:17:53,372] {pod_launcher.py:198} INFO - Event: segy-vds-conversion.4b797bbebaa84dd8b718869a9505e28b had an event of type Failed
[2022-06-15 13:17:53,373] {pod_launcher.py:308} ERROR - Event with job id segy-vds-conversion.4b797bbebaa84dd8b718869a9505e28b Failed
[2022-06-15 13:17:53,386] {pod_launcher.py:198} INFO - Event: segy-vds-conversion.4b797bbebaa84dd8b718869a9505e28b had an event of type Failed
[2022-06-15 13:17:53,386] {pod_launcher.py:308} ERROR - Event with job id segy-vds-conversion.4b797bbebaa84dd8b718869a9505e28b Failed
[2022-06-15 13:17:53,402] {pod_launcher.py:198} INFO - Event: segy-vds-conversion.4b797bbebaa84dd8b718869a9505e28b had an event of type Failed
[2022-06-15 13:17:53,402] {pod_launcher.py:308} ERROR - Event with job id segy-vds-conversion.4b797bbebaa84dd8b718869a9505e28b Failed
[2022-06-15 13:17:53,468] {taskinstance.py:1501} ERROR - Task failed with exceptionAnuj GuptaAnuj Guptahttps://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/open-vds/-/issues/141SEGYImport ignores scaling factor for integral samples2024-02-26T20:14:01ZPaal KvammeSEGYImport ignores scaling factor for integral samplesSEGYImport doesn't seem to handle offset/scale when reading Seg-Y files with integral samples. Actually, Seg-Y doesn't allow specifying offset. But it can specify scale. The scale is found in the TRWF field, bytes 169-170, in the trace h...SEGYImport doesn't seem to handle offset/scale when reading Seg-Y files with integral samples. Actually, Seg-Y doesn't allow specifying offset. But it can specify scale. The scale is found in the TRWF field, bytes 169-170, in the trace header. Assuming I understand the spec correctly.
If the TRWF is the same for all traces then the scale factor (2^-TRWF) could easily be stored in the VDS metadata. So, this is arguably a bug. See createChannelDescriptors() in SEGYImport.cpp.
Varying TRWF is trickier and would probably require the samples to be converted to float with each trace being scaled individually. Note that I have not seen such files in the wild. So, this second issue might be of academic interest only.https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/open-vds/-/issues/188Compressed and Uncompressed size of a VDS2024-02-26T20:09:18ZJørgen Lindjorgen.lind@3lc.aiCompressed and Uncompressed size of a VDSIt would be nice to be able to get the compressed and uncompressed size of a VDS. It would also be very handy if this was exposed in VDSInfoIt would be nice to be able to get the compressed and uncompressed size of a VDS. It would also be very handy if this was exposed in VDSInfohttps://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/open-vds/-/issues/199SEGYImport DataProvider for DMS needs to support chunked datasets2024-02-26T20:08:56ZMorten OfstadSEGYImport DataProvider for DMS needs to support chunked datasetsSEG-Y datasets in DMS uploaded with sdutil will have a default chunk size of 32MB, the DataProvider class needs to support this in order to successfully import the data. See this issue for details:
https://community.opengroup.org/osdu/p...SEG-Y datasets in DMS uploaded with sdutil will have a default chunk size of 32MB, the DataProvider class needs to support this in order to successfully import the data. See this issue for details:
https://community.opengroup.org/osdu/platform/pre-shipping/-/issues/585#note_246359Deepa KumariDeepa Kumarihttps://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/open-vds/-/issues/222VoumeData class not included in Java Library2024-02-26T20:07:38ZJulien LacosteVoumeData class not included in Java LibraryThe VolumeData class is not generated in the Java version.
Apparently it's missing from de the CmakeLists.txtThe VolumeData class is not generated in the Java version.
Apparently it's missing from de the CmakeLists.txthttps://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-service/-/issues/128Subproject creation accepts non-existing groups in ACLs2024-02-26T17:21:16ZYan Sushchynski (EPAM)Subproject creation accepts non-existing groups in ACLs## Description of the problem
There is an issue when it is possible to create a new subproject with non-existing groups in the `acls` field. And then, any action, except deleting the subproject, throws `403` in the subproject.
## Steps ...## Description of the problem
There is an issue when it is possible to create a new subproject with non-existing groups in the `acls` field. And then, any action, except deleting the subproject, throws `403` in the subproject.
## Steps to reproduce it
1. Create a new subproject with invalid acls:
```
curl --location --request POST 'https://<svc_url>/v3/subproject/tenant/osdu/subproject/test-123' \
--header 'x-api-key: {{SVC_API_KEY}}' \
--header 'Content-Type: application/json' \
--header 'ltag: osdu-demo-legaltag' \
--header 'appkey: {{DE_APP_KEY}}' \
--header 'Authorization: Bearer <token>' \
--data-raw '{
"storage_class": "REGIONAL",
"storage_location": "US-CENTRAL1",
"acls": {
"admins": [
"data.sdms.non-existing.admin@osdu.group"
],
"viewers": [
"data.sdms.non-existing.viewer@osdu.group"
]
}
}'
```
This request is executed without any error.
2. Try to upload any file to the subproject:
```shell
python sdutil cp somefile sd://osdu/test-123/somefile
```
Output:
```
[403] [seismic-store-service] User not authorized to perform this operation
```Diego MolteniSacha BrantsDiego Moltenihttps://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/open-vds/-/issues/227Documentation request: Unclear on constraints/expectations on axes for 3D dat...2024-02-20T17:27:16ZKevin McCartyDocumentation request: Unclear on constraints/expectations on axes for 3D datasetsHello,
I'm writing some utilities for my employer, Dynamic Graphics Inc., that are intended to convert between OpenVDS datasets and our own 3D grid file formats (one of them being a quite old internally-invented proprietary binary forma...Hello,
I'm writing some utilities for my employer, Dynamic Graphics Inc., that are intended to convert between OpenVDS datasets and our own 3D grid file formats (one of them being a quite old internally-invented proprietary binary format, and the more recent one being HDF5-based), not unlike `SEGYImport` and `SEGYExport`.
As I started working on this, I found that numerous questions arose regarding the axes/dimensions for OpenVDS datasets that aren't really covered, or are at best glossed over, by the online docs. Many thanks in advance for light that you can shed on them!
1) I'm presuming that for 3D datasets, axis 0 (that is, the axis named by `layout->GetAxisDescriptor(0)`) is always defined as being along the most rapidly-varying index in linear memory, while axis 2 is always along the most slowly-varying index in linear memory, is this correct?
2) For data in the inline/crossline/sample axis description system:
a) Will it always be the case that axis 0 is "sample", axis 1 is "crossline" and axis 2 is "inline" ? That seems to be what this line of `examples/GettingStarted/main.cpp` (as of OpenVDS 3.3.1) implies:
```plaintext
const int sampleDimension = 0, crosslineDimension = 1, inlineDimension = 2;
```
but is it a robust assumption? Currently I am checking that the names of the three axes are in this expected order and erroring out otherwise (which is an expectation met by all 3D datasets to which I have access with "inline" / "crossline" / "sample" axes).
b) If I have a dataset in our own 3D format for which the inline or crossline number _decreases_ in the positive direction (for the ordering in linear memory) of one or both of the two horizontal axes, this does not seem to be supported by OpenVDS. Do I have that correct? In other words, if I have such a 3D grid in our own format, it looks like I'll need to take care of moving the XY origin to the appropriate one of the other three grid corners in XY and swapping the nodes around correspondingly in memory, in order to be able to create an OpenVDS dataset where the inline/crossline numbers increase in the same direction as the node ordering in memory? Or am I missing something that would make this unnecessary (e.g. the axis descriptor min/max being possible to set in reversed order?)
3) Regarding the I/J/K axis description system:
a) Is there any specific mapping that is required/expected between the I/J/K labels and the 0/1/2 axis numbering? Or is it possible that I might find (say) one dataset where axis 2 is the K axis, and another dataset where axis 0 is the K axis (and so on) ?
b) In the transformation from axes to world coordinates using the metadata, are the world Z-coordinates always in the downward sense (i.e. more positive values are farther underground) as is stated to be the case with the inline/crossline/sample system? If not, how can I tell whether the Z-coordinate axis points upward or downward?
c) In the I/J/K system, do the axis descriptors' `GetCoordinateMin()/Max()` accessors return anything meaningful? (Since it would seem that the I/J/K axes are defined completely by the origin and step metadata, and by the axes' `GetNumSamples()` getters?) If so, in what way would they be used?
d) Should I expect that all VDS datasets with the I/J/K axis description system have the same handedness for the I/J/K axes (which?), or is this not guaranteed? For the inline/crossline/sample system I have found datasets with both chiralities.
4) Basically the same questions as (3a,b) for the X/Y/Z axis description system.
I'm not aware of any VDS datasets that exhibit the I/J/K or X/Y/Z coordinate systems; if there are public datasets available that do so, a pointer would be enormously appreciated!
5) It appears that the only units for Z coordinates that are supported are milliseconds, meters, and feet (and also apparently US survey feet?). I'm unclear what happens on attempting to create an OpenVDS dataset having a vertical axis with a different unit -- e.g. "fathoms" or "km" or "seconds". Is it allowed but unsupported (and if so is it round-trippable)? Or will it provoke an exception or crash?
Again, thanks so much for your help!https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/open-vds/-/issues/229AWS Directory Buckets2024-02-17T18:40:52ZKlaas KosterAWS Directory BucketsAWS has recently introduced something called Directory Buckets in addition to their General Purpose Buckets:
https://opsinsights.dev/exploring-the-newest-s3-bucket
Supposedly these come with lower cost and lower latency. Is there a rec...AWS has recently introduced something called Directory Buckets in addition to their General Purpose Buckets:
https://opsinsights.dev/exploring-the-newest-s3-bucket
Supposedly these come with lower cost and lower latency. Is there a recommendation to use these new Directory Buckets for VDS?Morten OfstadMorten Ofstadhttps://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/open-vds/-/issues/228Documentation request: Unclear on constraints/expectations for channel names/...2024-02-16T16:07:49ZKevin McCartyDocumentation request: Unclear on constraints/expectations for channel names/unitsHi, this is a companion issue to #227, but here I'm asking about channels rather than axes.
* Is there a limit on the number of different "full-sized" channels (e.g. 3D for a dataset with 3 dimensions) that may be written to a VDS datas...Hi, this is a companion issue to #227, but here I'm asking about channels rather than axes.
* Is there a limit on the number of different "full-sized" channels (e.g. 3D for a dataset with 3 dimensions) that may be written to a VDS dataset?
* All the datasets that I see have "Amplitude" as the primary channel (channel 0) name. Is it legal to have a dataset where "Amplitude" is not the primary channel, or where it is not present at all?
* The datasets imported from SEGY all seem to have auxiliary "Trace" and "PDSTraceHeader" channels with per-trace values. Is it legal to create datasets that do not have those channels?
* What are the constraints on the _names_ of channels?
* Are they restricted to solely the names #define'd in `GlobalMetadataCommon.h` in section "Attributes' names"?
* Or, are they allowed to be any alphanumeric ASCII string?
* Or, are additional printable ASCII characters allowed as well (which ones?)?
* Or, is any printable string allowed? (Is the encoding required to be UTF-8 or something else?)
* What are the constraints on the _units_ of channels?
* Are they restricted to solely the names #define'd in `KnownMetadata.h` with `KNOWNMETADATA_UNIT_` prefixes / accessible from the `KnownUnitNames` class?
* Or, are they allowed to be any alphanumeric ASCII string? (And if so, is that round-trippable?)
* Etc.
As a motivation for this question, I might have need to create a channel named "Density" whose units are "g/cm^3" for instance. Or a channel named "Temperature" whose units are "degrees C", or "Salinity" with units "ppm". These are just examples off the top of my head, our own internal file formats allow for near-arbitrary channel names and units.
Thank you in advance for whatever information you can provide!https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-service/-/issues/127Issue with Get Status API2024-02-09T18:12:56ZJiman KimIssue with Get Status APIHello we are running some authentication testing and are running into some behaviors that may or may not be a bug.
for this endpoint
/seistore-svc/api/v4/status
We have 3 tests running
1. Sends an invalid token
2. Sends a valid toke...Hello we are running some authentication testing and are running into some behaviors that may or may not be a bug.
for this endpoint
/seistore-svc/api/v4/status
We have 3 tests running
1. Sends an invalid token
2. Sends a valid token but signed with a wrong secret
3. Sends the HTTP request without an authorization header.
1,2 return a 401
but 3 returns 200.
Is this a bug or intended behavior?
Thank you!M21 - Release 0.24https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/open-vds/-/issues/213NEWBIE - installation on ARM 64 Linux graviton2024-02-01T07:16:11ZKlaas KosterNEWBIE - installation on ARM 64 Linux gravitonFollowing the instructions, I executed:
1) cmake ..
2) make -j8
3) make install
No errors or warnings are generated, and five executables are placed in Dist/OpenVDS/bin.
Two issues:
1) The README file states that ./SEGYImport should sho...Following the instructions, I executed:
1) cmake ..
2) make -j8
3) make install
No errors or warnings are generated, and five executables are placed in Dist/OpenVDS/bin.
Two issues:
1) The README file states that ./SEGYImport should show the Wavelet Compression option, but it does not.
2) I cannot find the .whl file anywhere that would allow me to use 'pip install' to get Python to work with OpenVDS.
What did I do wrong or forget?https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/open-vds/-/issues/158Installation of master fails on MacOS 13.0 with Arm642024-02-01T07:15:58ZAlexander JaustInstallation of master fails on MacOS 13.0 with Arm64## Description
I am trying to build OpenVDS `master` on an Arm64 Mac with the current MacOS release (Ventura), but it fails. Any input would be appreciated. I would also try to supply patches/merge requests where it makes sense.
I am u...## Description
I am trying to build OpenVDS `master` on an Arm64 Mac with the current MacOS release (Ventura), but it fails. Any input would be appreciated. I would also try to supply patches/merge requests where it makes sense.
I am using the following command to configure
```text
cmake -S . \
-B build \
-DCMAKE_BUILD_TYPE=Release \
-DBUILD_SHARED_LIBS=ON \
-DBUILD_JAVA=OFF \
-DBUILD_PYTHON=ON \
-DBUILD_EXAMPLES=ON \
-DBUILD_TESTS=OFF \
-DBUILD_DOCS=OFF \
-DDISABLE_AWS_IOMANAGER=ON \
-DDISABLE_AZURESDKFORCPP_IOMANAGER=ON \
-DDISABLE_GCP_IOMANAGER=ON \
-DDISABLE_DMS_IOMANAGER=ON \
-DDISABLE_STRICT_WARNINGS=ON \
-DCMAKE_INSTALL_PREFIX="${INSTALLATION_DIR}" \
```
where `INSTALLATION_DIR` points to `/Users/aej/software/openvds-master-install-python`.
and the following command line for building OpenVDS
```text
cmake --build "build" \
--config Release \
--target install \
-j 1 \
--verbose \
```
## Expectation
OpenVDS is built and installed in the specified directory.
## Actual behavior
The build fails. I found the following problems
1. If I delete a downloaded third-party dependency from the `3rdParty` directory, delete my build directory and then rerun the CMake configuration step the automatic fetching of the library fails. After a second deletion of the build directory and rerunning the CMake configuration step the third-party library seems to be fetched correctly.
2. I get a problem due to the inclusion of `curl.h` by `cpprestsdk`
```text
In file included from /Users/aej/software/compilescripts/openvds/openvds-3.1.0-src/src/OpenVDS/IO/IOManagerCurl.h:41:
/Library/Developer/CommandLineTools/SDKs/MacOSX13.0.sdk/usr/include/curl/curl.h:115:41: error: too few arguments provided to function-like macro invocation
__has_declspec_attribute(dllimport))
```
This seems to be related to changed behavior of LLVM/clang and it appears [with other projects](https://github.com/llvm/llvm-project/issues/53269) and has been reported to [cURL as well](https://github.com/curl/curl/issues/8293). It seems to be some interaction of cURL and casablanca. There are an [issue](https://github.com/microsoft/cpprestsdk/issues/1710) and a [pull request](https://github.com/microsoft/cpprestsdk/pull/1723) in the `cpprestsdk` repository for this, but they say that this will not be fixed since `cpprestsdk` is in maintenance mode.
I can fix it by commenting out the `#define dllimport`, but I am not sure if that is the best thing to do.
```text
cpprestapi_file="${SOURCE_DIR}/3rdparty/cpprestapi-2.10.16/Release/include/cpprest/details/cpprest_compat.h"
sed -i '' 's/\#define dllimport/\/\/\#define dllimport/' "${cpprestapi_file}"
sed -i '' 's/\/\/\/\/\#define dllimport/\/\/\#define dllimport/' "${cpprestapi_file}"
```
As the [`cpprestsdk`](https://github.com/microsoft/cpprestsdk) project is marked as being in maintenance mode so maybe it is necessary to move to another project in the (near?) future.
Side question: Why is the package called `cpprestapi` within the OpenVDS project? It makes debugging a bit confusing since the actual package/repository is called `cpprestsdk`.
3. Building the AWS IOManager fails
```text
/Users/aej/software/compilescripts/openvds/openvds-master-src/src/OpenVDS/IO/IOManagerAWSCurl.h:10:10: fatal error: 'aws/crt/auth/Credentials.h' file not found
#include <aws/crt/auth/Credentials.h>
^~~~~~~~~~~~~~~~~~~~~~~~~~~~
1 error generated.
```
The file exists though if I search for in from the OpenVDS repository root
```text
$ find . -iname "Credentials.h" -type f
./3rdparty/google-cloud-cpp-1.14.0/google/cloud/storage/oauth2/credentials.h
./3rdparty/aws-cpp-sdk-1.9.336_/aws-cpp-sdk-cognito-identity/include/aws/cognito-identity/model/Credentials.h
./3rdparty/aws-cpp-sdk-1.9.336_/aws-cpp-sdk-finspace-data/include/aws/finspace-data/model/Credentials.h
./3rdparty/aws-cpp-sdk-1.9.336_/aws-cpp-sdk-connect/include/aws/connect/model/Credentials.h
./3rdparty/aws-cpp-sdk-1.9.336_/aws-cpp-sdk-sts/include/aws/sts/model/Credentials.h
./3rdparty/aws-cpp-sdk-1.9.336_/crt/aws-crt-cpp/include/aws/crt/auth/Credentials.h
./3rdparty/aws-cpp-sdk-1.9.336_/crt/aws-crt-cpp/crt/aws-c-auth/include/aws/auth/credentials.h
```
I am currently a bit stuck at this step since I did not find any straightforward way yet to avoid this problem. I assume that the include paths is not populated properly.
## System
- Arm64 MacOS 13.0.1
- OpenVDS `master` branch
- clang 14.0.0
```text
$ clang --version
Apple clang version 14.0.0 (clang-1400.0.29.202)
Target: arm64-apple-darwin22.1.0
Thread model: posix
InstalledDir: /Library/Developer/CommandLineTools/usr/bin
```
- cmake 3.24.3 (via Homebrew)https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/open-vds/-/issues/219[IOManagerAWS] - question about STS credential provider for AWS2024-01-31T14:47:35ZFilip Brzęk[IOManagerAWS] - question about STS credential provider for AWSHi,
Is it currently possible to form `OpenOptions` for the AWS backend by specifying a roleARN that should be used to generate an STS token?
After glancing at [IOManagerAWSCurl](https://community.opengroup.org/osdu/platform/domain-data-...Hi,
Is it currently possible to form `OpenOptions` for the AWS backend by specifying a roleARN that should be used to generate an STS token?
After glancing at [IOManagerAWSCurl](https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/open-vds/-/blob/master/src/OpenVDS/IO/IOManagerAWSCurl.cpp?ref_type=heads#L180) it seems it defaults to `CreateCredentialsProviderChainDefault()`.
Is there a way in the current code path to be able to specify a specific role to be used for S3 access? Or it would have to be added separately.
If so, are there some style/contribution guidelines about adding new `OpenOptions`?
Thanks,
Filiphttps://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/open-vds/-/issues/223Java Binding crash when querying VolumeTraces with LOD2024-01-31T14:27:36ZJulien LacosteJava Binding crash when querying VolumeTraces with LODUsing java binding, the call to VolumeDataAccessManager::requestVolumeTraces method causes an application crash if LOD > 0.Using java binding, the call to VolumeDataAccessManager::requestVolumeTraces method causes an application crash if LOD > 0.https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/open-vds/-/issues/182The sdapi proxy is neither forward nor backward compatible.2024-01-30T12:14:42ZPaal KvammeThe sdapi proxy is neither forward nor backward compatible.I have run tests on version 3.2.1 using both the old "sdapi" I/O manager and the new "proxy" I/O manager, using VDSCopy. I have built OpenVDS from sources. Observations:
- A file on the cloud written using the "sdapi" manager cannot be ...I have run tests on version 3.2.1 using both the old "sdapi" I/O manager and the new "proxy" I/O manager, using VDSCopy. I have built OpenVDS from sources. Observations:
- A file on the cloud written using the "sdapi" manager cannot be read with the "proxy" manager.
- A file on the cloud written using the "proxy" manager cannot be read with the "sdapi" manager.
- Additionally, the file written using the "proxy" manager is seen as corrupt by other dms applications. The list of block names is empty. Somehow the meta-data isn't stored correctly.https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/open-vds/-/issues/225DmsDataset class is using resources that have been moved from2024-01-29T13:47:01ZDeepa KumariDmsDataset class is using resources that have been moved fromThe vector containing response is moved here: https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/open-vds/-/blob/master/src/OpenVDS/IO/DmsIoFactories/DmsIoManagerFactory.cpp#L200
https://community.opengroup....The vector containing response is moved here: https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/open-vds/-/blob/master/src/OpenVDS/IO/DmsIoFactories/DmsIoManagerFactory.cpp#L200
https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/open-vds/-/blob/master/src/OpenVDS/IO/DmsIoFactories/DmsIoManagerFactory.cpp#L223
The same resource which was already moved from, is being used here:
https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/open-vds/-/blob/master/src/OpenVDS/IO/DmsIoFactories/DmsIoManagerFactory.cpp#L204
and https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/open-vds/-/blob/master/src/OpenVDS/IO/DmsIoFactories/DmsIoManagerFactory.cpp#L227
Resources that have been moved from should not be used, that is the objective of move. More ref: https://en.cppreference.com/w/cpp/utility/moveM23 - Release 0.26Deepa KumariDeepa Kumarihttps://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/open-vds/-/issues/226Building from source2024-01-29T08:38:09ZVasilii SinkevichBuilding from sourceHi,
I am trying to build the static library from the source code and as a first step I did compile dynamically as instructed and noticed that Wavelet compression is not supported in the resulting library - just checked with OpenVDS::Is...Hi,
I am trying to build the static library from the source code and as a first step I did compile dynamically as instructed and noticed that Wavelet compression is not supported in the resulting library - just checked with OpenVDS::IsCompressionMethodSupported(OpenVDS::CompressionMethod::Wavelet).
Is it how it is supposed to be or I did something wrong?
Does it mean that both compression/decompression is not supported or decompression will still be working?
Thanks you,
Vasiliihttps://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/open-vds/-/issues/224log out "0.00 % Done." then terminated while convert segy file to vds file.2024-01-28T06:20:39Znanting liulog out "0.00 % Done." then terminated while convert segy file to vds file.![1706185787839](/uploads/55ea2a4c9e6cd925c4a8c5d56edb5cf4/1706185787839.jpg)
![1706185713377](/uploads/b5ae9b60826865228a3349ece03e8d5d/1706185713377.jpg)![1706185787839](/uploads/55ea2a4c9e6cd925c4a8c5d56edb5cf4/1706185787839.jpg)
![1706185713377](/uploads/b5ae9b60826865228a3349ece03e8d5d/1706185713377.jpg)