Seismic issueshttps://community.opengroup.org/groups/osdu/platform/domain-data-mgmt-services/seismic/-/issues2024-02-26T20:07:38Zhttps://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/open-vds/-/issues/222VoumeData class not included in Java Library2024-02-26T20:07:38ZJulien LacosteVoumeData class not included in Java LibraryThe VolumeData class is not generated in the Java version.
Apparently it's missing from de the CmakeLists.txtThe VolumeData class is not generated in the Java version.
Apparently it's missing from de the CmakeLists.txthttps://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/open-vds/-/issues/240AWS Directory Bucket access2024-03-24T22:23:58ZKlaas KosterAWS Directory Bucket accessOpenVDS 3.4.0 does not connect to the new AWS Directory Bucket.
url = 's3://vds--use1-az4--x-s3/rline1601'
connection = "Region = us-east-1"
vds = openvds.open(url, connection)
Generates the error:
RuntimeError: Error on downloading...OpenVDS 3.4.0 does not connect to the new AWS Directory Bucket.
url = 's3://vds--use1-az4--x-s3/rline1601'
connection = "Region = us-east-1"
vds = openvds.open(url, connection)
Generates the error:
RuntimeError: Error on downloading VolumeDataLayout object: Http error response: 404 -\> vds--use1-az4--x-s3.s3.us-east-1.amazonaws.com/rline1601/VolumeDataLayout: The specified bucket does not exist.
The error makes sense, because the correct address for the file is:
vds--use1-az4--x-s3.s3express-use1-az4.us-east-1.amazonaws.com/rline1601/VolumeDataLayout
So, currently OpenVDS forms the address by inserting 's3' between the bucket and the region, but for a Directory Bucket this is incorrect and should be 's3express-use1-az4'.
I would assume that the 's3express' part is universal, but that '-use1-az4' needs to be coming from the specified url.
I don't mind tinkering with the code and seeing how this can be fixed. But, it would be great is someone (Morten?) can point me to the part of the code where these addresses are formed. I have an example Python script that correctly reads a file from a Directory bucket, see below.https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-service/-/issues/131GC and GC baremetal deploy fail2024-03-19T14:35:58ZDaniel PerezGC and GC baremetal deploy failAliaksandr Ramanovich (EPAM)Yauheni Rykhter (EPAM)Aliaksandr Ramanovich (EPAM)https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-service/-/issues/130IBM E2E tests fail2024-03-19T14:59:18ZDaniel PerezIBM E2E tests failE2E tests for IBM in SDMS V3 are failing with no healthy upstream, this seems to be an issue with environment itself.E2E tests for IBM in SDMS V3 are failing with no healthy upstream, this seems to be an issue with environment itself.Anuj GuptaIsha KumariAnuj Guptahttps://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/open-vds/-/issues/239OpenVDS fails to build on MacOS if BUILD_CURL=ON2024-03-22T12:46:24ZAlexander JaustOpenVDS fails to build on MacOS if BUILD_CURL=ONBuilding OpenVDS fails for me on MacOS if I want OpenVDS to build its own version of curl (`-DBUILD_CURL=ON`, which is the default). The build passes if the system's curl is used (by setting `-DBUILD_CURL=OFF`). I am also deactivating so...Building OpenVDS fails for me on MacOS if I want OpenVDS to build its own version of curl (`-DBUILD_CURL=ON`, which is the default). The build passes if the system's curl is used (by setting `-DBUILD_CURL=OFF`). I am also deactivating some I/O managers that I do not need so this may influence the behavior. The error message sounds like the linker is missing some (part) of curl when linking.
## System
- Apple M1 Max CPU
- MacOS Ventura
- OpenVDS 3.4.0 checked out via git, but also `master`
- cmake version 3.28.3
## Build command
```text
cmake -S . \
-B build \
-DCMAKE_BUILD_TYPE=Release \
-DBUILD_SHARED_LIBS=ON \
-DBUILD_JAVA=OFF \
-DBUILD_PYTHON=ON \
-DBUILD_EXAMPLES=ON \
-DBUILD_TESTS=OFF \
-DBUILD_DOCS=OFF \
-DENABLE_OPENMP=ON \
-DDISABLE_AWS_IOMANAGER=ON \
-DDISABLE_AZURESDKFORCPP_IOMANAGER=OFF \
-DDISABLE_GCP_IOMANAGER=ON \
-DDISABLE_DMS_IOMANAGER=ON \
-DDISABLE_STRICT_WARNINGS=ON \
-DCMAKE_FIND_FRAMEWORK=LAST \
-DAUTO_ADJUST_UUID=OFF \
-DBUILD_CURL=ON \
-DCMAKE_INSTALL_PREFIX="${INSTALLATION_DIR}"
```
## Error
```text
[ 79%] Linking CXX shared library libopenvds.dylib
cd /Users/AEJ/software/compilescripts/openvds/openvds-3.4.0-src/build/src/OpenVDS && /opt/homebrew/Cellar/cmake/3.28.3/bin/cmake -E cmake_link_script CMakeFiles/openvds.dir/link.txt --verbose=1
ccache /Library/Developer/CommandLineTools/usr/bin/c++ -O3 -DNDEBUG -flto=thin -arch arm64 -isysroot /Library/Developer/CommandLineTools/SDKs/MacOSX14.2.sdk -dynamiclib -Wl,-headerpad_max_install_names -s -compatibility_version 3.0.0 -current_version 3.4.0 -o libopenvds.3.4.0.dylib -install_name @rpath/libopenvds.3.dylib CMakeFiles/openvds_objects.dir/OpenVDS.cpp.o CMakeFiles/openvds_objects.dir/IO/File.cpp.o CMakeFiles/openvds_objects.dir/IO/Linux_File.cpp.o CMakeFiles/openvds_objects.dir/IO/IOManager.cpp.o CMakeFiles/openvds_objects.dir/IO/IOManagerAzureSdkForCpp.cpp.o CMakeFiles/openvds_objects.dir/IO/IOManagerInMemory.cpp.o CMakeFiles/openvds_objects.dir/IO/IOManagerCurl.cpp.o CMakeFiles/openvds_objects.dir/IO/IOManagerAzurePresigned.cpp.o CMakeFiles/openvds_objects.dir/IO/IOManagerHttp.cpp.o CMakeFiles/openvds_objects.dir/IO/IORefreshToken.cpp.o CMakeFiles/openvds_objects.dir/IO/IOManagerDmsProxy.cpp.o CMakeFiles/openvds_objects.dir/IO/DmsIoFactories/AzureDmsIoManagerFactory.cpp.o CMakeFiles/openvds_objects.dir/IO/DmsIoFactories/DmsIoManagerFactory.cpp.o CMakeFiles/openvds_objects.dir/IO/SslVerifyPeerEnv.cpp.o CMakeFiles/openvds_objects.dir/IO/SDPath.cpp.o CMakeFiles/openvds_objects.dir/VDS/VolumeDataPartition.cpp.o CMakeFiles/openvds_objects.dir/VDS/VolumeDataChannelMapping.cpp.o CMakeFiles/openvds_objects.dir/VDS/VolumeDataLayer.cpp.o CMakeFiles/openvds_objects.dir/VDS/VolumeDataLayoutImpl.cpp.o CMakeFiles/openvds_objects.dir/VDS/VolumeDataChunk.cpp.o CMakeFiles/openvds_objects.dir/VDS/VolumeDataRegion.cpp.o CMakeFiles/openvds_objects.dir/VDS/VolumeDataHash.cpp.o CMakeFiles/openvds_objects.dir/VDS/VolumeDataPageAccessorImpl.cpp.o CMakeFiles/openvds_objects.dir/VDS/VolumeDataAccessManagerImpl.cpp.o CMakeFiles/openvds_objects.dir/VDS/VolumeDataPageImpl.cpp.o CMakeFiles/openvds_objects.dir/VDS/VolumeDataAccessor.cpp.o CMakeFiles/openvds_objects.dir/VDS/DimensionGroup.cpp.o CMakeFiles/openvds_objects.dir/VDS/ParseVDSJson.cpp.o CMakeFiles/openvds_objects.dir/VDS/MetadataManager.cpp.o CMakeFiles/openvds_objects.dir/VDS/VolumeDataStore.cpp.o CMakeFiles/openvds_objects.dir/VDS/VolumeDataStoreIOManager.cpp.o CMakeFiles/openvds_objects.dir/VDS/VolumeDataStoreVDSFile.cpp.o CMakeFiles/openvds_objects.dir/VDS/DataBlock.cpp.o CMakeFiles/openvds_objects.dir/VDS/VolumeDataRequestProcessor.cpp.o CMakeFiles/openvds_objects.dir/VDS/VolumeIndexer.cpp.o CMakeFiles/openvds_objects.dir/VDS/Env.cpp.o CMakeFiles/openvds_objects.dir/VDS/StringToDouble.cpp.o CMakeFiles/openvds_objects.dir/VDS/GlobalStateImpl.cpp.o CMakeFiles/openvds_objects.dir/VDS/WaveletAdaptiveLLDecompress.cpp.o CMakeFiles/openvds_objects.dir/VDS/WaveletDecompress.cpp.o CMakeFiles/openvds_objects.dir/VDS/WaveletInverseTransform.cpp.o CMakeFiles/openvds_objects.dir/VDS/WaveletTypes.cpp.o CMakeFiles/openvds_objects.dir/VDS/FSE/entropy_common.cpp.o CMakeFiles/openvds_objects.dir/VDS/FSE/fse_decompress.cpp.o CMakeFiles/openvds_objects.dir/VDS/Rle.cpp.o CMakeFiles/openvds_objects.dir/__/__/common/Base64/Base64.cpp.o "CMakeFiles/openvds_objects.dir/__/__/3rdparty/jsoncpp-1.8.4/src/lib_json/json_reader.cpp.o" "CMakeFiles/openvds_objects.dir/__/__/3rdparty/jsoncpp-1.8.4/src/lib_json/json_value.cpp.o" "CMakeFiles/openvds_objects.dir/__/__/3rdparty/jsoncpp-1.8.4/src/lib_json/json_writer.cpp.o" -Wl,-rpath,@executable_path -ldl -pthread ../../3rdparty/BuildAzureSdkForCpp/libazure-core.a ../../3rdparty/BuildLibXml2/libLibXml2.a libhue_bds_objects.a ../../fmt_9.1.0/libfmt.a ../../libuv_1.44.2_install/Release/lib/libuv_a.a ../../curl_7.85.0_install/Release/lib/libcurl.a ../../openssl_3.0.12_install/Release/lib/libssl.a ../../openssl_3.0.12_install/Release/lib/libcrypto.a ../../zlib_1.2.12_install/Release/lib/libz.a
ld: warning: -s is obsolete
ld: Undefined symbols:
_CFRelease, referenced from:
_Curl_resolv in libcurl.a[53](hostip.c.o)
_SCDynamicStoreCopyProxies, referenced from:
_Curl_resolv in libcurl.a[53](hostip.c.o)
clang: error: linker command failed with exit code 1 (use -v to see invocation)
make[2]: *** [src/OpenVDS/libopenvds.3.4.0.dylib] Error 1
make[1]: *** [src/OpenVDS/CMakeFiles/openvds.dir/all] Error 2
make: *** [all] Error 2
```https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/open-vds/-/issues/238Segmentation fault when opening a corrupted VDS file2024-03-22T12:47:53ZSveinung RundhovdeSegmentation fault when opening a corrupted VDS fileVersion: 3.4.0
When opening an invalid VDS file a pointer to 0x0 is returned in src/OpenVDS/VDS/VolumeDataStoreVDSFile.cpp:645. There is an assert on the next line checking for this. Perhaps this should be checked for in release mode as...Version: 3.4.0
When opening an invalid VDS file a pointer to 0x0 is returned in src/OpenVDS/VDS/VolumeDataStoreVDSFile.cpp:645. There is an assert on the next line checking for this. Perhaps this should be checked for in release mode as well to provide a better error message?https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/open-vds/-/issues/237Buffer over-read2024-03-07T10:00:39ZSveinung RundhovdeBuffer over-readThere is a buffer over-read in src/OpenVDS/VDS/ConvertValues.h:
template <typename T>
static void CopyFrom1Bit(void * __restrict voiddst, const void* __restrict voidsrc, int32_t count)
{
T* target = (T*)voiddst;
uint8_t* source = (u...There is a buffer over-read in src/OpenVDS/VDS/ConvertValues.h:
template <typename T>
static void CopyFrom1Bit(void * __restrict voiddst, const void* __restrict voidsrc, int32_t count)
{
T* target = (T*)voiddst;
uint8_t* source = (uint8_t*)voidsrc;
uint8_t bits = *source;
int32_t mask = 1;
for (int i = 0; i < count; i++)
{
*target = (bits & mask)? T(1) : T(0);
target++;
mask <<= 1;
if (mask == 0x100)
{
source++;
bits = *source; <-- Read one past end of buffer on this line.
mask = 1;
}
}
}
I assume this happens on last iteration of the loop so the value is never used. It is highly unlikely that this will cause any issues, but I guess there is a theoretical possibility that this can cause a segmentation fault.https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/open-vds/-/issues/236Unexpected errors when accessing dataset with OpenVDS 3.4.02024-03-22T12:46:03ZAlexander JaustUnexpected errors when accessing dataset with OpenVDS 3.4.0Hi,
I am currently testing out OpenVDS 3.4.0 and I really appreciate the increase amount of error messages. (Un)fortunately, I am also hit by some of the error messages. I am not sure if I am doing anything wrong or if our data set in t...Hi,
I am currently testing out OpenVDS 3.4.0 and I really appreciate the increase amount of error messages. (Un)fortunately, I am also hit by some of the error messages. I am not sure if I am doing anything wrong or if our data set in the Azure storage account is broken. The error messages I see do not appear when using in OpenVDS 3.3.3.
I have attached some Python scripts that I used to create the error messages at the bottom of this issues. I can also split this issue into several if it is too much content.
Any hints to what is going wrong and on how to debug this would be greatly appreciated.
## Versions, storage, dataset
* OpenVDS 3.4.0
* Azure blob store
* Dataset: Volve `ST10010ZC11_PZ_PSDM_KIRCH_FULL_T.MIG_FIN.POST_STACK.3D.JS-017536` with RLE compression and brick size 32. The file was created in June 2023 with OpenVDS/SEGYImport, but cannot tell which OpenVDS version it was.
This is the only file we did some testing on. I noticed that there are a few blobs with size 0. I am not sure if this is relevant.
## Observed errors
### Invalid headers
I see several messages about invalid headers, but I am not sure which parts is invalid. I don't understand why this only affects a few blobs (2-4) and not all blobs.
```text
[...]
Request was canceled.
OpenVDS reported error:
Invalid header (e.g. unsupported Wavelet compression version) for chunk: Dimensions_012LOD0/2647
[...]
```
### Invalid HTTP request range (HTTP 416 error) and logging of SAS token
1. The [range header of the HTTP request](https://learn.microsoft.com/en-us/rest/api/storageservices/specifying-the-range-header-for-blob-service-operations) seems to be wrong.
2. The error message exposes the SAS token. I am not sure if this is done on purpose. At least I see the risk that the token may appear unexpectedly in log files
```text
[...]
Http error response: 416 -> https://STORAGEACCOUNT/volve/ST10010ZC11_PZ_PSDM_KIRCH_FULL_T.MIG_FIN.POST_STACK.3D.JS-017536/vds_32_RLE/Dimensions_012LOD0/6945?SASTOKEN: The range specified is invalid for the current size of the resource.
RequestId: REQUESTID
Time:2024-03-06T13:27:02.0923545Z
[...]
```
I retracted potentially sensitive information.
None of the blobs mentioned in the error message has zero size:
```text
Blob name, Access tier, Blob type, Size
7747, Hot (Inferred), Block blob, 68.04 KiB
5470, Hot (Inferred), Block blob, 128.04 KiB
4302, Hot (Inferred), Block blob, 128.04 KiB
```
I could reproduce this error if requesting VolumeSamples so far. My script for request VolumeSubsets hangs after inline 42 and several error messages about invalid headers.
## Python script for reporduction
Below you find the Python code to request data from the VDS data set. The Python script needs the Python packages `openvds`, `python-dotenv`
and `numpy`.
### Requesting subsamples
```python
#!/usr/bin/env python3
import os
import numpy as np
import openvds
from dotenv import load_dotenv
load_dotenv()
SAS_TOKEN = os.getenv("SAS_TOKEN")
BLOB_URL = os.getenv("BLOB_URL")
if __name__ == "__main__":
print("Accessing samples from Azure blob")
try:
with openvds.open(url=BLOB_URL, connectionString=SAS_TOKEN) as vds:
manager = openvds.getAccessManager(vds)
layout = manager.volumeDataLayout
# Voxel extents are hardcoded for volve cube
# ST10010ZC11_PZ_PSDM_KIRCH_FULL_T.MIG_FIN.POST_STACK.3D.JS-017536
for depth in range(850):
samplePositions = []
for il in range(401):
for xl in range(720):
samplePositions.append((depth + 0.5, xl + 0.5, il + 0.5))
samplePositions = np.array(samplePositions)
print(f"Request at depth voxel {depth+0.5}")
request = manager.requestVolumeSamples(
samplePositions=samplePositions,
dimensionsND=openvds.DimensionsND.Dimensions_012,
lod=0,
channel=0,
interpolationMethod=openvds.InterpolationMethod.Cubic,
)
success = request.waitForCompletion()
if success:
print("Request was successfull")
else:
print(f"request:\n{request}")
if request.isCanceled:
print("Request was canceled.\n OpenVDS reported error:\n"
f" {request.errorMessage}"
)
else:
print("Request failed due to timeout")
except RuntimeError as error:
print(f"Could not open VDS: {error}")
```
### Requesting subsets
```python
#!/usr/bin/env python3
import os
import numpy as np
import openvds
from dotenv import load_dotenv
load_dotenv()
SAS_TOKEN = os.getenv("SAS_TOKEN")
BLOB_URL = os.getenv("BLOB_URL")
if __name__ == "__main__":
print("Accessing samples from Azure blob")
try:
with openvds.open(url=BLOB_URL, connectionString=SAS_TOKEN) as vds:
manager = openvds.getAccessManager(vds)
layout = manager.volumeDataLayout
sampleDimension, crosslineDimension, inlineDimension = (0, 1, 2)
for inlineIndex in range(401):
voxelMin = (0, 0, inlineIndex)
voxelMax = (
layout.getDimensionNumSamples(sampleDimension),
layout.getDimensionNumSamples(crosslineDimension),
inlineIndex + 1,
)
buffer = np.empty(
(
layout.getDimensionNumSamples(crosslineDimension),
layout.getDimensionNumSamples(sampleDimension),
)
)
print(f"Request at inline {inlineIndex}")
request = manager.requestVolumeSubset(
data_out=buffer,
dimensionsND=openvds.DimensionsND.Dimensions_012,
min=voxelMin,
max=voxelMax,
lod=0,
channel=0,
)
success = request.waitForCompletion()
if success:
print("Request was successfull")
else:
print(f"request:\n{request}")
if request.isCanceled:
print("Request was canceled.\n OpenVDS reported error:\n"
f"{request.errorMessage}"
)
else:
print("Request failed due to timeout")
except RuntimeError as error:
print(f"Could not open VDS: {error}")
```https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/open-vds/-/issues/235Integer scale/offset not applied2024-03-07T09:12:54ZJulien LacosteInteger scale/offset not appliedWhen reading an U16 dataset in R32, integer scale and offset are not applied on the result buffer.
Tested on version 3.3.3 through Java bindings, the call to VolumeDataAccessManager::requestVolumeSubsetFloat doesn't seem to apply the s...When reading an U16 dataset in R32, integer scale and offset are not applied on the result buffer.
Tested on version 3.3.3 through Java bindings, the call to VolumeDataAccessManager::requestVolumeSubsetFloat doesn't seem to apply the scale and offset correction.https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-service/-/issues/129DATASET SELECT LS POST: while putting invalid characters in select it is givi...2024-02-29T12:14:56ZIsha KumariDATASET SELECT LS POST: while putting invalid characters in select it is giving response code 200. it should give 400 DATASET SELECT LS POST: while putting invalid characters in selectit is giving response code 200. it should give 400 DATASET SELECT LS POST: while putting invalid characters in selectit is giving response code 200. it should give 400https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/open-vds/-/issues/234Missing error message caused by trivial bug2024-02-28T12:47:12ZPaal KvammeMissing error message caused by trivial bugError message from DmsDataset::registerDataset() is missing the important part.
In src/OpenVDS/IO/DmsIoFactories/DmsIoManagerFactory.cpp
Line 200: ```responseData = std::move(request->m_uploadHandler->responseData);```
Line 204: ```re...Error message from DmsDataset::registerDataset() is missing the important part.
In src/OpenVDS/IO/DmsIoFactories/DmsIoManagerFactory.cpp
Line 200: ```responseData = std::move(request->m_uploadHandler->responseData);```
Line 204: ```respons_str.insert(...); // access request->m_uploadHandler->responseData```
The bug is also copy-pasted into DmsDataset::lockDataset().https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/open-vds/-/issues/233CRS - Problem when the data is displayed in different UTM zones at the same p...2024-03-07T09:15:19ZJuliana Fernandesjuliana.fernandes@iesbrazil.com.brCRS - Problem when the data is displayed in different UTM zones at the same project.Hello,
IesBrazil team is testing OpenVDS+ with CRS and one of the steps were to QC the data using Headwave from Bluware.
The team noticed a problem and we did a documentation on the tests that I will present below:
**Goal of the Test...Hello,
IesBrazil team is testing OpenVDS+ with CRS and one of the steps were to QC the data using Headwave from Bluware.
The team noticed a problem and we did a documentation on the tests that I will present below:
**Goal of the Tests:** Check if OpenVDS+ is adding correctly the CRS into the VDS file,<br>
**Methodology:** Convert SEGY to VDS using OpenVDS+/Headwave and QC the data using Headwave,<br>
**Data used:** Volve and Brazilian data (Volve doesn't present any problem). From Brazil we used 4 files from Solimões Basin and 1 file from Amazonas Basin, provided by ANP. The data can find [HERE](https://reate.cprm.gov.br/anp/TERRESTRE), below you can download directly all the files used in this test (from Brazil, that is where we identified the problem):
* [0233_LESTE_URUCU.3D.MIG_FIN.1.sgy](https://reate.cprm.gov.br/arquivos/index.php/s/DKD0oj9FsZAU8tI/download?path=%2FSISMICA_3D%2F0233_LESTE_URUCU%2FTEMPO%2FSISMICA&files=0233_LESTE_URUCU.3D.MIG_FIN.1.sgy) - Solimões Basin, SAD69/UTM 20S, EPSG:29190
* [0237_AEROPORTO.3D.MIG_FIN.1.sgy](https://reate.cprm.gov.br/arquivos/index.php/s/DKD0oj9FsZAU8tI/download?path=%2FSISMICA_3D%2F0237_AEROPORTO%2FTEMPO%2FSISMICA&files=0237_AEROPORTO.3D.MIG_FIN.1.sgy) - Solimões Basin, SAD69/UTM 20S, EPSG:29190
* [0237_IGARAPE_MARTA.3D.MIG_FIN.2.sgy](https://reate.cprm.gov.br/arquivos/index.php/s/DKD0oj9FsZAU8tI/download?path=%2FSISMICA_3D%2F0237_IGARAPE_MARTA%2FTEMPO%2FSISMICA&files=0237_IGARAPE_MARTA.3D.MIG_FIN.2.sgy) - Solimões Basin, SAD69/UTM 20S, EPSG:29190
* [R0300_3D_CHIBATA_PSTM.3D.PSTM.1.sgy](https://reate.cprm.gov.br/arquivos/index.php/s/DKD0oj9FsZAU8tI/download?path=%2FSISMICA_3D%2FR0300_3D_CHIBATA%2FTEMPO%2FSISMICA&files=R0300_3D_CHIBATA_PSTM.3D.PSTM.1.sgy) - Solimões Basin, SIRGAS 2000/UTM 20S, EPSG:31980
* [R0300_2D_AM_URUCARA.3D.PSTM.1.sgy](https://reate.cprm.gov.br/arquivos/index.php/s/IPNA8z7hO1vHsxI/download?path=%2FSISMICA_3D%2FR0300_3D_AM_URUCARA%2FTEMPO%2FSISMICA&files=R0300_3D_AM_URUCARA.3D.PSTM.1.sgy) - Amazonas Basin, SAD69/UTM 21S, EPSG:29191<br>
**Shapefile:** Georeferenced polygons of exploratory blocks in geographic coordinates and datum SAD69, available [HERE](https://geomaps.anp.gov.br/geoanp/),<br>
**Problem:** When the project has a different zone from the data (e.g: the project is located at SAD69/ UTM 20S and the data is located at SAD69/ UTM 21S), the file is wrongly spatially positioned (We used a VDS converted by Headwave and the original SEGY to compare),<br>
**OpenVDS+ Version:** 3.3.0,<br>
**Comparative Scenario:**
* SEGY with Original CRS
* SEGY with WGS84 CRS
* VDS from HW with Original CRS
* VDS from HW with WGS84 CRS
* VDS from OpenVDS+ with WGS84 CRS (only option available)
### First Scenario - SEGY with Original CRS
The project is under the coordinate reference system CRS SAD69/ UTM 20S (EPSG:29190) that includes most of the data of the test.<br>
In this test the team uploaded all the segy files, listed in the "Data used" topic, under the Original CRS (also informed with the data list) and displayed. The polygon in red at the superior right corner in the image is the shapefile for the block R0300_3D_AM_URUCARA.<br>
**Result: All the data are at the expected spatial position.**
![SEGY_with_Original_CRS](/uploads/285ab921116f92658bbf43007924bdbb/SEGY_with_Original_CRS.png)
### Second Scenario - SEGY with WGS84 CRS
The project is under the coordinate reference system CRS SAD69/ UTM 20S (EPSG:29190) that includes most of the data of the test.<br>
In this test the team uploaded all the segy files, listed in the "Data used" topic, under the CRS WGS84 and displayed. The polygon in red at the superior right corner in the image is the shapefile for the block R0300_3D_AM_URUCARA.<br>
**Result: All the data are at the expected spatial position.**
![SEGY_with_Original_CRS](/uploads/285ab921116f92658bbf43007924bdbb/SEGY_with_Original_CRS.png)
### Third Scenario - VDS from HW with Original CRS
The project is under the coordinate reference system CRS SAD69/ UTM 20S (EPSG:29190) that includes most of the data of the test.<br>
In this test the team converted to vds, using Headwave, all the segy files, listed in the "Data used" topic, under the Original CRS (also informed with the data list) and displayed. The polygon in red at the superior right corner in the image is the shapefile for the block R0300_3D_AM_URUCARA.<br>
**Result: All the data are at the expected spatial position.**
![SEGY_with_Original_CRS](/uploads/285ab921116f92658bbf43007924bdbb/SEGY_with_Original_CRS.png)
### Fourth Scenario - VDS from HW with WGS84 CRS
The project is under the coordinate reference system CRS SAD69/ UTM 20S (EPSG:29190) that includes most of the data of the test.<br>
In this test the team converted to vds, using Headwave, all the segy files, listed in the "Data used" topic, under the CRS WGS84 and displayed. The polygon in red at the superior right corner in the image is the shapefile for the block R0300_3D_AM_URUCARA.<br>
**Result: All the data are at the expected spatial position.**
![SEGY_with_Original_CRS](/uploads/285ab921116f92658bbf43007924bdbb/SEGY_with_Original_CRS.png)
### Fifth Scenario - VDS from OpenVDS+ with WGS84 CRS (only option available)
The project is under the coordinate reference system CRS SAD69/ UTM 20S (EPSG:29190) that includes most of the data of the test.<br>
In this test the team converted to vds, using OpenVDS+, all the segy files, listed in the "Data used" topic, under the CRS WGS84 and displayed. The polygon in red at the superior right corner in the image is the shapefile for the block R0300_3D_AM_URUCARA.<br>
**Result: The file R0300_3D_AM_URUCARA that is located under a different zone from the project (21S) is wrongly spatially positioned (Should be at the same position that the red polygon is).**
![VDS_with_WGS84_Open](/uploads/fc6638d20fffea098cd2fde1ee44dcac/VDS_with_WGS84_Open.png)
We are available for any additional information needed.
Regards,
Julianahttps://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-service/-/issues/128Subproject creation accepts non-existing groups in ACLs2024-02-26T17:21:16ZYan Sushchynski (EPAM)Subproject creation accepts non-existing groups in ACLs## Description of the problem
There is an issue when it is possible to create a new subproject with non-existing groups in the `acls` field. And then, any action, except deleting the subproject, throws `403` in the subproject.
## Steps ...## Description of the problem
There is an issue when it is possible to create a new subproject with non-existing groups in the `acls` field. And then, any action, except deleting the subproject, throws `403` in the subproject.
## Steps to reproduce it
1. Create a new subproject with invalid acls:
```
curl --location --request POST 'https://<svc_url>/v3/subproject/tenant/osdu/subproject/test-123' \
--header 'x-api-key: {{SVC_API_KEY}}' \
--header 'Content-Type: application/json' \
--header 'ltag: osdu-demo-legaltag' \
--header 'appkey: {{DE_APP_KEY}}' \
--header 'Authorization: Bearer <token>' \
--data-raw '{
"storage_class": "REGIONAL",
"storage_location": "US-CENTRAL1",
"acls": {
"admins": [
"data.sdms.non-existing.admin@osdu.group"
],
"viewers": [
"data.sdms.non-existing.viewer@osdu.group"
]
}
}'
```
This request is executed without any error.
2. Try to upload any file to the subproject:
```shell
python sdutil cp somefile sd://osdu/test-123/somefile
```
Output:
```
[403] [seismic-store-service] User not authorized to perform this operation
```Diego MolteniSacha BrantsDiego Moltenihttps://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/open-vds/-/issues/232Build fails if curl is preinstalled (Alpine Linux)2024-02-29T07:29:59ZAlexander JaustBuild fails if curl is preinstalled (Alpine Linux)The linking step of OpenVDS and its executables fails if cURL is preinstalled on the system and OpenVDS builds its own cURL as well. I have attached a [Dockerfile](/uploads/3caea214874b12856dc12bfc7755bb7e/broken-openvds-build.dockerfile...The linking step of OpenVDS and its executables fails if cURL is preinstalled on the system and OpenVDS builds its own cURL as well. I have attached a [Dockerfile](/uploads/3caea214874b12856dc12bfc7755bb7e/broken-openvds-build.dockerfile) to reproduce the error. It seems that CMake picks up a different cURL or some other dependency which pulls in `libidn2`.
Do you have any idea what causes this?
```text
[...]
[ 96%] Linking CXX executable VDSCopy
cd /open-vds/build/tools/VDSCopy && /usr/bin/cmake -E cmake_link_script CMakeFiles/VDSCopy.dir/link.txt --verbose=1
/usr/bin/c++ -O3 -DNDEBUG -s -Wl,--exclude-libs=ALL -Wl,--no-export-dynamic CMakeFiles/VDSCopy.dir/VDSCopy.cpp.o ../Shared/CMakeFiles/tools_shared.dir/HelpConnection.cpp.o -o VDSCopy -Wl,-rpath,/open-vds/build/src/OpenVDS: ../../src/OpenVDS/libopenvds.so.3.4.255 ../../fmt_9.1.0/lib
fmt.a ../../jsoncpp_1.8.4/src/lib_json/libjsoncpp_static.a ../../docs/libhelp_connection.a
/usr/lib/gcc/x86_64-alpine-linux-musl/12.2.1/../../../../x86_64-alpine-linux-musl/bin/ld: ../../src/OpenVDS/libopenvds.so.3.4.255: undefined reference to `idn2_lookup_ul'
/usr/lib/gcc/x86_64-alpine-linux-musl/12.2.1/../../../../x86_64-alpine-linux-musl/bin/ld: ../../src/OpenVDS/libopenvds.so.3.4.255: undefined reference to `idn2_check_version'
/usr/lib/gcc/x86_64-alpine-linux-musl/12.2.1/../../../../x86_64-alpine-linux-musl/bin/ld: ../../src/OpenVDS/libopenvds.so.3.4.255: undefined reference to `idn2_strerror'
/usr/lib/gcc/x86_64-alpine-linux-musl/12.2.1/../../../../x86_64-alpine-linux-musl/bin/ld: ../../src/OpenVDS/libopenvds.so.3.4.255: undefined reference to `idn2_free'
collect2: error: ld returned 1 exit status
```
## Operating system
Alpine Linux (latest), but problem seems to appear starting from Alpine Linux 3.18. It builds fine in Alpine Linux 3.17.
I tested in Docker on an M1 Mac with and without x86_64 emulation. The error shows up in both cases.
## What did I try
### Without success
- Enabled/disabled different I/O managers.
- Had a look into the CMake files, but I must admit that I do not know how to properly debug this.
- Installing `libidn2-dev` , but it seems to be present already if one installs `curl-dev`.
- Switching between Make and Ninja as build system
### Successfully
- Setting `-DBUILD_CURL=OFF` seems to fix the problem. However, I am not sure if this has implications for features or performance.https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/open-vds/-/issues/231VDSInfo showing textual header for a VDS file that has been converted from SegY2024-02-27T07:06:58ZDebasis ChatterjeeVDSInfo showing textual header for a VDS file that has been converted from SegYVDSInfo --metadata-name TextHeader -e -w 80 <data set>
Is there an option to show actual text content?
Currently when I try the following command, it does not decode to text.
```
C:\openvds-3.0.4\bin\msvc_140>VDSInfo --metadata-name ...VDSInfo --metadata-name TextHeader -e -w 80 <data set>
Is there an option to show actual text content?
Currently when I try the following command, it does not decode to text.
```
C:\openvds-3.0.4\bin\msvc_140>VDSInfo --metadata-name TextHeader -e -w 80 debasis9Feb.vds
{
"category" : "SEGY",
"name" : "TextHeader",
"type" : "BLOB",
"value" : "w0DxQM....AQEA="
}
C:\openvds-3.0.4\bin\msvc_140>
```
cc @Ofstadhttps://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/open-vds/-/issues/230VDSInfo to point to VDS folder in SD-STORE2024-02-27T07:20:04ZDebasis ChatterjeeVDSInfo to point to VDS folder in SD-STORE@Ofstad - Please check the following.
The VDS folder has been created successfully from SegYImport step (converter).
We can check stat from sdutil.
We can also see the folder content from Backend (Amazon S3).
This is performed using AWS...@Ofstad - Please check the following.
The VDS folder has been created successfully from SegYImport step (converter).
We can check stat from sdutil.
We can also see the folder content from Backend (Amazon S3).
This is performed using AWS/M22/Preship.
Always the same error.
[Option 'c' does not exist]
VDSInfo –connection “sdauthorityurl=https://prsh.testing.preshiptesting.osdu.aws/api/seismic-store/v3;sdapikey=;sdtoken=access token" sd://osdu/subproject/debasi.0a2ed74b-c8c9-4289-a4b0-ce8e135e2cbd.vds
VDSInfo –connection “sdauthorityurl=https://prsh.testing.preshiptesting.osdu.aws/api/seismic-store/v3;sdtoken=access token" sd://osdu/subproject/debasi.0a2ed74b-c8c9-4289-a4b0-ce8e135e2cbd.vds
VDSInfo –connection “sdauthorityurl=https://prsh.testing.preshiptesting.osdu.aws/api/seismic-store/v3;sdapikey=ABC;sdtoken=access token" sd://osdu/subproject/debasi.0a2ed74b-c8c9-4289-a4b0-ce8e135e2cbd.vdshttps://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/open-vds/-/issues/229AWS Directory Buckets2024-02-17T18:40:52ZKlaas KosterAWS Directory BucketsAWS has recently introduced something called Directory Buckets in addition to their General Purpose Buckets:
https://opsinsights.dev/exploring-the-newest-s3-bucket
Supposedly these come with lower cost and lower latency. Is there a rec...AWS has recently introduced something called Directory Buckets in addition to their General Purpose Buckets:
https://opsinsights.dev/exploring-the-newest-s3-bucket
Supposedly these come with lower cost and lower latency. Is there a recommendation to use these new Directory Buckets for VDS?Morten OfstadMorten Ofstadhttps://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/open-vds/-/issues/228Documentation request: Unclear on constraints/expectations for channel names/...2024-02-16T16:07:49ZKevin McCartyDocumentation request: Unclear on constraints/expectations for channel names/unitsHi, this is a companion issue to #227, but here I'm asking about channels rather than axes.
* Is there a limit on the number of different "full-sized" channels (e.g. 3D for a dataset with 3 dimensions) that may be written to a VDS datas...Hi, this is a companion issue to #227, but here I'm asking about channels rather than axes.
* Is there a limit on the number of different "full-sized" channels (e.g. 3D for a dataset with 3 dimensions) that may be written to a VDS dataset?
* All the datasets that I see have "Amplitude" as the primary channel (channel 0) name. Is it legal to have a dataset where "Amplitude" is not the primary channel, or where it is not present at all?
* The datasets imported from SEGY all seem to have auxiliary "Trace" and "PDSTraceHeader" channels with per-trace values. Is it legal to create datasets that do not have those channels?
* What are the constraints on the _names_ of channels?
* Are they restricted to solely the names #define'd in `GlobalMetadataCommon.h` in section "Attributes' names"?
* Or, are they allowed to be any alphanumeric ASCII string?
* Or, are additional printable ASCII characters allowed as well (which ones?)?
* Or, is any printable string allowed? (Is the encoding required to be UTF-8 or something else?)
* What are the constraints on the _units_ of channels?
* Are they restricted to solely the names #define'd in `KnownMetadata.h` with `KNOWNMETADATA_UNIT_` prefixes / accessible from the `KnownUnitNames` class?
* Or, are they allowed to be any alphanumeric ASCII string? (And if so, is that round-trippable?)
* Etc.
As a motivation for this question, I might have need to create a channel named "Density" whose units are "g/cm^3" for instance. Or a channel named "Temperature" whose units are "degrees C", or "Salinity" with units "ppm". These are just examples off the top of my head, our own internal file formats allow for near-arbitrary channel names and units.
Thank you in advance for whatever information you can provide!https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/open-vds/-/issues/227Documentation request: Unclear on constraints/expectations on axes for 3D dat...2024-02-20T17:27:16ZKevin McCartyDocumentation request: Unclear on constraints/expectations on axes for 3D datasetsHello,
I'm writing some utilities for my employer, Dynamic Graphics Inc., that are intended to convert between OpenVDS datasets and our own 3D grid file formats (one of them being a quite old internally-invented proprietary binary forma...Hello,
I'm writing some utilities for my employer, Dynamic Graphics Inc., that are intended to convert between OpenVDS datasets and our own 3D grid file formats (one of them being a quite old internally-invented proprietary binary format, and the more recent one being HDF5-based), not unlike `SEGYImport` and `SEGYExport`.
As I started working on this, I found that numerous questions arose regarding the axes/dimensions for OpenVDS datasets that aren't really covered, or are at best glossed over, by the online docs. Many thanks in advance for light that you can shed on them!
1) I'm presuming that for 3D datasets, axis 0 (that is, the axis named by `layout->GetAxisDescriptor(0)`) is always defined as being along the most rapidly-varying index in linear memory, while axis 2 is always along the most slowly-varying index in linear memory, is this correct?
2) For data in the inline/crossline/sample axis description system:
a) Will it always be the case that axis 0 is "sample", axis 1 is "crossline" and axis 2 is "inline" ? That seems to be what this line of `examples/GettingStarted/main.cpp` (as of OpenVDS 3.3.1) implies:
```plaintext
const int sampleDimension = 0, crosslineDimension = 1, inlineDimension = 2;
```
but is it a robust assumption? Currently I am checking that the names of the three axes are in this expected order and erroring out otherwise (which is an expectation met by all 3D datasets to which I have access with "inline" / "crossline" / "sample" axes).
b) If I have a dataset in our own 3D format for which the inline or crossline number _decreases_ in the positive direction (for the ordering in linear memory) of one or both of the two horizontal axes, this does not seem to be supported by OpenVDS. Do I have that correct? In other words, if I have such a 3D grid in our own format, it looks like I'll need to take care of moving the XY origin to the appropriate one of the other three grid corners in XY and swapping the nodes around correspondingly in memory, in order to be able to create an OpenVDS dataset where the inline/crossline numbers increase in the same direction as the node ordering in memory? Or am I missing something that would make this unnecessary (e.g. the axis descriptor min/max being possible to set in reversed order?)
3) Regarding the I/J/K axis description system:
a) Is there any specific mapping that is required/expected between the I/J/K labels and the 0/1/2 axis numbering? Or is it possible that I might find (say) one dataset where axis 2 is the K axis, and another dataset where axis 0 is the K axis (and so on) ?
b) In the transformation from axes to world coordinates using the metadata, are the world Z-coordinates always in the downward sense (i.e. more positive values are farther underground) as is stated to be the case with the inline/crossline/sample system? If not, how can I tell whether the Z-coordinate axis points upward or downward?
c) In the I/J/K system, do the axis descriptors' `GetCoordinateMin()/Max()` accessors return anything meaningful? (Since it would seem that the I/J/K axes are defined completely by the origin and step metadata, and by the axes' `GetNumSamples()` getters?) If so, in what way would they be used?
d) Should I expect that all VDS datasets with the I/J/K axis description system have the same handedness for the I/J/K axes (which?), or is this not guaranteed? For the inline/crossline/sample system I have found datasets with both chiralities.
4) Basically the same questions as (3a,b) for the X/Y/Z axis description system.
I'm not aware of any VDS datasets that exhibit the I/J/K or X/Y/Z coordinate systems; if there are public datasets available that do so, a pointer would be enormously appreciated!
5) It appears that the only units for Z coordinates that are supported are milliseconds, meters, and feet (and also apparently US survey feet?). I'm unclear what happens on attempting to create an OpenVDS dataset having a vertical axis with a different unit -- e.g. "fathoms" or "km" or "seconds". Is it allowed but unsupported (and if so is it round-trippable)? Or will it provoke an exception or crash?
Again, thanks so much for your help!https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-service/-/issues/127Issue with Get Status API2024-02-09T18:12:56ZJiman KimIssue with Get Status APIHello we are running some authentication testing and are running into some behaviors that may or may not be a bug.
for this endpoint
/seistore-svc/api/v4/status
We have 3 tests running
1. Sends an invalid token
2. Sends a valid toke...Hello we are running some authentication testing and are running into some behaviors that may or may not be a bug.
for this endpoint
/seistore-svc/api/v4/status
We have 3 tests running
1. Sends an invalid token
2. Sends a valid token but signed with a wrong secret
3. Sends the HTTP request without an authorization header.
1,2 return a 401
but 3 returns 200.
Is this a bug or intended behavior?
Thank you!M21 - Release 0.24