seismic-dms-cpp-lib issueshttps://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-cpp-lib/-/issues2024-02-26T21:52:50Zhttps://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-cpp-lib/-/issues/21Unable to use OpenVDS and OpenZGY libraries to access data in Seismic DDMS on...2024-02-26T21:52:50ZMichaelUnable to use OpenVDS and OpenZGY libraries to access data in Seismic DDMS on AWS M15More details can be found here: https://community.opengroup.org/osdu/platform/pre-shipping/-/issues/439
Both OpenVDS and OpenZGY data has been ingested into Seismic DDMS on AWS M15. I was able to download the OpenZGY file using SDUtil.
...More details can be found here: https://community.opengroup.org/osdu/platform/pre-shipping/-/issues/439
Both OpenVDS and OpenZGY data has been ingested into Seismic DDMS on AWS M15. I was able to download the OpenZGY file using SDUtil.
However, when I try to access the OpenZGY data using the OpenZGY library, I get the following exception:
`Initialize: Seismic Store: sdapi 3.16.0 - Encountered network error when sending http request`
When I try to access the OpenVDS data using the latest OpenZGY library, I get the following exception:
`-- sdapi 3.17.0 - Fri Feb 17 16:43:30 2023 -- Head Object error: - Encountered network error when sending http request`
I can download the OpenZGY data using the latest version of SDUtil. This leads me to think that the SDAPI library is not able to propertly access seismic data in Seismic DDMS for AWS M15.
Is there another version of the sdapi library that I should be using with the OpenVDS and OpenZGY libraries?https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-cpp-lib/-/issues/20Azure add support for partitioned dnsZone blob endpoints2023-03-30T16:32:25ZFabien BosquetAzure add support for partitioned dnsZone blob endpointsRecent development in some Azure OSDU implementation shows that the blob endpoint is moving from [standard-endpoints](https://learn.microsoft.com/en-us/azure/storage/common/storage-account-overview#standard-endpoints) to [Azure DNS zone ...Recent development in some Azure OSDU implementation shows that the blob endpoint is moving from [standard-endpoints](https://learn.microsoft.com/en-us/azure/storage/common/storage-account-overview#standard-endpoints) to [Azure DNS zone endpoints](https://learn.microsoft.com/en-us/azure/storage/common/storage-account-overview#azure-dns-zone-endpoints-preview)
The actual code in [azure/AzureCommon.cc](https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-cpp-lib/-/blob/master/src/src/lib/cloud/providers/azure/AzureCommon.cc#L32) is not ready to use the new Azure DNS zone endpoints.https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-cpp-lib/-/issues/17Build fails when using Ninja generator2023-02-24T06:45:34ZPavel KisliakBuild fails when using Ninja generatorCurrently build fails when using Ninja generator, it's related to a bit different mechanism of scanning dependencies ([more details](https://github.com/ninja-build/ninja/issues/760)).
How to reproduce:
```
cmake -GNinja -B build
cmake -...Currently build fails when using Ninja generator, it's related to a bit different mechanism of scanning dependencies ([more details](https://github.com/ninja-build/ninja/issues/760)).
How to reproduce:
```
cmake -GNinja -B build
cmake --build build --config Release
```
Build error:
> ninja: error: 'crc32c/lib/libcrc32c.a', needed by 'libsdapi.so.0.0.0', missing and no known rule to make itPavel KisliakPavel Kisliakhttps://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-cpp-lib/-/issues/14Pipeline not starting for master and protected branches2022-06-15T05:55:48ZWalter DPipeline not starting for master and protected branchesStarting a pipeline for master or newly created protected branch results in following error message:
Pipeline cannot be run.
'container_scanning' job needs 'azure_containerize' job, but 'azure_containerize' is not in any previous stage
...Starting a pipeline for master or newly created protected branch results in following error message:
Pipeline cannot be run.
'container_scanning' job needs 'azure_containerize' job, but 'azure_containerize' is not in any previous stage
![image](/uploads/4654b10ae7a0011077e09c238ec295a6/image.png)Walter DWalter Dhttps://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-cpp-lib/-/issues/13Add a possibility to work with Anthos/MinIO.2023-03-30T16:59:36ZYan Sushchynski (EPAM)Add a possibility to work with Anthos/MinIO.Hello!
We implemented Seismic DMS for `Anthos` environment with MinIO as a storage backend. MinIO implementation works with `s3` and mostly follows AWS implementation.
For `sdutil`, all we needed was overriding a single method of `AwsS...Hello!
We implemented Seismic DMS for `Anthos` environment with MinIO as a storage backend. MinIO implementation works with `s3` and mostly follows AWS implementation.
For `sdutil`, all we needed was overriding a single method of `AwsStorageService` service; we just added MinIO endpoint.
(https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-sdutil/-/merge_requests/50/diffs#316b48789c54e1de5292718db6abac8856b6cec3_0_48)
The problem is that `Open VDS` and `Open ZGY` convertors use `seismic-cpp-lib` and there is no way to access MinIO storage to manipulate files.
I found that the library gets the information about Cloud provider for choosing a storage class from the response header of Seismic
(https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-cpp-lib/-/blob/master/src/src/lib/cloud/SeismicStore.cc#L704).
This value is taken from `CLOUDPROVIDER` env var of Seismic deployment. As our `CLOUDPROVIDER`'s value is `anthos`, `seismic-cpp-lib` chooses `GcsAccessorStorage`(
https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-cpp-lib/-/blob/master/src/src/lib/cloud/Storage.cc#L172)
What we need:
1. Choose `AwsStorage` if cloud provider is `anthos`;
1. Possibility to override AWS's Endpoint URL with the environmental variable's value.
SDMS MR:
https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-service/-/merge_requests/322
Thanks.
fyi: @Siarhei_Khaletski @jorgenhttps://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-cpp-lib/-/issues/12Compilation conflict with Windows.h2022-08-29T04:56:48ZAlberto SecchiCompilation conflict with Windows.hI have a Windows application consuming SDAPI. In my code I have to include both Windows.h and SDEXception.h.
When compiling the application, I have the following error:
illegal token on right side of '::' (SDEXception.h, line 294)
This...I have a Windows application consuming SDAPI. In my code I have to include both Windows.h and SDEXception.h.
When compiling the application, I have the following error:
illegal token on right side of '::' (SDEXception.h, line 294)
This is due to the fact that Windows.h defines #MIN and #MAX macros, which are conflicting with the std::numeric_limits<std::uint32_t>::max() call (SDEXception.h, line 294)
See also [this ](https://stackoverflow.com/questions/11544073/how-do-i-deal-with-the-max-macro-in-windows-h-colliding-with-max-in-std)article for further information.https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-cpp-lib/-/issues/9Issue in build the seismic-store-cpp-lib2022-01-04T16:56:11ZHouari ZegaiIssue in build the seismic-store-cpp-libWe cloned the repo (with submodule recursively) and followed the build instructions in the README file (under `Windows build instructions` section) to compile and install the library locally on windows 10 64-bit. We have installed Visual...We cloned the repo (with submodule recursively) and followed the build instructions in the README file (under `Windows build instructions` section) to compile and install the library locally on windows 10 64-bit. We have installed Visual Studio 2017 along with MFC and ATL optional sub-components and vcpkg.
When we run the build for windows using the command `.\devops\scripts\build-win64.ps1` but we got the following error:
```
-- Configuring done
-- Generating done
CMake Warning:
Manually-specified variables were not used by the project:
CMAKE_TOOLCHAIN_FILE
-- Build files have been written to: D:/seismic-store-cpp-lib/_build_Release_plain
Microsoft (R) Build Engine version 15.9.21+g9802d43bc3 for .NET Framework
Copyright (C) Microsoft Corporation. All rights reserved.
Build started 1/4/2022 5:08:33 PM.
Project "D:\seismic-store-cpp-lib\_build_Release_plain\ALL_BUILD.vcxproj"
on node 1 (default targets).
Project "D:\seismic-store-cpp-lib\_build_Release_plain\ALL_BUILD.vcxproj"
(1) is building "D:\seismic-store-cpp-lib\_build_Release_plain\ZERO_CHECK.
vcxproj" (2) on node 1 (default targets).
InitializeBuildStatus:
Creating "x64\Release\ZERO_CHECK\ZERO_CHECK.tlog\unsuccessfulbuild" because "AlwaysCreate" was specified.
CustomBuild:
All outputs are up-to-date.
FinalizeBuildStatus:
Deleting file "x64\Release\ZERO_CHECK\ZERO_CHECK.tlog\unsuccessfulbuild".
Touching "x64\Release\ZERO_CHECK\ZERO_CHECK.tlog\ZERO_CHECK.lastbuildstate".
Done Building Project "D:\seismic-store-cpp-lib\_build_Release_plain\ZERO_CHECK.vcxproj" (default targets).
Project "D:\seismic-store-cpp-lib\_build_Release_plain\ALL_BUILD.vcxproj"
(1) is building "D:\seismic-store-cpp-lib\_build_Release_plain\crc32c.vcxp
roj" (3) on node 1 (default targets).
InitializeBuildStatus:
Touching "x64\Release\crc32c\crc32c.tlog\unsuccessfulbuild".
CustomBuild: Creating directories for 'crc32c' Building Custom Rule D:/seismic-store-cpp-lib/src/CMakeLists.txt Performing download step (DIR copy) for 'crc32c' No update step for 'crc32c' Performing patch step for 'crc32c' C:\Program Files (x86)\Microsoft Visual Studio\2017\Community\Common7\IDE\VC\VCTargets\Microsoft.CppCommon.targets(209,
5): error MSB6006: "cmd.exe" exited with code 1. [D:\seismic-store-cpp-lib
\_build_Release_plain\crc32c.vcxproj]
Done Building Project "D:\seismic-store-cpp-lib\_build_Release_plain\crc32
c.vcxproj" (default targets) -- FAILED.
Done Building Project "D:\seismic-store-cpp-lib\_build_Release_plain\ALL_BUILD.vcxproj" (default targets) -- FAILED.
Build FAILED.
"D:\seismic-store-cpp-lib\_build_Release_plain\ALL_BUILD.vcxproj" (default
target) (1) ->
"D:\seismic-store-cpp-lib\_build_Release_plain\crc32c.vcxproj" (default ta
rget) (3) ->
(CustomBuild target) ->
C:\Program Files (x86)\Microsoft Visual Studio\2017\Community\Common7\IDE\VC\VCTargets\Microsoft.CppCommon.targets(20
9,5): error MSB6006: "cmd.exe" exited with code 1. [D:\seismic-store-cpp-l
ib\_build_Release_plain\crc32c.vcxproj]
0 Warning(s)
1 Error(s)
Time Elapsed 00:00:05.33
D:\seismic-store-cpp-lib\devops\scripts\build-win64.ps1 : Failed on cmake
build plain. Please scroll up to see verbose result of cmake command.
At line:1 char:1
+ .\devops\scripts\build-win64.ps1
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : NotSpecified: (:) [Write-Error], WriteErrorException
+ FullyQualifiedErrorId : Microsoft.PowerShell.Commands.WriteErrorException,build-win64.ps1
```
Is there any issue with the build process or environment?https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-cpp-lib/-/issues/8[GCP] HTTP/2 stream 0 was not closed cleanly: PROTOCOL_ERROR (1)2021-12-10T19:04:55ZYan Sushchynski (EPAM)[GCP] HTTP/2 stream 0 was not closed cleanly: PROTOCOL_ERROR (1)Hi!
When we attempt to convert Segy to OpenVDS with using the latest image `community.opengroup.org:5555/osdu/platform/domain-data-mgmt-services/seismic/open-vds/openvds-ingestion:2.1.8` in `SSDMS`, we get the following error:
```
(ERR...Hi!
When we attempt to convert Segy to OpenVDS with using the latest image `community.opengroup.org:5555/osdu/platform/domain-data-mgmt-services/seismic/open-vds/openvds-ingestion:2.1.8` in `SSDMS`, we get the following error:
```
(ERROR) HTTPRequest::Send, performing request, detailed error: HTTP/2 stream 0 was not closed cleanly: PROTOCOL_ERROR (err 1) The error occurred while executing an HTTP request. error code: 92, error message: 'Stream error in the HTTP/2 framing layer']
Retrying http_request for POST request with https://storage.googleapis.com/upload/storage/v1/b/<subproject-bucket>/o?uploadType=multipart - retry number: 1
```
@jorgen said that OpenVDS convertor used this code to upload the result dataset:
https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-cpp-lib/-/blob/master/src/src/lib/cloud/providers/gcp/GcsStorage.cc#L384
@jorgen, please, provide more info on what version of `seismic-dms-cpp-lib` you used in the latest Segy->OpenVDS converter image.
cc:
@Siarhei_Khaletski @Kateryna_KurachM9 - Release 0.12Sehubo AkinyanmiSehubo Akinyanmihttps://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-cpp-lib/-/issues/7AWS Missing regression/e2e tests execution in the pipeline2022-08-29T04:56:35ZDiego MolteniAWS Missing regression/e2e tests execution in the pipelineMissing regression/e2e tests execution in the pipelineMissing regression/e2e tests execution in the pipelineMatt WiseMatt Wisehttps://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-cpp-lib/-/issues/6IBM Missing regression/e2e tests execution in the pipeline2023-03-28T06:12:54ZDiego MolteniIBM Missing regression/e2e tests execution in the pipelineMissing regression/e2e tests execution in the pipelineMissing regression/e2e tests execution in the pipelineWalter DWalter Dhttps://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-cpp-lib/-/issues/5SDGenericDataset::readBlock(int blocknum, char **data, size_t &len) not corre...2021-06-08T19:40:54ZJørgen Lindjorgen.lind@3lc.aiSDGenericDataset::readBlock(int blocknum, char **data, size_t &len) not correctly implemented in Azure backendsThere is some descrepencies in how this function is implemented in the different backends.
https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-cpp-lib/-/blob/master/src/src/lib...There is some descrepencies in how this function is implemented in the different backends.
https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-cpp-lib/-/blob/master/src/src/lib/accessors/GcsAccessor.cc#L722
dictates that both data and len are out parameters. On function exit data will point to newly allocated buffer, and len will have the size of the buffer data.
This is implemented in a simmilar manner for AWS here:
https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-cpp-lib/-/blob/master/src/src/lib/cloud/providers/aws/AwsStorage.cc#L453
However, the implementation in both Azure implementations seem to be broken:
https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-cpp-lib/-/blob/master/src/src/lib/cloud/providers/azure/cpprest/AzureStorage.cc#L430
Seems to take the len as a max input parameter, and copy at most len or the objectSize. But it does not expose the objectSize if len > the objectSize.
https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-cpp-lib/-/blob/master/src/src/lib/cloud/providers/azure/curl/AzureStorage.cc#L586
Dereferences the data pointer, which probably points to garbage if it is used according to the other apis. Also it doesn't expose the size of the object either.
For OpenVDS its convenient if this function would allocate a buffer, fill it with the object data en expose the object size through len.https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-cpp-lib/-/issues/3Access dataset in seismi-drive via cloud url/credentials2021-06-07T19:20:09ZRaghu Jayan MenonAccess dataset in seismi-drive via cloud url/credentialsHello,
Is there a way to:
1. Get the fully qualified cloud url for the backing cloud provider for a dataset eith via API and through REST endpoint to the service.
2. Is there a mechanism to generate/refresh credentials that can work wit...Hello,
Is there a way to:
1. Get the fully qualified cloud url for the backing cloud provider for a dataset eith via API and through REST endpoint to the service.
2. Is there a mechanism to generate/refresh credentials that can work with (1)
I have tried to use the gcsUrl API however, it does not for example provide account_name in case of Azure and the SDToken cannot be used outside SDAPI (I assume).
Thank you,
Raghuhttps://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-cpp-lib/-/issues/2Setting up and integrating SDAPI with IBM implementation2021-08-23T10:53:21ZWalter DSetting up and integrating SDAPI with IBM implementationTo continue discussion in the issue # 4 created in the deprecated project Seismic DMS. Lint to the old issue -
https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms/-/issues/4To continue discussion in the issue # 4 created in the deprecated project Seismic DMS. Lint to the old issue -
https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms/-/issues/4https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-cpp-lib/-/issues/1devops: Dockerfiles do not have AWS SDK built in the base image2021-05-25T12:51:52ZRucha Deshpandedevops: Dockerfiles do not have AWS SDK built in the base imageThe base image required to build AWS artifacts do not include AWS SDKThe base image required to build AWS artifacts do not include AWS SDKRucha DeshpandeDiego MolteniRucha Deshpande