seismic-dms-cpp-lib issueshttps://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-cpp-lib/-/issues2021-06-07T19:20:09Zhttps://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-cpp-lib/-/issues/3Access dataset in seismi-drive via cloud url/credentials2021-06-07T19:20:09ZRaghu Jayan MenonAccess dataset in seismi-drive via cloud url/credentialsHello,
Is there a way to:
1. Get the fully qualified cloud url for the backing cloud provider for a dataset eith via API and through REST endpoint to the service.
2. Is there a mechanism to generate/refresh credentials that can work wit...Hello,
Is there a way to:
1. Get the fully qualified cloud url for the backing cloud provider for a dataset eith via API and through REST endpoint to the service.
2. Is there a mechanism to generate/refresh credentials that can work with (1)
I have tried to use the gcsUrl API however, it does not for example provide account_name in case of Azure and the SDToken cannot be used outside SDAPI (I assume).
Thank you,
Raghuhttps://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-cpp-lib/-/issues/13Add a possibility to work with Anthos/MinIO.2023-03-30T16:59:36ZYan Sushchynski (EPAM)Add a possibility to work with Anthos/MinIO.Hello!
We implemented Seismic DMS for `Anthos` environment with MinIO as a storage backend. MinIO implementation works with `s3` and mostly follows AWS implementation.
For `sdutil`, all we needed was overriding a single method of `AwsS...Hello!
We implemented Seismic DMS for `Anthos` environment with MinIO as a storage backend. MinIO implementation works with `s3` and mostly follows AWS implementation.
For `sdutil`, all we needed was overriding a single method of `AwsStorageService` service; we just added MinIO endpoint.
(https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-sdutil/-/merge_requests/50/diffs#316b48789c54e1de5292718db6abac8856b6cec3_0_48)
The problem is that `Open VDS` and `Open ZGY` convertors use `seismic-cpp-lib` and there is no way to access MinIO storage to manipulate files.
I found that the library gets the information about Cloud provider for choosing a storage class from the response header of Seismic
(https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-cpp-lib/-/blob/master/src/src/lib/cloud/SeismicStore.cc#L704).
This value is taken from `CLOUDPROVIDER` env var of Seismic deployment. As our `CLOUDPROVIDER`'s value is `anthos`, `seismic-cpp-lib` chooses `GcsAccessorStorage`(
https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-cpp-lib/-/blob/master/src/src/lib/cloud/Storage.cc#L172)
What we need:
1. Choose `AwsStorage` if cloud provider is `anthos`;
1. Possibility to override AWS's Endpoint URL with the environmental variable's value.
SDMS MR:
https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-service/-/merge_requests/322
Thanks.
fyi: @Siarhei_Khaletski @jorgenhttps://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-cpp-lib/-/issues/7AWS Missing regression/e2e tests execution in the pipeline2022-08-29T04:56:35ZDiego MolteniAWS Missing regression/e2e tests execution in the pipelineMissing regression/e2e tests execution in the pipelineMissing regression/e2e tests execution in the pipelineMatt WiseMatt Wisehttps://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-cpp-lib/-/issues/10AWS sdk to be cloned from the official repo in the build images definition2023-03-30T16:39:15ZDiego MolteniAWS sdk to be cloned from the official repo in the build images definitionbuild images are provided with static dependencies to build both CSP dedicated and polycloud releases.
We had to patch the aws-sdk-cpp sources to be able to statically build them. The sdk should be cloned from the official repo (in bot...build images are provided with static dependencies to build both CSP dedicated and polycloud releases.
We had to patch the aws-sdk-cpp sources to be able to statically build them. The sdk should be cloned from the official repo (in both images [ubuntu](https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-cpp-lib/-/blob/master/devops/docker/build.ubuntu.staticdeps.dockerfile#L216)/[centos7](https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-cpp-lib/-/blob/master/devops/docker/build.centos7.staticdeps.dockerfile#L254)).Matt WiseMatt Wisehttps://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-cpp-lib/-/issues/20Azure add support for partitioned dnsZone blob endpoints2023-03-30T16:32:25ZFabien BosquetAzure add support for partitioned dnsZone blob endpointsRecent development in some Azure OSDU implementation shows that the blob endpoint is moving from [standard-endpoints](https://learn.microsoft.com/en-us/azure/storage/common/storage-account-overview#standard-endpoints) to [Azure DNS zone ...Recent development in some Azure OSDU implementation shows that the blob endpoint is moving from [standard-endpoints](https://learn.microsoft.com/en-us/azure/storage/common/storage-account-overview#standard-endpoints) to [Azure DNS zone endpoints](https://learn.microsoft.com/en-us/azure/storage/common/storage-account-overview#azure-dns-zone-endpoints-preview)
The actual code in [azure/AzureCommon.cc](https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-cpp-lib/-/blob/master/src/src/lib/cloud/providers/azure/AzureCommon.cc#L32) is not ready to use the new Azure DNS zone endpoints.https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-cpp-lib/-/issues/23Bug in escape function defined in utils.cc2023-06-13T20:11:56ZMichaelBug in escape function defined in utils.ccIn the file https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-cpp-lib/-/blob/master/src/src/lib/shared/utils.cc, line 264 there's escape function defined. It contains a bug in...In the file https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-cpp-lib/-/blob/master/src/src/lib/shared/utils.cc, line 264 there's escape function defined. It contains a bug in line 270:
`if (c < 0 && (isalnum(c) || c == '-' || c == '_' || c == '.' || c == '~'))`
Specifically, the "if c < 0" condition shouldn't be there, at least from my point of view. As a result, the following code of sdapi library:
```
http.set_url(_sdmanager->getSDUrl() +
"/dataset/tenant/" + escape(tenant) +
"/subproject/" + escape(subproject) +
"/exist");
```
produces the following URL:
`"https://osdu-ship.msft-osdu-test.org/seistore-svc/api/v3/dataset/tenant/%6F%70%65%6E%64%65%73/subproject/%6D%69%63%68%61%65%6C%76%31/exist"`
instead of
`"https://osdu-ship.msft-osdu-test.org/seistore-svc/api/v3/dataset/tenant/opendes/subproject/michaelv1/exist"`https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-cpp-lib/-/issues/17Build fails when using Ninja generator2023-02-24T06:45:34ZPavel KisliakBuild fails when using Ninja generatorCurrently build fails when using Ninja generator, it's related to a bit different mechanism of scanning dependencies ([more details](https://github.com/ninja-build/ninja/issues/760)).
How to reproduce:
```
cmake -GNinja -B build
cmake -...Currently build fails when using Ninja generator, it's related to a bit different mechanism of scanning dependencies ([more details](https://github.com/ninja-build/ninja/issues/760)).
How to reproduce:
```
cmake -GNinja -B build
cmake --build build --config Release
```
Build error:
> ninja: error: 'crc32c/lib/libcrc32c.a', needed by 'libsdapi.so.0.0.0', missing and no known rule to make itPavel KisliakPavel Kisliakhttps://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-cpp-lib/-/issues/12Compilation conflict with Windows.h2022-08-29T04:56:48ZAlberto SecchiCompilation conflict with Windows.hI have a Windows application consuming SDAPI. In my code I have to include both Windows.h and SDEXception.h.
When compiling the application, I have the following error:
illegal token on right side of '::' (SDEXception.h, line 294)
This...I have a Windows application consuming SDAPI. In my code I have to include both Windows.h and SDEXception.h.
When compiling the application, I have the following error:
illegal token on right side of '::' (SDEXception.h, line 294)
This is due to the fact that Windows.h defines #MIN and #MAX macros, which are conflicting with the std::numeric_limits<std::uint32_t>::max() call (SDEXception.h, line 294)
See also [this ](https://stackoverflow.com/questions/11544073/how-do-i-deal-with-the-max-macro-in-windows-h-colliding-with-max-in-std)article for further information.https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-cpp-lib/-/issues/1devops: Dockerfiles do not have AWS SDK built in the base image2021-05-25T12:51:52ZRucha Deshpandedevops: Dockerfiles do not have AWS SDK built in the base imageThe base image required to build AWS artifacts do not include AWS SDKThe base image required to build AWS artifacts do not include AWS SDKRucha DeshpandeDiego MolteniRucha Deshpandehttps://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-cpp-lib/-/issues/11enable performance tests analisys in the pipeline2023-03-30T16:38:04ZDiego Moltenienable performance tests analisys in the pipelineA certain level of performances through sdapi should be guaranteed from all CSP implementations. There are already simple program examples ([pwrite ](https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic...A certain level of performances through sdapi should be guaranteed from all CSP implementations. There are already simple program examples ([pwrite ](https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-cpp-lib/-/blob/master/src/test/seismic-store/performance_write.cc) [pread](https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-cpp-lib/-/blob/master/src/test/seismic-store/performance_read.cc)) and a more complete one [here ](https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-cpp-lib/-/tree/master/src/test/performance)(evaluate read-write perf. on different combinations of thread numbers and blocks sizes) that can be used in the pipeline to guarantee library performances. This last can be built using the provided build script (passing the --build-ptest options…. see README.md). both AWS/IBM should start looking into this example to understand if these works for them and how we can generally enable them in the pipeline (define acceptance level and execution context). I’ve created a bug hereDiego MolteniDiego Moltenihttps://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-cpp-lib/-/issues/8[GCP] HTTP/2 stream 0 was not closed cleanly: PROTOCOL_ERROR (1)2021-12-10T19:04:55ZYan Sushchynski (EPAM)[GCP] HTTP/2 stream 0 was not closed cleanly: PROTOCOL_ERROR (1)Hi!
When we attempt to convert Segy to OpenVDS with using the latest image `community.opengroup.org:5555/osdu/platform/domain-data-mgmt-services/seismic/open-vds/openvds-ingestion:2.1.8` in `SSDMS`, we get the following error:
```
(ERR...Hi!
When we attempt to convert Segy to OpenVDS with using the latest image `community.opengroup.org:5555/osdu/platform/domain-data-mgmt-services/seismic/open-vds/openvds-ingestion:2.1.8` in `SSDMS`, we get the following error:
```
(ERROR) HTTPRequest::Send, performing request, detailed error: HTTP/2 stream 0 was not closed cleanly: PROTOCOL_ERROR (err 1) The error occurred while executing an HTTP request. error code: 92, error message: 'Stream error in the HTTP/2 framing layer']
Retrying http_request for POST request with https://storage.googleapis.com/upload/storage/v1/b/<subproject-bucket>/o?uploadType=multipart - retry number: 1
```
@jorgen said that OpenVDS convertor used this code to upload the result dataset:
https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-cpp-lib/-/blob/master/src/src/lib/cloud/providers/gcp/GcsStorage.cc#L384
@jorgen, please, provide more info on what version of `seismic-dms-cpp-lib` you used in the latest Segy->OpenVDS converter image.
cc:
@Siarhei_Khaletski @Kateryna_KurachM9 - Release 0.12Sehubo AkinyanmiSehubo Akinyanmihttps://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-cpp-lib/-/issues/6IBM Missing regression/e2e tests execution in the pipeline2023-03-28T06:12:54ZDiego MolteniIBM Missing regression/e2e tests execution in the pipelineMissing regression/e2e tests execution in the pipelineMissing regression/e2e tests execution in the pipelineWalter DWalter Dhttps://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-cpp-lib/-/issues/22IBM service not responding correctly on Seismic DMS Service dataset list endp...2023-03-30T16:31:30ZRashaad GrayIBM service not responding correctly on Seismic DMS Service dataset list endpointPOST /dataset/tenant/{tenantid}/subproject/{subprojectid}
Get the list of datasets in a subproject.
endpoint is not correctly responding, causing error with e2e test will skip specific test for nowPOST /dataset/tenant/{tenantid}/subproject/{subprojectid}
Get the list of datasets in a subproject.
endpoint is not correctly responding, causing error with e2e test will skip specific test for nowhttps://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-cpp-lib/-/issues/9Issue in build the seismic-store-cpp-lib2022-01-04T16:56:11ZHouari ZegaiIssue in build the seismic-store-cpp-libWe cloned the repo (with submodule recursively) and followed the build instructions in the README file (under `Windows build instructions` section) to compile and install the library locally on windows 10 64-bit. We have installed Visual...We cloned the repo (with submodule recursively) and followed the build instructions in the README file (under `Windows build instructions` section) to compile and install the library locally on windows 10 64-bit. We have installed Visual Studio 2017 along with MFC and ATL optional sub-components and vcpkg.
When we run the build for windows using the command `.\devops\scripts\build-win64.ps1` but we got the following error:
```
-- Configuring done
-- Generating done
CMake Warning:
Manually-specified variables were not used by the project:
CMAKE_TOOLCHAIN_FILE
-- Build files have been written to: D:/seismic-store-cpp-lib/_build_Release_plain
Microsoft (R) Build Engine version 15.9.21+g9802d43bc3 for .NET Framework
Copyright (C) Microsoft Corporation. All rights reserved.
Build started 1/4/2022 5:08:33 PM.
Project "D:\seismic-store-cpp-lib\_build_Release_plain\ALL_BUILD.vcxproj"
on node 1 (default targets).
Project "D:\seismic-store-cpp-lib\_build_Release_plain\ALL_BUILD.vcxproj"
(1) is building "D:\seismic-store-cpp-lib\_build_Release_plain\ZERO_CHECK.
vcxproj" (2) on node 1 (default targets).
InitializeBuildStatus:
Creating "x64\Release\ZERO_CHECK\ZERO_CHECK.tlog\unsuccessfulbuild" because "AlwaysCreate" was specified.
CustomBuild:
All outputs are up-to-date.
FinalizeBuildStatus:
Deleting file "x64\Release\ZERO_CHECK\ZERO_CHECK.tlog\unsuccessfulbuild".
Touching "x64\Release\ZERO_CHECK\ZERO_CHECK.tlog\ZERO_CHECK.lastbuildstate".
Done Building Project "D:\seismic-store-cpp-lib\_build_Release_plain\ZERO_CHECK.vcxproj" (default targets).
Project "D:\seismic-store-cpp-lib\_build_Release_plain\ALL_BUILD.vcxproj"
(1) is building "D:\seismic-store-cpp-lib\_build_Release_plain\crc32c.vcxp
roj" (3) on node 1 (default targets).
InitializeBuildStatus:
Touching "x64\Release\crc32c\crc32c.tlog\unsuccessfulbuild".
CustomBuild: Creating directories for 'crc32c' Building Custom Rule D:/seismic-store-cpp-lib/src/CMakeLists.txt Performing download step (DIR copy) for 'crc32c' No update step for 'crc32c' Performing patch step for 'crc32c' C:\Program Files (x86)\Microsoft Visual Studio\2017\Community\Common7\IDE\VC\VCTargets\Microsoft.CppCommon.targets(209,
5): error MSB6006: "cmd.exe" exited with code 1. [D:\seismic-store-cpp-lib
\_build_Release_plain\crc32c.vcxproj]
Done Building Project "D:\seismic-store-cpp-lib\_build_Release_plain\crc32
c.vcxproj" (default targets) -- FAILED.
Done Building Project "D:\seismic-store-cpp-lib\_build_Release_plain\ALL_BUILD.vcxproj" (default targets) -- FAILED.
Build FAILED.
"D:\seismic-store-cpp-lib\_build_Release_plain\ALL_BUILD.vcxproj" (default
target) (1) ->
"D:\seismic-store-cpp-lib\_build_Release_plain\crc32c.vcxproj" (default ta
rget) (3) ->
(CustomBuild target) ->
C:\Program Files (x86)\Microsoft Visual Studio\2017\Community\Common7\IDE\VC\VCTargets\Microsoft.CppCommon.targets(20
9,5): error MSB6006: "cmd.exe" exited with code 1. [D:\seismic-store-cpp-l
ib\_build_Release_plain\crc32c.vcxproj]
0 Warning(s)
1 Error(s)
Time Elapsed 00:00:05.33
D:\seismic-store-cpp-lib\devops\scripts\build-win64.ps1 : Failed on cmake
build plain. Please scroll up to see verbose result of cmake command.
At line:1 char:1
+ .\devops\scripts\build-win64.ps1
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : NotSpecified: (:) [Write-Error], WriteErrorException
+ FullyQualifiedErrorId : Microsoft.PowerShell.Commands.WriteErrorException,build-win64.ps1
```
Is there any issue with the build process or environment?https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-cpp-lib/-/issues/19New versions does not support m11 release or older2023-03-30T16:33:04ZJørgen Lindjorgen.lind@3lc.aiNew versions does not support m11 release or olderIt seems that new versions of seismic-dms-cpp-lib does not support m11 release giving the error message:
`sdapi 3.16.0 - SeismicStore::DatasetRegister: Error executing an HTTP request [ HTTP 404 ]`It seems that new versions of seismic-dms-cpp-lib does not support m11 release giving the error message:
`sdapi 3.16.0 - SeismicStore::DatasetRegister: Error executing an HTTP request [ HTTP 404 ]`https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-cpp-lib/-/issues/14Pipeline not starting for master and protected branches2022-06-15T05:55:48ZWalter DPipeline not starting for master and protected branchesStarting a pipeline for master or newly created protected branch results in following error message:
Pipeline cannot be run.
'container_scanning' job needs 'azure_containerize' job, but 'azure_containerize' is not in any previous stage
...Starting a pipeline for master or newly created protected branch results in following error message:
Pipeline cannot be run.
'container_scanning' job needs 'azure_containerize' job, but 'azure_containerize' is not in any previous stage
![image](/uploads/4654b10ae7a0011077e09c238ec295a6/image.png)Walter DWalter Dhttps://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-cpp-lib/-/issues/5SDGenericDataset::readBlock(int blocknum, char **data, size_t &len) not corre...2021-06-08T19:40:54ZJørgen Lindjorgen.lind@3lc.aiSDGenericDataset::readBlock(int blocknum, char **data, size_t &len) not correctly implemented in Azure backendsThere is some descrepencies in how this function is implemented in the different backends.
https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-cpp-lib/-/blob/master/src/src/lib...There is some descrepencies in how this function is implemented in the different backends.
https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-cpp-lib/-/blob/master/src/src/lib/accessors/GcsAccessor.cc#L722
dictates that both data and len are out parameters. On function exit data will point to newly allocated buffer, and len will have the size of the buffer data.
This is implemented in a simmilar manner for AWS here:
https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-cpp-lib/-/blob/master/src/src/lib/cloud/providers/aws/AwsStorage.cc#L453
However, the implementation in both Azure implementations seem to be broken:
https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-cpp-lib/-/blob/master/src/src/lib/cloud/providers/azure/cpprest/AzureStorage.cc#L430
Seems to take the len as a max input parameter, and copy at most len or the objectSize. But it does not expose the objectSize if len > the objectSize.
https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-cpp-lib/-/blob/master/src/src/lib/cloud/providers/azure/curl/AzureStorage.cc#L586
Dereferences the data pointer, which probably points to garbage if it is used according to the other apis. Also it doesn't expose the size of the object either.
For OpenVDS its convenient if this function would allocate a buffer, fill it with the object data en expose the object size through len.https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-cpp-lib/-/issues/2Setting up and integrating SDAPI with IBM implementation2021-08-23T10:53:21ZWalter DSetting up and integrating SDAPI with IBM implementationTo continue discussion in the issue # 4 created in the deprecated project Seismic DMS. Lint to the old issue -
https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms/-/issues/4To continue discussion in the issue # 4 created in the deprecated project Seismic DMS. Lint to the old issue -
https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms/-/issues/4https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-cpp-lib/-/issues/21Unable to use OpenVDS and OpenZGY libraries to access data in Seismic DDMS on...2024-02-26T21:52:50ZMichaelUnable to use OpenVDS and OpenZGY libraries to access data in Seismic DDMS on AWS M15More details can be found here: https://community.opengroup.org/osdu/platform/pre-shipping/-/issues/439
Both OpenVDS and OpenZGY data has been ingested into Seismic DDMS on AWS M15. I was able to download the OpenZGY file using SDUtil.
...More details can be found here: https://community.opengroup.org/osdu/platform/pre-shipping/-/issues/439
Both OpenVDS and OpenZGY data has been ingested into Seismic DDMS on AWS M15. I was able to download the OpenZGY file using SDUtil.
However, when I try to access the OpenZGY data using the OpenZGY library, I get the following exception:
`Initialize: Seismic Store: sdapi 3.16.0 - Encountered network error when sending http request`
When I try to access the OpenVDS data using the latest OpenZGY library, I get the following exception:
`-- sdapi 3.17.0 - Fri Feb 17 16:43:30 2023 -- Head Object error: - Encountered network error when sending http request`
I can download the OpenZGY data using the latest version of SDUtil. This leads me to think that the SDAPI library is not able to propertly access seismic data in Seismic DDMS for AWS M15.
Is there another version of the sdapi library that I should be using with the OpenVDS and OpenZGY libraries?https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-cpp-lib/-/issues/16Using VCPKG manifest file2023-06-28T12:29:17ZPavel KisliakUsing VCPKG manifest fileCan we think about start using [VCPKG manifests](https://vcpkg.readthedocs.io/en/latest/specifications/manifests/)?
I see that VCPKG already used for Windows build, I hope that it also can be unified, because VCPKG available for Win/Linu...Can we think about start using [VCPKG manifests](https://vcpkg.readthedocs.io/en/latest/specifications/manifests/)?
I see that VCPKG already used for Windows build, I hope that it also can be unified, because VCPKG available for Win/Linux/Mac.
In addition to help with faster getting first build there are also other benefits:
- Will help to avoid interferences with globally installed libraries for different projects.
- Manifest allows to stick with specified versions of third-party libraries.
- Will reduce complexity of Cmake by removing constructions like "if (WIN32) else" for linking dependencies.
- Allow to move in direction to publish 'seismic-store-cpp-lib' to VCPKG.
As start point, I've prepared branch with [VCPKG manifest](https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-cpp-lib/-/commit/9ee956e09612cb4f69d174147db2d040040656ef).
Currently I am not added libraries **aws-sdk-cpp** and **google-cloud-cpp** because need to fix linking in the cmake.
How to build:
```
make -B build -DCMAKE_TOOLCHAIN_FILE=~/vcpkg/scripts/buildsystems/vcpkg.cmake -DVCPKG_FEATURE_FLAGS=versions
cmake --build build --config Release
```
[Please correct path to installed VCPKG]
The VCPKG way should work on all platforms, but currently there are exists few issues which should be fixed.
For have better life with Visual Studio, I've also prepared "CMakeSettings.json" file,
that allows to just use "Open folder" command from the VS and make build without any other configure actions.
(Just VCPKG should be installed and path should be added to environment variable VCPKG_ROOT).
One thing that need to keep in mind, VCPKG does not officially support dynamic linkage on Linux, which
is related to system-provided libraries ([more info](https://github.com/microsoft/vcpkg/issues/15006)).
But exists community supported triplet which can be used at our own risk.
As I am new on OSDU and on this particular library, please point out which other impediments do you see.
Edited 1/26/2023: Btw, the same work was done for [Reservoir/Open-ETP-server](https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/reservoir/open-etp-server/-/issues/30)