Commit 95817dc5 authored by Daniel Perez's avatar Daniel Perez
Browse files

ci: merge master

parents 6dccef65 94c74c7f
Pipeline #78812 failed
......@@ -6,15 +6,18 @@ Seismic Store is a Schlumberger cloud-based solution designed to store and manag
The sdapi library will allow you to easily interact with seismic store and perform operations like upload, download and manage datasets of any size with seismic store.
- [Linux Build Instructions](#linux-build-instructions)
- [Linux Build DockerFile](#linux-build-dockerfile)
- [Windows Build Instructions](#windows-build-instructions)
- [Environment Control Variables](#environment-control-variables)
- [Usage Examples](#usage-examples)
- [Linux build instructions](#linux-build-instructions)
- [Linux build dockerFile](#linux-build-dockerfile)
- [Windows build instructions](#windows-build-instructions)
- [Environment variables](#environment-variables)
- [Usage examples](#usage-examples)
- [Exception handling](#exception-handling)
![components interaction diagram](docs/components-interaction-diagram-plain.png)
## Linux Build Instructions
---
## Linux build instructions
The sdapi library require third party software storage providers enabled.
The sdapi shared library only exports its publically visible API and symbols.
......@@ -155,7 +158,9 @@ docker run \
--build-ftest --build-utest --build-rtest --build-ptest
```
## Linux Build DockerFile
---
## Linux build dockerFile
The library can be build using one of the provided docker container definitions under [devops/docker](devops/docker).
CentOS 7, CentOS 8 and Ubuntu 20.04 docker files are provided. both for the hidden dependencies and the shared library version.
......@@ -198,7 +203,9 @@ docker run \
/lib64/ld-linux-x86-64.so.2 (0x00007fd1f4f47000)
```
## Windows Build Instructions
---
## Windows build instructions
To build sdapi for Windows, [Visual Studio 2017](https://docs.microsoft.com/en-us/visualstudio/releasenotes/vs2017-relnotes) is required. Also, [MFC and ATL](https://docs.microsoft.com/en-us/cpp/mfc/mfc-and-atl?view=vs-2019) optional sub-components must be checked under the *Desktop development with C++* workload in the Visual Studio Installer program.
......@@ -232,7 +239,9 @@ PS .\devops\scripts\build-win64.ps1
PS .\devops\scripts\build-win64.ps1 -build_type 'Release' -vcpkg_path 'D:/vcpkg' -vcversion_cmake 'Visual Studio 15 2017 Win64' -lib_version '3.5.323;
```
## Environment Control Variables
---
## Environment variables
Library behaviors can be controlled via these environment variables:
......@@ -242,7 +251,9 @@ Library behaviors can be controlled via these environment variables:
- DISABLE_SDAPI_401_RETRY=ON|OFF, enable or disable http 401 retries for SDAPI (enabled by default)
- SDAPI_LOGLEVEL=0|1|... set the logging level (0 by default)
## Usage Examples
---
## Usage examples
The SDManager is the main configuration object for seismic store. It stores and manages service url, key and user credentials. It must be initialized as first in a client code.
......@@ -371,3 +382,146 @@ void downloader(
```
More examples can be found in the [src/test/seismic-store](src/test/seismic-store) folder.
---
## Exception handling
### Classes
```
seismicdrive::
error::authprovider::
Error
error::dataset::
Error
OpenFailed
DeleteFailed
FlushFailed
NotOpen
AlreadyOpen
UpdateFailed
InvalidPrefix
InvalidPath
InvalidNodeId
Locked
error::dataset::context::
NotReadOnly
Expired
UnknownFormat
PathMismatch
error::format::
Error
Json
ToJson
FromJson
error::internal::
Error
error::manager::
Error
AuthProviderNotSet
ServiceUrlNotSet
ApiKeyNotSet
context::UnknownFormat
error::seismicstore::
ActionFailed
NotJson
error::storage::
Error
NotImplemented
InvalidSasUri
ProviderError
NotBuilt
AccessError
```
### Usage example
```cpp
#include "SDException.h"
#include "SDGenericDataset.h"
#include "SDManager.h"
#include <iostream>
using namespace seismicdrive;
void func(SDManager *manager)
{
try
{
std::string datasetName = "XXX";
SDGenericDataset dataset(manager, datasetName);
dataset.open(SDDatasetDisposition::OVERWRITE);
char buffer[100] = {};
dataset.writeBlock(0, buffer, sizeof buffer);
dataset.close();
std::cout << "OK\n";
}
catch (error::manager::AuthProviderNotSet &e)
{
std::cout << "Auth provider must be set\n\n"
<< e.what() << '\n';
}
catch (error::dataset::Error &e)
{
auto path = e.getPath();
std::cout << "Dataset error for " << path << "\n\n"
<< e.what() << '\n';
}
catch (SDException &e)
{
std::cout << "Something went wrong\n\n"
<< e.what() << '\n';
}
}
```
```
Possible output:
Dataset error for XXX
sdapi 3.14.0 - Invalid dataset path: XXX
Expected format: sd://<tenant_name>/<subproject_name>/<path>/<dataset_name>
```
### Migration guide
Steps to migrate software from sdapi older than 3.14:
1. Rebuild software with the new sdapi
1. Verify that it still runs ok
1. Enable compiler deprecation warnings
1. Rebuild
1. Deal with each warning, modify catch statements to catch the new exceptions instead of the old.
1. Rebuild
1. Verify that it still runs ok
### Superseded classes
|Old catch|New catch|
|:---|:---|
|```SDException```|```SDException``` (same as before)|
|```SDExceptionAuthProviderError```|```error::authprovider::Error```|
|```SDExceptionAzureStorageError```|```error::storage::Error```|
|```SDExceptionContextExpired```|```error::dataset::context::Expired```|
|```SDExceptionContextPathNoMatch```|```error::dataset::context::PathMismatch```|
|```SDExceptionContextVersionTooNew```|```error::dataset::context::UnknownFormat```|
|```SDExceptionDatasetError```|```error::dataset::Error```|
|```SDExceptionDatasetLocked```|```error::dataset::Locked```|
|```SDExceptionGCSAccessorError```|```error::storage::AccessError```|
|```SDExceptionInternalError```|```error::internal::Error```|
|```SDExceptionOutdatedMethod```|N/A, not thrown by the SDAPI|
|```SDExceptionSDAccessorError```|```error::seismicstore::ActionFailed```|
|```SDExceptionStorageError```|```error::storage::Error```|
|```SDExceptionValueError```|```error::format::Json```|
|```SDExpectedJsonException```|```error::seismicstore::NotJson```|
|```SDExpectedReadOnlyDatasetException```|```error::dataset::context::NotReadOnly```|
......@@ -3,6 +3,7 @@ abi-check-for-sdapi-azure:
tags: ["osdu-small"]
stage: scan
needs: ['compile-and-unit-test-azure']
allow_failure: true
script:
- yum update -y
- yum -y install https://dl.fedoraproject.org/pub/epel/epel-release-latest-7.noarch.rpm
......@@ -24,6 +25,8 @@ abi-check-for-sdapi-azure:
- echo "Current branch $current_version"
- chmod +x ./devops/scripts/abi-check.sh
- ./devops/scripts/abi-check.sh
- if [ -f logs/sdapi/$current_version/log.txt ]; then cat logs/sdapi/$current_version/log.txt; fi
- if [ -f logs/sdapi/$master_version/log.txt ]; then cat logs/sdapi/$master_version/log.txt; fi
- mkdir compat_reports/sdapi/azure
- mv compat_reports/sdapi/${master_version}_to_${current_version} compat_reports/sdapi/azure/.
artifacts:
......@@ -36,6 +39,7 @@ abi-check-for-sdapi-azure-curl:
tags: ["osdu-small"]
stage: scan
needs: ['compile-and-unit-test-azure-curl']
allow_failure: true
script:
- yum update -y
- yum -y install https://dl.fedoraproject.org/pub/epel/epel-release-latest-7.noarch.rpm
......@@ -57,6 +61,8 @@ abi-check-for-sdapi-azure-curl:
- echo "Current branch $current_version"
- chmod +x ./devops/scripts/abi-check.sh
- ./devops/scripts/abi-check.sh
- if [ -f logs/sdapi/$current_version/log.txt ]; then cat logs/sdapi/$current_version/log.txt; fi
- if [ -f logs/sdapi/$master_version/log.txt ]; then cat logs/sdapi/$master_version/log.txt; fi
- mkdir compat_reports/sdapi/azure_curl
- mv compat_reports/sdapi/${master_version}_to_${current_version} compat_reports/sdapi/azure_curl/.
artifacts:
......
......@@ -3,6 +3,7 @@ abi-check-for-sdapi-ibm:
tags: ["osdu-small"]
stage: scan
needs: ['compile-and-unit-test-ibm']
allow_failure: true
script:
- yum update -y
- yum -y install https://dl.fedoraproject.org/pub/epel/epel-release-latest-7.noarch.rpm
......@@ -24,6 +25,8 @@ abi-check-for-sdapi-ibm:
- echo "Current branch $current_version"
- chmod +x ./devops/scripts/abi-check.sh
- ./devops/scripts/abi-check.sh
- if [ -f logs/sdapi/$current_version/log.txt ]; then cat logs/sdapi/$current_version/log.txt; fi
- if [ -f logs/sdapi/$master_version/log.txt ]; then cat logs/sdapi/$master_version/log.txt; fi
- mkdir compat_reports/sdapi/ibm
- mv compat_reports/sdapi/${master_version}_to_${current_version} compat_reports/sdapi/ibm/.
artifacts:
......
......@@ -3,6 +3,7 @@ abi-check-for-sdapi-polycloud:
tags: ["osdu-small"]
stage: scan
needs: ['compile-and-unit-test-polycloud']
allow_failure: true
script:
- yum update -y
- yum -y install https://dl.fedoraproject.org/pub/epel/epel-release-latest-7.noarch.rpm
......@@ -24,6 +25,8 @@ abi-check-for-sdapi-polycloud:
- echo "Current branch $current_version"
- chmod +x ./devops/scripts/abi-check.sh
- ./devops/scripts/abi-check.sh
- if [ -f logs/sdapi/$current_version/log.txt ]; then cat logs/sdapi/$current_version/log.txt; fi
- if [ -f logs/sdapi/$master_version/log.txt ]; then cat logs/sdapi/$master_version/log.txt; fi
- mkdir compat_reports/sdapi/polycloud
- mv compat_reports/sdapi/${master_version}_to_${current_version} compat_reports/sdapi/polycloud/.
artifacts:
......
......@@ -317,15 +317,15 @@ else
rm -rf $build_dir/*
fi
# clean the dist directory
# set the dist dir
dist_dir=${mnt_volume}/dist
if [ -d ${dist_dir} ]; then
rm -rf ${dist_dir}
fi
# set the install directory
# set the install directory
install_dir=${dist_dir}/${build_mode}
# remove the install directory
rm -rf ${install_dir}
# ============================================================================
# Build
# ============================================================================
......@@ -358,7 +358,7 @@ args+=( "-Wno-dev" )
cmake -B${build_dir} -H${mnt_volume}/src "${args[@]}"
# make
cmake --build ${build_dir} -- -j $(nproc)
cmake --build ${build_dir} -j $(nproc)
if [ $? -ne 0 ]; then exit 1; fi
# install
......
......@@ -117,7 +117,7 @@ if ($build_static){
$build_static_str = "-DENABLE_STATIC=ON"
}
&cmake $src_dir -G $vcversion_cmake $tests -DOPTIONAL_STORAGE_PROVIDERS_ENABLED:STRING="$providers" -DCMAKE_INSTALL_PREFIX="$dist_dir_name" -DCMAKE_BUILD_TYPE="$build_type" -DCMAKE_TOOLCHAIN_FILE="$vcpkg_path\scripts\buildsystems\vcpkg.cmake" -DLIB_VERSION_MAJOR="$lib_version_major" -DLIB_VERSION_MINOR="$lib_version_minor" -DLIB_VERSION_PATCH="$lib_version_patch" $libcurl_version -DLIB_VERSION_ON_NAME=ON $build_static_str
&cmake -j ${env:NUMBER_OF_PROCESSORS} $src_dir -G $vcversion_cmake $tests -DOPTIONAL_STORAGE_PROVIDERS_ENABLED:STRING="$providers" -DCMAKE_INSTALL_PREFIX="$dist_dir_name" -DCMAKE_BUILD_TYPE="$build_type" -DCMAKE_TOOLCHAIN_FILE="$vcpkg_path\scripts\buildsystems\vcpkg.cmake" -DLIB_VERSION_MAJOR="$lib_version_major" -DLIB_VERSION_MINOR="$lib_version_minor" -DLIB_VERSION_PATCH="$lib_version_patch" $libcurl_version -DLIB_VERSION_ON_NAME=ON $build_static_str
if($LASTEXITCODE -ne 0) {
Write-Error "Failed on cmake (configuration build). Rerunning with verbose option."
......@@ -132,7 +132,7 @@ if($LASTEXITCODE -ne 0) {
# Switch to $bld_dir_plain
Set-Location $bld_dir_plain
&cmake $src_dir -G $vcversion_cmake $tests -DOPTIONAL_STORAGE_PROVIDERS_ENABLED:STRING="$providers" -DCMAKE_INSTALL_PREFIX="$dist_dir_name" -DCMAKE_BUILD_TYPE="$build_type" -DCMAKE_TOOLCHAIN_FILE="$vcpkg_path\scripts\buildsystems\vcpkg.cmake" -DLIB_VERSION_MAJOR="$lib_version_major" -DLIB_VERSION_MINOR="$lib_version_minor" -DLIB_VERSION_PATCH="$lib_version_patch" $libcurl_version $build_static_str
&cmake -j ${env:NUMBER_OF_PROCESSORS} $src_dir -G $vcversion_cmake $tests -DOPTIONAL_STORAGE_PROVIDERS_ENABLED:STRING="$providers" -DCMAKE_INSTALL_PREFIX="$dist_dir_name" -DCMAKE_BUILD_TYPE="$build_type" -DCMAKE_TOOLCHAIN_FILE="$vcpkg_path\scripts\buildsystems\vcpkg.cmake" -DLIB_VERSION_MAJOR="$lib_version_major" -DLIB_VERSION_MINOR="$lib_version_minor" -DLIB_VERSION_PATCH="$lib_version_patch" $libcurl_version $build_static_str
if($LASTEXITCODE -ne 0) {
Write-Error "Failed on cmake (configuration build plain). Rerunning with verbose option."
......@@ -147,7 +147,7 @@ if($LASTEXITCODE -ne 0) {
# Switch to $bld_dir
Set-Location $bld_dir
&cmake --build . --config "$build_type"
&cmake --build . --config "$build_type" -j ${env:NUMBER_OF_PROCESSORS}
if($LASTEXITCODE -ne 0) {
Write-Error "Failed on cmake build. Rerunning with verbose option."
......@@ -164,7 +164,7 @@ if($LASTEXITCODE -ne 0) {
# Switch to $bld_dir_plain
Set-Location $bld_dir_plain
&cmake --build . --config "$build_type"
&cmake --build . --config "$build_type" -j ${env:NUMBER_OF_PROCESSORS}
if($LASTEXITCODE -ne 0) {
Write-Error "Failed on cmake build plain. Rerunning with verbose option."
......@@ -190,4 +190,4 @@ Copy-Item "$osdu_dir/LICENSE" -Destination "$dist_dir/$build_type/sdapi-$version
Copy-Item "$bld_dir/version.txt" -Destination "$dist_dir/$build_type/sdapi-$version" -Recurse -Force
Compress-Archive -Path "$dist_dir/$build_type/sdapi-$version" -Update -DestinationPath "$zip_name"
Set-Location $work_dir
\ No newline at end of file
Set-Location $work_dir
......@@ -234,6 +234,10 @@ if (AWS_PROVIDER_ENABLED)
endif()
if (IBM_PROVIDER_ENABLED)
file(GLOB SRC_LIB_PROVIDERS_IBM ${sdapi_SOURCE_DIR}/src/lib/cloud/providers/ibm/*.cc)
if (NOT AWS_PROVIDER_ENABLED)
file(GLOB SRC_LIB_PROVIDERS_IBM_EXTRA ${sdapi_SOURCE_DIR}/src/lib/cloud/providers/aws/*.cc)
list(APPEND SRC_LIB_PROVIDERS_IBM "${SRC_LIB_PROVIDERS_IBM_EXTRA}")
endif()
endif()
if (GCP_PROVIDER_ENABLED)
file(GLOB SRC_LIB_PROVIDERS_GCP ${sdapi_SOURCE_DIR}/src/lib/cloud/providers/gcp/*.cc)
......
......@@ -35,19 +35,14 @@
#endif
#endif
#define DLL_LOCAL
#define DEPRECATED(x) [[deprecated("Deprecated: " x)]]
#else
#if __GNUC__ >= 4
#ifdef NO_DLL_PUBLIC
#define DLL_PUBLIC
#else
#define DLL_PUBLIC __attribute__((visibility("default")))
#endif
#define DLL_LOCAL __attribute__((visibility("hidden")))
#else
#ifdef NO_DLL_PUBLIC
#define DLL_PUBLIC
#define DLL_LOCAL
#else
#define DLL_PUBLIC [[gnu::visibility("default")]]
#endif
#define DEPRECATED(x) [[deprecated(x)]]
#endif
......@@ -82,9 +82,9 @@ namespace seismicdrive
_ctag = stringFromArchive(archive);
_type = stringFromArchive(archive);
_cloud_provider = stringFromArchive(archive);
_metadata = JsonUtils::tojson(stringFromArchive(archive));
_filemetadata = JsonUtils::tojson(stringFromArchive(archive));
_seismicmeta = JsonUtils::tojson(stringFromArchive(archive));
_metadata = jsonutils::toJson(stringFromArchive(archive));
_filemetadata = jsonutils::toJson(stringFromArchive(archive));
_seismicmeta = jsonutils::toJson(stringFromArchive(archive));
}
Dataset()
......@@ -102,27 +102,27 @@ namespace seismicdrive
{
}
Dataset(const Json::Value &root, const std::string &cloud_provider = "", const std::string &info = "")
Dataset(const json::Value &root, const std::string &cloud_provider, const std::string &info)
: _cloud_provider(cloud_provider)
{
_name = JsonUtils::getStringValueForVariableFromJSON(root, "name", true, info);
_path = JsonUtils::getStringValueForVariableFromJSON(root, "path", true, info);
_created_by = JsonUtils::getStringValueForVariableFromJSON(root, "created_by", false, info);
_created_date = JsonUtils::getStringValueForVariableFromJSON(root, "created_date", true, info);
_last_modified_date = JsonUtils::getStringValueForVariableFromJSON(root, "last_modified_date", true, info);
_storage_url = JsonUtils::getStringValueForVariableFromJSON(root, "gcsurl", true, info);
_ctag = JsonUtils::getStringValueForVariableFromJSON(root, "ctag", true, info);
_type = JsonUtils::getStringValueForVariableFromJSON(root, "type", false, info);
_ltag = JsonUtils::getStringValueForVariableFromJSON(root, "ltag", false, info);
_tags = JsonUtils::getStringVectorForVariableFromJSON(root, "gtags", false, info);
_tenant_name = JsonUtils::getStringValueForVariableFromJSON(root, "tenant", true, info);
_sub_project_name = JsonUtils::getStringValueForVariableFromJSON(root, "subproject", true, info);
_metadata = root.get("metadata", Json::Value::null);
_filemetadata = root.get("filemetadata", Json::Value::null);
_seismicmeta = root.get("seismicmeta", Json::Value::null);
_sbit = JsonUtils::getStringValueForVariableFromJSON(root, "sbit", false, info);
_sbit_count = JsonUtils::getStringValueForVariableFromJSON(root, "sbit_count", false, info);
_readonly = JsonUtils::getStringValueForVariableFromJSON(root, "readonly", false, info) == "true";
_name = jsonutils::getString(root, "name", info);
_path = jsonutils::getString(root, "path", info);
_created_by = jsonutils::getString(root, "created_by", info, false);
_created_date = jsonutils::getString(root, "created_date", info);
_last_modified_date = jsonutils::getString(root, "last_modified_date", info);
_storage_url = jsonutils::getString(root, "gcsurl", info);
_ctag = jsonutils::getString(root, "ctag", info);
_type = jsonutils::getString(root, "type", info, false);
_ltag = jsonutils::getString(root, "ltag", info, false);
_tags = jsonutils::getStringVector(root, "gtags", info, false);
_tenant_name = jsonutils::getString(root, "tenant", info);
_sub_project_name = jsonutils::getString(root, "subproject", info);
_metadata = root.get("metadata", json::Value::null);
_filemetadata = root.get("filemetadata", json::Value::null);
_seismicmeta = root.get("seismicmeta", json::Value::null);
_sbit = jsonutils::getString(root, "sbit", info, false);
_sbit_count = jsonutils::getString(root, "sbit_count", info, false);
_readonly = jsonutils::getString(root, "readonly", info, false) == "true";
}
// ---------------------------------------------------------------
......@@ -177,17 +177,17 @@ namespace seismicdrive
return _last_modified_date;
}
Json::Value get_metadata() const
json::Value get_metadata() const
{
return _metadata;
}
Json::Value get_filemetadata() const
json::Value get_filemetadata() const
{
return _filemetadata;
}
Json::Value get_seismicmeta() const
json::Value get_seismicmeta() const
{
return _seismicmeta;
}
......@@ -259,17 +259,17 @@ namespace seismicdrive
_last_modified_date = last_modified_date;
}
void set_metadata(const Json::Value &metadata)
void set_metadata(const json::Value &metadata)
{
_metadata = metadata;
}
void set_filemetadata(const Json::Value &filemetadata)
void set_filemetadata(const json::Value &filemetadata)
{
_filemetadata = filemetadata;
}
void set_seismicmeta(const Json::Value &seismicmeta)
void set_seismicmeta(const json::Value &seismicmeta)
{
_seismicmeta = seismicmeta;
}
......@@ -330,9 +330,9 @@ namespace seismicdrive
std::string _ctag;
std::string _type;
std::string _cloud_provider;
Json::Value _metadata{Json::Value::null};
Json::Value _filemetadata{Json::Value::null};
Json::Value _seismicmeta{Json::Value::null};
json::Value _metadata{json::Value::null};
json::Value _filemetadata{json::Value::null};
json::Value _seismicmeta{json::Value::null};
std::vector<std::string> _tags;
bool _readonly{false};
......
......@@ -20,7 +20,7 @@
#include <iostream>
#include <regex>
#include "SDException.h"
#include "SDExceptionImpl.h"
#include "shared/base64.h"
#include "shared/config.h"
#include "shared/utils.h"
......@@ -35,11 +35,6 @@ namespace seismicdrive
}
DatasetPath(const std::string &sdfilename, bool asdir = false)
{
init(sdfilename, asdir);
}
void init(const std::string &sdfilename, bool asdir = false)
{
_sdpath = sdfilename;
{
......@@ -47,16 +42,21 @@ namespace seismicdrive
auto contextIndex = _sdpath.find(contextkey);
if (contextIndex != std::string::npos)
{
_context = seismicdrive::Base64UrlDecode(_sdpath.substr(contextIndex + contextkey.length()));
_context = Base64UrlDecode(_sdpath.substr(contextIndex + contextkey.length()));
_sdpath = _sdpath.substr(0, contextIndex);
}
}
{
std::string emex;
if (!(asdir ? sdutils::isSDPath
: sdutils::isSDDatasetPath)(_sdpath, &emex))
if (asdir)
{
if (!sdutils::isSDPath(_sdpath))
{
throw error::dataset::InvalidPrefix(_sdpath);
}
}
else if (!sdutils::isSDDatasetPath(_sdpath))
{
throw SDExceptionDatasetError(emex);
throw error::dataset::InvalidPath(_sdpath);
}
}
if (_sdpath.back() == '/')
......@@ -106,7 +106,7 @@ namespace seismicdrive
return uriencode ? _path.empty() ? "%2F"
: std::regex_replace(_path, std::regex("/"), "%2F")
: _path.empty() ? "/"
: _path;
: _path;
}
std::string getDatasetName() const { return _dataset; }
......
......@@ -19,6 +19,7 @@
#endif
#include "SDDataset.h"
#include "SDExceptionImpl.h"
#include "Constants.h"
#include "SDManagerImpl.h"
......@@ -30,7 +31,7 @@ namespace seismicdrive
{
constexpr int SerialContextVersion = 1;
SDDataset::SDDataset(seismicdrive::SDManager *sdmanager, const std::string &sdfilename, const std::string &filetype)
SDDataset::SDDataset(SDManager *sdmanager, const std::string &sdfilename, const std::string &filetype)
: _sdmanager(sdmanager),
_sdpath(sdfilename),
_sdfiletype(filetype),
......@@ -65,12 +66,7 @@ namespace seismicdrive
{
if (!_storage)
{
const std::string cloud_provider = _dataset.get_cloud_provider();
if (cloud_provider.empty())
{
throw std::invalid_argument("Cloud provider not specified");
}
_storage.reset(Storage::create(cloud_provider,
_storage.reset(Storage::create(_dataset.get_cloud_provider(),
_sdmanager->_impl->getAuthProvider(),
sdconfig::PATHPREFIX + _sdpath.getTenantName() + '/' + _sdpath.getSubprojectName(),
_disposition == SDDatasetDisposition::READ_ONLY));
......@@ -106,14 +102,14 @@ namespace seismicdrive
// check if activate the pedantic mode (auto flush of file-metadata WID session applied only)
if (!pedantic.empty() && pedantic != api::json::KEnableProperty && pedantic != api::json::KDisableProperty)
{
throw SDExceptionDatasetError(sdmex::dataset::OpenArgsValue(api::json::Constants::KPedantic));
throw error::dataset::Error({sdmex::dataset::OpenArgsValue(api::json::Constants::KPedantic), _sdpath.getDatasetPath()});
}
_pedantic = pedantic.empty() || pedantic == api::json::KEnableProperty; // by default pedantic = true
// check if diable the flush of the manifest mode (auto flush of file-metadata globally applied)
if (!flushManifest.empty() && flushManifest != api::json::KEnableProperty && flushManifest != api::json::KDisableProperty)
{