Seismic issueshttps://community.opengroup.org/groups/osdu/platform/domain-data-mgmt-services/seismic/-/issues2023-03-24T16:02:13Zhttps://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-service/-/issues/69Subproject creation Bad Request2023-03-24T16:02:13ZDenis Karpenok (EPAM)Subproject creation Bad RequestGCP preshipping environment.
Tenant was created:
`{
"name": "autotesttenantid436502",
"esd": "odesprod.osdu-gcp.go3-nrg.projects.epam.com",
"gcpid": "osdu-data-prod",
"default_acls": "users.datalake.admins@odesprod.osdu...GCP preshipping environment.
Tenant was created:
`{
"name": "autotesttenantid436502",
"esd": "odesprod.osdu-gcp.go3-nrg.projects.epam.com",
"gcpid": "osdu-data-prod",
"default_acls": "users.datalake.admins@odesprod.osdu-gcp.go3-nrg.projects.epam.com"
}`
Trying to create subproject.
Request:
`curl --location --request POST 'https://preship.gcp.gnrg-osdu.projects.epam.com/api/seismic-store/v3/subproject/tenant/autotesttenantid436502/subproject/subprojectodi725168' \
--header 'Content-Type: application/json' \
--header 'data-partition-id: odesprod' \
--header 'ltag: odesprod-SeismicDMS-Legal-Tag-Test7116874' \
--header 'Authorization: Bearer ID_TOCKEN' \
--data-raw '{
"admin": "admin@odesprod.osdu-gcp.go3-nrg.projects.epam.com",
"storage_class": "MULTI_REGIONAL",
"storage_location": "US",
"legal": {
"legaltags": [
"odesprod-SeismicDMS-Legal-Tag-Test7116874"
],
"otherRelevantDataCountries": [
"US"
]
}
}'`
Response:
`[seismic-store-service] Bad Request`
Seismic-store logs:
`2022-10-21 15:40:40.798 EET{"error":{"code":400,"message":"[seismic-store-service] Bad Request","status":"BAD_REQUEST"}}`Sacha BrantsSacha Brantshttps://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-service/-/issues/70Subproject deletion bad request2023-03-24T16:00:19ZYan Sushchynski (EPAM)Subproject deletion bad requestWe found a bug connected with deleting subprojects. It seems that when we delete them the call to Entitlements service has a wrong URL.
We can see from the logs that Seismic sends request to the following URL:
https://entitlements/api/...We found a bug connected with deleting subprojects. It seems that when we delete them the call to Entitlements service has a wrong URL.
We can see from the logs that Seismic sends request to the following URL:
https://entitlements/api/entitlements/v2/groups/data/<goupname>.
The __data__ is extra here.
The bug is here: https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-service/-/blob/master/app/sdms/src/cloud/providers/google/config.ts#L131M15 - Release 0.18Diego MolteniSacha BrantsDiego Moltenihttps://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-service/-/issues/86osdu:wks:work-product-component--NotionalSeismicLine:1.0.02023-03-24T15:58:23ZSacha Brantsosdu:wks:work-product-component--NotionalSeismicLine:1.0.0https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-service/-/issues/92Refactor base images and npm dependencies version2023-03-24T15:54:04ZAliaksandr Ramanovich (EPAM)Refactor base images and npm dependencies versionIt seems it's time to update versions of base images used in Dockerfiles and dependencies in the package.json fileIt seems it's time to update versions of base images used in Dockerfiles and dependencies in the package.json fileSacha BrantsSacha Brantshttps://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/open-vds/-/issues/176VDS with LOD levels unexpectedly large2023-03-22T14:43:59ZAlexander JaustVDS with LOD levels unexpectedly largeI find VDS data sets to increase much more in size when LOD levels are added than expected.
## Obervation/Experiment
I created a VDS data sets from a volve data set (`ST0202R08_PS_PSDM_FULL_OFFSET_PP_TIME.MIG_FIN.POST_STACK.3D.JS-01753...I find VDS data sets to increase much more in size when LOD levels are added than expected.
## Obervation/Experiment
I created a VDS data sets from a volve data set (`ST0202R08_PS_PSDM_FULL_OFFSET_PP_TIME.MIG_FIN.POST_STACK.3D.JS-017534.segy`, about 1GiB) for different LOD levels. This is a 3D post-stack dataset such that I would expect an increase in file size of about 15% if all LOD levels are created. The major increase should appear for the first few LOD levels.
I observe the following file sizes:
| LOD level | Size in MiB | Relative size |
|-----------|-------------|---------------|
| 0 | 864,04 | 100% |
| 1 | 1248,06 | 144% |
| 2 | 1456,09 | 169% |
| 3 | 1456,09 | 169% |
| 4 | 1454,31 | 168% |
| 5 | 1456,09 | 169% |
| 6 | 1456,09 | 169% |
| 7 | 1456,09 | 169% |
| 8 | 1456,09 | 169% |
| 9 | 1454,88 | 168% |
| 10 | 1456,10 | 169% |
| 11 | 1456,10 | 169% |
| 12 | 1456,11 | 169% |
Now I see that the file size increases by more than 50%. If I scale my estimate I see that the increase at least flattens out quickly what agrees with the geometric series. However, I also see a small dip for LOD level 10 and 11 which I don't exactly understand, but maye that is due to some collapsed blocks.
I used the default settings, but also tested with LOD level and compression set explicitly with the same results for the file size. The file is stored on a local hard drive and file size is checked with `du`.
I also tested with a larger SEGY where VDS with no LOD levels is 11 GiB large. When adding 4 LOD levels the file size goes up to 20 GiB. This is an increase of file size by even more than **80%**.
## Questions
Is this increase in file size expected? I assumed that the file size increase should be according to the [geometric series](https://en.wikipedia.org/wiki/Geometric_series) with `a=1` and `r=(1/2)^d` which would be `r=1/8` for 3D data sets.
## Platform
* Apple Arm M1 Max
* MacOS 13.2.1
* OpenVDS 3.0.3 (compiled from source)https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-service/-/issues/87Get the schemas from the schema service instead of hard-coding them in this repo2023-03-21T20:23:36ZSacha BrantsGet the schemas from the schema service instead of hard-coding them in this repoRashaad GrayRashaad Grayhttps://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-service/-/issues/79osdu:wks:work-product-component--SeismicLineGeometry:1.0.02023-03-21T20:23:00ZSacha Brantsosdu:wks:work-product-component--SeismicLineGeometry:1.0.0Rashaad GrayRashaad Grayhttps://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-service/-/issues/78osdu:wks:work-product-component--SeismicBinGrid:1.1.02023-03-21T20:22:50ZSacha Brantsosdu:wks:work-product-component--SeismicBinGrid:1.1.0Rashaad GrayRashaad Grayhttps://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-service/-/issues/76osdu:wks:master-data--SeismicProcessingProject:1.2.02023-03-21T20:22:35ZSacha Brantsosdu:wks:master-data--SeismicProcessingProject:1.2.0Rashaad GrayRashaad Grayhttps://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-service/-/issues/75osdu:wks:master-data--SeismicAcquisitionSurvey:1.2.02023-03-21T20:22:21ZSacha Brantsosdu:wks:master-data--SeismicAcquisitionSurvey:1.2.0Rashaad GrayRashaad Grayhttps://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-service/-/issues/77osdu:wks:work-product-component--SeismicTraceData:1.3.02023-03-21T20:22:11ZSacha Brantsosdu:wks:work-product-component--SeismicTraceData:1.3.0Rashaad GrayRashaad Grayhttps://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/open-vds/-/issues/170Allow VDSCopy to overwrite existing SDMS dataset: `--allow-overwrite`2023-03-15T08:22:49ZFilip BrzękAllow VDSCopy to overwrite existing SDMS dataset: `--allow-overwrite`Hello,
first of all, thank you very much for the great `3.1` release, and re-implementation of `IOManagers`, for SDMS, it's been very helpful as `sdapi` was troublesome.
While examining new IOManagers with `OPENVDS_DMS_CURL=1`, I have ...Hello,
first of all, thank you very much for the great `3.1` release, and re-implementation of `IOManagers`, for SDMS, it's been very helpful as `sdapi` was troublesome.
While examining new IOManagers with `OPENVDS_DMS_CURL=1`, I have noticed a change in the behavior, that prevents us from writing bulk trace data (VDS content) into the previously created dataset (with the metadata we require), that doesn't yet have any data loaded.
```shell
OPENVDS_DMS_CURL=1 AWS_REGION=us-east-1 VDSCopy ./data/syntethic_data.wavelet.vds sd://osdu/<test-project>/demo-1
[CURL http respons error 409. Automatic rety https://<REDACTED>/api/seismic-store/v3/dataset/tenant/osdu/subproject/<REDACTED>/dataset/demo-1?path=%2F]
...
[Could not create VDS sd://osdu/<REDACTED>/gsi-demo-1] Seismic dms lock failed: Http error respons: 409 -> https://<REDACTED>/api/seismic-store/v3/dataset/tenant/osdu/subproject/<REDACTED>/dataset/demo-1?path=%2F
- [seismic-store-service] The dataset sd://osdu/<REDACTED>/demo-1 already exists[seismic-store-service]
```
when it's run through old DMS flow (using `sdapi`), it's happy to ignore 409, and proceed to write it, command for the reference, but it's results in `seg-fault` at the end (it was reported here #123)
```
AWS_REGION=us-east-1 VDSCopy ./data/syntethic_data.wavelet.vds sd://osdu/<test-project>/demo-1
```
Is it possible to add `--allow-overwrite` similar to the flag available in `VDSUploader.sh` in HueSpace SDK, which would ignore 409, when the dataset was created previously?
Rationale: we want to have control over how the dataset is created in SDMS, for data-lineage reasons, for which we need to create it ourselves, and then populate sd location with VDS content.
Regards,
Filip
PS. Is there a way to submit a feature request for `VDSUploader.sh` as well? If so what's the best channel? We've been exploring both tools for loading bulk data to OSDU, and we're seeing some gaps e.g. in S3 auth (only profile/dedicated role auth flow available), and no control over the dataset name when loading to SDMS, it generates random name, which prevents from targeting previously created dataset.https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/open-vds/-/issues/175Downsampling for LOD levels wrong for first point2023-03-07T10:19:02ZAlexander JaustDownsampling for LOD levels wrong for first pointI have been playing around with LOD level generation on a VDS with synthetic content. My goal is to understand the LOD levels and the downsampling better.
## Observation/Experiment
I wrote a Python script ([test_lod_levels_sine_functio...I have been playing around with LOD level generation on a VDS with synthetic content. My goal is to understand the LOD levels and the downsampling better.
## Observation/Experiment
I wrote a Python script ([test_lod_levels_sine_function.py](/uploads/f7a3eb217cdba359ea77b0b7c96ec0a4/test_lod_levels_sine_function.py)) which generates a sine function (with specified frequency and amplitude etc) that is written to a 3D VDS. I let OpenVDS add 4 LOD levels such that I have 5 levels in total. In the current setup in the script, I have a single brick with 32 samples in each direction on level 0.
In the next step I load the VDS file and extract the data along a line. The data is plotted it against the analytical function and I compute some error norms (only printed on screen). See the following plot:
## Observations and questions
From what I understand here seems to be the following:
- If data is sampled down, I kind only keep the second, fourth etc. sample on each level is kept. This looks somewhat like this
```text
level Samples
0 0 1 2 3 4 5 6 7 8 ...
1 0 2 4 6 8 ...
2 0 4 8 ...
```
Missing values indicate samples that are not available on the LOD level.
Is this understanding correct?
- At the moment, it looks to me that down sampling simply removes/ignores values that are in between the sample that are being kept. At least for to me it looks like this to me from the plot.
Is this understanding correct or do you do any kind of elaborate downsampling that includes some kind of anti-aliasing?
- From my experiments I expect that for my sine wave, I simply get bigger gaps between discrete points the higher go up per level. I plotted this for the all levels in my VDS:
![lod_level_sine_function](/uploads/7e94e7385e12e5b044569d64b1ecc2c0/lod_level_sine_function.png){width=60%}
My assumptions on how the downsampling is done, seem to hold mostly. However, for some reason the **first** sample on levels > 0 seems to be shifted (point at top left in the plot). The y-value here is close to 1 instead of 0.
If my interpretation of how the x-coordinate is determined, I would expect that all points also show the same offset. However, all other samples on a level of detail seem to match with a sample of level 0.
Am I doing something wrong here when determining the x-coordinate or does the downsampling have a bug here?
## Platform
* Apple Arm M1 Max
* MacOS 13.2.1
* OpenVDS 3.0.3 (compiled from source)https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-service/-/issues/95Can't connect to Redis protected by password and TLS disabled2023-03-04T08:27:36ZVolodymyr Pienskoi [EPAM / GCP]Can't connect to Redis protected by password and TLS disabledGoogle Cloud environment has Redis instance protected by password and no TLS. All required config variables are provided:
- LOCKSMAP_REDIS_INSTANCE_ADDRESS: minio-seismic-store
- LOCKSMAP_REDIS_INSTANCE_PORT: 6379
- LOCKSMAP_REDIS_INSTAN...Google Cloud environment has Redis instance protected by password and no TLS. All required config variables are provided:
- LOCKSMAP_REDIS_INSTANCE_ADDRESS: minio-seismic-store
- LOCKSMAP_REDIS_INSTANCE_PORT: 6379
- LOCKSMAP_REDIS_INSTANCE_KEY: password
- LOCKSMAP_REDIS_INSTANCE_TLS_DISABLE: true
When application is trying to connect to Redis instance I get the following errors in logs:
```
"[ioredis] Unhandled error event: Error: connect ETIMEDOUT
at TLSSocket.<anonymous> (/seistore-service/node_modules/ioredis/built/Redis.js:170:41)
at Object.onceWrapper (events.js:519:28)
at TLSSocket.emit (events.js:400:28)
at TLSSocket.emit (domain.js:475:12)
at TLSSocket.Socket._onTimeout (net.js:495:8)
at listOnTimeout (internal/timers.js:557:17)
at processTimers (internal/timers.js:500:7)"
```
It seems that `redisSubscriptionClient` in [dataset/locker.ts](./app/sdms/src/services/dataset/locker.ts) is always created with TLS option when `LOCKSMAP_REDIS_INSTANCE_KEY` provided despite `LOCKSMAP_REDIS_INSTANCE_TLS_DISABLE` is equal to true.
Possible fix is included in MR: https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-service/-/merge_requests/656M16 - Release 0.19https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/open-zgy/-/issues/27(Another) Build fix patch2023-02-24T13:32:48ZJon Jenssen(Another) Build fix patchThanks for merging the previous build patch I sent.
The latest code builds fine on both Windows (VS2022) and Linux(gcc/clang) with C++20 enabled, except for one thing that needs to be fixed.
I've attached a patch for this [build_fix.pa...Thanks for merging the previous build patch I sent.
The latest code builds fine on both Windows (VS2022) and Linux(gcc/clang) with C++20 enabled, except for one thing that needs to be fixed.
I've attached a patch for this [build_fix.patch](/uploads/98fd94543becf137c025e41184355a0c/build_fix.patch)https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/open-zgy/-/issues/26Building the native library with MSVC (Windows) and Clang or GCC on Linux2023-02-24T08:05:55ZJon JenssenBuilding the native library with MSVC (Windows) and Clang or GCC on LinuxThe attached patch allows you to build the native library with C++20 standard support using either MSVC 2022, clang or gcc. Supports building the library static as well.
[0001-Make-the-native-code-build-with-both-msvc-gcc-and-cl.zip](/u...The attached patch allows you to build the native library with C++20 standard support using either MSVC 2022, clang or gcc. Supports building the library static as well.
[0001-Make-the-native-code-build-with-both-msvc-gcc-and-cl.zip](/uploads/52a4ba6beab9fe7dfd54ad13bbc62419/0001-Make-the-native-code-build-with-both-msvc-gcc-and-cl.zip)https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-cpp-lib/-/issues/17Build fails when using Ninja generator2023-02-24T06:45:34ZPavel KisliakBuild fails when using Ninja generatorCurrently build fails when using Ninja generator, it's related to a bit different mechanism of scanning dependencies ([more details](https://github.com/ninja-build/ninja/issues/760)).
How to reproduce:
```
cmake -GNinja -B build
cmake -...Currently build fails when using Ninja generator, it's related to a bit different mechanism of scanning dependencies ([more details](https://github.com/ninja-build/ninja/issues/760)).
How to reproduce:
```
cmake -GNinja -B build
cmake --build build --config Release
```
Build error:
> ninja: error: 'crc32c/lib/libcrc32c.a', needed by 'libsdapi.so.0.0.0', missing and no known rule to make itPavel KisliakPavel Kisliakhttps://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/open-vds/-/issues/174Addition of valid input for SEGYImport2023-02-23T13:23:58ZAlexander JaustAddition of valid input for SEGYImportI wonder what would be the correct the conversion of SEG-Y files to VDS using `SEGYImport` when the provided options by `SEGYImport` are not rich enough.
Example: I have an attribute map in a SEG-Y file. However, none of the allowed na...I wonder what would be the correct the conversion of SEG-Y files to VDS using `SEGYImport` when the provided options by `SEGYImport` are not rich enough.
Example: I have an attribute map in a SEG-Y file. However, none of the allowed names for the `Attribute` property (`--attribute-name`: Amplitude (default), Attribute, Depth, Probability, Time, Vavg, Vint, or Vrms) are fitting my needs. The attribute name `Attribute` would be too vague for my use case and the other allowed names do not fit either.
- Should I import the SEG-Y file to VDS and afterwards change the name? I am not sure if that is possible, cf. #173.
- Could I provide a patch that extends SEGYImport by the units, names etc. that I would need?
- Should I fork and create my own SEGYImport? I would really like to avoid this.https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/open-vds/-/issues/172KnownChannelNames class2023-02-23T12:36:51ZMorten OfstadKnownChannelNames classThere should be a KnownChannelNames class with the names from the documentation (https://osdu.pages.opengroup.org/platform/domain-data-mgmt-services/seismic/open-vds/vds/specification/Metadata.html#named-channels) -- For C++ this is foun...There should be a KnownChannelNames class with the names from the documentation (https://osdu.pages.opengroup.org/platform/domain-data-mgmt-services/seismic/open-vds/vds/specification/Metadata.html#named-channels) -- For C++ this is found in GlobalMetadataCommon.h, but it is not available in Python/Java.Morten OfstadMorten Ofstadhttps://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/home/-/issues/20Rename ZGY/VDS ingestor DAGs2023-02-07T01:40:39ZNur SheikhRename ZGY/VDS ingestor DAGsDAG names are currently in segy-to-[vds|zgy]-conversion-xxxxxx format.
They should be renamed to
segy-to-vds-conversion
segy-to-zgy-conversion
**Impact**
no way for a user to identify which segy-to-vds or segy-to-zgy DAG to use.DAG names are currently in segy-to-[vds|zgy]-conversion-xxxxxx format.
They should be renamed to
segy-to-vds-conversion
segy-to-zgy-conversion
**Impact**
no way for a user to identify which segy-to-vds or segy-to-zgy DAG to use.M16 - Release 0.19Srinivasan Narayananshivani karipeNaresh JampalaSrinivasan Narayanan