OSDU Software issueshttps://community.opengroup.org/groups/osdu/-/issues2023-09-19T10:20:18Zhttps://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/open-vds/-/issues/205fail to copy file from cloud server2023-09-19T10:20:18Znanting liufail to copy file from cloud serveri try to read vds from minio(a open-source, S3 compatible object store)
at first step,i try to read a vds file with _OpenVDS.open()_,but failed,
exception is "Error on downloading VolumeDataLayout object: Http error response: 404 -> http...i try to read vds from minio(a open-source, S3 compatible object store)
at first step,i try to read a vds file with _OpenVDS.open()_,but failed,
exception is "Error on downloading VolumeDataLayout object: Http error response: 404 -> https://endpoint/bucket-name/test.vds/VolumeDataLayout: The specified key does not exist.".
then,i realized that _open()_ can not read a vds file directly, cause the file uploaded manually.
and the second step,l try to use VDSCopy to copy the VDS file to the cloud environment,still fail! with error "Error on uploading VolumeDataLayout object: unexpected AWS signing failure",here is my command `VDSCopy.exe E:\PPCoef.vds s3://endpoint/bucket-name/testVDS -d "Region=us-west-rack-2;SecretKey=xxx;SecretAccessKey=xxx"`,my SecretKey&SecretAccessKey is correct,but l dont know why print this...
Could you please help me figure out how to deal with this situation?https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/rock-and-fluid-sample/rafs-ddms-services/-/issues/243Manifest reference list of GasChromatographyGasCompositionComponents2023-09-27T14:54:43ZMykhailo BuriakManifest reference list of GasChromatographyGasCompositionComponentsManifest the reference list of GasChromatographyGasCompositionComponents
Link to manifest file: https://gitlab.opengroup.org/osdu/subcommittees/data-def/projects/RAFSDDMSDEV/docs/-/blob/main/Design%20Documents/ReferenceValues/Manifests/...Manifest the reference list of GasChromatographyGasCompositionComponents
Link to manifest file: https://gitlab.opengroup.org/osdu/subcommittees/data-def/projects/RAFSDDMSDEV/docs/-/blob/main/Design%20Documents/ReferenceValues/Manifests/reference-data/OPEN/GasChromatographyGasCompositionComponents.1.0.0.json
Ensure that reference data can be used in Gas Composition content schema which will be implemented #205RAFS DDMS Sprint 17Ernesto GutierrezErnesto Gutierrezhttps://community.opengroup.org/osdu/platform/deployment-and-operations/infra-azure-provisioning/-/issues/318README issues after latest update2023-09-21T10:58:33ZDmytro KomisarREADME issues after latest updateLink to "Steps to create Flux Manifest Repository" [here](https://community.opengroup.org/osdu/platform/deployment-and-operations/infra-azure-provisioning/-/blame/master/README.md#L45)
https://community.opengroup.org/osdu/platform/deploy...Link to "Steps to create Flux Manifest Repository" [here](https://community.opengroup.org/osdu/platform/deployment-and-operations/infra-azure-provisioning/-/blame/master/README.md#L45)
https://community.opengroup.org/osdu/platform/deployment-and-operations/infra-azure-provisioning/-/blob/az/sa-fix-documentation/docs/flux.md
is broken.
In the same paragraph: "This step is optional and not recommended one for new installations" but next step with running common_prepare.sh fails because of
```bash
./infra/scripts/common_prepare.sh $(az account show --query id -otsv) $UNIQUE $PREFIX
ERROR: GIT_REPO not provided
```
and it looks like the previous step is needed. Or common_prepare.sh should be fixed.https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/rock-and-fluid-sample/rafs-ddms-services/-/issues/242Update technical documentation2024-02-03T00:38:56ZSiarhei Khaletski (EPAM)Update technical documentation**Context**
The documentation requires review.
**Scope**
- Review and update existing [architectural diagrams](https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/rock-and-fluid-sample/rafs-ddms-services/-/tree/main...**Context**
The documentation requires review.
**Scope**
- Review and update existing [architectural diagrams](https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/rock-and-fluid-sample/rafs-ddms-services/-/tree/main/docs/architecture)
- Review and update project [tutorial](https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/rock-and-fluid-sample/rafs-ddms-services/-/tree/main/docs/tutorial). Note: It makes sense to use references to the [QA Collection](https://community.opengroup.org/osdu/qa/-/tree/main/Dev/48_CICD_Setup_RAFSDDMSAPI?ref_type=heads) as much as possible
- Review and update schema management and ingestion document https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/rock-and-fluid-sample/rafs-ddms-services/-/tree/main/deployments?ref_type=headsErnesto GutierrezErnesto Gutierrezhttps://community.opengroup.org/osdu/platform/pre-shipping/-/issues/597M20, Azure - CSV Parser - Created record-ID - does not show "Data" from Searc...2023-09-22T17:51:41ZDebasis ChatterjeeM20, Azure - CSV Parser - Created record-ID - does not show "Data" from Search response@omprakash_epam
16-Sep-2023 updates.
I find record IDs in Airflow log now.
Can retrieve data by using Storage Service.
But when I retrieve using Search, then it does not show Data block.[M20-Azure-CSV-Parser-steps-Debasis.docx](/upload...@omprakash_epam
16-Sep-2023 updates.
I find record IDs in Airflow log now.
Can retrieve data by using Storage Service.
But when I retrieve using Search, then it does not show Data block.[M20-Azure-CSV-Parser-steps-Debasis.docx](/uploads/c660fae027788df878c5363d94f7b042/M20-Azure-CSV-Parser-steps-Debasis.docx)
I tested recently.
"runId": "test_workflow512151",
Log file is enclosed here.[Azure-CSV-Parser-runID-test_workflow512151.txt](/uploads/4cc20eaed9dcf7deeaab29eda4cf8ae0/Azure-CSV-Parser-runID-test_workflow512151.txt)
I get the impression that the approach has changed and we have to find record-IDs differently.
Can you please check with the DEV team and let us know?
Thank youZhibin MaiOm Prakash GuptaZhibin Maihttps://community.opengroup.org/osdu/platform/pre-shipping/-/issues/596M20 Azure RAFS - Create Legal Tag - No standard Error Message for Expiration...2023-09-28T00:54:16ZEsakkiprem SubramaniyanM20 Azure RAFS - Create Legal Tag - No standard Error Message for Expiration date Leap year/Date format validation HandlingWhile creating the legal tags in RAFS DDMS the expiration date validation is done for past dates. The error message is missing for leap year and wrong date format.
Request body: Case 1: { "name": "{{LtagName}}", "description": "Legal Ta...While creating the legal tags in RAFS DDMS the expiration date validation is done for past dates. The error message is missing for leap year and wrong date format.
Request body: Case 1: { "name": "{{LtagName}}", "description": "Legal Tag added for RAFS DDMS", "properties": { "contractId": "123456", "countryOfOrigin": \[ "US", "CA" \], "dataType": "Public Domain Data", "exportClassification": "EAR99", "originator": "Autotest", "personalData": "No Personal Data", "securityClassification": "Private", **"expirationDate": "2021-01-29"** } } response : { "code": 400, "reason": "Validation error.", "message": "{"errors":\["Expiration date must be a value in the future. Given 2021-01-29"\]}" }
Case 2: { "name": "{{LtagName}}", "description": "Legal Tag added for RAFS DDMS", "properties": { "contractId": "123456", "countryOfOrigin": \[ "US", "CA" \], "dataType": "Public Domain Data", "exportClassification": "EAR99", "originator": "Autotest", "personalData": "No Personal Data", "securityClassification": "Private", \*\* "expirationDate": "2025-02-29"\*\* } }
response:
```plaintext
No error msg found
```
Case 3:
{ "name": "{{LtagName}}", "description": "Legal Tag added for RAFS DDMS", "properties": { "contractId": "123456", "countryOfOrigin": \[ "US", "CA" \], "dataType": "Public Domain Data", "exportClassification": "EAR99", "originator": "Autotest", "personalData": "No Personal Data", "securityClassification": "Private", **"expirationDate": "2025-03-32"** } }
response:
```plaintext
No error msg found
```
Its better to provide the standard error messageDebasis ChatterjeeSiarhei Khaletski (EPAM)Om Prakash GuptaDebasis Chatterjeehttps://community.opengroup.org/osdu/platform/domain-data-mgmt-services/rock-and-fluid-sample/rafs-ddms-services/-/issues/241Change weight to mass to align with OSDU naming convention requirements2023-09-21T20:41:06ZRaghd GadrbouhChange weight to mass to align with OSDU naming convention requirementsBased on the #dd-fluidsamples-and-geochemistry group review of the extraction and fractionation data schemas, the forum asked to change the term "Weight" to "Mass" existing in the following schemas:
- https://community.opengroup.org/osdu...Based on the #dd-fluidsamples-and-geochemistry group review of the extraction and fractionation data schemas, the forum asked to change the term "Weight" to "Mass" existing in the following schemas:
- https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/rock-and-fluid-sample/rafs-ddms-services/-/blob/main/app/models/data_schemas/jsonschema/fractionation_data_schema.json?ref_type=heads
- https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/rock-and-fluid-sample/rafs-ddms-services/-/blob/main/app/models/data_schemas/jsonschema/extraction_data_schema.json?ref_type=headsRAFS DDMS Sprint 17Ernesto GutierrezErnesto Gutierrezhttps://community.opengroup.org/osdu/platform/domain-data-mgmt-services/rock-and-fluid-sample/rafs-ddms-services/-/issues/240Add method to all geochemistry schemas2023-10-02T15:04:13ZRaghd GadrbouhAdd method to all geochemistry schemasAdd method attribute to all geochemistry schemas:
- https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/rock-and-fluid-sample/rafs-ddms-services/-/blob/main/app/models/data_schemas/jsonschema/extraction_data_schema.js...Add method attribute to all geochemistry schemas:
- https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/rock-and-fluid-sample/rafs-ddms-services/-/blob/main/app/models/data_schemas/jsonschema/extraction_data_schema.json?ref_type=heads
- https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/rock-and-fluid-sample/rafs-ddms-services/-/blob/main/app/models/data_schemas/jsonschema/fractionation_data_schema.json?ref_type=heads
- https://gitlab.opengroup.org/osdu/subcommittees/data-def/projects/RAFSDDMSDEV/docs/-/blob/main/Design%20Documents/gc_gasoline_data_schema.json (not published yet)
- https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/rock-and-fluid-sample/rafs-ddms-services/-/blob/main/app/models/data_schemas/jsonschema/gcms_alkanes_data_schema.json?ref_type=heads
- https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/rock-and-fluid-sample/rafs-ddms-services/-/blob/main/app/models/data_schemas/jsonschema/gcms_aromatics_data_schema.json?ref_type=heads
- https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/rock-and-fluid-sample/rafs-ddms-services/-/blob/main/app/models/data_schemas/jsonschema/gcms_ratios_data_schema.json?ref_type=heads
- https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/rock-and-fluid-sample/rafs-ddms-services/-/blob/main/app/models/data_schemas/jsonschema/physchem_data_schema.json?ref_type=heads
- https://gitlab.opengroup.org/osdu/subcommittees/data-def/projects/RAFSDDMSDEV/docs/-/blob/main/Design%20Documents/whole_oil_gc_data_schema.json (not published yet)
```json
"Method": {
"type": "string",
"description": "The sample analysis method used for this analysis"
}
```
@michael_jones_epam can confirm the correct pattern required to allow values only from the SampleAnalysis reference data.
Draft Reference List:
https://gitlab.opengroup.org/osdu/subcommittees/data-def/projects/RAFSDDMSDEV/docs/-/blob/main/Design%20Documents/ReferenceValues/Manifests/reference-data/OPEN/GeochemistryMethod.jsonRAFS DDMS Sprint 17Ernesto GutierrezErnesto Gutierrezhttps://community.opengroup.org/osdu/platform/system/indexer-service/-/issues/112ADR: Create field for case insensitive search2024-02-26T17:16:00ZMark ChanceADR: Create field for case insensitive search# ADR: Add keywordLower Index Mapping field
<a name="TOC"></a>
[[_TOC_]]
# Status
- [x] Proposed
- [x] Trialing
- [x] Under review
- [x] Approved
- [ ] Retired
# Background
Application developers would like to provide to their users...# ADR: Add keywordLower Index Mapping field
<a name="TOC"></a>
[[_TOC_]]
# Status
- [x] Proposed
- [x] Trialing
- [x] Under review
- [x] Approved
- [ ] Retired
# Background
Application developers would like to provide to their users a simple mechanism to enable searching that is much like SQL "LIKE" queries with lower function. Currently, none of the existing ElasticSearch fields implement this.
# Context & Scope
## Requirements
The desire is to support the following search query:
```json
{
"kind": "osdu:wks:master-data--Well:1.0.0",
"query": "data.FacilityName.keywordLower:exam*"
}
```
Which would return
```json
{
"results": [
{
"data": {
"FacilityName": "Example test"
},
"id": "osdu:master-data--Well:1012"
}
]
}
```
# Tradeoff Analysis
# Proposed solution
A field in the index called keywordLower in which all input is normalized to lower case.
For example, this mapping in master-data--Well would be created:
```json
"CurrentOperatorID": {
"type": "text",
"fields": {
"keyword": {
"type": "keyword",
"null_value": "null",
"ignore_above": 256
},
"keywordLower": {
"type": "keyword",
"normalizer": "lowercase",
"null_value": "null",
"ignore_above": 256
}
}
},
```
The 'keywordLower' field is added and has the additional attribute:
"normalizer": "lowercase"
# Change Management
* Operators may need to re-ingest data or update the index. Is it possible to "patch" data to re-run the indexer on data already ingested?
# Decision
# Consequences
* The indexer code changes should have no noticeable impact on the system or applications (only additional property created).
* The index will be larger with the addition of the many instances of this field.
Draft MR: https://community.opengroup.org/osdu/platform/system/indexer-service/-/merge_requests/618M22 - Release 0.25Stanisław BienieckiStanisław Bienieckihttps://community.opengroup.org/osdu/platform/consumption/geospatial/-/issues/293Documentation - Create Documents for GCZ Security Pattern2024-01-17T15:52:07ZNoel OkanyaDocumentation - Create Documents for GCZ Security PatternAs a GCZ Product Owner, I want to understand potential GCZ Security Patterns, so that users can be aware of security options.
Acceptance Criteria:
- How to consume GCZ
- Roles, Privileges, and Responsibilities
- WorkaroundsAs a GCZ Product Owner, I want to understand potential GCZ Security Patterns, so that users can be aware of security options.
Acceptance Criteria:
- How to consume GCZ
- Roles, Privileges, and Responsibilities
- WorkaroundsBrianAnkita SrivastavaDavid JacobBrianhttps://community.opengroup.org/osdu/platform/security-and-compliance/policy/-/issues/110Request to have a M18 patch on AWS for memory leak fix2023-09-20T21:40:20ZDadong ZhouRequest to have a M18 patch on AWS for memory leak fixShell would like to request to have a M18 patch for Policy service memory leak fix. The memory leak issue with test results is documented in: https://community.opengroup.org/osdu/platform/security-and-compliance/policy/-/issues/93.
As s...Shell would like to request to have a M18 patch for Policy service memory leak fix. The memory leak issue with test results is documented in: https://community.opengroup.org/osdu/platform/security-and-compliance/policy/-/issues/93.
As shown in the test results, the memory leak is caused by using the logging package "coloredlogs" of version 15.0.1. It is tested that using an old version 14.2 of the package fixes the memory leak. The logging package is completely removed in M20.
We would like to request to have the M18 patch available on AWS with the only change of the logging package version number.
Thanks.
cc @KellyZhou @hutchins @MonicaJohnsShane HutchinsShane Hutchinshttps://community.opengroup.org/osdu/platform/system/dataset/-/issues/59Validation for requests is not working2023-09-12T09:58:03ZRiabokon Stanislav(EPAM)[GCP]Validation for requests is not workingThe request looks like this
```
curl -i -X PUT \
-H "Authorization:Bearer <token>" \
-H "data-partition-id:osdu" \
-H "Content-Type:application/json" \
-d \
'{
"datasetRegistries": [
]
}' \
'https://<server>/api/datase...The request looks like this
```
curl -i -X PUT \
-H "Authorization:Bearer <token>" \
-H "data-partition-id:osdu" \
-H "Content-Type:application/json" \
-d \
'{
"datasetRegistries": [
]
}' \
'https://<server>/api/dataset/v1/registerDataset'
```
returns only the HTTPS status 400 with empty body
![Screenshot_2023-09-11_at_19.21.35](/uploads/bf2a6045b170b2287d9d3dea9c469277/Screenshot_2023-09-11_at_19.21.35.png)M21 - Release 0.24Riabokon Stanislav(EPAM)[GCP]Riabokon Stanislav(EPAM)[GCP]https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/reservoir/open-etp-server/-/issues/81Speedup getting dataspace information by caching update time and size value2023-09-21T18:13:24ZLaurent DenySpeedup getting dataspace information by caching update time and size value* Getting the list of dataspace is a frequent request, since there is no API to get the information about individual dataspaces
* The request is using sql function that are expensive when there is a large number of dataset* Getting the list of dataspace is a frequent request, since there is no API to get the information about individual dataspaces
* The request is using sql function that are expensive when there is a large number of datasetM21 - Release 0.24Laurent DenyLaurent Denyhttps://community.opengroup.org/osdu/platform/domain-data-mgmt-services/reservoir/open-etp-server/-/issues/80exportEpc : output estimated progress percentage values2023-10-06T15:10:24ZVirginie MarcoutexportEpc : output estimated progress percentage valuesexportEpc : output estimated progress percentage values step by step
Related to [Task 60162: open-etp-server: log estimated phase percentage](https://dev.azure.com/pdgm/EP%20Connect/\_workitems/edit/60162)exportEpc : output estimated progress percentage values step by step
Related to [Task 60162: open-etp-server: log estimated phase percentage](https://dev.azure.com/pdgm/EP%20Connect/\_workitems/edit/60162)M21 - Release 0.24Virginie MarcoutVirginie Marcouthttps://community.opengroup.org/osdu/platform/pre-shipping/-/issues/595M20 AWS Airflow Missing Skipped IDs and Logs Level Error2023-09-28T12:17:58ZNaufal Mohamed NooriM20 AWS Airflow Missing Skipped IDs and Logs Level ErrorThere are 2 issues surrounding current preship AWS airflow. When user performs manifest ingestion with expected failure (i.e. missing reference value):
1. The airflow log XCom part does not show proper skipped_ids table as we saw in old...There are 2 issues surrounding current preship AWS airflow. When user performs manifest ingestion with expected failure (i.e. missing reference value):
1. The airflow log XCom part does not show proper skipped_ids table as we saw in older milestone release
![image](/uploads/74e688293bc55a5449cbf789d9be0d63/image.png)
In R3M16 version, we can see the XCom table has 3 rows including column header:
![image](/uploads/a00c416e70e221e938fce44a2d880a72/image.png)
2. The log does not show any ERROR logging level as expected
I have checked any stage of the airflow log I dont see any ERROR type logging eventhough the ingestion is failing (due to reference list I purposely include in the JSON manifest which does not exists in the database).
Check runID: be401335-e916-4e14-a38b-0feb52ca1b97
Sampel json used:
```
"data": {
"ProjectName": "KALICO 3D",
"SpatialLocation": {
"QualitativeSpatialAccuracyTypeID": "{{data_partition_id}}:reference-data--QualitativeSpatialAccuracyType:Assumed_HAHAHA:",
...
...
...}
```https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/reservoir/open-etp-server/-/issues/79Endpoint for readiness/healthness2023-09-21T18:11:02ZDzmitry Malkevich (EPAM)Endpoint for readiness/healthnessWe are facing periodic issues when Open ETP Server is respond with error to client or not responding at all and pod needs to be restarted so we need to have an endpoint to monitor readiness/healthness.
cc: @Oleksandr_Kosse , @Yan_Sushch...We are facing periodic issues when Open ETP Server is respond with error to client or not responding at all and pod needs to be restarted so we need to have an endpoint to monitor readiness/healthness.
cc: @Oleksandr_Kosse , @Yan_Sushchynski , @Yauhen_ShaliouLaurent DenyLaurent Denyhttps://community.opengroup.org/osdu/platform/system/indexer-service/-/issues/110ReIndex API does not always update the schema mapping to ElasticSearch2023-09-08T19:53:33ZZhibin MaiReIndex API does not always update the schema mapping to ElasticSearchWhen an augmenter configuration is deployed, in order to make use of the updated configuration, two operations are required:
1. Update the schema mapping with extended schema from the augmenter configuration to ElasticSearch
2. Re-index ...When an augmenter configuration is deployed, in order to make use of the updated configuration, two operations are required:
1. Update the schema mapping with extended schema from the augmenter configuration to ElasticSearch
2. Re-index the records of the affected kind.
In this scenario, it is expected that users still can search the "old" data before the re-index is completed.
Current implementation of ReIndex API does not always update the schema mapping to ElasticSearch if the forceClean option is not set to true. However, when the forceClean option is set to true, the original index will be deleted/purged. In this case, users may not be able to search the expected data before a new index is fully populated.Zhibin MaiZhibin Maihttps://community.opengroup.org/osdu/platform/domain-data-mgmt-services/reservoir/open-etp-server/-/issues/78Add --config flag to specify location of json configuration file2023-09-21T13:27:19ZFabiola RiveraAdd --config flag to specify location of json configuration fileCurrent implementation:
The default config json file stays together with executable. In the case of containers based on open_etp_server_runtime image will stay in /usr/bin/ container's folder.
The openETPServer will look for the .json (c...Current implementation:
The default config json file stays together with executable. In the case of containers based on open_etp_server_runtime image will stay in /usr/bin/ container's folder.
The openETPServer will look for the .json (custom and default) in the following order:
1)Current working directory
2)Parent folder of the working dir (for pipeline tests)
3)/usr/bin
Need to add a --config optional flag. If given, this is the path of the json configuation file to use.M21 - Release 0.24Fabiola RiveraFabiola Riverahttps://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/open-vds/-/issues/203Exec SEGYIMPORT.exe(or .sh) is the only way to convert a .sgy file to .vds file?2023-09-08T08:21:10Znanting liuExec SEGYIMPORT.exe(or .sh) is the only way to convert a .sgy file to .vds file?l am confused that run SEGYIMPORT.exe(or .sh) is the only method to convert a .sgy file to .vds file?l am confused that run SEGYIMPORT.exe(or .sh) is the only method to convert a .sgy file to .vds file?https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/open-vds/-/issues/202[docs][webpage] seems Python API docs are gone2023-09-15T11:30:27ZFilip Brzęk[docs][webpage] seems Python API docs are gone[Python API Link](https://osdu.pages.opengroup.org/platform/domain-data-mgmt-services/seismic/open-vds/python-api.html)
![image](/uploads/5fc6bed0cd3a145846b921ac6d2221c1/image.png)[Python API Link](https://osdu.pages.opengroup.org/platform/domain-data-mgmt-services/seismic/open-vds/python-api.html)
![image](/uploads/5fc6bed0cd3a145846b921ac6d2221c1/image.png)