OSDU Software issueshttps://community.opengroup.org/groups/osdu/-/issues2023-10-19T15:42:06Zhttps://community.opengroup.org/osdu/platform/data-flow/ingestion/ingestion-dags/-/issues/110Provide suitable clue (error message) when the batch size is large and no rec...2023-10-19T15:42:06ZDebasis ChatterjeeProvide suitable clue (error message) when the batch size is large and no records are processed/createdThe JSON payloads are clean. Actually, for my test cases these were JSON files from OSDU Reference data.
And Policy service is not enabled in my OSDU instance.
I run the payload the first time. It fails to create records.
Checked all th...The JSON payloads are clean. Actually, for my test cases these were JSON files from OSDU Reference data.
And Policy service is not enabled in my OSDU instance.
I run the payload the first time. It fails to create records.
Checked all the log files and they are all clean. So, that is very misleading.
I then split the initial payload into smaller pieces and then the job goes through smoothly.
What we need is a clear error message such as "in your current System configuration, it is not possible to handle payload of size larger than XXX".https://community.opengroup.org/osdu/ui/admincli/-/issues/6Package AdminCLI2023-10-24T04:00:56ZShane HutchinsPackage AdminCLIMake admincli pip installable or the like
https://osdu-community.ideas.aha.io/ideas/IDEA-I-83Make admincli pip installable or the like
https://osdu-community.ideas.aha.io/ideas/IDEA-I-83Shane HutchinsShane Hutchinshttps://community.opengroup.org/osdu/platform/data-flow/ingestion/external-data-sources/core-external-data-workflow/-/issues/41Can't use eds ingest without specific kinds in Storage service2023-10-24T11:59:17ZYauheni Rykhter (EPAM)Can't use eds ingest without specific kinds in Storage serviceHello!
Right now, we need to create some Storage kinds for eds ingest dag (ConnectedSourceRegistryEntry and ConnectedSourceDataJob).
But we also need to create 8-9 files for that 2 specific kinds.
Can you explain the reason for tha...Hello!
Right now, we need to create some Storage kinds for eds ingest dag (ConnectedSourceRegistryEntry and ConnectedSourceDataJob).
But we also need to create 8-9 files for that 2 specific kinds.
Can you explain the reason for that?
JFYI, @Oleksandr_Kosse, @Yauhen_ShaliouAshish SaxenaPriyanka BhongadeAshish Saxenahttps://community.opengroup.org/osdu/platform/security-and-compliance/policy/-/issues/116Add a sample DataAuthz policy using LegalTag extension properties for policy ...2023-10-31T22:24:02ZDadong ZhouAdd a sample DataAuthz policy using LegalTag extension properties for policy integration testingAdd a sample DataAuthz policy using LegalTag extension properties which will be included in policy integration testing.Add a sample DataAuthz policy using LegalTag extension properties which will be included in policy integration testing.Dadong ZhouDadong Zhouhttps://community.opengroup.org/osdu/platform/security-and-compliance/policy/-/issues/115Policy service isn't work propely after integration tests2023-10-30T14:40:39ZYauheni Rykhter (EPAM)Policy service isn't work propely after integration testsHello!
During our pipeline, we have the step when we deploy Policy service and initialization bundles (see the example):
![image](/uploads/5f5b05870bbea7937bdc09d716c1b5a5/image.png).
After integration tests, you can see that the size o...Hello!
During our pipeline, we have the step when we deploy Policy service and initialization bundles (see the example):
![image](/uploads/5f5b05870bbea7937bdc09d716c1b5a5/image.png).
After integration tests, you can see that the size of bundle was changed (see the example):
![image](/uploads/087374c11b7716116d8445fcfd9063b7/image.png).
Can you update tests that leave bundles as they were initialized?
Thanks in advance!
JFYI, @Oleksandr_Kosse, @Yauhen_ShaliouShane HutchinsShane Hutchinshttps://community.opengroup.org/osdu/platform/domain-data-mgmt-services/rock-and-fluid-sample/rafs-ddms-services/-/issues/273Add fluid system type to cap pressure + reference data2023-12-13T13:43:31ZMykhailo BuriakAdd fluid system type to cap pressure + reference data* Add FluidSystemType attribute to Capillary Pressure content schema
* Manifest the supporting reference list
community.opengroup.org/osdu/platform/domain-data-mgmt-services/rock-and-fluid-sample/rafs-ddms-services/-/tree/main/app/model...* Add FluidSystemType attribute to Capillary Pressure content schema
* Manifest the supporting reference list
community.opengroup.org/osdu/platform/domain-data-mgmt-services/rock-and-fluid-sample/rafs-ddms-services/-/tree/main/app/models/data_schemas/jsonschemaRAFS DDMS Sprint 20Carlos ColinCarlos Colinhttps://community.opengroup.org/osdu/platform/pre-shipping/-/issues/628M20 IBM Error in CSV parser2024-01-29T13:15:36ZTaylor GraberM20 IBM Error in CSV parserI am unable to have success through the csv-parser-dag. The log returns these error messages:
[2023-10-17 16:12:26,580] {pod_launcher.py:149} INFO - Error starting ApplicationContext. To display the conditions report re-run your applicat...I am unable to have success through the csv-parser-dag. The log returns these error messages:
[2023-10-17 16:12:26,580] {pod_launcher.py:149} INFO - Error starting ApplicationContext. To display the conditions report re-run your application with 'debug' enabled.
[2023-10-17 16:12:26,584] {pod_launcher.py:149} INFO - 2023-10-17 16:12:26.583 ERROR 1 --- [ main] o.s.boot.SpringApplication : Application run failed
I've attached the full log as well if that is helpful. Please let me know if you need more information from me[FullLog.txt](/uploads/102a9075fb9de58b0189f0b5ec6650a2/FullLog.txt)https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/reservoir/open-etp-server/-/issues/88Enable any number of threads2023-10-20T19:09:12ZGilson MartinsEnable any number of threadsCurrently the openETPServer limits the number of threads hardware_concurrency. The purpose is to create a new environment variable to enable/disable this behavior.
This way, if the env variable is set to true, then openETPServer accepts...Currently the openETPServer limits the number of threads hardware_concurrency. The purpose is to create a new environment variable to enable/disable this behavior.
This way, if the env variable is set to true, then openETPServer accepts any number of threads.Gilson MartinsGilson Martinshttps://community.opengroup.org/osdu/platform/pre-shipping/-/issues/627GCP Search service failing with "Error making request to Policy service"2023-10-17T12:17:01ZRakesh SharmaGCP Search service failing with "Error making request to Policy service"Hi, we are using GCP pre-shipping instance hosted at https://preship.gcp.gnrg-osdu.projects.epam.com and search service is failing to return data. Following is the error
```
Status of search response 404 : Not Found {"code":404,"reason"...Hi, we are using GCP pre-shipping instance hosted at https://preship.gcp.gnrg-osdu.projects.epam.com and search service is failing to return data. Following is the error
```
Status of search response 404 : Not Found {"code":404,"reason":"Error making request to Policy service. Check the inner HttpResponse for more info.","message":"{\"detail\":\"No query in opa response. Translate failed. Policy may not exist. {'result': {}}\"}"}
```https://community.opengroup.org/osdu/platform/consumption/geospatial/-/issues/308Unit Test cases2023-10-18T16:04:47Zvikas ranaUnit Test casesData preparation should also part of unit test cases.Data preparation should also part of unit test cases.https://community.opengroup.org/osdu/platform/consumption/geospatial/-/issues/307'host' in koop-config.json - Configurable.2023-10-18T16:03:48Zvikas rana'host' in koop-config.json - Configurable.The 'host' value in the koop-config.json file is currently hard-coded as "127.0.0.1." We need to make this value configurable .The 'host' value in the koop-config.json file is currently hard-coded as "127.0.0.1." We need to make this value configurable .https://community.opengroup.org/osdu/data/open-test-data/-/issues/93Volve Stratigraphy2023-10-25T05:58:29ZThomas Gehrmann [slb]Volve StratigraphyAdd a complete stratigraphic column for Volve. This is based on the spreadsheet provided by Bjarne Bøklepp [Equinor] in the member GitLab [here](https://gitlab.opengroup.org/osdu/subcommittees/data-def/projects/well-delivery/docs/-/blob/...Add a complete stratigraphic column for Volve. This is based on the spreadsheet provided by Bjarne Bøklepp [Equinor] in the member GitLab [here](https://gitlab.opengroup.org/osdu/subcommittees/data-def/projects/well-delivery/docs/-/blob/master/Design%20Documents/WellLogExtensions/Volve_WellLog_examples/stratigraphy/volve_lithostratigraphy.xlsx?ref_type=heads).Thomas Gehrmann [slb]Thomas Gehrmann [slb]https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/rock-and-fluid-sample/rafs-ddms-services/-/issues/271"add data" API to cross-check Analysis properties and show error if any mismatch2024-01-11T12:39:45ZDebasis Chatterjee"add data" API to cross-check Analysis properties and show error if any mismatch## Initial request
Refer to related issue in Data Definition space. https://gitlab.opengroup.org/osdu/subcommittees/data-def/work-products/schema/-/issues/619
We need to have enough information in the catalog (Work-product-component) a...## Initial request
Refer to related issue in Data Definition space. https://gitlab.opengroup.org/osdu/subcommittees/data-def/work-products/schema/-/issues/619
We need to have enough information in the catalog (Work-product-component) about what measurements to expect in the "content". Plus strict check (as we find in Wellbore DDMS) to show error when we try to use more fields (than stated in WPC) for "add content data" step. WPC also does not provide much clue about what to expect in the "content" - single-value, multi-value and so on.
For the NMR example, there are more details when we see JSON payload.
```plaintext
"AvailableSampleAnalysisProperties": [
"SamplesAnalysisID",
"SampleID",
"FreshState",
"FullBrineSaturation",
"PartialBrineSaturation",
"LiquidFilledPorosity",
"Porosity",
"EffectivePorosity",
"BVI",
"FFI",
"Swi",
"T2Cutoff"
],
```
Not quite Page-92 of sample Kentish report.
```plaintext
"Saturation": {
"Value": 100.0,
"UnitOfMeasure": "{{data-partition-id}}:reference-data--UnitOfMeasure:%25:"
},
"T2": {
"Value": 5.01,
"UnitOfMeasure": "{{data-partition-id}}:reference-data--UnitOfMeasure:ms:"
},
"Porosity": {
"Value": 0.136,
"UnitOfMeasure": "{{data-partition-id}}:reference-data--UnitOfMeasure:%25:"
}
},
```
---
# Proposed Solution
**Precondition**: Content schema is recorded and existng SampleAnalysisID is defined in the content schema
* System counts the names of attributes, object names and array names specified in the content schema
* System drops duplicate names and updates AvailableSampleAnalysisProperties array in samplesanalysis record with names from content schema
* If array already included some values already, then they should preserve after updateMykhailo BuriakMykhailo Buriakhttps://community.opengroup.org/osdu/data/data-definitions/-/issues/65Patch M21 Sample Analysis Family/Type2023-10-28T04:12:10ZThomas Gehrmann [slb]Patch M21 Sample Analysis Family/TypeBy mistake:
1. some SampleAnalysisType record ids contain `%20` (due to blanks in the code).
2. The SampleAnalysisFamily FlowAssuranceProperties should have been called FlowAssurance.By mistake:
1. some SampleAnalysisType record ids contain `%20` (due to blanks in the code).
2. The SampleAnalysisFamily FlowAssuranceProperties should have been called FlowAssurance.Thomas Gehrmann [slb]Thomas Gehrmann [slb]https://community.opengroup.org/osdu/platform/consumption/geospatial/-/issues/306Seismic 3D Polygon and attributes2023-10-12T19:16:03ZDebasis ChatterjeeSeismic 3D Polygon and attributesInitial request to @LeviRemi
```
{
"attributes": {
"ProjectName": "LOCKWOOD 3D",
"ProjectEndDate": 1319040000000,
"Operator": "opendes:master-data--Organisation:CHESA...Initial request to @LeviRemi
```
{
"attributes": {
"ProjectName": "LOCKWOOD 3D",
"ProjectEndDate": 1319040000000,
"Operator": "opendes:master-data--Organisation:CHESAPEAKE:",
"OperatingEnvironmentID": "opendes:reference-data--OperatingEnvironment:Onshore:",
"SeismicGeometryTypeID": "opendes:reference-data--SeismicGeometryType:3D:",
"SampleInterval": "0.002",
"RecordLength": "3.0",
"FoldCount": "999",
"esriOid": 10
},
```
Do you think there is option to truncate value when appropriate?
Ex: Operator = CHESAPEAKE.
Operating Environment = Onshore
SeismicGeometry type = 3D
Also, check the ProjectEndDate.
And how about units of measure for fields such as Sample Interval, Record Length?
Was there any consideration to expose AreaCalculated or AreaNominal?https://community.opengroup.org/osdu/platform/data-flow/ingestion/ingestion-workflow/-/issues/158A custom header 'x-user-id' is used in core part2023-11-08T19:54:10ZRiabokon Stanislav(EPAM)[GCP]A custom header 'x-user-id' is used in core partI wanted to bring to your attention an issue that was identified by our GC Team while they were in the process of addressing https://community.opengroup.org/osdu/platform/data-flow/ingestion/ingestion-workflow/-/issues/157.
org.opengrou...I wanted to bring to your attention an issue that was identified by our GC Team while they were in the process of addressing https://community.opengroup.org/osdu/platform/data-flow/ingestion/ingestion-workflow/-/issues/157.
org.opengroup.osdu.workflow.service.WorkflowRunServiceImpl#addUserId
```
private Map<String, Object> addUserId(String workflowName, TriggerWorkflowRequest request) {
final Map<String, Object> executionContext = request.getExecutionContext();
if (executionContext.get(KEY_USER_ID) != null) {
String errorMessage = String.format("Request to trigger workflow with name %s failed because execution context contains reserved key 'userId'", workflowName);
throw new AppException(400, "Failed to trigger workflow run", errorMessage);
}
String userId = dpsHeaders.getUserId();
log.debug("putting user id: " + userId + " in execution context");
executionContext.put(KEY_USER_ID, userId);
return executionContext;
}
```
The current logic relies on a custom header that is primarily intended for use at an infrastructural level, as outlined in https://community.opengroup.org/osdu/platform/data-flow/ingestion/home/-/issues/52. The GC team approved an ADR with the understanding that this custom header would not be utilized within the core codebase.
However, as indicated in https://community.opengroup.org/osdu/platform/deployment-and-operations/helm-charts-azure/-/merge_requests/366, a header named 'x-user-id' is populated with data from 'x-on-behalf-of' using a specific rule. This mechanism aligns with the requirements of the CSP provider but may not be entirely suitable for the Core Part of the Workflow Service.
```
if (jwt_authn[msft_issuer]["appid"] == serviceAccountClientId and on_behalf_of_header ~= nil and on_behalf_of_header ~= '') then
request_handle:headers():add("x-user-id", request_handle:headers():get("x-on-behalf-of"))
else
request_handle:headers():add("x-user-id", jwt_authn[msft_issuer]["appid"])
end
```
This logic introduces **three key issues**:
- The core part of the Workflow service depends on a custom CSP header to execute context, which may not be in alignment with the intended architecture.
- The Workflow service may not operate correctly without <ISTIO> and the accompanying special rule, potentially limiting its usability.
- There is a security concern in that 'x-user-id' is not currently validated on the BackEnd side, allowing any user to utilize it for potentially vested interests.
_As for the third problem_, there is the test case:
1. A user was authorized within Workflow Service.
1. This user uses 'x-user-id' with the name of another user, resulting in the triggering of a workflow under the identity of a different user.https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/rock-and-fluid-sample/rafs-ddms-services/-/issues/270Implement CRUD endpoints to manage (master) data2023-12-13T13:43:39ZMykhailo BuriakImplement CRUD endpoints to manage (master) dataRAFS DDMS API should have endpoints that support CRUD of RAFS DDMS related master data
- POST
- GET
- DELETE
- GET/version
Endpoint should cover the following master data in v2:
1. Sample
2. Coring
Link to Sample master data schema - ...RAFS DDMS API should have endpoints that support CRUD of RAFS DDMS related master data
- POST
- GET
- DELETE
- GET/version
Endpoint should cover the following master data in v2:
1. Sample
2. Coring
Link to Sample master data schema - https://gitlab.opengroup.org/osdu/subcommittees/data-def/work-products/schema/-/blob/master/E-R/master-data/Sample.1.0.0.md?ref_type=heads
Ensure that SampleID created via the DDMS can be referenced to SamplesAnalysis and SamplesAnalysesReport WPC
No need to migrate rocksample schema as it will be substituted by Sample master-data schemaRAFS DDMS Sprint 19Siarhei Khaletski (EPAM)Ernesto GutierrezSiarhei Khaletski (EPAM)https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/rock-and-fluid-sample/rafs-ddms-services/-/issues/269Manifest supporting Sample Analysis reference lists2023-12-13T13:43:46ZMykhailo BuriakManifest supporting Sample Analysis reference listsManifest the following reference lists to support SamplesAnalysisReport and SamplesAnalysis meta data:
https://gitlab.opengroup.org/osdu/subcommittees/data-def/work-products/schema/-/blob/master/E-R/reference-data/SampleAnalysisType.1.0...Manifest the following reference lists to support SamplesAnalysisReport and SamplesAnalysis meta data:
https://gitlab.opengroup.org/osdu/subcommittees/data-def/work-products/schema/-/blob/master/E-R/reference-data/SampleAnalysisType.1.0.0.md?ref_type=heads
https://gitlab.opengroup.org/osdu/subcommittees/data-def/work-products/schema/-/blob/master/E-R/reference-data/SampleAnalysisFamily.1.0.0.md?ref_type=heads
https://gitlab.opengroup.org/osdu/subcommittees/data-def/work-products/schema/-/blob/master/E-R/reference-data/SampleAnalysisSubFamily.1.0.0.md?ref_type=heads
https://gitlab.opengroup.org/osdu/subcommittees/data-def/work-products/schema/-/blob/master/E-R/reference-data/SamplesAnalysisCategoryTag.1.0.0.md?ref_type=headsRAFS DDMS Sprint 19Ernesto GutierrezErnesto Gutierrezhttps://community.opengroup.org/osdu/platform/domain-data-mgmt-services/rock-and-fluid-sample/rafs-ddms-services/-/issues/268Switch to published SamplesAnalysesReport WPC2023-10-13T07:39:05ZMykhailo BuriakSwitch to published SamplesAnalysesReport WPCMichael JonesMykhailo BuriakMichael Joneshttps://community.opengroup.org/osdu/platform/domain-data-mgmt-services/rock-and-fluid-sample/rafs-ddms-services/-/issues/267Switch to published SamplesAnalysis WPC2023-10-13T07:39:08ZMykhailo BuriakSwitch to published SamplesAnalysis WPCMichael JonesMykhailo BuriakMichael Jones