OSDU Software issueshttps://community.opengroup.org/groups/osdu/-/issues2022-09-29T13:41:06Zhttps://community.opengroup.org/osdu/platform/system/indexer-service/-/issues/69Integration test to include input value from timezone different from UTC and ...2022-09-29T13:41:06ZDebasis ChatterjeeIntegration test to include input value from timezone different from UTC and test NormalizerPlease see working example from @Kateryna_Kurach
https://community.opengroup.org/osdu/platform/pre-shipping/-/blob/main/R3-M12/Test_Plan_Results_M12/Manifest_Ingestion/M12-GCP-Master-FoR-Date-check-Debasis-Kateryna.txt
Please include t...Please see working example from @Kateryna_Kurach
https://community.opengroup.org/osdu/platform/pre-shipping/-/blob/main/R3-M12/Test_Plan_Results_M12/Manifest_Ingestion/M12-GCP-Master-FoR-Date-check-Debasis-Kateryna.txt
Please include test with positive and negative time shift.
https://community.opengroup.org/osdu/platform/system/indexer-service/-/blob/master/indexer-core/src/test/java/org/opengroup/osdu/indexer/util/parser/DateTimeParserTest.java#L31
cc @nthakurhttps://community.opengroup.org/osdu/ui/data-loading/osdu-cli/-/issues/10Integration with wellbore ddms loader2021-12-07T01:15:31ZChad LeongIntegration with wellbore ddms loaderWe are developing in parallel a loader for the Wellbore DDMS. This ought to be integrated with osdu-cli as the base framework for data loading exercises in OSDU as the community tool.
https://community.opengroup.org/osdu/platform/data-...We are developing in parallel a loader for the Wellbore DDMS. This ought to be integrated with osdu-cli as the base framework for data loading exercises in OSDU as the community tool.
https://community.opengroup.org/osdu/platform/data-flow/data-loading/wellbore-ddms-las-loader/-/issues/29https://community.opengroup.org/osdu/platform/system/schema-service/-/issues/69Intermittent GET Schema endpoint response with 4042023-06-28T08:35:38ZNeelesh ThakurIntermittent GET Schema endpoint response with 404Indexer/Search consumers have reported issues that data block is not indexed for certain records.
While triaging the issue, we have found Schema service is responding with 404 when it receives multiple concurrent request in this case (p...Indexer/Search consumers have reported issues that data block is not indexed for certain records.
While triaging the issue, we have found Schema service is responding with 404 when it receives multiple concurrent request in this case (please take a look at logs below).
![Schema_Error](/uploads/e2eb86fb1697c46dc09fa3283e0133f1/Schema_Error.jpg)Aman VermaAman Vermahttps://community.opengroup.org/osdu/platform/system/storage/-/issues/66[Intermittent] Record Metadata is available in Cosmos but the Blob store retu...2022-09-27T11:10:14ZKrishna Nikhil Vedurumudi[Intermittent] Record Metadata is available in Cosmos but the Blob store returns a 404.If record metadata exist and the actual record doesn't exist in BlobStore, FetchBatchRecords API is going to return a 500 with following response
```
{
"code": 500,
"reason": "Unable to process parallel blob download",
"mess...If record metadata exist and the actual record doesn't exist in BlobStore, FetchBatchRecords API is going to return a 500 with following response
```
{
"code": 500,
"reason": "Unable to process parallel blob download",
"message": "AppException(error=AppError(code=404, reason=Specified blob was not found, message=Status code 404, \"<?xml version=\"1.0\" encoding=\"utf-8\"?><Error><Code>BlobNotFound</Code><Message>The specified blob does not exist._RequestId:580b9915-f01e-0009-2c0a-3c65a8000000_Time:2021-04-28T08:45:41.2917696Z</Message></Error>\", errors=null, debuggingInfo=null, originalException=com.azure.storage.blob.models.BlobStorageException: Status code 404, \"<?xml version=\"1.0\" encoding=\"utf-8\"?><Error><Code>BlobNotFound</Code><Message>The specified blob does not exist._RequestId:580b9915-f01e-0009-2c0a-3c65a8000000_Time:2021-04-28T08:45:41.2917696Z</Message></Error>\"), originalException=com.azure.storage.blob.models.BlobStorageException: Status code 404, \"<?xml version=\"1.0\" encoding=\"utf-8\"?><Error><Code>BlobNotFound</Code><Message>The specified blob does not exist._RequestId:580b9915-f01e-0009-2c0a-3c65a8000000_Time:2021-04-28T08:45:41.2917696Z</Message></Error>\")"
}
```
Couple of issues to investigate / fix
- The PersistentServiceImpl ensures that if the blob write has failed, the cosmos db update will not happen. How did we run into this inconsistency.
- If one blob does not exist, the entire FetchBatchRecords call should not fail with a 500.
- Error message for 5xx should always be standard. So, a 500 in this case should be Internal Server Error.https://community.opengroup.org/osdu/platform/system/storage/-/issues/170Invalidate derived data when parent record is deleted2023-03-31T10:02:02ZAn NgoInvalidate derived data when parent record is deletedDerived data (records with ancestry/parent) inherit the legal tags from the parent record(s).
So when at least one of the parent records is deleted, then the children records are no longer valid. Without this step, there are records wit...Derived data (records with ancestry/parent) inherit the legal tags from the parent record(s).
So when at least one of the parent records is deleted, then the children records are no longer valid. Without this step, there are records with invalid legal tags (or no legal tag) still exists in the system.https://community.opengroup.org/osdu/platform/system/lib/core/os-core-common/-/issues/70Invalid data-partition-id will create 500 code with no Authorize message.2023-10-10T12:16:06ZBruce JinInvalid data-partition-id will create 500 code with no Authorize message.While making calls to OSDU services, such as `secret` and `storage` service, testers discover that if they put invalid symbols in `data_partition_id`, we will have 500 code, but with reason of Access Denied.
After investigation, we reali...While making calls to OSDU services, such as `secret` and `storage` service, testers discover that if they put invalid symbols in `data_partition_id`, we will have 500 code, but with reason of Access Denied.
After investigation, we realize the partition service did not consider the situation that user put invalid URI symbols like `@#$%` in data partition
id, which will make the `normalizeStringUrl` function have Java.Lang exception in this [UrlNormalizationUtil.java](https://community.opengroup.org/osdu/platform/system/lib/core/os-core-common/-/blob/master/src/main/java/org/opengroup/osdu/core/common/util/UrlNormalizationUtil.java).
```
Caused by: java.lang.IllegalArgumentException: Malformed escape pair at index 57: http://os-partition:8080/api/partition/v1/partitions/osdu%
at java.net.URI.create(URI.java:852)
at org.opengroup.osdu.core.common.util.UrlNormalizationUtil.normalizeStringUrl(UrlNormalizationUtil.java:27)
```
This will generate a 500 code in entitlement service since the service will treat this error as a general error in [SpringExceptionMapper.java](handleGeneralException), instead a 400 code.
Also in Entitlements the error message is processed within [AuthorizationServiceImpl.java](https://community.opengroup.org/osdu/platform/system/lib/core/os-core-common/-/blob/master/src/main/java/org/opengroup/osdu/core/common/entitlements/AuthorizationServiceImpl.java), so it will have `"Access denied", "The user is not authorized to perform this action"` error message.
Here is a MR that will handle the 500 code produced from `java.lang.IllegalArgumentException` https://community.opengroup.org/osdu/platform/system/lib/core/os-core-common/-/merge_requests/219Bruce JinBruce Jinhttps://community.opengroup.org/osdu/data/open-test-data/-/issues/87Invalid dataset records used in ingestion of Volve seismic data2022-10-06T20:49:55ZMichaelInvalid dataset records used in ingestion of Volve seismic dataWe have noticed in several different environments that the dataset records used for the Volve seismic data is incorrect.
The dataset records being used are dataset--FileCollection.SEGY dataset records.
When attempting to call the data...We have noticed in several different environments that the dataset records used for the Volve seismic data is incorrect.
The dataset records being used are dataset--FileCollection.SEGY dataset records.
When attempting to call the dataset get retrieval instruction service (api/dataset/v1/getRetrievalInstructions) a 500 error is returned.
Can the Volve ingestion process be changed to either fix the dataset--FileCollection.SEGY dataset records so that a 500 error does not occur when calling the dataset get retrieval instruction service or changing the dataset record to be a dataset-File.Generic.https://community.opengroup.org/osdu/platform/consumption/geospatial/-/issues/152Investigate batch upload capability for OSDU API2022-11-30T17:39:18ZLevi RemingtonInvestigate batch upload capability for OSDU APIGCZ Sprint 24Bryan GunterBryan Gunterhttps://community.opengroup.org/osdu/ui/data-loading/wellbore-ddms-data-loader/-/issues/35Investigate WITSML parser capability2022-11-29T20:55:12ZChad LeongInvestigate WITSML parser capabilityInvestigate the extensibility of current WITSML parser vs a new parser for WITSML files that support Wellbore DDMS
The parser is here - https://community.opengroup.org/osdu/platform/data-flow/ingestion/energistics-osdu-integration/-/tre...Investigate the extensibility of current WITSML parser vs a new parser for WITSML files that support Wellbore DDMS
The parser is here - https://community.opengroup.org/osdu/platform/data-flow/ingestion/energistics-osdu-integration/-/tree/master/energisticsNiall McDaidNiall McDaidhttps://community.opengroup.org/osdu/platform/data-flow/real-time/consumers/python-cli-kafka-consumer/-/issues/6Ipsam officia atque ullam.2021-02-15T15:56:01ZDouglas DohmeyerIpsam officia atque ullam.# Sunt
Qui eos est. Tempora aut qui. Aliquid doloribus consequuntur. Quasi similique qui. Qui deleniti quasi.
`Sit.`# Sunt
Qui eos est. Tempora aut qui. Aliquid doloribus consequuntur. Quasi similique qui. Qui deleniti quasi.
`Sit.`https://community.opengroup.org/osdu/platform/data-flow/real-time/processors/pipe/-/issues/6Ipsam officia atque ullam.2021-02-15T15:56:01ZDmitry KniazevIpsam officia atque ullam.# Sunt
Qui eos est. Tempora aut qui. Aliquid doloribus consequuntur. Quasi similique qui. Qui deleniti quasi.
`Sit.`# Sunt
Qui eos est. Tempora aut qui. Aliquid doloribus consequuntur. Quasi similique qui. Qui deleniti quasi.
`Sit.`https://community.opengroup.org/osdu/platform/data-flow/real-time/consumers/python-cli-kafka-consumer/-/issues/12Ipsa molestiae sit aut.2021-02-15T15:56:02ZDouglas DohmeyerIpsa molestiae sit aut.### Maxime
Ipsum officia itaque. Illo ex rerum. Molestias quidem voluptatum. Ut dolor alias. Ad sed aspernatur.
quibusdam | quia | quia
---- | ---- | ----
voluptate | sit | et
porro | et | placeat### Maxime
Ipsum officia itaque. Illo ex rerum. Molestias quidem voluptatum. Ut dolor alias. Ad sed aspernatur.
quibusdam | quia | quia
---- | ---- | ----
voluptate | sit | et
porro | et | placeathttps://community.opengroup.org/osdu/platform/data-flow/real-time/processors/pipe/-/issues/12Ipsa molestiae sit aut.2021-02-15T15:56:02ZDmitry KniazevIpsa molestiae sit aut.### Maxime
Ipsum officia itaque. Illo ex rerum. Molestias quidem voluptatum. Ut dolor alias. Ad sed aspernatur.
quibusdam | quia | quia
---- | ---- | ----
voluptate | sit | et
porro | et | placeat### Maxime
Ipsum officia itaque. Illo ex rerum. Molestias quidem voluptatum. Ut dolor alias. Ad sed aspernatur.
quibusdam | quia | quia
---- | ---- | ----
voluptate | sit | et
porro | et | placeathttps://community.opengroup.org/osdu/platform/data-flow/ingestion/ingestion-dags/-/issues/111is_batch option still supporting2023-11-14T17:13:24ZBruce Jinis_batch option still supportingIn `manifest_ingestion` dags, we have 2 payload types, which leads to 2 flows, one is batch upload, another one is 3 step with some validations with schema and integrity.
My question is, are we still using the batch_upload option on the ...In `manifest_ingestion` dags, we have 2 payload types, which leads to 2 flows, one is batch upload, another one is 3 step with some validations with schema and integrity.
My question is, are we still using the batch_upload option on the left?
Since we are not checking for schema and integrity on that flow, and we have option to split 1 large file in batches in `processing_single_manifest_file_task![Screenshot_2023-11-13_at_3.24.18_PM](/uploads/8a9242b16809d5dcaf1237c4c9152049/Screenshot_2023-11-13_at_3.24.18_PM.png).https://community.opengroup.org/osdu/platform/deployment-and-operations/infra-azure-provisioning/-/issues/139Issue - Cosmos DB Limitations || Shared throughputs2021-06-23T09:30:15ZKrishna Nikhil VedurumudiIssue - Cosmos DB Limitations || Shared throughputsCosmos DB's shared throughput cannot be stretched beyond 25 collections. Quoting Azure documentation ->
> Azure Cosmos DB accounts using shared database throughput are now limited to 25 collections per database. This will allow for bett...Cosmos DB's shared throughput cannot be stretched beyond 25 collections. Quoting Azure documentation ->
> Azure Cosmos DB accounts using shared database throughput are now limited to 25 collections per database. This will allow for better throughput sharing across collections. Create additional databases with shared throughput and add more collections or add collections to the same database with dedicated throughput.
In OSDU we are currently at 25 collections. So, we have already hit the limitation.
This would require us to re-organize our collections to multiple databases so that they can scale in a better way.Krishna Nikhil VedurumudiKrishna Nikhil Vedurumudihttps://community.opengroup.org/osdu/platform/system/storage/-/issues/110Issue in Publisher Facade2022-11-21T09:51:31ZAbhishek Kumar (SLB)Issue in Publisher FacadeThe branch is not in a running state due to bug in the azure core library.
Please refer to this issue https://community.opengroup.org/osdu/platform/system/lib/cloud/azure/os-core-lib-azure/-/issues/17
**Branch:** UsageOfPublishFacadeThe branch is not in a running state due to bug in the azure core library.
Please refer to this issue https://community.opengroup.org/osdu/platform/system/lib/cloud/azure/os-core-lib-azure/-/issues/17
**Branch:** UsageOfPublishFacadeNikhil Singh[MicroSoft]Nikhil Singh[MicroSoft]https://community.opengroup.org/osdu/platform/system/lib/cloud/azure/os-core-lib-azure/-/issues/17Issue in Publisher Facade2022-01-21T09:59:41ZAbhishek Kumar (SLB)Issue in Publisher FacadeServices using Publisher Fascare from this MR.
They would fail with below error:
```
***************************
APPLICATION FAILED TO START
***************************
Description:
Field pubSubAttributesBuilder in org.opengroup.osdu....Services using Publisher Fascare from this MR.
They would fail with below error:
```
***************************
APPLICATION FAILED TO START
***************************
Description:
Field pubSubAttributesBuilder in org.opengroup.osdu.azure.publisherFacade.EventGridPublisher required a bean of type 'org.opengroup.osdu.azure.publisherFacade.models.PubSubAttributesBuilder' that could not be found.
The injection point has the following annotations:
- @org.springframework.beans.factory.annotation.Autowired(required=true)
Action:
Consider defining a bean of type 'org.opengroup.osdu.azure.publisherFacade.models.PubSubAttributesBuilder' in your configuration.
```
**Root cause:**
Autowiring beans which are not declared as Spring bean:<br>
`@Autowired
private PubSubAttributesBuilder pubSubAttributesBuilder;
`
<br>
`
@Lazy
@Builder
public class PubSubAttributesBuilder {
`
**Solution:**
Remove unused reference of `private PubSubAttributesBuilder pubSubAttributesBuilder` from `src/main/java/org/opengroup/osdu/azure/publisherFacade/EventGridPublisher.java` & `src/main/java/org/opengroup/osdu/azure/publisherFacade/ServiceBusPublisher.java`Nikhil Singh[MicroSoft]Nikhil Singh[MicroSoft]https://community.opengroup.org/osdu/platform/deployment-and-operations/infra-gcp-provisioning/-/issues/23Issues 17/22: OSDU pods are in CrashLoopBackOff state2023-08-16T13:00:20Zm sIssues 17/22: OSDU pods are in CrashLoopBackOff stateLocal HTTP minikube cluster with issues 17/22 errors, documentation(odt), custom-values.yaml and logs attached:
[commTicket.tar.gz](/uploads/74cda85e867b3addebf02ccac559f86f/commTicket.tar.gz)Local HTTP minikube cluster with issues 17/22 errors, documentation(odt), custom-values.yaml and logs attached:
[commTicket.tar.gz](/uploads/74cda85e867b3addebf02ccac559f86f/commTicket.tar.gz)Dzmitry Malkevich (EPAM)Dzmitry Malkevich (EPAM)https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-service/-/issues/127Issue with Get Status API2024-02-09T18:12:56ZJiman KimIssue with Get Status APIHello we are running some authentication testing and are running into some behaviors that may or may not be a bug.
for this endpoint
/seistore-svc/api/v4/status
We have 3 tests running
1. Sends an invalid token
2. Sends a valid toke...Hello we are running some authentication testing and are running into some behaviors that may or may not be a bug.
for this endpoint
/seistore-svc/api/v4/status
We have 3 tests running
1. Sends an invalid token
2. Sends a valid token but signed with a wrong secret
3. Sends the HTTP request without an authorization header.
1,2 return a 401
but 3 returns 200.
Is this a bug or intended behavior?
Thank you!M21 - Release 0.24https://community.opengroup.org/osdu/ui/data-loading/osdu-cli/-/issues/23Issue with Legal tag hardcoded information2023-12-14T10:06:38ZDurga Prasad Reddy NadavaluriIssue with Legal tag hardcoded informationI am attempting to ingest a reference record using the OSDU CLI and have noticed that it utilizes the following part of the schema in the Legal tag section:
```
"legal": {
"legaltags": [
"<Your-legaltag-name>"
],
"ot...I am attempting to ingest a reference record using the OSDU CLI and have noticed that it utilizes the following part of the schema in the Legal tag section:
```
"legal": {
"legaltags": [
"<Your-legaltag-name>"
],
"otherRelevantDataCountries": [
"US"
],
"status": "compliant"
}
```
From the above, it is evident that "otherRelevantDataCountries" is hardcoded with "US." However, if we are using a specific legal tag, would it be possible to derive country information from the provided legal tag instead of manually changing it every time based on various legal tags?
Reference Link to code (ingest.py): https://community.opengroup.org/osdu/ui/data-loading/osdu-cli/-/blob/main/src/osducli/commands/dataload/ingest.py?ref_type=heads#L631Chad LeongChad Leong