OSDU Software issueshttps://community.opengroup.org/groups/osdu/-/issues2023-09-20T02:17:59Zhttps://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-service/-/issues/114Implement dataset storage for IBM2023-09-20T02:17:59ZMark YanImplement dataset storage for IBMhttps://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-service/-/issues/113Implement dataset storage for GCP2023-09-20T02:17:21ZMark YanImplement dataset storage for GCPhttps://community.opengroup.org/osdu/platform/consumption/geospatial/-/issues/299Transformer - Ingest Seismic Nav/Shot-Points using new GeoJSON Pattern2023-09-19T14:38:06ZLevi RemingtonTransformer - Ingest Seismic Nav/Shot-Points using new GeoJSON PatternSupport for TraceNav will be added to OSDU. [This pattern](https://gitlab.opengroup.org/osdu/subcommittees/data-def/work-products/schema/-/blob/731231e814366336ac8f7e928bed2d0eadbb63ef/Examples/WorkedExamples/Seismic2DLineNavigation/cont...Support for TraceNav will be added to OSDU. [This pattern](https://gitlab.opengroup.org/osdu/subcommittees/data-def/work-products/schema/-/blob/731231e814366336ac8f7e928bed2d0eadbb63ef/Examples/WorkedExamples/Seismic2DLineNavigation/content/TraceNav.ST0299-05005_MIG_FIN.json) will be used. The following tasks should be addressed:
* Confirm existence of TraceData utilizing the new pattern in IBM Preship RI Environment
* Update the Transformer to support the new data format.
For additional info/context, see [this slack conversation with Thomas Gehrmann](https://opensdu.slack.com/archives/C022X4RC0UR/p1693269631009799?thread_ts=1693225511.708599&cid=C022X4RC0UR).https://community.opengroup.org/osdu/platform/deployment-and-operations/helm-charts-azure/-/issues/29Missing `helm dependency update` in Helm Chart for OSDU on Azure2023-09-18T15:40:58ZPaweł GrudzieńMissing `helm dependency update` in Helm Chart for OSDU on Azure**Description:**
The current instructions for deploying Helm charts do not include the essential step of updating dependencies using `helm dependency update`. This omission results in errors during chart deployment because dependencies ...**Description:**
The current instructions for deploying Helm charts do not include the essential step of updating dependencies using `helm dependency update`. This omission results in errors during chart deployment because dependencies are missing.
**Details:**
When attempting to deploy Helm charts using the provided instructions, the following error occurs:
```
Error: An error occurred while checking for chart dependencies. You may need to run `helm dependency build` to fetch missing dependencies: found in Chart.yaml, but missing in charts/ directory: unit, crs-catalog, crs-conversion, osdu-helm-library
```
Strangely enough I do not remember having that problem during the previous installation so I may assume that something changed in the code or my execution was a bit different. Nevertheless adding dep update should not break anything but instead add some quality to the instructions.
**Expected Behavior:**
The Helm chart should deploy successfully without dependency errors.
**Actual Behavior:**
The deployment fails due to missing chart dependencies.
**Steps to Reproduce:**
1. Follow the provided instructions to deploy Helm charts.
2. Observe the error indicating missing chart dependencies.
**Suggested Fix:**
Include the `helm dependency update` command before each `helm upgrade` command in the instructions:
```bash
# Ensure your context is set.
az aks get-credentials -n <your kubernetes service> --admin -g <resource group>
# Create Namespace
NAMESPACE=osdu-azure
kubectl create namespace $NAMESPACE && kubectl label namespace $NAMESPACE istio-injection=enabled
# Update dependencies and install charts
helm dependency update osdu-azure/osdu-partition_base
helm upgrade -i partition-services osdu-azure/osdu-partition_base -n $NAMESPACE -f osdu_azure_custom_values.yaml
helm dependency update osdu-azure/osdu-opa
helm upgrade -i opa osdu-azure/osdu-opa -n $NAMESPACE -f osdu_azure_custom_values.yaml --set global.replicaCount=3
helm dependency update osdu-azure/osdu-security_compliance
helm upgrade -i security-services osdu-azure/osdu-security_compliance -n $NAMESPACE -f osdu_azure_custom_values.yaml
helm dependency update osdu-azure/osdu-core_services
helm upgrade -i core-services osdu-azure/osdu-core_services -n $NAMESPACE -f osdu_azure_custom_values.yaml
helm dependency update osdu-azure/osdu-reference_helper
helm upgrade -i reference-services osdu-azure/osdu-reference_helper -n $NAMESPACE -f osdu_azure_custom_values.yaml
helm dependency update osdu-azure/osdu-ingest_enrich
helm upgrade -i ingest-services osdu-azure/osdu-ingest_enrich -n $NAMESPACE -f osdu_azure_custom_values.yaml
```https://community.opengroup.org/osdu/platform/deployment-and-operations/helm-charts-azure/-/issues/28Outdated Airflow create_user command in instructions (update for instructions)2023-09-18T15:23:17ZPaweł GrudzieńOutdated Airflow create_user command in instructions (update for instructions)**Title:** Outdated Airflow `create_user` command in instructions
**Description:**
The provided instructions for creating a user in Airflow use the old `create_user` command syntax. However, in the newer version of Airflow (Airflow 2),...**Title:** Outdated Airflow `create_user` command in instructions
**Description:**
The provided instructions for creating a user in Airflow use the old `create_user` command syntax. However, in the newer version of Airflow (Airflow 2), the correct command is `users create`.
**Details:**
The current documentation instructs users to utilize the following command:
```bash
airflow create_user \
--role Admin \
--username $USER_FIRST \
--firstname $USER_FIRST \
--lastname $USER_LAST \
--email $EMAIL \
--password $PASSWORD
```
This command is outdated and is not supported in Airflow 2.
**Expected Behavior:**
Instructions should utilize the updated command syntax compatible with Airflow 2:
```bash
airflow users create \
--role Admin \
--username $USER_FIRST \
--firstname $USER_FIRST \
--lastname $USER_LAST \
--email $EMAIL \
--password $PASSWORD
```
**Actual Behavior:**
Using the outdated command results in an error or unrecognized command in the Airflow 2 environment.
**Steps to Reproduce:**
1. Install Airflow 2.
2. Attempt to create a user using the provided `create_user` command.
3. Observe the error indicating the command is not recognized.
**Suggested Fix:**
Update the documentation to use the correct command syntax for creating a user in Airflow 2.https://community.opengroup.org/osdu/platform/deployment-and-operations/helm-charts-azure/-/issues/27Incorrect Kubernetes namespace in Airflow container retrieval instructions2023-09-18T12:37:46ZPaweł GrudzieńIncorrect Kubernetes namespace in Airflow container retrieval instructions**Title:** Incorrect Kubernetes namespace in Airflow container retrieval instructions
**Description:**
The provided instructions for accessing the Airflow web container refer to the wrong Kubernetes namespace. The documentation current...**Title:** Incorrect Kubernetes namespace in Airflow container retrieval instructions
**Description:**
The provided instructions for accessing the Airflow web container refer to the wrong Kubernetes namespace. The documentation currently indicates the namespace as `airflow` whereas the setup instructions establish it as `airflow2`. This is minor bug but gets me every time I try to deploy (and was not obvious the first time I deployed).
**Details:**
In the provided documentation, users are instructed to set up Airflow in the `airflow2` namespace:
```bash
# Create Namespace
NAMESPACE=airflow2
kubectl create namespace $NAMESPACE
```
However, subsequent instructions to retrieve the Airflow web container are using the `airflow` namespace:
```bash
# Get Airflow web container
AIRFLOW_WEB_CONTAINER=$(kubectl get pod -n airflow | grep "web" | cut -f 1 -d " ")
```
```
$ AIRFLOW_WEB_CONTAINER=$(kubectl get pod -n airflow | grep "web" | cut -f 1 -d " ")
No resources found in airflow namespace.
```
**Expected Behavior:**
The instructions should be consistent, with both referring to the same Kubernetes namespace.
**Actual Behavior:**
There's an inconsistency between setup instructions and the container retrieval instructions in terms of the namespace used.
**Steps to Reproduce:**
1. Follow the provided instructions to set up Airflow.
2. Attempt to retrieve the Airflow web container using the given command.
3. Observe the mismatch in namespace usage.
**Suggested Fix:**
Update the container retrieval instructions to use the `airflow2` namespace:
```bash
# Get Airflow web container
AIRFLOW_WEB_CONTAINER=$(kubectl get pod -n airflow2 | grep "web" | cut -f 1 -d " ")
```https://community.opengroup.org/osdu/platform/deployment-and-operations/infra-azure-provisioning/-/issues/320Authentication hangs with AzureRM provider version 2.64.0 in Monitoring Resou...2023-09-18T11:13:02ZPaweł GrudzieńAuthentication hangs with AzureRM provider version 2.64.0 in Monitoring Resources terraform and needs update**Description:**
When using the AzureRM Terraform provider at version `2.64.0`, the Monitoring Resources 'terraform apply' script hangs indefinitely without providing any error message. However, after updating the AzureRM provider to a ...**Description:**
When using the AzureRM Terraform provider at version `2.64.0`, the Monitoring Resources 'terraform apply' script hangs indefinitely without providing any error message. However, after updating the AzureRM provider to a newer version, the problem is resolved, suggesting an authentication issue with version `2.64.0`. I did not capture the logs sadly.
**Details:**
In a Terraform script with the below configuration:
```
terraform {
required_version = ">= 1.3"
backend "azurerm" {
key = "terraform.tfstate"
}
required_providers {
azurerm = {
source = "hashicorp/azurerm"
version = "=2.64.0"
}
random = {
source = "hashicorp/random"
version = "=2.3.1"
}
}
}
```
The terraform apply command hangs indefinitely during execution. Although no error message was shown in standard logs, verbose logs indicated an authentication error to Azure.
**Expected Behavior:**
The terraform apply should either execute successfully.
**Actual Behavior:**
The script hangs indefinitely without any feedback to the user.
**Steps to Reproduce:**
1. Use the AzureRM provider at version `2.64.0` in a Terraform script.
2. Execute the script.
3. Observe that it hangs without any clear error message.
**Workaround:**
Upgrading the AzureRM provider to a newer version (e.g., `3.73.0`) resolves the problem.
**Suggested Fix:**
Upgrade to the latest version of the provider in documentation.
**Environment:**
- Terraform version: (e.g., 1.5.1)
- AzureRM provider version where the issue was observed: `2.64.0`https://community.opengroup.org/osdu/platform/deployment-and-operations/infra-azure-provisioning/-/issues/319Incorrect usage of trim function leads to malformed resource names in monitor...2023-09-18T10:59:58ZPaweł GrudzieńIncorrect usage of trim function leads to malformed resource names in monitoring resources terraformIn the Terraform module, the trim function is being used to remove specific suffixes from strings. However, the current usage can lead to the unintended removal of characters, causing malformed resource names in Azure resources.
Descri...In the Terraform module, the trim function is being used to remove specific suffixes from strings. However, the current usage can lead to the unintended removal of characters, causing malformed resource names in Azure resources.
Description:
In the Monitoring Resources main.tf Terraform module, the trim function is being used to remove specific suffixes from strings. However, the current usage can lead to the unintended removal of characters, causing malformed resource names in Azure resources.
Details:
The specific instance observed is in the trimming of the -rg suffix from resource group names. The current code uses:
```
central_group_prefix = trim(data.terraform_remote_state.central_resources.outputs.central_resource_group_name, "-rg")
```
The intention is to remove the -rg suffix, but due to the behavior of trim, it also removes any individual -, r, and g characters from the ends of the string, leading to unexpected results.
For instance, a name like "osdu-pl2-crpl2-583g-rg" is trimmed to "osdu-pl2-crpl2-583" instead of the expected "osdu-pl2-crpl2-583g".
Expected Behavior:
The -rg suffix should be removed without affecting other characters in the string.
Actual Behavior:
Characters within the -rg suffix are being removed individually if they are at the ends of the string, leading to unexpected results.
Steps to Reproduce:
Use a resource group name like "osdu-pl2-crpl2-583g-rg".
Apply the Terraform module.
Observe that resources dependent on the central_group_prefix variable have the g character missing.
Suggested Fix:
Replace the trim function with the trimsuffix function, which will only remove the exact -rg suffix:
```
central_group_prefix = trimsuffix(data.terraform_remote_state.central_resources.outputs.central_resource_group_name, "-rg")
```
This change should be applied wherever the trim function is used in a similar context.https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-sdutil/-/issues/29Add verification that the Seismic's cloud provider matches the one from the c...2023-10-26T12:19:36ZYan Sushchynski (EPAM)Add verification that the Seismic's cloud provider matches the one from the config.yamlHello,
Recently we introduced a new implementation of Seismic for Google cloud. It mostly follows almost the same workflow as the previous google implementation, but there are some cruicial differences, and we got unpredicted results. A...Hello,
Recently we introduced a new implementation of Seismic for Google cloud. It mostly follows almost the same workflow as the previous google implementation, but there are some cruicial differences, and we got unpredicted results. As far as I remember, the service's respnses have information about cloud providers.
What if some extra checks are added?
ThanksDiego MolteniMark YanDiego Moltenihttps://community.opengroup.org/osdu/platform/system/search-service/-/issues/135ADR Provide suggestions for auto-complete of input2024-01-15T11:56:08ZMark ChanceADR Provide suggestions for auto-complete of input# ADR: Autocomplete
<a name="TOC"></a>
[[_TOC_]]
# Status
- [x] Proposed
- [x] Trialing
- [ ] Under review
- [ ] Approved
- [ ] Retired
# Background
Shell application developer stakeholders want to provide to their users the functi...# ADR: Autocomplete
<a name="TOC"></a>
[[_TOC_]]
# Status
- [x] Proposed
- [x] Trialing
- [ ] Under review
- [ ] Approved
- [ ] Retired
# Background
Shell application developer stakeholders want to provide to their users the functionality to provide auto-complete suggestions based on partial input.
# Context & Scope
Based on words occurring in OSDU platform records, a comparison is made to all text tokens occurring in all fields of a record. For this case we propose using bagOfWords described in indexer [ADR](https://community.opengroup.org/osdu/platform/system/indexer-service/-/issues/113)
[Back to TOC](#TOC)
## Requirements
The partial input is passed to the search service and a list of suggestions is returned.
To be useful, the response time must be under 2 seconds.
[Back to TOC](#TOC)
# Tradeoff Analysis
[Back to TOC](#TOC)
# Proposed solution
The search query json will support this syntax:
```json
{
"suggestPhrase": "united"
}
```
Which would return something of the form:
```json
{
"phraseSuggestions": [
"United States",
"United States therm",
"United Kingdom",
"United Kingdom British thermal unit",
"United Kingdom term",
"United Kingdom nautical mile",
]
}
```
[Back to TOC](#TOC)
# Change Management
* Operators may need to execute reindex with force_clean=true action on indices to enable this feature.
# Decision
# Consequences
* The search code changes will not impact any existing queries or functionality since this is a new field.
[Back to TOC](#TOC)
#EOF.M23 - Release 0.26Mark ChanceMark Chancehttps://community.opengroup.org/osdu/platform/system/indexer-service/-/issues/113ADR: Bag of Words2024-03-18T14:07:18ZMark ChanceADR: Bag of Words# ADR: Copy all text field to BagOfWords field
<a name="TOC"></a>
[[_TOC_]]
# Status
- [x] Proposed
- [x] Trialing
- [x] Under review
- [x] Approved
- [ ] Retired
# Background
The application development stakeholders want to provid...# ADR: Copy all text field to BagOfWords field
<a name="TOC"></a>
[[_TOC_]]
# Status
- [x] Proposed
- [x] Trialing
- [x] Under review
- [x] Approved
- [ ] Retired
# Background
The application development stakeholders want to provide their users a mechanism to search for words in a record regardless of where it appears in the record. Currently this is not working for nested fields as inner mechanism is relying on `query_string` ES query which is not allowing searching through nested documents.
# Context & Scope
[Back to TOC](#TOC)
## Requirements
- User is able to find resources by words stored in any field using query without using explicit field names.
- User is able to find resources referencing given ID from external systems if this ID is part of referencing OSDU ID.
- (Additional) List of all phrases is stored inside single field to be able to implement simple autocompletion.
[Back to TOC](#TOC)
# Tradeoff Analysis
## Option 1
All the fields are copied and to the word-bag using copy_to mechanism. We are proposing `bagOfWords` as the internal field name for this use case. This enables the user to find wells through their alias names using fulltext query (name aliases are stored in the nested array, so currently it is not possible without explicitly specifying field name).Additionally, to `bagOfWords` we would like to add ID detail as they are often IDs from external source systems like (“osdu:wks::master-data—Well-1.0.0:43234324” detail here may contain UWI). So, when the users know 4323424 (for example from the source system) but don't know OSDU internal ID system, they are still able to find records referencing them (for example find all DS related to given wellbore). Such a field is also valuable for implementing search-as-you-type autocompletion, we can create simple but powerful version of it by just adding a subfield with ES completion indexing and expose it for searching.
## Option 2
If for some reason alternative 1 is too broad, it is suggested to use the indexing hints added to the schema files as described here: https://gitlab.opengroup.org/osdu/subcommittees/ea/work-products/adr-elaboration/-/issues/66. A tag such as x-osdu-indexing-copytowordbag could be an indicator that the associated field is to be added to the workbag field:
“x-osdu-indexing-copytowordbag”: “enabled”/"disabled"
for example. However such approach would make schemas less portable as every OSDU installation may have different needs.
[Back to TOC](#TOC)
# Proposed solution
For each kind of resource, an index will be created and the value will contain all (normalized) tokens across all other text fields in the mapping.
This will enable a query of the form:
```json
{
"kind": "osdu:*:*:*",
"query": "test"
}
```
which would return
```json
{
"results": [
{
"data": {
"FacilityName": "Example test"
},
"id": "osdu:master-data--Well:1012"
},
{
"data": {
"FacilityNameAlias": "Example test"
},
"id": "osdu:master-data--Well:30142"
}
]
}
```
The search service query against the word_bag field so that the two wells would be returned despite 'test' occurring in different fields.
[Back to TOC](#TOC)
## Accepted Limitations / things to work out
[Back to TOC](#TOC)
# Change Management
* Operators may need to execute reindex with force_clean=true action on indices to enable this feature.
# Decision
# Consequences
* The indexer code changes should have no impact on automated applications as they are using field related queries which are unchanged. Application where user is controlling top level query might show new additional results (for matches in nested objects and in ID details), but this is expected behavior.
[Back to TOC](#TOC)
#EOF.M22 - Release 0.25Mark ChanceMark Chancehttps://community.opengroup.org/osdu/data/open-test-data/-/issues/92Create Seismic 2D Navigation sample JSON payloads - ready to support display ...2023-09-14T13:02:22ZDebasis ChatterjeeCreate Seismic 2D Navigation sample JSON payloads - ready to support display of SP labelsPlease see
https://gitlab.opengroup.org/osdu/subcommittees/data-def/work-products/schema/-/issues/348#note_69692
With that information, I think we may need to overhaul these (SEGP1) examples.
Existing JSON payloads here
https://commu...Please see
https://gitlab.opengroup.org/osdu/subcommittees/data-def/work-products/schema/-/issues/348#note_69692
With that information, I think we may need to overhaul these (SEGP1) examples.
Existing JSON payloads here
https://community.opengroup.org/osdu/platform/data-flow/data-loading/open-test-data/-/tree/master/rc--3.0.0/4-instances/Volve/work-products/seismics_1_2_0
cc @Keith_Wallhttps://community.opengroup.org/osdu/platform/system/search-service/-/issues/134Search should not return 404 in case there are no matching data in Elasticsearch2023-11-08T14:07:37ZDenis Karpenok (EPAM)Search should not return 404 in case there are no matching data in Elasticsearch**The expected result:**
- When no data matches the query response is 200 OK with an empty list.
**Actual results are:**
- Inconsistent, sometimes it's 200 OK sometimes it's 400.
**Reason:**
- Not all requests to ElasticSearch have...**The expected result:**
- When no data matches the query response is 200 OK with an empty list.
**Actual results are:**
- Inconsistent, sometimes it's 200 OK sometimes it's 400.
**Reason:**
- Not all requests to ElasticSearch have parameters to ignore user errors, usually, those are preliminary requests to get details for further search queries, for example: https://community.opengroup.org/osdu/platform/system/search-service/-/blob/master/search-core/src/main/java/org/opengroup/osdu/search/service/FieldMappingTypeService.java#L49
**Solution:**
- Suppress all 400 errors from Elasticsearch and respond to the end user only with 200 OK.
**Pros:**
- More consistent workflow for client applications.
- Reduced error handling for client applications.
More details are in the attached CSV files:
[test_results_2023-08-29_11-34-31.csv](/uploads/03bf18c852387f4da493aa13b97ad5d3/test_results_2023-08-29_11-34-31.csv)
[test_results_2023-08-29_11-51-20.csv](/uploads/6071b35ea688e57bdf24112198a9ddd7/test_results_2023-08-29_11-51-20.csv)https://community.opengroup.org/osdu/platform/consumption/geospatial/-/issues/297Look at adding URL to file as attribute for any data item we index that inclu...2023-09-13T20:56:44ZBrianLook at adding URL to file as attribute for any data item we index that includes a file in OSDU (e.d SEGY for sesimic, or LAS for log, etc)https://community.opengroup.org/osdu/platform/consumption/geospatial/-/issues/296Transformer pod crashloopBackoff in Azure K8s2023-09-22T05:23:58ZBenjamin LaGroneTransformer pod crashloopBackoff in Azure K8skubectl logs gcz-transformer-7c4dbd8dcf-qrsz6 -n ignite
2023-09-13 18:34:19,443 main DEBUG Apache Log4j Core 2.17.2 initializing configuration YamlConfiguration[location=jar:file:/app.jar!/BOOT-INF/classes!/log4j2.yml]
2023-09-13 18:34:1...kubectl logs gcz-transformer-7c4dbd8dcf-qrsz6 -n ignite
2023-09-13 18:34:19,443 main DEBUG Apache Log4j Core 2.17.2 initializing configuration YamlConfiguration[location=jar:file:/app.jar!/BOOT-INF/classes!/log4j2.yml]
2023-09-13 18:34:19,445 main DEBUG PluginManager 'Core' found 127 plugins
2023-09-13 18:34:19,445 main DEBUG PluginManager 'Level' found 0 plugins
2023-09-13 18:34:19,446 main DEBUG Processing node for object appenders
2023-09-13 18:34:19,446 main DEBUG Processing node for object Console
2023-09-13 18:34:19,446 main DEBUG Node name is of type STRING
2023-09-13 18:34:19,447 main DEBUG Processing node for object PatternLayout
2023-09-13 18:34:19,447 main DEBUG Node Pattern is of type STRING
2023-09-13 18:34:19,447 main DEBUG Returning PatternLayout with parent Console of type layout:class org.apache.logging.log4j.core.layout.PatternLayout
2023-09-13 18:34:19,448 main DEBUG Returning Console with parent appenders of type appender:class org.apache.logging.log4j.core.appender.ConsoleAppender
2023-09-13 18:34:19,448 main DEBUG Processing node for array RollingFile
2023-09-13 18:34:19,449 main DEBUG Processing RollingFile[0]
2023-09-13 18:34:19,449 main DEBUG Processing node for object PatternLayout
2023-09-13 18:34:19,450 main DEBUG Node pattern is of type STRING
2023-09-13 18:34:19,450 main DEBUG Returning PatternLayout with parent RollingFile of type layout:class org.apache.logging.log4j.core.layout.PatternLayout
2023-09-13 18:34:19,450 main DEBUG Processing node for object Policies
2023-09-13 18:34:19,450 main DEBUG Processing node for object SizeBasedTriggeringPolicy
2023-09-13 18:34:19,451 main DEBUG Node size is of type STRING
2023-09-13 18:34:19,451 main DEBUG Returning SizeBasedTriggeringPolicy with parent Policies of type SizeBasedTriggeringPolicy:class org.apache.logging.log4j.core.appender.rolling.SizeBasedTriggeringPolicy
2023-09-13 18:34:19,457 main DEBUG Returning Policies with parent RollingFile of type Policies:class org.apache.logging.log4j.core.appender.rolling.CompositeTriggeringPolicy
2023-09-13 18:34:19,457 main DEBUG Processing node for object DefaultRollOverStrategy
2023-09-13 18:34:19,460 main DEBUG Node max is of type NUMBER
2023-09-13 18:34:19,460 main DEBUG Returning DefaultRollOverStrategy with parent RollingFile of type DefaultRolloverStrategy:class org.apache.logging.log4j.core.appender.rolling.DefaultRolloverStrategy
2023-09-13 18:34:19,460 main DEBUG Processing node for array File
2023-09-13 18:34:19,461 main DEBUG Processing File[0]
2023-09-13 18:34:19,461 main DEBUG Processing node for object PatternLayout
2023-09-13 18:34:19,462 main DEBUG Node pattern is of type STRING
2023-09-13 18:34:19,463 main DEBUG Returning PatternLayout with parent File of type layout:class org.apache.logging.log4j.core.layout.PatternLayout
2023-09-13 18:34:19,463 main DEBUG Processing File[1]
2023-09-13 18:34:19,463 main DEBUG Processing node for object PatternLayout
2023-09-13 18:34:19,464 main DEBUG Node pattern is of type STRING
2023-09-13 18:34:19,464 main DEBUG Returning PatternLayout with parent File of type layout:class org.apache.logging.log4j.core.layout.PatternLayout
2023-09-13 18:34:19,465 main DEBUG Returning appenders with parent root of type appenders:class org.apache.logging.log4j.core.config.AppendersPlugin
2023-09-13 18:34:19,465 main DEBUG Processing node for object Loggers
2023-09-13 18:34:19,465 main DEBUG Processing node for array logger
2023-09-13 18:34:19,466 main DEBUG Processing logger[0]
2023-09-13 18:34:19,466 main DEBUG Processing array for object AppenderRef
2023-09-13 18:34:19,466 main DEBUG Node ref is of type STRING
2023-09-13 18:34:19,467 main DEBUG Returning AppenderRef with parent logger of type AppenderRef:class org.apache.logging.log4j.core.config.AppenderRef
2023-09-13 18:34:19,467 main DEBUG Processing logger[1]
2023-09-13 18:34:19,467 main DEBUG Processing array for object AppenderRef
2023-09-13 18:34:19,468 main DEBUG Node ref is of type STRING
2023-09-13 18:34:19,468 main DEBUG Returning AppenderRef with parent logger of type AppenderRef:class org.apache.logging.log4j.core.config.AppenderRef
2023-09-13 18:34:19,469 main DEBUG Processing logger[2]
2023-09-13 18:34:19,469 main DEBUG Processing array for object AppenderRef
2023-09-13 18:34:19,469 main DEBUG Node ref is of type STRING
2023-09-13 18:34:19,470 main DEBUG Returning AppenderRef with parent logger of type AppenderRef:class org.apache.logging.log4j.core.config.AppenderRef
2023-09-13 18:34:19,470 main DEBUG Processing node for object Root
2023-09-13 18:34:19,470 main DEBUG Node level is of type STRING
2023-09-13 18:34:19,471 main DEBUG Processing node for array AppenderRef
2023-09-13 18:34:19,471 main DEBUG Processing AppenderRef[0]
2023-09-13 18:34:19,471 main DEBUG Processing AppenderRef[1]
2023-09-13 18:34:19,472 main DEBUG Returning Root with parent Loggers of type root:class org.apache.logging.log4j.core.config.LoggerConfig$RootLogger
2023-09-13 18:34:19,472 main DEBUG Returning Loggers with parent root of type loggers:class org.apache.logging.log4j.core.config.LoggersPlugin
2023-09-13 18:34:19,474 main DEBUG Completed parsing configuration
2023-09-13 18:34:19,477 main DEBUG PluginManager 'Lookup' found 16 plugins
2023-09-13 18:34:19,479 main DEBUG Building Plugin[name=layout, class=org.apache.logging.log4j.core.layout.PatternLayout].
2023-09-13 18:34:19,497 main DEBUG PluginManager 'TypeConverter' found 26 plugins
2023-09-13 18:34:19,516 main DEBUG PatternLayout$Builder(pattern="[%-5level] %d{yyyy-MM-dd HH:mm:ss.SSS} [%t] %c{1} - %msg%n", PatternSelector=null, Configuration(jar:file:/app.jar!/BOOT-INF/classes!/log4j2.yml), Replace=null, charset="null", alwaysWriteExceptions="null", disableAnsi="null", noConsoleNoAnsi="null", header="null", footer="null")
2023-09-13 18:34:19,516 main DEBUG PluginManager 'Converter' found 48 plugins
2023-09-13 18:34:19,530 main DEBUG Building Plugin[name=appender, class=org.apache.logging.log4j.core.appender.ConsoleAppender].
2023-09-13 18:34:19,544 main DEBUG ConsoleAppender$Builder(target="null", follow="null", direct="null", bufferedIo="null", bufferSize="null", immediateFlush="null", ignoreExceptions="null", PatternLayout([%-5level] %d{yyyy-MM-dd HH:mm:ss.SSS} [%t] %c{1} - %msg%n), name="LogToConsole", Configuration(jar:file:/app.jar!/BOOT-INF/classes!/log4j2.yml), Filter=null, ={})
2023-09-13 18:34:19,547 main DEBUG Starting OutputStreamManager SYSTEM_OUT.false.false
2023-09-13 18:34:19,548 main DEBUG Building Plugin[name=layout, class=org.apache.logging.log4j.core.layout.PatternLayout].
2023-09-13 18:34:19,549 main DEBUG PatternLayout$Builder(pattern="[%-5level] %d{yyyy-MM-dd HH:mm:ss.SSS} [%t] %c{1} - %msg%n", PatternSelector=null, Configuration(jar:file:/app.jar!/BOOT-INF/classes!/log4j2.yml), Replace=null, charset="null", alwaysWriteExceptions="null", disableAnsi="null", noConsoleNoAnsi="null", header="null", footer="null")
2023-09-13 18:34:19,550 main DEBUG Building Plugin[name=SizeBasedTriggeringPolicy, class=org.apache.logging.log4j.core.appender.rolling.SizeBasedTriggeringPolicy].
2023-09-13 18:34:19,556 main DEBUG createPolicy(size="10MB")
2023-09-13 18:34:19,558 main DEBUG Building Plugin[name=Policies, class=org.apache.logging.log4j.core.appender.rolling.CompositeTriggeringPolicy].
2023-09-13 18:34:19,559 main DEBUG createPolicy(={SizeBasedTriggeringPolicy(size=10485760)})
2023-09-13 18:34:19,559 main DEBUG Building Plugin[name=DefaultRolloverStrategy, class=org.apache.logging.log4j.core.appender.rolling.DefaultRolloverStrategy].
2023-09-13 18:34:19,563 main DEBUG DefaultRolloverStrategy$Builder(max="10", min="null", fileIndex="null", compressionLevel="null", ={}, stopCustomActionsOnError="null", tempCompressedFilePattern="null", Configuration(jar:file:/app.jar!/BOOT-INF/classes!/log4j2.yml))
2023-09-13 18:34:19,564 main DEBUG Building Plugin[name=appender, class=org.apache.logging.log4j.core.appender.RollingFileAppender].
2023-09-13 18:34:19,567 main DEBUG RollingFileAppender$Builder(fileName="logs/app.log", filePattern="logs/${date:yyyy-MM}/app-%d{MM-dd-yyyy}-%i.log.gz", append="null", locking="null", Policies(CompositeTriggeringPolicy(policies=[SizeBasedTriggeringPolicy(size=10485760)])), DefaultRollOverStrategy(DefaultRolloverStrategy(min=1, max=10, useMax=true)), advertise="null", advertiseUri="null", createOnDemand="null", filePermissions="null", fileOwner="null", fileGroup="null", bufferedIo="null", bufferSize="null", immediateFlush="null", ignoreExceptions="null", PatternLayout([%-5level] %d{yyyy-MM-dd HH:mm:ss.SSS} [%t] %c{1} - %msg%n), name="LogToRollingFile", Configuration(jar:file:/app.jar!/BOOT-INF/classes!/log4j2.yml), Filter=null, ={})
2023-09-13 18:34:19,575 main DEBUG Returning file creation time for /logs/app.log
2023-09-13 18:34:19,576 main DEBUG Starting RollingFileManager logs/app.log
2023-09-13 18:34:19,581 main DEBUG PluginManager 'FileConverter' found 2 plugins
2023-09-13 18:34:19,587 main DEBUG Setting prev file time to 2023-09-13T18:34:19.000+0000
2023-09-13 18:34:19,588 main DEBUG Initializing triggering policy CompositeTriggeringPolicy(policies=[SizeBasedTriggeringPolicy(size=10485760)])
2023-09-13 18:34:19,588 main DEBUG Initializing triggering policy SizeBasedTriggeringPolicy(size=10485760)
2023-09-13 18:34:19,589 main DEBUG Building Plugin[name=layout, class=org.apache.logging.log4j.core.layout.PatternLayout].
2023-09-13 18:34:19,590 main DEBUG PatternLayout$Builder(pattern="%d{yyyy-MM-dd HH:mm:ss.SSS} - %msg%n", PatternSelector=null, Configuration(jar:file:/app.jar!/BOOT-INF/classes!/log4j2.yml), Replace=null, charset="null", alwaysWriteExceptions="null", disableAnsi="null", noConsoleNoAnsi="null", header="null", footer="null")
2023-09-13 18:34:19,592 main DEBUG Building Plugin[name=appender, class=org.apache.logging.log4j.core.appender.FileAppender].
2023-09-13 18:34:19,596 main DEBUG FileAppender$Builder(fileName="logs/dataIngestionError_13-09-2023-06-34.log", append="false", locking="null", advertise="null", advertiseUri="null", createOnDemand="null", filePermissions="null", fileOwner="null", fileGroup="null", bufferedIo="null", bufferSize="null", immediateFlush="null", ignoreExceptions="null", PatternLayout(%d{yyyy-MM-dd HH:mm:ss.SSS} - %msg%n), name="LogToGeoJsonSummaryFile", Configuration(jar:file:/app.jar!/BOOT-INF/classes!/log4j2.yml), Filter=null, ={})
2023-09-13 18:34:19,598 main DEBUG Starting FileManager logs/dataIngestionError_13-09-2023-06-34.log
2023-09-13 18:34:19,599 main DEBUG Building Plugin[name=layout, class=org.apache.logging.log4j.core.layout.PatternLayout].
2023-09-13 18:34:19,600 main DEBUG PatternLayout$Builder(pattern="%d{yyyy-MM-dd HH:mm:ss.SSS} - %msg%n", PatternSelector=null, Configuration(jar:file:/app.jar!/BOOT-INF/classes!/log4j2.yml), Replace=null, charset="null", alwaysWriteExceptions="null", disableAnsi="null", noConsoleNoAnsi="null", header="null", footer="null")
2023-09-13 18:34:19,601 main DEBUG Building Plugin[name=appender, class=org.apache.logging.log4j.core.appender.FileAppender].
2023-09-13 18:34:19,602 main DEBUG FileAppender$Builder(fileName="logs/trajectoryLog_13-09-2023-06-34.log", append="false", locking="null", advertise="null", advertiseUri="null", createOnDemand="null", filePermissions="null", fileOwner="null", fileGroup="null", bufferedIo="null", bufferSize="null", immediateFlush="null", ignoreExceptions="null", PatternLayout(%d{yyyy-MM-dd HH:mm:ss.SSS} - %msg%n), name="TrajectoryLog", Configuration(jar:file:/app.jar!/BOOT-INF/classes!/log4j2.yml), Filter=null, ={})
2023-09-13 18:34:19,603 main DEBUG Starting FileManager logs/trajectoryLog_13-09-2023-06-34.log
2023-09-13 18:34:19,603 main DEBUG Building Plugin[name=appenders, class=org.apache.logging.log4j.core.config.AppendersPlugin].
2023-09-13 18:34:19,604 main DEBUG createAppenders(={LogToConsole, LogToRollingFile, LogToGeoJsonSummaryFile, TrajectoryLog})
2023-09-13 18:34:19,605 main DEBUG Building Plugin[name=AppenderRef, class=org.apache.logging.log4j.core.config.AppenderRef].
2023-09-13 18:34:19,606 main DEBUG createAppenderRef(ref="LogToRollingFile", level="null", Filter=null)
2023-09-13 18:34:19,606 main DEBUG Building Plugin[name=logger, class=org.apache.logging.log4j.core.config.LoggerConfig].
2023-09-13 18:34:19,609 main DEBUG LoggerConfig$Builder(additivity="false", level="INFO", levelAndRefs="null", name="org.osdu.gcz.transformer", includeLocation="null", ={LogToRollingFile}, ={}, Configuration(jar:file:/app.jar!/BOOT-INF/classes!/log4j2.yml), Filter=null)
2023-09-13 18:34:19,611 main DEBUG Building Plugin[name=AppenderRef, class=org.apache.logging.log4j.core.config.AppenderRef].
2023-09-13 18:34:19,612 main DEBUG createAppenderRef(ref="LogToGeoJsonSummaryFile", level="null", Filter=null)
2023-09-13 18:34:19,612 main DEBUG Building Plugin[name=logger, class=org.apache.logging.log4j.core.config.LoggerConfig].
2023-09-13 18:34:19,613 main DEBUG LoggerConfig$Builder(additivity="false", level="INFO", levelAndRefs="null", name="geoJsonSummaryLog", includeLocation="null", ={LogToGeoJsonSummaryFile}, ={}, Configuration(jar:file:/app.jar!/BOOT-INF/classes!/log4j2.yml), Filter=null)
2023-09-13 18:34:19,613 main DEBUG Building Plugin[name=AppenderRef, class=org.apache.logging.log4j.core.config.AppenderRef].
2023-09-13 18:34:19,614 main DEBUG createAppenderRef(ref="TrajectoryLog", level="null", Filter=null)
2023-09-13 18:34:19,615 main DEBUG Building Plugin[name=logger, class=org.apache.logging.log4j.core.config.LoggerConfig].
2023-09-13 18:34:19,616 main DEBUG LoggerConfig$Builder(additivity="false", level="INFO", levelAndRefs="null", name="trajectoryLog", includeLocation="null", ={TrajectoryLog}, ={}, Configuration(jar:file:/app.jar!/BOOT-INF/classes!/log4j2.yml), Filter=null)
2023-09-13 18:34:19,616 main DEBUG Building Plugin[name=AppenderRef, class=org.apache.logging.log4j.core.config.AppenderRef].
2023-09-13 18:34:19,617 main DEBUG createAppenderRef(ref="LogToConsole", level="null", Filter=null)
2023-09-13 18:34:19,617 main DEBUG Building Plugin[name=AppenderRef, class=org.apache.logging.log4j.core.config.AppenderRef].
2023-09-13 18:34:19,618 main DEBUG createAppenderRef(ref="LogToRollingFile", level="null", Filter=null)
2023-09-13 18:34:19,618 main DEBUG Building Plugin[name=root, class=org.apache.logging.log4j.core.config.LoggerConfig$RootLogger].
2023-09-13 18:34:19,620 main DEBUG LoggerConfig$RootLogger$Builder(additivity="null", level="ERROR", levelAndRefs="null", includeLocation="null", ={LogToConsole, LogToRollingFile}, ={}, Configuration(jar:file:/app.jar!/BOOT-INF/classes!/log4j2.yml), Filter=null)
2023-09-13 18:34:19,621 main DEBUG Building Plugin[name=loggers, class=org.apache.logging.log4j.core.config.LoggersPlugin].
2023-09-13 18:34:19,622 main DEBUG createLoggers(={org.osdu.gcz.transformer, geoJsonSummaryLog, trajectoryLog, root})
2023-09-13 18:34:19,625 main DEBUG Configuration YamlConfiguration[location=jar:file:/app.jar!/BOOT-INF/classes!/log4j2.yml] initialized
2023-09-13 18:34:19,625 main DEBUG Starting configuration YamlConfiguration[location=jar:file:/app.jar!/BOOT-INF/classes!/log4j2.yml]
2023-09-13 18:34:19,625 main DEBUG Started configuration YamlConfiguration[location=jar:file:/app.jar!/BOOT-INF/classes!/log4j2.yml] OK.
2023-09-13 18:34:19,626 main DEBUG Shutting down OutputStreamManager SYSTEM_OUT.false.false-1
2023-09-13 18:34:19,627 main DEBUG OutputStream closed
2023-09-13 18:34:19,627 main DEBUG Shut down OutputStreamManager SYSTEM_OUT.false.false-1, all resources released: true
2023-09-13 18:34:19,627 main DEBUG Appender DefaultConsole-1 stopped with status true
2023-09-13 18:34:19,628 main DEBUG Stopped org.apache.logging.log4j.core.config.DefaultConfiguration@7506e922 OK
2023-09-13 18:34:19,708 main DEBUG Registering MBean org.apache.logging.log4j2:type=31221be2
2023-09-13 18:34:19,712 main DEBUG Registering MBean org.apache.logging.log4j2:type=31221be2,component=StatusLogger
2023-09-13 18:34:19,713 main DEBUG Registering MBean org.apache.logging.log4j2:type=31221be2,component=ContextSelector
2023-09-13 18:34:19,715 main DEBUG Registering MBean org.apache.logging.log4j2:type=31221be2,component=Loggers,name=
2023-09-13 18:34:19,716 main DEBUG Registering MBean org.apache.logging.log4j2:type=31221be2,component=Loggers,name=trajectoryLog
2023-09-13 18:34:19,717 main DEBUG Registering MBean org.apache.logging.log4j2:type=31221be2,component=Loggers,name=org.osdu.gcz.transformer
2023-09-13 18:34:19,717 main DEBUG Registering MBean org.apache.logging.log4j2:type=31221be2,component=Loggers,name=geoJsonSummaryLog
2023-09-13 18:34:19,718 main DEBUG Registering MBean org.apache.logging.log4j2:type=31221be2,component=Appenders,name=LogToConsole
2023-09-13 18:34:19,719 main DEBUG Registering MBean org.apache.logging.log4j2:type=31221be2,component=Appenders,name=LogToGeoJsonSummaryFile
2023-09-13 18:34:19,720 main DEBUG Registering MBean org.apache.logging.log4j2:type=31221be2,component=Appenders,name=LogToRollingFile
2023-09-13 18:34:19,720 main DEBUG Registering MBean org.apache.logging.log4j2:type=31221be2,component=Appenders,name=TrajectoryLog
2023-09-13 18:34:19,724 main DEBUG org.apache.logging.log4j.core.util.SystemClock does not support precise timestamps.
2023-09-13 18:34:19,724 main DEBUG Reconfiguration complete for context[name=31221be2] at URI jar:file:/app.jar!/BOOT-INF/classes!/log4j2.yml (org.apache.logging.log4j.core.LoggerContext@449b2d27) with optional ClassLoader: null
2023-09-13 18:34:19,724 main DEBUG Shutdown hook enabled. Registering a new one.
2023-09-13 18:34:19,726 main DEBUG LoggerContext[name=31221be2, org.apache.logging.log4j.core.LoggerContext@449b2d27] started OK.
2023-09-13 18:34:20,593 main DEBUG Reconfiguration started for context[name=31221be2] at URI null (org.apache.logging.log4j.core.LoggerContext@449b2d27) with optional ClassLoader: null
2023-09-13 18:34:20,594 main DEBUG Using configurationFactory org.apache.logging.log4j.core.config.ConfigurationFactory$Factory@1c3a4799
2023-09-13 18:34:20,607 main DEBUG PluginManager 'Lookup' found 16 plugins
2023-09-13 18:34:20,612 main DEBUG Apache Log4j Core 2.17.2 initializing configuration YamlConfiguration[location=jar:file:/app.jar!/BOOT-INF/classes!/log4j2.yml]
2023-09-13 18:34:20,612 main DEBUG PluginManager 'Core' found 127 plugins
2023-09-13 18:34:20,613 main DEBUG PluginManager 'Level' found 0 plugins
2023-09-13 18:34:20,613 main DEBUG Processing node for object appenders
2023-09-13 18:34:20,613 main DEBUG Processing node for object Console
2023-09-13 18:34:20,614 main DEBUG Node name is of type STRING
2023-09-13 18:34:20,614 main DEBUG Processing node for object PatternLayout
2023-09-13 18:34:20,614 main DEBUG Node Pattern is of type STRING
2023-09-13 18:34:20,615 main DEBUG Returning PatternLayout with parent Console of type layout:class org.apache.logging.log4j.core.layout.PatternLayout
2023-09-13 18:34:20,615 main DEBUG Returning Console with parent appenders of type appender:class org.apache.logging.log4j.core.appender.ConsoleAppender
2023-09-13 18:34:20,616 main DEBUG Processing node for array RollingFile
2023-09-13 18:34:20,616 main DEBUG Processing RollingFile[0]
2023-09-13 18:34:20,616 main DEBUG Processing node for object PatternLayout
2023-09-13 18:34:20,617 main DEBUG Node pattern is of type STRING
2023-09-13 18:34:20,617 main DEBUG Returning PatternLayout with parent RollingFile of type layout:class org.apache.logging.log4j.core.layout.PatternLayout
2023-09-13 18:34:20,617 main DEBUG Processing node for object Policies
2023-09-13 18:34:20,618 main DEBUG Processing node for object SizeBasedTriggeringPolicy
2023-09-13 18:34:20,618 main DEBUG Node size is of type STRING
2023-09-13 18:34:20,619 main DEBUG Returning SizeBasedTriggeringPolicy with parent Policies of type SizeBasedTriggeringPolicy:class org.apache.logging.log4j.core.appender.rolling.SizeBasedTriggeringPolicy
2023-09-13 18:34:20,619 main DEBUG Returning Policies with parent RollingFile of type Policies:class org.apache.logging.log4j.core.appender.rolling.CompositeTriggeringPolicy
2023-09-13 18:34:20,620 main DEBUG Processing node for object DefaultRollOverStrategy
2023-09-13 18:34:20,620 main DEBUG Node max is of type NUMBER
2023-09-13 18:34:20,620 main DEBUG Returning DefaultRollOverStrategy with parent RollingFile of type DefaultRolloverStrategy:class org.apache.logging.log4j.core.appender.rolling.DefaultRolloverStrategy
2023-09-13 18:34:20,621 main DEBUG Processing node for array File
2023-09-13 18:34:20,621 main DEBUG Processing File[0]
2023-09-13 18:34:20,621 main DEBUG Processing node for object PatternLayout
2023-09-13 18:34:20,622 main DEBUG Node pattern is of type STRING
2023-09-13 18:34:20,622 main DEBUG Returning PatternLayout with parent File of type layout:class org.apache.logging.log4j.core.layout.PatternLayout
2023-09-13 18:34:20,623 main DEBUG Processing File[1]
2023-09-13 18:34:20,623 main DEBUG Processing node for object PatternLayout
2023-09-13 18:34:20,623 main DEBUG Node pattern is of type STRING
2023-09-13 18:34:20,624 main DEBUG Returning PatternLayout with parent File of type layout:class org.apache.logging.log4j.core.layout.PatternLayout
2023-09-13 18:34:20,624 main DEBUG Returning appenders with parent root of type appenders:class org.apache.logging.log4j.core.config.AppendersPlugin
2023-09-13 18:34:20,625 main DEBUG Processing node for object Loggers
2023-09-13 18:34:20,625 main DEBUG Processing node for array logger
2023-09-13 18:34:20,625 main DEBUG Processing logger[0]
2023-09-13 18:34:20,626 main DEBUG Processing array for object AppenderRef
2023-09-13 18:34:20,626 main DEBUG Node ref is of type STRING
2023-09-13 18:34:20,627 main DEBUG Returning AppenderRef with parent logger of type AppenderRef:class org.apache.logging.log4j.core.config.AppenderRef
2023-09-13 18:34:20,627 main DEBUG Processing logger[1]
2023-09-13 18:34:20,627 main DEBUG Processing array for object AppenderRef
2023-09-13 18:34:20,628 main DEBUG Node ref is of type STRING
2023-09-13 18:34:20,628 main DEBUG Returning AppenderRef with parent logger of type AppenderRef:class org.apache.logging.log4j.core.config.AppenderRef
2023-09-13 18:34:20,628 main DEBUG Processing logger[2]
2023-09-13 18:34:20,629 main DEBUG Processing array for object AppenderRef
2023-09-13 18:34:20,629 main DEBUG Node ref is of type STRING
2023-09-13 18:34:20,630 main DEBUG Returning AppenderRef with parent logger of type AppenderRef:class org.apache.logging.log4j.core.config.AppenderRef
2023-09-13 18:34:20,630 main DEBUG Processing node for object Root
2023-09-13 18:34:20,630 main DEBUG Node level is of type STRING
2023-09-13 18:34:20,631 main DEBUG Processing node for array AppenderRef
2023-09-13 18:34:20,631 main DEBUG Processing AppenderRef[0]
2023-09-13 18:34:20,631 main DEBUG Processing AppenderRef[1]
2023-09-13 18:34:20,632 main DEBUG Returning Root with parent Loggers of type root:class org.apache.logging.log4j.core.config.LoggerConfig$RootLogger
2023-09-13 18:34:20,632 main DEBUG Returning Loggers with parent root of type loggers:class org.apache.logging.log4j.core.config.LoggersPlugin
2023-09-13 18:34:20,633 main DEBUG Completed parsing configuration
2023-09-13 18:34:20,633 main DEBUG PluginManager 'Lookup' found 16 plugins
2023-09-13 18:34:20,634 main DEBUG Building Plugin[name=layout, class=org.apache.logging.log4j.core.layout.PatternLayout].
2023-09-13 18:34:20,634 main DEBUG PatternLayout$Builder(pattern="[%-5level] %d{yyyy-MM-dd HH:mm:ss.SSS} [%t] %c{1} - %msg%n", PatternSelector=null, Configuration(jar:file:/app.jar!/BOOT-INF/classes!/log4j2.yml), Replace=null, charset="null", alwaysWriteExceptions="null", disableAnsi="null", noConsoleNoAnsi="null", header="null", footer="null")
2023-09-13 18:34:20,635 main DEBUG PluginManager 'Converter' found 48 plugins
2023-09-13 18:34:20,636 main DEBUG Building Plugin[name=appender, class=org.apache.logging.log4j.core.appender.ConsoleAppender].
2023-09-13 18:34:20,637 main DEBUG ConsoleAppender$Builder(target="null", follow="null", direct="null", bufferedIo="null", bufferSize="null", immediateFlush="null", ignoreExceptions="null", PatternLayout([%-5level] %d{yyyy-MM-dd HH:mm:ss.SSS} [%t] %c{1} - %msg%n), name="LogToConsole", Configuration(jar:file:/app.jar!/BOOT-INF/classes!/log4j2.yml), Filter=null, ={})
2023-09-13 18:34:20,639 main DEBUG Building Plugin[name=layout, class=org.apache.logging.log4j.core.layout.PatternLayout].
2023-09-13 18:34:20,640 main DEBUG PatternLayout$Builder(pattern="[%-5level] %d{yyyy-MM-dd HH:mm:ss.SSS} [%t] %c{1} - %msg%n", PatternSelector=null, Configuration(jar:file:/app.jar!/BOOT-INF/classes!/log4j2.yml), Replace=null, charset="null", alwaysWriteExceptions="null", disableAnsi="null", noConsoleNoAnsi="null", header="null", footer="null")
2023-09-13 18:34:20,640 main DEBUG Building Plugin[name=SizeBasedTriggeringPolicy, class=org.apache.logging.log4j.core.appender.rolling.SizeBasedTriggeringPolicy].
2023-09-13 18:34:20,641 main DEBUG createPolicy(size="10MB")
2023-09-13 18:34:20,641 main DEBUG Building Plugin[name=Policies, class=org.apache.logging.log4j.core.appender.rolling.CompositeTriggeringPolicy].
2023-09-13 18:34:20,642 main DEBUG createPolicy(={SizeBasedTriggeringPolicy(size=10485760)})
2023-09-13 18:34:20,642 main DEBUG Building Plugin[name=DefaultRolloverStrategy, class=org.apache.logging.log4j.core.appender.rolling.DefaultRolloverStrategy].
2023-09-13 18:34:20,643 main DEBUG DefaultRolloverStrategy$Builder(max="10", min="null", fileIndex="null", compressionLevel="null", ={}, stopCustomActionsOnError="null", tempCompressedFilePattern="null", Configuration(jar:file:/app.jar!/BOOT-INF/classes!/log4j2.yml))
2023-09-13 18:34:20,643 main DEBUG Building Plugin[name=appender, class=org.apache.logging.log4j.core.appender.RollingFileAppender].
2023-09-13 18:34:20,645 main DEBUG RollingFileAppender$Builder(fileName="logs/app.log", filePattern="logs/${date:yyyy-MM}/app-%d{MM-dd-yyyy}-%i.log.gz", append="null", locking="null", Policies(CompositeTriggeringPolicy(policies=[SizeBasedTriggeringPolicy(size=10485760)])), DefaultRollOverStrategy(DefaultRolloverStrategy(min=1, max=10, useMax=true)), advertise="null", advertiseUri="null", createOnDemand="null", filePermissions="null", fileOwner="null", fileGroup="null", bufferedIo="null", bufferSize="null", immediateFlush="null", ignoreExceptions="null", PatternLayout([%-5level] %d{yyyy-MM-dd HH:mm:ss.SSS} [%t] %c{1} - %msg%n), name="LogToRollingFile", Configuration(jar:file:/app.jar!/BOOT-INF/classes!/log4j2.yml), Filter=null, ={})
2023-09-13 18:34:20,645 main DEBUG PluginManager 'FileConverter' found 2 plugins
2023-09-13 18:34:20,646 main DEBUG Initializing triggering policy SizeBasedTriggeringPolicy(size=10485760)
2023-09-13 18:34:20,646 main DEBUG Building Plugin[name=layout, class=org.apache.logging.log4j.core.layout.PatternLayout].
2023-09-13 18:34:20,647 main DEBUG PatternLayout$Builder(pattern="%d{yyyy-MM-dd HH:mm:ss.SSS} - %msg%n", PatternSelector=null, Configuration(jar:file:/app.jar!/BOOT-INF/classes!/log4j2.yml), Replace=null, charset="null", alwaysWriteExceptions="null", disableAnsi="null", noConsoleNoAnsi="null", header="null", footer="null")
2023-09-13 18:34:20,648 main DEBUG Building Plugin[name=appender, class=org.apache.logging.log4j.core.appender.FileAppender].
2023-09-13 18:34:20,649 main DEBUG FileAppender$Builder(fileName="logs/dataIngestionError_13-09-2023-06-34.log", append="false", locking="null", advertise="null", advertiseUri="null", createOnDemand="null", filePermissions="null", fileOwner="null", fileGroup="null", bufferedIo="null", bufferSize="null", immediateFlush="null", ignoreExceptions="null", PatternLayout(%d{yyyy-MM-dd HH:mm:ss.SSS} - %msg%n), name="LogToGeoJsonSummaryFile", Configuration(jar:file:/app.jar!/BOOT-INF/classes!/log4j2.yml), Filter=null, ={})
2023-09-13 18:34:20,649 main DEBUG Building Plugin[name=layout, class=org.apache.logging.log4j.core.layout.PatternLayout].
2023-09-13 18:34:20,650 main DEBUG PatternLayout$Builder(pattern="%d{yyyy-MM-dd HH:mm:ss.SSS} - %msg%n", PatternSelector=null, Configuration(jar:file:/app.jar!/BOOT-INF/classes!/log4j2.yml), Replace=null, charset="null", alwaysWriteExceptions="null", disableAnsi="null", noConsoleNoAnsi="null", header="null", footer="null")
2023-09-13 18:34:20,651 main DEBUG Building Plugin[name=appender, class=org.apache.logging.log4j.core.appender.FileAppender].
2023-09-13 18:34:20,652 main DEBUG FileAppender$Builder(fileName="logs/trajectoryLog_13-09-2023-06-34.log", append="false", locking="null", advertise="null", advertiseUri="null", createOnDemand="null", filePermissions="null", fileOwner="null", fileGroup="null", bufferedIo="null", bufferSize="null", immediateFlush="null", ignoreExceptions="null", PatternLayout(%d{yyyy-MM-dd HH:mm:ss.SSS} - %msg%n), name="TrajectoryLog", Configuration(jar:file:/app.jar!/BOOT-INF/classes!/log4j2.yml), Filter=null, ={})
2023-09-13 18:34:20,652 main DEBUG Building Plugin[name=appenders, class=org.apache.logging.log4j.core.config.AppendersPlugin].
2023-09-13 18:34:20,652 main DEBUG createAppenders(={LogToConsole, LogToRollingFile, LogToGeoJsonSummaryFile, TrajectoryLog})
2023-09-13 18:34:20,653 main DEBUG Building Plugin[name=AppenderRef, class=org.apache.logging.log4j.core.config.AppenderRef].
2023-09-13 18:34:20,654 main DEBUG createAppenderRef(ref="LogToRollingFile", level="null", Filter=null)
2023-09-13 18:34:20,654 main DEBUG Building Plugin[name=logger, class=org.apache.logging.log4j.core.config.LoggerConfig].
2023-09-13 18:34:20,655 main DEBUG LoggerConfig$Builder(additivity="false", level="INFO", levelAndRefs="null", name="org.osdu.gcz.transformer", includeLocation="null", ={LogToRollingFile}, ={}, Configuration(jar:file:/app.jar!/BOOT-INF/classes!/log4j2.yml), Filter=null)
2023-09-13 18:34:20,655 main DEBUG Building Plugin[name=AppenderRef, class=org.apache.logging.log4j.core.config.AppenderRef].
2023-09-13 18:34:20,656 main DEBUG createAppenderRef(ref="LogToGeoJsonSummaryFile", level="null", Filter=null)
2023-09-13 18:34:20,656 main DEBUG Building Plugin[name=logger, class=org.apache.logging.log4j.core.config.LoggerConfig].
2023-09-13 18:34:20,657 main DEBUG LoggerConfig$Builder(additivity="false", level="INFO", levelAndRefs="null", name="geoJsonSummaryLog", includeLocation="null", ={LogToGeoJsonSummaryFile}, ={}, Configuration(jar:file:/app.jar!/BOOT-INF/classes!/log4j2.yml), Filter=null)
2023-09-13 18:34:20,657 main DEBUG Building Plugin[name=AppenderRef, class=org.apache.logging.log4j.core.config.AppenderRef].
2023-09-13 18:34:20,658 main DEBUG createAppenderRef(ref="TrajectoryLog", level="null", Filter=null)
2023-09-13 18:34:20,658 main DEBUG Building Plugin[name=logger, class=org.apache.logging.log4j.core.config.LoggerConfig].
2023-09-13 18:34:20,659 main DEBUG LoggerConfig$Builder(additivity="false", level="INFO", levelAndRefs="null", name="trajectoryLog", includeLocation="null", ={TrajectoryLog}, ={}, Configuration(jar:file:/app.jar!/BOOT-INF/classes!/log4j2.yml), Filter=null)
2023-09-13 18:34:20,660 main DEBUG Building Plugin[name=AppenderRef, class=org.apache.logging.log4j.core.config.AppenderRef].
2023-09-13 18:34:20,660 main DEBUG createAppenderRef(ref="LogToConsole", level="null", Filter=null)
2023-09-13 18:34:20,661 main DEBUG Building Plugin[name=AppenderRef, class=org.apache.logging.log4j.core.config.AppenderRef].
2023-09-13 18:34:20,661 main DEBUG createAppenderRef(ref="LogToRollingFile", level="null", Filter=null)
2023-09-13 18:34:20,662 main DEBUG Building Plugin[name=root, class=org.apache.logging.log4j.core.config.LoggerConfig$RootLogger].
2023-09-13 18:34:20,662 main DEBUG LoggerConfig$RootLogger$Builder(additivity="null", level="ERROR", levelAndRefs="null", includeLocation="null", ={LogToConsole, LogToRollingFile}, ={}, Configuration(jar:file:/app.jar!/BOOT-INF/classes!/log4j2.yml), Filter=null)
2023-09-13 18:34:20,663 main DEBUG Building Plugin[name=loggers, class=org.apache.logging.log4j.core.config.LoggersPlugin].
2023-09-13 18:34:20,663 main DEBUG createLoggers(={org.osdu.gcz.transformer, geoJsonSummaryLog, trajectoryLog, root})
2023-09-13 18:34:20,664 main DEBUG Configuration YamlConfiguration[location=jar:file:/app.jar!/BOOT-INF/classes!/log4j2.yml] initialized
2023-09-13 18:34:20,664 main DEBUG Starting configuration YamlConfiguration[location=jar:file:/app.jar!/BOOT-INF/classes!/log4j2.yml]
2023-09-13 18:34:20,665 main DEBUG Started configuration YamlConfiguration[location=jar:file:/app.jar!/BOOT-INF/classes!/log4j2.yml] OK.
2023-09-13 18:34:20,666 main DEBUG Appender TrajectoryLog stopped with status true
2023-09-13 18:34:20,666 main DEBUG Appender LogToRollingFile stopped with status true
2023-09-13 18:34:20,667 main DEBUG Appender LogToGeoJsonSummaryFile stopped with status true
2023-09-13 18:34:20,667 main DEBUG Appender LogToConsole stopped with status true
2023-09-13 18:34:20,668 main DEBUG Stopped YamlConfiguration[location=jar:file:/app.jar!/BOOT-INF/classes!/log4j2.yml] OK
2023-09-13 18:34:20,670 main DEBUG Registering MBean org.apache.logging.log4j2:type=31221be2
2023-09-13 18:34:20,671 main DEBUG Registering MBean org.apache.logging.log4j2:type=31221be2,component=StatusLogger
2023-09-13 18:34:20,671 main DEBUG Registering MBean org.apache.logging.log4j2:type=31221be2,component=ContextSelector
2023-09-13 18:34:20,672 main DEBUG Registering MBean org.apache.logging.log4j2:type=31221be2,component=Loggers,name=
2023-09-13 18:34:20,673 main DEBUG Registering MBean org.apache.logging.log4j2:type=31221be2,component=Loggers,name=trajectoryLog
2023-09-13 18:34:20,673 main DEBUG Registering MBean org.apache.logging.log4j2:type=31221be2,component=Loggers,name=org.osdu.gcz.transformer
2023-09-13 18:34:20,674 main DEBUG Registering MBean org.apache.logging.log4j2:type=31221be2,component=Loggers,name=geoJsonSummaryLog
2023-09-13 18:34:20,674 main DEBUG Registering MBean org.apache.logging.log4j2:type=31221be2,component=Appenders,name=LogToConsole
2023-09-13 18:34:20,675 main DEBUG Registering MBean org.apache.logging.log4j2:type=31221be2,component=Appenders,name=LogToGeoJsonSummaryFile
2023-09-13 18:34:20,676 main DEBUG Registering MBean org.apache.logging.log4j2:type=31221be2,component=Appenders,name=LogToRollingFile
2023-09-13 18:34:20,676 main DEBUG Registering MBean org.apache.logging.log4j2:type=31221be2,component=Appenders,name=TrajectoryLog
2023-09-13 18:34:20,677 main DEBUG Reconfiguration complete for context[name=31221be2] at URI jar:file:/app.jar!/BOOT-INF/classes!/log4j2.yml (org.apache.logging.log4j.core.LoggerContext@449b2d27) with optional ClassLoader: null
2023-09-13 18:34:21,283 main DEBUG Reconfiguration started for context[name=31221be2] at URI null (org.apache.logging.log4j.core.LoggerContext@449b2d27) with optional ClassLoader: null
2023-09-13 18:34:21,283 main DEBUG Using configurationFactory org.apache.logging.log4j.core.config.ConfigurationFactory$Factory@1c3a4799
2023-09-13 18:34:21,295 main DEBUG PluginManager 'Lookup' found 16 plugins
2023-09-13 18:34:21,300 main DEBUG Apache Log4j Core 2.17.2 initializing configuration YamlConfiguration[location=jar:file:/app.jar!/BOOT-INF/classes!/log4j2.yml]
2023-09-13 18:34:21,300 main DEBUG PluginManager 'Core' found 127 plugins
2023-09-13 18:34:21,301 main DEBUG PluginManager 'Level' found 0 plugins
2023-09-13 18:34:21,301 main DEBUG Processing node for object appenders
2023-09-13 18:34:21,301 main DEBUG Processing node for object Console
2023-09-13 18:34:21,302 main DEBUG Node name is of type STRING
2023-09-13 18:34:21,302 main DEBUG Processing node for object PatternLayout
2023-09-13 18:34:21,303 main DEBUG Node Pattern is of type STRING
2023-09-13 18:34:21,303 main DEBUG Returning PatternLayout with parent Console of type layout:class org.apache.logging.log4j.core.layout.PatternLayout
2023-09-13 18:34:21,303 main DEBUG Returning Console with parent appenders of type appender:class org.apache.logging.log4j.core.appender.ConsoleAppender
2023-09-13 18:34:21,304 main DEBUG Processing node for array RollingFile
2023-09-13 18:34:21,304 main DEBUG Processing RollingFile[0]
2023-09-13 18:34:21,304 main DEBUG Processing node for object PatternLayout
2023-09-13 18:34:21,305 main DEBUG Node pattern is of type STRING
2023-09-13 18:34:21,305 main DEBUG Returning PatternLayout with parent RollingFile of type layout:class org.apache.logging.log4j.core.layout.PatternLayout
2023-09-13 18:34:21,306 main DEBUG Processing node for object Policies
2023-09-13 18:34:21,306 main DEBUG Processing node for object SizeBasedTriggeringPolicy
2023-09-13 18:34:21,306 main DEBUG Node size is of type STRING
2023-09-13 18:34:21,307 main DEBUG Returning SizeBasedTriggeringPolicy with parent Policies of type SizeBasedTriggeringPolicy:class org.apache.logging.log4j.core.appender.rolling.SizeBasedTriggeringPolicy
2023-09-13 18:34:21,307 main DEBUG Returning Policies with parent RollingFile of type Policies:class org.apache.logging.log4j.core.appender.rolling.CompositeTriggeringPolicy
2023-09-13 18:34:21,307 main DEBUG Processing node for object DefaultRollOverStrategy
2023-09-13 18:34:21,307 main DEBUG Node max is of type NUMBER
2023-09-13 18:34:21,308 main DEBUG Returning DefaultRollOverStrategy with parent RollingFile of type DefaultRolloverStrategy:class org.apache.logging.log4j.core.appender.rolling.DefaultRolloverStrategy
2023-09-13 18:34:21,308 main DEBUG Processing node for array File
2023-09-13 18:34:21,308 main DEBUG Processing File[0]
2023-09-13 18:34:21,308 main DEBUG Processing node for object PatternLayout
2023-09-13 18:34:21,309 main DEBUG Node pattern is of type STRING
2023-09-13 18:34:21,309 main DEBUG Returning PatternLayout with parent File of type layout:class org.apache.logging.log4j.core.layout.PatternLayout
2023-09-13 18:34:21,309 main DEBUG Processing File[1]
2023-09-13 18:34:21,309 main DEBUG Processing node for object PatternLayout
2023-09-13 18:34:21,310 main DEBUG Node pattern is of type STRING
2023-09-13 18:34:21,310 main DEBUG Returning PatternLayout with parent File of type layout:class org.apache.logging.log4j.core.layout.PatternLayout
2023-09-13 18:34:21,310 main DEBUG Returning appenders with parent root of type appenders:class org.apache.logging.log4j.core.config.AppendersPlugin
2023-09-13 18:34:21,311 main DEBUG Processing node for object Loggers
2023-09-13 18:34:21,311 main DEBUG Processing node for array logger
2023-09-13 18:34:21,311 main DEBUG Processing logger[0]
2023-09-13 18:34:21,312 main DEBUG Processing array for object AppenderRef
2023-09-13 18:34:21,312 main DEBUG Node ref is of type STRING
2023-09-13 18:34:21,312 main DEBUG Returning AppenderRef with parent logger of type AppenderRef:class org.apache.logging.log4j.core.config.AppenderRef
2023-09-13 18:34:21,313 main DEBUG Processing logger[1]
2023-09-13 18:34:21,313 main DEBUG Processing array for object AppenderRef
2023-09-13 18:34:21,314 main DEBUG Node ref is of type STRING
2023-09-13 18:34:21,314 main DEBUG Returning AppenderRef with parent logger of type AppenderRef:class org.apache.logging.log4j.core.config.AppenderRef
2023-09-13 18:34:21,314 main DEBUG Processing logger[2]
2023-09-13 18:34:21,314 main DEBUG Processing array for object AppenderRef
2023-09-13 18:34:21,315 main DEBUG Node ref is of type STRING
2023-09-13 18:34:21,315 main DEBUG Returning AppenderRef with parent logger of type AppenderRef:class org.apache.logging.log4j.core.config.AppenderRef
2023-09-13 18:34:21,316 main DEBUG Processing node for object Root
2023-09-13 18:34:21,316 main DEBUG Node level is of type STRING
2023-09-13 18:34:21,316 main DEBUG Processing node for array AppenderRef
2023-09-13 18:34:21,316 main DEBUG Processing AppenderRef[0]
2023-09-13 18:34:21,317 main DEBUG Processing AppenderRef[1]
2023-09-13 18:34:21,317 main DEBUG Returning Root with parent Loggers of type root:class org.apache.logging.log4j.core.config.LoggerConfig$RootLogger
2023-09-13 18:34:21,317 main DEBUG Returning Loggers with parent root of type loggers:class org.apache.logging.log4j.core.config.LoggersPlugin
2023-09-13 18:34:21,318 main DEBUG Completed parsing configuration
2023-09-13 18:34:21,318 main DEBUG PluginManager 'Lookup' found 16 plugins
2023-09-13 18:34:21,319 main DEBUG Building Plugin[name=layout, class=org.apache.logging.log4j.core.layout.PatternLayout].
2023-09-13 18:34:21,320 main DEBUG PatternLayout$Builder(pattern="[%-5level] %d{yyyy-MM-dd HH:mm:ss.SSS} [%t] %c{1} - %msg%n", PatternSelector=null, Configuration(jar:file:/app.jar!/BOOT-INF/classes!/log4j2.yml), Replace=null, charset="null", alwaysWriteExceptions="null", disableAnsi="null", noConsoleNoAnsi="null", header="null", footer="null")
2023-09-13 18:34:21,320 main DEBUG PluginManager 'Converter' found 48 plugins
2023-09-13 18:34:21,321 main DEBUG Building Plugin[name=appender, class=org.apache.logging.log4j.core.appender.ConsoleAppender].
2023-09-13 18:34:21,322 main DEBUG ConsoleAppender$Builder(target="null", follow="null", direct="null", bufferedIo="null", bufferSize="null", immediateFlush="null", ignoreExceptions="null", PatternLayout([%-5level] %d{yyyy-MM-dd HH:mm:ss.SSS} [%t] %c{1} - %msg%n), name="LogToConsole", Configuration(jar:file:/app.jar!/BOOT-INF/classes!/log4j2.yml), Filter=null, ={})
2023-09-13 18:34:21,324 main DEBUG Building Plugin[name=layout, class=org.apache.logging.log4j.core.layout.PatternLayout].
2023-09-13 18:34:21,325 main DEBUG PatternLayout$Builder(pattern="[%-5level] %d{yyyy-MM-dd HH:mm:ss.SSS} [%t] %c{1} - %msg%n", PatternSelector=null, Configuration(jar:file:/app.jar!/BOOT-INF/classes!/log4j2.yml), Replace=null, charset="null", alwaysWriteExceptions="null", disableAnsi="null", noConsoleNoAnsi="null", header="null", footer="null")
2023-09-13 18:34:21,326 main DEBUG Building Plugin[name=SizeBasedTriggeringPolicy, class=org.apache.logging.log4j.core.appender.rolling.SizeBasedTriggeringPolicy].
2023-09-13 18:34:21,326 main DEBUG createPolicy(size="10MB")
2023-09-13 18:34:21,327 main DEBUG Building Plugin[name=Policies, class=org.apache.logging.log4j.core.appender.rolling.CompositeTriggeringPolicy].
2023-09-13 18:34:21,327 main DEBUG createPolicy(={SizeBasedTriggeringPolicy(size=10485760)})
2023-09-13 18:34:21,328 main DEBUG Building Plugin[name=DefaultRolloverStrategy, class=org.apache.logging.log4j.core.appender.rolling.DefaultRolloverStrategy].
2023-09-13 18:34:21,328 main DEBUG DefaultRolloverStrategy$Builder(max="10", min="null", fileIndex="null", compressionLevel="null", ={}, stopCustomActionsOnError="null", tempCompressedFilePattern="null", Configuration(jar:file:/app.jar!/BOOT-INF/classes!/log4j2.yml))
2023-09-13 18:34:21,329 main DEBUG Building Plugin[name=appender, class=org.apache.logging.log4j.core.appender.RollingFileAppender].
2023-09-13 18:34:21,330 main DEBUG RollingFileAppender$Builder(fileName="logs/app.log", filePattern="logs/${date:yyyy-MM}/app-%d{MM-dd-yyyy}-%i.log.gz", append="null", locking="null", Policies(CompositeTriggeringPolicy(policies=[SizeBasedTriggeringPolicy(size=10485760)])), DefaultRollOverStrategy(DefaultRolloverStrategy(min=1, max=10, useMax=true)), advertise="null", advertiseUri="null", createOnDemand="null", filePermissions="null", fileOwner="null", fileGroup="null", bufferedIo="null", bufferSize="null", immediateFlush="null", ignoreExceptions="null", PatternLayout([%-5level] %d{yyyy-MM-dd HH:mm:ss.SSS} [%t] %c{1} - %msg%n), name="LogToRollingFile", Configuration(jar:file:/app.jar!/BOOT-INF/classes!/log4j2.yml), Filter=null, ={})
2023-09-13 18:34:21,330 main DEBUG PluginManager 'FileConverter' found 2 plugins
2023-09-13 18:34:21,331 main DEBUG Initializing triggering policy SizeBasedTriggeringPolicy(size=10485760)
2023-09-13 18:34:21,331 main DEBUG Building Plugin[name=layout, class=org.apache.logging.log4j.core.layout.PatternLayout].
2023-09-13 18:34:21,332 main DEBUG PatternLayout$Builder(pattern="%d{yyyy-MM-dd HH:mm:ss.SSS} - %msg%n", PatternSelector=null, Configuration(jar:file:/app.jar!/BOOT-INF/classes!/log4j2.yml), Replace=null, charset="null", alwaysWriteExceptions="null", disableAnsi="null", noConsoleNoAnsi="null", header="null", footer="null")
2023-09-13 18:34:21,332 main DEBUG Building Plugin[name=appender, class=org.apache.logging.log4j.core.appender.FileAppender].
2023-09-13 18:34:21,333 main DEBUG FileAppender$Builder(fileName="logs/dataIngestionError_13-09-2023-06-34.log", append="false", locking="null", advertise="null", advertiseUri="null", createOnDemand="null", filePermissions="null", fileOwner="null", fileGroup="null", bufferedIo="null", bufferSize="null", immediateFlush="null", ignoreExceptions="null", PatternLayout(%d{yyyy-MM-dd HH:mm:ss.SSS} - %msg%n), name="LogToGeoJsonSummaryFile", Configuration(jar:file:/app.jar!/BOOT-INF/classes!/log4j2.yml), Filter=null, ={})
2023-09-13 18:34:21,333 main DEBUG Building Plugin[name=layout, class=org.apache.logging.log4j.core.layout.PatternLayout].
2023-09-13 18:34:21,334 main DEBUG PatternLayout$Builder(pattern="%d{yyyy-MM-dd HH:mm:ss.SSS} - %msg%n", PatternSelector=null, Configuration(jar:file:/app.jar!/BOOT-INF/classes!/log4j2.yml), Replace=null, charset="null", alwaysWriteExceptions="null", disableAnsi="null", noConsoleNoAnsi="null", header="null", footer="null")
2023-09-13 18:34:21,334 main DEBUG Building Plugin[name=appender, class=org.apache.logging.log4j.core.appender.FileAppender].
2023-09-13 18:34:21,335 main DEBUG FileAppender$Builder(fileName="logs/trajectoryLog_13-09-2023-06-34.log", append="false", locking="null", advertise="null", advertiseUri="null", createOnDemand="null", filePermissions="null", fileOwner="null", fileGroup="null", bufferedIo="null", bufferSize="null", immediateFlush="null", ignoreExceptions="null", PatternLayout(%d{yyyy-MM-dd HH:mm:ss.SSS} - %msg%n), name="TrajectoryLog", Configuration(jar:file:/app.jar!/BOOT-INF/classes!/log4j2.yml), Filter=null, ={})
2023-09-13 18:34:21,336 main DEBUG Building Plugin[name=appenders, class=org.apache.logging.log4j.core.config.AppendersPlugin].
2023-09-13 18:34:21,336 main DEBUG createAppenders(={LogToConsole, LogToRollingFile, LogToGeoJsonSummaryFile, TrajectoryLog})
2023-09-13 18:34:21,336 main DEBUG Building Plugin[name=AppenderRef, class=org.apache.logging.log4j.core.config.AppenderRef].
2023-09-13 18:34:21,337 main DEBUG createAppenderRef(ref="LogToRollingFile", level="null", Filter=null)
2023-09-13 18:34:21,338 main DEBUG Building Plugin[name=logger, class=org.apache.logging.log4j.core.config.LoggerConfig].
2023-09-13 18:34:21,338 main DEBUG LoggerConfig$Builder(additivity="false", level="INFO", levelAndRefs="null", name="org.osdu.gcz.transformer", includeLocation="null", ={LogToRollingFile}, ={}, Configuration(jar:file:/app.jar!/BOOT-INF/classes!/log4j2.yml), Filter=null)
2023-09-13 18:34:21,339 main DEBUG Building Plugin[name=AppenderRef, class=org.apache.logging.log4j.core.config.AppenderRef].
2023-09-13 18:34:21,339 main DEBUG createAppenderRef(ref="LogToGeoJsonSummaryFile", level="null", Filter=null)
2023-09-13 18:34:21,340 main DEBUG Building Plugin[name=logger, class=org.apache.logging.log4j.core.config.LoggerConfig].
2023-09-13 18:34:21,340 main DEBUG LoggerConfig$Builder(additivity="false", level="INFO", levelAndRefs="null", name="geoJsonSummaryLog", includeLocation="null", ={LogToGeoJsonSummaryFile}, ={}, Configuration(jar:file:/app.jar!/BOOT-INF/classes!/log4j2.yml), Filter=null)
2023-09-13 18:34:21,340 main DEBUG Building Plugin[name=AppenderRef, class=org.apache.logging.log4j.core.config.AppenderRef].
2023-09-13 18:34:21,341 main DEBUG createAppenderRef(ref="TrajectoryLog", level="null", Filter=null)
2023-09-13 18:34:21,341 main DEBUG Building Plugin[name=logger, class=org.apache.logging.log4j.core.config.LoggerConfig].
2023-09-13 18:34:21,342 main DEBUG LoggerConfig$Builder(additivity="false", level="INFO", levelAndRefs="null", name="trajectoryLog", includeLocation="null", ={TrajectoryLog}, ={}, Configuration(jar:file:/app.jar!/BOOT-INF/classes!/log4j2.yml), Filter=null)
2023-09-13 18:34:21,342 main DEBUG Building Plugin[name=AppenderRef, class=org.apache.logging.log4j.core.config.AppenderRef].
2023-09-13 18:34:21,343 main DEBUG createAppenderRef(ref="LogToConsole", level="null", Filter=null)
2023-09-13 18:34:21,343 main DEBUG Building Plugin[name=AppenderRef, class=org.apache.logging.log4j.core.config.AppenderRef].
2023-09-13 18:34:21,344 main DEBUG createAppenderRef(ref="LogToRollingFile", level="null", Filter=null)
2023-09-13 18:34:21,344 main DEBUG Building Plugin[name=root, class=org.apache.logging.log4j.core.config.LoggerConfig$RootLogger].
2023-09-13 18:34:21,345 main DEBUG LoggerConfig$RootLogger$Builder(additivity="null", level="ERROR", levelAndRefs="null", includeLocation="null", ={LogToConsole, LogToRollingFile}, ={}, Configuration(jar:file:/app.jar!/BOOT-INF/classes!/log4j2.yml), Filter=null)
2023-09-13 18:34:21,346 main DEBUG Building Plugin[name=loggers, class=org.apache.logging.log4j.core.config.LoggersPlugin].
2023-09-13 18:34:21,346 main DEBUG createLoggers(={org.osdu.gcz.transformer, geoJsonSummaryLog, trajectoryLog, root})
2023-09-13 18:34:21,346 main DEBUG Configuration YamlConfiguration[location=jar:file:/app.jar!/BOOT-INF/classes!/log4j2.yml] initialized
2023-09-13 18:34:21,347 main DEBUG Starting configuration YamlConfiguration[location=jar:file:/app.jar!/BOOT-INF/classes!/log4j2.yml]
2023-09-13 18:34:21,347 main DEBUG Started configuration YamlConfiguration[location=jar:file:/app.jar!/BOOT-INF/classes!/log4j2.yml] OK.
2023-09-13 18:34:21,348 main DEBUG Appender TrajectoryLog stopped with status true
2023-09-13 18:34:21,349 main DEBUG Appender LogToRollingFile stopped with status true
2023-09-13 18:34:21,349 main DEBUG Appender LogToGeoJsonSummaryFile stopped with status true
2023-09-13 18:34:21,350 main DEBUG Appender LogToConsole stopped with status true
2023-09-13 18:34:21,350 main DEBUG Stopped YamlConfiguration[location=jar:file:/app.jar!/BOOT-INF/classes!/log4j2.yml] OK
2023-09-13 18:34:21,352 main DEBUG Registering MBean org.apache.logging.log4j2:type=31221be2
2023-09-13 18:34:21,353 main DEBUG Registering MBean org.apache.logging.log4j2:type=31221be2,component=StatusLogger
2023-09-13 18:34:21,353 main DEBUG Registering MBean org.apache.logging.log4j2:type=31221be2,component=ContextSelector
2023-09-13 18:34:21,354 main DEBUG Registering MBean org.apache.logging.log4j2:type=31221be2,component=Loggers,name=
2023-09-13 18:34:21,355 main DEBUG Registering MBean org.apache.logging.log4j2:type=31221be2,component=Loggers,name=trajectoryLog
2023-09-13 18:34:21,355 main DEBUG Registering MBean org.apache.logging.log4j2:type=31221be2,component=Loggers,name=org.osdu.gcz.transformer
2023-09-13 18:34:21,356 main DEBUG Registering MBean org.apache.logging.log4j2:type=31221be2,component=Loggers,name=geoJsonSummaryLog
2023-09-13 18:34:21,356 main DEBUG Registering MBean org.apache.logging.log4j2:type=31221be2,component=Appenders,name=LogToConsole
2023-09-13 18:34:21,357 main DEBUG Registering MBean org.apache.logging.log4j2:type=31221be2,component=Appenders,name=LogToGeoJsonSummaryFile
2023-09-13 18:34:21,357 main DEBUG Registering MBean org.apache.logging.log4j2:type=31221be2,component=Appenders,name=LogToRollingFile
2023-09-13 18:34:21,358 main DEBUG Registering MBean org.apache.logging.log4j2:type=31221be2,component=Appenders,name=TrajectoryLog
2023-09-13 18:34:21,358 main DEBUG Reconfiguration complete for context[name=31221be2] at URI jar:file:/app.jar!/BOOT-INF/classes!/log4j2.yml (org.apache.logging.log4j.core.LoggerContext@449b2d27) with optional ClassLoader: null
. ____ _ __ _ _
/\\ / ___'_ __ _ _(_)_ __ __ _ \ \ \ \
( ( )\___ | '_ | '_| | '_ \/ _` | \ \ \ \
\\/ ___)| |_)| | | | | || (_| | ) ) ) )
' |____| .__|_| |_|_| |_\__, | / / / /
=========|_|==============|___/=/_/_/_/
:: Spring Boot :: (v2.7.10)
2023-09-13 18:34:21,705 main DEBUG AsyncLogger.ThreadNameStrategy=UNCACHED (user specified null, default is UNCACHED)
2023-09-13 18:34:21,705 main DEBUG org.apache.logging.log4j.core.util.SystemClock does not support precise timestamps.
[18:34:23] (wrn) Failed to resolve IGNITE_HOME automatically for class codebase [class=class o.a.i.i.util.IgniteUtils, e=URI is not hierarchical]
Console logging handler is not configured.
[18:34:23] __________ ________________
[18:34:23] / _/ ___/ |/ / _/_ __/ __/
[18:34:23] _/ // (7 7 // / / / / _/
[18:34:23] /___/\___/_/|_/___/ /_/ /___/
[18:34:23]
[18:34:23] ver. 8.8.13#20211223-sha1:80557a10
[18:34:23] 2021 Copyright(C) GridGain Systems, Inc. and Contributors
[18:34:23]
[18:34:23] Ignite documentation: http://gridgain.com
[18:34:23]
[18:34:23] Quiet mode.
[18:34:23] ^-- Logging by 'JavaLogger [quiet=true, config=null]'
[18:34:23] ^-- To see **FULL** console log here add -DIGNITE_QUIET=false or "-v" to ignite.{sh|bat}
[18:34:23]
[18:34:23] OS: Linux 5.4.0-1091-azure amd64
[18:34:23] VM information: OpenJDK Runtime Environment 1.8.0_212-b04 IcedTea OpenJDK 64-Bit Server VM 25.212-b04
[18:34:23] Please set system property '-Djava.net.preferIPv4Stack=true' to avoid possible problems in mixed environments.
[18:34:23] Configured plugins:
[18:34:23] ^-- None
[18:34:23]
[18:34:23] Configured failure handler: [hnd=StopNodeOrHaltFailureHandler [tryStop=false, timeout=0, super=AbstractFailureHandler [ignoredFailureTypes=UnmodifiableSet [SYSTEM_WORKER_BLOCKED, SYSTEM_CRITICAL_OPERATION_TIMEOUT]]]]
[18:34:24] Message queue limit is set to 0 which may lead to potential OOMEs when running cache operations in FULL_ASYNC or PRIMARY_SYNC modes due to message queues growth on sender and receiver sides.
[18:34:24] Security status [authentication=off, tls/ssl=off]
[18:34:25] REST protocols do not start on client node. To start the protocols on client node set '-DIGNITE_REST_START_ON_CLIENT=true' system property.https://community.opengroup.org/osdu/platform/consumption/geospatial/-/issues/295kubernetes deployment docs osdu-istio does not configure GCZ deployment for i...2023-09-22T05:23:58ZBenjamin LaGronekubernetes deployment docs osdu-istio does not configure GCZ deployment for istio enabled environments, it appears to create new istio-system namespace insteadin docs folder geospatial/docs/deployment/kubernetes/osdu-istio
provided chart does not seem to configure GCZ deployment for istio enabled environments, it instead appears to create new istio-system namespace instead. I'm not sure what t...in docs folder geospatial/docs/deployment/kubernetes/osdu-istio
provided chart does not seem to configure GCZ deployment for istio enabled environments, it instead appears to create new istio-system namespace instead. I'm not sure what the intent was, but this would likely fail to deploy where istio already deployed/enabled.
Could we have some clarification on the intent?
it would be helpful to deploy into an environment where istio already createdhttps://community.opengroup.org/osdu/platform/consumption/geospatial/-/issues/294Azure Kubernetes with istio Helm Chart ignite (Gridgain) service deployment t...2023-09-22T05:23:58ZBenjamin LaGroneAzure Kubernetes with istio Helm Chart ignite (Gridgain) service deployment template seems to be missing02f2760524014f5d634077420102883d761d7a8b had helm template `docs/deployment/kubernetes/helm-charts/templates/service.yaml`, which was deleted in 47ad25719f658a3d71fd10601d4e51c8d1f27fd4
without this template, Ignite/Gridgain has no serv...02f2760524014f5d634077420102883d761d7a8b had helm template `docs/deployment/kubernetes/helm-charts/templates/service.yaml`, which was deleted in 47ad25719f658a3d71fd10601d4e51c8d1f27fd4
without this template, Ignite/Gridgain has no service attached to the pod
I restored the file in my local repo and was able to successfully deploy, however can see no internal communication between pods
In our configuration we have istio sidecart deployed in clusterhttps://community.opengroup.org/osdu/platform/domain-data-mgmt-services/well-delivery/well-delivery/-/issues/20WPC - MasterData GroupType Change2023-09-13T14:28:08ZChad LeongWPC - MasterData GroupType ChangeHi, we have this upcoming planned change - I suspect this will affect the Well Delivery DDMS. Please be informed.
#### M21 (v0.24.0) Change Warning
OSDU uses group-type classifications for the entities. The definitions for the group-ty...Hi, we have this upcoming planned change - I suspect this will affect the Well Delivery DDMS. Please be informed.
#### M21 (v0.24.0) Change Warning
OSDU uses group-type classifications for the entities. The definitions for the group-types are provided in
the [Schema Usage Guide](https://community.opengroup.org/osdu/data/data-definitions/-/blob/v0.22.0/Guides/Chapters/02-GroupType.md#2-group-type).
Over time some of the entities' group-type classifications have been challenged. The following types appear in the wrong
group-type:
1. Reports are seen as non-tangible state descriptions/snapshots
1. master-data--FluidsReport proposed migrated to → work-product-component--WellFluidsReport.
2. master-data--OperationsReport proposed migrated to → work-product-component--WellOperationsReport.
2. Tubulars - when exchanged they are often transported using datasets/files, but the data themselves are tangible and
associated with an investment making them master-data.
1. work-product-component--TubularAssembly proposed migrated to → master-data--TubularAssembly.
2. work-product-component--TubularComponent proposed migrated to → master-data--TubularComponent.
3. work-product-component--TubularExternalComponent proposed migrated to →
master-data--TubularExternalComponent.
Obviously, this will impact operators who have ingested a large number of such data/records. This advance notice is
intended to prepare for this change and/or engage with data definitions, specifically
the [Well Delivery work-stream](https://opensdu.slack.com/archives/CL7MK8KMW), to influence the implementation of the
change.
There is a [schema preview including documentation provided](https://community.opengroup.org/osdu/data/data-definitions/-/blob/368b3a703581bb2d4a19210c935f922d9b3a4f4a/E-R/ChangeReport.md#snapshot-2023-08-18-towards-m21) in the Data Definitions community mirror.https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-service/-/issues/112[ADR] Synching SDMS V3 datasets in SDMS V42024-02-28T07:31:26ZDiego Molteni[ADR] Synching SDMS V3 datasets in SDMS V4# Introduction
We need a solution for make dataset ingested in SDMS V3 visible and consumed by SDMS V4.
The purpose of this ADR is to describes how to enable a synchronization mechanism that allows users of SDMS V4 to consume seismic d...# Introduction
We need a solution for make dataset ingested in SDMS V3 visible and consumed by SDMS V4.
The purpose of this ADR is to describes how to enable a synchronization mechanism that allows users of SDMS V4 to consume seismic dataset entities ingested in SDMS V3 via client applications, even though the two versions of the system have entirely different architectural logics.
# Status
* [x] Initiated
* [x] Proposed
* [ ] Under Review
* [ ] Approved
* [ ] Rejected
# Problem statement
The Seismic Data Management Service V4 (SDMS V4) stores and manages data types as defined by the Open Subsurface Data Universe (OSDU) Authority. The APIs (Application Programming Interfaces) provide robust data type checks and are fully integrated with the OSDU policy service. The goal is to minimize ambiguity in the authorization model and facilitate straightforward adoption through a consistent usage pattern. In contrast, the V3 version of the service defines, saves, and manages proprietary metadata records, interacts directly with the entitlement service, and organizes records into collections/data-groups named subprojects.
<div align="center">
<br/><img src="/uploads/5e1a58219ca35be9da530b0eba2ed9fa/arch-diagram.png"
alt="sdms-architectural-diagram"
style="display: block; margin: 0 auto;"/><br/>
</div>
The key difference between the two versions of the service lies in the way of how the cloud storage URI is generated. In SDMS V4 this is generated starting from the record-id value while in SDMS V3 the generated URI is a random UUID.
# Proposed solution
Update SDMS V4 by adding the capability to correctly retrieve the storage location for the dataset's bulk data if the dataset was ingested via SDMS V3.
## Scenarios
When a dataset is ingested in SDMS V3 from a seismic application, the latter also creates an OSDU Bulk record linked to a Work Product Component, as shown in the following diagram:
<div align="center">
<br/><img src="/uploads/3d73191098963a80675c2ed6e96472cc/image.png"
alt="sdms-architectural-diagram"
style="display: block; margin: 0 auto; height: 30%; width: 30%" /><br/>
</div>
The seismic applications saves the SDMS V3 URI (also known as `sdapth`) in the `FileSourceInfo` property of the created OSDU Bulk record. This is done to later facilitate communication of the URI to SDMS V3 for retrieving the storage connection string required to access the dataset's bulk data.
### Example of SDMS V3 dataset metadata
```json
{
"name": "test-data.zgy",
"tenant": "partition",
"subproject": "subproject",
"path": "/",
"ltag": "test-legal",
"created_by": "test-user@slb.com",
"last_modified_date": "Tue Sep 12 2023 11:04:29 GMT+0000 (Coordinated Universal Time)",
"created_date": "Tue Sep 12 10:54:10 GMT+0000 (Coordinated Universal Time)",
"gcsurl": "ss-weu-xkz32bjwg2425gn/bdf36c8a-3c62-3151-12b7-227af4727520",
"ctag": "sMTz0oWeId1nOnrx",
"readonly": true,
"sbit": null,
"sbit_count": 0,
"filemetadata": {
"type": "GENERIC",
"size": 1544552448,
"nobjects": 47
},
"seismicmeta_guid": "partition:work-product-component--SeismicTraceData:326bac9a-1fb2-5c73-9c64-6ca122c5025a",
"access_policy": "uniform"
}
```
### Example of OSDU storage associated Work Product Component
```json
{
"id": "partition:work-product-component--SeismicTraceData:326bac9a-1fb2-5c73-9c64-6ca122c5025",
"kind": "osdu:wks:work-product-component--SeismicTraceData:1.3.0",
"version": 1685099234631439,
"acl": {
"viewers": [
"data.test@domain.slb.com"
],
"owners": [
"data.test@domain.com"
]
},
"legal": {
"legaltags": [
"test-legal"
],
"otherRelevantDataCountries": [
"US"
],
"status": "compliant"
},
"data": {
"BinGridID": "partition:work-product-component--SeismicBinGrid:2a714f2b12aa346d16a08c5a2f4e157e:",
"Datasets": [
"partition:dataset--FileCollection.Slb.OpenZGY:1de532c2-4d1b-5316-ba4a-422342321d55"
],
"DDMSDatasets": [
"urn:dataset--FileCollection.Slb.OpenZGY:1de532c2-4d1b-5316-ba4a-422342321d55"
],
"Name": "test-data.zgy",
"Source": "osdu",
"SubmitterName": "test-user@domain.com"
},
"createUser": "test-user@domain.com",
"createTime": "2023-09-12T11:04:30.321Z",
"modifyUser": "test-user@domain.com",
"modifyTime": "2023-09-12T18:09:12.703Z"
}
```
### Example of OSDU storage associated File Collection
```json
{
"id": "partition:dataset--FileCollection.Slb.OpenZGY:1de532c2-4d1b-5316-ba4a-422342321d55",
"version": "4426199321664216",
"kind": "osdu:wks:dataset--FileCollection.Slb.OpenZGY:1.0.0",
"acl": {
"viewers": [
"data.test@domain.slb.com"
],
"owners": [
"data.test@domain.com"
]
},
"legal": {
"legaltags": [
"test-legal"
],
"otherRelevantDataCountries": [
"US"
],
"status": "compliant"
},
"createUser": "test-user@domain.com",
"createTime": "2023-09-12T11:04:02.705Z",
"data": {
"Endian": "BIG",
"SEGYRevision": "rev 1",
"TotalSize": "1544552448",
"Name": "test-data.zgy",
"DatasetProperties": {
"FileCollectionPath": "sd://tenant/subproject/",
"FileSourceInfos": [
{
"FileSource": "test-data.zgy",
"Name": "test-data.zgy",
"FileSize": "1544552448",
}
]
}
}
}
```
## Proposed Solution
To enable applications to access bulk datasets ingested in SDMS V3 through SDMS V4, we need to update the mechanism in SDMS V4 for retrieving the correct storage URI associated with the Bulk record. This update is necessary to generate a valid connection string for accessing the bulk data.
When a Bulk record is created, the SDMS V3 URI (also known as 'sdapth') is typically saved in the `FileCollectionPath` and `FileSource` properties. In the most common scenarios, the `sd://tenant/subproject/path` portion of the URI is stored in the `FileCollectionPath` property, while the URI's name is stored in the `FileSource` property.
When a connection access string is requested for a Bulk record through SDMS V4, the service should detect if the record's file source type refers to a V3 dataset's URI. If this last case, the service should then:
1. extract the `subproject` name from the `FileCollectionPath`
```python
subproject = record.data.DatasetProperties.FileCollectionPath.replace("sd://", "").split('/')[1]
```
2. extract the `path` from the `FileCollectionPath`
```python
subproject = (record.data.DatasetProperties.FileCollectionPath.replace("sd://", "").split('/')[2:]).replace("//", "/")
```
3. extract the `name` from the `FileSource`
```python
name = record.data.DatasetProperties.FileSourceInfos[0].FileSource
```
4. retrieve the storage URL from the V3 journal
```sql
SELECT c.data.gcsurl
FROM c
WHERE
c.data.subproject="{subproject}"
AND c.data.path="{path}"
AND c.data.name="{name}"
```
5. generate the connection string using the retrieved storage URL
```python
storage_client = StorageClient("{storage-url}")
return storage_client.getConnectionString()
```
#### Notes
Seismic applications use different approaches to save the SDMS V3 URI in the Bulk record, and all these cases should be considered:
1. The sd://tenant/subproject/path is saved in the `FileCollectionPath`, and the name is saved in `FileSource`.
2. The full sd://tenant/subproject/path/name URI is saved in both `FileCollectionPath` and `FileSource`.
3. The sd://tenant/subproject/path URI is saved in `FileCollectionPath`, and the name in `FileSource`, but this latter starts with the ./ special character (which should be removed).
### Limitations
Applications that do not match the described flow should we reviewed with the application owner before defining the right strategy to enable the synchronization of datasets ingested in SDMS V3 with SDSM V4.M22 - Release 0.25Sacha BrantsSneha PoddarSacha Brantshttps://community.opengroup.org/osdu/platform/domain-data-mgmt-services/wellbore/lib/wellbore-cloud/wellbore-gcp-lib/-/issues/1Add multipartition support2023-09-13T10:24:24ZYan Sushchynski (EPAM)Add multipartition supportThere is no multipartition support for the `GC` implementation. It causes a few problems:
1. We are forced to specify a GC project id
2. We have to use default bucket names
3. We can't separate the data in different partitionsThere is no multipartition support for the `GC` implementation. It causes a few problems:
1. We are forced to specify a GC project id
2. We have to use default bucket names
3. We can't separate the data in different partitions