OSDU Software issueshttps://community.opengroup.org/groups/osdu/-/issues2021-07-06T15:38:33Zhttps://community.opengroup.org/osdu/platform/ci-cd-pipelines/-/issues/23Spotbugs Failing in some Services with Out of Memory Exception2021-07-06T15:38:33ZDavid Diederichd.diederich@opengroup.orgSpotbugs Failing in some Services with Out of Memory ExceptionSome services, such as [Partition](https://community.opengroup.org/osdu/platform/system/partition/-/jobs/404507) fail in the spotbugs step. If re-run with `SECURE_LOG_LEVEL` set to `"debug"`, we see that the [failure](https://community.o...Some services, such as [Partition](https://community.opengroup.org/osdu/platform/system/partition/-/jobs/404507) fail in the spotbugs step. If re-run with `SECURE_LOG_LEVEL` set to `"debug"`, we see that the [failure](https://community.opengroup.org/osdu/platform/system/partition/-/jobs/404635#L878) is `java.lang.OutOfMemoryError`
From that [same debug output](https://community.opengroup.org/osdu/platform/system/partition/-/jobs/404635#L868), spotbugs is run with `java -Xmx1900M`.M7 - Release 0.10David Diederichd.diederich@opengroup.orgDavid Diederichd.diederich@opengroup.orghttps://community.opengroup.org/osdu/platform/data-flow/ingestion/csv-parser/csv-parser/-/issues/14CSV parser not deleting the workflow that is registered after integration tests2021-07-01T03:03:44ZKishore BattulaCSV parser not deleting the workflow that is registered after integration testsCSV Parser in azure registered the parser through workflow service register workflow API. After integration test, the workflow service must be deleted otherwise each run will create csv workflows in the system which will slow down the ai...CSV Parser in azure registered the parser through workflow service register workflow API. After integration test, the workflow service must be deleted otherwise each run will create csv workflows in the system which will slow down the airflow to load huge number of DAGs at runtime.M7 - Release 0.10SwapnilSwapnilhttps://community.opengroup.org/osdu/platform/data-flow/ingestion/csv-parser/csv-parser/-/issues/10CSV parser program uses File service but should use Dataset service2021-07-01T03:03:41ZSpencer Suttonsuttonsp@amazon.comCSV parser program uses File service but should use Dataset serviceIt looks like this uses the File service to pull down a CSV before doing the parsing logic. Dataset service should be used when interacting with any bulk data or files via OSDU.
**I'm planning on updating this code to use Dataset servic...It looks like this uses the File service to pull down a CSV before doing the parsing logic. Dataset service should be used when interacting with any bulk data or files via OSDU.
**I'm planning on updating this code to use Dataset service instead, would this be mergeable when done?**M7 - Release 0.10ethiraj krishnamanaiduDania Kodeih (Microsoft)Joeethiraj krishnamanaiduhttps://community.opengroup.org/osdu/platform/domain-data-mgmt-services/well-delivery/well-delivery/-/issues/1SLB Specific Variable defined in ADO pipeline yaml files2021-06-29T22:00:07ZJasonSLB Specific Variable defined in ADO pipeline yaml filesThere are currently two issues preventing generic customers to deploy the ADO pipelines in this repo:
- Non-development pipeline is currently hard coded with an SLB-specific repo name [here](https://community.opengroup.org/osdu/platform/...There are currently two issues preventing generic customers to deploy the ADO pipelines in this repo:
- Non-development pipeline is currently hard coded with an SLB-specific repo name [here](https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/well-delivery/well-delivery/-/blob/master/devops/azure/pipeline.yml#L37). This causes the ADO pipelines to fail for other users because it can't find the repo titled `security-infrastructure`
- Non-development pipeline is hard coded with SLB environment names: https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/well-delivery/well-delivery/-/blob/master/devops/azure/pipeline.yml#L89
We will need to remove these SLB-specific lines in to make the pipelines work for everyone.M7 - Release 0.10JasonJasonhttps://community.opengroup.org/osdu/platform/system/file/-/issues/30Need for DELETE endpoint2021-06-29T09:41:44ZParesh BehedeNeed for DELETE endpointCurrently there is no way to delete already uploaded file by user from data platform, in case user uploads wrong file by mistake that file can not be deleted by user.
We must give ability to delete metadata record and file associated wi...Currently there is no way to delete already uploaded file by user from data platform, in case user uploads wrong file by mistake that file can not be deleted by user.
We must give ability to delete metadata record and file associated with that metadata record to user, so that user can delete file uploaded by him/her when ever its necessary.
New endpoint in File Service could be DELETE /v2/files/{id}/metadataM7 - Release 0.10Paresh BehedeParesh Behedehttps://community.opengroup.org/osdu/platform/deployment-and-operations/infra-azure-provisioning/-/issues/173opendes hardcoded in http scripts2021-06-23T09:23:50ZKishore Battulaopendes hardcoded in http scriptsopendes is hardcoded in the http scripts even though a variable exists at the top of the scripts. This is resulting unexpected behavior when changing the data-partition-id.
One of the hardcoded locations: https://community.opengroup.org...opendes is hardcoded in the http scripts even though a variable exists at the top of the scripts. This is resulting unexpected behavior when changing the data-partition-id.
One of the hardcoded locations: https://community.opengroup.org/osdu/platform/deployment-and-operations/infra-azure-provisioning/-/blob/master/tools/rest/check.http#L202M7 - Release 0.10https://community.opengroup.org/osdu/platform/deployment-and-operations/infra-azure-provisioning/-/issues/171Adding new in partition service for Workflow Ingestion Service (storage accont)2021-06-14T12:19:20ZAalekh JainAdding new in partition service for Workflow Ingestion Service (storage accont)In order to support multi partition for storage account in workflow ingestion service, we need to add the following properties to partition service -
1. `ingest-storage-account-name`
2. `ingest-storage-account-key`
MR is raised here: ...In order to support multi partition for storage account in workflow ingestion service, we need to add the following properties to partition service -
1. `ingest-storage-account-name`
2. `ingest-storage-account-key`
MR is raised here: !317
Change in core lib azure are introduced here: https://community.opengroup.org/osdu/platform/system/lib/cloud/azure/os-core-lib-azure/-/merge_requests/110M7 - Release 0.10