infra-azure-provisioning issueshttps://community.opengroup.org/osdu/platform/deployment-and-operations/infra-azure-provisioning/-/issues2022-08-23T10:47:29Zhttps://community.opengroup.org/osdu/platform/deployment-and-operations/infra-azure-provisioning/-/issues/155Onboard Well Delivery DDMS2022-08-23T10:47:29ZJasonOnboard Well Delivery DDMS**Service name**: `Well Delivery DDMS`
https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/well-delivery/well-delivery
The following steps must be completed for a service to onboard with OSDU on Azure. Additionally, ...**Service name**: `Well Delivery DDMS`
https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/well-delivery/well-delivery
The following steps must be completed for a service to onboard with OSDU on Azure. Additionally, please add the `Service Onboarding` tag to this issue when it is created.
For more information, visit our service onboarding documentation [here](https://community.opengroup.org/osdu/platform/deployment-and-operations/infra-azure-provisioning/-/blob/master/docs/service-onboarding.md).
## Steps:
**Infrastructure and Initial Requirements**
- [ ] Add any additional Azure cloud infrastructure (Cosmos containers, Storage containers, fileshares, etc.) to the Terraform template. [Link](https://community.opengroup.org/osdu/platform/deployment-and-operations/infra-azure-provisioning/-/tree/master/infra/templates/osdu-r3-mvp). Note that if the infrastructure is a part of the data-partition template, you may need to add secrets to the keyvault that are partition specific; if doing so, update the createPartition REST request to include the keys that you have added so they are accessible in service code. [Link](https://community.opengroup.org/osdu/platform/deployment-and-operations/infra-azure-provisioning/-/blob/master/tools/rest/partition.http#L48)
- [x] Create an ingress point for the service. [Link](https://community.opengroup.org/osdu/platform/deployment-and-operations/infra-azure-provisioning/-/blob/master/charts/osdu-common/templates/appgw-ingress.yaml)
- [x] Add any test data that is required for the service integration tests. [Link](https://community.opengroup.org/osdu/platform/deployment-and-operations/infra-azure-provisioning/-/tree/master/tools/test_data)
- [x] Update `upload-data.py` to upload any new test data files you created. [Link](https://community.opengroup.org/osdu/platform/deployment-and-operations/infra-azure-provisioning/-/blob/master/tools/test_data/upload-data.py).
- [x] Update the integration tester with any entitlements required to test the service. [Link](https://community.opengroup.org/osdu/platform/deployment-and-operations/infra-azure-provisioning/-/blob/master/tools/test_data/user_info_1.json)
- [x] Add in any new secrets that the service needs to run. [Link](https://community.opengroup.org/osdu/platform/deployment-and-operations/infra-azure-provisioning/-/blob/master/charts/osdu-common/templates/kv-secrets.yaml)
- [ ] Create environment variable script to generate .yaml files to be used with Intellij [EnvFile](https://plugins.jetbrains.com/plugin/7861-envfile) plugin and .envrc files to be used with [direnv](https://direnv.net/). [Link](https://community.opengroup.org/osdu/platform/deployment-and-operations/infra-azure-provisioning/-/tree/master/tools/variables)
**Gitlab Code and Documentation**
- [x] Complete the service code such that it passes all integration tests locally. There is some documentation on starting off implementing an Azure provider. [Link](./gitlab-service-readme-template.md)
- [x] Create helm charts for service. The charts for each service are located in the `devops/azure` directory. You can look at charts from other services as a model. The charts will be nearly identical except for the different environment variables, values, etc each service needs to run. [Link](./gitlab-service-guide.md)
- [x] Implement Istio for the service if this has not already been done. Here is an example MR that shows what steps are required.
- [ ] Create an Istio auth policy in the `devops/azure/chart/templates` directory. Here is an example of an Istio auth policy that is generic and can be used by other services. [Link](https://community.opengroup.org/osdu/platform/system/storage/-/blob/master/devops/azure/chart/templates/azure-istio-auth-policy.yaml)
- [x] Add any variables that are required for the service integration tests to the Azure CI-CD file. [Link](https://community.opengroup.org/osdu/platform/ci-cd-pipelines/-/blob/master/cloud-providers/azure.yml)
- [ ] Verify that the README for the Azure provider correctly and clearly describes how to run and test the service. There is a README template to help. [Link](./gitlab-service-readme-template.md)
- [ ] Push any changes and verify that the Gitlab pipeline is passing in master.
**Development and Demo Azure Devops Pipelines**
- [x] Create development ADO pipeline at `devops/azure/development-pipeline.yml` in the service repo.
- [ ] Verify development pipeline passes in ADO.
- [x] Create Demo ADO pipeline at `devops/azure/pipeline.yml` in the service repo.
- [ ] Verify demo pipeline is passing in ADO.
**User Documentation**
- [ ] Add the service to the mirror pipeline instructions. [Link](https://community.opengroup.org/osdu/platform/deployment-and-operations/infra-azure-provisioning/-/blob/master/docs/code-mirroring.md)
- [ ] Add the service to the manual deployment instructions. [Link](https://community.opengroup.org/osdu/platform/deployment-and-operations/infra-azure-provisioning/-/tree/master/charts)
- [ ] Add any required variables to the already existing variable group instructions for automated deployment. You should know if any variables need to be added to existing variable groups from creating the development and demo pipelines. [Link](https://community.opengroup.org/osdu/platform/deployment-and-operations/infra-azure-provisioning/-/blob/master/docs/service-automation.md#create-osdu-service-libraries)
- [ ] Add a variable group `Azure Service Release - $SERVICE_NAME` to the documentation. You should know what values to set for this variable group from creating the development and demo pipelines. [Link](https://community.opengroup.org/osdu/platform/deployment-and-operations/infra-azure-provisioning/-/blob/master/docs/service-automation.md#create-osdu-service-libraries)
- [ ] Add a step for creating the service pipeline at the bottom of the service-automation page. [Link](https://community.opengroup.org/osdu/platform/deployment-and-operations/infra-azure-provisioning/-/blob/master/docs/service-automation.md#create-osdu-service-libraries)
- [ ] Create a rest script with sample calls to the service for users. [Link](https://community.opengroup.org/osdu/platform/deployment-and-operations/infra-azure-provisioning/-/tree/master/tools/rest)M7 - Release 0.10Dmitriy RudkoSumra ZafarJasonDmitriy Rudkohttps://community.opengroup.org/osdu/platform/deployment-and-operations/infra-azure-provisioning/-/issues/161Implement Legal Tag Update Workflow2021-07-27T21:19:00ZAbhishek ChowdhryImplement Legal Tag Update WorkflowWe need a system in place which we will periodically evaluate the status of each legal tag as "valid" or "invalid" and update storage records accordingly. This system should also expose the events such that they can be consumed by other ...We need a system in place which we will periodically evaluate the status of each legal tag as "valid" or "invalid" and update storage records accordingly. This system should also expose the events such that they can be consumed by other subscribers within and outside of OSDU.
The changed Legal tags will be sent to a new topic in EventGrid from where they will be forwarded to a topic on Service bus. The Storage service will poll the Legal Tags from Service bus topic and will to update the status of all the records associated with those tags.M7 - Release 0.10Abhishek ChowdhryAbhishek Chowdhryhttps://community.opengroup.org/osdu/platform/deployment-and-operations/infra-azure-provisioning/-/issues/163Architecture change- service resources- Add cosmos db and Storage account2022-08-23T10:47:29ZAman VermaArchitecture change- service resources- Add cosmos db and Storage accountAdditional cosmos DB and Storage account is needed to in services resource group to support shared schemas. This database/ SA would be in addition to all the partition specific cosmos dbs/ SAs
---
__Design__
1. We already have a module...Additional cosmos DB and Storage account is needed to in services resource group to support shared schemas. This database/ SA would be in addition to all the partition specific cosmos dbs/ SAs
---
__Design__
1. We already have a module for cosmos db. The same can be leveraged to create cosmos db in service resources.
2. We already have a module for Storage account. The same can be leveraged to create Storage Account in service resources.
_Module Requirements_
- Required modules are already present
_Template Requirements_
- Database will be named with the suffix of "system" to distinguish from table or db
- Database will be created as part of the service Resource Template
- Database will be locked
- Database location and replication location will be consistent in naming patterns to Data Partitions
- Database by default will use the same type of throughput settings as other CosmosDBs.
- Storage account will be named with the suffix of "system" to distinguish from other SAs
- Storage account will be created as part of the service Resource Template
- Storage account will be locked
- Storage account location and replication location will be consistent in naming patterns to Data Partitions
---
__Acceptance Criteria__
1. Architecture Diagram Change
3. Modify Central service to add the additional database/ SA.
4. Ensure all Module Unit Tests Pass
5. Ensure all Template Unit Tests and Integration Tests Pass
6. Update all required documentation
cc: @polavishnu, @manishkM7 - Release 0.10Aman VermaAman Vermahttps://community.opengroup.org/osdu/platform/deployment-and-operations/infra-azure-provisioning/-/issues/171Adding new in partition service for Workflow Ingestion Service (storage accont)2021-06-14T12:19:20ZAalekh JainAdding new in partition service for Workflow Ingestion Service (storage accont)In order to support multi partition for storage account in workflow ingestion service, we need to add the following properties to partition service -
1. `ingest-storage-account-name`
2. `ingest-storage-account-key`
MR is raised here: ...In order to support multi partition for storage account in workflow ingestion service, we need to add the following properties to partition service -
1. `ingest-storage-account-name`
2. `ingest-storage-account-key`
MR is raised here: !317
Change in core lib azure are introduced here: https://community.opengroup.org/osdu/platform/system/lib/cloud/azure/os-core-lib-azure/-/merge_requests/110M7 - Release 0.10https://community.opengroup.org/osdu/platform/deployment-and-operations/infra-azure-provisioning/-/issues/173opendes hardcoded in http scripts2021-06-23T09:23:50ZKishore Battulaopendes hardcoded in http scriptsopendes is hardcoded in the http scripts even though a variable exists at the top of the scripts. This is resulting unexpected behavior when changing the data-partition-id.
One of the hardcoded locations: https://community.opengroup.org...opendes is hardcoded in the http scripts even though a variable exists at the top of the scripts. This is resulting unexpected behavior when changing the data-partition-id.
One of the hardcoded locations: https://community.opengroup.org/osdu/platform/deployment-and-operations/infra-azure-provisioning/-/blob/master/tools/rest/check.http#L202M7 - Release 0.10https://community.opengroup.org/osdu/platform/deployment-and-operations/infra-azure-provisioning/-/issues/185[Feature] Airflow Monitoring Dashboards.2021-07-28T06:23:16ZMayank Saggar [Microsoft][Feature] Airflow Monitoring Dashboards.For Monitoring of Airflow and it's services, three dashboards, one for airflow infra, one for airflow service and one for airflow dags will be deployed as a part of Monitoring resources. The infra and service dashboards would be viewable...For Monitoring of Airflow and it's services, three dashboards, one for airflow infra, one for airflow service and one for airflow dags will be deployed as a part of Monitoring resources. The infra and service dashboards would be viewable at data partition level, whereas the dags dashboard would be viewable at data-partition and dag level.M7 - Release 0.10Mayank Saggar [Microsoft]Mayank Saggar [Microsoft]