Azure: Onboarding Well Delivery DDMS
Well Delivery DDMS
The following steps must be completed for a service to onboard with OSDU on Azure. Additionally, please add the
Service Onboarding tag to this issue when it is created.
Required Documentation for Service Approval (link or provide info here)
- What entity types does this service manage: See slide 6 here.
- Functional Swagger Link: here
- Instructions on how to run and test the service locally: here
- Instructions for creating ADO pipeline for the service: here
- What are the entitlements required to call the different endpoints for this service:
To query data:
To create data:
To soft delete:
To hard delete data or data version:
- What infrastructure is deployed for this service: A cosmos databas
well-deliveryin data partition Cosmos account with containers for each data type.
- How is the infrastructure for this service deployed: Currently within the service code
- What is the default tier/scaling for the service-specific infrastructure: 1200 RUs / container * 11 containers currently = 13,200 total or approximately $700/month
- Are the schemas included in WKS [YES/NO]. If not, how will customers load the schemas: Schemas not currently in WKS but are projected to be there before release
- Link to a postman collection or VS Code .http file where a customer can find an end-to-end workflow for the service (not required):
Infrastructure and Initial Requirements
Create helm charts for service. The charts for each service are located in the
devops/azuredirectory. You can look at charts from other services as a model. The charts will be nearly identical except for the different environment variables, values, etc each service needs to run.
- Service is deployed into its own namespace (currently optional).
- Service has its own ingress (currently optional).
- If there are new entitlements for this DDMS, add them to the list of groups used to bootstrap a data partition that can be found here.
- If there are new entitlements for this DDMS, add them to the list of groups used to bootstrap the opendes data partition found here.
Create an Istio auth policy in the
devops/azure/chart/templatesdirectory for the service. Here is an example of an Istio auth policy that is generic and can be used by other services. Link.
- Add any test data that is required for the service integration tests. Link.
upload-data.pyto upload any new test data files you created. Link.
- Verify that the README for the Azure provider correctly and clearly describes how to run and test the service.
- Create environment variable script to generate .yaml files (if sevice in Java) to be used with Intellij EnvFile plugin and .envrc files to be used with direnv. Link.
- Community PR requirements are enforced for the repo by default.
Development and Demo Azure Devops Pipelines
Create development ADO pipeline at
devops/azure/development-pipeline.ymlin the service repo.
- Verify development pipeline passes in ADO.
- Create documentation on how to add this service to an existing ADO project.
- Service Swagger is reachable.
- Service pases all integration tests.
- Service is able to pass any end-to-end workflows that have been defined.
- Relase helm charts created.
- Release helm charts validated.
- Release helm charts uploaded to Azure ACR (work with Manish Kumar for this).
- Release documentation for helm-charts-azure written and verified.