Skip to content
GitLab
Projects
Groups
Snippets
/
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Sign in / Register
Toggle navigation
Menu
Open sidebar
Open Subsurface Data Universe Software
Platform
Deployment and Operations
infra-azure-provisioning
Commits
287a3a00
Commit
287a3a00
authored
Aug 26, 2021
by
Vivek Ojha
Committed by
MANISH KUMAR
Aug 26, 2021
Browse files
Dataset pipeline azure
parent
7fdec0e3
Changes
8
Hide whitespace changes
Inline
Side-by-side
charts/README.md
View file @
287a3a00
...
...
@@ -313,6 +313,7 @@ airflow:
AIRFLOW_VAR_CORE__SERVICE__STORAGE__URL: "http://storage.osdu.svc.cluster.local/api/storage/v2/records"
AIRFLOW_VAR_CORE__SERVICE__FILE__HOST: "http://file.osdu.svc.cluster.local/api/file/v2"
AIRFLOW_VAR_CORE__SERVICE__WORKFLOW__HOST: "http://ingestion-workflow.osdu.svc.cluster.local/api/workflow"
AIRFLOW_VAR_CORE__SERVICE__DATASET__HOST: "http://dataset.osdu.svc.cluster.local/api/dataset/v1"
AIRFLOW__WEBSERVER__WORKERS: 15
AIRFLOW__WEBSERVER__WORKER_REFRESH_BATCH_SIZE: 0
AIRFLOW__CORE__STORE_SERIALIZED_DAGS: True #This flag decides whether to serialise DAGs and persist them in DB
...
...
@@ -405,6 +406,7 @@ git clone https://community.opengroup.org/osdu/platform/system/reference/crs-cat
git clone https://community.opengroup.org/osdu/platform/system/reference/crs-conversion-service.git
$SRC_DIR
/crs-conversion-service
git clone https://community.opengroup.org/osdu/platform/system/notification.git
$SRC_DIR
/notification
git clone https://community.opengroup.org/osdu/platform/data-flow/enrichment/wks.git
$SRC_DIR
/wks
git clone https://community.opengroup.org/osdu/platform/system/dataset.git
$SRC_DIR
/dataset
git clone https://community.opengroup.org/osdu/platform/system/register.git
$SRC_DIR
/register
git clone https://community.opengroup.org/osdu/platform/system/schema-service.git
$SRC_DIR
/schema-service
git clonehttps://community.opengroup.org/osdu/platform/data-flow/ingestion/ingestion-workflow.git
$SRC_DIR
/ingestion-workflow
...
...
@@ -515,7 +517,8 @@ SERVICE_LIST="infra-azure-provisioning \
notification
\
schema-service
\
ingestion-workflow
\
ingestion-service"
ingestion-service
\
dataset"
for
SERVICE
in
$SERVICE_LIST
;
do
...
...
charts/airflow/helm-config.yaml
View file @
287a3a00
...
...
@@ -254,6 +254,7 @@ airflow:
AIRFLOW_VAR_CORE__SERVICE__STORAGE__URL
:
"
http://storage.osdu.svc.cluster.local/api/storage/v2/records"
AIRFLOW_VAR_CORE__SERVICE__FILE__HOST
:
"
http://file.osdu.svc.cluster.local/api/file/v2"
AIRFLOW_VAR_CORE__SERVICE__WORKFLOW__HOST
:
"
http://ingestion-workflow.osdu.svc.cluster.local/api/workflow"
AIRFLOW_VAR_CORE__SERVICE__DATASET__HOST
:
"
http://dataset.osdu.svc.cluster.local/api/dataset/v1"
AIRFLOW_VAR_CORE__SERVICE__SEARCH_WITH_CURSOR__URL
:
"
http://search-service.osdu.svc.cluster.local/api/search/v2/query_with_cursor"
AIRFLOW__WEBSERVER__WORKERS
:
8
AIRFLOW__WEBSERVER__WORKER_REFRESH_BATCH_SIZE
:
0
...
...
charts/osdu-common/templates/appgw-ingress.yaml
View file @
287a3a00
...
...
@@ -117,3 +117,7 @@ spec:
serviceName
:
policy-service
servicePort
:
80
path
:
/api/policy/*
-
backend
:
serviceName
:
dataset
servicePort
:
80
path
:
/api/dataset/v1/*
docs/code-mirroring.md
View file @
287a3a00
...
...
@@ -28,6 +28,7 @@ Empty repositories need to be created that will be used by a pipeline to mirror
| seismic-store-service | https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-service.git |
| wellbore-domain-services | https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/wellbore/wellbore-domain-services.git |
| ingestion-service | https://community.opengroup.org/osdu/platform/data-flow/ingestion/ingestion-service.git |
| dataset | https://community.opengroup.org/osdu/platform/system/dataset.git |
| policy | https://community.opengroup.org/osdu/platform/security-and-compliance/policy.git |
```
bash
export
ADO_ORGANIZATION
=
<organization_name>
...
...
@@ -58,6 +59,7 @@ SERVICE_LIST="infra-azure-provisioning \
seismic-store-service
\
wellbore-domain-services
\
ingestion-service
\
dataset
\
policy"
...
...
@@ -99,6 +101,7 @@ Variable Group Name: `Mirror Variables`
| SEISMIC_STORE_SERVICE_REPO | https://dev.azure.com/osdu-demo/osdu/_git/seismic-store-service |
| WELLBORE_DOMAIN_SERVICSE_REPO | https://dev.azure.com/osdu-demo/osdu/_git/wellbore-domain-services |
| INGESTION_SERVICE_REPO | https://dev.azure.com/osdu-demo/osdu/_git/ingestion-service |
| DATASET_REPO | https://dev.azure.com/osdu-demo/osdu/_git/dataset |
| POLICY_REPO | https://dev.azure.com/osdu-demo/osdu/_git/policy |
| ACCESS_TOKEN |
<your_personal_access_token>
|
...
...
@@ -135,6 +138,7 @@ az pipelines variable-group create \
SEISMIC_STORE_SERVICE_REPO
=
https://dev.azure.com/
${
ADO_ORGANIZATION
}
/
$ADO_PROJECT
/_git/seismic-store-service
\
WELLBORE_DOMAIN_SERVICSE_REPO
=
https://dev.azure.com/
${
ADO_ORGANIZATION
}
/
$ADO_PROJECT
/_git/wellbore-domain-services
\
INGESTION_SERVICE_REPO
=
https://dev.azure.com/
${
ADO_ORGANIZATION
}
/
$ADO_PROJECT
/_git/ingestion-service
\
DATASET_REPO
=
https://dev.azure.com/
${
ADO_ORGANIZATION
}
/
$ADO_PROJECT
/_git/dataset
\
POLICY_REPO
=
https://dev.azure.com/
${
ADO_ORGANIZATION
}
/
$ADO_PROJECT
/_git/policy
\
ACCESS_TOKEN
=
$ACCESS_TOKEN
\
-ojson
...
...
@@ -337,6 +341,13 @@ jobs:
sourceGitRepositoryUri: 'https://community.opengroup.org/osdu/platform/data-flow/ingestion/ingestion-service.git'
destinationGitRepositoryUri: '
$(
INGESTION_SERVICE_REPO
)
'
destinationGitRepositoryPersonalAccessToken:
$(
ACCESS_TOKEN
)
- task: swellaby.mirror-git-repository.mirror-git-repository-vsts-task.mirror-git-repository-vsts-task@1
displayName: 'dataset'
inputs:
sourceGitRepositoryUri: 'https://community.opengroup.org/osdu/platform/system/dataset.git'
destinationGitRepositoryUri: '
$(
DATASET_REPO
)
'
destinationGitRepositoryPersonalAccessToken:
$(
ACCESS_TOKEN
)
- task: swellaby.mirror-git-repository.mirror-git-repository-vsts-task.mirror-git-repository-vsts-task@1
displayName: 'policy'
...
...
docs/service-automation.md
View file @
287a3a00
...
...
@@ -33,6 +33,7 @@ This variable group will be used to hold the common values for the services to b
| UNIT_URL |
`https://<your_fqdn>/api/unit/v2`
|
| CRS_CATALOG_URL |
`https://<your_fqdn>/api/crs/catalog/v2/`
|
| CRS_CONVERSION_URL |
`https://<your_fqdn>/api/crs/converter/v2/`
|
| DATASET_URL |
`https://<your_fqdn>/api/dataset/v1`
|
| REGISTER_BASE_URL |
`https://<your_fqdn>/`
|
| ACL_OWNERS |
`data.test1`
|
| ACL_VIEWERS |
`data.test1`
|
...
...
@@ -86,6 +87,7 @@ az pipelines variable-group create \
UNIT_URL
=
"https://
${
DNS_HOST
}
/api/unit/v2/"
\
CRS_CATALOG_URL
=
"https://
${
DNS_HOST
}
/api/crs/catalog/v2/"
\
CRS_CONVERSION_URL
=
"https://
${
DNS_HOST
}
/api/crs/converter/v2/"
\
DATASET_URL
=
"https://
${
DNS_HOST
}
/api/dataset/v1"
\
REGISTER_BASE_URL
=
"https://
${
DNS_HOST
}
/"
\
ACL_OWNERS
=
"data.test1"
\
ACL_VIEWERS
=
"data.test1"
\
...
...
@@ -673,6 +675,28 @@ az pipelines variable-group create \
-ojson
```
__Setup and Configure the ADO Library `Azure Service Release - dataset`__
This variable group is the service specific variables necessary for testing and deploying the
`dataset`
service.
| Variable | Value |
|----------|-------|
| MAVEN_DEPLOY_POM_FILE_PATH |
`drop/provider/dataset-azure`
|
| MAVEN_INTEGRATION_TEST_OPTIONS |
`-DDATASET_HOST=$(DATASET_URL) -DAZURE_AD_TENANT_ID=$(AZURE_TENANT_ID) -DINTEGRATION_TESTER=$(INTEGRATION_TESTER) -DTESTER_SERVICEPRINCIPAL_SECRET=$(AZURE_TESTER_SERVICEPRINCIPAL_SECRET) -DAZURE_AD_APP_RESOURCE_ID=$(AZURE_AD_APP_RESOURCE_ID) -DSTAGING_CONTAINER_NAME=dataset-staging-area -DNO_DATA_ACCESS_TESTER=$(NO_DATA_ACCESS_TESTER) -DNO_DATA_ACCESS_TESTER_SERVICEPRINCIPAL_SECRET=$(NO_DATA_ACCESS_TESTER_SERVICEPRINCIPAL_SECRET) -DAZURE_STORAGE_ACCOUNT=$(STORAGE_ACCOUNT) -DUSER_ID=osdu-user -DEXIST_FILE_ID=8900a83f-18c6-4b1d-8f38-309a208779cc -DTIME_ZONE="UTC+0" -DDATA_PARTITION_ID=$(MY_TENANT)`
|
| MAVEN_INTEGRATION_TEST_POM_FILE_PATH |
`drop/deploy/testing/dataset-test-azure`
|
| SERVICE_RESOURCE_NAME |
`$(AZURE_DATASET_NAME)`
|
```
bash
az pipelines variable-group create
\
--name
"Azure Service Release - dataset"
\
--authorize
true
\
--variables
\
MAVEN_DEPLOY_POM_FILE_PATH
=
"drop/provider/dataset-azure"
\
MAVEN_INTEGRATION_TEST_OPTIONS
=
'-DDATASET_HOST=$(DATASET_URL) -DAZURE_AD_TENANT_ID=$(AZURE_TENANT_ID) -DINTEGRATION_TESTER=$(INTEGRATION_TESTER) -DTESTER_SERVICEPRINCIPAL_SECRET=$(AZURE_TESTER_SERVICEPRINCIPAL_SECRET) -DAZURE_AD_APP_RESOURCE_ID=$(AZURE_AD_APP_RESOURCE_ID) -DSTAGING_CONTAINER_NAME=dataset-staging-area -DNO_DATA_ACCESS_TESTER=$(NO_DATA_ACCESS_TESTER) -DNO_DATA_ACCESS_TESTER_SERVICEPRINCIPAL_SECRET=$(NO_DATA_ACCESS_TESTER_SERVICEPRINCIPAL_SECRET) -DAZURE_STORAGE_ACCOUNT=$(STORAGE_ACCOUNT) -DUSER_ID=osdu-user -DEXIST_FILE_ID=8900a83f-18c6-4b1d-8f38-309a208779cc -DTIME_ZONE="UTC+0" -DDATA_PARTITION_ID=$(MY_TENANT)'
\
MAVEN_INTEGRATION_TEST_POM_FILE_PATH
=
"drop/deploy/testing/dataset-test-azure"
\
SERVICE_RESOURCE_NAME
=
'$(AZURE_DATASET_NAME)'
\
-ojson
__Setup and Configure the ADO Library
`
Azure Service Release - seismic-store-service
`
__
This variable group is the service specific variables necessary
for
testing and deploying the
`
seismic-store-service
`
service.
...
...
@@ -1180,8 +1204,23 @@ az pipelines create \
--yaml-path /devops/azure/pipeline.yml
\
-ojson
```
22. Add a Pipeline for __dataset__ to deploy the Dataset Service.
_Repo:_ `dataset`
_Path:_ `/devops/azure/pipeline.yml`
_Validate:_ https://<your_dns_name>/api/dataset/v1/swagger-ui.html is alive.
```
bash
az pipelines create
\
--name 'dataset'
\
--repository dataset
\
--branch master
\
--repository-type tfsgit
\
--yaml-path /devops/azure/pipeline.yml
\
-ojson
```
2
1
.
Add a Pipeline for __policy-service__ to deploy the Policy Service.
2
3
. Add a Pipeline for __policy-service__ to deploy the Policy Service.
_Repo:_ `policy`
_Path:_ `/devops/azure/pipeline.yml`
...
...
tools/rest/dataset.http
0 → 100644
View file @
287a3a00
# -------HTTP REST CLIENT -------
# https://marketplace.visualstudio.com/items?itemName=humao.rest-client
## This script provides a few samples for calling dataset service.
# -----------------------
# Service entitlements
# -----------------------
# The following entitlements are required by the authenticated user to call this service:
#
# 1. service.dataset.viewer
# -----------------------
# OAUTH (Variables)
# -----------------------
###
@login_base = login.microsoftonline.com/{{TENANT_ID}}
@oauth_token_host = {{login_base}}/oauth2/v2.0/token
@scopes = {{CLIENT_ID}}/.default openid profile offline_access
# -----------------------
# OAUTH refresh_token
# -----------------------
###
# @name refresh
POST https://{{oauth_token_host}} HTTP/1.1
Content-Type: application/x-www-form-urlencoded
grant_type=refresh_token
&client_id={{CLIENT_ID}}
&client_secret={{CLIENT_SECRET}}
&refresh_token={{INITIAL_TOKEN}}
&scope={{scopes}}
# -----------------------
# API (Variables)
# -----------------------
###
@access_token = {{refresh.response.body.access_token}}
@ENDPOINT = https://{{OSDU_HOST}}
@DATASET_HOST = {{ENDPOINT}}/api/dataset/v1
@data_partition_id = opendes
# -----------------------
# API: dataset
# -----------------------
###
# @name getStorageInstructions
GET {{DATASET_HOST}}/getStorageInstructions?kindSubType=file
Authorization: Bearer {{access_token}}
Accept: application/json
Content-Type: application/json
data-partition-id: {{data_partition_id}}
tools/test_data/user_info_1.json
View file @
287a3a00
...
...
@@ -27,6 +27,8 @@
"notification.pubsub"
,
"service.file.viewers"
,
"service.file.editors"
,
"service.dataset.viewers"
,
"service.dataset.editors"
,
"service.delivery.viewer"
]
},
...
...
@@ -58,6 +60,8 @@
"notification.pubsub"
,
"service.file.viewers"
,
"service.file.editors"
,
"service.dataset.viewers"
,
"service.dataset.editors"
,
"service.delivery.viewer"
]
}
...
...
tools/variables/dataset.sh
0 → 100644
View file @
287a3a00
#!/usr/bin/env bash
#
# Purpose: Create the Developer Environment Variables.
# Usage:
# storage.sh
###############################
## ARGUMENT INPUT ##
###############################
usage
()
{
echo
"Usage: DNS_HOST=<your_host> INVALID_JWT=<your_token> dataset.sh "
1>&2
;
exit
1
;
}
SERVICE
=
"file"
if
[
-z
$UNIQUE
]
;
then
tput setaf 1
;
echo
'ERROR: UNIQUE not provided'
;
tput sgr0
usage
;
fi
if
[
-z
$DNS_HOST
]
;
then
tput setaf 1
;
echo
'ERROR: DNS_HOST not provided'
;
tput sgr0
usage
;
fi
if
[
-z
$COMMON_VAULT
]
;
then
tput setaf 1
;
echo
'ERROR: COMMON_VAULT not provided'
;
tput sgr0
usage
;
fi
if
[
-z
$INVALID_JWT
]
;
then
tput setaf 1
;
echo
'ERROR: INVALID_JWT not provided'
;
tput sgr0
usage
;
fi
if
[
-f
./settings_common.env
]
;
then
source
./settings_common.env
;
else
tput setaf 1
;
echo
'ERROR: common.env not found'
;
tput sgr0
fi
if
[
-f
./settings_environment.env
]
;
then
source
./settings_environment.env
;
else
tput setaf 1
;
echo
'ERROR: environment.env not found'
;
tput sgr0
fi
if
[
!
-d
$UNIQUE
]
;
then
mkdir
$UNIQUE
;
fi
# ------------------------------------------------------------------------------------------------------
# LocalHost Run Settings
# ------------------------------------------------------------------------------------------------------
LOG_PREFIX
=
"dataset"
AZURE_TENANT_ID
=
"
${
TENANT_ID
}
"
AZURE_CLIENT_ID
=
"
${
ENV_PRINCIPAL_ID
}
"
AZURE_CLIENT_SECRET
=
"
${
ENV_PRINCIPAL_SECRET
}
"
KEYVAULT_URI
=
"
${
ENV_KEYVAULT
}
"
appinsights_key
=
"
${
ENV_APPINSIGHTS_KEY
}
"
cosmosdb_database
=
"
${
COSMOS_DB_NAME
}
"
AZURE_AD_APP_RESOURCE_ID
=
"
${
ENV_APP_ID
}
"
entitlements_service_endpoint
=
"https://
${
ENV_HOST
}
/api/entitlements/v2"
entitlements_app_key
=
"
${
API_KEY
}
"
storage_service_endpoint
=
"https://
${
ENV_HOST
}
/api/storage/v2/"
file_service_endpoint
=
"https://
${
ENV_HOST
}
/api/file/v2/files"
aad_client_id
=
"
${
ENV_APP_ID
}
"
partition_service_endpoint
=
"https://
${
ENV_HOST
}
/api/partition/v1"
schema_service_endpoint
=
"https://
${
ENV_HOST
}
/api/schema-service/v1"
azure_istioauth_enabled
=
"false"
server_port
=
"8089"
# ------------------------------------------------------------------------------------------------------
# Integration Test Settings
# ------------------------------------------------------------------------------------------------------
DATASET_BASE_URL
=
"http://localhost:
${
server_port
}
/api/dataset/v1/"
DATASET_HOST
=
"http://localhost:
${
server_port
}
/api/dataset/v1/"
DATASET_HOST_REMOTE
=
"https://
${
ENV_HOST
}
/api/dataset/v1/"
STORAGE_HOST
=
"https://
${
ENV_HOST
}
/api/storage/v2/"
LEGAL_HOST
=
"https://
${
ENV_HOST
}
/api/legal/v1/"
AZURE_STORAGE_ACCOUNT
=
"
${
ENV_STORAGE
}
"
# also used for testing
DATA_PARTITION_ID
=
"opendes"
INTEGRATION_TESTER
=
"
${
ENV_PRINCIPAL_ID
}
"
TESTER_SERVICEPRINCIPAL_SECRET
=
"
${
ENV_PRINCIPAL_SECRET
}
"
AZURE_AD_TENANT_ID
=
"
${
TENANT_ID
}
"
AZURE_AD_APP_RESOURCE_ID
=
"
${
ENV_APP_ID
}
"
NO_DATA_ACCESS_TESTER
=
"
${
NO_ACCESS_ID
}
"
NO_DATA_ACCESS_TESTER_SERVICEPRINCIPAL_SECRET
=
"
${
NO_ACCESS_SECRET
}
"
USER_ID
=
"osdu-user"
TENANT_NAME
=
"opendes"
DOMAIN
=
"
${
COMPANY_DOMAIN
}
"
DEPLOY_ENV
=
"empty"
cat
>
${
UNIQUE
}
/
${
SERVICE
}
.envrc
<<
LOCALENV
# ------------------------------------------------------------------------------------------------------
# Common Settings
# ------------------------------------------------------------------------------------------------------
export OSDU_TENANT=
$OSDU_TENANT
export OSDU_TENANT2=
$OSDU_TENANT2
export OSDU_TENANT3=
$OSDU_TENANT3
export COMPANY_DOMAIN=
$COMPANY_DOMAIN
export COSMOS_DB_NAME=
$COSMOS_DB_NAME
export LEGAL_SERVICE_BUS_TOPIC=
$LEGAL_SERVICE_BUS_TOPIC
export RECORD_SERVICE_BUS_TOPIC=
$RECORD_SERVICE_BUS_TOPIC
export LEGAL_STORAGE_CONTAINER=
$LEGAL_STORAGE_CONTAINER
export TENANT_ID=
$TENANT_ID
export INVALID_JWT=
$INVALID_JWT
export NO_ACCESS_ID=
$NO_ACCESS_ID
export NO_ACCESS_SECRET=
$NO_ACCESS_SECRET
export OTHER_APP_ID=
$OTHER_APP_ID
export OTHER_APP_OID=
$OTHER_APP_OID
export AD_USER_EMAIL=
$AD_USER_EMAIL
export AD_USER_OID=
$AD_USER_OID
export AD_GUEST_EMAIL=
$AD_GUEST_EMAIL
export AD_GUEST_OID=
$AD_GUEST_OID
# ------------------------------------------------------------------------------------------------------
# Environment Settings
# ------------------------------------------------------------------------------------------------------
export ENV_SUBSCRIPTION_NAME=
$ENV_SUBSCRIPTION_NAME
export ENV_APP_ID=
$ENV_APP_ID
export ENV_PRINCIPAL_ID=
$ENV_PRINCIPAL_ID
export ENV_PRINCIPAL_SECRET=
$ENV_PRINCIPAL_SECRET
export ENV_APPINSIGHTS_KEY=
$ENV_APPINSIGHTS_KEY
export ENV_REGISTRY=
$ENV_REGISTRY
export ENV_STORAGE=
$ENV_STORAGE
export ENV_STORAGE_KEY=
$ENV_STORAGE_KEY
export ENV_STORAGE_CONNECTION=
$ENV_STORAGE_CONNECTION
export ENV_COSMOSDB_HOST=
$ENV_COSMOSDB_HOST
export ENV_COSMOSDB_KEY=
$ENV_COSMOSDB_KEY
export ENV_SERVICEBUS_NAMESPACE=
$ENV_SERVICEBUS_NAMESPACE
export ENV_SERVICEBUS_CONNECTION=
$ENV_SERVICEBUS_CONNECTION
export ENV_KEYVAULT=
$ENV_KEYVAULT
export ENV_HOST=
$ENV_HOST
export ENV_REGION=
$ENV_REGION
export ENV_ELASTIC_HOST=
$ENV_ELASTIC_HOST
export ENV_ELASTIC_PORT=
$ENV_ELASTIC_PORT
export ENV_ELASTIC_USERNAME=
$ENV_ELASTIC_USERNAME
export ENV_ELASTIC_PASSWORD=
$ENV_ELASTIC_PASSWORD
# ------------------------------------------------------------------------------------------------------
# LocalHost Run Settings
# ------------------------------------------------------------------------------------------------------
export LOG_PREFIX="
${
LOG_PREFIX
}
"
export AZURE_TENANT_ID="
${
AZURE_TENANT_ID
}
"
export AZURE_CLIENT_ID="
${
AZURE_CLIENT_ID
}
"
export AZURE_CLIENT_SECRET="
${
AZURE_CLIENT_SECRET
}
"
export keyvault_url="
${
keyvault_url
}
"
export appinsights_key="
${
appinsights_key
}
"
export cosmosdb_database="
${
cosmosdb_database
}
"
export AZURE_AD_APP_RESOURCE_ID="
${
AZURE_AD_APP_RESOURCE_ID
}
"
export osdu_entitlements_url="
${
osdu_entitlements_url
}
"
export osdu_entitlements_app_key="
${
osdu_entitlements_app_key
}
"
export osdu_storage_url="
${
osdu_storage_url
}
"
export AZURE_STORAGE_ACCOUNT="
${
AZURE_STORAGE_ACCOUNT
}
"
export aad_client_id="
${
aad_client_id
}
"
export storage_account="
${
storage_account
}
"
export server_port="
${
server_port
}
"
export azure_istioauth_enabled="
${
azure_istioauth_enabled
}
"
# ------------------------------------------------------------------------------------------------------
# Integration Test Settings
# ------------------------------------------------------------------------------------------------------
export DATASET_HOST="
${
DATASET_HOST
}
"
export DATA_PARTITION_ID="
${
DATA_PARTITION_ID
}
"
export INTEGRATION_TESTER="
${
INTEGRATION_TESTER
}
"
export TESTER_SERVICEPRINCIPAL_SECRET="
${
TESTER_SERVICEPRINCIPAL_SECRET
}
"
export AZURE_AD_TENANT_ID="
${
AZURE_AD_TENANT_ID
}
"
export AZURE_AD_APP_RESOURCE_ID="
${
AZURE_AD_APP_RESOURCE_ID
}
"
export NO_DATA_ACCESS_TESTER="
${
NO_DATA_ACCESS_TESTER
}
"
export NO_DATA_ACCESS_TESTER_SERVICEPRINCIPAL_SECRET="
${
NO_DATA_ACCESS_TESTER_SERVICEPRINCIPAL_SECRET
}
"
export AZURE_STORAGE_ACCOUNT="
${
AZURE_STORAGE_ACCOUNT
}
"
export DOMAIN="
${
COMPANY_DOMAIN
}
"
export USER_ID="
${
USER_ID
}
"
export TIME_ZONE="
${
TIME_ZONE
}
"
export STAGING_CONTAINER_NAME="
${
STAGING_CONTAINER_NAME
}
"
LOCALENV
cat
>
${
UNIQUE
}
/
${
SERVICE
}
_local.yaml
<<
LOCALRUN
LOG_PREFIX: "
${
LOG_PREFIX
}
"
AZURE_TENANT_ID: "
${
AZURE_TENANT_ID
}
"
AZURE_CLIENT_ID: "
${
AZURE_CLIENT_ID
}
"
AZURE_CLIENT_SECRET: "
${
AZURE_CLIENT_SECRET
}
"
keyvault_url: "
${
keyvault_url
}
"
appinsights_key: "
${
appinsights_key
}
"
cosmosdb_database: "
${
cosmosdb_database
}
"
AZURE_AD_APP_RESOURCE_ID: "
${
AZURE_AD_APP_RESOURCE_ID
}
"
osdu_entitlements_url: "
${
osdu_entitlements_url
}
"
osdu_entitlements_app_key: "
${
osdu_entitlements_app_key
}
"
osdu_storage_url: "
${
osdu_storage_url
}
"
AZURE_STORAGE_ACCOUNT: "
${
AZURE_STORAGE_ACCOUNT
}
"
aad_client_id: "
${
aad_client_id
}
"
storage_account: "
${
storage_account
}
"
server_port: "
${
server_port
}
"
azure_istioauth_enabled: "
${
azure_istioauth_enabled
}
"
LOCALRUN
cat
>
${
UNIQUE
}
/
${
SERVICE
}
_local_test.yaml
<<
LOCALTEST
DATASET_HOST: "
${
DATASET_HOST
}
"
DATA_PARTITION_ID: "
${
DATA_PARTITION_ID
}
"
INTEGRATION_TESTER: "
${
INTEGRATION_TESTER
}
"
TESTER_SERVICEPRINCIPAL_SECRET: "
${
TESTER_SERVICEPRINCIPAL_SECRET
}
"
AZURE_AD_TENANT_ID: "
${
AZURE_AD_TENANT_ID
}
"
AZURE_AD_APP_RESOURCE_ID: "
${
AZURE_AD_APP_RESOURCE_ID
}
"
NO_DATA_ACCESS_TESTER: "
${
NO_DATA_ACCESS_TESTER
}
"
NO_DATA_ACCESS_TESTER_SERVICEPRINCIPAL_SECRET: "
${
NO_DATA_ACCESS_TESTER_SERVICEPRINCIPAL_SECRET
}
"
AZURE_STORAGE_ACCOUNT: "
${
AZURE_STORAGE_ACCOUNT
}
"
USER_ID: "
${
USER_ID
}
"
TIME_ZONE: "
${
TIME_ZONE
}
"
STAGING_CONTAINER_NAME: "
${
STAGING_CONTAINER_NAME
}
"
LOCALTEST
cat
>
${
UNIQUE
}
/
${
SERVICE
}
_test.yaml
<<
DEVTEST
DATASET_HOST: "
${
DATASET_HOST_REMOTE
}
"
DATA_PARTITION_ID: "
${
DATA_PARTITION_ID
}
"
INTEGRATION_TESTER: "
${
INTEGRATION_TESTER
}
"
TESTER_SERVICEPRINCIPAL_SECRET: "
${
TESTER_SERVICEPRINCIPAL_SECRET
}
"
AZURE_AD_TENANT_ID: "
${
AZURE_AD_TENANT_ID
}
"
AZURE_AD_APP_RESOURCE_ID: "
${
AZURE_AD_APP_RESOURCE_ID
}
"
NO_DATA_ACCESS_TESTER: "
${
NO_DATA_ACCESS_TESTER
}
"
NO_DATA_ACCESS_TESTER_SERVICEPRINCIPAL_SECRET: "
${
NO_DATA_ACCESS_TESTER_SERVICEPRINCIPAL_SECRET
}
"
AZURE_STORAGE_ACCOUNT: "
${
AZURE_STORAGE_ACCOUNT
}
"
USER_ID: "
${
USER_ID
}
"
TIME_ZONE: "
${
TIME_ZONE
}
"
STAGING_CONTAINER_NAME: "
${
STAGING_CONTAINER_NAME
}
"
DEVTEST
Write
Preview
Supports
Markdown
0%
Try again
or
attach a new file
.
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment