Skip to content
Snippets Groups Projects
Santiago Ortiz [EPAM]'s avatar
Santiago Ortiz [EPAM] authored
PWS-23: Service Dockerfile for development environment (!3)
16cdf389
History

Project and Workflow

Introduction

This is the repository for Project & Workflow Service (P&WFS). Project & Workflow Service (P&WFS) capability is to deliver a service which achieves supporting more efficient and better decision making on large scale capital investments. This capability group are aiming for a step change in collaboration, workflow processes and information recorded throughout projects and decision gates. Enabling enhanced assurance, rich contextual information capture, and the pursuit of excellence in project decision-making.

Project structure

app
 |__ api
 |__ client
 |__ core
 |__ data

Project Tutorial

API Specs

Project Startup

Configuration

The settings module can read environment variables as long as they are declared as fields of the Settings class (see app/core/settings/base.py).

Or they can be provided in a .env or prod.env files

Python Version

Project & Workflow runs in Python 3.11.

Local settings

Add an .env with:

ENVIRONMENT="dev"

# Optionally add any other env var (BUILD_DATE, etc.)

Cache settings

NOTE: Redis Cache it is a NFR, it is used to improve resiliency and performance.

It is needed to add the following variables to .env to set up cache layer:

CACHE_ENABLE=True
CACHE_BACKEND="app.core.helpers.cache.backends.redis_cache.RedisCacheBackend"

The CACHE_ENABLE variable can enable or disable cache for all app. Besides, to disable caching in a request just use "Cache-Control" in a request header as "no-store" or "no-cache".

The CACHE_BACKEND variable is a path to your backend, which is used to initialize FastAPICache like here.

You can use already prepared backends like:

  • app.core.helpers.cache.backends.redis_cache.RedisCacheBackend
  • app.core.helpers.cache.backends.inmemory_cache.InMemoryCacheBackend

Also, you can customize and use your own one. Customized backend is supposed to be based on BaseCacheBackend class.

To set up custom backend you maybe need to add optional variables to .env. For example for RedisCacheBackend we use:

REDIS_HOSTNAME=xxxxxx.redis.cache.windows.net
REDIS_PASSWORD=<redis-key>
REDIS_DATABASE=13
REDIS_SSL=True
REDIS_PORT=6380

By default ttl = 60 sec is used. It is possible to set another ttl through the CACHE_DEFAULT_TTL variable. Also, it is possible to set ttl manually for a specific request directly at the place where @cache is used.

Run with Docker

Docker-compose it is meant to be used for local development/testing, not for production, be aware that docker-compose uses higher privileges for development and testing purposes, we wouldn't recommend to use the docker-compose file for production, only for developers to be able to add changes and test them in local as well as unit/integration tests to avoid having to install all the dependencies. We are using a distroless image as the default production docker image for an improved security experience.

Docker Flavor Versions (docker-compose)

  • Distroless - Dockerfile is the default image to run with docker-compose. It is a balanced, production ready dockerfile which can be used to deploy an application as well as to develop on your local machine. It is a lean dockerfile (which is a best approach for production systems due lack of unneeded binaries and libraries) and this approach will be ideal to address security CVEs and to have lean version of Project and Workflow.
    • Distroless build around ~400MB, you can test it docker-compose --profile distroless up distroless
  • tests - Dockerfile_test Dockerfile definition which contains and install all dependencies needed for unit/integration tests.

Quickly run docker-compose

WARNING Docker-compose it is meant to be used for quick development/testing, not for production.

docker-compose up

# Simple test
curl localhost:8080/

Run with Dockerfile [MacOS / Linux]

export BUILD_DATE=$(date -u +"%Y-%m-%dT%H:%M:%SZ")
export COMMIT_ID=$(git rev-parse HEAD)
export COMMIT_BRANCH=$(git rev-parse --abbrev-ref HEAD)
export COMMIT_MESSAGE=$(git log -1 --pretty=%B)

docker build -f "./Dockerfile" \
    --build-arg build_date="$BUILD_DATE" \
    --build-arg commit_id="$COMMIT_ID" \
    --build-arg commit_branch="$COMMIT_BRANCH" \
    --build-arg commit_message="$COMMIT_MESSAGE" . \
    -t pws:latest

docker run pws:latest

Run with Dockerfile [Windows]

winpty docker run --rm -it -p <container_port>:<host_port> pws

Run with Uvicorn

uvicorn app.main:app --port LOCAL_PORT

To see the OpenAPI page, follow the link:

http://127.0.0.1:<LOCAL_PORT>/<OPENAPI_PREFIX>/docs

Use the OPENAPI_PREFIX value used for project settings.

Testing

Run Unit Tests

WARNING: It is recommended to use the Dockerfile image for production systems. The Dockerfile.tests image, contains extra dependencies needed for testing purposes and is not intended for production use.

Unit tests

docker-compose run tests
docker-compose run --rm tests flake8 / --config=/setup.cfg
docker-compose run --rm tests flake8 /app --select T1

Run Local Integration Tests

Using Docker env substitution from shell capabilities.

# Start docker-compose 
docker-compose up pws
# Export needed envs for testing
# Internal hostname in docker-compose (app)
# Alternatively you can choose to test in remote env
export PWS_BASE_URL=http://pws:8080
export ACCESS_TOKEN=<access_token>
export PARTITION=<partition>
export URL_PREFIX=/api
export CLOUD_PROVIDER=<cloud-provider>
# Run test
docker-compose build tests
docker-compose --profile tests run --rm integration

# (Optional) Cleanup
docker-compose down

Local running

  1. Install virtual env if needed

     python -m venv integration_env
     source integration_env/bin/activate
  2. Install requests

    pip install -r requirements.txt
    pip install -r requirements-tests.in
  • Run via Terminal:
    pytest -n auto tests/integration/tests --ddms-base-url {DDMS_BASE_URL} --url-prefix {URL_PREFIX} --partition {PARTITION} --bearer-token {TOKEN} --cloud-provider {CLOUD_PROVIDER}
  • Run/Debug via PyCharm:
    1. Open "Run/Debug Configurations" in the upper right corner.
    2. Click "Edit configuration templates..." in the bottom left corner of the opened window.
    3. Find and expand "Python tests", click "pytest"
    4. Choose "Working directory" as root directory of the project
    5. Fill "Additional Arguments" with -n auto
      -n auto --ddms-base-url={DDMS_BASE_URL} --url-prefix={URL_PREFIX} --partition={PARTITION} --bearer-token={token}
    6. Apply and OK
    7. Use the green arrow next to the test name to run or debug it.

Integration tests structure

app

Run Postman Tests

Contribution

Setting up development environment

This project uses Poetry to manage virtual environments and dependencies. In case you don't have it installed, make sure to check out Poetry's official documentation to have it installed on your local machine. At the moment of this writing, the Poetry version being used is 1.7.1.

This project is using black formatter, isort to sort imports, flake8 for logical and styling checks, mypy for static type checks, and pylint for static code analysis. If you're using Visual Studio Code, you can Install these extensions to autoformat the Python code you work on:

Preparing virtual environment

To create your virtual environment you just have to run:

poetry install

After that, make sure Visual Studio Code, or whatever IDE you're using, uses the Python interpreter generated at the virtual environment's location.

Manage package dependencies

If you want to add new dependencies, just run poetry add [PACKAGE'S NAME], or poetry add --group=dev [PACKAGE'S NAME] if it's a dependency not required in production.

Manage package dependencies - Tests

If you want to add new dependencies for tests, just run poetry add --group=dev [PACKAGE'S NAME].

Code Style Check

There is a list of linters and formatters which are used in this project for code style check. Mostly it is flake8, pylint, and black.

Full list of checkers can be found in pre-commit config file. Configuration for checkers is in pyproject.toml, .pylintrc.toml, and .flake8 file.

Pre-commit Hooks Installation

Install pre-commit.

Pre-commit hooks are supposed to be installed locally.

pre-commit tool should be used inside this environment. For this, try to commit from the command line with the virtual environment, previously generated by Poetry, activated.

Steps after requirements installation:

pre-commit install
pre-commit install --hook-type commit-msg

Second step is needed for correct gitlint work.

Pre-commit Hooks Run

Command to run pre-commit manually for all files:

pre-commit run --all-files

In other cases, all checks run automatically on commit.

Deployment

License

Licensed under Apache License Version 2.0; details can be found in LICENSE.