Skip to content
Snippets Groups Projects
README.md 9.9 KiB
Newer Older
  • Learn to ignore specific revisions
  • Chad Leong's avatar
    Chad Leong committed
    # Project and Workflow
    
    
    This is the repository for Project & Workflow Service (P&WFS). Project & Workflow Service (P&WFS) capability is to deliver a service which achieves supporting more efficient and better decision making on large scale capital investments. This capability group are aiming for a step change in collaboration, workflow processes and information recorded throughout projects and decision gates. Enabling enhanced assurance, rich contextual information capture, and the pursuit of excellence in project decision-making.
    
    ## Project structure
    
        app
         |__ api
         |__ client
         |__ core
         |__ data
    
    ## Project Tutorial
    
    Chad Leong's avatar
    Chad Leong committed
    
    
    Chad Leong's avatar
    Chad Leong committed
    
    
    Chad Leong's avatar
    Chad Leong committed
    
    
    Chad Leong's avatar
    Chad Leong committed
    
    
    The settings module can read environment variables as long as they are declared as fields of the
    Settings class (see app/core/settings/base.py).
    
    Chad Leong's avatar
    Chad Leong committed
    
    
    Or they can be provided in a .env or prod.env files
    
    Chad Leong's avatar
    Chad Leong committed
    
    
    #### Python Version
    Project & Workflow runs in **Python 3.11**.
    
    #### Local settings
    
    Add an .env with:
    
    Chad Leong's avatar
    Chad Leong committed
    
    ```
    
    ENVIRONMENT="dev"
    
    # Optionally add any other env var (BUILD_DATE, etc.)
    
    Chad Leong's avatar
    Chad Leong committed
    ```
    
    
    Chad Leong's avatar
    Chad Leong committed
    
    
    > NOTE: Redis Cache it is a NFR, it is used to improve resiliency and performance.
    
    Chad Leong's avatar
    Chad Leong committed
    
    
    It is needed to add the following variables to .env to set up cache layer:
    
    Chad Leong's avatar
    Chad Leong committed
    
    
    ```
    CACHE_ENABLE=True
    CACHE_BACKEND="app.core.helpers.cache.backends.redis_cache.RedisCacheBackend"
    ```
    
    Chad Leong's avatar
    Chad Leong committed
    
    
    The `CACHE_ENABLE` variable can enable or disable cache for all app.
    Besides, to disable caching in a request just use "Cache-Control" in a request header as "no-store" or "no-cache".
    
    Chad Leong's avatar
    Chad Leong committed
    
    
    The `CACHE_BACKEND` variable is a path to your backend,
    which is used to initialize FastAPICache like [here](https://github.com/long2ice/fastapi-cache#quick-start).
    
    Chad Leong's avatar
    Chad Leong committed
    
    
    You can use already prepared backends like:
     * `app.core.helpers.cache.backends.redis_cache.RedisCacheBackend`
     * `app.core.helpers.cache.backends.inmemory_cache.InMemoryCacheBackend`
      
    Also, you can customize and use your own one.
    Customized backend is supposed to be based on [BaseCacheBackend class](/app/core/helpers/cache/backends/base_cache.py).
    
    Chad Leong's avatar
    Chad Leong committed
    
    
    To set up custom backend you maybe need to add optional variables to .env. For example for RedisCacheBackend we use:
    ```
    REDIS_HOSTNAME=xxxxxx.redis.cache.windows.net
    REDIS_PASSWORD=<redis-key>
    REDIS_DATABASE=13
    REDIS_SSL=True
    REDIS_PORT=6380
    ```
    
    Chad Leong's avatar
    Chad Leong committed
    
    
    By default `ttl = 60` sec is used. It is possible to set another ttl through the `CACHE_DEFAULT_TTL` variable.
    Also, it is possible to set ttl manually for a specific request directly at the place where @cache is used.
    
    Chad Leong's avatar
    Chad Leong committed
    
    
    Chad Leong's avatar
    Chad Leong committed
    
    
    Docker-compose it is meant to be used for local development/testing, not for production, be aware that docker-compose uses higher privileges for development and testing purposes, we wouldn't recommend to use the [docker-compose](./docker-compose.yml) file for production, only for developers to be able to add changes and test them in local as well as unit/integration tests to avoid having to install all the dependencies.
    We are using a distroless image as the default production docker image for an improved security experience.
    
    Chad Leong's avatar
    Chad Leong committed
    
    
    #### Docker Flavor Versions (docker-compose)
    
    Chad Leong's avatar
    Chad Leong committed
    
    
    * [Distroless - Dockerfile](./Dockerfile) is the default image to run with [docker-compose](./docker-compose.yml).  It is a balanced, production ready dockerfile which can be used to deploy an application as well as to develop on your local machine. It is a lean dockerfile (which is a best approach for production systems due lack of unneeded binaries and libraries) and this approach will be ideal to address security CVEs and to have lean version of Project and Workflow.
      * Distroless build around ~400MB, you can test it `docker-compose --profile distroless up distroless`
    * [tests - Dockerfile_test](./Dockerfile.tests) Dockerfile definition which contains and install all dependencies needed for unit/integration tests.
    
    Chad Leong's avatar
    Chad Leong committed
    
    
    #### Quickly run docker-compose
    
    Chad Leong's avatar
    Chad Leong committed
    
    
    **WARNING** Docker-compose it is meant to be used for quick development/testing, not for production.
    
    Chad Leong's avatar
    Chad Leong committed
    
    
    Chad Leong's avatar
    Chad Leong committed
    
    
    # Simple test
    curl localhost:8080/
    ```
    
    Chad Leong's avatar
    Chad Leong committed
    
    
    #### Run with Dockerfile [MacOS / Linux]
    ```bash
    
    export BUILD_DATE=$(date -u +"%Y-%m-%dT%H:%M:%SZ")
    export COMMIT_ID=$(git rev-parse HEAD)
    export COMMIT_BRANCH=$(git rev-parse --abbrev-ref HEAD)
    export COMMIT_MESSAGE=$(git log -1 --pretty=%B)
    
    docker build -f "./Dockerfile" \
        --build-arg build_date="$BUILD_DATE" \
        --build-arg commit_id="$COMMIT_ID" \
        --build-arg commit_branch="$COMMIT_BRANCH" \
        --build-arg commit_message="$COMMIT_MESSAGE" . \
        -t pws:latest
    
    docker run pws:latest
    ```
    
    Chad Leong's avatar
    Chad Leong committed
    
    
    #### Run with Dockerfile [Windows]
    ```bash
    winpty docker run --rm -it -p <container_port>:<host_port> pws
    ```
    
    
    Chad Leong's avatar
    Chad Leong committed
    
    
    ```
    uvicorn app.main:app --port LOCAL_PORT
    ```
    
    Chad Leong's avatar
    Chad Leong committed
    
    
    To see the OpenAPI page, follow the link:
    
    Chad Leong's avatar
    Chad Leong committed
    
    
    `http://127.0.0.1:<LOCAL_PORT>/<OPENAPI_PREFIX>/docs`
    
    Chad Leong's avatar
    Chad Leong committed
    
    
    Use the `OPENAPI_PREFIX` value used for project settings.
    
    Chad Leong's avatar
    Chad Leong committed
    
    
    Chad Leong's avatar
    Chad Leong committed
    
    
    ### Run Unit Tests
    
    **WARNING:** It is recommended to use the [Dockerfile](./Dockerfile) image for production systems.  The [Dockerfile.tests](./Dockerfile.tests) image, contains extra dependencies needed for testing purposes and is not intended for production use.
    
    Unit tests
    
    ```shell
    docker-compose run tests
    docker-compose run --rm tests flake8 / --config=/setup.cfg
    docker-compose run --rm tests flake8 /app --select T1
    ```
    
    ### Run Local Integration Tests
    
    Using Docker [env substitution from shell](https://docs.docker.com/compose/environment-variables/set-environment-variables/#substitute-from-the-shell) capabilities.
    
    ```shell
    # Start docker-compose 
    docker-compose up pws
    # Export needed envs for testing
    # Internal hostname in docker-compose (app)
    # Alternatively you can choose to test in remote env
    export PWS_BASE_URL=http://pws:8080
    export ACCESS_TOKEN=<access_token>
    export PARTITION=<partition>
    export URL_PREFIX=/api
    export CLOUD_PROVIDER=<cloud-provider>
    # Run test
    docker-compose build tests
    docker-compose --profile tests run --rm integration
    
    # (Optional) Cleanup
    docker-compose down
    ```
    
    ### Local running
    1. Install virtual env if needed
       ```
        python -m venv integration_env
        source integration_env/bin/activate
       ```
    2. Install requests
        ```
        pip install -r requirements.txt
        pip install -r requirements-tests.in
        ````
    
    3. 
    - Run via Terminal:
        ```
        pytest -n auto tests/integration/tests --ddms-base-url {DDMS_BASE_URL} --url-prefix {URL_PREFIX} --partition {PARTITION} --bearer-token {TOKEN} --cloud-provider {CLOUD_PROVIDER}
        ```
    - Run/Debug via PyCharm:
      1. Open "Run/Debug Configurations" in the upper right corner.
      2. Click "Edit configuration templates..." in the bottom left corner of the opened window.
      3. Find and expand "Python tests", click "pytest"
      4. Choose "Working directory" as root directory of the project
      5. Fill "Additional Arguments" with -n auto  
         ```
         -n auto --ddms-base-url={DDMS_BASE_URL} --url-prefix={URL_PREFIX} --partition={PARTITION} --bearer-token={token}
         ```
      6. Apply and OK
      7. Use the green arrow next to the test name to run or debug it.
    
    ### Integration tests structure
    
        app
    
    
    ### Run Postman Tests
    
    ## Contribution
    
    ### Setting up development environment
    
    Chad Leong's avatar
    Chad Leong committed
    
    
    This project uses [Poetry](https://python-poetry.org/) to manage virtual environments and dependencies. In case you don't have it installed, make sure to check out [Poetry's official documentation](https://python-poetry.org/docs/) to have it installed on your local machine. At the moment of this writing, the Poetry version being used is `1.7.1`.
    
    This project is using `black` formatter, `isort` to sort imports, `flake8` for logical and styling checks, `mypy` for static type checks, and `pylint` for static code analysis. If you're using Visual Studio Code, you can Install these extensions to autoformat the Python code you work on:
    
    - [Black Formatter](https://marketplace.visualstudio.com/items?itemName=ms-python.black-formatter)
    - [isort](https://marketplace.visualstudio.com/items?itemName=ms-python.isort)
    - [Pylint](https://marketplace.visualstudio.com/items?itemName=ms-python.pylint)
    - [Mypy Type Checker](https://marketplace.visualstudio.com/items?itemName=ms-python.mypy-type-checker)
    - [Flake8](https://marketplace.visualstudio.com/items?itemName=ms-python.flake8)
    
    ### Preparing virtual environment
    
    To create your virtual environment you just have to run:
    
    ```shell
    poetry install
    ```
    
    After that, make sure Visual Studio Code, or whatever IDE you're using, uses the Python interpreter generated at the virtual environment's location.
    
    ### Manage package dependencies
    
    If you want to add new dependencies, just run `poetry add [PACKAGE'S NAME]`, or `poetry add --group=dev [PACKAGE'S NAME]` if it's a dependency not required in production.
    
    ### Manage package dependencies - Tests
    
    If you want to add new dependencies for tests, just run `poetry add --group=dev [PACKAGE'S NAME]`.
    
    ### Code Style Check
    There is a list of linters and formatters which are used in this project for code style check.
    Mostly it is [flake8](https://flake8.pycqa.org/en/latest/), pylint, and black.
    
    Full list of checkers can be found in [pre-commit config](.pre-commit-config.yaml) file. Configuration for checkers is in [pyproject.toml](pyproject.toml), [.pylintrc.toml](.pylintrc.toml), and [.flake8](.flake8) file.
    
    #### Pre-commit Hooks Installation
    
    Install `pre-commit`.
    
    Pre-commit hooks are supposed to be installed locally.
    
    > pre-commit tool should be used inside this environment. For this, try to commit from the command line with the virtual environment, previously generated by Poetry, activated.
    
    Steps after requirements installation:
    
    ```shell
    pre-commit install
    pre-commit install --hook-type commit-msg
    ```
    
    Second step is needed for correct gitlint work.
    
    #### Pre-commit Hooks Run
    Command to run pre-commit manually for all files:
    
    ```shell
    pre-commit run --all-files
    ```
    
    In other cases, all checks run automatically on commit.
    
    ## Deployment
    
    * [Main deployment page](./devops/)
    
    ## License
    Licensed under Apache License Version 2.0; details can be found in [LICENSE](./LICENSE).