Newer
Older
## Introduction
Carlos Colin
committed
This is the repository for Project & Workflow Service (P&WFS). Project & Workflow Service (P&WFS) capability is to deliver a service which achieves supporting more efficient and better decision making on large scale capital investments. This capability group are aiming for a step change in collaboration, workflow processes and information recorded throughout projects and decision gates. Enabling enhanced assurance, rich contextual information capture, and the pursuit of excellence in project decision-making.
Carlos Colin
committed
## Project structure
app
|__ api
|__ client
|__ core
|__ data
## Project Tutorial
## Project Startup
### Configuration
The settings module can read environment variables as long as they are declared as fields of the
Settings class (see app/core/settings/base.py).
Or they can be provided in a .env or prod.env files
#### Python Version
Project & Workflow runs in **Python 3.11**.
#### Local settings
Add an .env with:
ENVIRONMENT="dev"
# Optionally add any other env var (BUILD_DATE, etc.)
#### Cache settings
> NOTE: Redis Cache it is a NFR, it is used to improve resiliency and performance.
It is needed to add the following variables to .env to set up cache layer:
```
CACHE_ENABLE=True
CACHE_BACKEND="app.core.helpers.cache.backends.redis_cache.RedisCacheBackend"
```
The `CACHE_ENABLE` variable can enable or disable cache for all app.
Besides, to disable caching in a request just use "Cache-Control" in a request header as "no-store" or "no-cache".
The `CACHE_BACKEND` variable is a path to your backend,
which is used to initialize FastAPICache like [here](https://github.com/long2ice/fastapi-cache#quick-start).
You can use already prepared backends like:
* `app.core.helpers.cache.backends.redis_cache.RedisCacheBackend`
* `app.core.helpers.cache.backends.inmemory_cache.InMemoryCacheBackend`
Also, you can customize and use your own one.
Customized backend is supposed to be based on [BaseCacheBackend class](/app/core/helpers/cache/backends/base_cache.py).
To set up custom backend you maybe need to add optional variables to .env. For example for RedisCacheBackend we use:
```
REDIS_HOSTNAME=xxxxxx.redis.cache.windows.net
REDIS_PASSWORD=<redis-key>
REDIS_DATABASE=13
REDIS_SSL=True
REDIS_PORT=6380
```
By default `ttl = 60` sec is used. It is possible to set another ttl through the `CACHE_DEFAULT_TTL` variable.
Also, it is possible to set ttl manually for a specific request directly at the place where @cache is used.
### Run with Docker
Docker-compose it is meant to be used for local development/testing, not for production, be aware that docker-compose uses higher privileges for development and testing purposes, we wouldn't recommend to use the [docker-compose](./docker-compose.yml) file for production, only for developers to be able to add changes and test them in local as well as unit/integration tests to avoid having to install all the dependencies.
We are using a distroless image as the default production docker image for an improved security experience.
#### Docker Flavor Versions (docker-compose)
* [Distroless - Dockerfile](./Dockerfile) is the default image to run with [docker-compose](./docker-compose.yml). It is a balanced, production ready dockerfile which can be used to deploy an application as well as to develop on your local machine. It is a lean dockerfile (which is a best approach for production systems due lack of unneeded binaries and libraries) and this approach will be ideal to address security CVEs and to have lean version of Project and Workflow.
* Distroless build around ~400MB, you can test it `docker-compose --profile distroless up distroless`
* [tests - Dockerfile_test](./Dockerfile.tests) Dockerfile definition which contains and install all dependencies needed for unit/integration tests.
#### Quickly run docker-compose
**WARNING** Docker-compose it is meant to be used for quick development/testing, not for production.
```shell
docker-compose up
# Simple test
curl localhost:8080/
```
#### Run with Dockerfile [MacOS / Linux]
```bash
export BUILD_DATE=$(date -u +"%Y-%m-%dT%H:%M:%SZ")
export COMMIT_ID=$(git rev-parse HEAD)
export COMMIT_BRANCH=$(git rev-parse --abbrev-ref HEAD)
export COMMIT_MESSAGE=$(git log -1 --pretty=%B)
docker build -f "./Dockerfile" \
--build-arg build_date="$BUILD_DATE" \
--build-arg commit_id="$COMMIT_ID" \
--build-arg commit_branch="$COMMIT_BRANCH" \
--build-arg commit_message="$COMMIT_MESSAGE" . \
-t pws:latest
docker run pws:latest
```
#### Run with Dockerfile [Windows]
```bash
winpty docker run --rm -it -p <container_port>:<host_port> pws
```
### Run with Uvicorn
```
uvicorn app.main:app --port LOCAL_PORT
```
To see the OpenAPI page, follow the link:
`http://127.0.0.1:<LOCAL_PORT>/<OPENAPI_PREFIX>/docs`
Use the `OPENAPI_PREFIX` value used for project settings.
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
### Run Unit Tests
**WARNING:** It is recommended to use the [Dockerfile](./Dockerfile) image for production systems. The [Dockerfile.tests](./Dockerfile.tests) image, contains extra dependencies needed for testing purposes and is not intended for production use.
Unit tests
```shell
docker-compose run tests
docker-compose run --rm tests flake8 / --config=/setup.cfg
docker-compose run --rm tests flake8 /app --select T1
```
### Run Local Integration Tests
Using Docker [env substitution from shell](https://docs.docker.com/compose/environment-variables/set-environment-variables/#substitute-from-the-shell) capabilities.
```shell
# Start docker-compose
docker-compose up pws
# Export needed envs for testing
# Internal hostname in docker-compose (app)
# Alternatively you can choose to test in remote env
export PWS_BASE_URL=http://pws:8080
export ACCESS_TOKEN=<access_token>
export PARTITION=<partition>
export URL_PREFIX=/api
export CLOUD_PROVIDER=<cloud-provider>
# Run test
docker-compose build tests
docker-compose --profile tests run --rm integration
# (Optional) Cleanup
docker-compose down
```
### Local running
1. Install virtual env if needed
```
python -m venv integration_env
source integration_env/bin/activate
```
2. Install requests
```
pip install -r requirements.txt
pip install -r requirements-tests.in
````
3.
- Run via Terminal:
```
pytest -n auto tests/integration/tests --ddms-base-url {DDMS_BASE_URL} --url-prefix {URL_PREFIX} --partition {PARTITION} --bearer-token {TOKEN} --cloud-provider {CLOUD_PROVIDER}
```
- Run/Debug via PyCharm:
1. Open "Run/Debug Configurations" in the upper right corner.
2. Click "Edit configuration templates..." in the bottom left corner of the opened window.
3. Find and expand "Python tests", click "pytest"
4. Choose "Working directory" as root directory of the project
5. Fill "Additional Arguments" with -n auto
```
-n auto --ddms-base-url={DDMS_BASE_URL} --url-prefix={URL_PREFIX} --partition={PARTITION} --bearer-token={token}
```
6. Apply and OK
7. Use the green arrow next to the test name to run or debug it.
### Integration tests structure
app
### Run Postman Tests
## Contribution
### Setting up development environment
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
This project uses [Poetry](https://python-poetry.org/) to manage virtual environments and dependencies. In case you don't have it installed, make sure to check out [Poetry's official documentation](https://python-poetry.org/docs/) to have it installed on your local machine. At the moment of this writing, the Poetry version being used is `1.7.1`.
This project is using `black` formatter, `isort` to sort imports, `flake8` for logical and styling checks, `mypy` for static type checks, and `pylint` for static code analysis. If you're using Visual Studio Code, you can Install these extensions to autoformat the Python code you work on:
- [Black Formatter](https://marketplace.visualstudio.com/items?itemName=ms-python.black-formatter)
- [isort](https://marketplace.visualstudio.com/items?itemName=ms-python.isort)
- [Pylint](https://marketplace.visualstudio.com/items?itemName=ms-python.pylint)
- [Mypy Type Checker](https://marketplace.visualstudio.com/items?itemName=ms-python.mypy-type-checker)
- [Flake8](https://marketplace.visualstudio.com/items?itemName=ms-python.flake8)
### Preparing virtual environment
To create your virtual environment you just have to run:
```shell
poetry install
```
After that, make sure Visual Studio Code, or whatever IDE you're using, uses the Python interpreter generated at the virtual environment's location.
### Manage package dependencies
If you want to add new dependencies, just run `poetry add [PACKAGE'S NAME]`, or `poetry add --group=dev [PACKAGE'S NAME]` if it's a dependency not required in production.
### Manage package dependencies - Tests
If you want to add new dependencies for tests, just run `poetry add --group=dev [PACKAGE'S NAME]`.
### Code Style Check
There is a list of linters and formatters which are used in this project for code style check.
Mostly it is [flake8](https://flake8.pycqa.org/en/latest/), pylint, and black.
Full list of checkers can be found in [pre-commit config](.pre-commit-config.yaml) file. Configuration for checkers is in [pyproject.toml](pyproject.toml), [.pylintrc.toml](.pylintrc.toml), and [.flake8](.flake8) file.
#### Pre-commit Hooks Installation
Install `pre-commit`.
Pre-commit hooks are supposed to be installed locally.
> pre-commit tool should be used inside this environment. For this, try to commit from the command line with the virtual environment, previously generated by Poetry, activated.
Steps after requirements installation:
```shell
pre-commit install
pre-commit install --hook-type commit-msg
```
Second step is needed for correct gitlint work.
#### Pre-commit Hooks Run
Command to run pre-commit manually for all files:
```shell
pre-commit run --all-files
```
In other cases, all checks run automatically on commit.
## Deployment
* [Main deployment page](./devops/)
## License
Licensed under Apache License Version 2.0; details can be found in [LICENSE](./LICENSE).