Ingestion Workflow issueshttps://community.opengroup.org/osdu/platform/data-flow/ingestion/ingestion-workflow/-/issues2023-07-05T09:53:27Zhttps://community.opengroup.org/osdu/platform/data-flow/ingestion/ingestion-workflow/-/issues/142Communication between calling program and a launched run of manifest-based In...2023-07-05T09:53:27ZDebasis ChatterjeeCommunication between calling program and a launched run of manifest-based Ingestion processAs you can imagine, it is common practice for software vendors to offer UI-based insert/update/delete capability for meta data.
Such program would have to interact with user (human) to gather information about a new Wellbore (for example...As you can imagine, it is common practice for software vendors to offer UI-based insert/update/delete capability for meta data.
Such program would have to interact with user (human) to gather information about a new Wellbore (for example) and then, behind the scene, make up JSON load/manifest to actually populate OSDU DP by creating a new Wellbore.
Ideally, such program needs to report back to the user (human) almost right away whether his/her effort (to create a new Wellbore in OSDU DP) has succeeded or not.
Not enough to tell the user “Here is the RunID, go and check status from Airflow console”.
Linked to osdu/platform/system/home#80
Perhaps a solution through Notification Service?
This also impacts EDS fetch-and-ingest workflow. cc - @jrougeau (for information)
cc - @lasscock.b , @Kateryna_Kurach (for information)https://community.opengroup.org/osdu/platform/data-flow/ingestion/ingestion-workflow/-/issues/71ADR: Workflow Service - R3 Improvements2021-04-15T12:58:17ZDmitriy RudkoADR: Workflow Service - R3 Improvements## Context
During work with different stream, we identify several critical design issues with Workflow service that needs to be addressed in R3:
* Workflow service is not just an `abstraction` over orchestration engine (Airflow) but also...## Context
During work with different stream, we identify several critical design issues with Workflow service that needs to be addressed in R3:
* Workflow service is not just an `abstraction` over orchestration engine (Airflow) but also contains OSDU specific logic (`DataType`, `WorkflowType`, `UserType`). This logic should be moved to Ingestion Service.
* Workflow Service do not respect Data Partitions. Users potentially can trigger any Workflow in the system.
* There is not functionality to register a new Workflow
## Scope
- Add functionality to register new Workflows
- Add support of Data Partitions
- Remove OSDU specific workflow functionality (`DataType`, `WorkflowType`, `UserType`) from Workflow Service.
- Allow OSDU clients directly trigger registered Workflows, without Ingestion Service.
- Update API to reflect [Google REST API Design Guide](https://cloud.google.com/apis/design). Please see[OpenAPI Spec](https://community.opengroup.org/osdu/platform/data-flow/ingestion/ingestion-workflow/-/blob/refactoring_workflow/docs/api/openapi.workflow.yaml) for details.
## Decision
- Accept API changes as a part of R3
- Accept Workflow > Core changes as a part of R3
- Deprecate exiting Workflow API (startWorkflow, etc)
## Rationale
- Registration of workflows required for E2E R3 Ingestion
- API spec is on critical path for CSV Ingestion
## Consequences
- Most of the Core logic changes will be implemented by GCP
- Will require support of CSPs as SPI layer will be touched.
## When to revisit
- Post R3
## Technical details:
![R3_Workflow_-_L3__Target](/uploads/75f02f3ec73ee85a95bb668dc7426df2/R3_Workflow_-_L3__Target.png)
![R3_Workflow_-_L4__Target](/uploads/03429b8474b61049b4327ae920969374/R3_Workflow_-_L4__Target.png)
### SPI Layer:
- `IWorkflowEngineService` - **Has default implementation.** Abstraction over orchestration engine. By default we have implementation for Airflow.
- `IWorkflowManagerService` - **Has default implementation.** Implements CRUD over Workflow entity.
- `IWorkflowRunService` - - **Has default implementation.** Implements CRUD over Workflow Run entity.
- `IWorkflowMetadataRepository` - Should be implemented by CSP!. Repository for Workflow entity.
- `IWorkflowRunRepository` - Should be implemented by CSP!. Repository for Workflow Run entityM1 - Release 0.1Dmitriy RudkoDmitriy Rudkohttps://community.opengroup.org/osdu/platform/data-flow/ingestion/ingestion-workflow/-/issues/118ADR : New API to handle System workflows2023-12-15T05:47:08Zpreeti singh[Microsoft]ADR : New API to handle System workflows**Context:**
===
System workflows are the workflows which are available to all the data partitions. Any System workflow can be triggered and retrieved by any of the tenants, but it can be Created, updated, deleted by only a user with spe...**Context:**
===
System workflows are the workflows which are available to all the data partitions. Any System workflow can be triggered and retrieved by any of the tenants, but it can be Created, updated, deleted by only a user with special privilege (Let’s say it has system role).
This is more with respect to the workflows or DAGs that OSDU provides with.
**How it's done today:**
===
- There is no concept of system workflows.
- The workflow metadata is stored in partition specific cosmos collection.
**Issue with current design:**
===
- The behavior of create api endpoint will change and can be confusing to users if we use same for system as well as private workflow. Users might end up unknowingly creating system workflow by passing data-partition-id of special partition.
- It is difficult for the updates to be managed for the changes from the OSDU community, if we try to copy or replicate the information across all the customer partitions.
**Proposal:**
===
There are two types of workflows in the system, System workflows and Private workflows. The proposal is to create a new API endpoint to register System workflows.
- The new API shall be termed as `workflow/system`
- To **create/update/delete** System workflows - `/workflow/system` endpoint shall be used
- To **Get/Trigger** System workflows, existing workflow service endpoint must be used.
- The authorization of new end point shall be different from existing groups. We'll use service principal based authorization.
- The new API shall not accept data-partition-id as a header. Service would be aware where the System workflows are located.
- This API should interact *only* with System workflows. It should not have access to other workflows.
**Sequence Diagram for createWorkflow**
![createWorkflow](/uploads/da01a45cf14062aad0a5cfc48bd51c3d/createWorkflow.png)
**Sequence Diagram for getallWorkflows**
![getallWorkflows](/uploads/f767cc09ea1b27baeb6137e6dbdd9959/getallWorkflows.png)
**Sequence Diagram for getWorkflow option 1** (this one got finalized)
![getWorkflowOption1](/uploads/55dcd0f4ece4c0df83f5bef057c1cfbb/getWorkflowOption1.png)
**Sequence Diagram for getWorkflow option 2**
![getWorkflowOption2](/uploads/5ee667cb7226a62a05a03a8effa27988/getWorkflowOption2.png)M9 - Release 0.12preeti singh[Microsoft]preeti singh[Microsoft]https://community.opengroup.org/osdu/platform/data-flow/ingestion/ingestion-workflow/-/issues/158A custom header 'x-user-id' is used in core part2023-11-08T19:54:10ZRiabokon Stanislav(EPAM)[GCP]A custom header 'x-user-id' is used in core partI wanted to bring to your attention an issue that was identified by our GC Team while they were in the process of addressing https://community.opengroup.org/osdu/platform/data-flow/ingestion/ingestion-workflow/-/issues/157.
org.opengrou...I wanted to bring to your attention an issue that was identified by our GC Team while they were in the process of addressing https://community.opengroup.org/osdu/platform/data-flow/ingestion/ingestion-workflow/-/issues/157.
org.opengroup.osdu.workflow.service.WorkflowRunServiceImpl#addUserId
```
private Map<String, Object> addUserId(String workflowName, TriggerWorkflowRequest request) {
final Map<String, Object> executionContext = request.getExecutionContext();
if (executionContext.get(KEY_USER_ID) != null) {
String errorMessage = String.format("Request to trigger workflow with name %s failed because execution context contains reserved key 'userId'", workflowName);
throw new AppException(400, "Failed to trigger workflow run", errorMessage);
}
String userId = dpsHeaders.getUserId();
log.debug("putting user id: " + userId + " in execution context");
executionContext.put(KEY_USER_ID, userId);
return executionContext;
}
```
The current logic relies on a custom header that is primarily intended for use at an infrastructural level, as outlined in https://community.opengroup.org/osdu/platform/data-flow/ingestion/home/-/issues/52. The GC team approved an ADR with the understanding that this custom header would not be utilized within the core codebase.
However, as indicated in https://community.opengroup.org/osdu/platform/deployment-and-operations/helm-charts-azure/-/merge_requests/366, a header named 'x-user-id' is populated with data from 'x-on-behalf-of' using a specific rule. This mechanism aligns with the requirements of the CSP provider but may not be entirely suitable for the Core Part of the Workflow Service.
```
if (jwt_authn[msft_issuer]["appid"] == serviceAccountClientId and on_behalf_of_header ~= nil and on_behalf_of_header ~= '') then
request_handle:headers():add("x-user-id", request_handle:headers():get("x-on-behalf-of"))
else
request_handle:headers():add("x-user-id", jwt_authn[msft_issuer]["appid"])
end
```
This logic introduces **three key issues**:
- The core part of the Workflow service depends on a custom CSP header to execute context, which may not be in alignment with the intended architecture.
- The Workflow service may not operate correctly without <ISTIO> and the accompanying special rule, potentially limiting its usability.
- There is a security concern in that 'x-user-id' is not currently validated on the BackEnd side, allowing any user to utilize it for potentially vested interests.
_As for the third problem_, there is the test case:
1. A user was authorized within Workflow Service.
1. This user uses 'x-user-id' with the name of another user, resulting in the triggering of a workflow under the identity of a different user.https://community.opengroup.org/osdu/platform/data-flow/ingestion/ingestion-workflow/-/issues/92Workflow property validation should be done at the API level2021-07-27T14:46:45ZMatt WiseWorkflow property validation should be done at the API levelCurrently, it seems it is left up to each provider implementation to properly validate fields like 'WorkflowName'.
This should be an API validation instead in the core code.
Example: WorkflowName regex check in provider code
![image](...Currently, it seems it is left up to each provider implementation to properly validate fields like 'WorkflowName'.
This should be an API validation instead in the core code.
Example: WorkflowName regex check in provider code
![image](/uploads/af09db72f6882c102853ab8751a70873/image.png)M7 - Release 0.10ethiraj krishnamanaiduDania Kodeih (Microsoft)Wladmir FrazaoJoeChris ZhangDmitriy RudkoSpencer Suttonsuttonsp@amazon.comMatt Wiseethiraj krishnamanaiduhttps://community.opengroup.org/osdu/platform/data-flow/ingestion/ingestion-workflow/-/issues/74ADR: Workflow Service Environment Standardization2021-03-23T11:45:21ZAlan HensonADR: Workflow Service Environment Standardization## Context
Providing consistent workflow runtime environments enables DAGs (Directed Acyclic Graphs) to be written once and run across any standardized workflow service environment. There are some differences in the Workflow Service envi...## Context
Providing consistent workflow runtime environments enables DAGs (Directed Acyclic Graphs) to be written once and run across any standardized workflow service environment. There are some differences in the Workflow Service environments built for R3, so we must agree on the versions of the major components of the Workflow Service to achieve standardization.
## Scope
- All Workflow Service implementations should operate with the same `major.minor` version of Airflow.
- All Workflow Service implementations should operate with the same `major.minor` Python version within Airflow.
- All Workflow Service DAG Operators should be authored to run with the same `major.minor` Python version within Airflow.
## Decision
Standardize on the following Workflow Service component versions
| Component | Version |
| --------- | ------- |
| Airflow | 1.10.x |
| Airflow Python Runtime | 3.6.x |
| DAG Operator Python Development Version | 3.6.x |
## Rationale
- Workflows (DAGs) written against the standard will be portable to all standardized Workflow Service runtime environments.
## Consequences
- Workflow Service implementers may have to change Airflow and Python versions and re-test developed workflows (DAGs)Alan HensonAlan Hensonhttps://community.opengroup.org/osdu/platform/data-flow/ingestion/ingestion-workflow/-/issues/159ADR: Implement Airflow facade endpoint2024-01-08T10:10:33ZRiabokon Stanislav(EPAM)[GCP]ADR: Implement Airflow facade endpoint# Context
OSDU Platform uses Apache Airflow for orchestration of various data ingestion and processing jobs.
# Problem statement
Currently OSDU Airflow component does not support data isolation for multi-tenant deployments. Airflow Admi...# Context
OSDU Platform uses Apache Airflow for orchestration of various data ingestion and processing jobs.
# Problem statement
Currently OSDU Airflow component does not support data isolation for multi-tenant deployments. Airflow Administrative UI is available for all users and makes possible to observe all the processing data for all existing tenants which may cause data leaks and security issues.
# Proposal of the solution
It is proposed to introduce a facade that will replace Airflow admin UI and will collect in a tenant-specific way via the Airflow REST API job execution information (namely its resulting x-com variables). To do this we need to add a new endpoint in the Workflow service API, which will collect the details of the DAG run using the existing Airflow REST API v2.
New API endpoint /v1/workflow/{workflow_name}/workflowRun/{runId}/lastInfo should implement the following business logic:
![image-2023-10-18_17-48-20](/uploads/44f53a3de410b8dff0276b127387f29a/image-2023-10-18_17-48-20.png)
- Get internal workflow entity with getWorkflowRunByName and check if submittedBy corresponds to the user submitted in the header, otherwise return 401 NOT_AUTHORIZED
- Get list of all task instances with /dags/{dag_id}/dagRuns/{dag_run_id}/taskInstances where dag_id is workflow_name and dag_run_id is runId
- Select task instance with maximal end_date
- With task_id of the selected task instance get list of xcom entries keys /dags/{dag_id}/dagRuns/{dag_run_id}/taskInstances/{task_id}/xcomEntries
- Obtain xcom values by theis keys using /dags/{dag_id}/dagRuns/{dag_run_id}/taskInstances/{task_id}/xcomEntries/{xcom_key}
- Return task instance details from step 3 combined with xcom values map in a single JSON responceM23 - Release 0.26Rustam Lotsmanenko (EPAM)rustam_lotsmanenko@epam.comRiabokon Stanislav(EPAM)[GCP]Andrei Dalhikh [EPAM/GC]Rustam Lotsmanenko (EPAM)rustam_lotsmanenko@epam.comhttps://community.opengroup.org/osdu/platform/data-flow/ingestion/ingestion-workflow/-/issues/157Pass workflow user ID to the Airflow as part of payload.2023-11-08T19:54:46ZRiabokon Stanislav(EPAM)[GCP]Pass workflow user ID to the Airflow as part of payload.This issue was discovered by GC Team when the QA Team was testing a platform.
It revolves around triggering workflows and the addition of the User ID into the execution context through the 'x-user-id' header.
Upon further investigation,...This issue was discovered by GC Team when the QA Team was testing a platform.
It revolves around triggering workflows and the addition of the User ID into the execution context through the 'x-user-id' header.
Upon further investigation, we came across the(MR) https://community.opengroup.org/osdu/platform/deployment-and-operations/helm-charts-azure/-/merge_requests/366, which appears to implement this logic with a dependency on the infrastructural level.
However, we have to add some kind of validation or additional logic to use a header 'user' in core logic. This adjustment is essential as we might want to use the service without a service mesh or similar infrastructure.
org.opengroup.osdu.workflow.service.WorkflowRunServiceImpl#addUserId
```
private Map<String, Object> addUserId(String workflowName, TriggerWorkflowRequest request) {
final Map<String, Object> executionContext = request.getExecutionContext();
if (executionContext.get(KEY_USER_ID) != null) {
String errorMessage = String.format("Request to trigger workflow with name %s failed because execution context contains reserved key 'userId'", workflowName);
throw new AppException(400, "Failed to trigger workflow run", errorMessage);
}
String userId = dpsHeaders.getUserId();
log.debug("putting user id: " + userId + " in execution context");
executionContext.put(KEY_USER_ID, userId);
return executionContext;
}
```M21 - Release 0.24https://community.opengroup.org/osdu/platform/data-flow/ingestion/ingestion-workflow/-/issues/155Workflow Run API - Providing incorrect data partition Id in payload does not ...2023-08-03T04:26:37ZSurabhi SethWorkflow Run API - Providing incorrect data partition Id in payload does not trigger workflow but gives 200 responseAPI: Workflow Service API \> Workflow Run /workflow/{workflow_name}/workflowRun
![image.png](/uploads/46a0c21db8f5f38c5fd62c233cfff6f9/image.png){width=916 height=473}
If incorrect dataPartitionId is passed in the payload, workflow tri...API: Workflow Service API \> Workflow Run /workflow/{workflow_name}/workflowRun
![image.png](/uploads/46a0c21db8f5f38c5fd62c233cfff6f9/image.png){width=916 height=473}
If incorrect dataPartitionId is passed in the payload, workflow trigger does not happen.
Actual:
However status code of the trigger API is 200.
Expected:
Non-200 status code should be returned - 5xx, 4xx.https://community.opengroup.org/osdu/platform/data-flow/ingestion/ingestion-workflow/-/issues/147Update Status end point is publishing INGESTOR stage GSM messages2022-08-11T04:56:25Zdevesh bajpaiUpdate Status end point is publishing INGESTOR stage GSM messagesUpdate status endpoint in workflow service is publishing "INGESTOR"stage GSM messages. Workflow service update status endpoint can be called by user and this case "INGESTOR" stage in published GSM message doesn't seems valid. Workflow se...Update status endpoint in workflow service is publishing "INGESTOR"stage GSM messages. Workflow service update status endpoint can be called by user and this case "INGESTOR" stage in published GSM message doesn't seems valid. Workflow service can publish GSM message with "WORKFLOW" stage to give clear distinction regarding the source of GSM message.M13 - Release 0.16https://community.opengroup.org/osdu/platform/data-flow/ingestion/ingestion-workflow/-/issues/146Azure - One Service IT Test is flaky2022-05-31T03:58:58Zharshit aggarwalAzure - One Service IT Test is flakyTestWorkflowRunV3Integration.updateWorkflowRunStatus_should_returnSuccess_when_givenValidRequest_StatusRunning()
following tests seems to be flaky and fails occasionally
https://community.opengroup.org/osdu/platform/data-flow/ingestion...TestWorkflowRunV3Integration.updateWorkflowRunStatus_should_returnSuccess_when_givenValidRequest_StatusRunning()
following tests seems to be flaky and fails occasionally
https://community.opengroup.org/osdu/platform/data-flow/ingestion/ingestion-workflow/-/jobs/1041557Akshat JoshiAkshat Joshihttps://community.opengroup.org/osdu/platform/data-flow/ingestion/ingestion-workflow/-/issues/145workflow_id instead of dag_name2022-05-20T19:00:20ZRiabokon Stanislav(EPAM)[GCP]workflow_id instead of dag_nameWorkflow Service passes _workflow_id_ instead of _dag_name_, when it try to get dag_run_status.Workflow Service passes _workflow_id_ instead of _dag_name_, when it try to get dag_run_status.Riabokon Stanislav(EPAM)[GCP]Riabokon Stanislav(EPAM)[GCP]https://community.opengroup.org/osdu/platform/data-flow/ingestion/ingestion-workflow/-/issues/144WhiteSource update2022-08-23T21:24:04ZMaksim MalkovWhiteSource updateUpdate `core` and `azure` modules according to WS reports.Update `core` and `azure` modules according to WS reports.M12 - Release 0.15Maksim MalkovMaksim Malkovhttps://community.opengroup.org/osdu/platform/data-flow/ingestion/ingestion-workflow/-/issues/143Status publisher incorrectly sets status to FAILED2022-04-20T20:32:08ZMorris EstepaStatus publisher incorrectly sets status to FAILEDIngestion workflow is incorrectly publishing the status of a CSV ingestion as FAILED when a DAG attempts to set the workflow status to FINISHED. The problem occurs because "org.opengroup.osdu.workflow.service.WorkflowRunServiceImpl" call...Ingestion workflow is incorrectly publishing the status of a CSV ingestion as FAILED when a DAG attempts to set the workflow status to FINISHED. The problem occurs because "org.opengroup.osdu.workflow.service.WorkflowRunServiceImpl" calls the publishStatusWithUnexpectedErrors method when it receives "finished" as the workflow status in its logUpdatedStatus method.
Replication steps:
1) Subscribe to ingestion workflow status topic (SNS topic in AWS) to receive status messages.
2) Ingest a CSV file.
Ingestion workflow will publish 3 messages with the following statuses:
* SUBMITTED
* IN_PROGRESS
* FAILED
The 3rd status should have said SUCCESS.Okoun-Ola Fabien HouetoOkoun-Ola Fabien Houetohttps://community.opengroup.org/osdu/platform/data-flow/ingestion/ingestion-workflow/-/issues/141Incorrect Info endpoint [GONRG-4528]2022-03-31T09:04:56ZDenis Karpenok (EPAM)Incorrect Info endpoint [GONRG-4528]```
GET 'https://preship.gcp.gnrg-osdu.projects.epam.com/api/workflow/v1/info' - 404, "Not Found",
GET 'https://preship.gcp.gnrg-osdu.projects.epam.com/api/workflow/info' - works well
```
Expected:
`GET 'https://preship.gcp.gnrg-osdu.p...```
GET 'https://preship.gcp.gnrg-osdu.projects.epam.com/api/workflow/v1/info' - 404, "Not Found",
GET 'https://preship.gcp.gnrg-osdu.projects.epam.com/api/workflow/info' - works well
```
Expected:
`GET 'https://preship.gcp.gnrg-osdu.projects.epam.com/api/workflow/v1/info' - works well`M12 - Release 0.15Chris ZhangDzmitry Malkevich (EPAM)Chris Zhanghttps://community.opengroup.org/osdu/platform/data-flow/ingestion/ingestion-workflow/-/issues/140Airflow 2.0. A new dag state 'queued'2022-02-18T07:25:14ZRiabokon Stanislav(EPAM)[GCP]Airflow 2.0. A new dag state 'queued'Response from https://airflow.apache.org/api/v1/dags/{dag_id}/dagRuns with Airflow 2.0 contains
a new dag state 'queued'
`Changed in version 2.1.3: 'queued' is added as a possible value.`
https://airflow.apache.org/docs/apache-airflow...Response from https://airflow.apache.org/api/v1/dags/{dag_id}/dagRuns with Airflow 2.0 contains
a new dag state 'queued'
`Changed in version 2.1.3: 'queued' is added as a possible value.`
https://airflow.apache.org/docs/apache-airflow/stable/stable-rest-api-ref.html#operation/post_dag_runM11 - Release 0.14Riabokon Stanislav(EPAM)[GCP]Riabokon Stanislav(EPAM)[GCP]https://community.opengroup.org/osdu/platform/data-flow/ingestion/ingestion-workflow/-/issues/135Upgrade to Log4J 2.172021-12-21T03:54:17ZDavid Diederichd.diederich@opengroup.orgUpgrade to Log4J 2.17The Apache Foundation released another Log4j2 update, version 2.17, which address a denial of service vulnerability.
This issue tracks progress to upgrade this dependency for this project.The Apache Foundation released another Log4j2 update, version 2.17, which address a denial of service vulnerability.
This issue tracks progress to upgrade this dependency for this project.https://community.opengroup.org/osdu/platform/data-flow/ingestion/ingestion-workflow/-/issues/134Log4J Expedient Updates and Patches2021-12-16T20:37:57ZDavid Diederichd.diederich@opengroup.orgLog4J Expedient Updates and PatchesThis issue associates MRs that were applied to this project quickly to get a patched version ready as soon as possible. The intent is to provide a reference point for later, more thoughtful, analysis.This issue associates MRs that were applied to this project quickly to get a patched version ready as soon as possible. The intent is to provide a reference point for later, more thoughtful, analysis.David Diederichd.diederich@opengroup.orgDavid Diederichd.diederich@opengroup.orghttps://community.opengroup.org/osdu/platform/data-flow/ingestion/ingestion-workflow/-/issues/133Update dependencies accoriding WhiteSource reports[SLB]2022-05-10T08:03:36ZMaksim MalkovUpdate dependencies accoriding WhiteSource reports[SLB]This is just a regular update raised by the WhiteSource check we have conducted on the SLB side.
Dependencies updates for:
* root pom
* core module pom
* azure module pomThis is just a regular update raised by the WhiteSource check we have conducted on the SLB side.
Dependencies updates for:
* root pom
* core module pom
* azure module pomM10 - Release 0.13Maksim MalkovMaksim Malkovhttps://community.opengroup.org/osdu/platform/data-flow/ingestion/ingestion-workflow/-/issues/132Add a version of Airflow into an endpoint 'info' for Workflow Service [GONRG-...2021-12-15T19:43:52ZKateryna Kurach (EPAM)Add a version of Airflow into an endpoint 'info' for Workflow Service [GONRG-3777]Add a version of Airflow into an endpoint 'info' for Workflow Service
Add v1 into /api/workflow/info
Expected path:
{workflow}
/api/workflow/v1/infoAdd a version of Airflow into an endpoint 'info' for Workflow Service
Add v1 into /api/workflow/info
Expected path:
{workflow}
/api/workflow/v1/infoM10 - Release 0.13