diff --git a/README.md b/README.md
index 12c4e2afcb79262f432f52082f1c03521d449b2e..dea7c79c669046c2aebd56e7afd63e22407c5e09 100644
--- a/README.md
+++ b/README.md
@@ -29,17 +29,11 @@
 
 ## Introduction
 
-The OSDU R2 Prototype includes a Workflow Engine, an implementation of Apache Airflow, to orchestrate business
-processes. In particular, the Workflow Engine handles ingestion of opaque and well log .las files in OSDU R2.
-
-The Workflow Engine encompasses the following components:
-
-* Opaque Ingestion DAG
-* OSDU Ingestion DAG
-* Workflow Status Operator
-* Stale Jobs Scheduler
-* Workflow Finished Sensor Operator
+The project is a set of Apache Airflow DAGs implementations to orchestrate data ingestion within OSDU platform.
+The following DAGs are implemented:
 
+* Osdu_ingest - R3 Manifest Ingestion DAG
+* Osdu_ingest_r2 - R2 Manifest Ingestion DAG
 
 ## Deployment
 
@@ -48,15 +42,14 @@ GCP provides Cloud Composer a fully managed workflow orchestration service built
 
 To deploy the Ingestion DAGs on GCP Cloud Composer just upload files from */src* folder into *DAGS_FOLDER* and *PLUGINS_FOLDER* accordingly into the DAG bucket that provided by Composer environment. [More info in documentation.](https://cloud.google.com/composer/docs/quickstart#uploading_the_dag_to)
 
-*DAGS_FOLDER* and *FLUGINS_FOLDER* are setting up in airflow.cfg file.
+*DAGS_FOLDER* and *PLUGINS_FOLDER* are setting up by Composer itself.
 
 According to the [DAG implementation details](#dag-implementation-details) need to put [osdu_api] directory into the *DAGS_FOLDER*. Moreover, all required variables have to be set in Airflow meta store by Variables mechanism. [List of the required variables](#required-variables).
 
-### Installing Python Dependencies
+#### Installing Python Dependencies
 Environment dependencies might be installed by several ways:
-1. Installing a Python dependency from PyPI. Cloud Composer picks up *requirements.txt* file from the DAGs bucket.
-2. Setting up an environment into the Cloud Composer Console.
-3. Installing local Python library. Put your dependencies into *DAG_FOLDER/libs* directory. Airflow automatically adds *DAG_FOLDER* and *PLUGINS_FOLDER* to the *PATH*.
+1. Setting up an environment into the Cloud Composer Console.
+2. Installing local Python library. Put your dependencies into *DAG_FOLDER/libs* directory. Airflow automatically adds *DAG_FOLDER* and *PLUGINS_FOLDER* to the *PATH*.
 
 
 ## DAG Implementation Details