Data Ingestion issueshttps://community.opengroup.org/groups/osdu/platform/data-flow/ingestion/-/issues2021-06-23T17:33:05Zhttps://community.opengroup.org/osdu/platform/data-flow/ingestion/ingestion-workflow/-/issues/63[Parsers] Integrate developed RESQML Parser into Ingestion Framework2021-06-23T17:33:05ZKateryna Kurach (EPAM)[Parsers] Integrate developed RESQML Parser into Ingestion FrameworkRESQML Parser is being developed by Energistics.
This user story is to track an effort needed to support integration of RESQML Parser into Ingestion Framework.RESQML Parser is being developed by Energistics.
This user story is to track an effort needed to support integration of RESQML Parser into Ingestion Framework.https://community.opengroup.org/osdu/platform/data-flow/ingestion/ingestion-workflow/-/issues/50[Parsers] Preparing contract and Requirements for Parser Integration2021-06-23T17:34:01ZKateryna Kurach (EPAM)[Parsers] Preparing contract and Requirements for Parser IntegrationLessons Learned from Energistics demo.
We need to prepare a contract and requirements for a generic Parser integration.
The following issues have to be covered:
1. Technology stack
2. Workflow execution context
3. Responsibilities and f...Lessons Learned from Energistics demo.
We need to prepare a contract and requirements for a generic Parser integration.
The following issues have to be covered:
1. Technology stack
2. Workflow execution context
3. Responsibilities and functionality to implement
4. Testing guidelines
5. Reqs to parsers (e.g. Parser should work on file content and not file path)
As a result we should have:
- contract
- documentation with requirements for OSDU parsers
- "How-to" integration documentaionhttps://community.opengroup.org/osdu/platform/data-flow/ingestion/ingestion-workflow/-/issues/33Ingestion Job Notification2021-03-10T22:34:23ZKateryna Kurach (EPAM)Ingestion Job NotificationThis user story covers the following business scenarios:
- Ability to produce a notification regarding the status of Ingestion Job execution or / and it's part
- Ability to notify a user via email, text etc - Requirements have to be crea...This user story covers the following business scenarios:
- Ability to produce a notification regarding the status of Ingestion Job execution or / and it's part
- Ability to notify a user via email, text etc - Requirements have to be created here.
This functionality can be achieved by implementing a solution using Notification Service described here: https://community.opengroup.org/osdu/platform/data-flow/ingestion/ingestion-workflow/-/issues/16
or by implementing a totally different approach.Dmitriy RudkoDmitriy Rudkohttps://community.opengroup.org/osdu/platform/data-flow/ingestion/ingestion-workflow/-/issues/23[Parsers] Develop LAS 2.0 - Well Log Parser2021-06-23T17:32:27ZKateryna Kurach (EPAM)[Parsers] Develop LAS 2.0 - Well Log Parserhttps://community.opengroup.org/osdu/platform/data-flow/ingestion/ingestion-workflow/-/issues/22[Parsers] Develop LAS Well Log Parser2021-06-23T17:32:38ZKateryna Kurach (EPAM)[Parsers] Develop LAS Well Log Parserhttps://community.opengroup.org/osdu/platform/data-flow/ingestion/home/-/issues/18Ingestion Framework Guiding Principles2023-03-09T18:15:53ZStephen Whitley (Invited Expert)Ingestion Framework Guiding Principles
## Ingestion Framework Guiding Principles
These guiding principles are shared by all the SLB authored user stories reflecting the capabilities of OpenDES
* Simple, Dedicated and Efficient APIs as ingestion entry points for each data f...
## Ingestion Framework Guiding Principles
These guiding principles are shared by all the SLB authored user stories reflecting the capabilities of OpenDES
* Simple, Dedicated and Efficient APIs as ingestion entry points for each data format
* Store original high-fidelity data as is. Data in its original form must land first in the most appropriate store. Ex: - A DLIS file must land in the File DMS, A ZGY seismic survey must land in the Seismic DMS, etc. Once the data, in its original form, lands in the appropriate DMS, Parsers/Scanners/Enrichment processes can be applied to that data. The original data is stored as-is and parsers/scanner/clean-up processes will output derived entities.
* Framework should inherently enforce data lifecycle, Original ---> Well Known Structure ---> Well known Entity
* Extensibility of Parsers/Scanner/Enrichment processes through Configurations/Registrations
* Track flow and lifecycle of data in the data platformM1 - Release 0.1