OSDU Software issueshttps://community.opengroup.org/groups/osdu/-/issues2021-12-10T14:24:29Zhttps://community.opengroup.org/osdu/platform/data-flow/real-time/streams/stream-admin-service/-/issues/6Admin / Stop Stream2021-12-10T14:24:29ZDmitry KniazevAdmin / Stop StreamImplement the GET /stream/{id}/stop method of the streaming API in the [controller](https://community.opengroup.org/osdu/platform/data-flow/real-time/streams/stream-admin-service/-/blob/develop/src/main/java/org/opengroup/osdu/streaming/...Implement the GET /stream/{id}/stop method of the streaming API in the [controller](https://community.opengroup.org/osdu/platform/data-flow/real-time/streams/stream-admin-service/-/blob/develop/src/main/java/org/opengroup/osdu/streaming/api/StreamingAdminControllerImpl.java) using the DeploymentAdminService (to be created):
- [ ] get stream definition record from the storage service using the stream id (input parameter)
- [ ] extract deployment id from ExtensionProperties
- [ ] call DeploymentAdminService::stopDeployment method passing deployment id
- [ ] handle exceptions and return the appropriate HTTP code
- [ ] create tests for each possible return codeStephen NimmoStephen Nimmohttps://community.opengroup.org/osdu/platform/data-flow/real-time/streams/stream-admin-service/-/issues/8Source / Fake Messages Producer2021-10-22T14:49:11ZSunil GargSource / Fake Messages ProducerRead the Stream Setup and initialize resources (containers need to be initialized) to make sure that the source stream can be received by the parser and parser can further feed the messages to the Kafka source topic.
----
This should b...Read the Stream Setup and initialize resources (containers need to be initialized) to make sure that the source stream can be received by the parser and parser can further feed the messages to the Kafka source topic.
----
This should be a Python app **packaged to a Docker container** that takes input parameters as env variables and uses them to do the following:
Input parameters are passed as env variables:
```
bootstrap.servers - list of brokers to bootstrap kafka connection
OSDU_STREAMS_SUBSCRIBEIDS - the list of message keys to monitor in the source topic and route to sink topic
OSDU_STREAMS_SOURCEBINDINGS - the list of source topics to read messages from
OSDU_STREAMS_SINKBINDINGS - the list of sink topics to write the messages to
```
- [x] On start-up extract the parameters from env variables:
- [x] bootstrap.servers = localhost:9092 (list of brokers to bootstrap Kafka connection)
- [x] OSDU_STREAMS_SUBSCRIBEIDS = "opendes:work-product-component--WellLog:be54a691c0384182944d71c6b2b6f699" (list of parent entities to be used to key messages)
- [x] OSDU_STREAMS_SOURCEBINDINGS = wss://localhost:8080 (not used for in the fake producer, will be the connection string to the remote ETP server)
- [x] OSDU_STREAMS_SINKBINDINGS = "opendes_wks_work-product-component--WellLog_1.0.0" (list of topics to write messages to)
- [x] establish connection with Kafka using bootstrap servers and topics information
- [x] run the infinite loop of generating the fake well logging data, serialize each measurement to [avro ](https://avro.apache.org/docs/current/gettingstartedpython.html), assign SubscribeID value as a message key and push them to the SinkBinding topic.
**Schema of the message is to be discussed!**Douglas DohmeyerDouglas Dohmeyerhttps://community.opengroup.org/osdu/platform/data-flow/ingestion/ingestion-dags/-/issues/92Manifest-based Ingestion - add persistable reference information by using con...2022-12-09T10:18:24ZDebasis ChatterjeeManifest-based Ingestion - add persistable reference information by using content from Reference dataCurrent approach - specify persistable reference inside load manifest (JSON file).
We propose that Manifest Ingestion process can get the information from Reference data and add that row "on the fly" just before using Storage Service to ...Current approach - specify persistable reference inside load manifest (JSON file).
We propose that Manifest Ingestion process can get the information from Reference data and add that row "on the fly" just before using Storage Service to create new record. This way, Data Loader does not need to worry about adding this information by himself or herself.
cc - @Keith_Wall , @ChrisZhang , @epeysson , @Kateryna_Kurach, @chad , @blasscoc (for information)
```
"meta": [
{
"kind": "Unit",
"name": "ms",
"persistableReference": "{\"abcd\":{\"a\":0.0,\"b\":0.001,\"c\":1.0,\"d\":0.0},\"symbol\":\"ms\",\"baseMeasurement\":{\"ancestry\":\"T\",\"type\":\"UM\"},\"type\":\"UAD\"}",
"unitOfMeasureID": "{{data-partition-id}}:reference-data--UnitOfMeasure:ms:",
"propertyNames": [
"RecordLength"
]
},
```
For example, see this retrieval query.
Body
```
{
"kind": "{{data-partition-id}}:wks:reference-data--UnitOfMeasure:1.0.0",
"limit":5000,
"query": "id: \"{{data-partition-id}}:reference-data--UnitOfMeasure:ms\"",
"returnedFields": ["*"]
}
```
Response
```
{
"results": [
{
"data": {
"AttributionPublication": "Energistics Unit of Measure Dictionary V1.0",
"PersistableReference": "{\"abcd\":{\"a\":0.0,\"b\":0.001,\"c\":1.0,\"d\":0.0},\"symbol\":\"ms\",\"baseMeasurement\":{\"ancestry\":\"T\",\"type\":\"UM\"},\"type\":\"UAD\"}",
"InactiveIndicator": false,
"UnitDimensionCode": "T",
"UnitQuantityID": "odesprod:reference-data--UnitQuantity:T:",
"Code": "ms",
"Source": "Workbook Published/UnitOfMeasure.1.0.0.xlsx; commit SHA c1d72417.",
"Name": "millisecond",
"AttributionAuthority": "Energistics",
"IsBaseUnit": false,
"AttributionRevision": "1.0",
"ID": "ms",
"CoefficientC": 1.0,
"UnitDimensionName": "time",
"CoefficientD": 0.0,
"CoefficientA": 0.0,
"CoefficientB": 0.001
},
"kind": "odesprod:wks:reference-data--UnitOfMeasure:1.0.0",
"source": "wks",
"acl": {
"viewers": [
"data.default.owners@odesprod.osdu-gcp.go3-nrg.projects.epam.com"
],
"owners": [
"data.default.owners@odesprod.osdu-gcp.go3-nrg.projects.epam.com"
]
},
"type": "reference-data--UnitOfMeasure",
"version": 1617286097841123,
"createTime": "2021-04-01T14:08:17.885Z",
"authority": "odesprod",
"namespace": "odesprod:wks",
"legal": {
"legaltags": [
"odesprod-demo-legaltag"
],
"otherRelevantDataCountries": [
"US"
],
"status": "compliant"
},
"createUser": "osdu-sa-airflow-composer@osdu-service-prod.iam.gserviceaccount.com",
"id": "odesprod:reference-data--UnitOfMeasure:ms"
}
],
"totalCount": 1
}
```
Enclosed is end-to-end example of Unit Conversion (Frame of Reference) in GCP, R3M8 by using manifest-based Ingestion process.
[OSDU_PTP_M8_TeamA_GCP-Manifest-Ingestion-unit-convert-Debasis-Naufal.txt](/uploads/73b4e9e586c9c7649d8581fd84e86100/OSDU_PTP_M8_TeamA_GCP-Manifest-Ingestion-unit-convert-Debasis-Naufal.txt)https://community.opengroup.org/osdu/platform/data-flow/ingestion/segy-to-vds-conversion/-/issues/6Provide new DAG to show overview information from freshly created VDS file (a...2021-10-05T18:49:50ZDebasis ChatterjeeProvide new DAG to show overview information from freshly created VDS file (after conversion)We can think of two alternatives.
This can be run automatically at the end of successful conversion.
Or there can be an independent DAG performing this step.
cc : @mstormo , @ChrisZhangWe can think of two alternatives.
This can be run automatically at the end of successful conversion.
Or there can be an independent DAG performing this step.
cc : @mstormo , @ChrisZhanghttps://community.opengroup.org/osdu/platform/data-flow/ingestion/segy-to-vds-conversion/-/issues/7Provide new DAG to perform SegY export, on demand2021-10-05T18:52:46ZDebasis ChatterjeeProvide new DAG to perform SegY export, on demandThis will allow to make a round trip of data. Started with SegY imported, VDS file was created. This step would allow Data Loader to recreate the SegY file.
In addition, the program could accept user input in order to subset by inline,...This will allow to make a round trip of data. Started with SegY imported, VDS file was created. This step would allow Data Loader to recreate the SegY file.
In addition, the program could accept user input in order to subset by inline, crossline ranges. This will allow the user to get a partial export from the original dataset.
@mstormo , @ChrisZhang for informationhttps://community.opengroup.org/osdu/platform/deployment-and-operations/audit-and-metrics/-/issues/37SeismicDMS API's Error in Azure Preshipping Environment r3m8.2021-10-06T18:55:19ZAnanth TSeismicDMS API's Error in Azure Preshipping Environment r3m8.@manishk @todaiks As per the recent discussion (with Kamlesh) and further execution/analysis ....We are unable move forward in SeismicDMS api's using the AZURE Preshipping environment r3m8. It's only giving the response for the Legal Tag...@manishk @todaiks As per the recent discussion (with Kamlesh) and further execution/analysis ....We are unable move forward in SeismicDMS api's using the AZURE Preshipping environment r3m8. It's only giving the response for the Legal Tag (New seismic store) and further the subsequent API's are ended up with 404 errors.
The details of the results has been enclosed. Request you to kindly review the PPT and revert me with the possible solution.
[Azure_Preshipr3m8_Errors_in_SeismicDMS_Collection_1021.pdf](/uploads/e0a6660ba2718c71c0cf47b6eee9af56/Azure_Preshipr3m8_Errors_in_SeismicDMS_Collection_1021.pdf)MANISH KUMARMANISH KUMARhttps://community.opengroup.org/osdu/platform/security-and-compliance/entitlements/-/issues/86List Group API returns direct parent of the given group ID2021-10-06T23:50:28ZAn NgoList Group API returns direct parent of the given group IDThe current List Group API returns the direct parent of a given group, which we did not intend.
This was an accidental feature.
This needs to be done separately with a new APIThe current List Group API returns the direct parent of a given group, which we did not intend.
This was an accidental feature.
This needs to be done separately with a new APIhttps://community.opengroup.org/osdu/platform/data-flow/ingestion/energistics/witsml-parser/-/issues/44OSS ETP Server Java Contribution2021-10-07T02:12:45ZWelly Tambunancoach@wellytambunan.comOSS ETP Server Java ContributionHi All,
i'm trying to contribute etp-java, not sure if this repo is the right place to do so.. as this spesifically focusing on resqml etc..
is there any place that i can discuss about etp contributor? thanks
here's the initial repo....Hi All,
i'm trying to contribute etp-java, not sure if this repo is the right place to do so.. as this spesifically focusing on resqml etc..
is there any place that i can discuss about etp contributor? thanks
here's the initial repo.. it's just a skeleton for now.
https://github.com/weltam/etp-java
cc @debasisc @Siarhei_Khaletski
cheershttps://community.opengroup.org/osdu/platform/pre-shipping/-/issues/106Loading of 50,000 records using Osdu_ingest DAG for IBM platform - job did no...2022-08-23T12:49:24ZKamlesh TodaiLoading of 50,000 records using Osdu_ingest DAG for IBM platform - job did not finish was left running for more than 48 hoursThe job submitted to ingest 50000 records did not complete after letting it run for more than 48 hours. Looking at the airflow log it appeared to be saving the records, but it did not return status of finished, failed, or success. It alw...The job submitted to ingest 50000 records did not complete after letting it run for more than 48 hours. Looking at the airflow log it appeared to be saving the records, but it did not return status of finished, failed, or success. It always returned status of running and one could see the airflow log getting updated.https://community.opengroup.org/osdu/platform/data-flow/real-time/streams/stream-admin-service/-/issues/9Processor / Messages Router2021-10-25T16:53:20ZDmitry KniazevProcessor / Messages RouterThis is a simple Kafka Stream App (Java) that reads messages from Source topic and routes them (with no change) to a sink topic.
Input parameters are passed as env variables:
```
bootstrap.servers - list of brokers to bootstrap kafka c...This is a simple Kafka Stream App (Java) that reads messages from Source topic and routes them (with no change) to a sink topic.
Input parameters are passed as env variables:
```
bootstrap.servers - list of brokers to bootstrap kafka connection
OSDU_STREAMS_SUBSCRIBEIDS - the list of message keys to monitor in the source topic and route to sink topic
OSDU_STREAMS_SOURCEBINDINGS - the list of source topics to read messages from
OSDU_STREAMS_SINKBINDINGS - the list of sink topics to write the messages to
```
- [ ] On start-up extract the parameters from env variables:
- [ ] bootstrap.servers = localhost:9092 (list of brokers to bootstrap Kafka connection)
- [ ] OSDU_STREAMS_SUBSCRIBEIDS = "opendes:work-product-component--WellLog:be54a691c0384182944d71c6b2b6f699"
- [ ] OSDU_STREAMS_SOURCEBINDINGS = "opendes_wks_work-product-component--WellLog_1.0.0"
- [ ] OSDU_STREAMS_SINKBINDINGS = "opendes_myApp--WellLog_1.0.0"
- [ ] establish connection with Kafka using bootstrap servers and topics information
- [ ] create the topology to route messages from source topic to sink topic with no changes
- [ ] create tests for the Kafka Streams AppDmitry KniazevDmitry Kniazevhttps://community.opengroup.org/osdu/platform/data-flow/real-time/streams/stream-admin-service/-/issues/10Sink / Web Messages Consumer2021-10-29T16:40:48ZDmitry KniazevSink / Web Messages ConsumerWrite a simple web app (Python|Node|Java) **packaged as a docker container** that connects to Kafka, consumes messages from a topic and visualizes this information on a web page.
Input parameters are passed as env variables:
```
bootst...Write a simple web app (Python|Node|Java) **packaged as a docker container** that connects to Kafka, consumes messages from a topic and visualizes this information on a web page.
Input parameters are passed as env variables:
```
bootstrap.servers - list of brokers to bootstrap kafka connection
OSDU_STREAMS_SUBSCRIBEIDS - the list of message keys to monitor in the source topic and route to sink topic
OSDU_STREAMS_SOURCEBINDINGS - the list of source topics to read messages from
OSDU_STREAMS_SINKBINDINGS - the list of sink topics to write the messages to
```
- [ ] On start-up extract the parameters from env variables:
- [ ] bootstrap.servers = localhost:9092 (list of brokers to bootstrap Kafka connection)
- [ ] OSDU_STREAMS_SUBSCRIBEIDS = "opendes:work-product-component--WellLog:be54a691c0384182944d71c6b2b6f699"
- [ ] OSDU_STREAMS_SOURCEBINDINGS = "opendes_myApp--WellLog_1.0.0"
- [ ] OSDU_STREAMS_SINKBINDINGS = "http://localhost:8080" (not used in this application)
- [ ] establish connection with Kafka using bootstrap servers and topics information
- [ ] run infinite loop of reading messages from the source topic, extract the measurements from the avro message and visualize it on the page
- [ ] create tests for the appDouglas DohmeyerDouglas Dohmeyerhttps://community.opengroup.org/osdu/platform/data-flow/real-time/streams/stream-admin-service/-/issues/12Admin / Info Stream2021-11-12T15:02:24ZDmitry KniazevAdmin / Info StreamImplement the GET /stream/{id}/info method of the streaming API in the [controller](https://community.opengroup.org/osdu/platform/data-flow/real-time/streams/stream-admin-service/-/blob/develop/src/main/java/org/opengroup/osdu/streaming/...Implement the GET /stream/{id}/info method of the streaming API in the [controller](https://community.opengroup.org/osdu/platform/data-flow/real-time/streams/stream-admin-service/-/blob/develop/src/main/java/org/opengroup/osdu/streaming/api/StreamingAdminControllerImpl.java) using the DeploymentAdminService (to be created):
- [ ] get stream definition record from the storage service using the stream id (input parameter)
- [ ] extract deployment id from ExtensionProperties
__need brainstorming how to collect the status of the deployment__
- [ ] handle exceptions and return the appropriate HTTP code
- [ ] create tests for each possible return codeStephen NimmoStephen Nimmohttps://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/home/-/issues/19sdutil help to show recursive listing "ls -r"2022-07-27T10:22:38ZDebasis Chatterjeesdutil help to show recursive listing "ls -r"When working in GCP Preship environment, @Yan_Sushchynski showed me this possibility.
I think this could be useful for "Help" to show as advanced option.
```
> python sdutil [command]
available commands:
* app : application auth...When working in GCP Preship environment, @Yan_Sushchynski showed me this possibility.
I think this could be useful for "Help" to show as advanced option.
```
> python sdutil [command]
available commands:
* app : application authorization utilities
* auth : authentication utilities
* config : manage the utility configuration
* cp : copy data to(upload)/from(download)/in(copy) seismic store
* ls : list subprojects and datasets
* mk : create a subproject resource
* mv : move a dataset in seismic store
* patch : patch a seismic store subproject or dataset
* rm : delete a subproject or a space separated list of datasets
* stat : print information like size, creation date, legal tag(admin) for a space separated list of tenants, subprojects or datasets
* unlock : remove a lock on a seismic store dataset
* user : user authorization utilities
* version : print the sdutil version
(sdutilenv) C:\seismic-store-sdutil-master>
```https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-sdutil/-/issues/12[GCP] Can't download data from Seismic Store if it consists of more than one ...2023-03-30T16:54:59ZYan Sushchynski (EPAM)[GCP] Can't download data from Seismic Store if it consists of more than one fileAfter uploading oVDS dataset to Seismic Store with SEGY->oVDS converter, I want to download the result to my local machine.
But I got this error
![image](/uploads/ebb3058d89b9dd27ddf937bd259b8125/image.png)
As I understand, `sdutil` us...After uploading oVDS dataset to Seismic Store with SEGY->oVDS converter, I want to download the result to my local machine.
But I got this error
![image](/uploads/ebb3058d89b9dd27ddf937bd259b8125/image.png)
As I understand, `sdutil` uses `gcsurl` of the dataset and attempts to download it directly from the bucket, but it can't do this if the dataset consists of more than one file.
This is how the oVDS dataset looks in the bucket
![image](/uploads/361a48c79afa88e2542dd269d4cb94b8/image.png)https://community.opengroup.org/osdu/platform/system/indexer-service/-/issues/39Enhance documentation to explain how one can troubleshoot issues in Indexer (...2023-03-09T18:15:53ZDebasis ChatterjeeEnhance documentation to explain how one can troubleshoot issues in Indexer (Normalizer)Please see related issue #32 .
Suggest adding suitable notes in Core Services (Indexer) documentation.
https://community.opengroup.org/osdu/documentation/-/wikis/Core-Services-Overview
cc - @nthakur for informationPlease see related issue #32 .
Suggest adding suitable notes in Core Services (Indexer) documentation.
https://community.opengroup.org/osdu/documentation/-/wikis/Core-Services-Overview
cc - @nthakur for informationhttps://community.opengroup.org/osdu/platform/data-flow/real-time/streams/stream-admin-service/-/issues/13Source / Fake ETP Producer2021-12-10T14:23:57ZDmitry KniazevSource / Fake ETP ProducerTahir AliTahir Alihttps://community.opengroup.org/osdu/platform/deployment-and-operations/audit-and-metrics/-/issues/38Data Ingestion Status2021-10-18T11:41:34ZStephen Whitley (Invited Expert)Data Ingestion Status# Breakdown of Data Ingestion by Status
Error : violates ingestion rule
Failure : The workflow fails
Success : The ingestion job completes correctly.
## SLI Group
- [ ] Data Access & Egress
- [ ] Data Access Rights
- [ ] Data Governanc...# Breakdown of Data Ingestion by Status
Error : violates ingestion rule
Failure : The workflow fails
Success : The ingestion job completes correctly.
## SLI Group
- [ ] Data Access & Egress
- [ ] Data Access Rights
- [ ] Data Governance
- [X] Data Ingest
- [ ] Data Quality
- [x] Data Volume
- [ ] Delivery
- [ ] Platform Performance
- [ ] Platform Traction
- [ ] Search
- [ ] Other
## How Measured
Ingestion Logs
## Dependencieshttps://community.opengroup.org/osdu/platform/deployment-and-operations/audit-and-metrics/-/issues/39Elasticity: Autoscaling2021-10-18T11:43:23ZStephen Whitley (Invited Expert)Elasticity: Autoscaling# SLI Title
## SLI Group
- [ ] Data Access & Egress
- [ ] Data Access Rights
- [ ] Data Governance
- [ ] Data Ingest
- [ ] Data Quality
- [ ] Data Volume
- [ ] Delivery
- [X] Platform Performance
- [X] Platform Cost
- [ ] Platform Trac...# SLI Title
## SLI Group
- [ ] Data Access & Egress
- [ ] Data Access Rights
- [ ] Data Governance
- [ ] Data Ingest
- [ ] Data Quality
- [ ] Data Volume
- [ ] Delivery
- [X] Platform Performance
- [X] Platform Cost
- [ ] Platform Traction
- [ ] Search
- [ ] Other
## How Measured
Capture Autoscaling events (scale up/scale down)
## Dependencieshttps://community.opengroup.org/osdu/platform/domain-data-mgmt-services/wellbore/wellbore-domain-services/-/issues/26memory leak with data chunking2023-03-16T20:34:29ZYunhua Koglinmemory leak with data chunkingHello, When we have dask for data chunking features, memory is not released
![image__1_](/uploads/7e3da10faa83fe89d243992cddfad3d7/image__1_.png)Hello, When we have dask for data chunking features, memory is not released
![image__1_](/uploads/7e3da10faa83fe89d243992cddfad3d7/image__1_.png)https://community.opengroup.org/osdu/platform/domain-data-mgmt-services/seismic/seismic-dms-suite/seismic-store-service/-/issues/37Support auth with access_token2023-03-27T19:17:39ZAleksandr Spivakov (EPAM)Support auth with access_tokenCurrently service supports only id_token for authorization. It will be good to have support for access_token.Currently service supports only id_token for authorization. It will be good to have support for access_token.