Source / Fake Messages Producer
Read the Stream Setup and initialize resources (containers need to be initialized) to make sure that the source stream can be received by the parser and parser can further feed the messages to the Kafka source topic.
This should be a Python app packaged to a Docker container that takes input parameters as env variables and uses them to do the following:
Input parameters are passed as env variables:
bootstrap.servers - list of brokers to bootstrap kafka connection
OSDU_STREAMS_SUBSCRIBEIDS - the list of message keys to monitor in the source topic and route to sink topic
OSDU_STREAMS_SOURCEBINDINGS - the list of source topics to read messages from
OSDU_STREAMS_SINKBINDINGS - the list of sink topics to write the messages to
-
On start-up extract the parameters from env variables: -
bootstrap.servers = localhost:9092 (list of brokers to bootstrap Kafka connection) -
OSDU_STREAMS_SUBSCRIBEIDS = "opendes:work-product-component--WellLog:be54a691c0384182944d71c6b2b6f699" (list of parent entities to be used to key messages) -
OSDU_STREAMS_SOURCEBINDINGS = wss://localhost:8080 (not used for in the fake producer, will be the connection string to the remote ETP server) -
OSDU_STREAMS_SINKBINDINGS = "opendes_wks_work-product-component--WellLog_1.0.0" (list of topics to write messages to)
-
-
establish connection with Kafka using bootstrap servers and topics information -
run the infinite loop of generating the fake well logging data, serialize each measurement to avro , assign SubscribeID value as a message key and push them to the SinkBinding topic.
Schema of the message is to be discussed!