The Sales Kafka Mega Blog
You can find the the whole source code over here.
So till the part 2 we have seen how the sales-publisher-app had produced the data for Kafka.
As the next part of this app we will be consuming the data produced.
Lets first look at the project structure of the app and then next we will move to the next part.
We have already discussed in the previous blog on what is the work of RootConfig. Over here with a little tweak i.e make configuration for COnsumer we will write a Config File.
Next we will import it in Main Application. Refer the github code above to view more.
Done with this we will next write the KafkaConsumer. The code itself is self explanatory. Let fist look into the code.
What this piece of code is doing is simply Consuming the data which is coming to the sales-publisher against the group sales-101. Again the Kafka Architecture in itself is a very big topic. So if you are not very familer with it either ignore it and assume that we are provided with a Java API for Kafka client and move forward or learn the Kafka Architecture and come back.
With this small change I thik we are done with the consumer part. As soon as we run this app. It will start consuming the message. At the end we are basically capturing the incoming message in a model. Which we can If we want store in table or use the live information in any way we want.
I will attach the snaps of whole of the working of the app over here.
Starting a Zookeeper Server
Starting a Kafka Server
Starting the Order maker app:
Starting the Sales-publisher (Producer)
Starting the consumer to take on the data
Well try yourself and let us know in the comment. Bye bye for now.