...
- Check the system for dependencies such as Maven, Java, Docker and docker-compose
- Package the demo application for a producer and a consumer and build the docker images
- Start the docker container in the same docker network with docker-compose
- After Strimzi kafka is up and running the user can run manually ./runproducer.sh and ./runconsumer.sh in different shells or use demo.sh to start the producer and consumer
- The script will send type1 to ICS, which is already predefined in the demo application
- The script will send the producer info to ICS
- The script will send the consumer job info to ICS
- ICS will trigger the demo application based on its callbacks
- Data is produced on the demo application
- The script sends docker logs of the Producer Callback function of ICS
- The script sends docker logs of the Demo applications
Terminology:
- Information Type: Represents the types of data that can be produced by data producers and consumed by data consumers.
- Information Job: Represents an active data subscription by a data consumer, specifying the type of data to be produced and additional parameters for filtering.
- Data Consumer: Represents entities that consume data and manage data subscription jobs.
- Data Producer: Represents entities that produce data.
API offered in ICS:
- A1-EI API: Data consumer EI job registration and EI job status callbacks (enrichment information)
- Data producer API:
- Data consumer API
- Service status API