Requirements
Maven 3
Java 17
Docker and Docker Compose
The Code, which can be cloned here
Building
Once the code is cloned and requirements installed you should be able to start building TEIV and its components.
To build TEIV and its components with tests run the following command in the first /teiv
directory:
mvn clean install
Building with tests should create the following docker images:
o-ran-sc/smo-teiv-ingestion:latest
o-ran-sc/smo-teiv-exposure :latest
To build without tests run the following command in the first /teiv
directory (significantly faster). Do this with custom YANG:
mvn clean install -Dmaven.test.skip=true
Building without tests should create the following docker images:
o-ran-sc/smo-teiv-ingestion:latest
o-ran-sc/smo-teiv-exposure:latest
Running
To run TEIV only TEIV is needed, but pgsql-schema-generator is provided to generate PostgresSQL schemes from the YANG models which can be used in TEIV.
TEIV
In the /teiv/docker-compose directory run the following code to bring TEIV up:
docker-compose up
Running 'docker ps -a’ should show the following docker containers:
kafka-producer confluentinc/cp-kafka:7.6.1
kafka confluentinc/cp-kafka:7.6.1
kafka2 confluentinc/cp-kafka:7.6.1
kafka3 confluentinc/cp-kafka:7.6.1
topology-ingestion-inventory o-ran-sc/smo-teiv-ingestion:latest
topology-exposure-inventory o-ran-sc/smo-teiv-exposure:latest
zookeeper confluentinc/cp-zookeeper:6.2.1
dbpostgresql postgis/postgis:13-3.4-alpine
Once running there are some sample queries to try at Sample TEIV Queries
The /teiv/docker-compose
directory has everything needed to get TEIV up and running consisting of the following files and directories:
docker-compose.yml - contains all the services needed to run TEIV
sql_scripts - contains the sql scripts produced by the pgsql-schema-generator
cloudEventProducer - contains a cloudEventProducer script that will produce kafka events to populate TEIV by running locally, cloudEventProducerForDockerCompose used for the
kafka-producer
service in docker-compomse.yml and the actual events are in the events directorycopySqlSchemaFromPgsqlGenerator.sh - copies the sql schema generated from pgsql-schema-generator to the sql_scripts directory, renaming and replacing placeholders.
In docker-compose.yml there is an optional kafka-producer
service which will automatically run the cloudEventProducerForDockerCompose script to populate TEIV. Can comment out if this is not desired.
pgsql-schema-generator
The default sql scripts in sql_scripts were built from TEIVs default yang models provided in /teiv/teiv/src/main/resources/models using the pgsql-schema-generator.
To run pgsql-schema-generator yourself, using the default YANG models or your own YANG models, copy YANG models into the /teiv/pgsql-schema-generator/src/main/resources/generate-defaults
directory and from /teiv/pgsql-schema-generator/
run:
mvn exec:java -Dexec.mainClass="org.oran.smo.teiv.pgsqlgenerator.DatabaseSchemaGeneratorApplication"
Once run successfully the sql schemas should be present in /teiv/pgsql-schema-generator/target
(highlighted in red below) and graphs representing the entities and relationships of the YANG models in /teiv/pgsql-schema-generator/target/graphs
(highlighted in blue below):
There is a script in /teiv/docker-compose
, copySqlSchemaFromPgsqlGenerator.sh that will copy the generated sql schemes to /teiv/docker-compose/sql_scripts
, rename them, and replace the placeholders.
NB: if you are generating your own sql schemes with different YANG models, compared to the default YANG, the example events will not work, therefore kafka-producer
must be commented out in the docker-compose.yml