Release G: Running Python SDK based Clamp K8s Tests


Repository it/dep(smo-install/test/pythonsdk) contains the python SDK and O-RAN tests. These O-RAN tests should be executed against the entire ONAP and NORTRIC installation. 

There are two kind of tests are available in this.

  • Unit tests - This tests the python SDK. Python and tox is sufficient to run these tests.
  • O-RAN tests - These tests are based on the use cases. Each test required specific component to be up and running.

Pre-requisites for running the tests

The tests for O-RAN use cases requires specific components to be up and running. It is better to have entire ONAP and NONRTRIC related components, as it covers all the possible test scenarios.

ONAP and NONRTRIC related components can be installed using the installation instruction found in smo-install/README.md.

The folder smo-install/helm-override contains different configurations for of ONAP and NONRTRIC deployments.

To bring up the setup for python tests, smo-install/helm-override/pythonsdk-tests configuration should be used.

Installation command to bring the setup(ONAP and NONRTRIC) is as follows

./dep/smo-install/scripts/layer-2/install-oran.sh pythonsdk-tests

All the containers in ONAP and NONRTRIC namespaces should be up and running.

Running the tests

Tox is used to run the test cases and the steps can be found at smo-install/test/pythonsdk/README.md

How Tox finds the test files

Tox considers a file as a test file when the filename starts with "test_". The same pattern based matching applies to the test cases as well.

If you want to disable a test file/test case, Rename it to not start with "test_"

Example of running unit tests,


Example of running O-RAN tests(a1sim and clamp k8s),

Viewing the logs

Unit tests doesn't generate any logs at the time of writing this document. O-RAN test generates the logs in smo-install/test/pythonsdk/src/orantests/pythonsdk.debug.log.

This contains all the logs of running the test preparation and tests(pythonsdk.debug.log)

Test reports

Test reports are generated as part of each test execution. These reports are available in the following location

  • Unit tests - smo-install/test/pythonsdk/reports/junit/unit-tests.xml
  • O-RAN tests - smo-install/test/pythonsdk/src/orantests/reports/junit/oran-tests.xml

Example of unit test report

unit-tests.xml
<?xml version="1.0" encoding="utf-8"?>
<testsuites>
    <testsuite name="pytest" errors="0" failures="0" skipped="0" tests="51" time="1.735" timestamp="2022-11-18T13:03:14.836752" hostname="kubeflow-poc-vm-1">
        <testcase classname="unit-tests.test_a1policymanagement" name="test_initialization" time="0.005" />
        <testcase classname="unit-tests.test_a1policymanagement" name="test_check_status" time="0.003" />
        <testcase classname="unit-tests.test_a1policymanagement" name="test_get_policy_types" time="0.003" />
        <testcase classname="unit-tests.test_a1policymanagement" name="test_get_policy_type_agent" time="0.003" />
        <testcase classname="unit-tests.test_a1policymanagement" name="test_get_policy" time="0.003" />
        <testcase classname="unit-tests.test_a1policymanagement" name="test_create_service" time="0.002" />
        <testcase classname="unit-tests.test_a1policymanagement" name="test_create_policy" time="0.004" />
        <testcase classname="unit-tests.test_a1sim" name="test_initialization" time="0.001" />
        <testcase classname="unit-tests.test_a1sim" name="test_check_version" time="0.002" />
        <testcase classname="unit-tests.test_a1sim" name="test_check_status" time="0.003" />
        <testcase classname="unit-tests.test_a1sim" name="test_get_policy_number" time="0.005" />
        <testcase classname="unit-tests.test_a1sim" name="test_create_policy_type" time="0.003" />
        <testcase classname="unit-tests.test_clamp" name="test_initialization" time="0.001" />
        <testcase classname="unit-tests.test_clamp" name="test_get_template_instance" time="0.006" />
        <testcase classname="unit-tests.test_clamp" name="test_upload_commission" time="0.030" />
        <testcase classname="unit-tests.test_clamp" name="test_create_instance" time="0.004" />
        <testcase classname="unit-tests.test_clamp" name="test_get_template_instance_status" time="0.003" />
        <testcase classname="unit-tests.test_clamp" name="test_delete_template_instance" time="0.003" />
        <testcase classname="unit-tests.test_clamp" name="test_decommission_template" time="0.003" />
        <testcase classname="unit-tests.test_dmaap" name="test_initialization" time="0.001" />
        <testcase classname="unit-tests.test_dmaap" name="test_create_topic" time="0.002" />
        <testcase classname="unit-tests.test_dmaap" name="test_create_service" time="0.002" />
        <testcase classname="unit-tests.test_dmaap" name="test_send_link_failure_event" time="0.002" />
        <testcase classname="unit-tests.test_dmaap" name="test_get_result" time="0.002" />
        <testcase classname="unit-tests.test_dmaap" name="test_get_all_topics" time="0.001" />
        <testcase classname="unit-tests.test_enrichmentservice" name="test_initialization" time="0.001" />
        <testcase classname="unit-tests.test_enrichmentservice" name="test_check_status" time="0.002" />
        <testcase classname="unit-tests.test_enrichmentservice" name="test_get_eitypes" time="0.002" />
        <testcase classname="unit-tests.test_enrichmentservice" name="test_get_eitype_individual" time="0.002" />
        <testcase classname="unit-tests.test_enrichmentservice" name="test_get_eiproducers" time="0.002" />
        <testcase classname="unit-tests.test_enrichmentservice" name="test_get_eiproducer_individual" time="0.002" />
        <testcase classname="unit-tests.test_enrichmentservice" name="test_get_eiproducer_status" time="0.003" />
        <testcase classname="unit-tests.test_enrichmentservice" name="test_get_eijobs" time="0.003" />
        <testcase classname="unit-tests.test_enrichmentservice" name="test_get_eijob_individual" time="0.002" />
        <testcase classname="unit-tests.test_enrichmentservice" name="test_create_eitype" time="0.002" />
        <testcase classname="unit-tests.test_enrichmentservice" name="test_create_eiproducer" time="0.003" />
        <testcase classname="unit-tests.test_enrichmentservice" name="test_create_eijob" time="0.002" />
        <testcase classname="unit-tests.test_policy" name="test_initialization" time="0.001" />
        <testcase classname="unit-tests.test_policy" name="test_get_components_status" time="0.002" />
        <testcase classname="unit-tests.test_policy" name="test_get_policy_status" time="0.002" />
        <testcase classname="unit-tests.test_policy" name="test_get_policy" time="0.002" />
        <testcase classname="unit-tests.test_policy" name="test_create_policy" time="0.002" />
        <testcase classname="unit-tests.test_policy" name="test_deploy_policy" time="0.002" />
        <testcase classname="unit-tests.test_policy" name="test_undeploy_policy" time="0.002" />
        <testcase classname="unit-tests.test_policy" name="test_delete_policy" time="0.002" />
        <testcase classname="unit-tests.test_sdnc" name="test_initialization" time="0.001" />
        <testcase classname="unit-tests.test_sdnc" name="test_get_status" time="0.002" />
        <testcase classname="unit-tests.test_sdnc" name="test_get_odu_oru_status" time="0.002" />
        <testcase classname="unit-tests.test_sdnc" name="test_get_devices" time="0.002" />
        <testcase classname="unit-tests.test_sdnc" name="test_get_events" time="0.002" />
        <testcase classname="unit-tests.test_version" name="test_version" time="0.001" />
    </testsuite>
</testsuites>

Example of ORAN test report

oran-tests.xml
<?xml version="1.0" encoding="utf-8"?>
<testsuites>
    <testsuite name="pytest" errors="0" failures="0" skipped="0" tests="4" time="449.638" timestamp="2022-11-18T12:49:08.034715" hostname="kubeflow-poc-vm-1">
        <testcase classname="src.orantests.test_a1sim" name="test_a1sim" time="0.936" />
        <testcase classname="src.orantests.test_cl_k8s" name="test_cl_oru_app_deploy" time="194.274" />
        <testcase classname="src.orantests.test_cl_k8s" name="test_cl_odu_app_smo_deploy" time="136.291" />
        <testcase classname="src.orantests.test_cl_k8s" name="test_cl_odu_app_ics_deploy" time="114.686" />
    </testsuite>
</testsuites>

Clamp ACM K8s Tests

Policy clamp contains multiple participants, Creating ACM instance with proper configuration can trigger each participant to act upon.

Policy clamp contains k8sparticipant, It can take the helm chart as input in commission configuration and maintain the lifecycle of the helm deployment.

At the time of writing this document, Policy clamp k8s based tests bringing up 3 rApps through ACM instance creation.

These tests configures the ONAP environment for the testing and cleans up the environment once the tests are completed. Clamp based tests are available in the file named smo-install/test/pythonsdk/src/orantests/test_cl_k8s.py.

Steps involved in each ACM k8s tests

Below steps are given in the consideration of the ONAP and NONRTRIC containers are up and running.

How k8sparticipant whitelist chart museum

K8s Participant should be configured with the chart museum details to be whitelisted.

Installation with "pythonsdk-tests" configured to whitelist the chart museum server runs in test namespace

Each ACM tests contains the following tests

  1. Deploy chart museum server in test namespace in k8s
  2. Upload the charts required for the tests in chart museum server
  3. Find the k8s participant pod running in ONAP namespace and add the chart museum configuration in that.
  4. Prepare the tosca commission payload with the chart details
  5. Upload the tosca to the policy clamp
  6. Create ACM instance in policy clamp. it gets created in state UNINITIALIZED
  7. Request for ACM instance state update to PASSIVE. This should deploy the chart using k8sparticipant
  8. Verify that the rApp container to be up and running 
  9. Request for ACM instance state update to UNINITIALIZED. This should undeploy the chart using k8sparticipant
  10. Verify that the rApp container to be down and removed
  11. Find the k8s participant pod running in ONAP namespace and remove the chart museum configuration in that.
  12. Undeploy chart museum server

Each of the above steps can be found using the debug logs.