cancel
Showing results for 
Search instead for 
Did you mean: 
cancel
712
Views
5
Helpful
4
Replies

Testing of NSO service package's YANG module and template

Priya Chaturvedi
Cisco Employee
Cisco Employee

Hi. I am new to NSO and am trying to find ways to perform validation of YANG modules and related service templates. Use case is to allow network engineers develop those for different service package deployments and have a mechanism to validate the YANG module, and further check whether XML template is in accordance with YANG module or not.

 

Are there any tools that I can utilise for this?

 

Also, how do network engineers check whether their YANG module and template is good to be deployed?

 

Thanks!

1 Accepted Solution

Accepted Solutions

Automated test frameworks do what you want.

 

You define a test case input file and test case expected output file. Everything is tied together with test case code.

When te framework runs the test case code:

  1. reads the the input file
  2. applies the inputs into the NSO
  3. commits and sends config towards netsim/southbound-locked devices/lab devices
  4. fetches the results (get-modification or whatever is needed)
  5. compares the fetched results with the expected output file
  6. fails/succeeds the test case

If step 4 is done towards NSO or towards actual lab devices depends on how complete the test framework is designed and developed.

Whatever it might be, in your particular case you get people together to agree upon the expected output files. Once everyone is happy how the config should look like then that is it.

Ofc at this point in the NSO development it is expected that NEDs were already inspected and that their functionality in device config managing is acceptable.

View solution in original post

4 Replies 4

u.avsec
Spotlight
Spotlight

Hey.

 

As you are cisco, look into CXTA for repetitive test execution. It is a test framework based on Robot Framework. Another name to look into is pyATS.

 

Every thing has to be set up/developed a little. Once that is done network engineers just add test cases with minimal to no coding.

 

How to check if things are good?

1) Does it compile and reload into NSO

2) Does it accept expected parameters and refuse unwanted values (YANG)

3) Is correct device configuration generated (templates)

 

All three of them can either be done manually or automated. How deep the validation will go is up to how thorough you want your tests be. Typically first level of checks are done on NETSIM devices (development stage, early testing stage) and a lot of commit dry-run-ing. Later lab of actual devices can be used for 100% accuracy (acceptance tests or something).

 

Hey, thanks for your answer. 

 

Could you explain in case of templates, how do they check if correct device config has been generated? I am guessing this is where all the dry-run work comes into picture, but does any automated way exist to do this?

 

Automated test frameworks do what you want.

 

You define a test case input file and test case expected output file. Everything is tied together with test case code.

When te framework runs the test case code:

  1. reads the the input file
  2. applies the inputs into the NSO
  3. commits and sends config towards netsim/southbound-locked devices/lab devices
  4. fetches the results (get-modification or whatever is needed)
  5. compares the fetched results with the expected output file
  6. fails/succeeds the test case

If step 4 is done towards NSO or towards actual lab devices depends on how complete the test framework is designed and developed.

Whatever it might be, in your particular case you get people together to agree upon the expected output files. Once everyone is happy how the config should look like then that is it.

Ofc at this point in the NSO development it is expected that NEDs were already inspected and that their functionality in device config managing is acceptable.

Priya Chaturvedi
Cisco Employee
Cisco Employee

Thanks @u.avsec ! I need to look further into this on how to get started with such tests' creation. But your replies did clear a lot of things for me.