Back to home page

OSCL-LXR

 
 

    


0001 tdc - Adding test cases for tdc
0002 
0003 Author: Lucas Bates - lucasb@mojatatu.com
0004 
0005 ADDING TEST CASES
0006 -----------------
0007 
0008 User-defined tests should be added by defining a separate JSON file.  This
0009 will help prevent conflicts when updating the repository. Refer to
0010 template.json for the required JSON format for test cases.
0011 
0012 Include the 'id' field, but do not assign a value. Running tdc with the -i
0013 option will generate a unique ID for that test case.
0014 
0015 tdc will recursively search the 'tc-tests' subdirectory (or the
0016 directories named with the -D option) for .json files.  Any test case
0017 files you create in these directories will automatically be included.
0018 If you wish to store your custom test cases elsewhere, be sure to run
0019 tdc with the -f argument and the path to your file, or the -D argument
0020 and the path to your directory(ies).
0021 
0022 Be aware of required escape characters in the JSON data - particularly
0023 when defining the match pattern. Refer to the supplied json test files
0024 for examples when in doubt.  The match pattern is written in json, and
0025 will be used by python.  So the match pattern will be a python regular
0026 expression, but should be written using json syntax.
0027 
0028 
0029 TEST CASE STRUCTURE
0030 -------------------
0031 
0032 Each test case has required data:
0033 
0034 id:           A unique alphanumeric value to identify a particular test case
0035 name:         Descriptive name that explains the command under test
0036 skip:         A completely optional key, if the corresponding value is "yes"
0037               then tdc will not execute the test case in question. However,
0038               this test case will still appear in the results output but
0039               marked as skipped. This key can be placed anywhere inside the
0040               test case at the top level.
0041 category:     A list of single-word descriptions covering what the command
0042               under test is testing. Example: filter, actions, u32, gact, etc.
0043 setup:        The list of commands required to ensure the command under test
0044               succeeds. For example: if testing a filter, the command to create
0045               the qdisc would appear here.
0046               This list can be empty.
0047               Each command can be a string to be executed, or a list consisting
0048               of a string which is a command to be executed, followed by 1 or
0049               more acceptable exit codes for this command.
0050               If only a string is given for the command, then an exit code of 0
0051               will be expected.
0052 cmdUnderTest: The tc command being tested itself.
0053 expExitCode:  The code returned by the command under test upon its termination.
0054               tdc will compare this value against the actual returned value.
0055 verifyCmd:    The tc command to be run to verify successful execution.
0056               For example: if the command under test creates a gact action,
0057               verifyCmd should be "$TC actions show action gact"
0058 matchPattern: A regular expression to be applied against the output of the
0059               verifyCmd to prove the command under test succeeded. This pattern
0060               should be as specific as possible so that a false positive is not
0061               matched.
0062 matchCount:   How many times the regex in matchPattern should match. A value
0063               of 0 is acceptable.
0064 teardown:     The list of commands to clean up after the test is completed.
0065               The environment should be returned to the same state as when
0066               this test was started: qdiscs deleted, actions flushed, etc.
0067               This list can be empty.
0068               Each command can be a string to be executed, or a list consisting
0069               of a string which is a command to be executed, followed by 1 or
0070               more acceptable exit codes for this command.
0071               If only a string is given for the command, then an exit code of 0
0072               will be expected.
0073 
0074 
0075 SETUP/TEARDOWN ERRORS
0076 ---------------------
0077 
0078 If an error is detected during the setup/teardown process, execution of the
0079 tests will immediately stop with an error message and the namespace in which
0080 the tests are run will be destroyed. This is to prevent inaccurate results
0081 in the test cases.  tdc will output a series of TAP results for the skipped
0082 tests.
0083 
0084 Repeated failures of the setup/teardown may indicate a problem with the test
0085 case, or possibly even a bug in one of the commands that are not being tested.
0086 
0087 It's possible to include acceptable exit codes with the setup/teardown command
0088 so that it doesn't halt the script for an error that doesn't matter. Turn the
0089 individual command into a list, with the command being first, followed by all
0090 acceptable exit codes for the command.
0091 
0092 Example:
0093 
0094 A pair of setup commands.  The first can have exit code 0, 1 or 255, the
0095 second must have exit code 0.
0096 
0097         "setup": [
0098             [
0099                 "$TC actions flush action gact",
0100                 0,
0101                 1,
0102                 255
0103             ],
0104             "$TC actions add action reclassify index 65536"
0105         ],