2.7. model¶
2.7.1. ExportLog suite¶
Module with keywords that publish metric and other log events.
-
resources.libraries.python.model.ExportLog.
export_ssh_command
(host, port, command)¶ Add a log item about SSH command execution starting.
The log item is present only in raw output. Result arrives in a separate log item. Log level is always DEBUG.
The command is stored as “data” (not “msg”) as in some cases the command can be too long to act as a message.
The host is added to the info set of hosts.
- Parameters
host (str) – Node “host” attribute, usually its IPv4 address.
port (int) – SSH port number to use when connecting to the host.
command (str) – Serialized bash command to execute.
-
resources.libraries.python.model.ExportLog.
export_ssh_result
(host, port, code, stdout, stderr, duration)¶ Add a log item about ssh execution result.
Only for raw output log.
There is no easy way to pair with the corresponding command, but usually there is only one SSH session for given host and port. The duration value may give a hint if that is not the case.
Message is empty, data has fields “rc”, “stdout”, “stderr” and “duration”. Log level is always DEBUG.
The host is NOT added to the info set of hosts, as each result comes after a command.
TODO: Do not require duration, find preceding ssh command in log. Reason: Pylint complains about too many arguments. Alternative: Define type for SSH endopoint (and use that instead host+port).
- Parameters
host (str) – Node “host” attribute, usually its IPv4 address.
port (int) – SSH port number to use when connecting to the host.
code (int) – Bash return code, e.g. 0 for success.
stdout (str) – Captured standard output of the command execution.
stderr (str) – Captured error output of the command execution.
duration (float) – How long has the command been executing, in seconds.
-
resources.libraries.python.model.ExportLog.
export_ssh_timeout
(host, port, stdout, stderr, duration)¶ Add a log item about ssh execution timing out.
Only for debug log.
There is no easy way to pair with the corresponding command, but usually there is only one SSH session for given host and port.
Message is empty, data has fields “stdout”, “stderr” and “duration”. The duration value may give a hint if that is not the case. Log level is always DEBUG.
The host is NOT added to the info set of hosts, as each timeout comes after a command.
- Parameters
host (str) – Node “host” attribute, usually its IPv4 address.
port (int) – SSH port number to use when connecting to the host.
stdout (str) – Captured standard output of the command execution so far.
stderr (str) – Captured error output of the command execution so far.
duration (float) – How long has the command been executing, in seconds.
2.7.2. ExportResult suite¶
Module with keywords that publish parts of result structure.
-
resources.libraries.python.model.ExportResult.
append_mrr_value
(mrr_value, unit)¶ Store mrr value to proper place so it is dumped into json.
The value is appended only when unit is not empty.
- Parameters
mrr_value (float) – Forwarding rate from MRR trial.
unit (str) – Unit of measurement for the rate.
-
resources.libraries.python.model.ExportResult.
export_dut_type_and_version
(dut_type='unknown', dut_version='unknown')¶ Export the arguments as dut type and version.
Robot tends to convert “none” into None, hence the unusual default values.
If either argument is missing, the value from robot variable is used. If argument is present, the value is also stored to robot suite variable.
- Parameters
dut_type (Optional[str]) – DUT type, e.g. VPP or DPDK.
dut_version (Optiona[str]) – DUT version as determined by the caller.
- Raises
RuntimeError – If value is neither in argument not robot variable.
-
resources.libraries.python.model.ExportResult.
export_ndrpdr_latency
(text, latency)¶ Store NDRPDR hdrh latency data.
If “latency” node does not exist, it is created. If a previous value exists, it is overwritten silently.
Text is used to determine what percentage of PDR is the load, as the Robot caller has the information only there.
Reverse data may be missing, we assume the test was unidirectional.
- Parameters
text (str) – Info from Robot caller to determime load.
latency (1-tuple or 2-tuple of str) – Output from TRex utility, min/avg/max/hdrh.
-
resources.libraries.python.model.ExportResult.
export_search_bound
(text, value, unit, bandwidth=None)¶ Store bound value and unit.
This function works for both NDRPDR and SOAK, decided by text.
If a node does not exist, it is created. If a previous value exists, it is overwritten silently. Result type is set (overwritten) to ndrpdr (or soak).
Text is used to determine whether it is ndr or pdr, upper or lower bound, as the Robot caller has the information only there.
- Parameters
text (str) – Info from Robot caller to determime bound type.
value (float) – The bound value in packets (or connections) per second.
unit (str) – Rate unit the bound is measured (or estimated) in.
bandwidth (Optional[float]) – The same value recomputed into L1 bits per second.
-
resources.libraries.python.model.ExportResult.
export_tg_type_and_version
(tg_type='unknown', tg_version='unknown')¶ Export the arguments as tg type and version.
Robot tends to convert “none” into None, hence the unusual default values.
If either argument is missing, the value from robot variable is used. If argument is present, the value is also stored to robot suite variable.
- Parameters
tg_type (Optional[str]) – TG type, e.g. TREX.
tg_version (Optiona[str]) – TG version as determined by the caller.
- Raises
RuntimeError – If value is neither in argument not robot variable.
2.7.3. export_json suite¶
Module tracking json in-memory data and saving it to files.
The current implementation tracks data for raw output, and info output is created from raw output on disk (see raw2info module). Raw file contains all log items but no derived quantities, info file contains only important log items but also derived quantities. The overlap between two files is big.
Each test case, suite setup (hierarchical) and teardown has its own file pair.
Validation is performed for output files with available JSON schema. Validation is performed in data deserialized from disk, as serialization might have introduced subtle errors.
-
class
resources.libraries.python.model.export_json.
export_json
¶ Bases:
object
Class handling the json data setting and export.
-
ROBOT_LIBRARY_SCOPE
= 'GLOBAL'¶
-
export_pending_data
()¶ Write the accumulated data to disk.
Create missing directories. Reset both file path and data to avoid writing multiple times.
Functions which finalize content for given file are calling this, so make sure each test and non-empty suite setup or teardown is calling this as their last keyword.
If no file path is set, do not write anything, as that is the failsafe behavior when caller from unexpected place. Aso do not write anything when EXPORT_JSON constant is false.
Regardless of whether data was written, it is cleared.
-
finalize_suite_setup_export
()¶ Add the missing fields to data. Do not write yet.
Should be run at the end of suite setup. The write is done at next start (or at the end of global teardown).
-
finalize_suite_teardown_export
()¶ Add the missing fields to data. Do not write yet.
Should be run at the end of suite teardown (but before the explicit write in the global suite teardown). The write is done at next start (or explicitly for global teardown).
-
finalize_test_export
()¶ Add the missing fields to data. Do not write yet.
Should be at the end of test teardown, as the implementation reads various Robot variables, some of them only available at teardown.
The write is done at next start (or at the end of global teardown).
-
start_suite_setup_export
()¶ Set new file path, initialize data for the suite setup.
This has to be called explicitly at start of suite setup, otherwise Robot likes to postpone initialization until first call by a data-adding keyword.
File path is set based on suite.
-
start_suite_teardown_export
()¶ Set new file path, initialize data for the suite teardown.
This has to be called explicitly at start of suite teardown, otherwise Robot likes to postpone initialization until first call by a data-adding keyword.
File path is set based on suite.
-
start_test_export
()¶ Set new file path, initialize data to minimal tree for the test case.
It is assumed Robot variables DUT_TYPE and DUT_VERSION are already set (in suite setup) to correct values.
This function has to be called explicitly at the start of test setup, otherwise Robot likes to postpone initialization until first call by a data-adding keyword.
File path is set based on suite and test.
-
warn_on_bad_export
()¶ If bad state is detected, log a warning and clean up state.
-
2.7.4. mem2raw suite¶
Module for converting in-memory data into raw JSON output.
CSIT and VPP PAPI are using custom data types that are not directly serializable into JSON.
Thus, before writing the raw outpt onto disk, the data is recursively converted to equivalent serializable types, in extreme cases replaced by string representation.
Validation is outside the scope of this module, as it should use the JSON data read from disk.
-
resources.libraries.python.model.mem2raw.
write_raw_output
(raw_file_path, raw_data)¶ Prepare data for serialization and dump into a file.
Ancestor directories are created if needed.
- Parameters
to_raw_path (str) – Local filesystem path, including the file name.
2.7.5. raw2info suite¶
Module facilitating conversion from raw outputs into info outputs.
-
resources.libraries.python.model.raw2info.
convert_content_to_info
(from_raw_path)¶ Read raw output, perform filtering, add derivatves, write info output.
Directory path is created if missing.
When processing teardown, create also suite output using setup info.
- Parameters
from_raw_path (str) – Local filesystem path to read raw JSON data from.
- Returns
Local filesystem path to written info JSON file.
- Return type
str
- Raises
RuntimeError – If path or content do not match expectations.
2.7.6. util suite¶
Module hosting few utility functions useful when dealing with modelled data.
This is for storing varied utility functions, which are too short and diverse to be put into more descriptive modules.
-
resources.libraries.python.model.util.
descend
(parent_node, key, default_factory=None)¶ Return a sub-node, create and insert it when it does not exist.
- Without this function:
child_node = parent_node.get(key, dict()) parent_node[key] = child_node
- With this function:
child_node = descend(parent_node, key)
New code is shorter and avoids the need to type key and parent_node twice.
- Parameters
parent_node (dict) – Reference to inner node of a larger structure we want to descend from.
key (str) – Key of the maybe existing child node.
default_factory (Optional[Callable[[], object]]) – If the key does not exist, call this to create a new value to be inserted under the key. None means dict. The other popular option is list.
- Returns
The reference to (maybe just created) child node.
- Return type
object
-
resources.libraries.python.model.util.
get_export_data
()¶ Return raw_data member of export_json library instance.
This assumes the data has been initialized already. Return None if Robot is not running.
- Returns
Current library instance’s raw data field.
- Return type
Optional[dict]
- Raises
AttributeError – If library is not imported yet.
2.7.7. validate suite¶
Module for validating JSON instances against schemas.
Short module currently, as we validate only testcase info outputs. Structure will probably change when we start validation mode file types.
-
resources.libraries.python.model.validate.
get_validators
()¶ Return mapping from file types to validator instances.
Uses hardcoded file types and paths to schemas on disk.
- Returns
Validators, currently just for tc_info_output.
- Return type
Mapping[str, jsonschema.validators.Validator]
- Raises
RuntimeError – If schemas are not readable or not valid.
-
resources.libraries.python.model.validate.
validate
(file_path, validator)¶ Load data from disk, use validator to validate it.
- Parameters
file_path (str) – Local filesystem path including the file name to load.
validator (jsonschema.validators.Validator) – Validator instance to use for validation.
- Raises
RuntimeError – If schema validation fails.