@@ -8,50 +8,26 @@ The validation framework uses pytest to organize and execute tests across variou
8
8
9
9
## Test Framework Structure
10
10
11
- ``` plaintext
12
- tests/validation/
13
- ├── common/ # Shared utilities for tests
14
- │ ├── ffmpeg_handler/ # FFmpeg integration utilities
15
- │ ├── integrity/ # Data integrity verification tools
16
- │ └── nicctl.py # Network interface control
17
- ├── configs/ # Test configuration files
18
- │ ├── test_config.yaml # Test environment settings
19
- │ └── topology_config.yaml # Network topology configuration
20
- ├── create_pcap_file/ # Tools for packet capture file creation
21
- ├── mtl_engine/ # Core test framework components
22
- │ ├── execute.py # Test execution management
23
- │ ├── RxTxApp.py # RX/TX application interface
24
- │ ├── GstreamerApp.py # GStreamer integration
25
- │ ├── ffmpeg_app.py # FFmpeg integration
26
- │ ├── csv_report.py # Test result reporting
27
- │ └── ramdisk.py # RAM disk management
28
- ├── tests/ # Test modules
29
- │ ├── single/ # Single-flow test scenarios
30
- │ │ ├── dma/ # DMA tests
31
- │ │ ├── ffmpeg/ # FFmpeg integration tests
32
- │ │ ├── gstreamer/ # GStreamer integration tests
33
- │ │ ├── kernel_socket/ # Kernel socket tests
34
- │ │ ├── performance/ # Performance benchmarking
35
- │ │ ├── ptp/ # Precision Time Protocol tests
36
- │ │ ├── st20p/ # ST2110-20 video tests
37
- │ │ ├── st22p/ # ST2110-22 compressed video tests
38
- │ │ ├── st30p/ # ST2110-30 audio tests
39
- │ │ └── st41/ # ST2110-40 ancillary data tests
40
- │ ├── dual/ # Dual-flow test scenarios
41
- │ └── invalid/ # Error handling and negative test cases
42
- ├── conftest.py # pytest configuration and fixtures
43
- ├── pytest.ini # pytest settings
44
- └── requirements.txt # Python dependencies
45
- ```
11
+ The validation framework is organized into the following main components:
12
+
13
+ - ** common/** : Shared utilities for test functionality, including FFmpeg handlers, integrity verification tools, and network interface control
14
+ - ** configs/** : Configuration files for test environment and network topology
15
+ - ** mtl_engine/** : Core test framework components that manage test execution, application interfaces, and result reporting
16
+ - ** tests/** : Test modules organized by scenario type:
17
+ - ** single/** : Single-flow test scenarios for various protocols (ST2110-20/22/30/40), backends, and integrations
18
+ - ** dual/** : Tests for multiple simultaneous flows
19
+ - ** invalid/** : Error handling and negative test cases
46
20
47
21
## Setup and Installation
48
22
49
23
### Prerequisites
50
24
51
25
- Python 3.9 or higher
52
26
- Media Transport Library built and installed
53
- - Network interfaces configured for testing
54
- - Sufficient permissions for network management
27
+ - Test media files (currently maintained on NFS)
28
+ - Network interfaces as specified in MTL's run.md document (VFs will be created automatically)
29
+ - Root privileges or equivalent (sudo) for network operations done by script/nicctl.sh
30
+ - FFmpeg and GStreamer plugins installed (required for integration tests)
55
31
56
32
### Environment Setup
57
33
@@ -110,6 +86,12 @@ Run specific test modules:
110
86
python3 -m pytest --topology_config=configs/topology_config.yaml --test_config=configs/test_config.yaml tests/single/st20p/test_st20p_rx.py
111
87
```
112
88
89
+ Run specific test cases with parameters:
90
+
91
+ ``` bash
92
+ python3 -m pytest --topology_config=configs/topology_config.yaml --test_config=configs/test_config.yaml " tests/single/st20p/fps/test_fps.py::test_fps[|fps = p60|-ParkJoy_1080p]"
93
+ ```
94
+
113
95
### Test Categories
114
96
115
97
The tests are categorized with markers that can be used to run specific test groups:
@@ -126,12 +108,19 @@ The tests are categorized with markers that can be used to run specific test gro
126
108
127
109
### Generating HTML Reports
128
110
129
- You can generate HTML reports for test results:
111
+ You can generate comprehensive HTML reports for test results that include test status, execution time, and detailed logs :
130
112
131
113
``` bash
132
114
python3 -m pytest --topology_config=configs/topology_config.yaml --test_config=configs/test_config.yaml -m smoke --template=html/index.html --report=report.html
133
115
```
134
116
117
+ The generated report (report.html) provides:
118
+ - Test execution summary and statistics
119
+ - Detailed pass/fail status for each test
120
+ - Execution time and performance metrics
121
+ - Error logs and tracebacks for failed tests
122
+ - System information for better debugging context
123
+
135
124
### Test Output and Reports
136
125
137
126
- Logs are written to ` pytest.log `
0 commit comments