Skip to content

Commit f64812c

Browse files
Docs: Revise README.md for clarity and structure in validation framework overview
1 parent 6f3e4fb commit f64812c

File tree

1 file changed

+27
-38
lines changed

1 file changed

+27
-38
lines changed

tests/validation/README.md

Lines changed: 27 additions & 38 deletions
Original file line numberDiff line numberDiff line change
@@ -8,50 +8,26 @@ The validation framework uses pytest to organize and execute tests across variou
88

99
## Test Framework Structure
1010

11-
```plaintext
12-
tests/validation/
13-
├── common/ # Shared utilities for tests
14-
│ ├── ffmpeg_handler/ # FFmpeg integration utilities
15-
│ ├── integrity/ # Data integrity verification tools
16-
│ └── nicctl.py # Network interface control
17-
├── configs/ # Test configuration files
18-
│ ├── test_config.yaml # Test environment settings
19-
│ └── topology_config.yaml # Network topology configuration
20-
├── create_pcap_file/ # Tools for packet capture file creation
21-
├── mtl_engine/ # Core test framework components
22-
│ ├── execute.py # Test execution management
23-
│ ├── RxTxApp.py # RX/TX application interface
24-
│ ├── GstreamerApp.py # GStreamer integration
25-
│ ├── ffmpeg_app.py # FFmpeg integration
26-
│ ├── csv_report.py # Test result reporting
27-
│ └── ramdisk.py # RAM disk management
28-
├── tests/ # Test modules
29-
│ ├── single/ # Single-flow test scenarios
30-
│ │ ├── dma/ # DMA tests
31-
│ │ ├── ffmpeg/ # FFmpeg integration tests
32-
│ │ ├── gstreamer/ # GStreamer integration tests
33-
│ │ ├── kernel_socket/ # Kernel socket tests
34-
│ │ ├── performance/ # Performance benchmarking
35-
│ │ ├── ptp/ # Precision Time Protocol tests
36-
│ │ ├── st20p/ # ST2110-20 video tests
37-
│ │ ├── st22p/ # ST2110-22 compressed video tests
38-
│ │ ├── st30p/ # ST2110-30 audio tests
39-
│ │ └── st41/ # ST2110-40 ancillary data tests
40-
│ ├── dual/ # Dual-flow test scenarios
41-
│ └── invalid/ # Error handling and negative test cases
42-
├── conftest.py # pytest configuration and fixtures
43-
├── pytest.ini # pytest settings
44-
└── requirements.txt # Python dependencies
45-
```
11+
The validation framework is organized into the following main components:
12+
13+
- **common/**: Shared utilities for test functionality, including FFmpeg handlers, integrity verification tools, and network interface control
14+
- **configs/**: Configuration files for test environment and network topology
15+
- **mtl_engine/**: Core test framework components that manage test execution, application interfaces, and result reporting
16+
- **tests/**: Test modules organized by scenario type:
17+
- **single/**: Single-flow test scenarios for various protocols (ST2110-20/22/30/40), backends, and integrations
18+
- **dual/**: Tests for multiple simultaneous flows
19+
- **invalid/**: Error handling and negative test cases
4620

4721
## Setup and Installation
4822

4923
### Prerequisites
5024

5125
- Python 3.9 or higher
5226
- Media Transport Library built and installed
53-
- Network interfaces configured for testing
54-
- Sufficient permissions for network management
27+
- Test media files (currently maintained on NFS)
28+
- Network interfaces as specified in MTL's run.md document (VFs will be created automatically)
29+
- Root privileges or equivalent (sudo) for network operations done by script/nicctl.sh
30+
- FFmpeg and GStreamer plugins installed (required for integration tests)
5531

5632
### Environment Setup
5733

@@ -110,6 +86,12 @@ Run specific test modules:
11086
python3 -m pytest --topology_config=configs/topology_config.yaml --test_config=configs/test_config.yaml tests/single/st20p/test_st20p_rx.py
11187
```
11288

89+
Run specific test cases with parameters:
90+
91+
```bash
92+
python3 -m pytest --topology_config=configs/topology_config.yaml --test_config=configs/test_config.yaml "tests/single/st20p/fps/test_fps.py::test_fps[|fps = p60|-ParkJoy_1080p]"
93+
```
94+
11395
### Test Categories
11496

11597
The tests are categorized with markers that can be used to run specific test groups:
@@ -126,12 +108,19 @@ The tests are categorized with markers that can be used to run specific test gro
126108

127109
### Generating HTML Reports
128110

129-
You can generate HTML reports for test results:
111+
You can generate comprehensive HTML reports for test results that include test status, execution time, and detailed logs:
130112

131113
```bash
132114
python3 -m pytest --topology_config=configs/topology_config.yaml --test_config=configs/test_config.yaml -m smoke --template=html/index.html --report=report.html
133115
```
134116

117+
The generated report (report.html) provides:
118+
- Test execution summary and statistics
119+
- Detailed pass/fail status for each test
120+
- Execution time and performance metrics
121+
- Error logs and tracebacks for failed tests
122+
- System information for better debugging context
123+
135124
### Test Output and Reports
136125

137126
- Logs are written to `pytest.log`

0 commit comments

Comments
 (0)