hostapd/tests/hwsim
Jouni Malinen d658205a89 tests: Verify PIN mismatch behavior
Signed-hostap: Jouni Malinen <j@w1.fi>
2013-12-29 10:00:32 +02:00
..
auth_serv tests: Add an EAP-TLS test case 2013-11-03 19:51:06 +02:00
vm tests: Generate a combined code coverage report 2013-12-28 16:32:54 +02:00
bss-1.conf tests: Test dynamic BSS addition/removal 2013-10-31 23:04:50 +02:00
bss-2.conf tests: Test dynamic BSS addition/removal 2013-10-31 23:04:50 +02:00
bss-3.conf tests: Test dynamic BSS addition/removal 2013-10-31 23:04:50 +02:00
bss-ht40-1.conf tests: Verify BSS additional/removal during HT co-ex scan 2013-11-06 15:52:40 +02:00
bss-ht40-2.conf tests: Verify BSS additional/removal during HT co-ex scan 2013-11-06 15:52:40 +02:00
build.sh tests: Do not override existing .config from build.sh 2013-11-09 15:40:06 +02:00
check_kernel.py hwsim tests: Check kernel messages for lockdep complaints 2013-11-20 16:39:24 +02:00
example-hostapd.config Make CONFIG_TESTING_OPTIONS=y enable all testing options 2013-12-26 20:50:28 +02:00
example-wpa_supplicant.config tests: Collect code coverage separately from each component in vm 2013-12-27 18:11:07 +02:00
hostapd.py tests: Verify WPA-Enterprise functionality 2013-12-29 10:00:31 +02:00
hwsim_utils.py tests: Verify QoS Mapping results in Data frames 2013-12-24 20:21:58 +02:00
multi-bss-acs.conf tests: Add test cases for automatic channel selection 2013-11-03 21:30:31 +02:00
multi-bss.conf tests: Add test case for multi-BSS configuration file 2013-11-03 20:20:50 +02:00
p2p0.conf tests: Add start/stop scripts and configuration files 2013-03-02 11:39:54 +02:00
p2p1.conf tests: Add start/stop scripts and configuration files 2013-03-02 11:39:54 +02:00
p2p2.conf tests: Add start/stop scripts and configuration files 2013-03-02 11:39:54 +02:00
README tests: Remove special start.sh option for concurrent P2P tests 2013-11-24 21:20:15 +02:00
run-all.sh tests: Remove special start.sh option for concurrent P2P tests 2013-11-24 21:20:15 +02:00
run-tests.py tests: Mark kernel issues more clearly in the log file 2013-12-27 18:11:07 +02:00
start.sh tests: Collect code coverage separately from each component in vm 2013-12-27 18:11:07 +02:00
stop.sh tests: Remove trailing whitespace 2013-11-02 15:40:36 +02:00
test_ap_acs.py tests: Wait for AP-ENABLED 2013-12-27 18:11:07 +02:00
test_ap_ciphers.py tests: Add test cases for various ciphers 2013-12-25 11:17:32 +02:00
test_ap_dynamic.py tests: Verify BSS additional/removal during HT co-ex scan 2013-11-06 15:52:40 +02:00
test_ap_eap.py tests: WPA2-Enterprise with PMF required 2013-12-29 10:00:31 +02:00
test_ap_ft.py tests: Verify FT EAP 2013-12-29 10:00:32 +02:00
test_ap_hs20.py tests: Verify Interworking network selection with other EAP types 2013-12-26 16:55:45 +02:00
test_ap_ht.py tests: Wait for AP-ENABLED 2013-12-27 18:11:07 +02:00
test_ap_pmf.py tests: Use single channel scans to speed up test cases 2013-11-02 11:22:16 +02:00
test_ap_qosmap.py tests: Verify QoS Mapping results in Data frames 2013-12-24 20:21:58 +02:00
test_ap_roam.py tests: Split run-tests.py logger info into per test case files 2013-10-31 12:51:08 +02:00
test_ap_tdls.py tests: Use single channel scans to speed up test cases 2013-11-02 11:22:16 +02:00
test_ap_wps.py tests: Verify PIN mismatch behavior 2013-12-29 10:00:32 +02:00
test_dfs.py tests: Add preliminary version of DFS test cases 2013-11-03 21:57:39 +02:00
test_gas.py tests: Include additional ANQP elements for testing coverage 2013-12-26 18:27:48 +02:00
test_ibss.py tests: Verify IBSS with WPA-None 2013-12-29 10:00:31 +02:00
test_nfc_wps.py tests: Split run-tests.py logger info into per test case files 2013-10-31 12:51:08 +02:00
test_p2p_autogo.py tests: Increase test coverage on BSS command 2013-12-26 16:55:44 +02:00
test_p2p_concurrency.py tests: Add test cases for concurrent P2P operations 2013-11-24 21:20:15 +02:00
test_p2p_discovery.py tests: Split run-tests.py logger info into per test case files 2013-10-31 12:51:08 +02:00
test_p2p_grpform.py tests: Add test cases for concurrent P2P operations 2013-11-24 21:20:15 +02:00
test_p2p_invitation.py tests: Split run-tests.py logger info into per test case files 2013-10-31 12:51:08 +02:00
test_p2p_persistent.py tests: Verify that re-invoked P2P group is identified as persistent 2013-11-21 11:39:12 +02:00
test_p2p_service.py tests: Split run-tests.py logger info into per test case files 2013-10-31 12:51:08 +02:00
test_p2p_wifi_display.py tests: Verify Wi-Fi Display operations 2013-12-26 16:55:44 +02:00
test_peerkey.py tests: Verify PeerKey handshake 2013-12-28 16:32:54 +02:00
test_sae.py tests: Add test cases for SAE 2013-11-03 11:31:48 +02:00
test_scan.py tests: Verify scan behavior 2013-12-26 20:50:23 +02:00
test_wnm.py tests: Add WNM test cases 2013-12-27 20:04:27 +02:00
utils.py tests: Verify P2P operations with a separate group interface 2013-11-24 15:32:52 +02:00
wlantest.py tests: Verify Interworking Probe Request filtering 2013-12-26 13:37:06 +02:00
wpasupplicant.py tests: Verify WPS AP Setup Locking on AP PIN failures 2013-12-29 10:00:32 +02:00

Automated hostapd/wpa_supplicant testing with mac80211_hwsim
------------------------------------------------------------

This directory contains testing infrastructure and test cases to run
automated tests of full hostapd and wpa_supplicant functionality. This
testing is done with the help of mac80211_hwsim which is Linux kernel
driver that simulates IEEE 802.11 radios without requiring any
additional hardware. This setup most of the hostapd and wpa_supplicant
functionality (and large parts of the Linux cfg80211 and mac80211
functionality for that matter) to be tested.

mac80211_hwsim is loaded with five simulated radios to allow different
device combinations to be tested. wlantest is used analyze raw packets
captured through the hwsim0 monitor interface that capture all frames
sent on all channels. tcpdump is used to store the frames for
analysis. Three wpa_supplicant processed are used to control three
virtual radios and one hostapd process is used to dynamically control
the other two virtual radios. hwsim_test is used to verify that data
connection (both unicast and broadcast) works between two netdevs.

The python scripts and tools in this directory control test case
execution. They interact wpa_supplicant and hostapd through control
interfaces to perform the operations. In addition, wlantest_cli and
hwsim_test are used to verify that operations have been performed
correctly and that the network connection works in the expected way.

These test cases are run automatically against the hostap.git commits
for regression testing and to help in keeping the hostap.git master
branch in stable state. Results from these tests are available here:
http://buildbot.w1.fi:8010/waterfall


Building binaries for testing
-----------------------------

You will need to build (or use already built) components to be
tested. These are available in the hostap.git repository and can be
built for example as follows:

cd ../../wpa_supplicant
cp ../tests/hwsim/example-wpa_supplicant.config .config
make clean
make
cd ../hostapd
cp ../tests/hwsim/example-hostapd.config .config
make clean
make hostapd hlr_auc_gw
cd ../wlantest
make clean
make
cd ../mac80211_hwsim/tools
make

The test scripts can find the binaries in the locations where they were
built. It is also possible to install hwsim_test and wlantest_cli
somewhat on the path to use pre-built tools.


wpaspy
------

The python scripts use wpaspy.py to interact with the wpa_supplicant
control interface, but the run-tests.py script adds the (relative)
path into the environment so it doesn't need to be installed.


mac80211_hwsim
--------------

mac80211_hwsim kernel module is available from the upstream Linux
kernel. Some Linux distributions enable it by default. If that's not the
case, you can either enable it in the kernel configuration
(CONFIG_MAC80211_HWSIM=m) and rebuild your kernel or use Backports with
CPTCFG_MAC80211_HWSIM=m to replace the wireless LAN components in the
base kernel.


sudo
----

Some parts of the testing process requires root privileges. The test
scripts are currently using sudo to achieve this. To be able to run the
tests, you'll probably want to enable sudo with a timeout to not expire
password entry very quickly. For example, use this in the sudoers file:

Defaults        env_reset,timestamp_timeout=180

Or on a dedicated test system, you could even disable password prompting
with this in sudoers:

%sudo   ALL=NOPASSWD: ALL


Other network interfaces
------------------------

Some of the test scripts are still using hardcoded interface names, so
the easiest way of making things work is to avoid using other network
devices that may use conflicting interface names. For example, unload
any wireless LAN driver before running the tests and make sure that
wlan0..4 gets assigned as the interface names for the mac80211_hwsim
radios. It may also be possible to rename the interface expectations in
run-tests.py to allow other names to be used.


Running tests
-------------

Simplest way to run a full set of the test cases is by running
run-all.sh in tests/hwsim directory. This will use start.sh to load the
mac80211_hwsim module and start wpa_supplicant, hostapd, and various
test tools. run-tests.sh is then used to run through all the defined
test cases and stop.sh to stop the programs and unload the kernel
module.

run-all.sh can be used to run the same test cases under different
conditions:

# run normal test cases
./run-all.sh

# run normal test cases under valgrind
./run-all.sh valgrind

# run normal test cases with Linux tracing
./run-all.sh trace

run-all.sh directs debug logs into the logs subdirectory (or $LOGDIR if
present in the environment). Log file names include the current UNIX
timestamp and a postfix to identify the specific log:
- *.log0 = wpa_supplicant debug log for the first radio
- *.log1 = wpa_supplicant debug log for the second radio
- *.log2 = wpa_supplicant debug log for the third radio
- *.hostapd = hostapd debug log
- hwsim0 = wlantest debug log
- hwsim0.pcapng = capture with all frames exchanged during the tests
- *.log = debug prints from the test scripts
- trace.dat = Linux tracing record (if enabled)
- hlr_auc_gw - hlr_auc_gw (EAP-SIM/AKA/AKA' authentication) log
- auth_serv - hostapd as RADIUS authentication server log


For manual testing, ./start.sh can be used to initialize interfaces and
programs and run-tests.py to execute one or more test
cases. run-tests.py output verbosity can be controlled with -d (more
verbose debug output) and -q (less verbose output) on the command
line. "-f <module name>" (pointing to file test_<module name>.py) can be
used to specify that all test cases from a single file are to be
run. Test name as the last command line argument can be specified that a
single test case is to be run (e.g., "./run-tests.py ap_pmf_required").


Adding/modifying test cases
---------------------------

All the test cases are defined in the test_*.py files. These are python
scripts that can use the local helper classes to interact with the test
components. While various python constructs can be used in the scripts,
only a minimal level of python knowledge should really be needed to
modify and add new test cases. The easiest starting point for this is
likely to take a look at some of the example scripts. When working on a
new test, run-tests.py with -d and the test case name on the command
line is a convenient way of verifying functionality.

run-tests.py will automatically import all test cases from the test_*.py
files in this directory. All functions starting with the "test_" prefix
in these files are assumed to be test cases. Each test case is named by
the function name following the "test_" prefix.


Results database
----------------

run-tests.py can be requested to write results from the execution of
each test case into an sqlite database. The "-S <path to database>" and
"-b <build id>" command line arguments can be used to do that. The
database must have been prepared before this, e.g., with following:

cat | sqlite3 /tmp/example.db <<EOF
CREATE TABLE results (test,result,run,time,duration,build,commitid);
CREATE INDEX results_idx ON results (test);
CREATE INDEX results_idx2 ON results (run);
CREATE TABLE tests (test,description);
EOF