This repository contains scripts to simulate the effect of channel and CFO variations on wireless fingerprinting using complex-valued CNNs. This repo also has simulation-based dataset based on models of some typical nonlinearities. It includes augmentation techniques, estimation techniques, and combinations of the two. The repository consists of 4 folders; namely, cxnn, preproc, tests, data, and a number of scripts outside of these folders (in the main folder).
We have created a simulation-based WiFi dataset based on models of some typical nonlinearities. We implement two different kinds of circuit-level impairments: I/Q imbalance and power amplifier nonlinearity. Training dataset consists of 200 signals per device for 19 devices (classes). The validation and test sets contain 100 signals per device. Overall, the dataset contains 3800 signals for training, 1900 signals for validation and 1900 signals for the test set. The dataset can be downloaded as an npz file from this Box link, and needs to be copied into the data
subdirectory.
Further details can be found in our paper.
This repo can be installed via following command:
bash
git clone https://github.com/metehancekic/wireless-fingerprinting.git
Since the implementation of complex valued neural networks is done on Keras with Theano backend, the modules inside the requirements.txt are needed to be installed to be able to run experiments. We strongly recommend to install miniconda, create a virtual environment, and run the following commands. These commands will build the environment which is necessary to run the codes in this repository.
bash
conda create -n cxnn python=2.7
conda activate cxnn
pip install -r requirements.txt
conda install mkl-service
conda install -c conda-forge resampy
For gpu usage:
bash
conda install -c anaconda pygpu
With following CUDA and cuDNN versions:
CUDA 9.0.176\ cuDNN 7.3.1
The code with default parameters (without channel and CFO) can be run using:
bash
KERAS_BACKEND=theano python cfo_channel_training_simulations.py
KERAS_BACKEND=theano python cfo_channel_testing_simulations.py
Controlled experiments emulating the effect of frequency drift and channel variations is included via "experiment_setup.py" or can be explicitly called on terminal. All the hyper-parameters for these experiments are in "configs_train.json" and "configs_test.json" for training and testing codes respectively.
For detailed information about arguments use following code:
bash
KERAS_BACKEND=theano python cfo_channel_training_simulations.py --help
``` usage: cfo_channel_testing_simulations.py [-h] [-a {reim,reim2x,reimsqrt2x,magnitude,phase,re,im,modrelu,crelu}] [-phy_ch] [-phy_cfo] [-comp_cfo] [-eq_tr] [-aug_ch] [-aug_cfo] [-res] [-eq_test] [-comp_cfo_test] [-aug_ch_test] [-aug_cfo_test]
optional arguments: -h, --help show this help message and exit -a {reim,reim2x,reimsqrt2x,magnitude,phase,re,im,modrelu,crelu}, --architecture {reim,reim2x,reimsqrt2x,magnitude,phase,re,im,modrelu,crelu} Architecture
setup: Setup for experiments
-phy_ch, --physical_channel Emulate the effect of channel variations, default = False -phy_cfo, --physical_cfo Emulate the effect of frequency variations, default = False -comp_cfo, --compensate_cfo Compensate frequency of training set, default = False -eq_tr, --equalize_train Equalize training set, default = False -aug_ch, --augment_channel Augment channel for training set, default = False -aug_cfo, --augment_cfo Augment cfo for training set, default = False -res, --obtain_residuals Obtain residuals for both train and test set, default = False -comp_cfo_test, --compensate_cfo_test Compensate frequency of test set, default = False
test setup: Test Setup for experiments
-eq_test, --equalize_test Equalize test set, default = False -aug_ch_test, --augment_channel_test Augment channel for test set, default = False -aug_cfo_test, --augment_cfo_test Augment cfo for test set, default = False ```
Running code with different day scenario (channel, cfo):
bash
KERAS_BACKEND=theano python cfo_channel_training_simulations.py --physical_channel --physical_cfo --augment_channel --augment_cfo
KERAS_BACKEND=theano python cfo_channel_testing_simulations.py --physical_channel --physical_cfo --augment_channel --augment_cfo --augment_channel_test --augment_cfo_test
project
β README.md
β cfo_channel_training_simulations.py Training code for all the experiments
β cfo_channel_testing_simulations.py Testing code from checkpoints
β configs_train.json All hyper parameters for training
β configs_test.json All hyper parameters for testing
β simulators.py All simulations (CFO, channel, residuals, etc) as functions
β experiment_setup.py Experiment setup (use different day channel, cfo effects or not)
β
ββββcxnn
β β models.py Neural network architectures
β β train.py Training function
β β train_network_reim_mag.py Training function for real and complex networks
β β
β ββββcomplexnn
β β complex-valued neural network implemantation codes
β β ...
β
ββββpreproc
β β fading_model.py Signal processing tools (Fading models, etc)
β β preproc_wifi Preprocessing tools (Equalization, etc)
β
ββββtests
β β test_aug_analysis.py Signal processing tools (Fading models, etc)
β β visualize_offset.py Preprocessing tools (Equalization, etc)
β
ββββdata
β simulations.npz Simulated WiFi dataset
β
Complex layers are from Trabelsi et al, "Deep Complex Networks", ICLR 2018 (MIT license), with the addition of cxnn/complexnn/activations.py which contains the ModReLU activation function.
``` @inproceedings{fingerprinting2019globecom, author = {Gopalakrishnan, Soorya and Cekic, Metehan and Madhow, Upamanyu}, booktitle = {IEEE Global Communications Conference (Globecom)}, location = {Waikoloa, Hawaii}, title = {Robust Wireless Fingerprinting via Complex-Valued Neural Networks}, year = {2019} }
@INPROCEEDINGS{9723393, author={Cekic, Metehan and Gopalakrishnan, Soorya and Madhow, Upamanyu}, booktitle={2021 55th Asilomar Conference on Signals, Systems, and Computers}, title={Wireless Fingerprinting via Deep Learning: The Impact of Confounding Factors}, year={2021}, volume={}, number={}, pages={677-684}, doi={10.1109/IEEECONF53345.2021.9723393}} ```
Bumps ipython from 7.16.3 to 8.10.0.
Sourced from ipython's releases.
See https://pypi.org/project/ipython/
We do not use GitHub release anymore. Please see PyPI https://pypi.org/project/ipython/
15ea1ed
release 8.10.0560ad10
DOC: Update what's new for 8.10 (#13939)7557ade
DOC: Update what's new for 8.10385d693
Merge pull request from GHSA-29gw-9793-fvw7e548ee2
Swallow potential exceptions from showtraceback() (#13934)0694b08
MAINT: mock slowest test. (#13885)8655912
MAINT: mock slowest test.a011765
Isolate the attack tests with setUp and tearDown methodsc7a9470
Add some regression tests for this changefd34cf5
Swallow potential exceptions from showtraceback()Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase
.
This is an excellent and exhaustive work. Thanks to the author for sharing code and the corresponding dataset. If it's convenient, I have the following two questions:
Have you tried testing on the following two real-world datasets from Northeastern University or other public datasets? (1)https://wiot.northeastern.edu/wp-content/uploads/2020/07/dataset_release.pdf (2)https://genesys-lab.org/oracle
Can you share the code for generating the simulation dataset?
Hi, I cannot reproduce the complex network accuracy of 99.62% on the clean WiFi dataset reported in Table I. With the code from this repository or in a custom training harness, I get performance equivalent or worse than the "Real 2x" architecture (<98%). Is there something I am missing?