This content is not included in your SAE MOBILUS subscription, or you are not logged in.

Accelerating In-Vehicle Network Intrusion Detection System Using Binarized Neural Network

Journal Article
2022-01-0156
ISSN: 2641-9645, e-ISSN: 2641-9645
Published March 29, 2022 by SAE International in United States
Accelerating In-Vehicle Network Intrusion Detection System Using Binarized Neural Network
Sector:
Citation: Zhang, L., Yan, X., and Ma, D., "Accelerating In-Vehicle Network Intrusion Detection System Using Binarized Neural Network," SAE Int. J. Adv. & Curr. Prac. in Mobility 4(6):2037-2050, 2022, https://doi.org/10.4271/2022-01-0156.
Language: English

Abstract:

Controller Area Network (CAN), the de facto standard for in-vehicle networks, has insufficient security features and thus is inherently vulnerable to various attacks. To protect CAN bus from attacks, intrusion detection systems (IDSs) based on advanced deep learning methods, such as Convolutional Neural Network (CNN) and Recurrent Neural Network (RNN), have been proposed to detect intrusions. However, those models generally introduce high latency, require considerable memory space, and often result in high energy consumption. To accelerate intrusion detection and also reduce memory requests, we exploit the use of Binarized Neural Network (BNN) and hardware-based acceleration for intrusion detection in in-vehicle networks. As BNN uses binary values for activations and weights rather than full precision values, it usually results in faster computation, smaller memory cost, and lower energy consumption than full precision models. Moreover, unlike other deep learning methods, BNN can be further accelerated by leveraging Field-Programmable Grid Arrays (FPGAs) since BNN cuts down the hardware consumption. We design our BNN model to suit CAN traffic data and exploit sequential features of the CAN traffic instead of individual messages. We evaluate the proposed IDS with four different real vehicle datasets. Our experimental results show that the proposed BNN-based IDS reduces the detection latency on the same CPU (3 times faster) while maintaining acceptable detection rates compared to full precision models. We also implement the proposed IDS using FPGA hardware to reduce latency further and accelerate intrusion detection. Our experiments on multiple platforms demonstrate that using the FPGAs dramatically reduces the detection latency (128 times faster) with lower power consumption in comparison with an embedded CPU.