Deep RNN-Based Network Traffic Classification Scheme in Edge Computing System


Kwihoon Kim, Joohyung Lee, Hyun-Kyo Lim, Se Won Oh, Youn-Hee Han




This paper proposes a deep recurrent neural network (RNN)-based traffic classification scheme (deep RNN-TCS) for classifying applications from traffic patterns in a hybrid edge computing and cloud computing architecture. We can also classify traffic from a cloud server, but there will be a time delay when packets transfer to the server. Therefore, the traffic classification is possible almost in realtime when it performed on edge computing nodes. However, training takes a lot of time and needs a lot of computing resources to learn traffic patterns. Therefore, it is efficient to perform training on cloud server and to perform serving on edge computing node. Here, a cloud server collects and stores output labels corresponding to the application packets. Then, it trains those data and generates inferred functions. An edge computation node receives the inferred functions and executes classification. Compared to deep packet inspection (DPI), which requires the periodic verification of existing signatures and updated application information (e.g., versions adding new features), the proposed scheme can classify the applications in an automated manner. Also, deep learning can automatically make classifiers for traffic classification when there is enough data. Specifically, input features and output labels are defined for classification as traffic packets and target applications, respectively, which are created as two-dimensional images. As our training data, traffic packets measured at Universitat Politecnica de Catalunya Barcelonatech were utilized. Accordingly, the proposed deep RNN-TCS is implemented using a deep long short-term memory system. Through extensive simulation-based experiments, it is verified that the proposed deep RNN-TCS achieves almost 5% improvement in accuracy (96% accuracy) while operating 500 times faster (elapsed time) compared to the conventional scheme.