论文标题

半-WTC:一个实用的半监督框架,用于通过重量任务一致性进行攻击分类

Semi-WTC: A Practical Semi-supervised Framework for Attack Categorization through Weight-Task Consistency

论文作者

Li, Zihan, Chen, Wentao, Wei, Zhiqing, Luo, Xingqi, Su, Bing

论文摘要

监督学习已被广​​泛用于攻击分类,需要高质量的数据和标签。但是,数据通常是不平衡的,很难获得足够的注释。此外,有监督的模型应遵守现实世界的部署问题,例如防御看不见的人造攻击。为了应对挑战,我们提出了一个半监督的细粒攻击分类框架,该框架由编码器和两个分支结构组成,并且可以将此框架推广到不同的监督模型。具有残留连接的多层感知器用作提取特征并降低复杂性的编码器。提出了复发原型模块(RPM)以半监督的方式有效地训练编码器。为了减轻数据不平衡问题,我们将重量任务一致性(WTC)引入了RPM的迭代过程中,通过将较大的权重分配给损失函数中样本较少的类的类。此外,要应对现实世界部署中的新攻击,我们提出了一种主动调整重新采样(AAR)方法,该方法可以更好地发现看不见的样本数据的分布并调整Encoder的参数。实验结果表明,我们的模型优于最先进的半监督攻击检测方法,分类精度提高了3%,训练时间降低了90%。

Supervised learning has been widely used for attack categorization, requiring high-quality data and labels. However, the data is often imbalanced and it is difficult to obtain sufficient annotations. Moreover, supervised models are subject to real-world deployment issues, such as defending against unseen artificial attacks. To tackle the challenges, we propose a semi-supervised fine-grained attack categorization framework consisting of an encoder and a two-branch structure and this framework can be generalized to different supervised models. The multilayer perceptron with residual connection is used as the encoder to extract features and reduce the complexity. The Recurrent Prototype Module (RPM) is proposed to train the encoder effectively in a semi-supervised manner. To alleviate the data imbalance problem, we introduce the Weight-Task Consistency (WTC) into the iterative process of RPM by assigning larger weights to classes with fewer samples in the loss function. In addition, to cope with new attacks in real-world deployment, we propose an Active Adaption Resampling (AAR) method, which can better discover the distribution of unseen sample data and adapt the parameters of encoder. Experimental results show that our model outperforms the state-of-the-art semi-supervised attack detection methods with a 3% improvement in classification accuracy and a 90% reduction in training time.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源