Sequential Safe Static and Dynamic Screening Rule for Accelerating Support Tensor Machine

With the continuous advancement of data acquisition technology, obtaining large amounts of high-dimensional data containing multiple features has become very easy, such as images and vision data. However, traditional machine learning methods, especially those based on vectors and matrices, face challenges such as the curse of dimensionality, increased computational complexity, and model overfitting. To address these issues, tensors, as a more flexible form of multi-dimensional array representation compared to vectors and matrices, better handle high-dimensional data. Consequently, tensor-based machine learning methods are gradually becoming focal points in academic research.

Support Tensor Machine (STM) is an effective tensor classification strategy inspired by Support Vector Machine (SVM), alternating projection techniques, and multilinear algebra operations. STM aims to handle complex tensor data by finding classification hyperplanes with the maximum margin, demonstrating outstanding performance in classification tasks. Although a series of improved versions of STM based on different tensor decomposition methods have been proposed recently, such as Higher-order Support Tensor Machine (HSTM), Support Tucker Machine (STUM), and Support Tensor Train Machine (STTM), the traditional STM offers shorter computational time while seeking accuracy. However, achieving efficient STM for large-scale problems remains a significant challenge because it requires solving sub-models similar to SVM through repeated iterations, making it difficult to tackle large-scale problems.

The article “Sequential Safe Static and Dynamic Screening Rule for Accelerating Support Tensor Machine,” jointly written by Hongmei Wang from the School of Business, Shandong Normal University; Kun Jiang from the School of Mathematics and Artificial Intelligence, Qilu University of Technology (Shandong Academy of Sciences); Xiao Li from the College of Information and Electrical Engineering, China Agricultural University; and Yitian Xu from the College of Science, China Agricultural University, aims to address the aforementioned issues. This study proposes an efficient Sequential Safe Static and Dynamic Screening Rule (SS-SDSR) to accelerate STM. The paper will be published in the journal Neural Networks.

Research Background and Significance

With the gradual progress of data acquisition technology, obtaining high-dimensional data containing numerous features, such as image and vision data, has become increasingly easy. However, traditional machine learning methods based on vectors and matrices face challenges such as the curse of dimensionality, increased computational complexity, and model overfitting. To address these issues, tensors, as a more flexible form of data representation compared to vectors and matrices, provide a more effective method for handling high-dimensional data. Consequently, tensor-based machine learning methods have become one of the important fields of academic research.

Support Tensor Machine (STM) is an effective tensor classification strategy, originating from Support Vector Machine (SVM) and its alternating projection techniques and multilinear algebra operations. STM aims to handle complex tensor data by finding classification hyperplanes with the maximum margin and demonstrating excellent performance in classification tasks. However, the traditional STM employs alternating projection iterative techniques, which are very time-consuming. To overcome this drawback, this study proposes an efficient Sequential Safe Static and Dynamic Screening Rule (SS-SDSR) to accelerate STM. The main idea is to reduce each projection sub-model by identifying and removing redundant variables before and during the training process without sacrificing accuracy.

Research Origin

This paper was jointly written by Hongmei Wang from the School of Business, Shandong Normal University; Kun Jiang from the School of Mathematics and Artificial Intelligence, Qilu University of Technology (Shandong Academy of Sciences); Xiao Li from the College of Information and Electrical Engineering, China Agricultural University; and Yitian Xu from the College of Science, China Agricultural University. The article will be published in the journal Neural Networks. The research was received on October 25, 2023, revised on March 31, 2024, and accepted on May 21, 2024.

Research Workflow

Research Workflow

The study proposes a Sequential Safe Static and Dynamic Screening Rule (SS-SDSR) to accelerate STM. It primarily includes the following steps:

  1. Static Screening Rule (SSR): Construct a static screening rule based on Variational Inequality (VI) to filter out most redundant features/samples before training.
  2. Dynamic Screening Rule (DSR): Construct a dynamic screening rule based on duality gap, to continuously filter redundant features/samples during the training process.
  3. Sequential Screening Process: Combine SSR and DSR: use SSR to filter out most of the useless variables initially during each parameter adjustment, and then use DSR to further filter redundant variables during the training process.

In the experimental part, the research team conducted experiments using artificial datasets to verify the effectiveness of different parameter intervals, screening frequencies, and data forms on the method. The experimental results showed that the method could work effectively regardless of the data form, especially when the parameter interval is small and the screening frequency is appropriate. In addition, numerical experiments were conducted on eleven vector datasets and six tensor datasets, and the results were compared with other five algorithms. The experimental results demonstrated that the method outperformed other algorithms in terms of efficiency and safety.

Main Results

Static Screening Rule (SSR)

Through variational inequality, the research team constructed a static screening rule for STM. SSR can filter out most redundant variables before training, reducing the problem’s size and speeding up its solution process.

Dynamic Screening Rule (DSR)

Based on duality gap, the research team proposed a dynamic screening rule. DSR can continuously filter redundant variables during the training process and embed them into the solving algorithm. The experiments showed that this method effectively accelerated the model training process with significant screening results.

Experimental Results

On vector datasets, the method significantly reduced training time while maintaining the accuracy of the original algorithm. For example, SS-SDSR was 2.35 times faster than the original STM algorithm on the spambase dataset, and the acceleration effect reached 3.02 times on the htru dataset.

On tensor datasets, SS-SDSR also performed excellently. For the mnist dataset, the method reduced training time to 11.35 of the original STM while maintaining the same accuracy. Furthermore, experiments on six tensor datasets demonstrated that SS-SDSR could effectively filter out redundant variables and achieve significant acceleration.

Research Conclusions

The research team summarized the advantages of this method, asserting that SS-SDSR could accelerate the training process of Support Tensor Machine without sacrificing accuracy by removing redundant variables. This provides an effective tool for handling large-scale tensor data and has potential widespread application value.

Finally, the research team pointed out that although SS-SDSR was proven to be excellent both theoretically and experimentally, there remains room for in-depth research on extending to more complex models and further accelerating. Their future research will focus on constructing more SS-SDSR versions to better resolve these issues.

Highlights Summary

  1. Innovation: First to extend the safe screening rule to tensor space, accelerating the training process of Support Tensor Machine.
  2. Theoretical Assurance: Guaranteed the safety and effectiveness of the method through rigorous optimization conditions and sparsity theory.
  3. Experimental Verification: Conducted validation on multiple datasets, demonstrating the method’s efficiency and accuracy.

These research findings not only have significant scientific value but also provide new methods and tools for practical applications. This paper will further promote the development of tensor-based machine learning methods and offer new insights into solving high-dimensional data problems. The research team hopes to further perfect this method for application in more complex machine learning models.