Abstract:Adaptive step size is introduced into the Stochastic Variance Reduction Gradient(SVRG) algorithm, and the numerical performance of the algorithm is further improved on this basis. Firstly, the step size of the SVRG algorithm is calculated by BB step size with two-dimensional quadratic termination. Then the stopping criterion and negative momentum are introduced into the inner loop of the SVRG algorithm to accelerate the convergence rate. The proposed algorithm’s numerical experiment was carried out using Matlab, and the numerical performance of the algorithm was observed. By analyzing the numerical experimental results of the algorithm, it is concluded that the algorithm’s performance is comparable to that of the SVRG method with the optimal step size adjustment. In addition, the new algorithm is insensitive to the selection of the initial step size and can automatically generate the optimal step size.