The gradient descent method suffers from the defects that the parameters must be set manually and easily fall into local minima in neural network training. This paper proposes an improved gradient descent method with adaptive parameter setting and global search capability. The proposed adaptive algorithm does not require any prior knowledge to obtain suitable parameters. Parameters such as learning rate are modified according to different training stages without human intervention. The structure of the gradient descent method is improved by using evolutionary computation, and the idea of lightweight populations is introduced into the method for global search and fast convergence. In addition, we introduce the meta-heuristic idea to make the swarm more intelligent and achieve better robustness. Experimental analysis shows that the proposed adaptive algorithm performs excellently with outstanding global search and generalization capabilities.