This paper proposes a modified natural gradient descent, containing fractional derivatives of Riemann-Liouville, Caputo and Grunwald-Letnikov types. Such approach belongs to information-geometric optimization methods, which take into account not only directions of gradients or momentum parameters, but the convexity (curvature) of minimizing function. This technique, comparing with second order optimization algorithms, lets to increase the rate of convergence. With fractional order derivatives it is possible to adjust the descent toward the neighborhood of the global minimum. In experiments, we demonstrated the increasing accuracy of image recognition on MNIST and CIFAR10, using the proposed optimization algorithm.