The Transformer architecture of Generative AI is very successful in predicting the distribution of Riemann zeta zero counts on consecutive Gram intervals. We get accuracies of 0.998 in predicting a sequence of ten consecutive zero counts. We tested with two ranges of Riemann zeta zeros, t = 1012 and t = 1028. With special training for rare events, we can get essentially full prediction. This shows that applying the technique to more complex problems has great promise. We have used very minimal computer resources compared to typical models in language applications. With access to better resources, we can attack much more important problems.