Dan Wang

and 5 more

This paper reports on a novel and general solution using machine learning (ML) for stabilization of a many-input, many-output system, which can learn to correlate patterns while the system is unstable or in operation. We apply this to coherent laser combination systems, to stabilize against optical phase drift. In this application, the ML algorithm finds correspondence between output interference patterns and phase errors, in both temporal pulse combination and spatial beam combination, and can be scaled to large numbers (order 100) of beams or pulses. This is significant because our approach enables ML to learn and deterministically correct phase drift in a high-channel-count, low repetition rate system designed for high energy, where stochastic, single-output control algorithms (e.g. SPGD) become too slow. The advantage of our approach is that the amount of information scales with the number of channels, enabling high-bandwidth control of arbitrarily high channel count coherent combination, providing for lasers with high energy and power. The approach presented in this paper has been experimentally demonstrated in a diffractive, 8-beam combination system by our group, with less than one percent stability, Opt. Express 30, 12639-12653 (2022). This manuscript expands on that work by exploring the characteristics of the algorithm in simulations, analyzing its combining efficiency and stability, and generalizing the approach to different types of coherently combined lasers with different learned patterns. This manuscript covers the ongoing progress of using ML as a deterministic error detector, and contributes to current trends of advanced understanding and novel application in photonic systems