This paper presents an innovative approach to enhancing underwater forensic exploration with advancements in soft robotics, adaptive evolution, and artificial intelligence. Focusing on the development of self-adaptive materials integrated with sensing and actuation capabilities, this research introduces the 'Evolutionary Algorithm Reinforcement Learning (EARL)' system. Inspired by the adaptability, reliability, and robustness of biological systems in unstructured environments, EARL employs a forward-looking adaptive mechanism that utilizes partial reinitialization in response to environmental changes, leveraging environmental similarities for real-time adaptations. Our computational model not only navigates the design space for underwater robotics but also provides insights into biological adaptation strategies. By analysing the effectiveness of design choices and exploring state-action probabilities, the study underscores the importance of tuning material properties, such as stiffness and buoyancy, to suit varying environmental conditions. The experimental outcomes demonstrate the system's capability to adapt and reinitialize partially, marking a significant advancement over existing AI-based methods in soft robotics. This work lays a foundation for future innovations in the field, suggesting directions for improving design and policy strategies to create adaptable, robust robots for complex underwater exploration tasks.