This paper presents a precise two-robot collaboration method for 3D self-localization relying on a single rotating camera and onboard accelerometers used to measure the tilt of the robots. This method allows for localization in GPS-denied environments and in the presence of magnetic interference or relatively (or totally) dark and unstructured unmarked locations. One robot moves forward on each step while the other remains stationary. The tilt angles of the robots obtained from the accelerometers and the rotational angle of the turret, associated with the video analysis, make it possible to continuously calculate the location of each robot. We describe a hardware setup used for experiments and provide a detailed description of the algorithm that fuses the data obtained by the accelerometers and cameras and runs in real-time on onboard micro-computers. Finally, we present 2D and 3D experimental results, which show that the system achieves 2% accuracy for the total travelled distance.