Micro air vehicles can be used for a wide range of applications, which include drone light shows for cultural or marketing purposes, or as a replacement of fireworks. However, widespread use of this technology is precluded by a usability gap between choreography design and deployment on the target drones. In order to achieve seamless deployment of micro air vehicle choreographies, computational interfaces that obtain drone trajectories from a more conventional media such as digital images are needed. In this paper, we propose CrazyKhoreia, a low-cost, computationally efficient approach to Unmanned Aerial Vehicles (in short, UAVs) choreography design, which consists of a robotic perception system that takes a digital image, applies boundary tracing to it and obtains a safely, traversable and apparently accurate waypoint matrix as a result. We validate our trajectory generation system through two modes of operation, light painting, where the UAV can either fly through all of the waypoints or multiple UAVs may be arranged to mimic the original image. The generated trajectories on Light Painting mode by physical robots are compared against the ground truth using a data association tool by the Computer Vision Group at TUM and achieved an average RMSE error of 0.4531 metres, while on Multi-Drone Formation the observed positions are estimated from photographs on actual formations, this way the system reached a mean RMSE error of 0.136 metres.