Sensory data is essential for the training of methods in autonomous driving like object detection, odometry, or SLAM. MEMS LiDAR sensors can be very valuable for autonomous vehicles because they are less prone to shock and wear compared to motorized optomechanical LiDAR sensors. Recording real-world data is complicated and expensive. An alternative is simulated data, but for MEMS LiDAR sensors there is no publicly available software to simulate this type of sensor. With this paper, we introduce a method to simulate data recorded by a MEMS LiDAR sensor like the Blickfeld Cube~1 (and other MEMS LiDAR sensors as well). For this, we use the open-source autonomous driving simulation environment CARLA (our method is available online\footnote[1]{https://github.com/BerensRWU/MEMS-LiDAR-Generator}). We compare our synthetic point cloud with a real-world point cloud and evaluate the similarity. Moreover, we demonstrate the application of our method on the problem of the optimal sensor configuration.