Authors: Ori Einy, Gleb Merkulov
Supervisor: Prof. Tal Shima
Develop a simulator-based augmentation of the experimental framework to improve the demonstration capabilities and extend the possible set of scenarios.
Experiments in the areas of guidance and task assignment may require the vehicle (or a group of vehicles) to perform tasks that are hard to represent physically. For instance, physical interception of a maneuvering target with a quadcopter may result in its damage. Another example is 3D trajectory following, in which we cannot see the reference path in the actual lab space.
In this project, we aim to augment the real experiment with a simulator-based framework to improve the demonstration capabilities and extend the possible set of experimental scenarios.
Our approach is schematically shown in the Figure. The motion of the vehicles is tracked via the motion capture system (OptiTrack + Motive) and is processed in Linux computer with ROS. To visualize the motion, the actual motion data is transferred to the Gazebo simulator, where the environment is real-time augmented with experiment-specific data. The simulator environment may later be projected in the lab space to augment the physical reality.