Explaining rifle shooting factors through multi-sensor body tracking
Using transformers and attention to mine actionable patterns from skeleton graphs
We apply deep learning algorithms to mine actionable patterns from body tracking data for rifle shooting tasks. We focus on explaining predictions (XAI), helping to better understand the reasoning inside AI models.
Sports shooting, and the effects of body movements on shooting performance is a largely unexplored research area. There are many doctrines for educating novice shooters, most of which are built on the experience of shooting instructors, but that do not always have significant statistical backing. Together with Saab AB, Training and Simulation, we have examined how body movements and postures can affect shooting performance by using body tracking sensors together with deep learning algorithms.
We gathered data in a dynamic live rifle shooting scenario involving body movements. Multiple sensors, including several body tracking sensors set up around the participant, and shot moment and location detection were used in the data collection.
Challenges included synchronising the different sensors, as well as merging the data and structuring it in a way that is understandable to machine learning algorithms. The gathered data can be processed and used in deep learning tasks with explainable models to extract patterns that can potentially aid in live training situations.
The video below shows the resulting skeleton position from one of the participants performing the rifle shooting scenario.