Motivation for launching the project by the client: every smartphone has inertial sensors that measure the acceleration and rotation of the phone. Theoretically, you can get a complete picture of the movement of a smartphone using these sensors. The path of a smartphone can tell a lot: where it lies (in a bag, pocket, hand, etc.), how its carrier moves (walks, stands, runs, climbs the stairs) and, most interestingly, what trajectory the phone describes in 3D. Using an algorithm for recognizing positions, activities and trajectories, you can help a person navigate in a building and understand his or her behaviour with a smartphone. The goal of our team is to build such an algorithm.
What we had initially:
Project goals: build an algorithm classifying activity and position and assessing the trajectory of a person in 3D for any activity and any position of the phone. The algorithm includes both the DL part and the classical approaches to inertial navigation.
MIL Team's solution: solving the problem of time series segmentation from sensors using a neural network for modelling parts of trajectories. Classification of time series segments by a neural network for activity and position recognition. Activity changes detection for defining elevator and escalator activities. Several classic and proprietary data preprocessing algorithms