Motivation for launching the project by the client: every smartphone has inertial sensors that measure the acceleration and rotation of the phone. Theoretically, you can get a complete picture of the movement of a smartphone using these sensors. The path of a smartphone can tell a lot: where it lies (in a bag, pocket, hand, etc.), how its carrier moves (walks, stands, runs, climbs the stairs) and, most interestingly, what trajectory the phone describes in 3D. Using an algorithm for recognizing positions, activities and trajectories, you can help a person navigate in a building and understand his or her behaviour with a smartphone. The goal of our team is to build such an algorithm.
What we had initially:
- there is a set of annotated data from inertial sensors. Annotation is presented by positions, activities and trajectory of a person in time;
- sensor data is very noisy, so we cannot use physical equations to estimate the trajectory.
- during the movement of a person on an elevator or escalator, inertial sensors show the same thing as when a person is standing, but the movement occurs.
Project goals: build an algorithm classifying activity and position and assessing the trajectory of a person in 3D for any activity and any position of the phone. The algorithm includes both the DL part and the classical approaches to inertial navigation.
MIL Team's solution: solving the problem of time series segmentation from sensors using a neural network for modelling parts of trajectories. Classification of time series segments by a neural network for activity and position recognition. Activity changes detection for defining elevator and escalator activities. Several classic and proprietary data preprocessing algorithms
Tools for building the model:
- Open dataset RuDaCop, assembled by the customer;
- Open dataset RoNIN;
- The dataset collected by MIL Team
The model results: under NDA
Client: under NDA
Technological stack: Python (PyTorch, quaternion), wandb for experiments monitoring