Maintaining an active lifestyle is essential for our overall well-being. However, accurately understanding and tracking our activities can be challenging. Have you ever wondered whether you are performing your yoga poses and exercises correctly? If so, then this use case is for you.

The yoga pose recognition use case is intended to classify various classes corresponding to different yoga poses. Whether it is a cobra pose or a tree pose or many others, the algorithm can recognize it - all with very low power consumption, thanks to the Machine Learning Core (MLC) available in ST MEMS sensors.

Approach

  • Data logs can be acquired using the wireless SensorTile.box kit and the STBLESensor app on Android or iOS.
  • We used MEMS-Studio to generate and configure a decision tree model with three different features to classify the yoga poses among those considered.
  • The decision tree has around 20 nodes and takes the accelerometer's X, Y and Z axes as input data.
  • The MLC runs at 104 Hz and features are computed within a window of 52 samples.
  • The configuration generates an interrupt every time the recognized class is updated. 


You can find the complete step-by-step guide with all the hardware and software used here.

Sensor

6-axis IMU (inertial measurement unit) with embedded AI: always-on 3-axis accelerometer and 3-axis gyroscope (reference: LSM6DSOX).

Data

The accelerometer is configured with 2 g full scale and 104 Hz output data rate.

The SensorTile.Box is placed on the left leg. Other devices can be used as well, provided that the orientation of sensor axes is correct.
  • X axis parallel to the leg, pointing up
  • Y axis perpendicular to the leg, pointing to the inside
  • Z axis pointing forward

Results

Power consumption (sensor + algorithm): 175 uA

The decision tree classifier detects 14 different classes corresponding to 12 different yoga positions and 2 non-yoga position (standing still and in motion).

The output of the decision tree classifier is stored in the register MLC0_SRC (address 70h).

  • 0 = Boat Pose
  • 1 = Bow Pose
  • 2 = Bridge
  • 3 = Child's Pose
  • 4 = Cobra's Pose
  • 5 = Downward-Facing Dog
  • 6 = Meditation Pose
  • 7 = Plank
  • 8 = Seated Forward Bend
  • 9 = Standing in Motion
  • 10 = Standing Still
  • 11 = The Extended Side Angle
  • 12 = The Tree
  • 13 = Upward Plank
Model created with

MEMS-Studio

MEMS-Studio
Compatible with
LSM6DSOX
LSM6DSOX
Resources

Model created with MEMS-Studio

A complete software solution for desktops to enable AI features on smart sensors. It allows users to analyze data, evaluate embedded libraries, and design no-code algorithms for the entire portfolio of MEMS sensors.

Model created with MEMS-Studio Model created with MEMS-Studio Model created with MEMS-Studio

Compatible with LSM6DSOX

Smart sensors capable of directly processing the data they capture and delivering meaningful insights to the host device. By processing data locally, smart sensors reduce transmitted data and cloud processing requirements, thus lowering power consumption at the system level.

Compatible with LSM6DSOX Compatible with LSM6DSOX Compatible with LSM6DSOX
You might also be interested by

Customer | STM32Cube.AI | Current sensor | Accelerometer | Predictive maintenance | Transportation

How Panasonic makes e-bikes smarter with AI

Tire pressure monitoring solution to improve rider safety and convenience.

Tutorial | Demo | MEMS MLC | Gyroscope | Accelerometer | Predictive maintenance | Wearables

Recognize head gestures in wearable devices with ultra low power sensors

Recognize head gestures such as nodding, shaking, and other general head movements through the Machine Learning Core available in MEMS sensors.

Tutorial | Demo | MEMS MLC | Accelerometer | Industrial | Predictive maintenance

How to monitor and classify fan-coil systems with STWIN.box

Monitor and classify the behavior of a fan (e.g. on HVAC units) through the Machine Learning Core available in MEMS sensors.