MM-Fit Dataset

The first dataset of full-body physical exercises captured by multiple time-synchronized wearable sensing devices.

Description: This is a substantial collection of inertial sensor data from smartphones, smartwatches and earbuds worn by participants while performing full-body workouts, and time-synchronised multi-viewpoint RGB-D video, with 2D and 3D pose estimates.

Devices and sensors:
DeviceModalitiesFrequency(Hz)
camera: Orbbec Astra Pro
RGB
Depth
30
30
earbud: eSense (Nokia Bell-Labs)
Accelerometer
Gyroscope
90
90
2x smartwatches: Mobvoi TicWatch Pro
Accelerometer
Gyroscope
Heart beats per minute
100
100
1
smartphone: Samsung S7
Accelerometer
Gyroscope
Magnetometer
210
210
100
smartphone: Huawei P20
Accelerometer
Gyroscope
Magnetometer
500
500
65


>> DOWNLOAD <<

The code at this repository serves as an example of how to use the MM-Fit data set.

Over 800 minutes of multimodal data

From 5 time-synchronised sensing devices.

Exercises

Squats

The body is lowered at the hips from a standing position and then stands back up to complete a repetition. Hands are push in front for balancing.

Push-ups

Legs extended back and balancing the straight body on hands and toes. The arms are flexed to alower and raise the body. Repetitions are counted when the body returns to the starting position.

Dumbbell shoulder presses

From a sitting pisition, the weights are pressed upwards until the arms are straight and the weights touch above the head.

Lunges

One leg is positioned forward with knee bent and foot flat on the ground while the other leg is positioned behind. The position of the legs is repeatedly swapped.

Standing dumbbell rows

Slightly bent knees, hips pushed back, chest and head up. With elbows at a 60-degree angle, the dumbbells are raised up from the back muscles.

Sit-ups

Abdominal exercise done by lying on the back and lifting the torso with arms behind the head.

Dumbbell tricep extensions

The weight is brought overhead, extending the arms straight. Keeping the shoulders still, the elbows are slowly bent, lowering the weight behind the head to where the arms are just lower than 90 degrees, elbows pointing forward.

Bicep curls

Bicep curls with weights. Arms are alternated in lifting up the weight with the rest of the body remaining still.

Sitting dumbbell lateral raises

Sitting with a dumbbell in each hand and straight back. Slowly lifting the weights out to the side until the arms are parallel with the floor.

Jumping jacks

Starting with the arms on the side and the legs brought together. By a jump into the air, simultaneously the legs are spread and the hands are pushed up to touch overhead. A repetition is completed with another jump returning to the starting position.

Publications

    An overview of the proposed activity segmentation and exercise recognition approach. Stage 1: Learn modality-specific representations using a separate autoencoder for each device and modality. The layers of the inertial 1D convolutional autoencoder block are used to illustrate that separate autoencoders are used for each device and modality. Stage 2: Flatten and concatenate the modality-specific representations outputted by the encoder component of each unimodal autoencoder. Learn a shared cross-modal representation using a fully-connected multimodal autoencoder that attempts to reconstruct the original inputs from the shared representation. The output vector of the multimodal autoencoder is split along the concatenation indices, and fed to the decoder component of the corresponding unimodal autoencoder, to reconstruct the original input, and backpropagate the reconstruction loss. Stage 3: A fully-connected classifier is attached to the learnt shared cross-modal representation. The entire network is trained for the task of activity segmentation and exercise recognition, with the pretrained unimodal and multimodal autoencoder weights being fine-tuned.
  • [IMWUT'20] MM-Fit: Multimodal Deep Learning for Automatic Exercise Logging Across Sensing Devices
    David Strömbäck, Sangxia Huang, Valentin Radu
    In ACM Interactive, Mobile, Wearable and Ubiquitous Technologies (IMWUT): Volume 4 Issue 4, December 2020.
    [DOI] [PDF] [bib]