A simple ‘Hello World’ style demo, the “Slide Demo” illustrates a real-time endpoint AI classification model built using the SensiML Toolkit (called a “Knowledge Pack”). It also serves as a simple dataset that can be used to understand the workflow used to build and deploy such Knowledge Packs to embedded IoT devices.
The Slide Demo uses 6-axes of accelerometer and gyro sensor data to classify amongst three different motion events generated by sliding or lifting the development board from a flat table top surface.
Motion Event Types:
- Stationary – Board is resting on a desk surface
- Horizontal – Board is slid back and forth (left to right or forward to back) on a flat surface in a repetitive rhythmic motion
- Vertical – Board is lifted in an up and down motion in the air in repetitive rhythmic motion
The source sensor data for the slide demo consists of 3-axis accelerometer and 3-axis gyroscope sensor data collected from the on-board inertial measurement unit (IMU) sensor found on each platform evaluation board. Sample rates for sensor data are 104Hz for QuickAI and SensorTile (using STMicro LSM6DSL and LSM6DSM IMU sensors respectively) and 100Hz for the Nordic Thingy (Invensense/TDK MPU-9250 IMU sensor).
Below is a representative plot of such IMU sensor data (example taken from SensorTile User001_Vertical_03.csv) that is typical of all files found within the dataset: