Skip to content

UNCG-DAISY/Tiny_Gesture

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

12 Commits
 
 
 
 
 
 
 
 

Repository files navigation

Tiny_Gesture

Table of contents

Introduction

Machine Learning on Microcontrollers is the most efficient, cheap, and reliable way of developing Edge Computing applications with low energy consumption. ‘Arduino Nano 33 BLE Sense board’ is used as a microcontroller to capture and classify 5 human gestures - “Squat”, “Jump”, “Walk”, “Run”, “Other”. Built Deep Learning model using captured gestures data to classify human gestures.

Technologies

  • Python
  • C

Libraries used

  • TensoFlow Lite
  • ArduinoBLE
  • Arduino_LSM9DS1
  • BLEAK
  • pandas
  • numpy
  • matplotlib

Goals

To showcase the following benefits in developing Edge Computing applications without sacrificing accuracy on Microcontrollers such as Arduino Nano 33 BLE Sense board:

  • Cost Effective
  • Low Energy Consumption
  • Data Privacy
  • Tiny form factor
  • Flexibility

Mentors

  • Dr.Somya Mohanty

Setup

The /src folder contains the Jupyter notebook, models, arduino code (capture , classify) and the /data folder contains the datasets that are used for this project. The detailed information and documentation about this project is included in the report under /docs folder in this repository.

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published