ESE5160_T05-Fusion-Maverick_Magic-Wand

Review Assignment Due Date

a14g-final-submission

* Team Number: T05
* Team Name: Fusion Maverick
* Team Members: Qingyang Xu, Ruizhe Wang, Xinyi Wang
* Github Repository URL: https://github.com/ese5160/a14g-final-submission-s25-t05-fusion-maverick.git
* Description of test hardware: (development boards, sensors, actuators, laptop + OS, etc) 

1. Video Presentation

Watch the video



2. Project Summary

Device Description

We designed a magic wand for IoT-based environments, capable of remotely controlling electronic devices through gesture recognition. In our prototype, we used a motor and an LCD screen as actuators. The system also features an “echo back” mechanism, providing distinct haptic feedback via a haptic driver and vibration motor to confirm command execution.

The project is inspired by the current smart home system. Instead of the mobile control via smart phone, we want to bring in more fun in the whole scheme. In addition, the magic wand could also be programmed as a laser cat teaser or a magic wand for children entertainment.


Device Functionality

  1. Design of the Internet-Connected Device:

    The system is composed of two custom-designed PCB modules: the Wand Module and the Actuator Module.

    The Wand Module detects hand gestures using an onboard IMU and activates only when the force-sensitive resistor (FSR) is pressed, minimizing unintended gesture recognition.

    Once a gesture is recognized, the microcontroller processes the data and transmits the corresponding command to the cloud via Wi-Fi.

    The Actuator Module, which maintains an active Wi-Fi Internet connection, receives this command and performs the appropriate action using motor and LCD.

    After execution, the actuator sends feedback to confirm task completion, which is communicated back to the user via a vibration motor on the wand.


  1. Sensors, Actuators, and Key Components:

    • Magic Wand PCB:

      • Slide Switch – Powers the wand on/off.
      • Force-Sensitive Resistor (FSR) – Enables gesture recognition only with intentional input.
      • Inertial Measurement Unit (IMU) – Detects motion and captures gesture patterns.
      • NeoPixel LED Strip – Provides visual animation feedback upon command transmission.
      • Vibration Motor – Delivers haptic feedback when a task is confirmed as complete.
    • Actuator PCB:

      • LCD Screen – Displays animations or task-related visuals.
      • Motor – Executes the received command from the wand module. The speed can be changed.
      • State LED – Indicates task activity status (on during task execution, off when idle); also serves as a debugging tool.


    blcok


Challenges

The difficulties we encountered lies in two aspects: MQTT communications between two devices with the introduction of Node-RED and the stack/memory allocation among different threads/tasks.

For the MQTT communications problem, we aimed to achieve both end-to-end/device-to-device control and cloud control, then we need to tackle with the potential conflicts. We modified the Node-RED interface setting(specifically, function with multiple buttons architecture) to allow both control scheme and ensure no mutual influence against each other.

For the stack/memory allocation challenge, we mainly resolved by using the debug mode and Percepio to measure the CPU usage of each thread/task and their occupation with respect to the whole progress, so that distribute adequate stack/memory for each thread/task.


Prototype Learnings


Next Steps & Takeaways




3. Hardware & Software Requirements




4. Project Photos & Screenshots











Codebase