กลับไปหน้ารวมไฟล์
gesture-controlled-robotic-arm-e9da31-en.md

Project Overview

This project was inspired by the integration of Wearable IoT technology with robotic systems. I chose the ST SensorTile development kit as the core for motion detection. This tiny device is packed with high-performance MEMS sensors capable of real-time kinematic calculations. The feature I utilized is its ability to transmit 3-dimensional angular values: Yaw (left-right rotation), Pitch (up-down tilt), and Roll (side-to-side tilt), to control the robotic arm to follow our hand movements.

For those interested in detailed information about this development kit, you can learn more here: ST SensorTile Evaluation Tool


Step 1: Structural Design and Basic Control

I started by creating the mechanical part by downloading 3D printing files from Thingiverse: 3D Printing File

In this project, I decided to reduce the Degree of Freedom (DOF) by one level to suit the control via the MEMS sensor I designed. For the drive system, I chose the MG90s servo motor, which is a very cost-effective Metal Gear Micro Servo. It features durable metal gears, superior to common plastic versions, and provides sufficient Torque for small robotic arm structures.

Firmware Control: I chose Arduino Nano as the main Microcontroller due to its small size and sufficient I/O ports to drive multiple servo motors simultaneously. I used the Arduino Servo Library to manage PWM (Pulse Width Modulation) signals to control the robotic arm's position.

Initially, I wrote Code to create a command interface via UART (Universal Asynchronous Receiver-Transmitter). This allowed us to send numerical commands via the Serial Monitor for precise basic movement (Pick-and-place) testing, before moving to wireless connectivity in the next step. It only took me an hour to code the movement for a UART input interface to take commands from serial port which is for the later Bluetooth module to communicate.


Step 2: Sensor User Interface Development

Once the basic system was stable, the next challenge was to make the robot "mimic" human movements instead of typing commands via a keyboard. I installed the ST SensorTile to function as an IMU (Inertial Measurement Unit).

Logic of Operation:

  1. Data Acquisition: The ST SensorTile detects acceleration (Accelerometer) and angular velocity (Gyroscope).
  2. Sensor Fusion: Internal algorithms within the SensorTile process raw data into Quaternion or Euler Angles (Yaw, Pitch, Roll) values.
  3. Data Mapping: The angular values from the sensor are transmitted via Bluetooth LE and converted into a degree range (0-180 degrees) understandable by each servo motor.

To enable natural robotic arm movement, Calibration is required to find the sensor's center point (Offset) and to smooth the response values as much as possible to reduce motor Jitter. The result is a robotic arm that responds precisely to our wrist movements.

The small plastic box (a watch?) is ST SensorTile. It transmits the current yaw, pitch to my BT LE module which in turn sending commands to Arduino Nano to control servos.


Step 3: Bluetooth Low Energy (BLE) Wireless Communication System

The core of data transmission in this project is Bluetooth LE technology, which differs from classic Bluetooth. In this system, I assigned the roles of the devices as follows:

  • ST SensorTile (Server/Peripheral Mode): Acts as the data transmitter (similar to general Wearable devices), broadcasting directional data.
  • BT LE Module - BlueNRG-1 (Client/Central Mode): I used a module featuring the ST BlueNRG-1 chip, which has the capability to be configured as a "Client" to retrieve data from the SensorTile, then forward that data to the Arduino Nano via Serial protocol.

Limitations and Further Development: Currently, this robotic arm uses 4 servo motors, divided into:

  1. Shoulder
  2. Arm (Upper Arm)
  3. Forearm (Lower Arm)
  4. Gripper

Although the SensorTile can transmit data for all 3-axis, due to the current robotic arm's structure lacking a servo at the Wrist, the data from the third axis is not yet fully utilized. My future plan is to develop an Android Application to act as a more sophisticated intermediary. This will allow me to incorporate other axes of movement and add the ability to program complex movement sequences directly via a smartphone screen.

ข้อมูล Frontmatter ดั้งเดิม

apps:
  - "1x Arduino IDE"
author: "Teddy99"
category: "Motors & Robotics"
components:
  - "1x ST Bluenrg-132"
  - "1x ST Sensor Tile"
  - "1x Arduino Nano R3"
description: "Robotic arm mimics human arm movement with MEMS and e-compass based and transmitted thru Bluetooth."
difficulty: "Intermediate"
documentationLinks: []
downloadableFiles:
  - "https://projects.arduinocontent.cc/5ae1ffbb-cdfa-489e-8940-0d317876124f.h"
  - "https://projects.arduinocontent.cc/5ae1ffbb-cdfa-489e-8940-0d317876124f.h"
encryptedPayload: "U2FsdGVkX186kTz9BpmWWwk/Jggnr/afr8A4Jbjey09GRR2hITmGRi8zMWVR0VAdJzRzs6RIqVc29byVBzQvIY2lzZBE2+EKkjYXDecNGsg="
heroImage: "https://cdn.jsdelivr.net/gh/bigboxthailand/arduino-assets@main/images/projects/gesture-controlled-robotic-arm-e9da31_cover.jpg"
lang: "en"
likes: 8
passwordHash: "dd1d1e3c18720d5a6e7d03b3f67c8adf943a54f9d57b657076436965e8b6d87d"
price: 2450
seoDescription: "Gesture Controlled Robotic Arm mimicking human arm movement using MEMS and e-compass via Bluetooth."
tags:
  - "remote control"
  - "robots"
title: "Gesture Controlled Robotic Arm"
tools: []
videoLinks:
  - "https://www.youtube.com/embed/KNsIQForp9A"
  - "https://www.youtube.com/embed/rLkIezAgzzw"
views: 11857