Project Overview
"Emo-Bot" is a high-utility exploration into Multimodal Sensor Fusion and Embedded Character Design. Unlike static displays, Emo-Bot utilizes a suite of environmental sensors to construct a reactive persona. By synthesizing data from motion (PIR), light (LDR), and spatial proximity (Ultrasonic), the bot manages its power state and visual expressions through a centralized I2C Bus Architecture, providing an immersive Human-Computer Interaction (HCI) experience on a standard 16x2 LCD.
Technical Deep-Dive
- Custom Character HMI Forensics:
- Bitmask Definition: To create expressive "Eyes," the system does not use standard ASCII. It utilizes the
createChar()function to store custom 5x8 bitmasks in the LCD's CGRAM. By toggling between different bitmasks (e.g., Open, Closed, Squinting), the bot simulates organic facial movements. - I2C Framing Efficiency: Using the
LiquidCrystal_I2Clibrary, the Uno transmits data over just two wires (SDA/SCL). This forensics allows the remaining digital and analog pins to be dedicated to the high-density sensor array, maximizing the bot's "Perceptual Bandwidth."
- Bitmask Definition: To create expressive "Eyes," the system does not use standard ASCII. It utilizes the
- Multimodal Sensor Fusion Logic:
- PIR-Triggered Power Management: The PIR sensor acts as a wake-word trigger. When no movement is detected, the bot enters a "Hibernate" state ($Backlight = OFF$), drastically reducing current consumption.
- Ultrasonic Eye-Tracking: Dual HC-SR04 sensors measure the distance to objects in the Left and Right quadrants. The firmware performs real-time comparison; if the Right sensor returns a lower ToF (Time-of-Flight), the custom LCD characters "Move" to the right columns, simulating visual tracking.
- Ambient Light Diagnostics:
- Night-Mode Actuation: The LDR monitors the ambient lux level. If the environment drops below a calibrated threshold, the bot automatically engages its high-brightness LED arrays, serving as a functional night-light while adjusting its LCD contrast for optimal visibility.
Engineering & Implementation
- System Calibration Hierarchy:
- Step 1: Motion Engagement. PIR signal duration is tuned to maintain the "Active" state during brief pauses in movement.
- Step 2: Proximity Interpolation. Ultrasonic distance data is filtered to prevent "Jitter" in the eye movements, ensuring smooth transitions between LCD columns.
- Step 3: Lux Thresholding. The LDR divider is calibrated for a specific 10k reference to ensure consistent LED actuation in varying low-light conditions.
- I2C Bus Forensics:
- The project highlights the importance of the I2C Address (typically 0x27 or 0x3F). Correct address mapping is essential for the handshake between the Uno's A4/A5 pins and the LCD backpack.
Conclusion
Emo-Bot demonstrates the transition from simple Data Display to Interactive HMI Persona. By mastering Custom Character Generation and Sensor Fusion Forensics, developers can create desktop companions that feel "Alive," responding intelligently to the complexities of their physical environment.