PetitCat
PetitCat implements a Brain-Inspired Cognitive Architecture (BICA) interfaced with a low-cost mobile robot (< €200) based on 'open hardware' technology. The PetitCat software is divided into two parts:
- The embedded part on the robot, developed in C/C++ for Arduino
- The cognitive architecture itself (BICA), developed in Python and executed on a PC controlling the robot remotely via Wi-Fi
The BICA part implements learning and behavior-generation algorithms derived from research (bottom-up hierarchical sequential learning, spatial learning, episodic learning) inspired by studies on certain brain regions such as the hippocampus, superior colliculus, and striatum. It can interface with other BICAs developed by other research teams and can control modular robots designed from components of "open electronics" marketed by various companies (Osoyoo, Elegoo, Adafruit, 4tronics, XiaoR, etc.).
PetitCat is used to validate the hypotheses used in the development of BICA by demonstrating interactive behaviors. The goal is to generate learning behaviors comparable to those of young animals (kittens, puppies, etc.).
Please refer to the project's Github repository for a complete description.
🛠️ เจาะลึกเบื้องหลังการทำงาน (Deep Dive / Technical Analysis)
Standard Arduino robots are "dumb." If you tell it to drive forward via Bluetooth, and there is a wall in front of it, it will blindly bash its chassis into the wall forever until it breaks. The Brain-Inspired Cognitive Robot completely rebuilds the remote control paradigm, linking the car to a massive desktop laptop running an Artificial Intelligence "Belief-Desire-Intention (BDI)" decision tree!
The BDI Architecture (Belief, Desire, Intention)
The Arduino acts merely as the "Spinal Cord." A separate laptop running Python acts as the "Brain."
- The Spinal Cord (ESP32): The robot constantly reads its front HC-SR04 Sonar Sensors and blasts the raw data stream back to the laptop server over a Wi-Fi WebSocket!
- Belief (Data State): The Python brain's algorithm absorbs the sonar data. Belief: There is a wall 5mm away.
- Desire (Goal State): You issue a command on the laptop keyboard: 'Drive Forward'. Desire: Reach the target.
- Intention (Action Execution): The Python algorithm hits a mathematical collision. It knows the user wants it to drive forward (Desire), but its physics engine knows there is a wall (Belief). The algorithm overrides the user!
- It violently sends a massive JSON command to the ESP32:
{"Action": "REVERSE_AND_PIVOT_RIGHT"}, actively rebelling against your joystick command to physically save its own hardware!
High-Bandwidth Wi-Fi WebSockets
You cannot use Bluetooth (HC-05) for AI telemetry; it is far too slow and the lag will cause the BDI matrix to crash when dodging obstacles.
- The project mandates an ESP32 or NodeMCU.
- It establishes a low-latency, two-way
WebSocketdirectly to a PythonFlaskorNode.jsserver. - The Python server runs the massive
Networkxalgorithmic grid or an implementation ofCarsa/Soarcognitive architectures, outputting a completely asynchronous barrage of steering arrays into the ESP32!
The Cognitive Hardware Platform
- ESP32 Core Microcontroller (Mandatory for the massive Wi-Fi throughput).
- Multiple HC-SR04 Sonar Arrays (Front, Left, Right) to fully map the Belief state matrix.
- L298N Motor Shields and a heavy 4WD Chassis Drive.
- A high-end Desktop PC / Laptop on the local Wi-Fi running complex Python Artificial Intelligence environments.