Concept
In my previous blog post, I introduced the idea of creating a Human Following Robot inspired by sci-fi movies like Wall-E. The concept revolves around developing a robot companion that autonomously follows a person, adjusting its position based on their movements. This project aims to bring a touch of cinematic magic into real life by utilizing sensors as the robot’s eyes and ears, allowing it to perceive the presence and actions of the person it follows.
Key Components
1. Ultrasonic Sensor:
Role: Measures the distance between the robot and a person.
Functionality: Enables the robot to make decisions about its movement based on the person’s proximity.
2. DC Motors and Motor Driver:
Role: Facilitates the robot’s movement.
Functionality: Allows the robot to move forward, backward, and turn, responding to the person’s position.
3. Servo Motor:
Role: Controls the direction in which the robot is facing.
Functionality: Adjusts its angle based on the horizontal position of the person, ensuring the robot follows the person accurately.
4. P5.js Integration:
Role: Provides visualization and user interface interaction.
Functionality: Creates a graphical representation of the robot’s perspective and the person’s position. Visualizes the direction of the servo motor, offering a clear indicator of the robot’s orientation.
Finalized Concept
Building upon the initial concept, the finalized idea is to implement a real-time human following mechanism. The robot will utilize the input from the ultrasonic sensor to determine the person’s distance, adjusting its movement accordingly. The servo motor will play a crucial role in ensuring the robot faces the person, enhancing the user experience.
Arduino Program
Inputs: Ultrasonic sensor readings for distance measurement.
Outputs: Control signals for DC motors based on a person’s proximity.
Video 1
P5 Program
The P5 program will focus on creating a visual representation of the robot’s perspective and the person’s position. It will receive data from the Arduino regarding the robot’s orientation and the person’s horizontal position, updating the graphical interface accordingly. (I’m thinking of representing an ellipse where the closer you are to the sensor the smaller the ellipse and the further you are the larger it gets). Furthermore, I have tried to implement a clickable control in p5 where it can move the robot frontwards and backwards by clicking on the right and left side of the p5 sketch.
Video 2
Fantastic!