Final User Testing Update

Group: Majid, Hassan, Aaron

Concept

This project aims to create a small-scale self-driving experience. It implements a machine-learning module that captures user hand gestures for steering and a foot pedal for accelerating, braking, and stopping.  A user controls the car by imitating the driving steering wheel movements in real life in front of a computer camera and engaging the pedals with the foot. However, that isn’t implemented yet.

Implementation

To make this possible and achieve a near-real-life driving experience, we implemented a machine learning module that uses a camera to capture a user’s hand gestures. This machine learning module is implemented in p5.js and is synced with the pedal and the robot via Bluetooth; however, that isn’t implemented yet. Regardless, a network is created between the robot and the pedals for user control.

From interacting with the code for the DC motors, it seems the maximum attainable speed is 500, which is relatively slow. However, we implemented a code that prevents the robotic car from crashing. When a user accelerates towards an obstacle, the robot stops, scans 180 deg away from the obstacle, and then moves back. In some cases, it scans and moves toward a direction devoid of obstacles.

We also installed an ultrasonic sensor to help avoid obstacles. The distance sensor is fastened to a servo motor to enable free rotation. This was particularly important because we implemented an algorithm that scans surrounding areas for obstacles and moves the robotic car in a direction that is devoid of obstacles.

Displayed here is the code defining the DC motors and other dependencies.

#include <Servo.h>
#include <AFMotor.h>
#define Echo A0
#define Trig A1
#define motor 10
#define Speed 500 //DC motor Speed
#define spoint 103

Displayed here is the code obstacle avoiding code

void Obstacle() {
  distance = ultrasonic();
  if (distance <= 12) {
    Stop();
    backward();
    delay(100);
    Stop();
    L = leftsee();
    servo.write(spoint);
    delay(800);
    R = rightsee();
    servo.write(spoint);
    if (L < R) {
      right();
      delay(500);
      Stop();
      delay(200);
    } else if (L > R) {
      left();
      delay(500);
      Stop();
      delay(200);
    }
  } else {
    forward();
  }
}

Aspects of the Project we’re proud of.

Our proudest achievement so far is the implementation of the machine learning module.

Also, the algorithm that checks for obstacles in a smart way and prevents users from crashing is a feat we’re most proud of. We believe this is implementation in real life could save lives by preventing vehicular accidents.

Aspects we can improve on in the future.

The most significant aspect of the project we believe would impact the world if worked on is the obstacle sensor that prevents the car from crashing. We believe if this is worked on on a larger scale, it could be implemented in cars to help reduce road accidents.

Regardless, we also believe we can  engineer a reverse the obstacle-sensing algorithm to create an algorithm that could detect cliffs and pot-holes, and dangerous empty spaces on roads to ultimately reduce road accidents.

User Testing

Leave a Reply