Drive-by-Wire Development

Drive-by-Wire DevelopmentComputer Vision Applications ■ Machine Learning Applications

 

While the ultimate goal of the project is to enable the vehicle to operate autonomously, it is beneficial to also have a means of manually maneuvering the vehicle remotely. This is often referred to as developing a drive-by-wire system. The goal is to develop a simple piece of software where the user inputs the desired direction for the vehicle to drive. This also enables the user to drive the vehicle in new environments, collecting sample data with the onboard sensors that can be used for testing the algorithms later on.

In this section, a series of ROS packages is used to take Xbox controller inputs from the user and output desired translational and rotational velocities for the autonomous car. Next, an Arduino sketch which receives these desired speeds from the Raspberry Pi and then converts these into motor commands is detailed. Lastly, the raspicam node is used to view and record video.

First, ensure that the vehicle can operate remotely. This requires the Raspberry Pi to have it’s own battery source. The Solar Powerbank MP-S6000 is used for this purpose. Additionally, the camera is installed on the Raspberry Pi. The camera, Raspberry Pi, and power source are all mounted on the vehicle as securely as possible. When powered on, the user can execute commands and programs on the Raspberry Pi via an SSH connection. The complete system is shown below.

DIY Autonomous Vehicle Elegoo RaspberryPi3 Assembly

Drive-by-Wire Development

This section begins assuming the Xbox controller drivers, the diy_driverless_car_ROS packages, and the ROS joy package are installed. If not, see the Raspberry Pi Integration section for instructions on how to install that software. This section also assumes a basic understanding of ROS. For a refresher on some ROS basics, please review the ROS overview related to my quadrotor research.

The rosgraph below provides an overview of the core nodes used to manually drive the Elegoo vehicle.

DIY Autonomous Vehicle Drive By Wire ROSGraph

First, the joy_node published the values of all the buttons and joysticks on the Xbox controller. The next node, rover_joy_teleop reads in the values from the Xbox controller and converts them into desired translational and rotational speeds. This node subscribes to Joy messages and outputs both Twist and TwistStamped messages. The TwistStamped messages are useful for training machine learning models since they include the time stamp of the command.

The user can modify the maximum speeds by changing these values in rover_joy_teleop.cpp:

float max_linear_vel = 0.2;
float max_angular_vel = 1.5707;

The callback function details the assignment of the Xbox controller joystick values to the associated desired speed commands:

void callBack(const sensor_msgs::JoyConstPtr& joy)
{
	  
  geometry_msgs::Twist vel;
  vel.angular.z = max_angular_vel*joy->axes[0];
  vel.linear.x = max_linear_vel*joy->axes[1];
  pub.publish(vel);

  geometry_msgs::TwistStamped velStamped;
  ros::Time current_time = ros::Time::now();

  velStamped.header.stamp.sec = current_time.sec;
  velStamped.header.stamp.nsec = current_time.nsec;

  velStamped.twist = vel;
  pubStamped.publish(velStamped);

}

Both of these nodes are configured and launched in the rover_teleop.launch file.

Since the autonomous car can potentially receive multiple command inputs at the same time, the yocs_cmd_vel_mux package is used as a multiplexer for command velocity inputs. This package receives incoming cmd_vel messages from several topics, allowing one topic at a time to command the robot, based on priorities. All topics, together with their priority and timeout are configured through a YAML file. The node is configured and launched by the rover_control.launch or rover_control_gazebo.launch file depending on if this is for real world or simulated operation. Make sure the topics listed in the .yaml configuration file, for example rover_cmd_vel_mux.yaml, match the topics being published by the rover_joy_teleop node. Ultimately, the yocs_cmd_vel_mux node outputs a single cmd_vel topic as defined in the configuration file. For this project, that topic is /rover_velocity_controller/cmd_vel“.

The next task is to take the desired velocity commands, send them to the Arduino, and have the Ardunio control the motors. The rosserial_arduino package enables ROS to interface directly with the Arduino IDE. The rosserial package provides a ROS communication protocol that works over your Arduino’s UART. It allows your Arduino to be a ROS node which can directly publish and subscribe to ROS messages, publish TF transforms, and get the ROS system time.

The ROS wiki has a tutorial for getting the Arduino setup with ROS. The tutorial starts by installing rosserial for Arduino. Since the autonomous car is running on Raspbian, refer to the package installations in the Raspberry Pi Integration section to install the packages using rosinstall in the primary ros core packages workspace or cloning the source files from the repository into the overlay workspace.

Once rosserial is installed, the ros_lib files are created in the <workspace>install directory. These files are the ROS bindings implemented as an Arduino library. This library must be copied into the Arduino build environment to enable Arduino programs to interact with ROS. Once successfully installed, you can see the ros_lib library in the list of examples in the Arduino IDE as shown below.

The Arduino onboard the autonomous car is loaded with the rover_motors.ino which receives ROS messages containing the desired motor speeds from the Raspberry Pi and converts them into PWM motor commands.

The sketch begins by defining the logic control output pins corresponding to the four motor input channels and two channel enable output pins.

#define ENA 5
#define ENB 11
#define IN1 6
#define IN2 7
#define IN3 8
#define IN4 9

IN1 and IN2 are used to control motor 1, IN3 and IN4 are used to control motor 2. ENA and ENB control the speed of motor 1 and speed of motor 2 separately by the PWM signal from the digital pin connected to the enable pin can take care of it.

The motor direction is controlled by sending a HIGH or LOW signal to the drive for each motor (or channel). For example for motor 1, a HIGH to IN1 and a LOW to IN2 will cause it to turn in one direction, and a LOW and HIGH will cause it to turn in the other direction.

Next, the Arduino subscribes to the “/rover_velocity_controller/cmd_vel” topic that is the output of the yocs_cmd_vel_mux node.
ros::Subscriber<geometry_msgs::Twist> subCmdVel("/rover_velocity_controller/cmd_vel", cmdVelCB);

When a message is received on that topic, the cmdVelCB callback function is run. The output PWM command signals from the Arduino are expected to be between -255 and +255 with motors stopping at 0. Based on the translational and rotational velocity limits, a gain is used to prevent the wheel commands from exceeding that threshold. The left and right wheel speeds, left_wheel_data and right_wheel_data, are computed as functions of the gain, linear and angular velocity commands, and the estimated distance between the motors defines as L.

void cmdVelCB( const geometry_msgs::Twist& twist)
{
  int gain = 700;
  float left_wheel_data = gain*(twist.linear.x + twist.angular.z*L);
  float right_wheel_data = gain*(twist.linear.x - twist.angular.z*L);
  
  if (left_wheel_data >= 0)
  {
    analogWrite(ENA, abs(left_wheel_data));
    digitalWrite(IN1, HIGH);
    digitalWrite(IN2, LOW);
  }
  else
  {
    analogWrite(ENA, abs(left_wheel_data));
    digitalWrite(IN1, LOW);
    digitalWrite(IN2, HIGH);
  }
  if (right_wheel_data >= 0)
  {
    analogWrite(ENB, abs(right_wheel_data));
    digitalWrite(IN3, LOW);
    digitalWrite(IN4, HIGH);
  }
  else
  {
    analogWrite(ENB, abs(right_wheel_data));
    digitalWrite(IN3, HIGH);
    digitalWrite(IN4, LOW);
  }
}

The motor input channels are set based on the intended direction of travel. The computed desired motor speeds are used as inputs to the enable pins. These are converted to the PWM commands that are sent to the motors.

The setup function initializes the ROS node and subscriber. The defined pins are then configured as outputs.

void setup()
{
  pinMode(IN1,OUTPUT);
  pinMode(IN2,OUTPUT);
  pinMode(IN3,OUTPUT);
  pinMode(IN4,OUTPUT);
  pinMode(ENA,OUTPUT);
  pinMode(ENB,OUTPUT);  
    
  analogWrite(ENA, 0);
  analogWrite(ENB, 0);
  digitalWrite(IN1, LOW);
  digitalWrite(IN2, HIGH);
  digitalWrite(IN3, LOW);
  digitalWrite(IN4, HIGH);
    
  nh.initNode();
  nh.subscribe(subCmdVel);
    
}

All of these nodes are executed by calling the rover_dbw.launch file. This file calls the rover_teleop.launch to launch the joy and rover_joy_teleop nodes and rover_control.launch to launch the yocs_mcd_vel_mux node. Lastly, assuming the Arduino is running the rover_motors.ino sketch, the file launches serial_node.py from the rosserial_python package. This package is a Python implementation of the host-side rosserial connection that automatically handles setup, publishing, and subscribing for a connected rosserial-enabled device (the Arduino in this case). The serial_node.py node serves as the interface between ROS and the Arduino.

Putting it all together, running the following command will enable remote teleoperation of the Elegoo vehicle.

$ roslaunch rover_teleop rover_dbw.launch

The image below was taken while remotely controlling the vehicle.

Drive-by-Wire Development Remote Navigation

Video Capture

The primary means of sensing the environment for the autonomous car is the RaspberryPi camera. This allows the user to manually drive the car beyond the line of sight. It will also be useful for implementing computer vision and machine learning algorithms to enable the car to navigate autonomously.

The rosgraph below outlines the topics and nodes involved. The fundamental drive-by-wire nodes are present, as well as the raspicam_node and republish node.

DIY Autonomous Vehicle Drive By Wire ROSGraph

Ubiquity Robotics maintains a ROS node for interfacing with the RaspberryPi camera. The installation instructions can be found on their GitHub repository. It is recommended that you download the source files in the overlay catkin workspace instead of the ROS core catkin workspace.

It is recommended to make a few changes to the configuration of the camera. This can be done by creating the camerav2_320x240.launch file and the camerav2_320x240.yaml file. In the camerav2_320x240.yaml file, the resolution is set to 320×240 pixels.

The launch file loads the .yaml configuration file. It also defines some other parameters such as the resolution, image quality, and framerate. This configuration increases the speed at which the images can be processed and analyzed. Due to the way the RaspberryPi camera is mounted, the image needed to be flipped as well.

<launch>
  <node type="raspicam_node" pkg="raspicam_node" name="raspicam_node" output="screen">
    <param name="camera_info_url" value="package://raspicam_node/camera_info/camerav2_320x240.yaml"/>
    <param name="height" value="240"/>
    <param name="width" value="320"/>
    <param name="vFlip" value="True"/>
    <param name="hFlip" value="True"/>
    <param name="quality" value="30"/>
    <param name="framerate" value="15"/>
  </node>
</launch>

To manually drive the car while running the RaspberryPi camera, the rover_dbw_cam.launch file is used. This file begins by launching the previously discussed joystick, controller, and serial communication nodes. Next, the RaspberryPi camera node is launched.

<!-- Spawn Camera Driver node --> 
<include file="$(find raspicam_node)/launch/camerav2_320x240.launch"/>

By default, the raspicam_node publishes compressed images as jpegs using the CompressedImage message format. For some applications, the images need to be in the raw Image message format. This is done using the republish node from the image_transport package. This package provides support for transporting images in low-bandwidth compressed formats such as JPEG, PNG and theora encoded video. The launch file creates a node to convert the images from CompressedImage format to raw Image format, publishing on the”/raspicam_node/image/image_raw” topic.

<!-- Spawn Image Transport node -->
<node name="republish" pkg="image_transport" type="republish" output="screen"
args="compressed in:=/raspicam_node/image raw out:=/raspicam_node/image/image_raw"/>

Lastly, ROS makes it convenient to record all of the data. This can support troubleshooting but is also useful for testing computer vision algorithms and machine learning models on real-world data recorded from the camera with the corresponding joystick commands. The following section of the launch file records all of the data, if the record_cam flag is set.

<!-- Record data  -->
<node pkg="rosbag" type="record" name="rosbag_record_all"
    args="-a -o $(find rover_teleop)/data/rover"
    if="$(arg record_cam)" />

Run the command below to manually drive the vehicle while running the onboard camera:

$ roslaunch rover_teleop rover_dbw_cam.launch

The camera output can be visualized using rqt_image_view.

DIY Autonomous Vehicle RasPiCam Image Output

Drive-by-Wire in Gazebo

ROS provides an excellent simulation environment known as Gazebo. This simulation environment is great for prototyping computer vision, machine learning, or control algorithms before implementing them on the autonomous car in the real world. Gazebo also enables you to simulate sensors such as the Kinect camera. Due to the modular nature of ROS, the software developed in Gazebo can be easily transitioned for use on the real vehicle.

The primary model used to simulate the Elegoo vehicle is based off of the Clearpath Robotics Husky vehicle model. Clearpath Robotics is a hardware and software manufacturer that provides unmanned vehicles, systems and solutions to industry leaders in over 35 countries around the world. They heavily support the open-source community and integrate ROS into the design of their robotic platforms. For detailed instructions on installing the ClearPath packages, refer to the FormulaPi ROS Simulation Environment section. Some of the parameters of the original Husky model were modified to resemble the Elegoo vehicle. The resulting model files are available for review and download in the rover_description and rover_gazebo packages in the diy_driverless_car_ROS repository.

While the Elegoo model based on the Husky is consistent throughout the simulations, Gazebo offers multiple world files as simulation environments. For this project, one of the worlds used is the Conde world file developed by Valter Costa for his paper, “Autonomous Driving Simulator for Educational Purposes.” The package for the Conde model are available here, and should be saved in the default Gazebo world directory ~/.gazebo/model. You can verify that the model loads correctly by loading it in Gazebo as shown below.

DIY Autonomous Vehicle ROS Gazebo Simulation

The launch file rover_dbw_cam_conde.launch begins by launching the conde_world.world and spawning the rover with the spawn_rover_conde.launch file.
<!-- We resume the logic in empty_world.launch, changing only the name of the world to be launched -->
<!-- world_name is wrt GAZEBO_RESOURCE_PATH environment variable -->
<include file="$(find gazebo_ros)/launch/empty_world.launch">
  <arg name="world_name" value="$(find rover_gazebo)/worlds/conde_world.world"/>  -->
  <arg name="paused" value="true"/>
  <arg name="use_sim_time" value="true"/>
  <arg name="gui" value="true"/>
  <arg name="headless" value="false"/>
  <arg name="debug" value="false"/>
</include>

<!-- Spawn DIY Rover Model -->
<include file="$(find rover_gazebo)/launch/spawn/spawn_rover_conde.launch"> 
   <arg name="camera_enabled" value="$(arg camera_enabled)"/>
   <arg name="laser_enabled" value="$(arg laser_enabled)"/>
   <arg name="ur5_enabled" value="$(arg ur5_enabled)"/>
   <arg name="kinect_enabled" value="$(arg kinect_enabled)"/>
</include>

<!-- Given the published joint states, publish tf for the robot links -->
<node name="robot_state_publisher" pkg="robot_state_publisher" type="robot_state_publisher" />

The remaining nodes are the same joystick and control nodes used for maneuvering the real vehicle as well as the rosbag node to record the data for later use.

DIY Autonomous Vehicle ROS Gazebo Simulation Verification