Workshop 5
Workshop 5
In the last workshop, we will introduce the remaining robot sensors and components.
Table of Contents
Workshop 5: Sensors.................................................................................................... 1
Table of Contents...................................................................................................... 1
1. Time of Flight sensor .......................................................................................... 2
2. Inertial Measurement Unit sensor ....................................................................... 4
3. Button ............................................................................................................... 6
4. Display.............................................................................................................. 9
1. Time of Flight sensor
The ToF sensor, located in the robot’s front bumper, is an Adafruit VL53L0X Time of Flight
Micro-LIDAR Distance Sensor (Datasheet). It can accurately measure objects in a small-
diameter cone in distances between 50 - 1200 mm. Using it in conjunction with the camera
and the motor encoders can be valuable for precise pose estimation. Similar to the other
sensors, its drivers and main python script are located in the testScripts directory. Looking
into the main.py script, we can instantiate the sensor as the VL53L0X class which is in the
tofDriver.py file.
As we can see from the code below, communication with the sensor is established through
the SMBus. To initialize the connection, it sends 8-bit unsigned values to the bus. From line
16-23, it sets the I2C standard mode, and then from line 26-38, it sets the recommended
settings as described in the datasheet. After initialization, the sensor data are read every 10
ms from the bus.
We can verify the TOF sensor’s functionality by running the main.py file:
As we can see, we get a reading of the distance between the ToF and its nearest facing object.
In the imu_reader_node.py we create the publisher node. During the sensor initialization,
we can choose the IMU publishing frequency, the corresponding I2C bus and the IMU
address. We can also set angular velocity and acceleration offsets for calibration of the IMU.
However, since the IMU is positioned in parallel to the horizontal plane position and the
robot is a rover-type, we can disregard these parameters. Then, the node will try to locate
the sensor using its bus and address IDs and test its output. It then zeroes out any offset
reading to calibrate itself.
The IMU’s fastest rate is 40Hz, but we can set the publisher node’s frequency to any value
slower than that, either with a frequency divider or with an arbitrary rate.
Finally, the publishing message contains information about the sensor’s linear acceleration
and angular velocity. The computations performed transform the IMU raw buffers to normal
measurements.
On the other hand, the subscriber node simply prints the incoming messages:
We can test these two nodes by building the catkin environment as usual and launching
them:
3. Button
The button was mentioned in the previous Workshop in a way to safely shut down the Jetson
and battery. However, it also has its dedicated driver and main code in the testScripts
directory. Opening the main.py script, we can see that the button class is defined by calling
the button driver with the addresses provided for its signal and LED GPIO. It services a
callback when an interrupt occurs and identifies between two states, when the button is
pressed or released.
ButtonClass is initialized by setting the Jetson’s GPIO mode and creating the LED object.
Finally, it creates an event that executes its callback whenever there is activity in the
button’s GPIO pin. When the callback is called, ledClass’ set function changes the turns on
or off the button’s LED. As we can see from the ledClass object, other functionalities are
also available such as turning the LED on, off, and even blinking.
These functions can be used by students for liveness features in their project, such as
selecting robot operation modes, or moving through different panes in the LCD display.
We can run the main.py script to see the button’s status messages. Be careful, however, to
not press the button consecutively for more than 3 seconds as this will trigger the battery
shutdown daemon.
By pressing the button, we can see printing statements of the button’s states as well as the
LED tuning on and off. Furthermore, we can observe that if we do not press the button firmly,
multiple press or release events will be recorded. Therefore, be careful when designing
button functionalities. It is important to note, though, that this does not occur when pressing
the integrated button located on the HUT instead.
4. Display
Finally, the last component on the Jetson board is the display. The LCD display used is the
SSD1780 model (Datasheet). It can be freely used to display any information or messages
the user wants. For example, in testScripts/displayScripts, there is a main.py file that
prints a static message about the battery, connection and ToF status. It uses the
SSD1306_128_64 drivers to initialize the display object along with the I2C bus and address
information. We utilize the PIL library to set the image, the font and the drawing object and
then project it to the screen.
The display driver initially writes the following values to its GPIO address to set it up. Then it
resets all its pixels and clears the contents of the image buffer.
Then with executing display(), the driver displays the values of the emptied data buffer on
the screen.
Finally, after creating the message in main.py, calling image() and display() again writes the
content of the PIL image to the buffer and then displays it.
We can run the main.py to see the output of the message in the screen with:
python EVC/testScripts/displayScripts/main.py