Camera calibration: How 6RS knows robots are seeing straight
For an autonomous mobile robot, maintaining accuracy within the visual cameras is vital to its operation and requires regular and reliable calibration systems. In real world robotics operations, the position of the RGBD cameras (given by X, Y, Z, Yaw, Pitch and Roll) tends to change: mechanical distortions, varying load on the robot, rough terrain, even inconsistencies in manufacturing contribute to tiny changes in the camera’s function. Over time, these slight changes may result in incorrect obstacle estimation and detection, resulting in false alarms and unnecessary stops that contribute to a slower average speed, consequently slowing down the whole operation.
At 6 River Systems, we mitigate calibration drift by running an auto-calibration protocol whenever the robot travels to the auto-charger. However, if the amount of drift becomes too great, a reliability engineer must intervene by remotely logging in and manually calibrating the camera. This process used to take approximately two hours during which the robot had to be pulled out of commission, resulting in productivity loss for both the customer (the out-of-operation robot) and the engineer (scheduling and completing a two-hour manual intervention). For our engineering team and for our customers, the time required to fix out-of-calibration cameras was a significant pain point.
To solve this problem, we developed an interactive tool using Google Colab which can be used by anyone on the reliability or operations teams. The whole process is completed offline and any adjustments are wirelessly updated on the robot in just a few minutes without halting the robot’s operation at the customer site. Because the tool does not interact with the robot until the last step of updating calibration values, it can be used if the robot is disconnected from its network connection or even turned off.
How? Data snapshots
If the tool does not interact with the robot at all, where do the data (point clouds) come from? We at 6RS perception team have developed a system called Snapshots (which deserves a blog post in itself): an event-driven data buffering and recording software module.
Imagine you are diagnosing and solving a recurring problem with a robot. You would want to capture and analyze data from when the problem occurred as well as what happened just before the issue manifested. We seek to do this whenever working on diagnostics, improvement and enhancements in a range of algorithms. Last year we developed the Snapshot module to achieve the data buffering and recording for any configured event(s).
In the case of calibration, we configured our snapshot system to record point cloud and depth image data for a few seconds as the robot prepares to dock with the auto-charger. As expected, every time a robot approaches the charger, a ROSbag is created on the robot’s disk and uploaded into the cloud, facilitating the very offline calibration optimization using the tool we are discussing in the post.
How the calibration tool works
User selects parameters like date, site, robot, etc. and the tool downloads the data accordingly. All the drop downs and fields are updated dynamically depending upon the preceding selection. The whole user experience is like using a web application but written entirely in Python with the help of “form” functionality in Google Colab and IPython widgets.
As we described in a previous blog post, the ROSbags containing calibration offset analytics are stored in Google BigQuery while the point cloud data is recorded and uploaded into the cloud by the Snapshot module.
In the Google Colab environment, we can triage any robot’s calibration status from any of our customer sites by accessing their most recent Snapshot. This information is fetched using Google’s Python API for BigQuery. Upon selecting a particular robot from the dropdown input, BigQuery fetches calibration data and Google Colab downloads the ROSbag associated with that instance of calibration.
User adjusts the calibration offsets using the respective fields and then calculates the transformation followed by visualizing the point cloud at one click. Matplotlib was used to visualize the point cloud messages from ROSbags by converting the messages to NumPy arrays.
Once all of the data are fetched and parsed, the engineer can review a visual representation of what the robot “sees” as well as a table of its camera’s calibration values. By adjusting the UI elements in the calibration tool, the visual representation of points adjust to illustrate the new values. Using this process, a user can make additional corrections to offsets while visualizing the point clouds.
Optimization and smart offsets suggestions
GIF showing 3D interactive visualization of the point cloud inside the notebook.
We analyzed a variety of point clouds collected from the snapshot bags and devised some performance metrics which are the indication of the calibration offsets being good enough to result in smooth movement. Using the data from the robots with ideal movement, we came up with optimum values for these performance metrics. Once the optimizer goes over all the point clouds in the ROSbag, the calibration tool suggests optimal manual offsets using this information for the robot’s camera as well as a visual representation of its adjustment.
Uploading calibration data back to the robot
After the calibration work is complete, the file is delivered to the robot when it connects to the network to receive its next instruction.
Abstraction of details in the tool
All the UI elements were developed using either IPython widgets library or Google Colab’s form elements. No code is visible to the end user and most of the python functions are contained in a python file which is downloaded from the cloud when the tool starts. Dropdowns, sliders and input boxes were used in the simplest way to help anyone adjust the calibration of any robot in the fleet given that a snapshot bag is available in the cloud.
Using the similar setup of integrating Google BigQuery, Cloud Storage and Colab, we have developed data annotation and machine learning model training tools. For example, a ROSbag is parsed to collect all the images inside it and presented to the user for data annotation. The user can annotate the data using UI elements and finally export the data and labels with the click of a button.
Since the analytics and ROSbags are stored in the cloud regularly by the robots, leveraging the tools developed in Colab have helped us in data analysis, model training and testing without manually searching for the data.
Following the calibration optimization tool, a tool was developed to optimize the calibration of all the robots on a given site and generate a pdf report with the suggested calibration offsets and the plot of the point clouds transformed using the suggested offsets, which has saved a lot of support time and prevented downtimes for the robots.
Fully automated in-cloud optimization is in progress which would not need any manual intervention. Various metrics will be regularly monitored and the values will be adjusted to ensure the optimal performance.
At 6 River Systems, we use diverse thinking to apply solutions to multiple problems. We use open source tools like Google Cloud and Cloud Platform, and we take the time to develop intelligent, impactful solutions. If this is exciting to you, check out our careers page: 6river.com/jobs
About the Author
Arpit Gupta is a Robotics software engineer working in the perception team building safety and perception features to make the robot safer and reliable.
Arpit has been working with 6RS for one and a half years since graduating with a Master’s degree in Robotics at Worcester Polytechnic Institute (WPI).