Obtaining a reliable and accurate camera calibration typically requires a significant amount of expertise as users must manually capture a set of calibration images that sufficiently constrain all of the parameters in the camera model. As a result, vision sensors prove challenging to work with, especially for novices who may lack intuition for how to collect such a calibration image set.
Prof. Edwin Olson and two of his former students, Johannes Strom and Andrew Richardson, have recently been awarded a United States Patent for their work in the development of AprilCal, an interactive camera calibration tool that automates the challenging task of calibration image acquisition. This innovative software guides users through the process of collecting a set of images of a calibration target. The AprilCal tool provides live feedback to users on the state of the calibration and suggests future target positions to maximally constrain the camera model parameters. In a series of human trials, this system has been shown to yield more accurate camera calibrations than standard tools while simultaneously decreasing the knowledge burden on users. AprilCal promises to lower the barrier-to-entry for applying vision sensors to robotics, virtual reality, and more.
Prof. Edwin Olson is is the director of the APRIL robotics lab, which studies Autonomy, Perception, Robotics, Interfaces, and Learning. His active research projects include applications to explosive ordinance disposal, search and rescue, multi-robot communication, railway safety, and automobile autonomy and safety.
Prof. Olson received a PhD from the Massachusetts Institute of Technology in 2008. In 2010, he led the winning team in the MAGIC 2010 competition by developing a team of 14 robots that semi-autonomously explored and mapped a large-scale urban environment. For winning, the U.S. Department of Defense awarded him $750,000. He was named one of Popular Science’s “Brilliant Ten” in 2012. In 2013, he was awarded a DARPA Young Faculty Award.