The Michigan Engineer News Center

Patented camera calibration tool automates calibration target acquisition

This innovative software guides users through the process of collecting a set of images of a calibration target.| Short Read
EnlargeAprilCal application

Obtaining a reliable and accurate camera calibration typically requires a significant amount of expertise as users must manually capture a set of calibration images that sufficiently constrain all of the parameters in the camera model. As a result, vision sensors prove challenging to work with, especially for novices who may lack intuition for how to collect such a calibration image set.

Prof. Edwin Olson and two of his former students, Johannes Strom and Andrew Richardson, have recently been awarded a United States Patent for their work in the development of AprilCal, an interactive camera calibration tool that automates the challenging task of calibration image acquisition. This innovative software guides users through the process of collecting a set of images of a calibration target. The AprilCal tool provides live feedback to users on the state of the calibration and suggests future target positions to maximally constrain the camera model parameters. In a series of human trials, this system has been shown to yield more accurate camera calibrations than standard tools while simultaneously decreasing the knowledge burden on users. AprilCal promises to lower the barrier-to-entry for applying vision sensors to robotics, virtual reality, and more.

Prof. Edwin Olson is is the director of the APRIL robotics lab, which studies Autonomy, Perception, Robotics, Interfaces, and Learning. His active research projects include applications to explosive ordinance disposal, search and rescue, multi-robot communication, railway safety, and automobile autonomy and safety.

Prof. Olson received a PhD from the Massachusetts Institute of Technology in 2008. In 2010, he led the winning team in the MAGIC 2010 competition by developing a team of 14 robots that semi-autonomously explored and mapped a large-scale urban environment. For winning, the U.S. Department of Defense awarded him $750,000. He was named one of Popular Science’s “Brilliant Ten” in 2012. In 2013, he was awarded a DARPA Young Faculty Award.

AprilCal application
Portrait of Steve Crang


Steve Crang
CSE Marketing and Communications Manager

Michigan Engineering

(734) 763-9996

3832 Beyster Bldg

The electrons absorb laser light and set up “momentum combs” (the hills) spanning the energy valleys within the material (the red line). When the electrons have an energy allowed by the quantum mechanical structure of the material—and also touch the edge of the valley—they emit light. This is why some teeth of the combs are bright and some are dark. By measuring the emitted light and precisely locating its source, the research mapped out the energy valleys in a 2D crystal of tungsten diselenide. Credit: Markus Borsch, Quantum Science Theory Lab, University of Michigan.

Mapping quantum structures with light to unlock their capabilities

Rather than installing new “2D” semiconductors in devices to see what they can do, this new method puts them through their paces with lasers and light detectors. | Medium Read