We developed a fully automated approach to calibrating multiple cameras whose fields of view may not all overlap. Our technique only requires waving an arbitrary textured planar pattern in front of the cameras, which is the only manual intervention that is required. The pattern is then automatically detected in the frames where it is visible and used to simultaneously recover geometric and photometric camera calibration parameters. In other words, even a novice user can use our system to extract all the information required to add virtual 3D objects into the scene and light them convincingly. This makes it ideal for Augmented Reality applications.
multi_teapot.avi(23.9MB) : An example of multi-camera augmented reality, as calibrated and augmented by our system.
pasteboard.avi(1.6MB) : An augmented piece of pasteboard. This pasteboard has been used for geometric and photometric calibration necessary for this augmentation.