Having stumbled into a topic so broad and so deep, I relied heavily on many generous folks to educate me directly and steer me in the right direction for further investigation. Some of them, such as Arlen Chase, John Degnan, Brian Edwards, Gary Guenther, Bill Krabill, Tom Painter, Sanjiv Singh, and Red Whittaker, were notable players in the narrative. Others, including Jeff Deems, David Harding, Susan Jones, Paul McManamon, Michael Raphael, Johannes Riegl Jr., Xiaoli Sun, John Weishampel, Jim Yungel, and John Zayhowski, provided me far more help and guidance than their brief mentions in this book indicate.
Others to whom I'm indebted got no mention at all: Don Neilson and Rebecca Michals of SRI, Michele Durant at HRL, Audrey Lengel and Allison Rein of the Niels Bohr Library and Archives at the American Institute of Physics, Paul LaRocque of Teledyne Optech, David Murphy of Near Earth Autonomy, Nora Kazour at Carnegie Mellon, Jeff Bennett, Don Carswell, Bob Samson, David Smullin, Ron Swonger, and Guido Visconti. And I'm sure I'm forgetting someone.
The University of Colorado Denver's Auraria Library kindly allowed me guest researcher access, helping enable the depth of reporting I was hoping for. UCHealth continued to let me tell its stories while working on this book. Jacqueline May Parkison did a masterful job editing the manuscript. I'm grateful for the support of literary agent John Willig and of the team at Prometheus Books. Finally, I thank my wife and daughters for their patience.
Flat-screen monitors display pointillist renderings of mountains, rivers, roads, bridges, buildings, and power lines with little regard for the usual relationships between actual pigmentation and color. In place of those are schemes generally favoring deep blues at the base that then loosen into greens, yellows, oranges, and reds with increasing elevation. There are enough screens with enough color that one has the sense of being courted by many amorous peacocks. The screens belong to exhibit hall booths, which, for the duration of the 2017 International Lidar Mapping Forum at the Hyatt Regency Denver, belong to dozens of data providers, point cloud software developers, and makers of very pricey laser and other hardware. Many have sci-fi names: Quantum Spatial, Network Geomatics, Terrasolid. Many have sci-fi products too, though they happen to be real.
Drones resembling paper airplanes, stealth fighters, helicopters, and propellered crabs have alighted on many tables and stands. I pause to consider the Pulse Aerospace Vapor 15, a black chopper with its rotors folded back like the wings of some immense cricket. If it were a carnivore, I would be lunch.
I just had lunch, standing over a ham and cheese sandwich from the buffet with a semiretired guy named Brad Weigle. He grows bananas and mangoes in his Florida backyard but won't be there much over the next couple of years. He's moving to Hawaii to map the National Tropical Botanical Gardens in high resolution using lidar that flies on drones like the Pulse Aerospace Vapor 15. As we talked, Chuyen Nguyen, a graduate student from Texas A&MCorpus Christi whom Weigle had met earlier, stopped to say hi and somehow mistook me for someone capable of grokking something called multiscale voxel segmentation for terrestrial lidar data as pertains to swamp mapping, the subject of her PhD work. I smiled and nodded pleasantly and said things like Wow and Amazing at what seemed be appropriate intervals. When she moved on, Weigle assured me: That stuff is cutting edge. It's going to be a really big deal.
Over at the Velodyne Lidar booth, a sales engineer named Jeff Wuendry tells me that self-driving vehicles that constantly lidar-scan their environments (so they, for example, don't smash into things) could, if that data were crowdsourced and stored in the cloud, obviate the need for lidar mapping drones, at least in a lot of urban environments. Wuendry also tells me that he used to work for a German company called SICK, now a big name in lidar. It started out with light curtains so metal-stamping machines wouldn't inadvertently crush the hands of their minders, he says.
I walk the few steps over to the Harris Corporation booth, where Blake Burns, a senior sales engineer, explains that with one of Harris's multimillion-dollar Geiger-mode flash lidar rigs, from an aircraft flying 330 miles an hour at an altitude of twenty-seven thousand feet, can render a three-mile swath of whatever lies below in photographic detail (rainbow coloration notwithstanding)not to mention, he adds, three-dimensional exactitude to about four inches in any direction. The US military has been using similar Harris systems for two decades, he says, though this was only declassified a couple of years back. Just three of the civilian units exist, and they have better things to do than hang out in an exhibition hall.
Right next door, so to speak, Katie Fitzsimmons of Leica Geosystems tells me that Leica's single-photon lidaranother multimillion-dollar black box, an example of which is on hand and, were it turned on, would be collecting breathtakingly detailed imagery and elevation data of the Hyatt carpetoperates like the equivalent of one hundred typical lidars at once, firing off six million laser pulses each second. The company is in the process of elevation mapping the continental United States and Western Europe to an accuracy of a foot or less with these machines, Fitzsimmons adds. She speaks with enthusiasm, but also more matter-of-factly than the situation seems to warrant, like someone reciting for the umpteenth time the technical specs of a space-time portal.
Her colleague Josh Rayburn describes the company's Pegasus:Backpack, a twenty-eight-pound carbon-fiber-reinforced wonder with five cameras and two spinning Velodyne lidars so someone can walk around and capture detailed 3D renderings of indoor or outdoor spaces. It costs about as much as a Ferrari. It's right about now that my brain starts to hurt.
It's not only being overwhelmed by all the shiny objects, preening flat screens, and jargon; it's also the weight of knowing that the lidar mapping that this entire conference is dedicated to represents just one of many realms in which people are using lidar. That knowledge both reinforces and seriously complicates my desire to make the technology and its evolution clear to people who, like me, might otherwise suspect a voxel to be a Toyota subcompact.
Lidar is a technology capable of measuring continental drift, determining the composition of Earth's atmosphere, discovering lost cities, tracking the biomass in forests, assessing flood risk and hurricane damage, measuring the melting of glaciers, detecting submerged explosives and the level of the seas in which they lurk, listening to stinkbug conversations, guiding missiles and self-driving cars, and a whole lot more. I have become fascinated with this most striking macroscopic application of quantum physics, which, were it not for its milquetoast name, would surely have soaked up a lot more popular love by now, considering its growing importance to science, industry, and government. I have decided it's time to tell lidar's story, starting at the beginning and hanging tough until a sort of techno-Darwinian explosion sent the technology radiating into more niches than any reasonable tome might possibly hold. Then I will have to pick and choose.
The beginning is straightforward, though. The extraordinary technologies arrayed at this technical conference in the Mile High City share an improbable common ancestor: lidar's first flicker occurred between the ears of a self-educated, mentally troubled Irish savant who lived with his parents on the outskirts of Dublin and corresponded with Albert Einstein. He dreamed up ways to see the impossibly tiny and the impossibly distant, to be realized long after he was gone.
Next page