The most detailed three-dimensional map of New York City is in the works, courtesy of early-morning flyovers from a plane last month that beamed lasers from 3,500 feet over the city that never sleeps.
The $450,000 project, part of Mayor Michael Bloomberg's environmental initiative PlaNYC, will generate maps with a resolution down to just a few inches. These digital recreations will show the Big Apple's elevation, vegetation and the geometric rise of its thousands of buildings.
This information will be used to create a public solar map showing residents, city planners and utility companies the best places to install solar panels on rooftops by "calculating the solar potential of every NYC rooftop," said Tria Case director of sustainability for the City University of New York (CUNY), which is involved in the project.
Other applications of the data include identifying flood-prone areas and taking stock of the city's tree cover and remaining wetlands.
Aerial laser scans
The mapping effort is made possible by Lidar (light detection and ranging), a technology similar to radar and sonar but that uses light pulses instead of radio and sound waves, respectively. (In the case of the New York City runs, the light was in the near-infrared range and thus invisible to humans.)
A Shrike Commander plane beamed these light pulses – about 75,000 of them per second – at the metropolis during nine post-midnight flyovers from April 14 to April 30.
Sensors onboard the plane recorded the amount of time it took for these lasers to reflect off the city's various surfaces and bounce back to the plane.
"We're basically sending out a pulse of light and measuring its return," said Richard Vincent, general manager of operation at Sanborn, the Colorado-based company hired to do the flights with their aircraft and equipment, as well as process the reams of data on the back end.
The plane made methodical sweeps over the city, rather like someone mowing a lawn or painting a wall. During such a sortie, "the plane is flying down the center of Broadway, say, and you're painting back and forth" in strokes a few hundred meters in width, Vincent said.
Sanford knew the exact location of its plane in the sky via a Global Positing System (GPS) and a so-called inertial measurement unit. The varying roundtrip times for the laser beam revealed the distance between the ground below and the sensors above, in turn providing accurate height and contours for the sprawling cityscape.
Points of light
The raw result of all these laser shots is a massive data set consisting of points, with about 100 points collected per 11 square feet (one square meter) on average, said Vincent.
This pointillism is then converted into a detailed impression of the real world by software and Sanford employees. "In post-processing, we determine if it’s a building, a tree or a rock, and that requires specialized software and training," Vincent said.
Water absorbs the laser beams, so "opaque" areas in the lidar-generated map with no signal return indicate water (though lily pads, suspended particles and shallow water will often send a signal back to the plane's sensors.)
Trees, with their layers of canopy, present a more dappled appearance, Vincent explained.
As Sanford employees make these classifications, they must also account for smaller items like cars and people.
Because these transient objects get in the way of establishing a baseline elevation of a city region for flood plain management, for instance, Sanford places them in a digital layer that can be toggled on and off.
Given the time of the flyovers, however, the problem of crowds of people thronging Manhattan streets did not come up. "We only got late night partiers," Vincent joked.