By 2020, satellite sensors will capture so many images of Earth that every one of the 8 million people who lives in New York City would have to be glued to a computer screen 24 hours a day in order for human eyes to view every image, according to Jeff Stein, Orbital Insight’s vice president for business development.

That statistic illustrates the massive amount of Earth observation data being provided by industry veterans like DigitalGlobe with its constellation of four large, high-resolution satellites and more recent entrants like Planet Labs, which has launched more than 100 cubesats to provide ubiquitous, lower-resolution imagery.

In the past, government and industry customers often relied on space-based cameras to obtain a close-up view of a distant location. Increasingly, companies are pairing that imagery with many other types of geospatial data.

“The big challenge is bringing all of that information together and delivering it to users so they can make decisions,” said Erik Grant, technical director for Raytheon Intelligence and Information Systems of Dulles, Virginia. “How do we integrate it, fuse it and make sense of it so you can take the full value of the tsunami of data that’s coming?”

The San Francisco-based startup Spaceknow, for example, monitors activity at 6,000 industrial sites in China by marrying satellite imagery with street maps and spectral data that can identify specific materials present at each location including metal, wood and concrete. Investors are hungry for the type of information Space know provides through its China Satellite Manufacturing Index, which is featured on Bloomberg Terminals, said Pavel Machalek, Spaceknow co-founder.

Merging those diverse datasets is not easy. With its 2015 purchase of Deimos Imaging of Spain, Canada’s UrtheCast Corp. gained two multispectral satellites to complement the imagery the firm already was obtaining from its still and video cameras mounted on the International Space Station. Each sensor provides a unique dataset. Building a platform to merge the data was “a mind-bending experience,” said Dan Lopez, UrtheCast platform and analytics vice president.

The sheer volume of data creates additional problems. DigitalGlobe satellites capture between 50 and 60 terabytes of imagery every day and the company has a data library of more than 80 petabytes.

To solve some of the data processing and storage problems, many firms are turning to cloud computing services like Amazon Web Services. Companies also are using sophisticated algorithms to identify patterns within images. The algorithms have improved to the point that computers are producing results that are nearly as accurate as people looking at imagery, Stein said. “We turn pixels into numbers and turn that into insight,” Stein said.

Plus, computers are continually improving their own performance by learning from past errors. Advances in deep learning and artificial intelligence are enabling companies to derive far more value from imagery than they ever could before, said Shay Har-Noy, DigitalGlobe Geospatial Big Data Platform vice president and general manager.

Companies still need to work on making their geospatial data easier for other people outside their industry to access and understand. “To use this wealth of data, you need to be smart about cloud computing, satellite imagery, atmospheric conditions, computer vision, deep learning and data science,” says Har-Noy. “Can we take some of these barriers down?”