Uncategorized

The Black Hole and the Future of Decentralized Data Processing: Explained

By April 11, 2019 No Comments
Engineer and computer scientist Katie Bouman, PhD, and copious hard drives containing M87’s supermassive black hole image data

How did a telescope that looks more like a dispersed series of satellite dishes extract an image of the heretofore “unseeable”? The answer is data — masses and masses of it — and, more importantly, its processing and interpretation. Each node in the decentralized Event Horizon Telescope produced so much data that its analysis required the physical transportation, by plane, of half a ton of hard drives to a central supercomputer at the MIT Haystack Observatory.

Paradigm-shifting scientific discoveries, like the first-ever image of a supermassive black hole, entail the production and analysis of increasing amounts of data, often utilizing decentralized data collection as in the case of the Event Horizon Telescope (EHT). And yet, data processing is still carried out in a centralized manner, requiring impractical transportation and potentially insecure information storage at a single location. How can scientific discovery continue to progress if we don’t find a better way?

Hypernet technology could catalyze a shift from the current centralized/cloud processing model to a code-to-data paradigm for data analysis. In the future, the Hypernet protocol could facilitate the coordinated processing of data in situ, in the field, harnessing local compute at field sites to create a globe-spanning supercomputer to match a globe-spanning telescope.

To understand the relationship between the Eye of Sauron picture and data processing, we should first consider what an image actually is. To begin with, it’s fairly inaccurate to say that this image — or any image — was “captured.” Images are not things that exist in nature, separate from human interpretation. Our eyes and our brains work together to gather, process, and reconstitute data to create things that human beings understand as images. Similarly, cameras are devices for gathering and processing information, which is then reconstituted into recognizable image form.

The Event Horizon Telescope is a decentralized means for gathering information in the form of radio waves, emanating from the ring of infalling material surrounding the black hole. The collection of massive amounts of data, from 7 different observatories around the world, was required to sort out the interactions between the radio waves and translate the resulting pattern into image form. Coordination of many far apart radio telescopes provided the necessary magnification and resolution that would have otherwise been impossible, but it also meant the data needed to be combined and analyzed before it could be reconstructed as an image. Processing relied on an algorithm, CHIRP (Continuous High-resolution Image Reconstruction using Patch Priors), developed under the leadership of engineer and computer scientist Katie Bouman in 2016.

The EHT collected the radio wave data in April 2017, after which it was transported and processed, and the image was presented to the public in April 2019. A non-trivial amount of time was lost because hard drives from the South Pole were stranded due to weather. But what if there was a way to bring the algorithm to the data and synchronize the processing, and not just the collection? This is Hypernet’s vision of the future.