The Origins of Hypernet and Galileo

By | Uncategorized

Ivan Ravlich, ABD in Aeronautics and Astronautics at Stanford, bridges the divide between several scientific and technological disciplines (chemical engineering, materials science, aerospace engineering, theoretical physics).  While preparing to run large scale simulations on extended theories of gravity for his doctoral research, he realized that the bottleneck of innovation in nearly every scientific field is the lack of access to available parallel computation. 

At the same time, Todd Chapman, PhD in Aeronautics and Astronautics, was working with his graduate advisor to make it possible to run serious aerospace simulations on an iphone5.  Todd was fascinated by edge computing and the idea of carrying out meaningful physics calculations on a handheld device in the field, where the data was being generated.  He ultimately found that it was possible to use a handheld device to reproduce cutting edge physics work from the late 1990s and early 2000s, which would have been carried out at that time using massive machines costing multiple millions of dollars.

It was not computing that brought Ivan and Todd together, however, but rather their shared love of playing music, especially “pretty terrible rip offs of 80s hair metal.”  They eventually realized that they had both been confronting a universal and widespread problem across scientific fields and that they might have ideas for solving it.

Researchers and engineers across all industries can spend 70-80% of their time trying to get their computing tools to work and only 20-30% on the meat of their projects.  At the same time, computing power is all around us, including in the consumer electronics Todd and Ivan encountered all over Stanford’s campus. This inspired them to create Hypernet, a blockchain-enabled marketplace with the potential to harness devices around the globe and make their computing power easily accessible.  The goal was to facilitate serious data analytics, scientific simulations, and all other computing intensive workloads. 

They also recognized that computing tools must be easy to use, not just accessible, so that subject-matter experts can focus on their science instead of their tools.  This is why they built Galileo, a portal interface that provides just the right degree of simplicity to free the user of implementation concerns while still allowing for advanced usage.  In terms of accessibility and usability, Hypernet and Galileo could do for cloud tools and distributed computing what the desktop and mouse did for the operating system and PC revolution.  

Watch the video to hear the origin story!

Introducing: Galileo, the Portal to Hypernet

By | Uncategorized

After much anticipation, the Hypernet team is today launching Galileo, the universal solution for distributed computing. Galileo is the portal to Hypernet, through which the world’s computing power will be accessed.

The Galileo app advances the Hypernet mission of democratizing the means of discovery and technological progress, as Kyle Wiggers so clearly explains in this piece for VentureBeat. It is a revolutionary tool for engineers, researchers, and data analysts, who can now simply drag and drop their code to access remote computing resources. It helps to automate DevOps by eliminating the need for extensive, complicated, and time-consuming cloud setup. With Galileo, Hypernet enables brilliant minds, wherever they may be found, to focus on solving the world’s greatest problems.

We invite Hypernet supporters and the engineers and data scientists in their networks to sign up and try it out, themselves, for free.

Ivan Ravlich, our project lead and co-founder at Hypernet Labs, explains, “With Galileo we want to immediately upskill your entire team by eliminating the need for specialized knowledge to access compute power through a fast and easy-to-use platform. This allows people to spend minutes, rather than weeks, accessing required computing power. In terms of accessibility and usability, Galileo and Hypernet do for cloud tools and distributed computing what the desktop and mouse did for the operating system and PC revolution.”

Whereas the Magellan application was designed only for our supply-side supporters, Galileo encompasses both the demand and supply sides of the Hypernet network. Our next step is to connect Galileo with the marketplace features our blockchain engineers have developed for our testnet. The public release of the testnet will be preceded by testing within Project Magellan.

The concept of a global decentralized supercomputer was introduced with the advent of Ethereum, and Hypernet is making this a functional reality for the purposes of all compute intensive work (data analysis, AI/ML, simulation, rendering). As an extremely useful technical innovation, Galileo helps us to drive widespread adoption of blockchain technology.

A sampling of Galileo’s first real-world applications

Dam breach analysis for dam safety: Engineers in hydrology and hydraulics have used Galileo for hundreds of hours of runs.

Bio Life Sciences: A researcher used Galileo to render 60GB of data and produce 3D imaging of the simulated effects of air pollution inside lungs.

Market Analysis: A market analyst built an algorithm to predict currency markets and used Galileo for data analytics, achieved 60% accuracy.

Social Sciences: A legal scholar and anthropologist used Galileo for quantitative comparative country studies of the effects of specific laws.

Space Exploration / Video Rendering: A plasma rocket company used Galileo to process 3D 4k renderings of their proof of concept ships.

Galileo Benefits

Scale up your compute resources in minutes with no setup

  • Got a large project with a short deadline? Access more powerful machines quickly and easily.
  • Stop wasting weeks or months configuring your cloud setup. Enable engineers and researchers to focus on their areas of expertise, not cloud infrastructure.

Easily access remote computation machines, on- and off-premises

  • Simply drag and drop project folders onto office workstations or cloud devices you control and run remotely immediately
  • View all jobs in progress on each machine
  • Improvement over Remote Desktop and SSH: multiple users can now run different jobs in one machine simultaneously

Automate deployment of compute jobs and script against the Galileo engine

  • Parallel computational workloads on one or many machines
  • Easy-to-use SDK to deploy hundreds or thousands of runs for sensitivity analysis or full-blown Monte Carlo simulations

Run securely and privately

  • By default, no one can run on your machine. If you want friends or colleagues to be able to run on your machine, you can invite them and set permissions.
  • Secure communications: HTTPS, WSS. All communications hashed (256-bit SHA3) and signed (2048-bit RSA, RSASSA-PSS)
  • End-to-end encryption: AES-CTR (256-bit key, 128-bit unique counter block)

Work smarter with collaborators — shared project folders

  • Share data sets, models, and results using any network drive or major cloud storage provider.
  • Easy management of large datasets: avoid transferring and copying large projects & results files on every deployment, and start running remotely immediately.

Sign up to try it out yourself!

September 2019 Roadmap Update

By | Uncategorized

We are steadily moving forward in our product roadmap, and exciting times are ahead. We are now in the “Powered Ascent” phase of our map, carrying out closed beta testing of the Hypernet access portal and internal testing of our blockchain payment channels and mechanics.

We’ve been working hard recruiting selected users who are eager to test drive the Hypernet testnet for their scientific computing workloads. To be clear, scientific computing is a necessary component of all research and engineering, carried out across industries, in the private sector, public sector, and academia. It includes data analysis and all work involving simulations. It’s clear that we are poised to solve critical pain-point problems for a broad range of scientific computing users. As testing continues, the engineering team is solving problems quickly and prioritizing building the most in-demand features.

While we’ve been working with test users — on both the demand and supply sides of the network, through several phases of the development process — , we’re now happy to announce that there are daily active researchers using the beta Hypernet portal application to carry out their professional computational work!

“The great news is that I’m so used to using the Hypernet portal (it’s really easy and convenient) that when it’s down I feel much less productive! Point is, it’s super useful.” — Active test user in the energy sector

A major feature of the access portal is its ease of use. Currently, it is common for researchers and engineers to feel that they are spending more time building their tools from scratch than they’re spending in their areas of expertise. This is true whether they are using their own machines, computers in a lab or office they can access, or cloud. Hypernet simplifies the entire equation. The goal of the Hypernet portal is for users to be able to purchase extra compute power through a blockchain-enabled marketplace and manage their own access to their machines, even if they will not be offering these machines on the supply side of Hypernet.

Testing deployment on embedded devices

As you know, members of our community have already taken part in supply-side testing through Project Magellan. As projected in the roadmap plans, we’ve now begun to merge supply- and demand-side testing with a select few Project Magellan members, and we will be onboarding more of them over the next few weeks in a controlled way.

After we move forward with the merge, the coming steps for Project Magellan will be to begin testing the blockchain payment channels and network governance features, which are currently undergoing rigorous internal testing and scrutiny. We look forward to this next phase in the roadmap.

Roadmap Update

By | Uncategorized

The new Hypernet roadmap condenses previous completed milestones and looks forward in more detail towards everything that is very quickly approaching.

As you know from previous updates, we are currently in “liftoff” — closed beta & internal blockchain testing. While previous steps mostly involved infrastructure construction, we are entering a more dynamic phase of product testing & aggressive and targeted user recruitment in preparation for mainnet launch.

The next few months promise to be intense and exhilarating. Get ready!

Note that phases of the roadmap are not equivalent in terms of time.

HyperNews: Magellan Looks Ahead

By | Uncategorized

Hypernet is on a mission to accelerate the entire range of scientific and high-performance computing in both academic research and industry, thereby crafting an unprecedentedly seamless experience on the blockchain. We intend to do this by providing a one-stop tool to allow anyone, working with any kind of scientific workload, to seamlessly run their code on any machine they choose — without the need to set up a VM or waste time configuring remote login and installation.

Last Wednesday, with the help of our Project Magellan test group, our supply side portal to the Hypernet testnet went live.

Ivan Ravlich sending jobs and live chatting with the Project Magellan group.

Our engineers have already completed over 200 compute jobs using supporters’ machines, located all over the world. This success is a significant step towards implementation of the full Hypernet vision.

The Hyperoffice during the live test.

Leveraging our supporters’ unique expertise from their time in the blockchain space, Project Magellan is an invaluable part of helping the Hypernet team craft the buyer-seller interaction and user experience. By allowing us to run the first iteration of tests with a myriad of workloads on various different machines, the Magellan users are helping us to identify and correct critical bugs and inefficiencies.

Project Magellan will continue into the summer as we test new functionalities and bring the various pieces of the Hypernet vision together for the big launch. Users’ input will directly contribute to the improvements we release on a regular basis during this time.

Dan Luo, Elliot Shohet, Conrad Bailey, Todd Chapman, and Steven Ingram.

Our dream and goal is to enable thinkers and creators by removing the barriers of access to compute power around the world. Welcome to the digital frontier.

Join the Project Magellan waitlist here.

Project Magellan: We Have Lift-Off

By | Uncategorized

Hypernet is on a mission to accelerate the entire range of scientific and high-performance computing in both academic research and industry, thereby crafting an unprecedentedly seamless experience on the blockchain. We intend to do this by providing a one-stop tool to allow anyone, working with any kind of scientific workload, to seamlessly run their code on any machine they choose — without the need to set up a VM or waste time configuring remote login and installation.

Last Wednesday, with the help of our Project Magellan test group, our supply side portal to the Hypernet testnet went live.

Ivan Ravlich sending jobs and live chatting with the Project Magellan group.

Our engineers have already completed over 200 compute jobs using supporters’ machines, located all over the world. This success is a significant step towards implementation of the full Hypernet vision.

The Hyperoffice during the live test.

Leveraging our supporters’ unique expertise from their time in the blockchain space, Project Magellan is an invaluable part of helping the Hypernet team craft the buyer-seller interaction and user experience. By allowing us to run the first iteration of tests with a myriad of workloads on various different machines, the Magellan users are helping us to identify and correct critical bugs and inefficiencies.

Project Magellan will continue into the summer as we test new functionalities and bring the various pieces of the Hypernet vision together for the big launch. Users’ input will directly contribute to the improvements we release on a regular basis during this time.

Dan Luo, Elliot Shohet, Conrad Bailey, Todd Chapman, and Steven Ingram.

Our dream and goal is to enable thinkers and creators by removing the barriers of access to compute power around the world. Welcome to the digital frontier.

Join the Project Magellan waitlist here.

Project Magellan: Into Hyperspace

By | Uncategorized

Magellan Powered by Hypernet is here and ready for download on June 12th!

Through Magellan, a select group of explorer-adventurers will be testing the very first portal to Hypernet’s initial testnet. We’ve already demonstrated the Hypernet protocol’s capacity to harness 1000 virtual machines across the globe to cooperatively solve a machine learning problem. It is now time to connect the devices of this inaugural group of ‘everyday people’ to conduct further experimentation and development of the testnet.

Members of the Project Magellan group are builders and self-starters who are taking initiative with Hypernet to address daunting societal challenges, including: 1) our ever-increasing and desperate need for compute power and data processing, 2) resource depletion and climate change resulting from attempts to meet this need, and 3) barriers to access to compute power–the means of scientific discovery–for many would-be innovators.

They join the Hypernet team as we build a new computing infrastructure that will more efficiently draw upon our collective computing resources, which are shared across billions of devices around the globe.

Practically speaking, Project Magellan acts as the supply side in the under-construction blockchain enabled marketplace for compute power. Accessing the testnet is extremely simple. Members of the group will download and install a desktop app, allowing their machines to run test compute jobs sent through Hypernet, directly from the Hypernet team. That’s it! The element of simplicity will remain once Hypernet launches in its fully developed form, and suppliers will be able to earn passive income by renting out the use of their machines. We are committed to delivering a network that places users first. Working with users, on both the supply and demand sides of the network, is therefore at the center of our product development process. This is where Project Magellan fits in.

Next Steps:
The Hypernet team is scheduling private early Alpha testing sessions with the Project Magellan group starting next week. Test jobs will be launched using the testnet on an intermittent basis for the next 2-4 weeks. Project Magellan will serve as a testing ground for added features, as we fit the various technical pieces of the Hypernet vision together.

Ultimately, tech is built by humans, for humans. We’re excited to embark on this journey, towards enhancing both our technology and user experience, alongside supporters from our community.

Team Building 2019

By | Uncategorized
Hypernet team, Palo Alto, March 2019

Spring 2019 at Hypernet has seen the successful test of the computing protocol across 1000 machines around the world as well as the release of our first Alpha product for the compute demand side of our network. The team in California also moved from our house close to Cupertino (and the original Apple Garage) into an office in Palo Alto. We outgrew our original location as we began to scale up in order to bring full utility to the network, create a groundbreaking B2C product, and build our user base. While we are keeping our burn rate low during this phase of product development, the team has grown considerably over the past several months and we’d like to introduce you to our newest members!

John Drexler

John Drexler, Hypernet’s Chief of Staff, is focused on company operations and communication, team building, and product-market strategy. John joined us from the venture capital fund Sovereign’s Capital, where he was a Senior Associate working on the due-diligence/investment team and assisting with portfolio management. He previously served as a Management Associate at Siloam Hospitals (IDX:SILO), as Marketing Manager on Summit Healthcare’s founding team in Jakarta. He studied economics at Covenant College. John has been absolutely critical to the reorganization of intra-company communication. Synergy between engineering, product development, business development, and marketing has been incredibly strong over the past months as a result of his efforts.

Conrad Bailey

Conrad Bailey is a full-stack engineer focused on Hypernet’s first Alpha product, the demand-side portal to the network. He arrived directly after earning his B.S. in computer science from the University of Notre Dame, but this wasn’t his first trip to Silicon Valley. In 2017 he was one of only ten students to be selected for the Silicon Valley Semester program, in which he participated as an intern in Palo Alto maintaining a full academic courseload. During his internship he explored finely tuned, parallel solutions for large scale numerical methods and wrote a linear algebra library for Prolog. At Notre Dame, he worked with the Cooperative Computing Laboratory to bring cloud functionality to WorkQueue: a framework for master-worker applications that can span thousands of cores. Conrad’s work with the engineering team greatly accelerated our progress towards the release of our first Beta, which we anticipate very soon.

Elliot Shohet

Elliot Shohet joined the team in February as a Blockchain Engineer. He hit the ground running and has been helpful on a number of different fronts. Elliot started programming at the age of 12, attained a computer science degree from UC Davis, and joined the blockchain space in 2013 when he first discovered bitcoin. In 2017, he started smart contract development for the Ethereum blockchain, and he has served as a developer and advisor for blockchain companies. Elliot brings a variety of skills and experiences to the team. We’re thrilled to have him on board as we build the blockchain scheduler, a crucial component of the Hypernet marketplace for compute power.

Antoine Estienne

Antoine Estienne is an outstanding blockchain engineer who is, along with Elliot, helping to architect the blockchain piece of Hypernet. Antoine is passionate about decentralization and peer to peer technologies, as well as their implications in our lives. He discovered the power of blockchain when he was finishing his Master of Science in Paris and exploring crowdfunding technologies for renewable energy projects. Thanks to his dual American and French citizenship, he was able to work in both countries to acquire skills in this quickly growing domain.

Jennifer Hudson

Jennifer Hudson joined the team in February as our Communication and Market Strategist. She brings a social science background to Hypernet, with a PhD from Columbia University in political science, specialized in political theory and international relations. She is experienced with social scientific tools for understanding market power dynamics and social climate, which helps her to effectively and strategically nurture relationships between Hypernet, its stakeholders, and its users. As communication lead, Jennifer is articulating and helping to implement a vision of the future in which our business model is aligned with users’ intentions, needs, values, and vision for their lives.

Iain Carson

Iain Carson joined Hypernet in March as Marketing Manager and Director of Content. Iain comes to the team with a background of over 10 years of experience in film and television production, including working for CBS and Lionsgate, and over 5 years of marketing experience with The Walt Disney Company. He is excited to bring engaging multimedia experiences in the coming months to the Hypernet community. Iain will be developing visual content and working with other members of the marketing and communication team to manage digital marketing.

Dan Luo

Dan Luo is our most recent hire, who we were happy to welcome as a member of the engineering team this month. Dan studied computer science and Japanese at UCSB and then learned web development at the App Academy bootcamp. Dan is an outstanding Front End developer and will be focused on our Alpha product. He has hit the ground running, and he is already delivering major value to the team.

As you can see, our engineering, communication, and marketing teams are taking shape, and we’re extremely excited to have more capacity to build and successfully launch the product! Thank you to all of our supporters who have believed in us and our mission to democratize computing resources and make strides in responsible data science.

May 2019 Hypernet Product Update

By | Uncategorized

Thanks to all of you who believe in our mission to democratize computing resources and make strides in responsible data science. To bring full utility to the network and provide a complete product, we are currently focusing on three main activities: technical development, compute buyer and seller acquisition, and the creation of a B2C product that will utilize the technology we have built so far. As planned, we are releasing our technology in stages. Q1 2019 saw the Alpha testing of our first application, the gateway to Hypernet for scientists and engineers seeking to access more compute power.

Alpha Product Development

  • Initial product, launched to Alpha testers in Q1 2019
  • Demand-side portal for the blockchain-enabled marketplace for compute power
  • As of May, user-centric product development ongoing, Alpha user base growing

The Alpha product serves as the launching pad for testing the Hypernet protocol before release and as a vehicle to onboard users who have unmet needs to easily deploy code to another machine.

For those of you who have never attempted to spin up more compute power in AWS, Azure, or Google Cloud, we recommend you attempt it once so you can experience, first hand, the complexity involved. Once you gain access to a cloud instance (or even someone’s computer) your code dependencies need to be installed on the new machine before you can send your script to be executed remotely. Hypernet’s Alpha product simplifies all of this.

Business Development, Communication & Marketing

As development continues, the product team and engineers are prioritizing the features that will deliver the most value to our users.

Our user-centric, ease-of-use focus is one of the major differentiators between Hypernet and other blockchain-enabled computational projects.

Since users, their values, and their needs are at the core of our process, we’ve focused a lot of attention on user interviews. We are learning more and more about the fundamental problems our users need to solve, even before they consider remote code deployment as a possible solution. This research has been crucial for the purposes of product development, business development, and marketing / communication.

In April, we began to systematically categorize different user profiles so that we can become increasingly surgical in our approach to product development and feature prioritization. The BD team has made great strides in identifying which segments derive the most value from our product, as well as ways we can better serve them and locate more users with similar needs. User profiles, derived from in-depth interview data and associated market research, have also been critical for the generation of targeted communication and marketing. The marketing team has been working closely with BD to target these segments and onboard Alpha users. The close feedback loop between user interaction and product implementation is key to successfully bringing a technology to market.

Future Updates

Project Magellan, the future supply-side portal and the current gateway to the Hypernet test net, is set to launch with our exclusive group of testers in the coming weeks. Stay tuned for updates on that front. Thank you for joining us on this journey!

The Black Hole and the Future of Decentralized Data Processing: Explained

By | Uncategorized
Engineer and computer scientist Katie Bouman, PhD, and copious hard drives containing M87’s supermassive black hole image data

How did a telescope that looks more like a dispersed series of satellite dishes extract an image of the heretofore “unseeable”? The answer is data — masses and masses of it — and, more importantly, its processing and interpretation. Each node in the decentralized Event Horizon Telescope produced so much data that its analysis required the physical transportation, by plane, of half a ton of hard drives to a central supercomputer at the MIT Haystack Observatory.

Paradigm-shifting scientific discoveries, like the first-ever image of a supermassive black hole, entail the production and analysis of increasing amounts of data, often utilizing decentralized data collection as in the case of the Event Horizon Telescope (EHT). And yet, data processing is still carried out in a centralized manner, requiring impractical transportation and potentially insecure information storage at a single location. How can scientific discovery continue to progress if we don’t find a better way?

Hypernet technology could catalyze a shift from the current centralized/cloud processing model to a code-to-data paradigm for data analysis. In the future, the Hypernet protocol could facilitate the coordinated processing of data in situ, in the field, harnessing local compute at field sites to create a globe-spanning supercomputer to match a globe-spanning telescope.

To understand the relationship between the Eye of Sauron picture and data processing, we should first consider what an image actually is. To begin with, it’s fairly inaccurate to say that this image — or any image — was “captured.” Images are not things that exist in nature, separate from human interpretation. Our eyes and our brains work together to gather, process, and reconstitute data to create things that human beings understand as images. Similarly, cameras are devices for gathering and processing information, which is then reconstituted into recognizable image form.

The Event Horizon Telescope is a decentralized means for gathering information in the form of radio waves, emanating from the ring of infalling material surrounding the black hole. The collection of massive amounts of data, from 7 different observatories around the world, was required to sort out the interactions between the radio waves and translate the resulting pattern into image form. Coordination of many far apart radio telescopes provided the necessary magnification and resolution that would have otherwise been impossible, but it also meant the data needed to be combined and analyzed before it could be reconstructed as an image. Processing relied on an algorithm, CHIRP (Continuous High-resolution Image Reconstruction using Patch Priors), developed under the leadership of engineer and computer scientist Katie Bouman in 2016.

The EHT collected the radio wave data in April 2017, after which it was transported and processed, and the image was presented to the public in April 2019. A non-trivial amount of time was lost because hard drives from the South Pole were stranded due to weather. But what if there was a way to bring the algorithm to the data and synchronize the processing, and not just the collection? This is Hypernet’s vision of the future.

We’re back from outer space…

By | Uncategorized

All of us in the blockchain space are familiar with the boom and bust that has occurred since 2014, when Ethereum first facilitated and unleashed the ICO mania that eventually hit the wall — hard — in 2017. Many technology and economics experts, and not just the skeptics, saw this as a predictable outcome, but their analytic frameworks also suggest that there’s good news ahead for companies that make productive use of blockchain technology. This is excellent news for the Hypernet community. We will further discuss the economic context and theories below, but first, we have an important update and announcement!

As you know, first and foremost, Hypernet is on a mission to create a world-changing product. This product-centric thinking is the major determining factor in all of our decision making, communication, information gathering, and idea building. Our community has been an excellent source of beneficial ideas, and we’re now in a position to solicit your feedback on the actual products we are striving to build via the creation of a private test group. We are extremely excited to begin sharing more details and insight, as well as an eventual test product for you to download.

Welcome to Project Magellan.

Here’s an edited screenshot of our first test product, which we expect to place in your hands soon this spring:

Project Magellan membership will gain you access to exclusive information, regarding the product pictured above as well as the implications of Hypernet technology for researchers and data scientists around the world. Prospective members should complete this application as soon as possible to secure a spot.

Candidates will receive confirmation upon application submission and receipt by the Hypernet team. The application will remain open for two weeks (until March 11th). The open application period will be followed by three days for review. All approved members will be added to Hypernet’s Project Magellan group on March 14th. As we already noted, group members will have access to privileged content, eventually including the prototype portal to Hypernet’s test net, pictured above.

Note that space is limited. For now, it is important that the minimum necessary number of people handles the product as we roll out new features and stress test the network. We will be adding more applicants from the wait list as their devices and user profiles become applicable and essential for new tests, so you will want to submit your application as soon as possible.

Thank you for standing by us over the past months and trusting that sometimes ‘no news is good news.’ The Hypernet team has been hard at work, making strides in all facets of our operation. Much of this work must take place behind the scenes in order to ensure a successful product launch and token distribution. Strategic timing is crucial.

Now back to theory… The Gartner Hype Cycle for Emerging Technologies, first articulated in 1995, describes a now-familiar storyline and helps to contextualize this moment in blockchain and Hypernet history:

If the 2008 Bitcoin whitepaper marks the beginning of the blockchain cycle, the jury is still out as to whether we’ve reached the dreaded “trough of disillusionment.” At the same time, there’s a certain amount of hope built into this model. Running parallel to the “hype cycle” logic, former Stanford researcher Roy Amara’s “law” states, “We tend to overestimate the effect of a technology in the short run and underestimate the effect in the long run.” It makes perfect sense that initial vague pronouncements about the impact of any new technology would drive overestimation and trigger disappointment. Underneath the hype, though, engineers would be using the new technology as the basis for real products and infrastructure, which might only become available post-disillusionment. This almost over-determines eventual underestimation of effective implementation of the original breakthrough.

The turmoil has a positive externality, however, and that is natural selection. In Capitalism, Socialism, and Democracy (1942), Austrian economist and political theorist Joseph Schumpeter described the progressive evolution of the capitalist economy as a “perennial gale of creative destruction.” The idea is that a society cannot reap the benefits of technological advancement without accepting the fact that there will be some losers. The weak must be left by the wayside as the most productive actors contribute to the growth of the system as a whole. This is the place of Hypernet in the blockchain space — we are building a product that ultimately strengthens the entire crypto community because it allows society to harness the potential of blockchain technology. Creative destruction describes the process of separating the projects that are pure hype from the real technical progress that is left standing strong after the dust settles. And real technical progress requires real, sustained, less-than-glamorous hard work.

We are excited to embark upon this journey with our Project Magellan members and our wider community. We value you as the original true believers in Hypernet, and we want you to be among the first to pioneer our cutting edge technology. Together we will circumnavigate the globe with compute power.

Apply here: https://goo.gl/forms/hQzIa2fh2ifV4suo2

The Future of AI: A Discussion with Dr. Kai-Fu Lee, Dr. Eric Brynjolfsson, and Dr. Susan Athey

By | Uncategorized

The future of AI is unclear, but one thing is certain: companies which perfect the use and implementation of AI software will become the most valuable companies in the world. This is according to Dr. Kai-Fu Lee, founder of Sinovation Ventures.

Dr. Lee joined world renowned MIT computer science researcher, Dr. Eric Brynjolfsson, and Stanford economist, Susan Athey, for a discussion at the Stanford Graduate School of Business last week. Hypernet attends these events to explore partnership opportunities, source clients and talent, and ensure that we stay on the cutting edge of research and technology developments. This event was exceptionally fruitful in each of these areas.

The conversation between these scholars was deep and insightful. Although they touched on many subjects, three points stood out:

  1. The field of AI is not nearly as mature as most people believe.
  2. China could easily overtake the US’ dominant position in AI, and technology in general.
  3. With AI displacing more jobs than it creates, companies will have a greater responsibility to maintain societal welfare.

The field of AI is not nearly as mature as most people believe
Both Lee and Brynjolfsson agreed that popular media outlets (and as a result, most people) have a poor understanding of where we are in terms of AI development. When people talk about AI, they are typically referring to AGI, or Artificial General Intelligence. This is the type of science-fiction AI that is capable of many tasks using different skills. Although this is where the field is heading, we are currently very far from that vision. Brynjolfsson was exceptionally antagonistic towards the idea that AI is nearing human intelligence. He views AI’s current state as an increasingly sophisticated ability to recognize patterns and predict outcomes… A far cry from humanoid robots which can think like we do.

China could easily overtake the US’ dominant position in AI, and technology in general.

Lee’s company, Sinovation ventures, has invested in over a dozen chinese unicorns, and 300 companies overall. He believes that while the US is operating at full capacity in terms of AI development, China is just getting started. Chinese companies like Alibaba, Baidu, and WeChat have enormous growth potential, given their market is 1B+ people. These companies operate very similarly to their US counterparts (Amazon, Google, and Facebook, respectively) but they manage to avoid R&D costs by copying successful business models. Furthermore, to many people’s surprise, China has more AI developers than the United States. Lee admits that all of the best programmers are in the US. But, he argues, companies don’t need the best programmers — they need a larger quantity of sufficient programmers in order to move their business forward. If this is true, then companies like Alibaba, Baidu, and WeChat may overtake their American counterparts in the next 10–20 years.

With AI displacing more jobs than it creates, companies will have a greater responsibility to maintain societal welfare.

Of course, one reason that companies are investing so heavily in AI is because it can be used to replace human workers. Although companies frequently claim that more new jobs will be created by AI, each of the panelists agreed that AI will displace many more jobs than it will create. This means that society will have to find new ways to deal with higher levels of unemployment, and the solutions may need to come from the companies themselves. This issue seemed to have the least clear solution of all the topics discussed. Ideas like low-income corporate housing, universal basic income, and heavier taxation have all been lightly explored, but until this becomes a bigger and more immediate issue, we likely won’t see companies experimenting with innovative solutions anytime soon.

Hypernet is a team of computer scientists out of Stanford University developing a groundbreaking new parallel programming model for AI applications. We are attending, hosting, and speaking at new events every month! Be sure to subscribe to our Telegram group chat and email newsletterto stay up to date with our progress.

– Your HyperTeam
Palo Alto, CA

2018 North American Blockchain Expo Recap

By | Uncategorized

The 2018 North American Blockchain Expo made three points very clear:

1. Blockchain is here to stay

2. The field is evolving quickly

3. The field is still young and has a lot of growth potential

With over 13,000 attendees, and nearly 400 exhibitors, the 2018 North American Blockchain Expo is the largest blockchain expo of the year. As an epicenter of corporate innovation, the conference attracts some of the world’s largest companies. This year, there were representatives from IBM, Cisco, Intel, Google, Tesla, BP, Disney, Boeing, GE, Target, Mercedes-Benz, and many more. These companies come to see the new ideas being generated by startups — and new ideas were everywhere.

The trendiest topic at the the expo this year seemed to be low-energy sensing and analytics. The chip manufacturer ARM was showcasing its new low-power operating system for use in distributed smart devices. Another project was exploring how to move an OS across smart devices. And Psikick, one of Hypernet’s Silicon Valley neighbors, is producing batteryless sensors which can be deployed worry-free to collect data over long time periods in remote places.

These low-energy plays all indicate an industry-wide shift towards IOT technology and analytics as companies strive to collect ever more data in a cost-effective manner. Hypernet is well-positioned to capitalize on this industry shift by providing a uniquely robust method for analyzing the tidal wave of data that will be produced by these low-energy IOT devices.

As expected, the expo also featured many companies which were still defining their value and differentiation in the market. It seemed like you could arrange the words AI, neural networks, deep learning, and blockchain in any order and there was a company there with that name. Other companies were clearly doing there best to rise to the top of an overcrowded sector. Precious assets (gold, diamonds, silver…) backed blockchains were around every corner of the expo auditorium. While the value proposition of this niche is clear and intuitive, realistically, this sector can only support a handful of companies. This market saturation is a good sign for the industry as a whole, because it indicates that there is a lot of potential for blockchain technology.

In any industry, intense competition produces winners and losers. In fifty years most of the startups featured at the 2018 Blockchain Expo will no longer exist. However, there are likely a select few which will stick around. These are the projects which are currently focusing on product development. Creating a fantastic user-friendly product which solves a real world issue is the only long term path to success. It is exciting to think that those companies — the ones currently in the throes of product development and going to market — might become the next Google, Amazon, or Facebook, and usher in the new technological paradigm which is made possible by blockchain technology.

Technology and Trust: Conversations at Stanford University

By | Uncategorized

Members of the Hypernet team recently attended an event at Stanford University which focused on the intersection of trust and technology. The event was led by Janine Zacharia, former middle-east correspondent for the Washington Post. Zacharia was accompanied by Dr. Jeffrey Hancock, a researcher who studies lies and deception, and Dr. Sharad Goel, whose background is in software and statistical analysis. While the discussion was highly interdisciplinary, there were great insights regarding the role blockchain could play in society, and how it is a natural evolution of trust-based interactions with technology.

It’s hard to overstate the significance of the “trust-revolution” in technology over the past 20 years. In 2007, when Facebook first became popular, many pundits claimed it would never succeed because people would never allow their personal information to be put online for anybody to see. Now, it seems the pendulum has swung in the other direction. Contemporary social media users do everything they can to attract friends and followers, displaying full trust in the Facebook platform.

With Ebay and Paypal there were similar arguments. In the early days of eCommerce, purchasing an object online with a credit card was perceived as a high-risk activity. At the time, we did not trust those systems or organizations with our personal financial information. Contrast that with Singles Day 2018, when nearly $25 billion dollars of transactions were conducted on the website Alibaba alone.

More recently, societal expectations of trust have adapted in order to accomodate “the sharing economy.” Airbnb and Uber each faced the same criticisms as Facebook and Paypal. Way back in 2010, around when Airbnb was founded, it seemed impossible that normal people would turn their homes into hotels where online strangers could enter and leave as they pleased. And growing up we are all told to never get into a stranger’s car — so how could Uber ever succeed? Thoughts like these seem oddly quaint and absurd today, but that is how most of us thought less than 8 years ago.

Now, we are once again at an inflection point as we learn to trust a new type of technology: blockchain. One of blockchain’s most attractive features is the ability to verify transactions in a trustless manner. However, users must still become comfortable psychologically while using the technology — even if the technology is more secure than the systems we already use every day.

Fortunately, previous technologies built on trust (Facebook, Paypal, Airbnb, Uber, etc…) are great case studies to learn from. Hypernet is already researching how to build a trusted platform, which helps users become comfortable with our technology. This will be an important component of our business strategy for adoption and growth of the network. This, along with our core IP and friendly user interface, will make Hypernet the first choice for peer-to-peer computation.

Hypernet & ODX: Using Blockchain for Social Impact

By | Uncategorized

Last week, Hypernet outlined what a healthy strategic partnership looks like by exploring four main principles. Partnerships should:

  1. Address a real problem
  2. Create clear strategic value
  3. Promote social good
  4. Focus on the end-user experience and adoption

After exploring these ideas, we announced our partnership with ODX. ODX is bringing sustainable decentralized internet to developing countries. With millions of devices running ODX’s software, this partnership gives Hypernet access to those devices as well. In simple terms, our vision is for device owners to be able to connect their devices to Hypernetwork through the ODX app. The end result is that device owners will earn internet time for connecting to Hypernetwork, and the protocol will be adopted in a strategic part of the world. The symbiotic nature of this partnership is clear, but we are also excited for another reason.

At Hypernet, we believe that blockchain technology has an incredible capacity for promoting social good. The decentralized vision of blockchain is also a moral stance against the monopolization (and monetization) of information.

However, any visions for the future of blockchain are currently only obtainable by people with access to the internet. Internet in developing countries is prohibitively expensive, and out of reach for most populations. One of the qualities we find most attractive in ODX is their sustainable model for providing internet to those who otherwise couldn’t afford it. By making internet accessible to new populations, ODX and Hypernetwork provide people with new hopes, dreams, and opportunities. We’re all familiar with the saying, “Give a man a fish, feed him for a day; teach a man to fish, feed him for life.” Access to the internet is the 21st century version of fishing.

For this reason, Hypernet is doing what it can to help get people online, and realize the promises of blockchain technology in a healthy, socially beneficial, and decentralized manner. If you also believe in this vision, we invite you to partner with us as we continue to make blockchain a force for good, and use this technology to improve the lives of millions of people. Together, we can use blockchain to make the world a better place for all of us.

-Your HyperTeam
Palo Alto, CA

Have thoughts to share? Continue the conversation in our online group at: t.me/hypernettoken

Strong Partnerships for Good Causes

By | Uncategorized

Decentralized, distributed computing and cryptocurrency blockchain projects share the grand goal of breaking up monopolies in a variety of sectors in order to enable the free sharing of ideas, and unleash boundless creativity on the part of individuals. Of course, all of this rides on the assumption that people will use the technology, which so far remains an open question. Also worth considering is the degree to which individuals can be empowered in this way when only slightly more than half of the world’s population has access to the internet, in the first place.

The spread of mobile phone infrastructure, and cheap phones and plans, has recently been the most powerful driver of increased internet access in many parts of the world. It seems logical, then, that any plan to address the so-called global “digital divide” should also take into account the particular way in which mobile phone usage might shape the internet. As Hypernet works to change the world through parallel distributed computing, the spread of mobile phones represents an extraordinary opportunity. Hypernet’s technology is built to harness the latent computing power newly available on millions of mobile phones all over the world.

At the same time, there are certain other actors taking advantage of the global mobile phone revolution. These other actors are the tech giants which offer free data usage on mobile phones in several countries. With net neutrality top of mind, this should be a source of worry and consternation for everyone in the crypto space and, more generally, for everyone who cares about freedom of information. For many individuals in developing countries, they only know of the internet by the corporate brand which provides it to them in exchange for their personal data. If you care about the free market of ideas, this should immediately set off alarm bells.

The way in which tech platforms can shape the way we understand new information has been a major topic of discussion in the United States since 2016, when it was argued that large tech companies could actually skew democratic elections. The problem may be getting worse rather than better, and it’s not just Americans who need to worry.

Duterte in the Philippines may owe his controversial success to his online media campaign. It’s not just about winning elections, either. His critics charge him with spreading false information online in order to influence policy, or more specifically, to “fuel” a violent drug war.

Myanmar is another country in which free access served to bring an entire population online, suddenly and all at once, with the result that a single service provider has total power over information delivery. Instead of a drug war, this configuration has contributed to large-scale human rights violations and what many are calling acts of ethnic cleansing against the Rohingya Muslim population.

In short, it’s an understatement to say that we should be concerned about the incredible concentration of online information distribution. Ever since Marshall McLuhan declared “the medium is the message” in the 1960s, media theorists have recognized the importance of the delivery platform in the shaping of the very meaning of content. This point takes on absolutely crucial significance, though, when one organization or delivery platform usurps monopoly power over the dissemination of news.

The only solution to the problem created by concentrated data control is making the internet, and the flow of information and communication that goes with it, indiscriminately accessible. The free flow of ideas and information is a democratic ideal of the highest order. Since the internet is today’s most widely used channel of communication, we should work to protect it from distortion and bad actors, to ensure that it remains neutral.

Given that mobile phones are the most commonly used devices for accessing the internet worldwide, Hypernet would be perfectly suited to partner with other organizations seeking to provide internet in a fair, uncensored, and equitably accessible manner.

Imagine an organization built on the explicit mission of democratizing internet access for everyone on the planet, of offering the opportunity and possibility of the internet without the harmful effects of the single provider system. This would be the pinnacle of open and equal information sharing. Now imagine that this organization is already reaching millions of users through software on millions of devices. Hypernet would be a natural and beneficial addition to an already symbiotic relationship. Hypernet would be able to provide the necessary computational component for a decentralized internet, and the owners of mobile devices would gain the capacity to monetize their devices, in addition to accessing the internet.

These are precisely the types of partnerships that Hypernet is interested in forming, partnerships that create value for everyone involved while preserving the values and vision of a decentralized internet and the free circulation of ideas.

Technology for Freedom, Against Weaponized Data

By | Uncategorized

AI purports to know you better than you know yourself, but an algorithmically generated picture, while claiming to be an accurate reflection, can easily become a self-fulfilling prophecy. Political commentators lamenting the effects of Facebook echo chambers identify various causes of the problem: the fact that the extremists can easily find and validate each other online, the lack of heavier “policing” of content, etc. These effects are clearly amplified by a technology that uses your data to categorize you and then uses psychology to push you further in that same direction. In this sense, the problem isn’t simply privacy but also freedom — the freedom to decide who you are.

One way to address the issue is through law, a strategy that Apple CEO Tim Cook recently lauded at the 40th International Conference of Data Protection and Privacy Commissioners, held at the European Parliament in Brussels Oct. 22–26. Another path would be to change the nature of the technology, itself. The development of freedom-enhancing technology, which would weaken the death grip of Amazon and Google on data storage, is a central goal of the distributed computing revolution. But only Hypernet has built a radically new computing architecture that can fully liberate us from corporate-controlled data centers. This is because it is the only distributed computing project that does not ultimately rely on the cloud and thus data centers, but rather a decentralized network of individuals, all the way down.

Cook argued for the prioritization of four key principles, cited by TechCrunch as data minimization, transparency, the right to access, and security. Hypernet offers a key technological advance that will make strides in the area of minimization, meaning that companies will be able to use data that is de-identified with the customer and that is never actually collected and held.

While the blockchain and cryptocurrency space, in general, rightly lays claim to the moniker of freedom-enhancing technology, it can do so by focusing on offering an alternative to the traditional centralized financial system and players. The result is a somewhat navel-gazing obsession with the technological basis of currency without further interrogation of its purpose or use. This, arguably, set off the great crypto collapse of 2018 when an over-inflated and solipsistic market was finally forced to reckon with the fact that it was built upon sand. Against this trend, Hypernet matches the promise of financial decentralization with a corresponding deliverance from data monopolies.

Pushing back against those who claim that restrictive privacy laws rob tech of the capacity to reach its true potential, Cook countered that our true technological potential can only be reached by working with, rather than against, the trust of our users and of humanity. Hypernet responds to this call to action by offering new solutions that actually correspond to human values and needs.

Ultimately, technology is a tool, built by humans, for humans. To claim that technology “needs” to be liberated from human concerns and values in order to achieve its potential is nonsensical and lazy. Instead, as creators, we need to develop the technological methods to navigate competing societal priorities and to correct course following the revelation of negative externalities. Done properly, this should not compromise technological progress and, more importantly, user experience, but only enhance it.

Hypernet Update

By | Uncategorized
The mobile Hypernet demo being prepared for a conference.

Where we are now —

The past few months at Hypernet have been incredibly exciting. The team was able to focus almost exclusively on meeting key technological development goals, as set forth in the roadmap white paper. We are now happy to announce that we have more than met our expectations, having produced functional first versions of the Hypernet blockchain scheduler, API, and marketplace lobby.

Milestones:

1. Blockchain Scheduler MVP; Status: COMPLETE
Completing the blockchain scheduler required the contract issuer, seller filter, and allocation testing. This was completed in the Spring, ahead of schedule.

2. Consensus API MVP; Status: COMPLETE
Completing the API required the P2P layer, Topology Layer, and Consensus Protocol. This was completed in early Summer, ahead of schedule.

Milestones 1 and 2 can be seen in action in our demo video, which can be viewed here: https://youtu.be/cLBH0kOz5hc?t=35

3. Compute Supplier Lobby MVP; Status: COMPLETE
This lobby is where machines are rewarded for making themselves available for use on the network. This was completed on-time, in late August.

While the opinion-based pendulum of crypto-hype and hate continues to swing, Hypernet persists in following our strategic plan towards the creation of a real product, for use by real people. Blockchain’s popularity and success were built on the promise of greater freedom from the concentration of wealth and power in traditional financial markets, but its reputation flounders on the speculative craze that arrived in its wake. Speculation is parasitic on the free market, the real purpose of which is to provide a framework for the production and distribution of goods and services that meet real human needs. The Hypernet blockchain scheduler serves an authentic, non-speculative purpose as the connective tissue between buyers and sellers of computing power.

If the scheduler brings together buyers and sellers in the marketplace, the Hypernet API is responsible for producing the goods. This is the revolutionary technological foundation that allows for distributed, truly parallel computing, and it now exists in the real world. As you can observe in our preliminary public demo, the scheduler generates smart contracts according to the specified needs of the buyers and then matches available sellers with current demand. The API facilitates the completion of parallel distributed computing tasks.

While other blockchain projects may claim to deliver distributed computing, they are hobbled by limitations resulting from their programming models. Crucially, they can only solve a restricted set of computing problems, and they are too rigid to tolerate the fluctuation of machines dropping in and out of the network. They, thus, rely on the old, familiar data-center set-up and can most accurately be described as make-shift work-arounds rather than evidence of a paradigm shift.

Instead of tinkering in the margins, the Hypernet team reconceived and built, from the ground up, an entirely new computing architecture — the Hypernet API, based on the principle of Distributed Average Consensus (DAC). Instead of covering over old server farms in the shiny veneer of blockchain, Hypernet engineers really grappled with the promise of consensus-based computing and brought it to its logical conclusion. The two key layers of Hypernet technology, the (on-chain) scheduler and the (off-chain) API, mirror each other. Both operationalize the contemporary ideal of networked power, generated via dynamic circulation and consensus, and untethered from the stationary and inflexible external referent of the data center.

Clearly, a major source of fuel for crypto-skepticism is the lack, thus far, of the perfect use-case. With only several hundred daily users, CryptoKitties is the top consumer application on Ethereum blockchain so far. Given this state of affairs, a healthy dose of skepticism seems warranted. At the same time, the history of technology and economic development is littered with examples of inventions that sat dormant for years before appropriate use-cases and complementary technologies were found. When electricity replaced steam as an industrial power source, it took thirty years before the reverberations could be felt in the form of productivity gains. This is because industrial architecture, centered around a steam-powered axle, was only gradually replaced by new configurations that could exploit the advantages of electric power.

Hypernet matches revolutionary technology with real user need by providing the technological architecture necessary to harness the innovation. It is the result of years of research, testing, and optimization. And with the on-schedule arrival of all planned deliverables over the past three months, it’s not just a thrilling idea; it’s concrete reality.

Hypernet Represented at CDAO Conference

By | Uncategorized

Recently, Hypernet’s Head of Business Development, Fernando Fuentes, and Head of Strategy, Samy Biyadi, attended an exclusive conference for Chief Data Officers and Chief Analytics Officers of Fortune 1000 companies. While there, they were able to build relationships with CDOs and CAOs of large corporations, and continue to establish Hypernet as a leader in corporate data analytics. Here is Fernando’s recap of the event:

Hypernet’s Business Development and Strategy leads recently engaged in in-depth conversations with chief executives of Fortune 1000 companies at the Chief Data & Analytics Officer (CDAO) Exchange in Chicago, Illinois. The Chief Data & Analytics Officer Exchange is an invitation-only exchange put together by the International Quality & Productivity Center (IQPC). The CDAO Exchange is not your standard corporate conference, but rather an intimate gathering of key decision makers in the field of data analytics.

Two key differences from your standard conference or expo are:

  1. Petit comite: fifty carefully selected chief executives mingle with twenty invited vendors. Standard conferences and expos typically attract hundreds, or even thousands of participants. The more intimate and direct setting of the petit comite is a great fit for Hypernet’s current position in the startup life cycle.
  2. Caliber of attendants: The CDAO Exchange brought together key stakeholders reporting directly, or one step removed, to the Chief Executive Officer at many of the world’s largest companies, including AXA, Allstate, Charles Schwab, Comcast, Equifax, Ford, GE, Hulu, Morgan Stanley, Nike, Northwestern Medicine, Principal, Prudential, State Farm, T-Mobile, Disney, and more. A key attendance requirement was to have at least $1M USD in budget under direct control to spend on data & analytics projects. Additionally, these executives and their teams spent a considerable amount of time ahead of the conference thoroughly describing their main data, analytics, and IT infrastructure pain points to reduce the time to insights when conversing with vendors, like Hypernet. Needless to say, the content of the different speaking engagements at the Exchange were enriched by these executives’ decades of domain-specific experience, and their bird’s-eye view of the interplay between data, analytics, IT infrastructure, human resources, and corporate politics.

In this post, the team would also like to share more details on learnings from past lives that were reaffirmed at the Exchange, and completely new learnings arising from intimate 1-on-1 meetings, monologue speaker sessions, roundtable discussions, and informal conversations which also offered new opportunities to learn, and to reaffirm previous hypotheses. These learnings generally fall into two categories:

(1) Data, analytics, and IT infrastructure as organizational functions

(2) The human factor: the people and social groups driving innovation in these areas + change management

Category (1) — Learnings from the Data, Analytics, & IT Infrastructure functions at these organizations:

  • Value is hard to quantify in the data & analytics industry. Vendors that can quantify value, and develop metrics to track this, will stand out from the competition.
  • Upfront investments in technology infrastructure and personnel are considerable, but longer-term returns on investment are difficult to quantify.
  • The industry and the media love to focus on exciting new technical solutions, like AI, and tend to forget that these large organizations have less exciting, but equally (if not more) valuable needs. For example, ‘replatforming’ is a common main challenge for large corporations nowadays. ‘Replatforming’ can mean many things: old to new DMS or on-premises to cloud IT infrastructure.
  • There are multiple challenges upstream in the data journey, even before getting to the more talked about analytics challenges, like data management, curation, ETL processes, etc.
  • A hot topic at the event was integrating geospatial and social media data into predictive analytics and customer 360 models.
  • All executives in the healthcare industry expressed that a main data management challenge is that data comes from disparate sources and in heterogeneous formats (image vs text, digital vs analog).

Category (2) — The Human Factor of Driving Data, Analytics, & IT infrastructure change:

  • Data & analytics teams at large organizations are organized in one of two ways, each with its advantages and disadvantages:
  • (1) Hub & spoke: central data & analytics function on which all other departments/functions rely for reporting and insights.
  • (2) Data & analytics function cutting across all businesses: The disadvantage is role redundancy; the advantage is that with this organizational model typically more data & analytics value is created and their importance is at the top of everyone’s mind across the company, as opposed to an afterthought.
  • Challenges managing change (inside the D&A team but also organization-wide) with new data & analytics initiatives.
  • Data scientists lacking business mindset & knowledge on how organizations operate.
  • Tough for data science teams to show how valuable their conclusions are and translate them to action items in the company’s agenda that result in tangible business impact ($ in additional revenue, $ saved).

Sending our Business Development & Strategy leads to represent Hypernet at the Exchange is in-line with the bigger picture we have designed for these business functions over the next couple of quarters. See, Hypernet’s competitive advantage is that it has the best programming protocol for decentralized compute in the world. Different kinds of reputable stakeholders have gone under the hood and emerged excited for the future ahead of us. It is important to emphasize two things: (1) our core intellectual property is independent of the blockchain scheduler, can be implemented in private, and is base layer agnostic (i.e. can be deployed on Ethereum or another blockchain); and (2) this core IP surpasses in most criteria competing programming architectures developed at blockchain startups, traditional startups, and innovation labs at corporations. In other posts, we have extensively addressed these key differentiating criteria. In this post we hope we were able to shed some light on what the Strategic Business Development team here at Hypernet has been working, and will continue working on until the next major corporate planning event.

-Fernando Fuentes de la Parra,
Hypernet Head of Business Development

Planning for Success

By | Uncategorized

Launching a startup (a new protocol or traditional company) involves a lot of moving parts. This is especially true in the blockchain space, where teams must deal with all of the normal startup tasks in addition to conducting a token sale. It’s tempting to focus on the glamorous parts of a token sale, and to use those as the basis for determining the strength of a project — “Does this project have enough influencers supporting it? Did it receive a high token metrics score from my preferred source?” — and many projects indeed expend a lot of resources chasing these vanity metrics, mistakenly thinking that these metrics constitute the foundation of a successful and sustainable protocol.

However, the top three challenges for growing startups have little to do with these types of issues. The top three challenges are: efficient organization, sufficient planning, and effective managing. Somewhere between 75 and 90 percent of all startups fail, most of them due to shortcomings in these areas. Normally, startups only recognize these issues when it is too late, when a small crack in a startup’s foundation suddenly splits the whole organization apart. In the blockchain world of glitz, glamour, and cash, we believe it will be the conservative and focused startups, like Hypernet, that have the best opportunity to succeed in the long term.

Hypernet is focused on the fundamentals of running a successful startup. These fundamentals do not change just because a team is operating in the blockchain space. In fact, they become even more important. At Hypernet, we are actively planning and organizing ourselves to execute our vision and combat the top challenges that handicap the growth of early-stage startups.

We’d like to share what this looks like with our community. It is important to us that you have a sense of what is going on in the background at Hypernet and that you have a better idea of what to look for in other token projects. It is our goal to help move the space forward by supporting protocols that focus on sound business fundamentals and a fantastic long-term vision.

As a startup grows, continually optimizing the organization of the startup is critical. Efficient organization can turn 50 man-hours into the equivalent of 100 man-hours. This is accomplished in a couple of different ways. One is by further defining roles of employees within the startup as it grows. At the beginning, startup employees all wear different hats. This is a necessary phase, but it is important to quickly move past it. A lack of defined roles leads to neglected responsibilities, and it makes hiring and scaling difficult and disorganized. By defining roles and responsibilities, a startup is also helping its different branches to collaborate in unison. For Hypernet, this means aligning our tech, business development, marketing, corporate organization, and executive teams on vision, goals, and plans for execution. It’s important to organize early, and organize often, because after a certain point, no amount of funding or technical development can fix a disorganized and inefficient organization. Increased efficiency is also a product of well-defined communication and management pipelines. Installing effective protocols for routing communications inside and outside Hypernet is a great investment in our own future and will minimize the growing pains of scaling up.

Strategy planning is another element often overlooked by startups. In most failed startups, strategizing is taken for granted or viewed as a natural part of the development process. It’s not understood as a focused task, which deserves research and preparation. One key component of strategic planning at Hypernet is the ongoing process of market exploration. We have developed a remarkable piece of technology, DAC, which enables huge technological advancements in many fields. Market exploration helps us to narrow down the long list of possible use-cases, and pick out the ones which are most strategic. Taking the time to find the perfect first use-case is what aligns startups on the path to success, or, alternatively, on the path to delays, false starts, and blind pivots from one failure to the next. At Hypernet, we always do our due diligence to ensure that we remain on the right path.

Marketing is similar to market exploration, in that strategy leads to efficiency. Attracting a concrete base of customers and users will help the product see long-term and widespread adoption. The blockchain community can provide a great initial boost to promising projects, but that is only the first small group of people who will eventually become involved with successful projects. The value of marketing lies in attracting real users to the network, who are using it to solve real-world problems.

Overall, applying sound business fundamentals is what will continue to move the crypto space forward, and help us all to enjoy a distributed future. It sometimes seems as though “blockchain companies” have constituted a business category of their own, with its own norms and expectations. Hypernet is choosing to put itself in a different category, the category of teams that make ethical business decisions, based on sound and well thought-through strategy, in order to build lasting solutions that strive to do more every day, year after year. This is a huge undertaking that will stretch out over a long period of time, but with the solid foundations we construct today, we are confident in our ability to deliver on our dreams for tomorrow.

-Your Hyperteam,
Palo Alto, CA

Token Sale Updates

By | Uncategorized

Friends and supporters of Hypernet,

Thank you for your continued enthusiasm for our project. We appreciate the community’s strong interest in our token sale and whitelisting procedures. As you know, we continue to work hard to turn Hypernet into a success and keep you, the community, up to date as we progress. The HyperToken is very important to us; it represents the method to unlock the next wave of peer-to-peer, in-network analytics. Ensuring that we launch the tokens when they will see maximum adoption and utility is our top priority.

We know you are all waiting to receive news about the specific dates of Hypernet’s public token sale. However, after listening to the community, partner input, and thoughtful reflection, we have determined that in the current environment, it is not beneficial for the community, nor for the adoption of the protocol, to initiate a public token sale or whitelist at this time.

While we had contemplated having our token sale this month, we will proactively postpone this date. Although we legally cannot guarantee that we will have a public token sale, this is our intention at this time. It is important to carry out a token sale as soon as it is in the best interest of the protocol and its stakeholders, and this decision is made with the interests of everyone in our community in mind.

That being said, we greatly appreciate the support leading up to this announcement, and would like to provide you with some information that we believe you will find helpful and interesting. Please read below to see select details on our token metrics, KYC overview, and plans regarding how to begin using Hypernet and acquiring HyperToken.


Long Term Focus of Hypernet: We believe delaying our sale is another example of Hypernet’s desire to make decisions in the long-term interest of the protocol and token holders. Other examples of this long-term focus include:

1. We have been selective with token purchasers to date. 
Only 4% of applicants have been granted allocations in the private sale. We oversubscribed our softcap, while purposefully leaving room for the community. Although there have been many more requests for allocations than we could accommodate, Hypernet has a strict due diligence and KYC process, and even stricter internal criteria for choosing strategic purchasers.

2. The private sale had very long lock-ups, indicating intrinsic support from purchasers.
Our goal was to design a structure that was fair and transparent. As a result, we determined that priced tiers would be best.

Our private sale had a tiered pricing format which had an inverse correlation between price per token and the length of lock-up (refer to table below). The long lock-up for private sale purchasers points to the fact that their interests are aligned with the long-term vision of Hypernet, and the ecosystem we are working to create.

3. We are focused on building strong corporate foundations.
Hypernet has been working closely with the law firm of choice for some of the largest technology companies in the world to ensure that we are in an optimal position to thrive in complex and rapidly developing regulatory environments. This will always be foundational for Hypernet, as we think regulation will be a large barrier to success for many otherwise promising projects.

4. Token metrics are fair and balanced (lockups for all in the private sale, in order to be fair to all).

All that said, we are happy to release a selection of our token metrics below and give you a sense of where things are at.


– TOKEN METRICS –

Total tokens: 100 million tokens

Hardcap: $15m

Team lock-up: 4 year lockup with 1 year cliff, then linear release

While the date and terms of our public token sale are yet to be released, we can at least share some details of the sale’s structure, and items to have in mind when considering how we think about whitelisting and KYC:

It’s important to know in advance that our sale process will be more rigorous than what you are perhaps used to. Our long-term vision requires that great attention is paid to regulatory compliance. With this, our token sale process cannot be fully automated, as it is with some other projects. Our jurisdictional compliance officers must manually approve each successful application — a process which can take up to a week. Only purchasers who meet our due diligence and regulatory criteria may participate in the sale. Here are our suggestions on a couple documents you might want to prepare in advance:

1. A clear image of your driver’s license or passport.
2. A certified copy of a recent utility bill showing your name and address.

However, everyone will have the opportunity to participate in the Hypernet Lobby where you can earn uptime rewards. The Lobby is where computers are held on standby while waiting to be matched with a job. While they are waiting, they are rewarded with an amount of HyperToken just for being available on the network. The Lobby will slowly be opened to participants alongside the token sale. The first participants in the Lobby will have the easiest time earning HyperToken, and there will be a system to determine who will be the first people to participate in the Lobby. The details of this system will be released closer to the token sale and launch of the Lobby.

We would like to reiterate our appreciation for your support, and also that we are confident that holding off on the token sale will be of benefit to everyone involved. The core of Hypernet though, our technology, product, and business development continue at full speed. Look out for announcements soon regarding lobby participation, partnerships, app downloads, and more!

Thank you, and please visit t.me/hypernetwork to see the latest updates as they are released.

-Ivan Ravlich, Founder of Hypernet
Palo Alto, CA

Hypernet Advisor Bio: Tony Reeves

By | Uncategorized

“Unlimited opportunity”

That’s how advisor Tony Reeves describes Hypernet.

Tony is CFO of Experian Global Technologies. He has decades of experience with acquisitions, portfolio management, business planning, risk management, forecasting, and strategy. Although professionally he works on the business side of emerging technologies, he has always had an innate interest in new and promising innovations, and that interest has brought him to where he is now. As CFO of Global Technologies at Experian, his life revolves around identifying and acquiring promising new technology. They are always seaching for improved data processing and security, since assigning credit is largely based on those variables.

Tony describes Hypernet like one would describe the early days of the internet: there are so many use cases, it’s almost overwhelming. This makes it hard for many people to see the exact magnitude of how incredible Hypernet could be. Just like in 1990, no one saw the potential and versatility of the internet, and Hypernet seems to be in that same phase. For Tony though, there are four main traits which attracted him to Hypernet:

1. Raw data processing capability
2. Data privacy
3. Societal benefits
4. The team

Tony sees that many of the established tech companies haven’t yet figured out the new distributed data and computation paradigm, and are now faltering because of it. Instead of investing in innovation, they are doubling down on old technologies and trying to scale them up with things like cloud computing — which doesn’t solve the fundamental limitations of current computational models. In that sense, says Tony, Hypernet has a favorable position on the next wave of computational innovation: “It could literally unwind and change of lot of the existing infrastructure… In this big data world, we have data that’s structured, and it’s like we’ve been waiting for the compute to catch up to do with the information what we’ve been talking about for years, and it’s exciting when it shows up… With Hypernet we can get better models, with more information, with fewer people.”

The second important component of processing data is making sure it stays secure. We are all familiar now with the Equifax credit breach, and the terrible consequences it had for the company and the 150 million individuals whose personal information was stolen. Hypernet’s unique ability to maintain data privacy through on-device computation is very attractive to any company whose financial well-being depends on data-security (which is most companies).

Tony is also immensely excited for the societal benefits of Hypernet’s new computation model. He sees this as being revolutionary in poorer parts of the world, where current computation is limited or inaccessible. Providing the people in these places with a new way to connect and compute could drastically raise living standards, as well as make a path to things like education, industrialization, and digitization. They will truly be able to participate in the world in the same way we are able to right now. “There’s as much social good that could come out of this as business opportunity. You know, IBM creates a mainframe, but I’m not sure how quickly or directly that is translated to helping others.”

The last thing that attracted Tony to Hypernet is the team. He has been in several negotiations with companies which had compelling technology and business models, but he chose to walk out because of the team. In Hypernet, he sees a group of bright, pragmatic, and hard-working individuals who are driven to develop a great product.

You can learn more about Tony by checking out his LinkedIn profile.

Be sure to check out Hypernetwork.io to view all of our advisors. Also, sign up for our newsletter and online community to learn about new advisors as we announce them!

-Your HyperTeam

Quantum Computing vs Hypercomputing: Which is more powerful?

By | Uncategorized

Since its inception, digital computation has been based off a binary language of ones and zeros. These ones and zeros are indicated by transistors, which are either in an on state (a one) or an off state (a zero). As computers have advanced, those transistors have become smaller and smaller. A computer with a million transistors used to fill an entire warehouse. Now, it fits in your pocket. However, we may have reached the limits of binary computing. With transistors now the size of atoms, the time is ripe for an evolutionary leap in computing.

Quantum computing seems to be that evolutionary next step. While quantum computing does not yet seem to be capable of replacing classical computing, it clearly holds promise for surpassing classical computing in terms of raw computational power. As Bernard Marr explains, “We are entering a big data world in which the information we need to store is growing; there is a need for more ones and zeroes and transistors to process them. For the most part, classical computers are limited to doing one thing at a time, so the more complex the problem, the longer it takes. A problem that requires more power and time than today’s computers can accommodate is called an intractable problem. These are the problems that quantum computers are predicted to solve.”

Already, quantum computers are showing great potential. Google has shared insightful details about its quantum computing project, which it runs in partnership with NASA. Currently, its D-Wave 2X quantum computer has been running at a pace about 100,000,000 times faster than a traditional computer chip.

A key question for Hypernet then, and a question we often hear from the community, is:

“Will ultra-powerful quantum computing ever make Hypernet obsolete?”

It’s a valid quesiton, and fortunately it has an easy answer: No. This is because they each perform separate functions. Quantum computing is a different type of computation, and Hypernet is a protocol for parallelizing multiple computation nodes.

In fact, the opposite is true. As quantum computing gets stronger, so does Hypernet. A quantum computer could easily be appended to the network, alongside traditional computers. Hypernet’s proprietary DAC algorithm can still be leveraged to synthesize the results of many quantum machines running in parallel.

Although the future is impossible to predict, we at Hypernet feel well positioned to lead the way in parallel distributed supercomputing — quantum or otherwise.

To learn more about Hypernet’s technology, you can check out our website. Then, be sure to join our Telegram group where you can ask our team questions. For those of you already in there, you can also now subscribe to our new announcement channel.

Hypernet Advisor Bio: Joe Urgo

By | Uncategorized

Hypernet’s first advisor in the blockchain space is Joe Urgo. Joe has been very active in crypto projects for several years after leaving the finance industry where he was a derivatives analyst. His financial background gave him a unique perspective on Bitcoin’s early price fluctuations, and he quickly recognized blockchain as a technological game changer. He worked at Coinbase for several years before starting his own company, Sourcerers.io. Now, he spends most of his time working on his other project, District0x, and advising blockchain companies, including Republic, Keep, Aragon, Bloom, Status, and more.

Joe was originally attracted to Hypernet due to its high-caliber team. He knew some of the founding members, and when they explained what Hypernet was doing, Joe was immediately on board with the vision. He knew of other distributed computation projects such as Golem, Sonm, and iExec, and he also recognized their limited potential — rendering is great, but not that many people need it. Hypernet’s proprietary DAC algorithm seemed much more generalizable, and overall, much more effective. Furthermore, the fact that most of the research had already been done on the technology meant that this wouldn’t be a non-starter project, since most of the leg-work had already been completed. Given Joe’s penchant for picking good projects to support, and his own success in the space, the Hypernet team was happy to have him join as an advisor.

Over the past several months, Joe has been really excited by the growth of Hypernet, and its focus on fostering a good community. With many projects neglecting the communities that make them possible, he’s happy that Hypernet admins are in the Telegram answering questions and welcoming new community members. He also sees Hypernet laying the foundations for long-term success. With so many projects building up lots of hype with little substance, he sees value in educating community members about Hypernet’s technology, and engaging in marketing in a more wholistic and positive manner.

To find out more about Joe, you can check out his LinkedIn profile.

Who Uses Hypernet and Why: Demand Side

By | Uncategorized

Earlier in the week we looked at the people who will power Hypernet, and why they would want to connect their devices to the network. This week, we will take a look at the types of people who will use Hypernet’s computing capabilities — the purchasers.

Hypernet was borne out of a need for a specific type of computational power which didn’t exist in the marketplace at an affordable cost. Ivan Ravlich and Todd Chapman, Hypernet’s CEO and CTO respectively, experienced this dearth of power very acutely while conducting research at Stanford University. This is what spurred them to research new methods of computation to enable data-intensive simulations. Along the way, they have discovered many other use cases for the novel Hypernetwork protocol, and found many other parties interested in utilizing it.

Researchers:
The first type of user will be scientific researchers. Researchers generally have big dreams, but small budgets. Their options for computing are limited to government machines, which can take months or years to move to the top of the waitlist, or they can purchase time on Amazon or Google servers, which is prohibitively expensive. Hypernet provides a solution to both issues.

Corporations:
Corporations also use large amounts of computational power. For example, modeling market trends, running financial simulations, and even things like behavior analytics all take massive amounts of computing power. The world’s largest corporations will sometimes have in-house computation options, but most do not. Therefore, Hypernet could cut their computation costs by as much as 80% over cloud computing.

Organizations with sensitive information:
Imagine you run a hospital. You have billions of data-points on thousands of patients. You could use this data to discover new markers for diseases, and predictively analyze who is at risk for certain health issues. However, due to HIPAA, and other healthcare information privacy laws, you can’t efficiently analyze this wealth of life-saving information. Hypernet’s unique algorithm allows for sensitive data to be analyzed and learned from, but in an anonymous and secure way. This applies to anyone with sensitive data who wants to learn from that data; Hypernet could enable a new wave of cyber security and information privacy.

Individuals far away from computing resources:
Since computation is conducted off of the device, Hypernet enables data-analysis in remote locations. This means one only needs to be connected to the internet to open unlimited computation possibilities. Imagine you are the Red Cross, and you are trying to find efficient evacuation routes for a remote village after an eruption. Hypernet coud efficiently find those routes using nothing more than a satellite phone with a data connection.

Realisitically, Hypernet can be used by anyone who wants computational power, and has a connection to the internet. These are only a few profiles of the types of people and organizations interested in purchasing computational power. Within each of these profiles, the user also gains distinct advantages by using Hypernet, as opposed to services offered by other organizations. To learn more, you can subscribe to our blog and newsletter, and join our growing online community where you can ask our team questions, and talk to other people interested in Hypernet’s potential.

-Your Hyperteam
Palo Alto, California

Who Will Use Hypernet: Supply Side

By | Uncategorized

In order for Hypernet to work, we need to have users join the network. The more users the network accrues, the more valuable the network becomes. Although we plan to seed the network with a few industrial computation partners, we believe the longterm value of Hypernet will come from average users — even the power from industrial computing companies is chump-change compared to the combined flops of all the smartphones in the world!

A key question then is: Will users join the network?

Let’s answer this question by first looking at individuals who already lend their processing power to large organizations. The Search for Extra Terrestrial Intelligence (SETI) project out of UC berkeley uses a low-tech program developed in the 90’s to link together 1.7 million individual’s home computers to search for possible life on other planets. How much do these people get paid to loan out their processors to SETI? Nothing. These altruistic individuals simply believe in the project so much that they offer their processors to UC Berekely for free.

The Folding at Home ([email protected]) project out of Stanford University does the same thing, but instead of finding aliens, it folds simulated proteins in order to find new cures for cancer, and other diseases. Again, the users (including 15 million PS3 owners) do this free of charge.

What will happen when users can power the same type of research projects, but instead, they can get paid for it? With much of the world living below the poverty line, but still owning a smartphone or computer, there are many individuals who are ecstatic about the possibility of earning extra income on their latent, and otherwise unused, devices. The ability to put that computational power toward something as cool as finding aliens, or fighting cancer, is just icing on the cake.

Now, let’s look at a couple other intriguing technologies: CryptoKitties and Farmville. What do these two online games have to do with Hypernet? The answer is gamification. While we expect the primary motivator for joining the network to be earning Hypertokens, we will also be crafting the community and network participation so that it is engaging and fun to be a part of. Games like CryptoKitties and Farmville have no financial incentive to participate (some may argue they have a financial disincentive), showing that community and social benefits are a strong motivator to participate in large online groups.

To summarize, people have 3 main motivators for participating in online groups: Financial, social, and altruistic. Over time, Hypernet will incoporate aspects of all 3 of these proven approaches while garnering users.

But before any of those users even hear about Hypernet, we already have our early adopter community; we have you. We hope that you will be one of the first people to pledge their device to the Hypernetwork, and help kickstart a revolution in supercomputing. Although we have not yet launched the main net, we are getting closer each day. To keep up with the latest announcements, and to be the first to know when the network launches, please join our Telegram community and sign up for our newsletter. We’ll see you there!

-Your HyperTeam
Palo Alto, California

Hypernet Announces First Round of Advisors

By | Uncategorized

The rumors are true… Hypernet has announced its first three advisors!

Thousands of people have already shown their support for Hypernet, and we are excited to introduce you to three industry leaders who want to make sure we experience long-term success. Although these three individuals are only a part of Hypernet’s full advisory board, they represent our desire to seek out partners across a veriety of strategic industries who can help Hypernet expand and grow.

Without further ado, Hypernet would like to introduce:

  • Randall Kaplan, Co-Founder of Akamai Technologies
  • Joseph Urgo, Co-Founder District0x
  • Tony Reeves, CFO of Global Technology at Experian Plc

RANDALL KAPLAN

Randall is a co-founder of Akamai Technologies, the global leader in Content Delivery Network (CDN) services. Akamai serves nearly 25% of the world’s web traffic, is a member of the S&P 500, employs more than 7,600 people with 64 offices in 28 countries, and had $2.4 billion in 2017 revenues, making it one of only nine U.S.-based software and services companies that exceed $2 billion in annualized revenue. Randall is also the founder and CEO of JUMP Investors, a venture capital firm that also functions as his family office (notable investments include Google and Seagate.) Over the past 20 years, Randall has been an advisor to and investor in many successful traditional and crypto companies. He has served on the board of directors of multiple companies, has been an active public speaker, and has mentored more than 100 students through JUMP’s internships.

JOSEPH URGO

Joseph is the co-founder of District0x, an Ethereum dApp decentralizing the world’s marketplaces. Prior to this, Joseph founded Sourcerers.io, a consultancy supporting leading Ethereum-based projects. Before blazing his own path, Joe previously spent three years as an Operations Manager for Coinbase. Prior to Coinbase, he was a derivatives trader for Three Arrows Capital, an international hedge fund based in Singapore.

TONY REEVES

Tony Reeves is the Chief Financial Officer of Global Technology for Experian plc. As one of his many areas of responsibility, he leads their cloud infrastructure finance strategy. Experian has a revenue of $4.6B, with a presence in 37 countries. Tony has over 35 years of financial experience with a broad array of advanced technologies, from TRW Space and Defense to Experian credit bureau data centers and modernization efforts. His current responsibilities at Experian include financial integrity of enterprise technology investments, and Tony plays a key role in the strategic direction of the technology road map to modernize global credit bureaus. Over the course of his career, Tony has been heavily involved in over 50 acquisitions that have provided global expansion, critical technology capabilities, and new data sources.

What Happens When the Cloud Runs Out?

By | Uncategorized

Exciting new technologies like autonomous vehicles, artificial intelligence, and innovative life-saving medical treatments have at least one critical thing in common — they require massiveamounts of computational power to continue to evolve.

Satya Nadella, the CEO of Microsoft, recently declared that the world is quickly “running out of computing capacity.” Research teams working on groundbreaking innovations feel this pain every day. The more computational power they have, the faster they can run calculations and process data in order to engineer solutions to the world’s biggest problems. Despite many advances in recent years, gaining access to massive amounts of computational resources is still difficult and expensive.

Before the cloud:

In the days before cloud computing, teams working on major new technologies would have to build their own data centers to acquire the computational power they needed. This was slow and very, very expensive. The only groups that could afford to do this were huge companies, major universities, and governments.

Cloud computing changed this, making supercomputational power much less expensive and easier to access, thus opening it up to a much wider range of actors. This is one of the most important computing innovations that directly impacts and improves our lives on a daily basis.

Data Centers that power the cloud are still really expensive:

But there’s a catch… The data centers powering the cloud can’t keep up with demand. Creating a single new data center costs hundreds of millions of dollars, requires extended construction time, and has an enormous environmental impact. The big companies that build these data centers aren’t able to make them quickly, cheaply, or efficiently enough to continue enabling the type of innovation we need to solve the world’s biggest problems.

Even though the cloud is much less expensive than the old stand-alone proprietary infrastructure, the costs of creating and maintaining the cloud are still passed on to the customers. The traditional data-center model for providing computational power is becoming outdated, and it’s holding innovation back. This is exactly the challenge we are tackling at Hypernet.

A More Cost Effective, Scalable Approach:

Instead of a future filled with billions of dollars of capital investment, and years of construction, we believe the answer to the computational resource crisis is in the palm of your hand, in your kitchen, in your living room, and on your desk.

Hypernet is a distributed computational network that enables any device (from your mobile phone to your smart refrigerator) to contribute computational power to researchers who need it. We are surrounded by billions of devices that have an incredible amount of unused capacity, and Hypernet is capable of leveraging it.

In the past, attempts to built distributed networks were hampered by roadblocks like latency and network resilience. The Hypernet team has spent years developing a proprietary Distributed Average Consensus algorithm framework that has overcome these challenges.

With no massive capital expenditures or long construction times, Hypernet is opening up a virtually unlimited reserve of low cost, infinitely expandable computational power that will fuel the next generation of innovation.

Learn more about Hypernet:

If you’re as excited as we are about changing the world, or want to ask a question about Hypernet, then we invite you to join our Telegram community for the latest information.

Until next time,
Your HyperTeam
-Palo Alto, CA

How Hypernet Compares to Golem, Sonm, iExec, etc..

By | Uncategorized

At first glance, Hypernet may appear to be similar to other blockchain projects, such as Golem, Sonm, iExec, etc… Indeed, there are many problems which all 4 groups are able to solve equally well. But when you pop open the hood, you’ll see that the engine driving Hypernet is fundamentally new, with a brand new programming model — a programming model which is more powerful and versatile than the traditional architecture used by other blockchain computer projects.

This programming model is what makes Hypernet different. Existing models are not well suited to solve general computing problems on distributed networks, which constantly have computers popping in and out of a computation. And simply throwing blockchain at the problem doesn’t solve this fundamental constraint. Hypernet has architected and implemented a new programming model beneath the blockchain layer to handle distributed computation problems which require interprocess communication. This is not off-the-shelf tech. We created it ourselves, from the ground up.

Hypernet is based on the principle of Distributed Average Consensus (DAC), and it is the result of years of research, testing, and optimization. DAC+Blockchain allows for the efficient distribution of compute jobs, and effectively manages computers dropping on and off the network. Furthermore, it creates a secure backbone where buyers and providers of computational power can engage with confidence. Both the on-chain (scheduled) and off-chain (DAC) technology layers of Hypernet fit together hand in glove, and are both driven by consensus.

Now, if we examine Golem, Sonm, and iExec in more depth, we can see that they have built (or plan to build) their technical foundations on traditional computing architectures. These architectures were originally developed specifically to be used in data centers, and each group has bolted this data center architecture onto a blockchain network (albeit in different manners.)

These data center architectures have two unavoidable consequences when applied to a distributed network:

1. The amount of network communication and data transfer overhead is very high.

2. These architectures do not tolerate computers randomly dropping in and out of the network.

These problems arise because in an orderly data center you know the exact topology of the network, and exactly how the packets of information are flowing from processor to processor. Data center architectures are optimized for one topology, and one topology only. This is problematic on a distributed network because you don’t know the network topology, and it’s impossible to know the state and availability of every machine, so the topology constantly changes. This means that if you attempt to carry out a parallel compute job that requires any sort of back-and-forth communication between computers, then the data reductions will fail, and the program will fault. The data reductions are computationally fragile. This fragility can be strengthened, however, attempts to strengthen data reduction fragility will cause an increase in communication overhead. Naturally then, these traditional data center algorithms are expensive to employ over a distributed network, due to the need to transport terabytes upon terabytes of data. And again, they can only handle certain types of tasks to begin with.

This is perhaps why Golem, Sonm, and iExec seem to currently be focused on solving very specific issues. Golem is currently heavily focused on rendering, which is a great application that pairs well with their grid computing architecture. Sonm is primarily adapting existing hub and spoke architectures, with an emphasis on server hosting, to distributed networks. And iExec is focused on decentralized cloud computing, for specific use in certain research applications.

Hypernet wants to do more though. It was originally developed by researchers who did not have the computational availability to solve some of humanity’s most nagging problems. So, they set out to rethink and redesign computation from the ground up. Hypernet was specifically created in order to solve problems which were previously unsolvable, to enable machine learning in a more efficient and sustainable manner, and to support the community of users who will help make it all happen.

As Albert Einstein famously said, “ The significant problems we face cannot be solved at the same level of thinking we were at when we created them.” Hypernet is innovating the process of problem solving through parallel distributed computation, in order to create new and effective solutions to the world’s greatest challenges — and we can’t wait to see what problems we can solve when we all work together.

If you’re interested in finding out more, we are always available to answer questions in our Telegram early supporter community. We hope to see you there!

-The HyperTeam
Palo Alto, California

Why Hypernet is Built on the Blockchain

By | Uncategorized

One of the most common questions we get in our online community is, “Why does Hypernet need to be built on the blockchain?”

It’s a fair question. There is such a frenzy surrounding blockchain technology that all you have to do is tack “blockchain” or “crypto” onto something else and people go crazy. Many blockchain projects have proven to be a lot of talk with little substance. We want to address this head-on and be clear about whyHypernet is built on the blockchain, and how it leverages the technology.

The world is running out of computational capacity, which powers innovation in the fields of climate modeling, cancer research, autonomous vehicle development, and many other technologies that will improve people’s lives. We want to ensure everyone has the computing power they need to innovate.

Without blockchain technology it would be nearly impossible to do this effectively. Here are four critical ways blockchain enables Hypernet:

1) Collateral to Protect Against Bad Actors

In order to discourage malicious behavior, both buyers and sellers must stake collateral to join a job contract. Buyers submitting a job to the network post the full cost of the job to the smart contract when it is created. Sellers joining a job contract must also stake collateral. The amount of collateral required by a seller can be set by the buyer. This economically incentivizes reliable performance, otherwise the sellers risk losing their collateral.

2) Reputation System

A public ledger provides the accountability and transparency needed to make the Hypernet marketplace healthy. Since transactions are recorded in a smart contract on an open ledger, buyers and sellers are empowered to build reputational capital that will allow them to thrive in the marketplace.

3) Network Currency

HyperTokens will serve as a mechanism for buyers to pay sellers for their compute time. Market dynamics will then govern the prioritization of projects on the network. A compute job is prioritized according to its HyperToken price agreed upon by buyers and sellers in the marketplace.

4) Voting

In the event that the Hypernet protocol must be changed in the future due to technical demands, the revision will be subject to a voting process. The number of votes wielded by any peer will be proportional to the sum of HyperTokens associated with their account.

5) Availability Rewards

There may be times when the number of available devices exceeds the number of active compute jobs. In this case, devices can still earn tokens by waiting in the Hypernet “lobby” and remaining online. Availability will be verified by other “lobby” occupants who ping each other to confirm that the devices are actually online, and ready to receive a job. Members who fail the challenge forfeit their collateral to the challenger.

We built Hypernet with the goal of solving problems and making the world a better place. Blockchain technology is not just a shiny accessory, but rather a critical and necessary component of accomplishing this mission.

If you have questions, or want to learn more, join the conversation going on right now in our Telegram community. And if you want to learn more about the technical details of Hypernet, checkout our whitepaper, too.

-Your HyperTeam