Technology for Freedom, Against Weaponized Data

By | Uncategorized

AI purports to know you better than you know yourself, but an algorithmically generated picture, while claiming to be an accurate reflection, can easily become a self-fulfilling prophecy. Political commentators lamenting the effects of Facebook echo chambers identify various causes of the problem: the fact that the extremists can easily find and validate each other online, the lack of heavier “policing” of content, etc. These effects are clearly amplified by a technology that uses your data to categorize you and then uses psychology to push you further in that same direction. In this sense, the problem isn’t simply privacy but also freedom — the freedom to decide who you are.

One way to address the issue is through law, a strategy that Apple CEO Tim Cook recently lauded at the 40th International Conference of Data Protection and Privacy Commissioners, held at the European Parliament in Brussels Oct. 22–26. Another path would be to change the nature of the technology, itself. The development of freedom-enhancing technology, which would weaken the death grip of Amazon and Google on data storage, is a central goal of the distributed computing revolution. But only Hypernet has built a radically new computing architecture that can fully liberate us from corporate-controlled data centers. This is because it is the only distributed computing project that does not ultimately rely on the cloud and thus data centers, but rather a decentralized network of individuals, all the way down.

Cook argued for the prioritization of four key principles, cited by TechCrunch as data minimization, transparency, the right to access, and security. Hypernet offers a key technological advance that will make strides in the area of minimization, meaning that companies will be able to use data that is de-identified with the customer and that is never actually collected and held.

While the blockchain and cryptocurrency space, in general, rightly lays claim to the moniker of freedom-enhancing technology, it can do so by focusing on offering an alternative to the traditional centralized financial system and players. The result is a somewhat navel-gazing obsession with the technological basis of currency without further interrogation of its purpose or use. This, arguably, set off the great crypto collapse of 2018 when an over-inflated and solipsistic market was finally forced to reckon with the fact that it was built upon sand. Against this trend, Hypernet matches the promise of financial decentralization with a corresponding deliverance from data monopolies.

Pushing back against those who claim that restrictive privacy laws rob tech of the capacity to reach its true potential, Cook countered that our true technological potential can only be reached by working with, rather than against, the trust of our users and of humanity. Hypernet responds to this call to action by offering new solutions that actually correspond to human values and needs.

Ultimately, technology is a tool, built by humans, for humans. To claim that technology “needs” to be liberated from human concerns and values in order to achieve its potential is nonsensical and lazy. Instead, as creators, we need to develop the technological methods to navigate competing societal priorities and to correct course following the revelation of negative externalities. Done properly, this should not compromise technological progress and, more importantly, user experience, but only enhance it.

Hypernet Update

By | Uncategorized
The mobile Hypernet demo being prepared for a conference.

Where we are now —

The past few months at Hypernet have been incredibly exciting. The team was able to focus almost exclusively on meeting key technological development goals, as set forth in the roadmap white paper. We are now happy to announce that we have more than met our expectations, having produced functional first versions of the Hypernet blockchain scheduler, API, and marketplace lobby.


1. Blockchain Scheduler MVP; Status: COMPLETE
Completing the blockchain scheduler required the contract issuer, seller filter, and allocation testing. This was completed in the Spring, ahead of schedule.

2. Consensus API MVP; Status: COMPLETE
Completing the API required the P2P layer, Topology Layer, and Consensus Protocol. This was completed in early Summer, ahead of schedule.

Milestones 1 and 2 can be seen in action in our demo video, which can be viewed here:

3. Compute Supplier Lobby MVP; Status: COMPLETE
This lobby is where machines are rewarded for making themselves available for use on the network. This was completed on-time, in late August.

While the opinion-based pendulum of crypto-hype and hate continues to swing, Hypernet persists in following our strategic plan towards the creation of a real product, for use by real people. Blockchain’s popularity and success were built on the promise of greater freedom from the concentration of wealth and power in traditional financial markets, but its reputation flounders on the speculative craze that arrived in its wake. Speculation is parasitic on the free market, the real purpose of which is to provide a framework for the production and distribution of goods and services that meet real human needs. The Hypernet blockchain scheduler serves an authentic, non-speculative purpose as the connective tissue between buyers and sellers of computing power.

If the scheduler brings together buyers and sellers in the marketplace, the Hypernet API is responsible for producing the goods. This is the revolutionary technological foundation that allows for distributed, truly parallel computing, and it now exists in the real world. As you can observe in our preliminary public demo, the scheduler generates smart contracts according to the specified needs of the buyers and then matches available sellers with current demand. The API facilitates the completion of parallel distributed computing tasks.

While other blockchain projects may claim to deliver distributed computing, they are hobbled by limitations resulting from their programming models. Crucially, they can only solve a restricted set of computing problems, and they are too rigid to tolerate the fluctuation of machines dropping in and out of the network. They, thus, rely on the old, familiar data-center set-up and can most accurately be described as make-shift work-arounds rather than evidence of a paradigm shift.

Instead of tinkering in the margins, the Hypernet team reconceived and built, from the ground up, an entirely new computing architecture — the Hypernet API, based on the principle of Distributed Average Consensus (DAC). Instead of covering over old server farms in the shiny veneer of blockchain, Hypernet engineers really grappled with the promise of consensus-based computing and brought it to its logical conclusion. The two key layers of Hypernet technology, the (on-chain) scheduler and the (off-chain) API, mirror each other. Both operationalize the contemporary ideal of networked power, generated via dynamic circulation and consensus, and untethered from the stationary and inflexible external referent of the data center.

Clearly, a major source of fuel for crypto-skepticism is the lack, thus far, of the perfect use-case. With only several hundred daily users, CryptoKitties is the top consumer application on Ethereum blockchain so far. Given this state of affairs, a healthy dose of skepticism seems warranted. At the same time, the history of technology and economic development is littered with examples of inventions that sat dormant for years before appropriate use-cases and complementary technologies were found. When electricity replaced steam as an industrial power source, it took thirty years before the reverberations could be felt in the form of productivity gains. This is because industrial architecture, centered around a steam-powered axle, was only gradually replaced by new configurations that could exploit the advantages of electric power.

Hypernet matches revolutionary technology with real user need by providing the technological architecture necessary to harness the innovation. It is the result of years of research, testing, and optimization. And with the on-schedule arrival of all planned deliverables over the past three months, it’s not just a thrilling idea; it’s concrete reality.

Hypernet Represented at CDAO Conference

By | Uncategorized

Recently, Hypernet’s Head of Business Development, Fernando Fuentes, and Head of Strategy, Samy Biyadi, attended an exclusive conference for Chief Data Officers and Chief Analytics Officers of Fortune 1000 companies. While there, they were able to build relationships with CDOs and CAOs of large corporations, and continue to establish Hypernet as a leader in corporate data analytics. Here is Fernando’s recap of the event:

Hypernet’s Business Development and Strategy leads recently engaged in in-depth conversations with chief executives of Fortune 1000 companies at the Chief Data & Analytics Officer (CDAO) Exchange in Chicago, Illinois. The Chief Data & Analytics Officer Exchange is an invitation-only exchange put together by the International Quality & Productivity Center (IQPC). The CDAO Exchange is not your standard corporate conference, but rather an intimate gathering of key decision makers in the field of data analytics.

Two key differences from your standard conference or expo are:

  1. Petit comite: fifty carefully selected chief executives mingle with twenty invited vendors. Standard conferences and expos typically attract hundreds, or even thousands of participants. The more intimate and direct setting of the petit comite is a great fit for Hypernet’s current position in the startup life cycle.
  2. Caliber of attendants: The CDAO Exchange brought together key stakeholders reporting directly, or one step removed, to the Chief Executive Officer at many of the world’s largest companies, including AXA, Allstate, Charles Schwab, Comcast, Equifax, Ford, GE, Hulu, Morgan Stanley, Nike, Northwestern Medicine, Principal, Prudential, State Farm, T-Mobile, Disney, and more. A key attendance requirement was to have at least $1M USD in budget under direct control to spend on data & analytics projects. Additionally, these executives and their teams spent a considerable amount of time ahead of the conference thoroughly describing their main data, analytics, and IT infrastructure pain points to reduce the time to insights when conversing with vendors, like Hypernet. Needless to say, the content of the different speaking engagements at the Exchange were enriched by these executives’ decades of domain-specific experience, and their bird’s-eye view of the interplay between data, analytics, IT infrastructure, human resources, and corporate politics.

In this post, the team would also like to share more details on learnings from past lives that were reaffirmed at the Exchange, and completely new learnings arising from intimate 1-on-1 meetings, monologue speaker sessions, roundtable discussions, and informal conversations which also offered new opportunities to learn, and to reaffirm previous hypotheses. These learnings generally fall into two categories:

(1) Data, analytics, and IT infrastructure as organizational functions

(2) The human factor: the people and social groups driving innovation in these areas + change management

Category (1) — Learnings from the Data, Analytics, & IT Infrastructure functions at these organizations:

  • Value is hard to quantify in the data & analytics industry. Vendors that can quantify value, and develop metrics to track this, will stand out from the competition.
  • Upfront investments in technology infrastructure and personnel are considerable, but longer-term returns on investment are difficult to quantify.
  • The industry and the media love to focus on exciting new technical solutions, like AI, and tend to forget that these large organizations have less exciting, but equally (if not more) valuable needs. For example, ‘replatforming’ is a common main challenge for large corporations nowadays. ‘Replatforming’ can mean many things: old to new DMS or on-premises to cloud IT infrastructure.
  • There are multiple challenges upstream in the data journey, even before getting to the more talked about analytics challenges, like data management, curation, ETL processes, etc.
  • A hot topic at the event was integrating geospatial and social media data into predictive analytics and customer 360 models.
  • All executives in the healthcare industry expressed that a main data management challenge is that data comes from disparate sources and in heterogeneous formats (image vs text, digital vs analog).

Category (2) — The Human Factor of Driving Data, Analytics, & IT infrastructure change:

  • Data & analytics teams at large organizations are organized in one of two ways, each with its advantages and disadvantages:
  • (1) Hub & spoke: central data & analytics function on which all other departments/functions rely for reporting and insights.
  • (2) Data & analytics function cutting across all businesses: The disadvantage is role redundancy; the advantage is that with this organizational model typically more data & analytics value is created and their importance is at the top of everyone’s mind across the company, as opposed to an afterthought.
  • Challenges managing change (inside the D&A team but also organization-wide) with new data & analytics initiatives.
  • Data scientists lacking business mindset & knowledge on how organizations operate.
  • Tough for data science teams to show how valuable their conclusions are and translate them to action items in the company’s agenda that result in tangible business impact ($ in additional revenue, $ saved).

Sending our Business Development & Strategy leads to represent Hypernet at the Exchange is in-line with the bigger picture we have designed for these business functions over the next couple of quarters. See, Hypernet’s competitive advantage is that it has the best programming protocol for decentralized compute in the world. Different kinds of reputable stakeholders have gone under the hood and emerged excited for the future ahead of us. It is important to emphasize two things: (1) our core intellectual property is independent of the blockchain scheduler, can be implemented in private, and is base layer agnostic (i.e. can be deployed on Ethereum or another blockchain); and (2) this core IP surpasses in most criteria competing programming architectures developed at blockchain startups, traditional startups, and innovation labs at corporations. In other posts, we have extensively addressed these key differentiating criteria. In this post we hope we were able to shed some light on what the Strategic Business Development team here at Hypernet has been working, and will continue working on until the next major corporate planning event.

-Fernando Fuentes de la Parra,
Hypernet Head of Business Development

Planning for Success

By | Uncategorized

Launching a startup (a new protocol or traditional company) involves a lot of moving parts. This is especially true in the blockchain space, where teams must deal with all of the normal startup tasks in addition to conducting a token sale. It’s tempting to focus on the glamorous parts of a token sale, and to use those as the basis for determining the strength of a project — “Does this project have enough influencers supporting it? Did it receive a high token metrics score from my preferred source?” — and many projects indeed expend a lot of resources chasing these vanity metrics, mistakenly thinking that these metrics constitute the foundation of a successful and sustainable protocol.

However, the top three challenges for growing startups have little to do with these types of issues. The top three challenges are: efficient organization, sufficient planning, and effective managing. Somewhere between 75 and 90 percent of all startups fail, most of them due to shortcomings in these areas. Normally, startups only recognize these issues when it is too late, when a small crack in a startup’s foundation suddenly splits the whole organization apart. In the blockchain world of glitz, glamour, and cash, we believe it will be the conservative and focused startups, like Hypernet, that have the best opportunity to succeed in the long term.

Hypernet is focused on the fundamentals of running a successful startup. These fundamentals do not change just because a team is operating in the blockchain space. In fact, they become even more important. At Hypernet, we are actively planning and organizing ourselves to execute our vision and combat the top challenges that handicap the growth of early-stage startups.

We’d like to share what this looks like with our community. It is important to us that you have a sense of what is going on in the background at Hypernet and that you have a better idea of what to look for in other token projects. It is our goal to help move the space forward by supporting protocols that focus on sound business fundamentals and a fantastic long-term vision.

As a startup grows, continually optimizing the organization of the startup is critical. Efficient organization can turn 50 man-hours into the equivalent of 100 man-hours. This is accomplished in a couple of different ways. One is by further defining roles of employees within the startup as it grows. At the beginning, startup employees all wear different hats. This is a necessary phase, but it is important to quickly move past it. A lack of defined roles leads to neglected responsibilities, and it makes hiring and scaling difficult and disorganized. By defining roles and responsibilities, a startup is also helping its different branches to collaborate in unison. For Hypernet, this means aligning our tech, business development, marketing, corporate organization, and executive teams on vision, goals, and plans for execution. It’s important to organize early, and organize often, because after a certain point, no amount of funding or technical development can fix a disorganized and inefficient organization. Increased efficiency is also a product of well-defined communication and management pipelines. Installing effective protocols for routing communications inside and outside Hypernet is a great investment in our own future and will minimize the growing pains of scaling up.

Strategy planning is another element often overlooked by startups. In most failed startups, strategizing is taken for granted or viewed as a natural part of the development process. It’s not understood as a focused task, which deserves research and preparation. One key component of strategic planning at Hypernet is the ongoing process of market exploration. We have developed a remarkable piece of technology, DAC, which enables huge technological advancements in many fields. Market exploration helps us to narrow down the long list of possible use-cases, and pick out the ones which are most strategic. Taking the time to find the perfect first use-case is what aligns startups on the path to success, or, alternatively, on the path to delays, false starts, and blind pivots from one failure to the next. At Hypernet, we always do our due diligence to ensure that we remain on the right path.

Marketing is similar to market exploration, in that strategy leads to efficiency. Attracting a concrete base of customers and users will help the product see long-term and widespread adoption. The blockchain community can provide a great initial boost to promising projects, but that is only the first small group of people who will eventually become involved with successful projects. The value of marketing lies in attracting real users to the network, who are using it to solve real-world problems.

Overall, applying sound business fundamentals is what will continue to move the crypto space forward, and help us all to enjoy a distributed future. It sometimes seems as though “blockchain companies” have constituted a business category of their own, with its own norms and expectations. Hypernet is choosing to put itself in a different category, the category of teams that make ethical business decisions, based on sound and well thought-through strategy, in order to build lasting solutions that strive to do more every day, year after year. This is a huge undertaking that will stretch out over a long period of time, but with the solid foundations we construct today, we are confident in our ability to deliver on our dreams for tomorrow.

-Your Hyperteam,
Palo Alto, CA

Token Sale Updates

By | Uncategorized

Friends and supporters of Hypernet,

Thank you for your continued enthusiasm for our project. We appreciate the community’s strong interest in our token sale and whitelisting procedures. As you know, we continue to work hard to turn Hypernet into a success and keep you, the community, up to date as we progress. The HyperToken is very important to us; it represents the method to unlock the next wave of peer-to-peer, in-network analytics. Ensuring that we launch the tokens when they will see maximum adoption and utility is our top priority.

We know you are all waiting to receive news about the specific dates of Hypernet’s public token sale. However, after listening to the community, partner input, and thoughtful reflection, we have determined that in the current environment, it is not beneficial for the community, nor for the adoption of the protocol, to initiate a public token sale or whitelist at this time.

While we had contemplated having our token sale this month, we will proactively postpone this date. Although we legally cannot guarantee that we will have a public token sale, this is our intention at this time. It is important to carry out a token sale as soon as it is in the best interest of the protocol and its stakeholders, and this decision is made with the interests of everyone in our community in mind.

That being said, we greatly appreciate the support leading up to this announcement, and would like to provide you with some information that we believe you will find helpful and interesting. Please read below to see select details on our token metrics, KYC overview, and plans regarding how to begin using Hypernet and acquiring HyperToken.

Long Term Focus of Hypernet: We believe delaying our sale is another example of Hypernet’s desire to make decisions in the long-term interest of the protocol and token holders. Other examples of this long-term focus include:

1. We have been selective with token purchasers to date. 
Only 4% of applicants have been granted allocations in the private sale. We oversubscribed our softcap, while purposefully leaving room for the community. Although there have been many more requests for allocations than we could accommodate, Hypernet has a strict due diligence and KYC process, and even stricter internal criteria for choosing strategic purchasers.

2. The private sale had very long lock-ups, indicating intrinsic support from purchasers.
Our goal was to design a structure that was fair and transparent. As a result, we determined that priced tiers would be best.

Our private sale had a tiered pricing format which had an inverse correlation between price per token and the length of lock-up (refer to table below). The long lock-up for private sale purchasers points to the fact that their interests are aligned with the long-term vision of Hypernet, and the ecosystem we are working to create.

3. We are focused on building strong corporate foundations.
Hypernet has been working closely with the law firm of choice for some of the largest technology companies in the world to ensure that we are in an optimal position to thrive in complex and rapidly developing regulatory environments. This will always be foundational for Hypernet, as we think regulation will be a large barrier to success for many otherwise promising projects.

4. Token metrics are fair and balanced (lockups for all in the private sale, in order to be fair to all).

All that said, we are happy to release a selection of our token metrics below and give you a sense of where things are at.


Total tokens: 100 million tokens

Hardcap: $15m

Team lock-up: 4 year lockup with 1 year cliff, then linear release

While the date and terms of our public token sale are yet to be released, we can at least share some details of the sale’s structure, and items to have in mind when considering how we think about whitelisting and KYC:

It’s important to know in advance that our sale process will be more rigorous than what you are perhaps used to. Our long-term vision requires that great attention is paid to regulatory compliance. With this, our token sale process cannot be fully automated, as it is with some other projects. Our jurisdictional compliance officers must manually approve each successful application — a process which can take up to a week. Only purchasers who meet our due diligence and regulatory criteria may participate in the sale. Here are our suggestions on a couple documents you might want to prepare in advance:

1. A clear image of your driver’s license or passport.
2. A certified copy of a recent utility bill showing your name and address.

However, everyone will have the opportunity to participate in the Hypernet Lobby where you can earn uptime rewards. The Lobby is where computers are held on standby while waiting to be matched with a job. While they are waiting, they are rewarded with an amount of HyperToken just for being available on the network. The Lobby will slowly be opened to participants alongside the token sale. The first participants in the Lobby will have the easiest time earning HyperToken, and there will be a system to determine who will be the first people to participate in the Lobby. The details of this system will be released closer to the token sale and launch of the Lobby.

We would like to reiterate our appreciation for your support, and also that we are confident that holding off on the token sale will be of benefit to everyone involved. The core of Hypernet though, our technology, product, and business development continue at full speed. Look out for announcements soon regarding lobby participation, partnerships, app downloads, and more!

Thank you, and please visit to see the latest updates as they are released.

-Ivan Ravlich, Founder of Hypernet
Palo Alto, CA

Hypernet Advisor Bio: Tony Reeves

By | Uncategorized

“Unlimited opportunity”

That’s how advisor Tony Reeves describes Hypernet.

Tony is CFO of Experian Global Technologies. He has decades of experience with acquisitions, portfolio management, business planning, risk management, forecasting, and strategy. Although professionally he works on the business side of emerging technologies, he has always had an innate interest in new and promising innovations, and that interest has brought him to where he is now. As CFO of Global Technologies at Experian, his life revolves around identifying and acquiring promising new technology. They are always seaching for improved data processing and security, since assigning credit is largely based on those variables.

Tony describes Hypernet like one would describe the early days of the internet: there are so many use cases, it’s almost overwhelming. This makes it hard for many people to see the exact magnitude of how incredible Hypernet could be. Just like in 1990, no one saw the potential and versatility of the internet, and Hypernet seems to be in that same phase. For Tony though, there are four main traits which attracted him to Hypernet:

1. Raw data processing capability
2. Data privacy
3. Societal benefits
4. The team

Tony sees that many of the established tech companies haven’t yet figured out the new distributed data and computation paradigm, and are now faltering because of it. Instead of investing in innovation, they are doubling down on old technologies and trying to scale them up with things like cloud computing — which doesn’t solve the fundamental limitations of current computational models. In that sense, says Tony, Hypernet has a favorable position on the next wave of computational innovation: “It could literally unwind and change of lot of the existing infrastructure… In this big data world, we have data that’s structured, and it’s like we’ve been waiting for the compute to catch up to do with the information what we’ve been talking about for years, and it’s exciting when it shows up… With Hypernet we can get better models, with more information, with fewer people.”

The second important component of processing data is making sure it stays secure. We are all familiar now with the Equifax credit breach, and the terrible consequences it had for the company and the 150 million individuals whose personal information was stolen. Hypernet’s unique ability to maintain data privacy through on-device computation is very attractive to any company whose financial well-being depends on data-security (which is most companies).

Tony is also immensely excited for the societal benefits of Hypernet’s new computation model. He sees this as being revolutionary in poorer parts of the world, where current computation is limited or inaccessible. Providing the people in these places with a new way to connect and compute could drastically raise living standards, as well as make a path to things like education, industrialization, and digitization. They will truly be able to participate in the world in the same way we are able to right now. “There’s as much social good that could come out of this as business opportunity. You know, IBM creates a mainframe, but I’m not sure how quickly or directly that is translated to helping others.”

The last thing that attracted Tony to Hypernet is the team. He has been in several negotiations with companies which had compelling technology and business models, but he chose to walk out because of the team. In Hypernet, he sees a group of bright, pragmatic, and hard-working individuals who are driven to develop a great product.

You can learn more about Tony by checking out his LinkedIn profile.

Be sure to check out to view all of our advisors. Also, sign up for our newsletter and online community to learn about new advisors as we announce them!

-Your HyperTeam

Quantum Computing vs Hypercomputing: Which is more powerful?

By | Uncategorized

Since its inception, digital computation has been based off a binary language of ones and zeros. These ones and zeros are indicated by transistors, which are either in an on state (a one) or an off state (a zero). As computers have advanced, those transistors have become smaller and smaller. A computer with a million transistors used to fill an entire warehouse. Now, it fits in your pocket. However, we may have reached the limits of binary computing. With transistors now the size of atoms, the time is ripe for an evolutionary leap in computing.

Quantum computing seems to be that evolutionary next step. While quantum computing does not yet seem to be capable of replacing classical computing, it clearly holds promise for surpassing classical computing in terms of raw computational power. As Bernard Marr explains, “We are entering a big data world in which the information we need to store is growing; there is a need for more ones and zeroes and transistors to process them. For the most part, classical computers are limited to doing one thing at a time, so the more complex the problem, the longer it takes. A problem that requires more power and time than today’s computers can accommodate is called an intractable problem. These are the problems that quantum computers are predicted to solve.”

Already, quantum computers are showing great potential. Google has shared insightful details about its quantum computing project, which it runs in partnership with NASA. Currently, its D-Wave 2X quantum computer has been running at a pace about 100,000,000 times faster than a traditional computer chip.

A key question for Hypernet then, and a question we often hear from the community, is:

“Will ultra-powerful quantum computing ever make Hypernet obsolete?”

It’s a valid quesiton, and fortunately it has an easy answer: No. This is because they each perform separate functions. Quantum computing is a different type of computation, and Hypernet is a protocol for parallelizing multiple computation nodes.

In fact, the opposite is true. As quantum computing gets stronger, so does Hypernet. A quantum computer could easily be appended to the network, alongside traditional computers. Hypernet’s proprietary DAC algorithm can still be leveraged to synthesize the results of many quantum machines running in parallel.

Although the future is impossible to predict, we at Hypernet feel well positioned to lead the way in parallel distributed supercomputing — quantum or otherwise.

To learn more about Hypernet’s technology, you can check out our website. Then, be sure to join our Telegram group where you can ask our team questions. For those of you already in there, you can also now subscribe to our new announcement channel.

Hypernet Advisor Bio: Joe Urgo

By | Uncategorized

Hypernet’s first advisor in the blockchain space is Joe Urgo. Joe has been very active in crypto projects for several years after leaving the finance industry where he was a derivatives analyst. His financial background gave him a unique perspective on Bitcoin’s early price fluctuations, and he quickly recognized blockchain as a technological game changer. He worked at Coinbase for several years before starting his own company, Now, he spends most of his time working on his other project, District0x, and advising blockchain companies, including Republic, Keep, Aragon, Bloom, Status, and more.

Joe was originally attracted to Hypernet due to its high-caliber team. He knew some of the founding members, and when they explained what Hypernet was doing, Joe was immediately on board with the vision. He knew of other distributed computation projects such as Golem, Sonm, and iExec, and he also recognized their limited potential — rendering is great, but not that many people need it. Hypernet’s proprietary DAC algorithm seemed much more generalizable, and overall, much more effective. Furthermore, the fact that most of the research had already been done on the technology meant that this wouldn’t be a non-starter project, since most of the leg-work had already been completed. Given Joe’s penchant for picking good projects to support, and his own success in the space, the Hypernet team was happy to have him join as an advisor.

Over the past several months, Joe has been really excited by the growth of Hypernet, and its focus on fostering a good community. With many projects neglecting the communities that make them possible, he’s happy that Hypernet admins are in the Telegram answering questions and welcoming new community members. He also sees Hypernet laying the foundations for long-term success. With so many projects building up lots of hype with little substance, he sees value in educating community members about Hypernet’s technology, and engaging in marketing in a more wholistic and positive manner.

To find out more about Joe, you can check out his LinkedIn profile.

Who Uses Hypernet and Why: Demand Side

By | Uncategorized

Earlier in the week we looked at the people who will power Hypernet, and why they would want to connect their devices to the network. This week, we will take a look at the types of people who will use Hypernet’s computing capabilities — the purchasers.

Hypernet was borne out of a need for a specific type of computational power which didn’t exist in the marketplace at an affordable cost. Ivan Ravlich and Todd Chapman, Hypernet’s CEO and CTO respectively, experienced this dearth of power very acutely while conducting research at Stanford University. This is what spurred them to research new methods of computation to enable data-intensive simulations. Along the way, they have discovered many other use cases for the novel Hypernetwork protocol, and found many other parties interested in utilizing it.

The first type of user will be scientific researchers. Researchers generally have big dreams, but small budgets. Their options for computing are limited to government machines, which can take months or years to move to the top of the waitlist, or they can purchase time on Amazon or Google servers, which is prohibitively expensive. Hypernet provides a solution to both issues.

Corporations also use large amounts of computational power. For example, modeling market trends, running financial simulations, and even things like behavior analytics all take massive amounts of computing power. The world’s largest corporations will sometimes have in-house computation options, but most do not. Therefore, Hypernet could cut their computation costs by as much as 80% over cloud computing.

Organizations with sensitive information:
Imagine you run a hospital. You have billions of data-points on thousands of patients. You could use this data to discover new markers for diseases, and predictively analyze who is at risk for certain health issues. However, due to HIPAA, and other healthcare information privacy laws, you can’t efficiently analyze this wealth of life-saving information. Hypernet’s unique algorithm allows for sensitive data to be analyzed and learned from, but in an anonymous and secure way. This applies to anyone with sensitive data who wants to learn from that data; Hypernet could enable a new wave of cyber security and information privacy.

Individuals far away from computing resources:
Since computation is conducted off of the device, Hypernet enables data-analysis in remote locations. This means one only needs to be connected to the internet to open unlimited computation possibilities. Imagine you are the Red Cross, and you are trying to find efficient evacuation routes for a remote village after an eruption. Hypernet coud efficiently find those routes using nothing more than a satellite phone with a data connection.

Realisitically, Hypernet can be used by anyone who wants computational power, and has a connection to the internet. These are only a few profiles of the types of people and organizations interested in purchasing computational power. Within each of these profiles, the user also gains distinct advantages by using Hypernet, as opposed to services offered by other organizations. To learn more, you can subscribe to our blog and newsletter, and join our growing online community where you can ask our team questions, and talk to other people interested in Hypernet’s potential.

-Your Hyperteam
Palo Alto, California

Who Will Use Hypernet: Supply Side

By | Uncategorized

In order for Hypernet to work, we need to have users join the network. The more users the network accrues, the more powerful the network becomes. Although we plan to seed the network with a few industrial computation partners, we believe the longterm strength of Hypernet will come from average users — even the power from industrial computing companies is chump-change compared to the combined flops of all the smartphones in the world!

A key question then is: Will users join the network?

Let’s answer this question by first looking at individuals who already lend their processing power to large organizations. The Search for Extra Terrestrial Intelligence (SETI) project out of UC berkeley uses a low-tech program developed in the 90’s to link together 1.7 million individual’s home computers to search for possible life on other planets. How much do these people get paid to loan out their processors to SETI? Nothing. These altruistic individuals simply believe in the project so much that they offer their processors to UC Berkeley for free.

The Folding at Home ([email protected]) project out of Stanford University does the same thing, but instead of finding aliens, it folds simulated proteins in order to find new cures for cancer, and other diseases. Again, the users (including 15 million PS3 owners) do this free of charge.

What will happen when users can power the same type of research projects, but instead, they can get paid for it? With much of the world living below the poverty line, but still owning a smartphone or computer, there are many individuals who are ecstatic about the possibility of earning extra income on their latent, and otherwise unused, devices. The ability to put that computational power toward something as cool as finding aliens, or fighting cancer, is just icing on the cake.

Now, let’s look at a couple other intriguing technologies: CryptoKitties and Farmville. What do these two online games have to do with Hypernet? The answer is gamification. While we expect the primary motivator for joining the network to be earning Hypertokens, we will also be crafting the community and network participation so that it is engaging and fun to be a part of. Games like CryptoKitties and Farmville have no financial incentive to participate (some may argue they have a financial disincentive), showing that community and social benefits are a strong motivator to participate in large online groups.

To summarize, people have 3 main motivators for participating in online groups: Financial, social, and altruistic. Over time, Hypernet will incorporate aspects of all 3 of these proven approaches while garnering users.

But before any of those users even hear about Hypernet, we already have our early adopter community; we have you. We hope that you will be one of the first people to pledge their device to the Hypernetwork, and help kickstart a revolution in supercomputing. Although we have not yet launched the main net, we are getting closer each day. To keep up with the latest announcements, and to be the first to know when the network launches, please join our Telegram community and sign up for our newsletter. We’ll see you there!

-Your HyperTeam
Palo Alto, California

Hypernet Announces First Round of Advisors

By | Uncategorized

The rumors are true… Hypernet has announced its first three advisors!

Thousands of people have already shown their support for Hypernet, and we are excited to introduce you to three industry leaders who want to make sure we experience long-term success. Although these three individuals are only a part of Hypernet’s full advisory board, they represent our desire to seek out partners across a veriety of strategic industries who can help Hypernet expand and grow.

Without further ado, Hypernet would like to introduce:

  • Randall Kaplan, Co-Founder of Akamai Technologies
  • Joseph Urgo, Co-Founder District0x
  • Tony Reeves, CFO of Global Technology at Experian Plc


Randall is a co-founder of Akamai Technologies, the global leader in Content Delivery Network (CDN) services. Akamai serves nearly 25% of the world’s web traffic, is a member of the S&P 500, employs more than 7,600 people with 64 offices in 28 countries, and had $2.4 billion in 2017 revenues, making it one of only nine U.S.-based software and services companies that exceed $2 billion in annualized revenue. Randall is also the founder and CEO of JUMP Investors, a venture capital firm that also functions as his family office (notable investments include Google and Seagate.) Over the past 20 years, Randall has been an advisor to and investor in many successful traditional and crypto companies. He has served on the board of directors of multiple companies, has been an active public speaker, and has mentored more than 100 students through JUMP’s internships.


Joseph is the co-founder of District0x, an Ethereum dApp decentralizing the world’s marketplaces. Prior to this, Joseph founded, a consultancy supporting leading Ethereum-based projects. Before blazing his own path, Joe previously spent three years as an Operations Manager for Coinbase. Prior to Coinbase, he was a derivatives trader for Three Arrows Capital, an international hedge fund based in Singapore.


Tony Reeves is the Chief Financial Officer of Global Technology for Experian plc. As one of his many areas of responsibility, he leads their cloud infrastructure finance strategy. Experian has a revenue of $4.6B, with a presence in 37 countries. Tony has over 35 years of financial experience with a broad array of advanced technologies, from TRW Space and Defense to Experian credit bureau data centers and modernization efforts. His current responsibilities at Experian include financial integrity of enterprise technology investments, and Tony plays a key role in the strategic direction of the technology road map to modernize global credit bureaus. Over the course of his career, Tony has been heavily involved in over 50 acquisitions that have provided global expansion, critical technology capabilities, and new data sources.

What Happens When the Cloud Runs Out?

By | Uncategorized

Exciting new technologies like autonomous vehicles, artificial intelligence, and innovative life-saving medical treatments have at least one critical thing in common — they require massiveamounts of computational power to continue to evolve.

Satya Nadella, the CEO of Microsoft, recently declared that the world is quickly “running out of computing capacity.” Research teams working on groundbreaking innovations feel this pain every day. The more computational power they have, the faster they can run calculations and process data in order to engineer solutions to the world’s biggest problems. Despite many advances in recent years, gaining access to massive amounts of computational resources is still difficult and expensive.

Before the cloud:

In the days before cloud computing, teams working on major new technologies would have to build their own data centers to acquire the computational power they needed. This was slow and very, very expensive. The only groups that could afford to do this were huge companies, major universities, and governments.

Cloud computing changed this, making supercomputational power much less expensive and easier to access, thus opening it up to a much wider range of actors. This is one of the most important computing innovations that directly impacts and improves our lives on a daily basis.

Data Centers that power the cloud are still really expensive:

But there’s a catch… The data centers powering the cloud can’t keep up with demand. Creating a single new data center costs hundreds of millions of dollars, requires extended construction time, and has an enormous environmental impact. The big companies that build these data centers aren’t able to make them quickly, cheaply, or efficiently enough to continue enabling the type of innovation we need to solve the world’s biggest problems.

Even though the cloud is much less expensive than the old stand-alone proprietary infrastructure, the costs of creating and maintaining the cloud are still passed on to the customers. The traditional data-center model for providing computational power is becoming outdated, and it’s holding innovation back. This is exactly the challenge we are tackling at Hypernet.

A More Cost Effective, Scalable Approach:

Instead of a future filled with billions of dollars of capital investment, and years of construction, we believe the answer to the computational resource crisis is in the palm of your hand, in your kitchen, in your living room, and on your desk.

Hypernet is a distributed computational network that enables any device (from your mobile phone to your smart refrigerator) to contribute computational power to researchers who need it. We are surrounded by billions of devices that have an incredible amount of unused capacity, and Hypernet is capable of leveraging it.

In the past, attempts to built distributed networks were hampered by roadblocks like latency and network resilience. The Hypernet team has spent years developing a proprietary Distributed Average Consensus algorithm framework that has overcome these challenges.

With no massive capital expenditures or long construction times, Hypernet is opening up a virtually unlimited reserve of low cost, infinitely expandable computational power that will fuel the next generation of innovation.

Learn more about Hypernet:

If you’re as excited as we are about changing the world, or want to ask a question about Hypernet, then we invite you to join our Telegram community for the latest information.

Until next time,
Your HyperTeam
-Palo Alto, CA

How Hypernet Compares to Golem, Sonm, iExec, etc..

By | Uncategorized

At first glance, Hypernet may appear to be similar to other blockchain projects, such as Golem, Sonm, iExec, etc… Indeed, there are many problems which all 4 groups are able to solve equally well. But when you pop open the hood, you’ll see that the engine driving Hypernet is fundamentally new, with a brand new programming model — a programming model which is more powerful and versatile than the traditional architecture used by other blockchain computer projects.

This programming model is what makes Hypernet different. Existing models are not well suited to solve general computing problems on distributed networks, which constantly have computers popping in and out of a computation. And simply throwing blockchain at the problem doesn’t solve this fundamental constraint. Hypernet has architected and implemented a new programming model beneath the blockchain layer to handle distributed computation problems which require interprocess communication. This is not off-the-shelf tech. We created it ourselves, from the ground up.

Hypernet is based on the principle of Distributed Average Consensus (DAC), and it is the result of years of research, testing, and optimization. DAC+Blockchain allows for the efficient distribution of compute jobs, and effectively manages computers dropping on and off the network. Furthermore, it creates a secure backbone where buyers and providers of computational power can engage with confidence. Both the on-chain (scheduled) and off-chain (DAC) technology layers of Hypernet fit together hand in glove, and are both driven by consensus.

Now, if we examine Golem, Sonm, and iExec in more depth, we can see that they have built (or plan to build) their technical foundations on traditional computing architectures. These architectures were originally developed specifically to be used in data centers, and each group has bolted this data center architecture onto a blockchain network (albeit in different manners.)

These data center architectures have two unavoidable consequences when applied to a distributed network:

1. The amount of network communication and data transfer overhead is very high.

2. These architectures do not tolerate computers randomly dropping in and out of the network.

These problems arise because in an orderly data center you know the exact topology of the network, and exactly how the packets of information are flowing from processor to processor. Data center architectures are optimized for one topology, and one topology only. This is problematic on a distributed network because you don’t know the network topology, and it’s impossible to know the state and availability of every machine, so the topology constantly changes. This means that if you attempt to carry out a parallel compute job that requires any sort of back-and-forth communication between computers, then the data reductions will fail, and the program will fault. The data reductions are computationally fragile. This fragility can be strengthened, however, attempts to strengthen data reduction fragility will cause an increase in communication overhead. Naturally then, these traditional data center algorithms are expensive to employ over a distributed network, due to the need to transport terabytes upon terabytes of data. And again, they can only handle certain types of tasks to begin with.

This is perhaps why Golem, Sonm, and iExec seem to currently be focused on solving very specific issues. Golem is currently heavily focused on rendering, which is a great application that pairs well with their grid computing architecture. Sonm is primarily adapting existing hub and spoke architectures, with an emphasis on server hosting, to distributed networks. And iExec is focused on decentralized cloud computing, for specific use in certain research applications.

Hypernet wants to do more though. It was originally developed by researchers who did not have the computational availability to solve some of humanity’s most nagging problems. So, they set out to rethink and redesign computation from the ground up. Hypernet was specifically created in order to solve problems which were previously unsolvable, to enable machine learning in a more efficient and sustainable manner, and to support the community of users who will help make it all happen.

As Albert Einstein famously said, “ The significant problems we face cannot be solved at the same level of thinking we were at when we created them.” Hypernet is innovating the process of problem solving through parallel distributed computation, in order to create new and effective solutions to the world’s greatest challenges — and we can’t wait to see what problems we can solve when we all work together.

If you’re interested in finding out more, we are always available to answer questions in our Telegram early supporter community. We hope to see you there!

-The HyperTeam
Palo Alto, California

Why Hypernet is Built on the Blockchain

By | Uncategorized

One of the most common questions we get in our online community is, “Why does Hypernet need to be built on the blockchain?”

It’s a fair question. There is such a frenzy surrounding blockchain technology that all you have to do is tack “blockchain” or “crypto” onto something else and people go crazy. Many blockchain projects have proven to be a lot of talk with little substance. We want to address this head-on and be clear about whyHypernet is built on the blockchain, and how it leverages the technology.

The world is running out of computational capacity, which powers innovation in the fields of climate modeling, cancer research, autonomous vehicle development, and many other technologies that will improve people’s lives. We want to ensure everyone has the computing power they need to innovate.

Without blockchain technology it would be nearly impossible to do this effectively. Here are four critical ways blockchain enables Hypernet:

1) Collateral to Protect Against Bad Actors

In order to discourage malicious behavior, both buyers and sellers must stake collateral to join a job contract. Buyers submitting a job to the network post the full cost of the job to the smart contract when it is created. Sellers joining a job contract must also stake collateral. The amount of collateral required by a seller can be set by the buyer. This economically incentivizes reliable performance, otherwise the sellers risk losing their collateral.

2) Reputation System

A public ledger provides the accountability and transparency needed to make the Hypernet marketplace healthy. Since transactions are recorded in a smart contract on an open ledger, buyers and sellers are empowered to build reputational capital that will allow them to thrive in the marketplace.

3) Network Currency

HyperTokens will serve as a mechanism for buyers to pay sellers for their compute time. Market dynamics will then govern the prioritization of projects on the network. A compute job is prioritized according to its HyperToken price agreed upon by buyers and sellers in the marketplace.

4) Voting

In the event that the Hypernet protocol must be changed in the future due to technical demands, the revision will be subject to a voting process. The number of votes wielded by any peer will be proportional to the sum of HyperTokens associated with their account.

5) Availability Rewards

There may be times when the number of available devices exceeds the number of active compute jobs. In this case, devices can still earn tokens by waiting in the Hypernet “lobby” and remaining online. Availability will be verified by other “lobby” occupants who ping each other to confirm that the devices are actually online, and ready to receive a job. Members who fail the challenge forfeit their collateral to the challenger.

We built Hypernet with the goal of solving problems and making the world a better place. Blockchain technology is not just a shiny accessory, but rather a critical and necessary component of accomplishing this mission.

If you have questions, or want to learn more, join the conversation going on right now in our Telegram community. And if you want to learn more about the technical details of Hypernet, checkout our whitepaper, too.

-Your HyperTeam