Welcome!

OpenStack Journal Authors: Carmen Gonzalez, Larry Dragich, Gil Allouche, Pat Romanski, Elizabeth White

Related Topics: Cloud Expo, Virtualization, OpenStack Journal

Cloud Expo: Blog Post

OpenNebula vs. OpenStack: User Needs vs. Vendor Driven

How Do You Compare OpenNebula with OpenStack?

This post is a reprint of a post at OpenNebula.org

We've crafted this post to answer a recurring question we've been hearing lately, specially from organizations planning to build their own private cloud:

How do you compare OpenNebula with OpenStack?...

This is indeed a complex question. There is no single answer because open-source projects and technologies present several dimensions. But we are far from afraid to answer it: the short, tl;dr version would be that they represent two different open-source models. While OpenNebula is an open-source effort focused on user needs, OpenStack is a vendor-driven effort.

This is neither a question of one being better than the other, they simply represent different approaches. Let us compare both open source options based on the following criteria: internal organization, governance model, roadmap definition, contributor profile, target user, product, and market competition. Obviously this comparison is biased (no way around that), but we have tried to be as neutral as possible.

A. Different Open-Source Project Models
Both projects release code under the liberal Apache 2.0 license, follow a transparent development process with a public roadmap, and have the same license agreement for new contributions. They present significant differences though, specially in:

  • Internal Organization. While OpenStack comprises many different subprojects (14 at the time of writing this post) aimed at building the different subsystems in a cloud infrastructure, OpenNebula offers a single integrated, comprehensive management platform for all cloud subsystems.
  • Governance Model. The main difference between both projects is in their governance model, mainly for the definition of the architecture, the release cycle and the roadmap. While OpenStack is controlled by a Foundation driven by vendors, OpenNebula follows a centralized, "Benevolent Dictator" approach. OpenNebula is managed by a single organization that focuses on the interest of the project and strategically leads it to ensure that meets users needs.
  • Roadmap Definition. OpenNebula roadmap is completely driven by users needs with features that meet real demands, and not features that result from an agreement among the different vendors participating in the management board of the project.
  • Contributor Profile. While in OpenNebula most of contributions come from the users of the software, the contributors to OpenStack are mostly vendors building their own OpenStack-based cloud product. Since we started OpenNebula six years ago, we wanted users to have a voice in the project and not to privilege contributors over users.

Now the question is,

Why is OpenNebula following a "Benevolent Dictator" management model?.

In our view, OpenStack is governed by a consortium of competitors, trying to create its own product or to provide compatibility for its particular device. The mixture of vendor motivations makes it increasingly difficult for a foundation to meet both the needs of the project and the monetization goals of each vendor. It is also interesting to remark that many of these vendors are also offering commercial products that directly compete with OpenStack components.

Traditionally, multi-vendor industrial consortiums are the best approach to commoditize a core component in the long term, mainly when there exists solid base software, but not to bring to market a complete enterprise-ready solution from scratch in the short term. In these situations the addition of more developers and members slows the project down, and the well-known Brooks law (The Mythical Man-Month) applies both at development and governance levels. OpenStack is reaching a point where the consensus based approach has limited the competitiveness of the project.

We believe that a centralized model with a strong individual leadership is the best way to quickly build a production-ready enterprise-class open-source product, mainly in the early stages of a fast growing market. Please do not pin this on us being control freaks; we do so because we want to create a great product and we want to take responsibility for the entire product and need to be responsive to our users. Benevolent dictator governance is the model followed by other successful open-source projects like Android or Linux Kernel, and, in our view, it is the most effective way to focus on engineering quality, to prioritize user needs, and also to ensure long term support.

The above reasons are the foundation of this claim: OpenNebula is made for users by users, OpenStack is made for vendors by vendors. This may seem like a daring statement, but we have been following this path for years, and haven't observed anything that proves this wrong.

B. Different Cloud Models
Although there are as many ways to understand cloud computing as there are organizations planning to build a cloud, they mostly fall between two extreme cloud models:

  • Enterprise Cloud Model (Datacenter Virtualization): On one side, there are businesses that understand cloud as an extension of virtualization in the datacenter; hence looking for a VMware vCloud-like infrastructure automation tool to orchestrate and simplify the management of the virtualized resources.
  • Public Cloud Model (Infrastructure Provision): On the other side, there are businesses that understand cloud as an AWS-like cloud on-premise; hence looking for a provisioning tool to supply virtualized resources on-demand.

CloudModels

Although OpenStack now tries to be everything for everyone, it was created as an open-source effort to compete against Amazon Web Services (AWS). Therefore while OpenStack is addressing the Infrastructure Provision segment; OpenNebula better meets the needs of Enterprise Cloud Computing. Since both tools enable infrastructure cloud computing, there is some overlap in the features they provide. However, each cloud model presents different architectural constraints and requires specialized interfaces, management capabilities and integration support. OpenNebula and OpenStack serve different needs and implement completely different philosophies.

C. Different Product Views
OpenNebula is a single enterprise-ready open-source product
, easy to install and operate, with a single installing and updating process, a one-stop community and a long-term commercial support. Any organization can use the open-source distribution to build a production cloud, and receive best-effort support through the community mailing list. Additionally, any organization can purchase commercial support directly from the developers. The important aspect is that we do not deliver enterprise editions of the software, we commercially support the community software.

Architectures

On the other hand, OpenStack comprises many subprojects with different levels maturitythat require complex integration to achieve a functional cloud infrastructure. A growing number of components and subprojects is making even more difficult their integration and coordination, and the delivery of a single coherent solution. No update path is provided if you want to install a new version, and there is not commercial support. Any organization interested in using OpenStack, and requiring commercial support and enterprise maturity, is recommended (by the vendors running the project) to deploy any of the several enterprise distributions.

ValueChain

From a business perspective, OpenNebula does not compete with OpenStack but with the many existing vendor "stacks" based on OpenStack, mainly with those by HP, Red Hat and IBM. These enterprise-grade distributions incorporate different versions of the OpenStack components with extended features, custom enhancements and integrations that may erode their compatibility and interoperability. Moreover many of them include proprietary components and exhibit significant differences in the implementation of critical underlying functionality.

So the organization that chooses OpenStack is actually using proprietary software based on OpenStack, and is locked into that specific distribution given that the vendor only supports its own stack, not the community version. Even worse, there is no way to migrate to another vendor distribution. In other words, these distributions do not offer the main benefits of open-source: low-cost, no lock-in, flexibility and interoperability.

D. A Look To the Future
We expect OpenStack to further fragment into more vendor specific "stacks" with narrow test matrices and extended proprietary features that lock customers in and don't interoperate well. OpenStack's biggest success is marketing. These vendor "stacks" and cloud providers will continue marketing "OpenStack" as the primary and, in most cases only, differentiator.

However OpenStack penetration in the market is relatively small compared with the investment made by vendors and VCs. These vendor specific "stacks" are not only competing with OpenNebula, other open-source cloud management platforms like CloudStack and Eucalyptus, and proprietary incumbents, they are also competing between them and with the open source community itself. All vendors claim they are the OpenStack leader because it's a winner-take-all game. Only one of the OpenStack distributions will gain critical mass on public and private clouds. Red Hat, now the dominant contributor to OpenStack, is in our view the only plausible winner.

Don't get us wrong, OpenStack is an open-source project with excellent developers, and some of its components are great from a technology point of view. Because a single cloud management platform can not be all things to all people, we will see an open-source cloud space with several offerings focused on different environments and/or industries. This will be the natural evolution, the same happened in other markets. OpenNebula and OpenStack will coexist and, in some cases, work together in a broad open cloud ecosystem. In the meantime, we will continue with our focus on solving real user needs in innovative ways, and getting our users involved in a fully vendor-agnostic project.

More Stories By Ignacio M. Llorente

Dr. Llorente is Director of the OpenNebula Project and CEO & co-founder at C12G Labs. He is an entrepreneur and researcher in the field of cloud and distributed computing, having managed several international projects and initiatives on Cloud Computing, and authored many articles in the leading journals and proceedings books. Dr. Llorente is one of the pioneers and world's leading authorities on Cloud Computing. He has held several appointments as independent expert and consultant for the European Commission and several companies and national governments. He has given many keynotes and invited talks in the main international events in cloud computing, has served on several Groups of Experts on Cloud Computing convened by international organizations, such as the European Commission and the World Economic Forum, and has contributed to several Cloud Computing panels and roadmaps. He founded and co-chaired the Open Grid Forum Working Group on Open Cloud Computing Interface, and has participated in the main European projects in Cloud Computing. Llorente holds a Ph.D in Computer Science (UCM) and an Executive MBA (IE Business School), and is a Full Professor (Catedratico) and the Head of the Distributed Systems Architecture Group at UCM.

@ThingsExpo Stories
The 3rd International Internet of @ThingsExpo, co-located with the 16th International Cloud Expo - to be held June 9-11, 2015, at the Javits Center in New York City, NY - announces that its Call for Papers is now open. The Internet of Things (IoT) is the biggest idea since the creation of the Worldwide Web more than 20 years ago.
Cultural, regulatory, environmental, political and economic (CREPE) conditions over the past decade are creating cross-industry solution spaces that require processes and technologies from both the Internet of Things (IoT), and Data Management and Analytics (DMA). These solution spaces are evolving into Sensor Analytics Ecosystems (SAE) that represent significant new opportunities for organizations of all types. Public Utilities throughout the world, providing electricity, natural gas and water, are pursuing SmartGrid initiatives that represent one of the more mature examples of SAE. We have s...
The security devil is always in the details of the attack: the ones you've endured, the ones you prepare yourself to fend off, and the ones that, you fear, will catch you completely unaware and defenseless. The Internet of Things (IoT) is nothing if not an endless proliferation of details. It's the vision of a world in which continuous Internet connectivity and addressability is embedded into a growing range of human artifacts, into the natural world, and even into our smartphones, appliances, and physical persons. In the IoT vision, every new "thing" - sensor, actuator, data source, data con...
How do APIs and IoT relate? The answer is not as simple as merely adding an API on top of a dumb device, but rather about understanding the architectural patterns for implementing an IoT fabric. There are typically two or three trends: Exposing the device to a management framework Exposing that management framework to a business centric logic Exposing that business layer and data to end users. This last trend is the IoT stack, which involves a new shift in the separation of what stuff happens, where data lives and where the interface lies. For instance, it's a mix of architectural styles ...
The Internet of Things is tied together with a thin strand that is known as time. Coincidentally, at the core of nearly all data analytics is a timestamp. When working with time series data there are a few core principles that everyone should consider, especially across datasets where time is the common boundary. In his session at Internet of @ThingsExpo, Jim Scott, Director of Enterprise Strategy & Architecture at MapR Technologies, discussed single-value, geo-spatial, and log time series data. By focusing on enterprise applications and the data center, he will use OpenTSDB as an example t...
An entirely new security model is needed for the Internet of Things, or is it? Can we save some old and tested controls for this new and different environment? In his session at @ThingsExpo, New York's at the Javits Center, Davi Ottenheimer, EMC Senior Director of Trust, reviewed hands-on lessons with IoT devices and reveal a new risk balance you might not expect. Davi Ottenheimer, EMC Senior Director of Trust, has more than nineteen years' experience managing global security operations and assessments, including a decade of leading incident response and digital forensics. He is co-author of t...
The Internet of Things will greatly expand the opportunities for data collection and new business models driven off of that data. In her session at @ThingsExpo, Esmeralda Swartz, CMO of MetraTech, discussed how for this to be effective you not only need to have infrastructure and operational models capable of utilizing this new phenomenon, but increasingly service providers will need to convince a skeptical public to participate. Get ready to show them the money!
The Internet of Things will put IT to its ultimate test by creating infinite new opportunities to digitize products and services, generate and analyze new data to improve customer satisfaction, and discover new ways to gain a competitive advantage across nearly every industry. In order to help corporate business units to capitalize on the rapidly evolving IoT opportunities, IT must stand up to a new set of challenges. In his session at @ThingsExpo, Jeff Kaplan, Managing Director of THINKstrategies, will examine why IT must finally fulfill its role in support of its SBUs or face a new round of...
One of the biggest challenges when developing connected devices is identifying user value and delivering it through successful user experiences. In his session at Internet of @ThingsExpo, Mike Kuniavsky, Principal Scientist, Innovation Services at PARC, described an IoT-specific approach to user experience design that combines approaches from interaction design, industrial design and service design to create experiences that go beyond simple connected gadgets to create lasting, multi-device experiences grounded in people's real needs and desires.
Enthusiasm for the Internet of Things has reached an all-time high. In 2013 alone, venture capitalists spent more than $1 billion dollars investing in the IoT space. With "smart" appliances and devices, IoT covers wearable smart devices, cloud services to hardware companies. Nest, a Google company, detects temperatures inside homes and automatically adjusts it by tracking its user's habit. These technologies are quickly developing and with it come challenges such as bridging infrastructure gaps, abiding by privacy concerns and making the concept a reality. These challenges can't be addressed w...
The Domain Name Service (DNS) is one of the most important components in networking infrastructure, enabling users and services to access applications by translating URLs (names) into IP addresses (numbers). Because every icon and URL and all embedded content on a website requires a DNS lookup loading complex sites necessitates hundreds of DNS queries. In addition, as more internet-enabled ‘Things' get connected, people will rely on DNS to name and find their fridges, toasters and toilets. According to a recent IDG Research Services Survey this rate of traffic will only grow. What's driving t...
Connected devices and the Internet of Things are getting significant momentum in 2014. In his session at Internet of @ThingsExpo, Jim Hunter, Chief Scientist & Technology Evangelist at Greenwave Systems, examined three key elements that together will drive mass adoption of the IoT before the end of 2015. The first element is the recent advent of robust open source protocols (like AllJoyn and WebRTC) that facilitate M2M communication. The second is broad availability of flexible, cost-effective storage designed to handle the massive surge in back-end data in a world where timely analytics is e...
Scott Jenson leads a project called The Physical Web within the Chrome team at Google. Project members are working to take the scalability and openness of the web and use it to talk to the exponentially exploding range of smart devices. Nearly every company today working on the IoT comes up with the same basic solution: use my server and you'll be fine. But if we really believe there will be trillions of these devices, that just can't scale. We need a system that is open a scalable and by using the URL as a basic building block, we open this up and get the same resilience that the web enjoys.
"Matrix is an ambitious open standard and implementation that's set up to break down the fragmentation problems that exist in IP messaging and VoIP communication," explained John Woolf, Technical Evangelist at Matrix, in this SYS-CON.tv interview at @ThingsExpo, held Nov 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA.
We are reaching the end of the beginning with WebRTC, and real systems using this technology have begun to appear. One challenge that faces every WebRTC deployment (in some form or another) is identity management. For example, if you have an existing service – possibly built on a variety of different PaaS/SaaS offerings – and you want to add real-time communications you are faced with a challenge relating to user management, authentication, authorization, and validation. Service providers will want to use their existing identities, but these will have credentials already that are (hopefully) i...
P2P RTC will impact the landscape of communications, shifting from traditional telephony style communications models to OTT (Over-The-Top) cloud assisted & PaaS (Platform as a Service) communication services. The P2P shift will impact many areas of our lives, from mobile communication, human interactive web services, RTC and telephony infrastructure, user federation, security and privacy implications, business costs, and scalability. In his session at @ThingsExpo, Robin Raymond, Chief Architect at Hookflash, will walk through the shifting landscape of traditional telephone and voice services ...
Explosive growth in connected devices. Enormous amounts of data for collection and analysis. Critical use of data for split-second decision making and actionable information. All three are factors in making the Internet of Things a reality. Yet, any one factor would have an IT organization pondering its infrastructure strategy. How should your organization enhance its IT framework to enable an Internet of Things implementation? In his session at Internet of @ThingsExpo, James Kirkland, Chief Architect for the Internet of Things and Intelligent Systems at Red Hat, described how to revolutioniz...
Bit6 today issued a challenge to the technology community implementing Web Real Time Communication (WebRTC). To leap beyond WebRTC’s significant limitations and fully leverage its underlying value to accelerate innovation, application developers need to consider the entire communications ecosystem.
The definition of IoT is not new, in fact it’s been around for over a decade. What has changed is the public's awareness that the technology we use on a daily basis has caught up on the vision of an always on, always connected world. If you look into the details of what comprises the IoT, you’ll see that it includes everything from cloud computing, Big Data analytics, “Things,” Web communication, applications, network, storage, etc. It is essentially including everything connected online from hardware to software, or as we like to say, it’s an Internet of many different things. The difference ...
Cloud Expo 2014 TV commercials will feature @ThingsExpo, which was launched in June, 2014 at New York City's Javits Center as the largest 'Internet of Things' event in the world.