Niklas Bjorkman wrote: Firstly I agree with your conclusion. NewSQL takes the best of the traditional databases and NoSQL databases to combine the benefits of both worlds. I do not agree that NewSQL vendors focus on giving scale-out features to transactional data. The NewSQL market is focusing on giving true ACID support combined with extreme performance, stepping away from the traditional relational structures in databases. A lot of developers appreciate the ease of accessing data using SQL and I think we will see more and more databases supporting standard SQL.
As you said - NewSQL databases often maintain the...
At VMworld this year, both in San Francisco and Barcelona, VMware CEO Pat Gelsinger introduced the concept of the Software-Defined Datacenter (SDDC). This builds on the concept that as more and more of the Data Center becomes virtualized (servers, desktops), delivering greater cost-savings and agility to customers, software-defined automation and functionality (network, security, storage, backup) become the next logical steps to help IT deliver greater value to the business.
As with any new technology or vision, there are often many questions about how this will impact the market, how it will affect IT organizations. Wikibon did a nice job providing their view on "Software-led Infrastructure". It's one of many attempts that I've seen to start trying to put a scope around this concept. Some portions are agreed upon, while others are creating some headaches.
I created this short FAQ to help answer some of those questions:
1. VMware is using a new term, "Software-Defined Datacenter" (SDDC), at the center of the 2012 conference. What is Software-Defined Datacenter? [Steve Herrod blog]. Software Defined Data Center is VMware's vision that greater business value can be created from IT when intelligent software is abstracted from standardized hardware. In the simplest technical definition, it is the separation (or abstraction) of the "control plane" (configuration, topology awareness, management, operations) from the "data plane" (moving data, storing data).
1a. Is there a clear spelling of this term?
Meh. Maybe, but it will have at least 3-5 variations in 2013. Just call it "SDDC" and save yourself a lot of auto-correct headaches.
2. Is there a clear, agreed upon definition (or standard) for Software-Defined Datacenter at this time?
Software-Defined Datacenter is not defined by an existing standards body (eg. IETF, ITU, NIST), but rather it is vision for the evolution of how Data Center environments will become more flexible in responding to business demands. SDDC builds upon the abstraction that server virtualization has created and extends this to broader elements of the Data Center (eg. network, storage), as well as expanding the roll that automation will play in the future.
3. How is "Software-Defined Datacenter" different than "Cloud"?
Cloud (or Cloud Computing) is fundamentally a new operational model for IT, where resources are delivered on-demand. While Cloud uses technologies such as virtualization or converged infrastructure, it's primarily about the shift in delivery and consumption of IT services. Software Defined Data Center is the next evolution of the underlying technology, where software delivers greater levels of intelligence and value, on top of standardized hardware.
4. Does Software-Defined Datacenter eliminate the need for traditional Data Center hardware?
No. There will still be a need for physical serves (CPU, memory), network devices to connect ports and deliver bandwidth, and devices that can store data on flash/disk/tape. But the trend in the industry is that these devices are becoming more standardized on x86 chips, mass produced memory/disks and mass produced ASICs. This trend should allow faster, more simplified "fabrics" (interconnecting servers, networks and storage) to be built, with the intelligence for policy, security, operations to continue to move into software, which is faster to develop and adapt to changing business requirements. Leading companies have been shifting their product strategies to embrace this trend for the last few years.
5. Which market segments does Software-Defined Datacenter target, or which use cases?
Software-Defined Datacenter technology are applicable to markets of all sizes (Enterprise, Mid-Market, Service Provider), but the initial adopters have been large Service Providers that are attempting to solve challenges with large-scale Data Centers. As the competition for Public and Hybrid Cloud services increases (Amazon, Google, Rackspace, Microsoft, Cloud Service Providers), the need to drive greater operational efficiency, and associated costs and time-to-market, is pushing them to solve problems in new software-centric ways.
As more Enterprise and Mid-Market customers adopt Private Cloud and deliver IT-as-a-Service, I also expect SDDC technologies to evolve to solve challenges at different scale, as well as user-centric challenges such as BYOD.
6. How will Software-Defined Datacenter impact IT organizations?
Even more than ever, the current era of IT is ultimately defined by rapid change, in terms of new devices (smartphones, tablets), new application consumption models (PaaS, SaaS), or converging technology silos (virtualization, converged infrastructure). Software-Defined Datacenter is the next step in converging functional areas, while attempting to give IT the ability to respond to business challenges faster.
7. Is Software-Defined Datacenter a competitive threat to traditional hardware companies?
As mentioned above, Software-Defined Datacenter does not eliminate the need for physical hardware within the Data Center. Rather it is a vision to enable customers to better take advantage of the trend towards delivering software intelligence on standardized hardware. As with many technology transitions, there are opportunities to evolve technology portfolios, evolve business models and unlock new partnership opportunities.
8. Is Software-Defined Datacenter explicitly linked with open-source technologies such as OpenStack, OpenFlow or Open vSwitch?
While there are open-source projects today that will have an influence on Software-Defined Datacenters, by no means does this mean that this is the only delivery mechanism for customers to obtain the technology needed for this IT technology evolution. A few examples of this:
OpenFlow is a standards-based protocol for network virtualization that can be implemented by any vendor, for either open-source or commercial products.
"Project Razor" is an open-source project that was jointly created by EMC and Puppet Labs to deliver advanced server and application automation for Data Center and Cloud environments. The software can be used with either commercial products (eg. VMware vSphere, Cisco UCS, etc.) or open-source projects (OpenStack, KVM, CloudFoundry)
About Brian Gracely An 19 year technology veteran, Brian Gracely is VP of Solutions Marketing at Virtustream. He holds CCIE #3077 and an MBA from Wake Forest University.
Throughout his career Brian has led Cisco, NetApp, EMC and Virtustream into emerging markets and through technology transitions. An active participant in the virtualization and cloud computing communities, his industry viewpoints and writing can also be found on Twitter @bgracely, on his blog Clouds of Change and his podcast The Cloudcast (.net). He is a VMware vExpert and was named a "Top 100" Cloud Computing blogger by Cloud Computing Journal.
Enterprise Open Source Magazine Latest Stories . . .
Cloud computing is more than a buzz-phrase it’s a transformative IT paradigm shift. The emphasis in the cloud is on elasticity, scalability, agility and open. Not just open standards but open APIs and open source. The delivery of software is also going through a paradigm shift. Open so...
In an ideal developer/systems administrator’s world, most applications would deploy seamlessly to multiple platforms and scale elastically with minimal effort bringing the unprecedented agility of the cloud within immediate reach of developer teams and IT organizations.
OpenStack, a ...
The cloud-enabled data center sits at the center of IT transformation. It facilitates the interconnection and communities that come together, propelling growth for both buyers and sellers.
In his session at the 12th International Cloud Expo, Gerry Fassig, CoreSite’s Vice President of...
Our more interconnected planet is accelerating the adoption and convergence of next-generation architectures, in the form of cloud, mobile and instrumented physical assets. Organizations that can effectively balance optimization and innovation, will be in a position to leverage new sys...
Here at AppNeta, we get to see a lot about how people build their web applications. From simple PHP scripts to heavily service-oriented Java clouds to monolithic Django apps, everybody’s product is architected a little differently. We’re still out to trace everything, and today I want ...
In the old world of IT, if you didn't have hardware capacity or the budget to buy more, your project was dead in the water. Budget constraints can leave some of the best, most creative and most ingenious innovations on the cutting room floor. It’s a true dilemma for developers and inno...