yourfanat wrote: I am using another tool for Oracle developers - dbForge Studio for Oracle. This IDE has lots of usefull features, among them: oracle designer, code competion and formatter, query builder, debugger, profiler, erxport/import, reports and many others. The latest version supports Oracle 12C. More information here.
At VMworld this year, both in San Francisco and Barcelona, VMware CEO Pat Gelsinger introduced the concept of the Software-Defined Datacenter (SDDC). This builds on the concept that as more and more of the Data Center becomes virtualized (servers, desktops), delivering greater cost-savings and agility to customers, software-defined automation and functionality (network, security, storage, backup) become the next logical steps to help IT deliver greater value to the business.
As with any new technology or vision, there are often many questions about how this will impact the market, how it will affect IT organizations. Wikibon did a nice job providing their view on "Software-led Infrastructure". It's one of many attempts that I've seen to start trying to put a scope around this concept. Some portions are agreed upon, while others are creating some headaches.
I created this short FAQ to help answer some of those questions:
1. VMware is using a new term, "Software-Defined Datacenter" (SDDC), at the center of the 2012 conference. What is Software-Defined Datacenter? [Steve Herrod blog]. Software Defined Data Center is VMware's vision that greater business value can be created from IT when intelligent software is abstracted from standardized hardware. In the simplest technical definition, it is the separation (or abstraction) of the "control plane" (configuration, topology awareness, management, operations) from the "data plane" (moving data, storing data).
1a. Is there a clear spelling of this term?
Meh. Maybe, but it will have at least 3-5 variations in 2013. Just call it "SDDC" and save yourself a lot of auto-correct headaches.
2. Is there a clear, agreed upon definition (or standard) for Software-Defined Datacenter at this time?
Software-Defined Datacenter is not defined by an existing standards body (eg. IETF, ITU, NIST), but rather it is vision for the evolution of how Data Center environments will become more flexible in responding to business demands. SDDC builds upon the abstraction that server virtualization has created and extends this to broader elements of the Data Center (eg. network, storage), as well as expanding the roll that automation will play in the future.
3. How is "Software-Defined Datacenter" different than "Cloud"?
Cloud (or Cloud Computing) is fundamentally a new operational model for IT, where resources are delivered on-demand. While Cloud uses technologies such as virtualization or converged infrastructure, it's primarily about the shift in delivery and consumption of IT services. Software Defined Data Center is the next evolution of the underlying technology, where software delivers greater levels of intelligence and value, on top of standardized hardware.
4. Does Software-Defined Datacenter eliminate the need for traditional Data Center hardware?
No. There will still be a need for physical serves (CPU, memory), network devices to connect ports and deliver bandwidth, and devices that can store data on flash/disk/tape. But the trend in the industry is that these devices are becoming more standardized on x86 chips, mass produced memory/disks and mass produced ASICs. This trend should allow faster, more simplified "fabrics" (interconnecting servers, networks and storage) to be built, with the intelligence for policy, security, operations to continue to move into software, which is faster to develop and adapt to changing business requirements. Leading companies have been shifting their product strategies to embrace this trend for the last few years.
5. Which market segments does Software-Defined Datacenter target, or which use cases?
Software-Defined Datacenter technology are applicable to markets of all sizes (Enterprise, Mid-Market, Service Provider), but the initial adopters have been large Service Providers that are attempting to solve challenges with large-scale Data Centers. As the competition for Public and Hybrid Cloud services increases (Amazon, Google, Rackspace, Microsoft, Cloud Service Providers), the need to drive greater operational efficiency, and associated costs and time-to-market, is pushing them to solve problems in new software-centric ways.
As more Enterprise and Mid-Market customers adopt Private Cloud and deliver IT-as-a-Service, I also expect SDDC technologies to evolve to solve challenges at different scale, as well as user-centric challenges such as BYOD.
6. How will Software-Defined Datacenter impact IT organizations?
Even more than ever, the current era of IT is ultimately defined by rapid change, in terms of new devices (smartphones, tablets), new application consumption models (PaaS, SaaS), or converging technology silos (virtualization, converged infrastructure). Software-Defined Datacenter is the next step in converging functional areas, while attempting to give IT the ability to respond to business challenges faster.
7. Is Software-Defined Datacenter a competitive threat to traditional hardware companies?
As mentioned above, Software-Defined Datacenter does not eliminate the need for physical hardware within the Data Center. Rather it is a vision to enable customers to better take advantage of the trend towards delivering software intelligence on standardized hardware. As with many technology transitions, there are opportunities to evolve technology portfolios, evolve business models and unlock new partnership opportunities.
8. Is Software-Defined Datacenter explicitly linked with open-source technologies such as OpenStack, OpenFlow or Open vSwitch?
While there are open-source projects today that will have an influence on Software-Defined Datacenters, by no means does this mean that this is the only delivery mechanism for customers to obtain the technology needed for this IT technology evolution. A few examples of this:
OpenFlow is a standards-based protocol for network virtualization that can be implemented by any vendor, for either open-source or commercial products.
"Project Razor" is an open-source project that was jointly created by EMC and Puppet Labs to deliver advanced server and application automation for Data Center and Cloud environments. The software can be used with either commercial products (eg. VMware vSphere, Cisco UCS, etc.) or open-source projects (OpenStack, KVM, CloudFoundry)
About Brian Gracely A 20 year technology veteran, Brian Gracely is VP of product management at Virtustream. He holds a CCIE #3077 and an MBA from Wake Forest University.
Throughout his career Brian has led Cisco, NetApp, EMC and Virtustream into emerging markets and through technology transitions. An active participant in the virtualization and cloud computing communities, his industry viewpoints and writing can also be found on Twitter @bgracely, on his blog Clouds of Change and his podcast The Cloudcast (.net). He is a VMware vExpert and was named a "Top 100" Cloud Computing blogger by Cloud Computing Journal.
Enterprise Open Source Magazine Latest Stories . . .
First, let's outline a frame of reference for multithreading and why we may need to use a thread pool.
A thread is an execution context that can run a set of instructions within a process - aka a running program. Multithreaded programming refers to using threads to execute multiple ta...
SYS-CON Events announced today that Cloud Academy named "Bronze Sponsor" of 21st International Cloud Expo which will take place October 31 - November 2, 2017 at the Santa Clara Convention Center in Santa Clara, CA. Cloud Academy is the industry’s most innovative, vendor-neutral cloud t...
SYS-CON Events announced today that CA Technologies has been named "Platinum Sponsor" of SYS-CON's 21st International Cloud Expo®, which will take place October 31-November 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. CA Technologies helps customers succeed in a fu...
In his opening keynote at 20th Cloud Expo, Michael Maximilien, Research Scientist, Architect, and Engineer at IBM, discussed the full potential of the cloud and social data requires artificial intelligence. By mixing Cloud Foundry and the rich set of Watson services, IBM's Bluemix is t...
The goal of Continuous Testing is to shift testing left to find defects earlier and release software faster. This can be achieved by integrating a set of open source functional and performance testing tools in the early stages of your software delivery lifecycle. There is one process t...
As organizations abandon the waterfall method of software development for Agile, many are stuck in what Hasan Yasar terms Water-Scrum-Fall. That is, the organization has not effectively embraced Agile and DevOps principles and remains in silos with no links to business goals. Enter Dev...