From NASA retiring its 45,000 sq ft data center to the U.S. Federal government saving more than $2.8 billion, many of the biggest data center owners in the world are making the long-term effort to scale their data centers more efficiently by consolidating the current hardware on an ongoing basis. Trying to figure out if there is more “juice” in a densely deployed virtual environment might be one of the most challenging but also expected progression of IT operations. Their increasing demand for more efficient computing power and storage space makes the growth of a scalable and elastic data center a necessity.
Bosheng Song
Recent Posts
Navigating Through the Uncertainty Between Public Cloud and Hybrid Cloud
In part 1 of this blog, we explored the basic history of public cloud. An important question that still needs to be put under the spotlight is “the Why” of public cloud. Why do we need public cloud anyways, and what are the real benefits and risks when we sign up for it? The uncertainty of running everything in public cloud is not only valid, but also very real. When applications are virtualized, there is always going to be a balance between performance and efficiency, and battling against the uncertainty of offloading everything to the cloud is an exciting but complicated process. From SLA to orchestration tools, the decision of building a public, private, or hybrid cloud has never been more complex.
Topics: Cloud
The concept of virtualization is one of the most important concepts since computers were invented. If you think about it - from cloud computing to crowdsourcing apps, the idea of “shared resources” can really apply to most hardware and software innovations that the IT industry has introduced to the market since the creation of the Internet.
Topics: Features
Private Clouds are great: Developers self-serving themselves the resources they need instantaneously has accelerated the development process. One of the challenges with private clouds that we hear about a lot is the resulting unpredictable demand. In 5.5, Turbonomic has developed an integration with vRealize Automation to let any workload be deployed in the best way possible at any time.
Topics: Features
The Big Decision of Adopting Openstack as a Private Cloud
Since VMTurbo began its open source community participation and platform support of the OpenStack private cloud solution, a lot in the virtualization industry has changed. From innovative foundation hardware technology such as hyperconverged infrastructure from Nutanix to OpenStack’s $3.3 billion market revenue projection by 2018, there has never been a better time to implement OpenStack for a company or organization that seeks for a private cloud solution that’s reliable, agile, and secure. However, with its flexibility and complexity, the adoption of OpenStack poses significant challenges to the IT organizations that decide to adopt the best virtualization technology designed for people who want absolute control over their private cloud infrastructure.
Topics: Cloud
Why Mitigating Storage and Network Latency Should Be Like Self-driving Automobiles
In today’s Internet age, we implement new technology and build new products to achieve one goal - to run applications faster, more fluent, and less “laggy”. However, even when we do have the capacity to power our applications, IT admins and developers often battle against the elusive frustration of storage and network latency [for some reason compute gets a pass :-) ].
Topics: Networking and SDN, Storage