Is Metal As A Service The Next Big Thing For The Cloud?

Guest Author: This week’s blog was brought to us by Graeme Caldwell — Graeme works as an inbound marketer for InterWorx, a revolutionary web hosting control panel for hosts who need scalability and reliability. Follow InterWorx on Twitter at @interworx, Like them on Facebook and check out their blog, http://www.interworx.com/community.

We’re accustomed to thinking of cloud platforms as being irrevocably tied to virtualization. Virtualization — the software representation of hardware — is what has allowed us to build infrastructure and software platforms of exquisite controllability and almost limitless flexibility. In fact, if we’re to believe the cloud’s foundational myth — which is probably just that, a myth — the cloud came about as a way to put virtualization to use in soaking up underutilized server resources.

But really the cloud is not so much tied to a particular technology as it is a set of capabilities: on-demand scaling, fast deployment, API control, metered pricing, and so on. You can have the cloud and its service modalities, including Infrastructure-as-a-Service, without the virtualization layer so long as you have an alternative technology that provides many of the same capabilities — or at least enough of them that they  fulfill the needs of the market while offering a benefit that existing technologies don’t.

Over the last few years we’ve seen the rise of containers, particularly Docker, as a replacement of hypervisor virtualization. Containers are great as a replacement for or improvement to Platform-as-a-service products, but they can’t really replace Infrastructure-as-a-service. A technology that can replace IaaS in many of its most important roles for a large segment of the user base, and especially for those building private clouds, is the bare metal cloud, which can be used to provide Metal-as-a-service functionality.

Bare metal clouds are probably best thought of as an enhancement of traditional server clusters. A cluster controller takes care of scalability — new servers can be added to the cluster at will. API control exists in much the same way as with virtualized Infrastructure-as-a-service. As for on-demand pricing, that’s really a function of the way platforms are designed and sold rather than any specific technology, but it’s not essential for most purposes where long-term hardware stability is more important that fast elastic scaling.

The most important point of superiority where bare metal clouds are concerned is performance. As the name suggests, operating systems or applications run directly on the bare metal without a virtualization layer, or in light-weight containers that offer easy deployment and migration without the overhead of virtualization.

In short, for most applications short of massive scaling on very short timeframes, bare metal clouds and metal-as-a-service offerings are likely to be a superior solution for companies who need to extract optimal performance from their hardware without sacrificing flexibility.

Metal-as-a-service has until now largely been associated with Canonical’s offering of the same name, but the concept has a much wider application and vendors are entering the bare metal arena both from the direction of virtualized cloud providers like IBM and more traditional server management and clustering solution providers like InterWorx. Companies like France’s Online Labs are leveraging low-powered ARM server clusters to provide Metal-as-a-service platforms.

Virtualization has always been a stop-gap technology: one that provides capabilities we need, but at a cost in performance and in complexity. The move back to bare metal without sacrificing performance is one that will pick up speed in the years to come.

5 Mistakes Most Businesses Make with the Cloud

Guest Author: This week’s blog was brought to us by William Hayles – Will is a technical writer and blogger for Outscale, a leading cloud hosting provider in the USA and France.

Cloud Mistakes

The cloud’s awesome, but only if it’s properly implemented. Ask yourself: have you used it effectively, or have you committed one of these common mistakes? Is your organization using the cloud effectively?

As the services and platforms that comprise cloud computing become more widespread, more and more businesses are looking at it as a viable option. And really, why shouldn’t they be? In the right hands, it’s an incredibly powerful technology, allowing for better collaboration, faster development/deployment, and reduced costs all across the board.

Of course, like any technology, the cloud’s only effective if you use it properly. Improperly implemented, a cloud computing solution could actually end up increasing your overall spending, to say nothing of the potential for a data breach associated with an unsecured cloud network.

Today, we’re going to go over a few of the most common mistakes made by first-time cloud adopters – and more importantly, how your organization can avoid making them.

Failure To Understand The Cloud (And Your Needs)

By and large, the most frequent – and most significant – error on the part of cloud adopters is a simple lack of understanding. Perhaps thanks to the culture of buzzwords that’s grown up around the tech industry, many businesses see the cloud in only the vaguest sense. This leads them to adopt a cloud model that’s ill-suited for their needs, since they see the cloud as a single service.

The truth is, “cloud computing” is a lot more complex than one might expect. It’s a catch-all term, one that covers a wide spectrum of different services. It’s thus important that you know the different types of cloud models available to you, as well as which one best suits your organization’s needs – including capacity.

Thinking Exclusively In The Short-Term

Far too many professionals think only in the short-term – what action can make them the greatest profit in the shortest possible time? Approaching the cloud with such a stance is asking for failure. You can’t simply focus on what the cloud can do for you in the immediate future; to properly implement a cloud service model into an organization requires careful planning and a long-term roadmap

Not Implementing Proper Security

One of the most common arguments against the cloud is that it’s inherently less secure than more traditional computing models. In unskilled hands, this argument’s actually true. Before settling on a cloud service provider, make sure you understand what areas of security they’re responsible for – and which fall under your purview.

“Security is an afterthought in a lot of scenarios for companies because traditional applications have been hosted behind a firewall,” explained Riverbed Technology’s Technical Director Steve Riley at a recent ITEXPO West Panel.  “But it no longer can be an afterthought; it has to be part of the deployment and design.”

Taking On Way Too Much At Once

Cloud computing is incredible, as is its potential to improve your organization. Seeing how much money it can save – and how efficient it can make your business – means it can be tempting to try to replace your business’s infrastructure overnight. Don’t do it.

Especially if your business maintains a large network of legacy infrastructure, the cloud is something that needs to be adopted gradually. Start slow. Test out small-scale changes first before you implement anything too huge.

Making Foolish Assumptions

There are two assumptions you should never make about the cloud:

  1. That it will instantly solve all your problems
  2. That your entire organization will be on-board with the idea the second you pitch it.

Before you try to add a cloud to your business, you need to make sure you’ve actually got a clear idea in mind of what problems you want to address with it. It’s also vital that you discuss the matter with your IT department – not everyone is going to like the idea of a large-scale switch.

“Many existing enterprise organizations, both within their current IT team and across other departments, may not perceive the value of a move to the cloud,” writes Ken Christensen of Datalink.  “Be prepared for the culture to push back against the notion of the cloud. In some cases, you may even face active opposition.”

In order to effectively pitch the idea, Christensen advises that you be both specific and measurable. You need to give some clear, concrete demonstration of the value cloud computing holds to your business. Otherwise, you may as well scrap the idea altogether.

Get Your Head In The Cloud

Like any tool, the cloud’s only functional if you know how to use it. It’s not something you can implement halfway, nor can you utilize it without fully understanding what it does. If you try to use the cloud knowing your organization’s requirements and culture – as well as the underlying technology – then you’re simply asking for trouble.

So, I ask again – is your organization using the cloud effectively? Hopefully now you know the answer.

The Benefits of Virtualization

Last week, we reviewed the basics of virtualization in order to provide an understanding of the technology and how businesses are utilizing it. The buzz surrounding virtualization is only getting stronger as IT professionals look for tools to maximize their budget and productivity. Server virtualization has been a game-changing technology for IT, providing efficiencies and capabilities that just aren’t possible when constrained to a physical environment. There are several benefits to IT professionals and businesses that choose to implement virtualization, including:

Cost Savings: With virtualization, businesses can cut their hardware maintenance costs by lowering the number of physical servers. Several companies are also using virtualization as a means to simplify the ownership and administration of their existing IT servers. In doing so, businesses can decrease their operational costs, such as staffing, powering, backup, hardware, and software maintenance.

Storage Management: By implementing a sever consolidation strategy through the use of virtualization, companies can increase their space utilization. The use of external storage has provided new forms of backup by performing block level copies of the hard disk drive. Since the storage is centrally arranged, de-duplication of data now a mainstream technology – either in-line for all data or in backup for reduced backup sizes.

Today, one physical server contains only one operating system, but virtualization offers the capability to put multiple operating systems in a single server. Basically, if a company has multiple servers performing small but vital tasks, virtualization provides the ability to collapse each server into a very high density cluster that uses much less space.

Business Continuity: I could talk about the importance of business continuity until I’m blue in the face – it seems to be very underrated by companies until they experience an outage or disruption of service. However, by that time – it’s too late, and the data has been lost. To ensure this doesn’t happen to your company, virtualization is a great tool to use as your “safety net”.

When you run things in a physical server environment, you generally need to buy another set of the identical hardware for your disaster recover site. If your company is running virtually, then there is no need to purchase expensive, duplicate equipment as backup. Additionally, virtualization simplifies your disaster recovery because there are fewer servers to deal with.

These are only three benefits offered by virtualization. Other benefits include increased uptime, better operations with automation, scaling, and reduced environmental impacts. The growing popularity of virtualization has yet to slow down, with more and more businesses realizing its benefits every day.

To learn more about virtualization, click here.

Blog Author: Vanessa Hartung

TeraGo Networks Presents: Back to Basics – What is Virtualization?

The term “virtualization” has been generating some buzz in the technology community as IT professionals look for ways to maximize their resources. But what exactly is virtualization? And how can it benefit your business? This blog post breaks down the history and functionality of virtualization.

What is virtualization?

Virtualization refers to the technologies designed to provide a layer of abstraction between computer hardware systems and the software running on them.  Since virtualization provides a logical view of computing resources instead of a physical view, it provides you with the capability to trick your operating systems into thinking that a group of servers is a single pool of computing resources. Virtualization also allows for you to run multiple operating systems simultaneously on a single machine.

Screen Shot 2013-07-19 at 9.01.06 AM

At its roots, virtualization is essentially partitioning, which divides a single physical server into multiple logical servers. Once the physical server has been divided, each logical server can run an operating system and applications independently.

Historically, virtualization has been around for several years. It was first used in the 1960’s as a way to partition large mainframe hardware. Back then, engineers faced the same problems that are faced today, such as too many underutilized servers. The team at IBM pioneered virtualization by providing the capability for engineers to partition mainframes, allowing tasks to multitask.

After the popularity of virtualization faded for a long period of time, it experienced a rebirth in the 1990s. Server virtualization on the Intel based x86 platform was invented in the 90s primarily by VMware. Since then, many other companies have entered into the x86 hardware and software virtualization market, but it was VMware that developed the first hypervisor for the x86 architecture, planting the seeds for the recent virtualization boom.

So what exactly is x86? It’s the generic name for Intel processors released after the original 8086 processor. The “x” in x86 stands for a range of possible numbers. If a computer’s technical specifications state that it’s based on the x86 architecture, it means that it uses an Intel processor, not AMD or PowerPC.

One of the aspects driving the increased popularity of virtualization is the shrinking availability of data center space. Many companies are also using virtualization as a money-saving initiative. By reducing the numbers and types of servers that support business applications, companies are looking at a significant cost savings.

Next week, we will discuss the benefits and features of virtualization.

To learn more about data center options, click here.

Blog Author: Vanessa Hartung

%d bloggers like this: