<img alt="" src="https://secure.bomb5mild.com/193737.png" style="display:none;">

Turbonomic Blog

So, what’s the deal with Serverless?

Posted by Eric Wright on May 30, 2016 11:45:09 AM
Eric Wright
Find me on:

#Serverless is the new #Docker according to many. The unfortunate thing about the seemingly sudden thrust of event-driven on-demand computing, is that it is being wrapped into a buzzword as if that is all there is to it. Before you dismiss the serverless movement, let’s take a few minutes together and talk about what it is, and what it isn’t.

Serverless Runs on Servers

Spoiler alert! While the pundits and contrarians will jump all over the choice of naming this style of computing as serverless, it does in fact run on servers. Of course it does!! That isn’t the point.

What we are seeing with the shift towards cloud computing, is the need to be able to deal with event-driven applications that are given processing time when, and only when they are used. In other words, don’t leave an application running constantly when it is effectively saying “Anyone there? Anyone there? Anyone there? Anyone there?….” and only springing into action when someone answers.

Think of the simplest analogy. Do you go to your front door every few minutes and look through the peephole or open it up to see if anyone is outside? No, of course not, you have a doorbell. Think of the doorbell as your foray into the concepts of serverless computing. When the doorbell rings, you take action by going to the door to check who is there and opening the door to greet them.

That is event-driven computing at a human level.

Computing is Expensive Still

There are literally millions upon millions of physical and virtual compute nodes sitting at less that 10% CPU constantly because they need to be on just in case the application starts doing something. There are absolutely valid application architectures that require always-on computing, and that isn’t the issue.

What does become an issue is that you have an event listener running on a virtual server that is entirely idle except when an event occurs and it then does something, but sits idle potentially for hours a day. Those hours are using power, cooling, data center space, monitoring resources, and even more importantly, your hard earned money.

If you run a t1.nano instance on AWS to host some event-driven application you are still incurring around 60$ per year to run the smallest available server on the Amazon Web Services environment. Assuming you’ve got other infrastructure wrapped around it, there may be some other costs. While 60$ a year doesn’t sound like much.

Serverless is Cool, but It’s Not Replacing Servers Per Se

Serverless computing is about event-driven applications. There are still continuously available applications that are going to be the majority of computing for quite some time. Just like containers, serverless computing is not a be-all-end-all solution which will unseat traditional computing.

Serverless computing will open the doors to reduce the costs and increase the efficiency with which we program and architect our infrastructure.

I was very pleased to be able to talk with Eric Windisch on the GC On-Demand podcast recently on this very subject. Eric’s been on the leading edge of application architecture in a few different ventures, including his startup IOpipe.

Thank you to Eric Windisch for a great chat, and let’s all keep our eyes on the serverless computing architecture to see just what it brings. Like containers, it has an opportunity to shift the way we design and deploy many applications. It won’t remove the need for traditional cloud and virtual machine infrastructure entirely. It will open the doors to save us all some money, and think about how to keep the data center a little greener and the cloud a little more on-demand in new ways.

Topics: Servers and Hardware

Subscribe Here!

Recent Posts

Posts by Tag

See all