Utility Computing Delivery Model

According to TechCrunch, Amazon is jumping into the utility computing space. We have been working on our own clustering/grid offering and the question of “how to price” computing capacity has been a hot topic. We have been focusing on the 50% the price of the other guys model, but that will only work for a limited time. The Amazon news starts to define the market. Their pricing is as follows:

  • 10 cents per instance hour
  • 20 cents per GB of bandwidth used
  • 15 cents per GB of storage

Amazon calls their new service the Elastic Compute Cloud (EC2). Here is how they define it:

Elastic: Amazon EC2 enables you to increase or decrease capacity within minutes, not hours or days. You can commission one, hundreds or even thousands of server instances simultaneously. Of course, because this is all controlled with web service APIs, your application can automatically scale itself up and down depending on its needs.”

The terms and conditions of the service have some interesting limits including: a) no more than 1 call per second per IP address, b) may not send files bigger than 40K and c) you cannot hold Amazon accountable for service delivery (i.e. you release them from any liability for service that you pay for).

Sun has a competing “pay as you go” computing offering starting at $1 per hour. They charge $1 per central processing unit hour and charge $1 per gigabyte per month. IBM offers similar pay-per-use computing at 50 cents per hour.

http://hilbert.math.uni-mannheim.de/~seiler/cray.jpg

Cool Cray supercomputer on display (note: this has nothing to do with the post, just thought it looked cool). The equivalent computing power doesn’t look half as good as it used to!

Leave your Comment