Beta Brain - What’s Latency And What Do You Eat It With?

By Paula September 13, 2012 English No comments yet



OK, here is how I see it. Everyone gets tons of information about bandwidth and how important it is. Business owners will walk around telling anyone who is willing to listen how high their company’s bandwidth is. If bandwidth could be captured in a photo they would certainly carry it in their wallets and show it proudly, just like a grandmother with her grandchildren’s pictures. Those of you that are in the IT arena have probably experienced this many times: You walk into a client and the first thing they want to talk about is their bandwidth, is it enough? Should we get more? Is it better than our neighbors’? Better than our competitor’s? They make it very clear that, whatever you do, you must increase their bandwidth to the maximum available (of course, within a reasonable budget). Those of you that are business owners are probably constantly bombarded by telecommunications companies and equipment manufacturers trying to get you to increase your bandwidth. Apparently, bandwidth to businesses is like shoes to women: they never have enough!
 
However, bandwidth is not all there is to the perceived speed of a network. There is also the forgotten, never-to-be-mentioned, ugly cousin: the latency. Network latency is the amount of time it takes for a packet to travel from the originating device to the destination device. End-to-end latency is a cumulative effect of the individual latencies of all the devices connected along the network path (e.g. workstation, router, ISP link, ISP network, etc). My idea is not to try to explain all the technical details because, I must confess, lots of other people have already done so much better than I ever will, so I will simply refer you to an excellent article written by Stuart Cheshire: “It’s the Latency, Stupid” (http://rescomp.stanford.edu/~cheshire/rants/Latency.html). As he states: “No matter how small the amount of data, for any particular network device there’s always a minimum time that you can never beat. That’s called the latency of the device. For a typical Ethernet connection the latency is usually about 0.3ms (milliseconds — thousandths of a second). For a typical modem link the latency is usually about 100ms, about 300 times worse than Ethernet”.
 
You are probably thinking: “and you are telling me this because…?” I promise it is for a good cause. For every person that reads this, absolutely no money will be donated to a low-budget tech film. However, it can save a small business from making a potential mistake. All this is important especially if you are considering moving your applications to the cloud or moving your existing servers to an external data center. One of the lessons we have learned throughout our years in the IT industry, is that the advantages of cloud computing do not apply to all businesses. It is not a “one size fits all” solution. If your business uses mission-critical applications that require extreme disk input/output operations and low latency then you may not be a good candidate for the cloud revolution. Otherwise, your end-users could suffer of a slower response time due to the increased latency, even if you add all the bandwidth you can afford. Make sure you ask your IT team to do a complete evaluation before proposing a move to the cloud.

No comments yet

No comments yet. Start a new discussion.

Add Comment