Dynamic Infrastructure: Networking Industry’s Biggest Hope

I found this fairly technical article addressing the exciting potential of Infrastructure 2.0 (anyone? no? first I’d heard about it too.)

It does look like a big change is on the way, and I for one can’t wait.  If you are providing Platform as a Service (PaaS) or Infrastructure as a Service (IaaS) solutions, this article gives much food for thought.

This was originally posted by Gregory Ness over at seekingalpha.com, but I found it on another blog, so to give credit, that’s the one I’m linking to 🙂

Here’s an excerpt:

“Dynamic infrastructure will unleash new potentials in the network, from connectivity intelligence (dynamic links and reporting between networks, endpoints and applications) to the rise of IT automation on a scale that few have anticipated. It will unleash new consolidation potentials for virtualized data centres and various forms of cloud computing. It will enable networks to ultimately keep up with increasing change velocities and complexity without a concomitant rise in network management expenses and manual labour risks.

Further down the road there will be even more capabilities emerging from Infrastructure 2.0 as virtualization and cloud payoffs put more pressure on brittle Infrastructure 1.0 networks. The evolution of cloud (James Urquhart calls it a maturity model in his recent CNET piece) will drive new demands on the network for automation.

Cisco is looking to address end-to-end IT automation and virtualization with a combination of partner technologies from the likes of VMware (VMW), and our own successes in the Catalyst and Nexus lines (e.g. the Nexus 1000v). Stay tuned on that front for some eye raising announcements.
– James Urquhart, Cisco, December 7, 2008

Without dynamic infrastructure enabled by automation, the payoff of virtualization and cloud initiatives will be muted in the same way that static security muted the virtualization payoff into a multitude of hypervisor VLANs. Think static pools of dynamic processing power that will eventually be consolidated into ever larger pools, enabling greater consolidation, greater efficiency and bigger payoffs free of the churn and risk of on-going manual intervention. This is the vision of Infrastructure 2.0.

Add to FacebookAdd to NewsvineAdd to DiggAdd to Del.icio.usAdd to StumbleuponAdd to RedditAdd to BlinklistAdd to Ma.gnoliaAdd to TechnoratiAdd to Furl

Advertisements

New Year. New President. Any new ideas?

Barack gave his inauguration speech today, which was very impressive. Unfortunately, it left me feeling depressed about the state of the global economy, and the bleak future awaiting us over the coming months. I guess it’s because I’m not an American, and am therefore missing that ‘Yeah, we can do anything’ gene that seems to have been handed out as they disembarked the Mayflower. It’s a character trait that the rest of the world is both envious of, and sickened by. Maybe it’s a jealousy thing.

What I do know is that even if Pres. Obama manages to turn the US economy around, it won’t happen overnight. Most of us are already feeling the effects of recession. At best it’s affecting our spending decisions for holidays, new cars, and gadgets, and at worst people are losing their jobs, and their homes.

So he asked for new ideas. Any ideas. It started me thinking about ways in which we should change our behaviour, practices and decision making in my industry, IT. IT has traditionally been one of the driving forces at the helm of the economic boom. The healthy race for technological advances has increasingly made everything smaller, yet more powerful. For most businesses, this technological progression has not gone un-noticed, but it has also failed to deliver any startling benefits. A PC which cost £400 5 years ago would have been a fairly good mid-market model, allowing basic office use. The equivalent PC today still appears to cost around £400, so where are the benefits. OK, so we have nice 19″ LCD displays instead of 17″ CRT monitors, but the PC is still a PC.

The same can be said for server class computers from vendors like HP and IBM. 5 years ago, a company would spend £10,000 on a new database server, and a further £20,000 to licence the software to run on it. Today, the same purchases are being made, with amazingly similar budgets.

The problem is more to do with the way people expect to use the technology. 10 years ago, you needed a separate server for each application you wanted to run. Often that old rule is no longer applicable, and yet IT departments continue to hold on to that model. Those IT teams that have been paying attention to the technology available, have already identified that a quad core CPU (which is becoming common even in PC’s now) is way over-powered for most traditional server tasks. These ‘Adaptive Thinkers’ have been quietly deploying virtualization solutions from firms like VMware and Microsoft. Hypervisor based server platforms that can harness the power of these smaller, faster technology advancements in ways that traditional server environments cannot.

If you haven’t already virtualized your IT systems, you’re behind the times. Unfortunately, if you have virtualized, you’re probably still behind the times too. Virtualization is again re-inventing itself with a service focus though IaaS (Infrastructure as a Service). VMware vCloud and Microsoft Azure Cloud platform refocus IT consolidation efforts into the data centre. By providing the environment on a service/rental basis, firms no longer have to look after their own virtualization platforms. This can reduce training costs, support costs, and obviously capital costs.

In the upcoming economic uncertainty, it surely makes sense to take Barack’s advice regarding new ideas, and rethink our approach to traditional computing if we are to survive this approaching storm.

Add to FacebookAdd to NewsvineAdd to DiggAdd to Del.icio.usAdd to StumbleuponAdd to RedditAdd to BlinklistAdd to Ma.gnoliaAdd to TechnoratiAdd to Furl