With an article that begins with "Cloud computing apps are for suckers. If there is an alternative that runs locally on your own machine, it will always be better," John C Dvorak, seems to be going from "baiting Mac users" to "baiting Google users."
But let's just take the argument at face value. Some of the points he makes are good ones – specifically, the ones with performance issues.
I don't care if you have 30-megabit-per-second service, you'll get flaky performance from most online apps, especially if they're popular. Always remember that your online speed is only as good as the speed at which data is coming at you: The application server may be swamped, and the various nodes along the route could become clogged, too. Nothing is ever as fast as the machine sitting on top of (or beneath) your own desk.Your desktop is faster than the cloud – that's true - but is your car? Information stored in the cloud can be accessed from any place with a Net connection. Information stored locally can only be accessed locally – well, unless you connect through a VPN or set up a VNC server. But even for those of us that know how to do it, a VNC server is a hassle, and a security risk unless you do it exactly right. 90 minutes is horrendous downtime for an enterprise application, and Dvorak is right so far as any application where 90 minutes downtime is unacceptable shouldn't be put on the cloud.
But there are plenty of applications – and for small-to-medium companies, e-mail is one of them – where the losses incurred from 90 minutes of downtime is less than the cost of having a dedicated in-house application installed and maintained on the network. (If the opposite is true, don't use cloud computing, use the in-house application, and keep an eye on how it performs.)
Dvorak also points out that your data is at the mercy of the service provider and that if the service is cut off, for whatever reason, so is your data. That's true, but if you don't back-up your data, your data can be lost by a hard drive crash. Both are about as likely to happen, in my experience.
To Dvorak, "People tend to forget that software is NOT a service; the whole cloud scheme is a scam to lock users into a single product and somehow extract more money from them." There is some aspect of vendor lock-in, but mostly cloud computing is a way to provide an application at low startup costs in exchange for revenue over time – whether through advertising, in the case of Google's apps, or through a subscription model. Yes, it is very much "renting" rather than "owning," but that can very well make financial sense in many cases.
After that, the arguments get a bit silly.
What happens if the net is attacked and your entire cloud world is gone for days and days? It just happened in the Republic of Georgia, and it can probably happen anywhere.If the Russians start bombing us, John, I'm sure that the boss will give us a few days off.
Ask yourself why the heck will we need six-core, high-performance chips if the cloud takes over everything?Why do we need six-core, high-performance chips now? In a virtualized server, certainly we'll need power to spare, but unless you're doing video editing or animation rendering, a six-core chip is probably overkill. And if we stop putting the big iron in the datacenters of big companies (very unlikely), they'll pop up in the data centers of the SAAS providers.
When it comes to performance and scalability, absolutely, standard client-server IT applications and local programs are going to have SAAS beat. Final Cut Pro is not going to the cloud. Photoshop isn't going to the cloud (though Photoshop Elements is...). But the key advantage of cloud computing isn't performance or scalability – it is portability. This is why people will pay twice as much for a laptop with the same specs as a desktop computer. Mobility is important.