“Big Data” and the “Cloud” are Rip-offs, Moore & Metcalfe should sue!

In the midst of the hyperactive dialogue about the incredible technology revolution, new ideas are so very rare, rip-offs are too common.I’m not against new models and jargon – just ask my colleagues.  But, it discredits current technology thought leaders when they coin a new term that is a pure derivative and fail to deliver proper attribution.  Students get expelled for such omissions.  Luckily, tech visionaries are held to lower standards.“Big Data” and the “Cloud” are being promoted everywhere as new dramatic models when in fact they are the exact outcomes suggested by Moore’s law (1965) and Metcalfe’s law (1980).  By failing to make this attribution, the biggest insights about Big Data and the Cloud are lost.Gordon Moore’s declaration of semiconductor circuit density seems as forgotten as Comdex – perhaps for the same reason.  The PC revolution made Moore’s law personal and inescapable.  Each year we waited breathlessly for the latest CPU so we could – run Lotus 1-2-3, run a GUI, run two programs at once, play a game, play a game with graphics, or even get Windows to boot-up in less than an hour!  We rushed to Comdex to see miracle of the latest CPU and the real effect it would have on our life.  Today we have passed the point on Moore’s curve where the CPU is defining the personal experience – today the personal experience evolution is being driven by many other system components.  No more direct personal effect of Moore’s law, so we stopped watching.But Moore’s law is still with us, the declining cost of CPU cycles continues unabated with impacts far more profound than personal electronics.  Big Data is only relevant because we can afford the CPU cycles to crunch it.  We have had Big Data since we invented computers – the only question was “which piles of data could we afford to crunch?”  40 years ago, we could only afford to crunch the big piles of data required to understand if Apollo 13 would return to earth or if you were cheating the IRS out of some money.  Today we can afford to crunch 100% of the tweets real-time during the Super Bowl to figure out if affluent 30 year-old women likes the color of the new Diet Coke can. Thanks to Moore, we can now afford to ask almost any question and expect to find an answer.And how did we collect all this Big Data and feed it to the CPUs – hipsters say, “the Social Cloud, dude.”  I say, thank you Robert Metcalfe for describing the Cloud and valuing Facebook 30 years ago.  Virtually unlimited free CPU cycles connected with effectively free unlimited bandwidth to every single data source on the globe – stationary or mobile.  Metcalfe told us where we were going – first one connection, then two, then four, then eight…then infinity.Over three decades ago, Metcalfe and Moore told us it was going to get intense and profound – and we shouldn’t forget their insight when trying to describe the future.  You might rise up in defense of the careless Twitter star (whose klout is off the charts) and claim, “the world is changing so quickly, and we can’t constrain our ‘genius du jour’ with such cumbersome discipline!”  If so, then shame on you.  Searching prior work has never been easier – that Google thing is pretty cool, you should try it.This just about isn’t about giving credit where credit is due – it is about holding our thoughts leaders to a higher standard of insight.  Pretending that the “Cloud” and “Big Data” sprung up from nowhere may be good for getting CIOs, conference attendees and VCs all lathered up – but this creationist approach inhibits a deeper more thoughtful dialogue about where this evolution is taking all of us.Think I’m dithering?  In my next piece, I’ll prove that Moore and Metcalfe caused the financial meltdown of 2008… and their insight suggests that there’s more to come.

Posted on February 7, 2012 in Insights, Technology Industry