The New IT Age
We have a bit of a "New IT Age" - or is that "New Age IT"? - going on. I don't mean that like Charles Araujo's Quantum Age of IT, as a major transformation of how we do IT. I mean it in the same way as "New Age" thinking: i.e. a total abdication of rational or critical thought. Peace, love, dope, and brown rice ... and bad science. We haven't seen crystal IT or aromatherapy IT or holistic pyramid IT, but some days I feel as if it is just a matter of time. Tweet this.
The rise of understanding that "IT is the people" is a good thing, and about time. Unfortunately that rising tide of awareness spawns some West Coast group-hug silliness, but I can get over that. My real concern is what can only be called superstition. Arthur C. Clarke said that any sufficiently advanced technology is indistinguishable from magic. Technology is advancing so fast now that people start ascribing it with magical powers, alarmingly reminiscent of the New Age airhead hippies. Tweet this. It is usually when the technology advances beyond their knowledge of pure science or engineering (just like the New Agers, who can be perfectly intelligent people whose beliefs are driven by ignorance). IT folk can be alarmingly narrow in their knowledge outside of IT: many have a liberal arts education or no tertiary education at all.
Here are three recent examples:
This post on robots, Why Human Competition is the Least You Have to Fear, was on a website dedicated to "Marketing. Social Media. Humanity." which is ringing deafening alarm bells already.
It is an unreasoned mish-mash of pop culture and factoids, i.e. perfect New IT Age. Its lack of structured content or logical argument actually makes it quite hard to refute (also a common New Age attribute). The post said
The robot revolution won’t come in the form of Terminators with German-Austrian accents that annihilate you. Noooo. It will quietly creep into your life, eventually replacing you if you’re unwilling to adapt and make your work emotionally-essential. Maybe it’s already happening. Prove that you’re human and make “art” work that only you can make.
The evidence of these advancing enemy? Driverless cars, automatically generated e-books, actual robots in actual industry, and forecasts for "the end of this century" (I sincerely hope I am still working then). Let's pick them off shall we?
- We've had driverless commercial aircraft and trains for a decade, in operation. And we've had cars too, but nobody in their right mind will let them on the roads. The Google car is just another step.
- Anybody can write a script to rip a public domain pdf, wrap a title page and colophon on it and upload it to Amazon. BFD.
- We've had robots in the workforce for two decades.
- Anybody who extrapolates a simple curve into the future gets what they deserve.
There will always be jobs for people. Those jobs change just as they did in the transition to agricultural then industrial and now service economies. Only those who don't grasp basic economics will fail to see that there is always a value exchange between people, whatever form it may take.
Extrapolating curves into the future brings me to my second example: Ray Kurzweil, who has had media mileage again lately, which is no surprise as Kurzweil is more showman than scientist. As I said in another post:
Even a blind squirrel finds a few nuts. But Ray Kurzweil missed the Cloud, Facebook, Twitter, SaaS, phone apps, and Angry Birds.
In case you have been in a cave for a while, Ray Kurzweil has joined Google. According to Ray “In 1999, I said that in about a decade we would see technologies such as self-driving cars and mobile phones that could answer your questions, and people criticized these predictions as unrealistic."
There already were self-driving cars in 1999 i.e, Kurzweil can't have been predicting the concept. He must have been predicting they would be mainstream. Which they aren't...
[And] I have never ever once seen anyone ask their phone a question in my life. If those are Ray's best two predictions, they're lame ones.
Third example of the New IT Age: there's early scientific speculation about using DNA as a computing medium. Right away there's a confusion around the use of DNA for computing, amongst the New IT Agers whose scientific education appears to have come from Star Trek. Let's be clear: It is an encoding material to be used in a computer instead of semiconductor transistor gates. That doesn't mean the DNA will be embedded in living creatures. It won't, it will be inside a device, held in a matrix of some sort. (No, a matrix in the normal sense of the word - no leather coats). It is pure coincidence that DNA occurs in living cells, just like silicon occurs in computer chips and coke bottles. A DNA-based computer bears the same relationship to a living thing that your current laptop does to a car windscreen: same material, is all.
All these examples sound like a bunch of dope-smoking teenagers. "Wow man, imagine if..." "Far out! You're freaking me out man. It could be real!" If schools taught a bit less whale rights or creative dissonance, and a bit more hard science, we'd have less of these problems. Beware the New IT Agers: just because you can imagine it (chemically assisted or otherwise) doesn't make it real nor does it make it become real in the future. Tweet this.