Is IT Dematerializing?
We like to joke about our progressively sleeker and smaller devices, but the truth is that the technology supporting them isn’t shrinking so much as it is vanishing altogether — that’s the idea of dematerialization. Have IT systems already dematerialized, and is that a good thing?
If we take a moment to think back even ten years ago, the world might seem virtually unrecognizable — smartphones were nowhere to be seen back then, and now you can’t be seen without one.
What’s not so easy to see is all the technology that’s become invisible, but which is no less important. For example, electricity (which is actually invisible, most of the time) is a fundamental requirement for everything that we do, but because we typically can’t see it, we don’t think about it. What else do we not consider?
Technologies that now seem revolutionary — VR headsets, wearables, and smartphones — will serve as mere touch-points for the vast quantities of data that they collect. This is dematerialism, the merging of physical and digital worlds, and it’s rapidly gaining speed in the world of tech.
As we enter into the new year, many of us are probably thinking about where we are in relation to our childhood conceptions of “the future” — are we already there? Is IT reaping the benefits? How far do we need to progress for that to happen, and what consequences will an increasingly immaterial world bring?
As Gartner reported last year, dematerialization is one of three key trends contributing to the “virtuous circle of smart things.” The falling costs of technology both enable more connectivity among a wider variety of digital devices and cause these devices to multiply in number, dramatically increasing their usefulness. The result is a “data of everything” that will eventually prove more valuable than the devices themselves (and the services that they replace).
As Gartner observes elsewhere, such a connected, virtual network will enable CIOs to “seize effective control of the physical product through control of the virtual service.”
To translate that into plain-English, what this means for IT is that the process of optimizing capacity, solving system problems, and making future IT predictions will become completely automated. An IT worker will no longer need to track individual workloads because the algorithms will do it for them, making the process invisible.
There are many huge upsides to this shift. For one, as advanced software replaces older, less efficient processes, businesses will be able to cut out many costs of IT entirely (again, data being more valuable than devices). Secondly, IT will soon be able to clearly articulate its value to the business, as well as discover new efficiencies — for instance, it could predict future workload spikes while eliminating overspending.
There are also some (manageable) downsides. As BBVA Innovation observes, the underlying processes of IT will become relatively unknown to workers, which means that they won’t be able to “take it apart and put it back together,” as do the heros of computer science lore. And as more information is hosted digitally, concerns about privacy and the potential for data abuse will grow.
Realistically, we’re still a ways off from total dematerialization (ten years out? 20? Next week?), and it will happen in fits and starts. Incidentally, as a sponsored Wired article notes, there’s a “reverse economies of scale” to this process, meaning that as physical systems slowly dwindle, they become much more expensive to maintain.
However, in other ways, we’re already there. Automated and predictive analytics have made waves among earlier adopters, simplifying IT data for business and IT leaders, improving the accuracy and actionability of data insights, and removing human errors from the process.
But will the seams of IT ever truly vanish? One third of wearables, Gartner predicts, will be “inconspicuous to the eye” by 2017 — given rates of change like this, we’re not going to rule out the possibility. For the next few years, then, try not to blink.