Most people, I think, are aware of the Internet of Things (the capitals are compulsory apparently); the idea that, for example, your heating can be connected up to the internet and you can switch it on via your smartphone, or now your iWatch, on the way home. Given the weather we’ve had in the last few months, I can certainly see the attraction of that sort of system.
Now, it appears we must now get used to the Internet of Everything. According to the world power that is Cisco, this is defined as ‘the business of connecting people, processes, data and devices without the need for human interaction, to create a more efficient digital world’ (my italics).
In fact, the idea of the Internet of Everything has been around for some considerable time. Google the phrase and you’ll find that it’s been used for several years, and although the idea of the Internet of Things is only now being discussed in ‘normal’ newspapers, the IT industry has been talking about ‘connected devices’ since the 1990s. What we are seeing now is the popular market starting to catch up with the cutting edge of IT. It will certainly take time for this to become a surge of demand, but in the meantime the digital world has a golden opportunity to design, develop and market the ‘Things of Internet’, neatly inverting the phrase to reflect technological and commercial reality.
And what Things they will be. Literally, and this is where Cisco is right, (almost) anything and everything can be brought into this world.
The various online articles on this subject tend to focus on devices that can be connected to tablets and phones (the Internet of Things) rather than the bigger concept defined by Cisco. The example of central heating that is remotely switched on by human intervention is now yesterday’s news. Instead, we’ll have fridges that monitor dairy products and place an online order for replacements, without any need for a human being to sniff the milk or put an egg in a pan of water to see if they are ‘off’. In Africa, and elsewhere, digital tagging devices could be used to monitor elephants, rhinos and other animals that are under such threat from poachers. Healthcare companies already produce wearable technology that monitors various vital functions and the UK government wants to nearly double the numbers currently monitored by ‘telecare’ systems to three million by 2017.
All of this, and more, creates opportunity for the IT industry. It reinforces the need for real societal and governmental pressure on the education system to produce more of the talented people we need. It will also mean more integration between computing industries and other areas of manufacturing. We’re already used to seeing computers in cars, but in the USA last autumn AT&T introduced a foundry that provides an environment where carmakers can devise the next generation of ‘smart’ cars (and I don’t mean those diddy ones you can park sideways).
Technology has, of course, been increasingly used in recruitment for many years, with video interviewing, applicant tracking systems and the like now standard fare. It has not, as yet, replaced the human element in recruitment, and, as far as I know, is unlikely to be able to do so for some time. There will always be some things that a human can do better than a computer – won’t there?
Andrew Finlayson, Associate Director, Be-IT Resourcing