Artificial Intelligence: What’s Hype and What’s Not
So what is hype and what is real? Let’s examine some of the facts:1. The term artificial intelligence was coined by an MIT computer science teacher called John McCarthy in 1956. So it’s not a new concept. McCarthy believed that: “Every aspect of learning or any other feature of intelligence can in principle be so precisely described that a machine can be made to simulate it.” 2. The cost of memory has plummeted in recent years while computer processing power has skyrocketed. The cost of memory has declined from over $12 a gigabyte in 2000 to approx. 0.4 of a cent in 2017, a 3000-fold decrease during this short interval. Over the same period, there has been a 10,000-fold increase in processing power. These technical factors are enabling a vast range of outcomes including AI. The acceleration of change cannot be denied. 3. The invention of the smartphone, coupled with the above technological improvements, has allowed some of the theory that was taught in 1982 to be brought to life at the consumer level for the first time e.g. Siri and Google Home voice technology. Remember, however, that natural language understanding and voice technology was not invented this year or last year, as the technology “futurists” would have us believe. What has happened is that the research already underway in 1982 and for many years beforehand has been able to be commercialised and made available at a personal level. 4. We need to distinguish “artificial intelligence” from what is just “smart computing”. Computers can now do things faster than ever before but just being faster is not intelligence. Being able to store large numbers of facts about you in a database and personalise offers at the point of sale is smart, but it’s not AI. Almost every startup nowadays proclaims that it uses machine learning and artificial intelligence, but my contention is that most actually don’t. 5. Artificial intelligence can be said to be at play when a computer cannot be programmed with all the data needed for certainty but when it can use some kind of heuristic to generate an answer that works for practical purposes. A heuristic technique (or heuristic) is an approach to problem solving, learning, or discovery which generates an approximate outcome that isn’t guaranteed to be perfect, but is sufficient for the immediate goals. Human intelligence makes use of heuristics and computers can be programmed to use them too. Take a simple example. You need to paint your house. The painter comes to provide a quote, walks from room to room, and says “Well, the price to paint the house will be $xx”. Why did the painter not measure each room down to the centimetre and calculate the price exactly? Because she uses a heuristic, a “rule of thumb” – a best guess that, in her experience, will be close enough in taking account of her costs. And over the years, as she paints more and more houses, her best guess becomes even better!
Machine learning and artificial intelligence make use of heuristics to generate a best-guess that is good enough for real world problems.A good example is face recognition. When the immigration officer holds your passport near your face to validate your identity at an airport he is in fact using his own rule of thumb and best guess. Since he doesn’t have a set of all possible facial images in his brain, to validate that the passport picture really is a picture of you and not someone else. If we try to have a computer match a passport photo to a traveller, it is impossible to build a database of every face on the planet and the computer would take too long finding the perfect match anyway. However, it is possible to program heuristics so that a best guess, over time, as more and more faces are matched, is as close to certainty as possible (but not, in fact, certainty). Same with voice recognition.
So, if AI can imitate human judgement using heuristics, will it ravage our workforce? Some doomsayers clearly think so.“No question, the impact of artificial intelligence and automation will be profound… we need to prepare for a future in which job losses reach 99%” Calum McClelland, 2016 Seriously? 99%? New technology, all the way back to stone tools and the wheel, has always made some work redundant. A changing workforce as a result of technological change is not new, especially since the advent of the personal computer and the internet. Look at the way industries such as photography, accounting, travel, accommodation, healthcare and financial services have been seriously impacted by technology and automation. Many jobs have been lost – and many new ones that we could not have imagined have been created. With respect to AI, there will doubtless be many individuals who will have to negotiate the instability and loss that will result from machines having human-like capacities for judgement based on heuristics. But on a societal level, the adoption of increasingly sophisticated technology will, as it always has done, open up unimaginable new ventures and activities, with a vast number of new work opportunities. What do you think? Is the future with AI bright, or is it just hype? Let me know your thoughts via Twitter or LinkedIn.
Have any questions? Want to know how we can help you and your organisation? Contact us using this form.
We can work with you to design and support implementation of a strategy for your business unit, for your entire organisation, or for any segment of your organisation where a fresh approach will add value.
We will use a combination of globally-recognised leading-edge processes, coupled with our proprietary validated toolbox to develop a bespoke, customised strategy, which we can assist you in implementing, that will deliver tangible impact and value to your organisation and your customers.
We have been designing and implementing strategy solutions since 2003 and we have the expertise and the experience not only to deliver, but to overdeliver.
More related articles on innovation
The democratisation of AI is levelling the playing field, allowing smaller companies to compete effectively with their larger counterparts.
Successful digital transformation necessitates a clear and effective framework, rooted in empathy.