Rise of the machines: Why AI in 2017 is going nowhereTuesday, January 10, 2017
Although AI was the big buzzword in 2016, few enterprises delivered on it, and 2017 will likely be no different, but machine learning is really on the charge…
One of the greatest buzzword fails of 2016 has to have been AI. Although widely promised, and mentioned continuously in the media, there were very few visible successes. While many companies were keen to be associated with the cutting-edge concepts around AI, building and demonstrating working human-like intelligence has proved to be problematic.
Of course, the same wasn’t true of machine learning (ML), on the contrary. There were many ML firsts in 2016, including Google Deep Mind beating the human world champion at the subtle and complex game of GO, while most tech market followers will have spotted the rise of ‘smart’ home gadgets like Google Home and Amazon Echo.
So what does 2017 hold for AI? As we’ve seen from 2016, building actual AI machines is not going to happen in this 12 month cycle, with estimates from experts ranging from an optimistic five years out to a relative eternity of 20 years.
As Ilia Kolochenko, CEO High-Tech Bridge has pointed out before: “It’s difficult to find a start-up that doesn’t claim to use it in some way. We should not over-estimate it [machine learning] – we need to understand that it can’t replace humans, instead it can significantly aid human tasks.”
However, machine learning, or the process of creating an Artificial Neural Network (ANN) by training a computer with a massive ocean of sample data is set to make a bigger impact in 2017 and beyond. High-Tech Bridge has been using the technology in the company’s ImmuniWeb web security testing platform for some years, albeit in combination with advanced human augmentation, in order to deliver intelligent automation of vulnerability scanning. It’s fair to say that competition is set to be intense - one US analyst report puts the overall cognitive systems market at a healthy CAGR of 55.57 per cent during the period 2016-2020.
Several interesting developments look set to help generate this massive growth, one particularly important area being the ever-present hardware barriers. Although current training of deep neural networks is mainly performed on NVIDIA GPUs, running a ‘trained’ network can be done more efficiently on FPGAs, or specialized ASICs, which are seeing rocketing investment and subsequent efficiency/power leaps. As Google posted last year in a blog about its own project building custom accelerators for machine learning: “The result is called a Tensor Processing Unit (TPU), a custom ASIC we built specifically for machine learning. We’ve been running TPUs inside our data centers for more than a year, and have found them to deliver an order of magnitude better-optimized performance per watt for machine learning. This is roughly equivalent to fast-forwarding technology about seven years into the future (three generations of Moore’s Law).
“TPU is tailored to machine learning applications, allowing the chip to be more tolerant of reduced computational precision, which means it requires fewer transistors per operation. Because of this, we can squeeze more operations per second into the silicon, use more sophisticated and powerful machine learning models and apply these models more quickly, so users get more intelligent results more rapidly.”
So it’s highly likely that while 2017 will see considerable advances in ML and ANN use in very specific verticals, such as niche applications in the financial service and healthcare industries, general enterprise adoption will lag somewhat.
However, the security industry is set to be one of the biggest success stories in ML - a recent survey by the IBM Institute for Business Value (IBV) found that only seven per cent of businesses surveyed are working on implementing cognitive-enabled security solutions to improve cybersecurity risk preparedness, but this skyrockets threefold to 21 per cent over the next two to three years.
Ironically, forty-five per cent of survey respondents list the top adoption challenges as not being ready from a competency perspective and a lack of internal skills to implement. In short, it could be said that the big challenge for cognitive technologies in 2017 is likely to be HR...