Thursday, May 24

Firms will double their use of machine learning in 2018, says Deloitte

LinkedIn Google+ Pinterest Tumblr +

Implementations and pilot projects using machine learning will double compared with 2017, and again by 2020 for large and medium-sized firms, according to a Deloitte Global report.

Further, with enabling technologies such as ML application program interfaces (APIs) and specialized hardware available in the cloud, these advances will be generally available to small as well as large companies.

 

Read about APIs for machine learned finance

 

From hype to reality in five vectors

Deloitte noted that despite the excitement over ML and cognitive technologies, and the aggressive forecasts for investment in these technologies, most enterprises using ML have only a handful of deployments and pilots underway.

The firm has identified five “vectors” of progress that may change all that: the automation of data science, reducing the need for training data, accelerating training, explaining results, and deploying locally.

“Progress along these vectors should lead to greater investment in ML and more intensive use within enterprises,” said the report.

This should cause enterprises to double the number of ML pilots and deployments by the end of 2018.

By then, over two-thirds of large companies working with machine learning may have 10 or more implementations and a similar number of pilots, Deloitte predicted.

Interpretability as a vector

One of the big criticisms of machine learning in the use of finance is interpretability, in that it is often not possible to explain with confidence how algorithms make decisions.

 

Read a MarketBrains interview on the “black box” problem with Paul Walker, former Goldman Sachs tech

 

In the US, the financial services industry adheres to a specific regulation that, among other things, requires that model behavior can be explained.

There’s been some headway: MIT researchers have demonstrated a method of training a neural network that delivered accurate predictions and the rationales for those predictions.

Companies should “explore state-of-the-art techniques for improving interpretability that may not yet be in the commercial mainstream, as interpretability of ML is still in its early days,” the report noted.

FPGA and ASIC could blow ML wide open

Big changes to the chips of machine learning are likely to cause big changes in the industry.

After moving from CPU-only to CPU-plus-GPU solutions, the industry exploded in usefulness and ubiquity; using chips that are 10 to 50 times better will do that, said Deloitte in a separate report.

Enter FPGAs (forward programmable gate array), which AWS and Baidu are using for machine learning in data centres, and ASICs (application-specific integrated circuits), like Google’s Tensor Processing Unit.

If FPGA and ASIC chips do what GPUs did in terms of order-of-magnitude improvements in processing speed, efficiency, price or any combination thereof, a similar explosion in utility and adoption seems probable, the firm added.

Deloitte Global predicts that by the end of 2018, over 25 percent of all chips used to accelerate machine learning in the data center will be FPGAs (field programmable gate arrays) and ASICs (application-specific integrated circuits).

But this will be about machine learning getting cheaper to do rather than getting better or more accurate results, which is still enough to give a major boost to adoption of advanced technologies.

“If the only accomplishment of these new chips is to make machine learning 10, 100 or 1,000 times less expensive, that could be more revolutionary than it seems,” the report noted.

Share.

About Author

Leave A Reply