Machine Learning Catch-Up

*|MC:SUBJECT|*

Dataconomy » Machine Learning Newsletter

Big Data News, Events, and Expert Opinion

Excerpts:

How Twitter Handles Your Data

Whenever someone needs to cite an example of the staggering volume of data generation today, Twitter is called upon. It’s no wonder this is the case- every single minute, 649 new users sign up to the service, and an astonishing 342,000 tweets are created. Handling this vast ocean of data- which includes not only text, but also hashtags, mentions, videos, links and images- is a daunting task to say the least. This is where Jake Mannix comes in. The former principal software engineer for LinkedIn, Mannix joined Twitter as an Applied Machine Learning Engineer for Twitter back in 2010. We recently caught up with Mannix to discuss his work with two of the biggest social media networks in the world, as well as how Twitter uses data to identify influencers...
Read on »

Uber Now Knows Where You’re Going Before You Step into the Cab

Uber’s latest predictive model will deduce where you’re headed and it’ll be accurate 74% of the time. With the use of data analytics, the data team at Uber incorporated classic Bayesian statistics to set up a model that infers “where people ultimately want to go, rather than where people may specify where they want to be dropped off, in order to get there,” explains Uber’s Ren Lu in a blogpost. The blog post explains the basic method of the research: We took the riding patterns of over 3000 unique riders in San Francisco earlier in 2014 (anonymizing the data to protect privacy.) Each of these trips had been “tagged” by the rider: when requesting an Uber, the rider had filled in the destination field. We assumed that this represented the...
Read on »

The Future of Machine Learning, According to Cloudera’s Sean Owen

In the first part of our interview with Sean Owen, Cloudera‘s Director of Data Science, we discussed the relationship between machine learning and Hadoop, the future of Apache Mahout and why machine learning has become such hot property. In this part of our discussion, we delved into the future of deep learning and neural networks, and how Owen foresees the relationship between machine learning and enterprise evolving. What do you think are some of the main trends in machine learning right now? To be honest, I think machine learning is still an advanced topic for enterprises. The infrastrcutres of most enterprises are built around reporting and retroactive analytics, and predictive analytics is still considered difficult and expensive. There is some truth to this, but at least we’re finding tools and...
Read on »

Baidu is Readying the World’s Largest Deep Learning System

Baidu is getting closer to finishing the world’s largest computer cluster for deep learning. The system is reported to have 100 billion digitally simulated neural connections conceived by heavy graphics processing units (GPUs). To put things in perspective the human brain has around a hundred trillion neural connections. Five years from now voice and picture based online searches will trump text based usership by more than half, owing to the need for easier modes of seeking information through mobile devices, explained Baidu Chief Executive Officer Robin Li. Andrew Ng, the chief scientist at China’s biggest search engine company, Baidu, has previously worked on the “Google Brain” project as well as the Stanford Artificial Intelligence Laboratory project before he joined Baidu in May. He believes that the ‘Baidu system will be...
Read on »

How Mimicking Brain Function is Revolutionising NLP

Since Microsoft began working with deep learning neural networks in 2009, we’ve seen huge improvements in the way algorithms can detect our language and dialogue. IBM have continued to pour money and resources into the development of Watson; Apple have moved the development of Siri in-house, to improve its NLP capabilities; we’ll soon see a version of Skype which can translate the spoken word on the fly. But Francisco Webber, co-founder of cortical.io, noticed a grey area in the realm of natural language processing. Most of it is heavily based in statistical analysis. “The problem with statistics”, he says, “is that it’s always right in principle but it’s not right in the sense that you can’t use it to create an NLP performance that is even close to what a...
Read on »

Former Kaggle Chief Scientist Rolls Out Enlitic to Revolutionalise Diagnostic Healthcare with Deep Learning

Last week saw the launch of Enlitic, a company that is incorporating deep learning to revolutionize diagnostic healthcare. With the intention of making the enormous trove of medical data that is available today accessible to physicians, Enlitic will provide data in the form of “medical images, doctors’ notes, and structured lab tests,” including X-rays, MRIs, and CT scans. “Medical diagnostics is, at its heart, a data problem – turning images, lab tests, patient histories, and so forth into a diagnosis and proposed intervention,” explains Enlitic Founder and CEO, Jeremy Howard. “Recent applied machine learning breakthroughs, especially using deep learning, have shown that computers can rapidly turn large amounts of data of this kind into deep insights, and find subtle patterns. This is the biggest opportunity for positive impact using data...
Read on »

Copyright © *|CURRENT_YEAR|* *|LIST:COMPANY|*, All rights reserved.
*|IFNOT:ARCHIVE_PAGE|* *|LIST:DESCRIPTION|*

Our mailing address is:
*|HTML:LIST_ADDRESS_HTML|* *|END:IF|*

unsubscribe from this list    update subscription preferences