Facebook announced in a blog post recently that it will begin using a deep learning engine developed by its AI Research Team to further analyze users’ posts. DeepText is “a deep learning-based text understanding engine that can understand with near-human accuracy the textual content of several thousands of posts per second, spanning more than 20 languages,” the post said. The theory is that precise ad targeting and laser-focused search results will help users consume content better.
In other words, Facebook is investing in its search capabilities again. The social media platform says DeepText will be able to comprehend slang and word-sense disambiguation. So if someone uses the word “squash” in a post, Facebook’s AI could comprehend whether the user is talking about the gourd or the action of crushing. Facebook says DeepText will also do a better job at weeding out spam. DeepText isn’t too different from Google’s Parsey McParseface (yes, really), a program designed to teach computers how to understand the human language.
At its core, DeepText is an artificial intelligence-based search engine designed to analyze and understand text, photos and videos. One of the ways it could be extremely useful, Facebook says, is for those looking to buy and sell using the platform. The site gathered data on sale posts in its initial use case of the software. So, for example, if you’d like to search Facebook to buy a specific product, you might get recommendations and reviews from your Facebook friends plus ads from companies offering that product.
But there’s a Downside to Everything
So, with all of this being said, could Facebook be on its way to becoming the new Google? No, and here’s why: Unlike Google, Facebook has never been a leader in the search space, and social media may not even be the ideal platform for such a pursuit. It simply isn’t where users go to shop or to get answers. That doesn’t mean it can’t catch up, but it’s rather unlikely, since Google has been the leader in search AI, with a library of research on the topic that dates back to 2001, when Google was just years old.
No matter how smart and undeniably useful technology becomes, consumers are still lukewarm at best when it comes to information sharing. The fear of a computerized “big brother” — robots that not only read everything you publish, but also can understand it on a deeper level— could be a challenge for some users. It’s hard to get consumers behind metadata analysis, especially if it’s obvious that the data is used to sell. Plenty of users already admit that they find Google’s highly targeted ads slightly creepy. Indeed, there’s still something unsettling about Google crawling your e-mails in order to feed you the right ads at the right time, even if it’s sometimes helpful.
Plus, it’s important to keep in mind that DeepText only uses your own greater personal network, interests and connections to feed results. It will be interesting to see how Facebook uses this technology to improve its user experience.