What I am about to explain may seem more science fiction than science fact but indulge me for a second.
In the not too distant future the Internet is planning on being able to read your mind.
I haven’t just eaten a large slice of ‘space cake’. I’m not even that mad. But if you follow search and social patents with any degree of interest there is a lot of activity right now around how to better understand emotional interaction as the ultimate ‘metric’ for measuring the value/quality of content.
And while understanding your reaction to what you see may seem like pie-in-the-sky’ it’s actually not as far away as you might think.
That transformational tech has monumental repercussions for how search and social algorithms could work in the future.
Over at MIT in the USA a team of neurologists and tech experts are already beginning to figure it out. Led by a scientist called Rosalind Picard the Affective Computing Research Group has developed technology already being used by major brands to test out the emotional connection of new advertising on potential viewers/consumers.
Using inbuilt cameras and a new kind of sensor that picks up on ‘electrodermal response’ their ‘tech’ to read facial expressions and connecting brain response to work out what we really think, or feel, about what we are seeing. Clearly this has huge potential benefits for agencies creating content…
The Facebook Like button is one of the most celebrated pieces of the Facebook success story. And it gives the Palo Alto Company huge mountains of personalization data from which they can serve you more of the good stuff and less of the people and content you don’t so much. But in the context of real world emotional understanding it is the equivalent of trying to perform keyhole brain surgery with a mallet.
As humans we don’t simply press a button to inform those around us what our emotional state is do we? Instead we use those telltale and universal facial expressions to share our feelings.
It is not difficult therefore to see how the world would change should our computers and mobile devices are equipped with technology that can read those subtle signs and inform our content choices as a result.
A smile would tell a search engine or social network that we should probably be seeing more of whatever it is we are looking at.
The connotations are immense. That level of data would take personalization to a new level, and would also mean the death of an icon – that Like button we mentioned earlier.
For Facebook it would turbocharge Edgerank to a new level, and may also put them in pole position in the race to provide a unique and personalized content experience for everyone.
It would also put the Palo Alto company in prime position to own not just the biggest audience online but to also own the tools to bring them the content that matters.
As recently as early April Facebook announced the first ‘live’ step in this long process by introducing ‘emoticon-based status updates’ to its main feed.
While this is pretty clunky right now it prove that Facebook is very interested in adding another layer to its ever-expanding data pile; that of how we feel. Why? Because ad dollars will flock to that like ducks to water and as marketers we can utilize such information too in creating content to fit moods as well as audience types.
Imagine a semantic search engine powered by personalization technology that works not on links and the document retrieval system that Google is using currently but on facial expressions and real emotion.
Semantic association and co-occurrence is already something that has been well documented as a future Google direction and those that follow the company may also be aware that it has made a considerable hire in its quest to develop what it describes as a ‘computer brain’.
Ray Kurzweil is an Artificial Intelligence expert and joined the business to help it develop capabilities to understand human emotion and other things.
His input was crucial in a recent ‘test’ to attempt to simulate the processes of a human brain (yep that really happened!).
In that test Google scientists exposed a giant web of interconnected computers to YouTube over a number of days to see what it may begin to recognize.
The answer was, of course, CATS!
Picking up on emotion is a key component of that research and as Google described a ‘deep learning’ algorithm used within the test there can be little doubt of their intention to use this kind of AI in the Core algo of the future for ranking our websites, content and information.
Google patent expert Bill Slawski believes the company’s recent acquisition of Behav.io is the next step in its quest to accelerate intelligent data sorting.
He explained: “Google has released a couple of patent filings about collecting location-based data from mobile devices and quarantining that data, and finding ways to scrub personalized data out of it. Given Google’s work with predictive algorithms for Google Now, and similar technologies is going to be enhanced considerably with ideas from the people working on Behav.io. They are spending a lot of time and resource in building up a team capable of really moving this on quickly now.”
Being able to personalize content based on emotional reaction could open up a whole new area of opportunity for content planning and creation.
Without doubt such a system would skew content creation towards the humorous or emotional and stories that play on those strongest of human emotions would certainly take precedence in any plan.
A funny animated Gif or meme or jaw dropping photograph or real life story about a fight against cancer would seemingly gain more traction and visibility within such a system and there would have to be controls on how that is surfaced to prevent emotive spamming becoming the new black hat tactic of the moment.
It does however also offer great data insight as content and website design or whole ad campaigns could be tested prior to release to gauge the Emotive Value and ensure a hit.
Whatever emotional intelligence has in store the one thing we do know is that it is not going away. The guys and girls at MIT and those looking to commercialize it will make sure of that. And while planning for such a day may seem pie-in-the-sky so did personal mobile internet just a few years ago.
Expect to see this integrated into social first as it makes much more sense as a key component of Edgerank, for instance as social should be driven more by emotional connection.
For Google the challenge is to be able to use AI at scale and across the web. While emotional signals will really help some niches (think food, automotive and other passion points) it will be limited in others (such as engineering and other less ‘sexy’ B2B markets).
In many ways it is the same challenge that faces the search giant in integrating social signals generally. While it would work well in niches where sharing is common it would die-a-death quickly in others.
I asked a number of respected industry experts their opinions on how such ‘technology’ may effect digital marketing in the future. Here is their take on the issue:
“If search ever becomes truly intelligent, marketers likely will have far less opportunity to appear in a list of results like they do today.
“Your company/brand/product/content will either be THE answer to a query, or it won’t appear at all. This will mean a rise in importance of non-search channels for discovery and a fall in the opportunity that search provides.”
“I love Kurzweil and believe we are headed that direction. Marketing is always a mixture of art and science…so if the technology gives marketers greater hints at how to develop better messaging, then I’m all for it.”
“Remember that the best salespeople have always been social humans, because of their ability to understand and adapt to the social signals of the people around them.”
“As computers become better at interpreting those signals, they’ll be able to do more of the things that human salespeople have always done, at lower costs. And that means increasing automation in marketing, and shifting the role of the marketer to psychologist and artist.”
“On the one hand, marketers will increasingly be in charge of actually understanding how people think at a basic, scientific level, in order to correct the computer automation. And on the other, marketers will need to be increasingly creative about using those psychological learnings.”
“At a macro level, this probably means a shift from demand generation (in the form of appealing ads, for example) to demand service (making it increasingly easy to act on authentic human desires). Which in turn means less advertising and more lead gen – the good marketer will be the one who can make the actual experience of doing something the most rewarding.”
“The capabilities that computers have in the areas of deep learning, and making more and better connections with data have been growing, but it’s
probably going to be a while before we see some really big impacts.”
“Applying deep learning approaches to that kind of wide-spread sensor-based data, and using it with predictive analytics is interesting, and may help identify potential health risks in our environment, how illnesses spread, which apps people are likely to download based upon where they live (mentioned as an example in one of the MIT patent filings regarding mobile sensor data), and much more.”
“”More “aware” computing may not just be a matter of more computing power and better algorithms, but also in providing more senses (more ways to collect data) to computing systems, as well as protecting privacy and information about individuals.”
“Marketers around the world struggle with determining the online sentiment of their campaigns. There’s lots of SaaS tools out there that take a stab at sentiment based on big data keyword mining, but many marketers are still forced to do a visual inspection of tweets, Facebook posts, blog comments, etc. because the available solutions are too inaccurate”
“Marketers have to manually assign sentiment by clicking on the smiley, neutral or frown face icon in one particular solution. This process is very time consuming and inefficient. The end result is that the data harvested from these tools can be considered untrustworthy or sketchy, at best.”
“Facebook’s new emoticons removes the need for big data sentiment keyword mining and eliminates the need for manual interpretation of sentiment. It empowers the content creator to define their own emotional intent. In this regard, it makes a marketers job much easier.”