Identifying trends and even emotion on social media and the web has been around in different ways for a while now, particularly with the use of emotion detection from text using data mining and text mining. A recent article on R&D mag online, however, describes an interesting innovation that will bring emotion detection in real time, via Twitter. There are two elements that get my attention from this story: data representation and the nuances of the lexicon used on Twitter. Both are similar to our experience at EMOSpeech when designing an enterprise application from speech emotion recognition technology: first, we are able to gather a large amount of information, but we are faced with the challenge of presenting it in a simple way that is relevant for Quality Assurance supervisors and LOB managers. Secondly, our technology does not rely on language, but there are always nuances for which must make minor adjustments.
Understanding on line conversations in real time is an area that companies like Recorded Future have developed; I believe EmotionWatch provides a unique value that with great potential for marketing, PR, social sciences and other applications. We expect emotion recognition will continue to gain ground, with more applications launched in the next two years.
Read the article on line: “Sochi: Our tweeted emotions to be decrypted in real time”