Emotional Mobile Marketing

Posted by michael MEIRESONNE on September 11th, 2018

Has anyone noticed we are also starting to see petabytes? That facial emotion recognition software in your phone needs to store your face information in the cloud on a huge server for everyone else to analyze your face for pennies per sales transaction. Emotional mobile marketing is on it's way to a phone near you!

When most people think ‘facial recognition,’ their minds probably drift to the latest iPhone security feature, which lets you unlock your phone by looking at it (and theoretically makes it impossible for anyone but you to unlock). But, facial recognition appeared long before it debuted on iPhones, and now its applications are much wider–and potentially more controversial.

Think back to the time you logged into Facebook and learned that it would now automatically ‘suggest’ tags in photos uploaded onto the social media website. It was an uncanny, giddy moment– “How does it know that’s so and so?!” And it did it consistently, quickly, and with scary accuracy.

The new-tech euphoria quickly wore off as you realized that, somewhere in Silicon Valley, there were developers, apps, and computers that could quickly scan a photo and determine exactly who was in it.

And therein lies the rub. The same technology that enables one convenient iPhone feature and saves you a few seconds tagging photos on Facebook (if people still do that these days) is also being rolled out in full Orwellian force, enabling governments, security agencies, and other entities interested in tracking and managing people the ability to pick out your face in a crowd using the video cameras that are already covering nearly every angle and inch of big cities.

Facial recognition, much like the name implies, involves a computer’s ability to recognize an individual face. When you stop to think about how you recognize your friends, you probably list some standard traits like height, hair color, a certain feature or two that stick out to you.

Computers have to learn how to recognize individual faces in images, then compare those faces to a database or other images in order to “recognize” a given face. This is one task where humans still act far faster and more reliably than computers.

Our innate ability to recognize people is almost indescribable and usually instantaneous; computers have to treat faces like math problems and then try to ‘solve’ them against other equations until they find a match.

Still, recent iterations of facial recognition work better than ever before. iPhones are becoming shockingly good at unlocking when they see their owners’ faces and governments now rely on facial recognition in certain circumstances.

Facial recognition software uses a two-step process to recognize that something within the frame is indeed a face before measuring various features of the face to determine which face it is.

Every face has distinguishable ‘landmarks,’ which are  a series of ‘peaks and valleys’ that can be measured by a computer. Some software companies call these peaks and valleys nodal points, and the industry consensus is that the human face has an average of 80 nodal points.

Here are some of the primary nodal points computers use to recognize human faces:

  • Distance between the eyes
  • Width of the nose
  • Depth of the eye sockets
  • The shape of the cheekbones
  • The length of the jaw line

Early iterations of facial recognition relied on 2D imaging techniques to evaluate nodal points, which understandably could be easily misconstrued if faces are turned at different angles or obscured by shadows or bright areas.

The latest facial recognition software uses 3D imaging to more accurately detect and identify faces, even when they are partially obscured, turned at extreme angles, or otherwise lack landmarks which computers could previously identify.

Facebook’s latest software, dubbed DeepFace, boasts a 97.25% success rate at identifying human faces in photos. Human beings average 97.58% on the same test. The technology that may have made the occasional silly gaffe a few years ago has quickly closed the gap to within a decimal point of human intelligence.

Some companies are taking identifying techniques several steps further by using biometric identification that treats facial texture and wrinkles much the same as human fingerprints–each line, mark, and variation in tone is unique to an individual and in some instances may be more accurately measured and compared than nodal points

Like it? Share it!


michael MEIRESONNE

About the Author

michael MEIRESONNE
Joined: July 23rd, 2018
Articles Posted: 1