EXAMING FACIAL RECOGNITION

Amazon has recently starred the latest controversy related to facial recognition and its application to the security world.

The technological evolution in which we have been involved in the last decade has been dizzying. Every little time we are surprised with new terms that make reference to new technologies; IoT, blockchain, augmented reality, etc. In the case of facial recognition, we see it every day in the news as something that is coming to our homes, and we are constantly reminded by great technology companies like Amazon, Apple, Microsoft… hey all want to be the first to adopt facial recognition successfully. But we can not forget that we are still far from having a mature, stable technology that guarantees reliable and precise results.

Our test reinforces the theory that facial recognition is not a safe technology for governmental adoption

Jacob Snow (technology and civil rights lawyer, ACLU Foundation of Northern California)

One of the latest controversies about facial recognition has been carried out by Amazon and its controversial program Rekognition, an image identification software the company sells to some governments. The ACLU (American Union for Civil Liberties) conducted a study to prove the unreliability of this type of software, crossing data of 25,000 pictures of arrested people with public pictures of all members of the United States Congress. The result: 28 members of the US Congress were incorrectly identified as criminals. The application associated the different faces of each other, identifying them as similars, and confused the congressmen and congresswoman as people who had committed some type of crime.

The concern for the use of this type of programs has quickly reached the streets, asking the CEO of Amazon to stop selling this program to government entities and police bodies. They argue to have in mind the serious consequences that may occur if security bodies use this type of programs to find matches between people with criminal records or to conduct searches, based on false identifications. This demonstrates the poor application that can be made of the most advanced technologies, when they are not 100% fully developed.

Amazon’s Rekognition program was born to identify objects, texts, scenes or people in a private environment. The goal of the program is for the software to identify the origin of objects pictured in photos such as animals, furniture or food. The use of this type of programs in areas where the consequences of a misjudgement can be very serious – such as controlling the accesses to your company or home – is a big mistake.

Another concern surrounding this case, is the use of this technology without professional technical support. The ACLU study was conducted with the confidence threshold predetermined by the program, which is 80%. According to the ACLU: “Although 80% confidence is an acceptable threshold for hot dogs pictures, chairs, animals or other cases in social networks, it would not be appropriate to identify individuals with a reasonable level of certainty”.

However, the program does not recommend in the initial configuration to modify the threshold or to be installed by a trained technician to ensure the viability of the system.

When we talk about security, we can not talk lightly. It is essential for a correct use of security applications and technologies, that companies draw on solutions that guarantee the quality and operational functioning of the installation and that also have a professional technical backing.

The case of Amazon is not an isolated case. There are many low-cost equipment on the market whose reliability is not the one it should be; incorrect identifications using 2D pictures of the enrolled person due to the inability of the device to capture the depth of the face, terminals that have not resolved the identification of people over the time (wrinkles, scars, signs of age, swelling, …) or devices that perform incorrect validations in cases of people with similar facial characteristics, underage children, twins, false gel masks, etc.

Although it is clear that facial recognition has evolved a lot in the recent years, when implementing the technology with a security purpose, we must ensure the stability and robustness of that technology and ensure that it is fully functional to use it safely in companies, business, houses and public institutions.

Therefore, to guarantee maximum security, experts continue to recommend biometric recognition by means of fingerprint reading. At NÜO we are pioneers in biometrics, we have been working on identification technologies for almost 20 years and we use different biometric technologies: capacitive, optical and multispectral.

Choose the latest in leading edge technology, design and high security.

NÜO

nuoplanet.com

Our solutions and products are the result of research and innovation, and are designed to adapt to your specific needs. We have the perfect alternative to guarantee security in your project or building.

Do you want us to help you?

Leave a Reply

For security reasons and to avoid comments like spam, a valid email must be entered in order to comment on the entries in the BY Tech blog. This information will never be used neither for commercial purposes nor for any purpose other than the validation of system security.

The personal data you provide to us by filling in this form will be processed by BY TECHDESIGN, S.L., as the data controller of this website.

The purpose of collecting and processing the personal data we request will be to relate to you and be able to provide our services and respond to your enquiries or requests. The legitimacy is through the express consent of the data subject.

We store your data during our relationship and for as long as the applicable laws bind us. You may contact us at any time to find out what information we have about you, rectify it if it is incorrect and erase it once our relationship has concluded.

You may exercise your rights to access, rectification, restriction, portability and the erasure of data at arco@by.com.es. If you believe that your rights have been violated, you may file a claim with the Spanish Data Protection Agency (www.aepd.es).

You may consult additional and detailed information in our DISCLAIMER, and specifically in section 4 concerning the “Privacy Policy”.