Belgian Newsagents Scan Customers’ Faces When They Want to Buy Tobacco Products

Two hundred newsagents in Belgium are using AI age scanners since March 18 to prevent customers who are too young from buying tobacco, alcohol or lottery games.
© Jan Van der Perre

Two hundred newsagents in Belgium are using AI age scanners since March 18 to prevent customers who are too young from buying tobacco, alcohol or lottery games. The system will soon be rolled out nationwide, and in the rest of the EU zone. A good idea? “There is a risk that they will have a discriminatory effect“, says criminologist Rosamunde Van Brakel (VUB).

How Do These Press or Newspaper Shops Want to Use AI Scanners?

“The press shops are going to install AI systems that automatically scan the faces of young customers. That system then estimates whether the buyer is too young to buy tobacco or alcohol, for example. The goal is of course noble: to protect young people from the harmful effects of gambling, alcohol or tobacco. But I wonder whether this is the right way to protect young people.”

“I have my doubts about this being rolled out to two hundred shops on the basis of a pilot project, without knowing exactly what the results are. Did the AI ​​system ensure that more minors were discovered? How many false positives were there? That information is missing, while at the very least a debate should be held about it before we install it everywhere.”

How accurate are such AI systems?

“There are still few independent studies on the use of such systems in stores. The algorithms used for this are trained by recognizing patterns in millions of photos. But we do know from scientific research into facial recognition that they are generally more accurate for men and white people.”

“The risk then exists, for example, that this AI system will often estimate black young people to be too young, and that it will therefore have a discriminatory effect.”

© Jan Van der Perre

Isn’t the risk of false negatives greater?

“In a sense, yes. Because if such a system estimates a customer to be too young, the salesperson will check that person’s ID. It is a good thing that this external check exists.”

“But the reverse is indeed not the case. If this AI system thinks that a young person is old enough, the salesperson no longer has to check them. Young people who look older than they are may therefore be able to buy alcohol or tobacco more easily.”

“The reflex that sellers have to ask for proof of identity when in doubt may gradually disappear as a result. Because we assume that such an AI system will be right, while in practice it often turns out that it is anything but flawless.”

These AI scanners would not film customers continuously, they would not be connected to the internet and would immediately destroy the customer’s data after the age estimate. Should we be concerned about our privacy?

“If all that is true, I do wonder how they are going to check or improve the effectiveness of those systems. It seems important to me that you can indicate whether such a model makes the correct assessment or not. We know from previous pilot projects with facial recognition that such systems often store data for a short period of time.”

© Jan Van der Perre

Could such a system lead to fewer conflicts, because it feels less arbitrary than a seller who chooses which customers to check himself?

“That is indeed possible. But I can also imagine that many young people do not like the fact that they are increasingly being photographed or filmed. In Australia, for example, they are already experimenting with allowing young people to log in to social media using facial recognition.”

“This example fits into a broader trend in which, without much debate, we are installing cameras with facial recognition everywhere. For example, the new EU AI regulation even makes it possible to have cameras with facial recognition in real time in public spaces. This all happens silently, but we are gradually growing into a total surveillance society. You only have to look at the US to see what happens if you handle this irresponsibly.»

Share this article