Three QUT researchers are part of an international research team that has identified new ways for retailers to use artificial intelligence in concert with in-store cameras to better serve consumer behavior and adapt store layouts in order to maximize sales.
In research published in artificial intelligence reviewthe team proposes an AI-powered store layout design framework for retailers to make the most of recent advances in AI techniques, and its subfields of computer vision and deep learning to monitor physical shopping behaviors of their customers.
Any shopper who has picked up milk from the furthest corner of a store knows well that an effective store layout presents its merchandise to both draw the customer’s attention to items they did not have intent to buy, increase browsing time, and easily find related or viable alternative products grouped together.
A well-thought-out layout has been shown to be positively correlated with increased sales and customer satisfaction. It is one of the most effective in-store marketing tactics that can directly influence customer decisions to increase profitability.
QUT researchers Dr. Kien Nguyen and Professor Clinton Fookes from the School of Electrical Engineering & Robotics and Professor Brett Martin, QUT Business School teamed up with researchers, Dr Minh Le, from Ho Chi Minh City University of Economics, Vietnam, and Professor Ibrahim Cil from Sakarya University, Serdivan, Turkey, to conduct a comprehensive review of existing approaches to store layout design.
According to Dr. Nguyen, improving supermarket layout design – through understanding and predicting – is a key tactic for improving customer satisfaction and increasing sales.
“Most importantly, this paper provides a comprehensive and novel framework for applying new AI techniques alongside existing CCTV camera data to interpret and better understand customers and their in-store behavior,” said Dr. Nguyen.
“CCTV offers information on how shoppers move through the store; the route they take and the sections where they spend more time. This research proposes to go further, noting that people express emotions through observable facial expressions such as raising an eyebrow, opening their eyes, or smiling.
Understanding customer emotions as they browse could provide marketers and managers with a valuable tool to understand customer reactions to the products they sell.
“Emotion recognition algorithms work by using computer vision techniques to locate the face and identify key landmarks on the face, such as the corners of the eyebrows, the tip of the nose, and the corners of the mouth,” said Dr. Nguyen.
“Understanding customer behavior is the ultimate goal of business intelligence. Obvious actions such as picking up products, putting products in the cart, and putting products back on the shelf have generated great interest in smart retailers.
“Other behaviors like looking at a product and reading a product box are a gold mine for marketing to understand customer interest in a product,” Dr. Nguyen said.
In addition to understanding emotions through facial cues and customer characterization, layout managers could use techniques of heatmap analysis, human trajectory tracking, and customer action recognition to illuminate their decisions. This type of knowledge can be assessed directly from the video and can be useful in understanding customer behavior at the store level while avoiding having to know individual identities.
Professor Clinton Fookes said the team came up with the Sense-Think-Act-Learn (STAL) framework for retailers.
“First of all, ‘Sense’ involves collecting raw data, for example from video footage from a store’s CCTV cameras, for processing and analysis. Store managers routinely do this with their own eyes; however, new approaches allow us to automate this aspect of detection and perform it across the store,” Prof Fookes said.
“Second, ‘Think’ involves processing collected data through advanced AI, data analytics and deep machine learning techniques, like how humans use their brains to process incoming data.
“Thirdly, ‘Act’ is about using the knowledge and insights from the second phase to improve and optimize the layout of the supermarket. The process works as a continuous learning cycle.
“A benefit of this framework is that it allows retailers to assess store design predictions such as traffic flow and behavior when customers enter a store, or the popularity of store displays placed in different areas. of the store,” Professor Fookes said.
“Stores like Woolworths and Coles already routinely use AI-based algorithms to better serve customers’ interests and desires, and to provide personalized recommendations. This is especially true at the point-of-sale system and program level. This is just another example of using AI to deliver better data-driven store layouts and designs, and to better understand customer behavior in physical spaces.
Dr Nguyen said the data could be filtered and cleaned to improve quality and privacy and transformed into a structural form. With privacy being a major concern for customers, data could be anonymized or made anonymous, for example by looking at customers at an aggregated level.
“Since there is an intense data stream from CCTV cameras, a cloud-based system can be considered as an appropriate approach for the analysis of supermarket layout in video data processing and storage. “, did he declare.
“The intelligent video analysis layer in the THINK phase plays the key role in interpreting the content of images and videos.”
Dr Nguyen said layout managers could consider store design variables (e.g. space design, POS displays, product placement, cashier placement ), employees (e.g. number, placement) and customers (e.g. crowd, length of visit, impulse purchases, use of furniture, queue formation, receptivity to product presentations) .
artificial intelligence review
The title of the article
When AI meets store layout design: a review
Publication date of articles