article

Food authenticity; new frontiers but as many challenges ahead

Posted: 12 November 2024 | | No comments yet

This week Professor Chris Elliott reflects on the need to accurately tackle authenticity testing and establish a benchmark that both the industry and public can rely on

I have just attended the 11th Recent Advances in Food Analysis (RAFA) conference held in Prague. For many years this has been my favourite food science conference, as so many experts attend and present their cutting-edge research. It’s a great opportunity to view new devices and software from a wide range of vendors and they hold a series of workshops to explain the new features and applications of their own systems. RAFA  delivers important information on new technologies and their applications in food analysis.

 

Food authenticity analysis: a rising priority

As I looked through the conference programme and attended many of the sessions, several trends became very obvious. The first was that food authenticity analysis is now one of the most important topics, with many sessions specifically dedicated to it, and even dominating in more general ‘food analysis’ sessions. I suppose this shouldn’t be overly surprising, as estimates put the value of the global food authenticity testing market at close to $8 billion in 2023 and projected to rise to over $12 billion by the end of this decade. To my mind this all reflects the growing concerns within many food businesses and government agencies that food fraud will grow in importance and severity, mainly driven by climate change and geopolitics.

 

Emerging trends in food authenticity testing

Within the food authenticity domain there are also some very clear trends. The movement to ‘fingerprinting techniques’ or ‘untargeted analysis’ as many scientists prefer to call it, is very clear. The role of artificial intelligence (AI) is becoming more prominent in the interpretation of massive data capture to form models of authentic and inauthentic samples. Furthermore, the way these databases are being used to make important decisions . The practice of ‘data fusion’ ie, taking results from different testing methods, combining them and applying AI to provide extremely sensitive and reliable testing systems for authenticity analysis, was also evident. My team and I are rather proud that we were one of the first research groups in the world to combine all these advances into one single authenticity application (salmon), which proved to be the direction of travel needed to accurately determine what’s genuine from what’s fake.

 

Challenges and questions in food authenticity testing

As I listened to all the great presentations and discussed their importance with friends and colleagues, while we all agreed that the path being taken was correct and highly promising, there remain quite a few questions to be answered and difficulties to overcome. How reliable are the methods when applied in the real world? This, in most cases, relates directly to the quality of the databases that have been generated.  

How many samples need to be tested to provide the quality assurance that the models actually reflect real-life circumstances of changes in weather patterns and changes in agronomic practises, just to mention a couple of the variables that can impact food fingerprints? The numbers of samples applied in the test development and validation processes varied from just a few to many thousands.  It may seem boring to hardcore researchers, but guidelines and international standards that set out minimum requirements for food authenticity databases are desperately needed.

 

Progress toward a harmonised approach for food authenticity validation

I think a first important step for this was taken recently in a project led by Dr Stephane Bayen from McGill University in Canada, considering how a harmonised approach for food authenticity marker validation and accreditation should be taken. I was very happy to be part of this group and we are now planning our next exercise to ascertain how many samples and how many seasons are needed to produce reliable databases that can be exploited in real-world testing programmes.

 

Case studies and the need for reliable authenticity testing methods

The conference presented some great case studies of authenticity – far too many to include in this article. But one really caught my attention: ‘The weird case of the vegan sausage’. It was reported in Italy that numerous vegan meat-alternative products were found to contain chicken, beef and pork. Whether this was due to really poor manufacturing processes or deliberate fraud was not confirmed. Either way the data presented certainly made me extremely suspicious of deliberate contamination. Even more concerning, if that’s possible, is that African Swine Fever DNA was also found in some of these products as well as in some other sausage products that were labelled as actually containing meat. In the past, I have warned about some significant issues regarding food safety in alternative protein products. I now add fraud to the list of concerns.

 

Building trust in food authenticity testing through robust validation

I have no doubt more and more of the fingerprinting techniques will find their way into routine testing laboratories. Some are already being used, mainly using isotope fingerprinting methods, and these seem to throw up as many problems as solutions in terms of data interpretation.

One major food company recently told me that they waste so much time and resource following up on isotopic test results that indicate something might be wrong, when after significant investigative effort by staff, they are shown to be perfectly fine…  We need to get out of this hiatus and have total trust in the methods that are, and will be, used to determine the authenticity of the food we all eat. I believe this can be achieved by taking the scientific breakthroughs outlined through harmonised and robust validation protocols. 

Related regions

Leave a Reply

Your email address will not be published. Required fields are marked *