One of the important changes in decision-making processes is the deployment of AI. The use of this technology for chemical risk assessment processes was one of the several hot topics of the EFSA ONE (#OneEU2022) conference held in Brussels from 21-24th June of 2022. One of the advantages of AI is the capacity of this technology to process big data sets in short time, enabling prompt decision-making outputs. The panelists discussed the critical points of the AI processes and frameworks that can lead to faulty and bias results. The impact of faulty AI systems can be of large dimension and quickly amplifying. Several examples were mentioned in this context. One was the recruiting AI tool used by Amazon in 2018, which favoured the selection of men. Another example was the use of AI to identify the best possible treatment approach for a patient, who did not respond to the treatment as expected and died. Careful attention needs to be paid to the scope of the AI tools, i.e. fitness-for-purpose. Extrapolation to other areas of application may not work as expected, e.g. AI developed with data from a specific socio-geographical area may not be used for prediction for other locations, which may have their own particularities. Several of these “buts” were eye openers for us. We knew that the quality and bias of data used to feed AI is critical, but were unaware of some the aspects that need to be considered and tested in the development of AI tools and applications before release to avoid programmed bias and minimize undesirable consequences. We do not have to forget that AI applications are made by humans and humans make mistakes, which if happening in an AI development, can lead to the mistakes being amplified.

Coming from food safety and food analytical fields, we could draw many parallels with AI-based tools: i.e. validation and testing for the systems to ensure that they work as expected for a defined context (matrix, if we talk about food analysis). Carmen Diaz-Amigo and Bert Popping of FOCOS engaged in the discussions with the experts.