Supervised Learning for Food Recognition
The colourful foods project at LiveSmart was built so that users could upload an image of their food plate to be analysed in order to obtain information about their diet. The analysis includes different steps, starting with the object detection using the YOLO model that will split the food plate into the different objects (food items) it contains. Then, we perform image classification on every detected object using the pretrained Inception model and assign a food category.
These two models need to be retrained using our 35K custom food images, which have been previously manually cropped and classified into 21 food categories. Finally, we perform a spatial analysis, a colour analysis and a nutritional content analysis in order to calculate the areas, colours and nutritional content of every food item in the plate. This will be used to understand how much variety the food plate has and how colourful the food plate is, which are both evidence of a healthy diet.
Laura Palacio García is a data scientist at LiveSmart, a health startup that provides an integrated solution to empower employees to optimise physical and mental wellbeing at work. She studied Biomedical Engineering in Barcelona, Spain and then did her MSc in Human and Biological Robotics at Imperial College London before starting to work for LiveSmart. She has previously worked as a data scientist in a biotechnology institute and a tech startup.