Skip to content

Latest commit

 

History

History
111 lines (73 loc) · 4.35 KB

README.md

File metadata and controls

111 lines (73 loc) · 4.35 KB

Android app for food recognition and display nutrition info using Teachable Machine learning

Overview

This android application will recognise the food item from the camera input and link the nutritional info for the food item.

This is an example application for TensorFlow Lite on Android. It uses Image classification to continuously classify whatever it sees from the device's back camera. Inference is performed using the TensorFlow Lite Java API. The demo app classifies frames in real-time, displaying the top most probable classifications. It allows the user to choose between a floating point or quantized model, select the thread count, and decide whether to run on CPU, GPU, or via NNAPI.

These instructions walk you through building and running the demo on an Android device. For an explanation of the source, see TensorFlow Lite Android image classification example.

Model

The model is created by training on images scraped from google. After scraping the images they are classified and trained via https://teachablemachine.withgoogle.com/train

Requirements

  • Android Studio (installed on a Linux, Mac or Windows machine)

  • Android device in developer mode with USB debugging enabled

  • USB cable (to connect Android device to your computer)

Build and run

Step 1. Clone the TensorFlow examples source code

Clone the TensorFlow examples GitHub repository to your computer to get the demo application.

https://github.com/Ravi-Teja-konda/trackMyNutrition.git

Open the TensorFlow source code in Android Studio. To do this, open Android Studio and select Open an existing project, setting the folder to examples/lite/examples/image_classification/android

Step 2. Build the Android Studio project

Select Build -> Make Project and check that the project builds successfully. You will need Android SDK configured in the settings. You'll need at least SDK version 23. The build.gradle file will prompt you to download any missing libraries.

The file download.gradle directs gradle to download the two models used in the example, placing them into assets.

Note:

`build.gradle` is configured to use TensorFlow Lite's nightly build.

If you see a build error related to compatibility with Tensorflow Lite's Java API (for example, `method X is undefined for type Interpreter`), there has likely been a backwards compatible change to the API. You will need to run `git pull` in the examples repo to obtain a version that is compatible with the nightly build.

Step 3. Install and run the app

Connect the Android device to the computer and be sure to approve any ADB permission prompts that appear on your phone. Select Run -> Run app. Select the deployment target in the connected devices to the device on which the app will be installed. This will install the app on the device.

To test the app, open the app called TFL Classify on your device. When you run the app the first time, the app will request permission to access the camera. Re-installing the app may require you to uninstall the previous installations.

Assets folder

Do not delete the assets folder content. If you explicitly deleted the files, choose Build -> Rebuild to re-download the deleted model files into the assets folder.

References:

Note: This project is for educational purposes only. All the links provided for nutritional info are for demo purposes only.