Integrating TensorFlow Lite with Flutter Using Google Teachable Machine: A Complete Guide
Machine learning is increasingly being used to create intelligent mobile applications, and TensorFlow Lite (TFLite) is a popular solution for deploying machine learning models on mobile and edge devices. In this article, we’ll explore how to integrate TensorFlow Lite with Flutter, enabling you to create a powerful app that performs image recognition using a model trained with Google Teachable Machine. This guide includes an example project to make the process easier to understand.
What is TensorFlow Lite?
TensorFlow Lite is a lightweight version of TensorFlow, optimized for mobile and embedded devices. It enables developers to run machine learning models efficiently on low-power and low-latency environments, making it ideal for mobile apps built with Flutter.
What is Google Teachable Machine?
Google Teachable Machine is a web-based tool that allows anyone to create machine learning models without requiring deep knowledge of ML or coding. You can train models for image, audio, or pose classification in a simple and visual way. Once trained, the models can be exported to TensorFlow Lite format and used in mobile applications.
Tools and Technologies Used
- Flutter: For building the UI.
- TensorFlow Lite: For running the machine learning model.
- TFLite Plugin: Flutter plugin to use TensorFlow Lite.
- Camera Plugin: To capture images for analysis.
- Google Teachable Machine: For creating and training the image recognition model.
Prerequisites
- Basic understanding of Flutter development.
- A pre-trained TensorFlow Lite model using Google Teachable Machine.
Step 1: Training the Model with Google Teachable Machine
- Open Google Teachable Machine: Visit teachablemachine.withgoogle.com and select the Image Project option.
- Add Classes: Add the classes you want to recognize, for example, “Cat”, “Dog”, and “Bird”.
- Upload Images: Upload images or use your webcam to provide training data for each class.
- Train the Model: Click on the Train Model button to train your image classification model.
- Export the Model: Once the training is complete, click on Export Model, select TensorFlow Lite, and download the
.tflite
model file along with thelabels.txt
file.
Step 2: Setting Up the Project
First, create a new Flutter project using the command line:
flutter create tflite_example_app
Navigate into the project directory:
cd tflite_example_app
Then, add the necessary dependencies to your pubspec.yaml
file:
dependencies:
flutter:
sdk: flutter
tflite: latest_version
camera: latest_version
Step 3: Adding the TensorFlow Lite Model
To use TensorFlow Lite, you need a .tflite
model file. For this example, we will use the model created with Google Teachable Machine. Put our data sets in Google Teachable Machine and downloaded .tflite
,its easy to take data from kaggle.
Place the downloaded .tflite
model and labels.txt
in the assets
folder of your Flutter project.
Update the pubspec.yaml
file to include the assets:
flutter:
assets:
- assets/your_model.tflite
- assets/labels.txt
Step 4: Setting Up Permissions
Since we’ll be using the camera, update the AndroidManifest.xml
to include the necessary permissions:
<uses-permission android:name="android.permission.CAMERA" />
Step 5: Building the App UI
Now, let’s create the UI for the application, which will allow users to take a picture and use TFLite to recognize the object.
Create a new file called home_screen.dart
and add the following code:
import 'package:flutter/material.dart';
import 'package:camera/camera.dart';
import 'package:tflite/tflite.dart';
class HomeScreen extends StatefulWidget {
@override
_HomeScreenState createState() => _HomeScreenState();
}
class _HomeScreenState extends State<HomeScreen> {
late CameraController _cameraController;
bool _isInitialized = false;
String _prediction = '';
@override
void initState() {
super.initState();
_initializeCamera();
_loadModel();
}
Future<void> _initializeCamera() async {
final cameras = await availableCameras();
_cameraController = CameraController(cameras[0], ResolutionPreset.medium);
await _cameraController.initialize();
setState(() {
_isInitialized = true;
});
}
Future<void> _loadModel() async {
String? res = await Tflite.loadModel(
model: 'assets/your_model.tflite',
labels: 'assets/labels.txt',
);
print('Model loaded: $res');
}
Future<void> _predictImage() async {
try {
final image = await _cameraController.takePicture();
var recognitions = await Tflite.runModelOnImage(
path: image.path,
numResults: 3,
threshold: 0.5,
);
setState(() {
_prediction = recognitions!.isNotEmpty
? recognitions[0]['label']
: 'No recognizable object found';
});
} catch (e) {
print('Error capturing or recognizing image: $e');
}
}
@override
Widget build(BuildContext context) {
return Scaffold(
appBar: AppBar(
title: Text('TFLite Flutter Example'),
),
body: Column(
mainAxisAlignment: MainAxisAlignment.center,
children: [
if (_isInitialized)
AspectRatio(
aspectRatio: _cameraController.value.aspectRatio,
child: CameraPreview(_cameraController),
),
ElevatedButton(
onPressed: _predictImage,
child: Text('Capture and Identify'),
),
if (_prediction.isNotEmpty) ...[
Padding(
padding: const EdgeInsets.all(16.0),
child: Text('Prediction: $_prediction',
style: TextStyle(fontSize: 18, fontWeight: FontWeight.bold)),
),
]
],
),
);
}
@override
void dispose() {
_cameraController.dispose();
super.dispose();
}
}
Step 6: Running the Model
In the code above, the _loadModel()
method loads the TFLite model when the app starts. The _predictImage()
method is triggered when the user presses the "Capture and Identify" button. It takes a picture using the camera, processes it using the TensorFlow Lite model, and displays the most probable label for the recognized object.
Step 7: Testing the App
To test the app, connect a physical device and run the application. Due to the camera plugin and TensorFlow Lite processing, testing on an emulator might not yield the best results.
- Launch the app and click on the “Capture and Identify” button.
- The camera will capture an image, and the app will display the predicted label along with confidence scores.
Conclusion
By integrating TensorFlow Lite with Flutter using a model trained with Google Teachable Machine, you can create intelligent and interactive apps that provide on-device machine learning capabilities. Whether it’s for educational purposes, object recognition, or other smart applications, using TFLite is a great way to leverage AI in your Flutter projects.
The example we built is just the beginning — TensorFlow Lite can also be used for text classification, pose detection, and many more use cases. Explore its potential to create more advanced and interactive Flutter applications!
“I would like to thank this programmer because their content is very good, and I have rarely seen this quality in Arabic content.”