This is a camera app that continuously segment the objects in the frames seen by your device's camera.
FastSAM by Ultralytics is a real-time, CNN-based model designed to segment any object in an image with minimal computational resources. It builds on YOLOv8-seg and is tailored for high-speed and efficient segmentation across various tasks.
This project uses the FastSAM_s.pt variant!
- Real-time segmentation using CNNs
- Efficient instance segmentation via prompt-guided selection (not applicable in this android demo)
- Built on YOLOv8-seg for fast and accurate performance
git clone https://github.com/CASIA-IVA-Lab/FastSAM.git
cd FastSAM
pip install -r requirements.txt
- Android Studio IDE: Tested on Android Studio Dolphin.
- Physical Android Device: Minimum OS version SDK 24 (Android 7.0 - Nougat) with developer mode enabled.
- Open Android Studio and select Open an existing Android Studio project.
- Navigate to
./Fast_SAM-android
and click OK. - If prompted for Gradle Sync, click OK.
- Connect your Android device, enable developer mode, and click the green Run arrow in Android Studio.
Implement LiteRT for a segmentation task utilizing the FastSAM model by Ultralytics.
Google has rebranded TensorFlow Lite as LiteRT which is used inside this project. Despite the new name, LiteRT retains the same high-performance on-device AI runtime but with expanded vision for multi-framework support, including models built in TensorFlow, PyTorch, JAX, and Keras. This change aims to make deploying machine learning models easier and more efficient across Android, iOS, and embedded devices. The name reflects Google’s commitment to a lightweight, multi-framework AI future.
Google Cloud credits are provided for this project for the #AISprint September 2024.