A library for running TF Lite models
- once you've trained a model in TensorFlow you can convert it to TF Lite for production use
- inspect the TF Lite model using netron.app
- some good TF models for object detection (need conversion)
Also see the project documentation
-
Add the dependency to your
shard.yml
:dependencies: tensorflow_lite: github: spider-gazelle/tensorflow_lite
-
Run
shards install
See the specs for basic usage or have a look at imagine
require "tensorflow_lite"
you can use the example metadata extractor to obtain the metadata for TF Lite models downloaded from tfhub.dev
Such as a Coral USB device
require "tensorflow_lite/edge_tpu"
To install the edge tpu delegate:
# Add Google Cloud public key
RUN wget -q -O - https://packages.cloud.google.com/apt/doc/apt-key.gpg | gpg --dearmor > /etc/apt/trusted.gpg.d/coral-edgetpu.gpg
# Add Coral packages repository
RUN echo "deb [signed-by=/etc/apt/trusted.gpg.d/coral-edgetpu.gpg] https://packages.cloud.google.com/apt coral-edgetpu-stable main" | tee /etc/apt/sources.list.d/coral-edgetpu.list
# install the lib
sudo apt update
sudo apt install libedgetpu-dev
To install the Coral USB drivers
sudo apt install libedgetpu1-std
# OR for max frequency
sudo apt install libedgetpu1-max
# unplug and re-plug the coral or run this
sudo systemctl restart udev
NOTE:: when using a coral and running lsusb
you need to look for either:
- Global Unichip Corp.
- Google Inc.
after running something on the chip it will change identity to Google Inc.
And you need to include the Google identity version in any docker files.
To update tensorflow lite bindings ./generate_bindings.sh
The dockerfile is used to build a compatible tensorflow build for target platforms.
There is an image pre-built at docker pull stakach/tensorflowlite:latest
To build an image run:
docker buildx build --progress=plain --platform linux/arm64,linux/amd64 -t stakach/tensorflowlite:latest --push .
to extract the libraries
mkdir -p ./ext
docker pull stakach/tensorflowlite:latest
docker create --name tflite_tmp stakach/tensorflowlite:latest true
docker cp tflite_tmp:/usr/local/lib/libedgetpu.so ./ext/libedgetpu.so
docker cp tflite_tmp:/usr/local/lib/libtensorflowlite_c.so ./ext/libtensorflowlite_c.so
docker cp tflite_tmp:/usr/local/lib/libtensorflowlite_gpu_delegate.so ./ext/libtensorflowlite_gpu_delegate.so
docker rm tflite_tmp
this operation is performed post-install by this library
Requires libtensorflow to be installed, this is handled automatically by ./build_tensorflowlite.sh
- there is a guide to building it
- you can use
./build_tensorflowlite.sh
to automate this - then requires
export LD_LIBRARY_PATH=/usr/local/lib
to run - test if installed successfully
crystal ./src/tensorflow_lite.cr
- this will output
Launching with tensorflow lite vx.x.x
- this will output
NOTE:: the lib is installed for local use via a postinstall script.
Make sure to distribute libtensorflowlite_c.so
with your production app
- Fork it (https://github.com/your-github-user/tensorflow_lite/fork)
- Create your feature branch (
git checkout -b my-new-feature
) - Commit your changes (
git commit -am 'Add some feature'
) - Push to the branch (
git push origin my-new-feature
) - Create a new Pull Request
- Stephen von Takach - creator and maintainer