MML MP

You might also like

Download as pptx, pdf, or txt
Download as pptx, pdf, or txt
You are on page 1of 10

Prepare Presentation on Build

tensor flow lite models


Microproject Co-ordinator : Mrs U.G.Sonwane
 Team Member
Sr No Name Roll No.

1. Madhur Madane 32

2. Kalpesh Mahale 33

3. Hrishikesh Mali 34

4. Samarth Pawar 46
What is TensorFlow
lite Model ?

Use of TensorFlow
lite Models.

How to build
TensorFlow lite Models

CONTENTS
What is TensorFlow
lite Model ?
TensorFlow lite Models

TLM
TensorFlow Lite, often referred to as
TFLite, is an open source library
developed by Google for deploying
machine learning models to edge
devices. Examples of edge
deployments would be mobile
(iOS/Android), embedded, and on-
device.
Use Of TensorFlow lite Models.

pattern recognition language processing


Pattern recognition is a data analysis method Is TensorFlow used for natural language
that uses machine learning algorithms to processin ?
automatically recognize patterns and Natural Language Processing with TensorFlow
regularities in data brings TensorFlow and NLP together to give
you invaluable tools to work with the
immense volume of unstructured data

Object Detection Invoke inference Read output tensor values


an object detection model can identify which The term inference refers to the process of The easiest way to evaluate the actual value
of a known set of objects might be present executing a TensorFlow Lite model on-device of a Tensor object is to pass it to the Session.
and provide information about their positions in order to make predictions based on input
within the image. data.
How to Build TensorFlow lite Models
Steps To build TFM
Building TensorFlow lite model

Step 1: In this step, you create or train a TensorFlow model using any of the TensorFlow APIs, such as TensorFlow Python, TensorFlow.js, or
TensorFlow Java.
Step 2: Once you have your TensorFlow model, you need to convert it to a TensorFlow Lite model using the TensorFlow Lite Converter.
Step 3: After converting your TensorFlow model to TensorFlow Lite, you may want to optimize the model to make it more efficient for
deployment on mobile or edge devices. This can be done using the TensorFlow Lite Optimizer.
Step 4: Finally, you can deploy the optimized TensorFlow Lite model to mobile or edge devices using the TensorFlow Lite Interpreter.
Thank you
Microproject Co-ordinator : Mrs
U.G.Sonwane

You might also like