Converting your inference graph file to a Tensorflow lite (.tflite) file

python object_detection/export_tflite_ssd_graph.py \ — input_type image_tensor \ — pipeline_config_path object_detection/testing_models/pipeline.config \ — trained_checkpoint_prefix object_detection/testing_models/ripper/model.ckpt-17463 \ — output_directory object_detection/testing_models/tflite/ — add_postprocessing_op=true

tflite_convert — graph_def_file=/Users/sivakumarswaminathan/anaconda3/lib/python3.7/site-packages/tensorflow/models/research/object_detection/testing_models/tflite/tflite_graph.pb — output_file=/Users/sivakumarswaminathan/anaconda3/lib/python3.7/site-packages/tensorflow/models/research/object_detection/testing_models/tflite/detect.tflite/detect.tflite — output_format=TFLITE — input_shapes=1,300,300,3 — input_arrays=normalized_input_image_tensor — output_arrays=’TFLite_Detection_PostProcess’,’TFLite_Detection_PostProcess:1',’TFLite_Detection_PostProcess:2',’TFLite_Detection_PostProcess:3' — inference_type=QUANTIZED_UINT8 — mean_values=128 — std_dev_values=127 — change_concat_input_ranges=false — allow_custom_ops — default_ranges_min=0 — default_ranges_max=255

--

--

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store