Machine Learning Benchmark Tool (ML Bench) (AI Benchmark Tool)
Supported models :
- MobileNet v1
- MobileNet v2
- Inception v3
- Resnet v2 50
- SSD Mobilenet v1 (Object Detection)
Supported runtime :
- Tensorflow Lite
- Tensorflow Mobile
- Android NN
- SNPE (for Qualcomm)
SideLoad Support:
How to side load your model :
1. Convert your model to tflite (using toco) or dlc (using snpe conversion tool).
2. On your local machine, create [Model Name] directory
3. Copy your model file to the directory created in step 2
4. Create a file called meta-data.json in the [Model Name] directory
example of meta-data.json :
{
"xres" : 299,
"yres" : 299,
"depth" : 3,
"input_type" : "float",
"output_type" : "float",
"input_name" : "input:0",
"output_name" : "InceptionV3/Predictions/Reshape_1:0",
"image_mean" : 0,
"image_std" : 0,
"accelerator":"dsp",
}
5. push [Model Name] directory to the target device using below command
adb push ./[Model Name] /sdcard/Android/data/com.etinum.mlbench/files/models/
机器学习基准工具(ML Bench)(AI Benchmark Tool)
支持的型号:
- MobileNet v1
- MobileNet v2
- 成立v3
- Resnet v2 50
- SSD Mobilenet v1(对象检测)
支持运行时:
- Tensorflow Lite
- Tensorflow Mobile
- Android NN
- SNPE(适用于Qualcomm)
SideLoad支持:
如何侧载您的模型:
1.将模型转换为tflite(使用toco)或dlc(使用snpe转换工具)。
2.在本地计算机上,创建[Model Name]目录
3.将模型文件复制到步骤2中创建的目录
4.在[Model Name]目录中创建名为meta-data.json的文件
meta-data.json的例子:
{
“xres”:299,
“yres”:299,
“深度”:3,
“input_type”:“浮动”,
“output_type”:“浮动”,
“input_name”:“input:0”,
“output_name”:“InceptionV3 / Predictions / Reshape_1:0”,
“image_mean”:0,
“image_std”:0,
“加速器”:“DSP”
}
5.使用以下命令将[Model Name]目录推送到目标设备
adb push ./[Model Name] /sdcard/Android/data/com.etinum.mlbench/files/models/