PaddlePaddle最简单的例子:利用python api调用paddle实现模型加载与预测

调用fluid的python接口:

import paddle.fluid as fluid

图片操作:

from PIL import Image

矩阵操作:

import numpy as np
exe = fluid.Executor(fluid.CPUPlace())
//设置model 的地址,在model_path字符型变量中保存
[inference_program, feed_target_names, fetch_targets] = fluid.io.load_inference_model(model_path, exe)
//feed是输入,支持多个输入
with fluid.program_guard(inference_program):
  results = exe.run(inference_program, feed={feed_target_names[0]: np_img,feed_target_names[1]:sz}, fetch_list=fetch_targets, return_numpy=False)
//result是输出,可以有多个输出		
res = np.array(results[0]).flatten()

 

 

注:如何安装paddle的python API

pip insstall paddle

 

你可能感兴趣的:(PaddlePaddle开发)