在线服务代码简介

完整的推理服务代码位于https://github.com/ucloud/uai-sdk/tree/master/examples/mxnet/inference/mnist,推理服务的代码为mnist_inference.py,我们同时提供了conf.json和模型checkpoint_dir。

mnist_inference.py

minst_inference.py 实现了load_model和execute两个函数。

创建 MnistModel 类

minst_inference.py首先需要实现一个在线服务的类,该类继承了MXNetAiUcloudModel(MXNet 在线服务基类)

""" A very simple MNIST inferencer.
"""
from __future__ import absolute_import
from __future__ import division
from __future__ import print_function
import numpy as np
import mxnet as mx
from PIL import Image
from collections import namedtuple
from uai.arch.mxnet_model import MXNetAiUcloudModel

class MnistModel(MXNetAiUcloudModel):
    """ Mnist example model
    """
    def __init__(self, conf):
        super(MnistModel, self).__init__(conf)

实现load_model

def load_model(self):
    sym, self.arg_params, self.aux_params = mx.model.load_checkpoint(self.model_prefix, self.num_epoch)
    self.model = mx.mod.Module(symbol=sym, context=mx.cpu())

实现execute

实现execute分为四个部分:

  • 通过self.model.bind进行数据输入
  • 通过self.model.set_params加载模型参数
  • 请求推理操作:self.model.forward
  • 将请求结果转化成string,并合并成results(results也是一个list,和data list是一一对应的关系) def execute(self, data, batchsize): """1""" BATCH = namedtuple('BATCH', ['data', 'label']) self.model.bind(datashapes=[('data', (batchsize, 1, 28, 28))], labelshapes=[('softmaxlabel', (batchsize, 10))], for_training=False)

    """2""" self.model.setparams(self.argparams, self.aux_params)

    """3""" ret = [] for i in range(batch_size): im = Image.open(data[i]).resize1) im = np.array(im) / 255.0 im = im.reshape(-1, 1, 28, 28) self.model.forward(BATCH([mx.nd.array(ims)], None))

    """4""" predictvalues = self.model.getoutputs()[0].asnumpy() val = predictvalues[0] retval = np.arraystr(np.argmax(val)) + '\n' ret.append(retval) return ret

1) 28, 28