|
1 |
| -torch-model-archiver --model-name resnetcppaot --version 1.0 --handler ../../../../cpp/_build/test/resources/examples/aot_inductor/resnet/resnet_handler/libresnet_handler:ResnetCppHandler --runtime LSP --extra-files config.json,index_to_name.json,resnet50_pt2.so --archive-format no-archive |
| 1 | +This example uses AOTInductor to compile the Resnet50 into an so file which is then executed using libtorch. |
| 2 | +The handler C++ source code for this examples can be found [here](src). |
| 3 | + |
| 4 | +### Setup |
| 5 | +1. Follow the instructions in [README.md](../../../../cpp/README.md) to build the TorchServe C++ backend. |
| 6 | + |
| 7 | +``` |
| 8 | +cd serve/cpp |
| 9 | +./builld.sh |
| 10 | +``` |
| 11 | + |
| 12 | +The build script will create the necessary artifact for this example. |
| 13 | +To recreate these by hand you can follow the prepare_test_files function of the [build.sh](../../../../cpp/build.sh) script. |
| 14 | +We will need the handler .so file as well as the resne50_pt2.so file containing the model and weights. |
| 15 | + |
| 16 | +2. Create a [model-config.yaml](model-config.yaml) |
| 17 | + |
| 18 | +```yaml |
| 19 | +minWorkers: 1 |
| 20 | +maxWorkers: 1 |
| 21 | +batchSize: 2 |
| 22 | + |
| 23 | +handler: |
| 24 | + model_so_path: "resnet50_pt2.so" |
| 25 | + mapping: "index_to_name.json" |
| 26 | +``` |
| 27 | +
|
| 28 | +### Generate MAR file |
| 29 | +
|
| 30 | +Now lets generate the mar file |
| 31 | +
|
| 32 | +```bash |
| 33 | +torch-model-archiver --model-name resnetcppaot --version 1.0 --handler ../../../../cpp/_build/test/resources/examples/aot_inductor/resnet_handler/libresnet_handler:ResnetCppHandler --runtime LSP --extra-files index_to_name.json,../../../../cpp/_build/test/resources/examples/aot_inductor/resnet_handler/resnet50_pt2.so --config-file model-config.yaml --archive-format no-archive |
| 34 | +``` |
| 35 | + |
| 36 | +Create model store directory and move the mar file |
| 37 | + |
| 38 | +``` |
| 39 | +mkdir model_store |
| 40 | +mv resnetcppaot model_store/ |
| 41 | +``` |
| 42 | + |
| 43 | +### Inference |
| 44 | + |
| 45 | +Start torchserve using the following command |
| 46 | + |
| 47 | +``` |
| 48 | +torchserve --ncs --model-store model_store/ |
| 49 | +``` |
| 50 | + |
| 51 | +Register the model using the following command |
| 52 | + |
| 53 | +``` |
| 54 | +curl -v -X POST "http://localhost:8081/models?initial_workers=1&url=resnetcppaot&batch_size=2&max_batch_delay=5000" |
| 55 | +
|
| 56 | +{ |
| 57 | + "status": "Model \"resnetcppaot\" Version: 1.0 registered with 1 initial workers" |
| 58 | +} |
| 59 | +``` |
| 60 | + |
| 61 | +Infer the model using the following command |
| 62 | + |
| 63 | +``` |
| 64 | +curl http://localhost:8080/predictions/resnetcppaot -T ../../../../cpp/test/resources/examples/aot_inductor/resnet_handler/0_png.pt |
| 65 | +{ |
| 66 | + "lens_cap": 0.0022578993812203407, |
| 67 | + "lynx": 0.0032067005522549152, |
| 68 | + "Egyptian_cat": 0.046274684369564056, |
| 69 | + "tiger_cat": 0.13740436732769012, |
| 70 | + "tabby": 0.2724998891353607 |
| 71 | +} |
| 72 | +``` |
0 commit comments