Skip to content

Commit f5b024c

Browse files
committed
example for using torhcinductor caching
1 parent f6d8476 commit f5b024c

File tree

1 file changed

+6
-3
lines changed

1 file changed

+6
-3
lines changed

examples/pt2/README.md

+6-3
Original file line numberDiff line numberDiff line change
@@ -56,14 +56,17 @@ torchserve takes care of 4 and 5 for you while the remaining steps are your resp
5656

5757
### Note
5858

59-
`torch.compile()` is a JIT compiler and JIT compilers generally have a startup cost. If that's an issue for you make sure to populate these two environment variables to improve your warm starts.
59+
`torch.compile()` is a JIT compiler and JIT compilers generally have a startup cost. To reduce the warm up time, `TorchInductor` already makes use of caching in `/tmp/torchinductor_USERID` of your machine
60+
61+
To persist this cache and /or to make use of additional experimental caching feature, set the following
6062

6163
```
6264
import os
6365
64-
os.environ["TORCHINDUCTOR_CACHE_DIR"] = "1"
65-
os.environ["TORCHINDUCTOR_FX_GRAPH_CACHE"] = "/path/to/directory" # replace with your desired path
66+
os.environ["TORCHINDUCTOR_CACHE_DIR"] = "/path/to/directory" # replace with your desired path
67+
os.environ["TORCHINDUCTOR_FX_GRAPH_CACHE"] = "1"
6668
```
69+
An example of how to use these with TorchServe is shown [here](./torch_inductor_caching/)
6770

6871
## torch.export.export
6972

0 commit comments

Comments
 (0)