- Use memory profiler tools to check which line of code is leaking memory.
- Use
.detach()
and.copy()
every time you want to save a copy of tensor.
And what’s so wired is that memory leaking happened every time after calling the Linear and ReLU modules. This is also addressed here: Memory Leak with Linear and ReLU layer