Predicting memory usage? #5188
Unanswered
JeremyRand
asked this question in
Q&A
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
If I have a given bin+param file for a model, and I know the shape of the input, is there a straightforward way to compute (or at least get a close estimate) how much memory the inference will consume, without actually executing the inference? If so, is there also a straightforward way to compute the memory consumption impact of enabling/disabling Winograd convolution (or other optimizations that are configurable in ncnn)?
Beta Was this translation helpful? Give feedback.
All reactions