We try to test the memory growth by gather the stats of memory usage when doing inference.
each time when we do an inference, we will get the statistics of memory it allocated. we found that:
("The max allocation of Memory when doing a single inference" - "The average allocation of Memory when doing a single inference") / ("The max allocation of Memory when doing a single inference") = 0.46, which means the variation is too big, why? it varies from about 700MB to 1500MB.
attached is the simple.java file and the test.sh script, probably to reproduce this, one need to modify the dir in the test.sh accordingly.
RetroSearch is an open source project built by @garambo | Open a GitHub Issue
Search and Browse the WWW like it's 1997 | Search results from DuckDuckGo
HTML:
3.2
| Encoding:
UTF-8
| Version:
0.7.4