Skip to content

Commit 82e9318

Browse files
Ilia Cherniavskiifacebook-github-bot
Ilia Cherniavskii
authored andcommittedJun 25, 2020
Adjust CUDA memory leak test (pytorch#40504)
Summary: Pull Request resolved: pytorch#40504 Make CUDA mem liek test not flaky Test Plan: python test/test_profiler.py Differential Revision: D22215527 Pulled By: ilia-cher fbshipit-source-id: 5f1051896342ac50cd3a21ea86ce7487b5f82a19
1 parent 85b87df commit 82e9318

File tree

1 file changed

+6
-4
lines changed

1 file changed

+6
-4
lines changed
 

‎test/test_profiler.py

+6-4
Original file line numberDiff line numberDiff line change
@@ -34,13 +34,15 @@ def test_mem_leak(self):
3434
torch.cuda.empty_cache()
3535
last_rss.append(p.memory_info().rss)
3636

37+
# with CUDA events leaking the increase in memory was ~7 MB between
38+
# profiler invocations above
39+
is_increasing = all(
40+
[last_rss[idx] > last_rss[idx - 1] for idx in range(1, len(last_rss))])
3741
max_diff = -1
3842
for idx in range(1, len(last_rss)):
3943
max_diff = max(max_diff, last_rss[idx] - last_rss[idx - 1])
40-
41-
# with CUDA events leaking the increase in memory was ~7 MB,
42-
# using much smaller threshold but not zero to reduce flakiness
43-
self.assertTrue(max_diff < 100 * 1024)
44+
self.assertTrue(not (is_increasing and max_diff > 100 * 1024),
45+
msg='memory usage is increasing, {}'.format(str(last_rss)))
4446

4547
if __name__ == '__main__':
4648
run_tests()

0 commit comments

Comments
 (0)
Please sign in to comment.