-
Notifications
You must be signed in to change notification settings - Fork 36
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
fix(diff/implicit): fix memory leak of OOP APIs #113
Conversation
Codecov ReportBase: 70.34% // Head: 70.26% // Decreases project coverage by
Additional details and impacted files@@ Coverage Diff @@
## main #113 +/- ##
==========================================
- Coverage 70.34% 70.26% -0.08%
==========================================
Files 71 72 +1
Lines 2981 3000 +19
==========================================
+ Hits 2097 2108 +11
- Misses 884 892 +8
Flags with carried forward coverage won't be shown. Click here to find out more.
Help us with your feedback. Take ten seconds to tell us how you rate us. Have a feature suggestion? Share it here. ☔ View full report at Codecov. |
94fcc09
to
5591aa9
Compare
bd3b6f1
to
e1bef1f
Compare
e1bef1f
to
3318f28
Compare
3318f28
to
f8eed52
Compare
f8eed52
to
0a44b3c
Compare
487a99d
to
91c4554
Compare
0a44b3c
to
56dc4ac
Compare
408102a
to
b15eb1f
Compare
56dc4ac
to
463c122
Compare
463c122
to
c4fd436
Compare
c4fd436
to
33576a0
Compare
33576a0
to
0e0c131
Compare
1f1b509
to
8f4beac
Compare
400c024
to
871b54c
Compare
871b54c
to
a9f0d70
Compare
Hi @XuehaiPan @JieRen98 , I see that this was just merged and I seem to be running in a memory leak of my own. EDITDo you also know when the release will happen for this fix? EDIT 2Apparently my memory leak was not fixed by using the latest fix so my guess is that it's not the same issue. |
Hi @zaccharieramzi, you could reference the changes in examples/iMAML/imaml_omniglot.py (5cb7b6). We previously created the inner network in the for-loop. For each iteration, a new inner network was created. That was the cause of the memory leak. for task_id in range(num_tasks):
inner_net = InnerNet(meta_params, ...)
inner_net.solve(data, ...)
...
gc.collect() Now we create the inner_nets = [InnerNet(meta_params, ...) for task_id in range(num_tasks)]
for task_id in range(num_tasks):
inner_net = inner_nets[task_id]
inner_net.reset_parameters(...)
inner_net.solve(data, ...)
... If this solution does not resolve your issue, you can open an issue for that. |
I solved my issue by getting rid of I think the problem might be that in the |
No description provided.