Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix severe node performance issue due to buffer allocation bug, fix undefined behavior which breaks node > 18.7.0 #82

Open
wants to merge 3 commits into
base: latest
Choose a base branch
from
Open
Changes from 1 commit
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Next Next commit
Change to using std::min() when allocating Buffers on the C++ side. T…
…his will result in Buffers being allocated to the size requested up to kMaxLength (0x3fffffff) instead of the current behavior of always allocating kMaxLength. Since these buffers are allocated in the node external memory (limited to 64M) the old behavior would very quickly cause massive GC pressure as the GC is triggered on each new Buffer allocation once the node external memory limit is reached. It appears the old behavior was not intentional and should have always been std::min(). This should fix #72
doggkruse authored and afreer committed Sep 19, 2023
commit 03254fe899c0fbec5606b8892c4be5790c1b4d9a
2 changes: 1 addition & 1 deletion src/binding.cc
Original file line number Diff line number Diff line change
@@ -141,7 +141,7 @@ class InstanceData final : public RefNapi::Instance {
ab = it->second.ab.Value();

if (ab.IsEmpty()) {
length = std::max<size_t>(length, kMaxLength);
length = std::min<size_t>(length, kMaxLength);
ab = Buffer<char>::New(env, ptr, length, [this](Env env, char* ptr) {
UnregisterArrayBuffer(ptr);
}).ArrayBuffer();