-
-
Notifications
You must be signed in to change notification settings - Fork 33.1k
Description
- v14.8.0:
- Linux 5.3.7-050307-generic #201910180652 SMP Fri Oct 18 10:56:47 UTC 2019 x86_64 x86_64 x86_64 GNU/Linux:
- Buffer:
What steps will reproduce the bug?
There is an inconsistency between calling Buffer.alloc(size, fill)
and calling let b=Buffer.alloc(size)
and b.fill(fill)
According to the spec size can be any integer smaller than buffer.constants.MAX_LENGTH
, which on my machine is just under 4GB.
When allocating a buffer of size larger than 2^31-1 using Buffer.alloc(size, fill)
it throws ERR_OUT_OF_RANGE
let b = Buffer.alloc(2200000000,1)
Uncaught: RangeError [ERR_OUT_OF_RANGE]: The value of "end" is out of range. It must be >= 0 && <= 2200000000. Received 2200000000
But when split into two lines it doesn't throw
let b = Buffer.alloc(2200000000)
b.fill(1)
Also, if offset
and end
are provided to fill, the same error occures
let b = Buffer.alloc(2200000000)
b.fill(1,0,b.length)
Uncaught: RangeError [ERR_OUT_OF_RANGE]: The value of "end" is out of range. It must be >= 0 && <= 2200000000. Received 2200000000
How often does it reproduce? Is there a required condition?
Always
What is the expected behavior?
At a minimum, to be consistent.
If buffer max size can be larger than 2^31-1, than it should be possible to fill it