-
-
Notifications
You must be signed in to change notification settings - Fork 887
Limit all memory allocations to configurable values. #2704
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
// BMPs with negative-or-zero width are invalid. Also, reject extremely wide images | ||
// to keep the math sane. And reject int.MinValue as a height because you can't | ||
// get its absolute value (because -int.MinValue is one more than int.MaxValue). | ||
const int k64KWidth = 65535; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
should this not be a configurable value? I can see use cases where someone might want to load very large images... they would be responsible to ensure they have enough memory. This is a sensible default looks fine but allowing a user to bypass where they understand the danger etc.... I can also see use cases in the imagesharp.web world where we might even want even lower limits to prevent bad actors blowing out memory etc.
We might even want to consider a more generic max width/max height set of rules that can be applied to all image formats uniformly.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
True, updated.
I've now added configuration to |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Most importantly, we need good evidence for the default buffer limits we define.
Instead of defining limits for 1D buffers and 2D buffers I would prefer to define them against contigous buffers and memory groups. Each should have their limits defined in bytes. I don't see value in dimension limits, a 8 x 2
image needs the same buffer size as a 4 x 4
image. In practice this means that the 2D limit logic has to be pushed down to the memory group allocation code. I think the resulting code will be cleaner.
/// Defaults to <value><see cref="DefaultMaxAllocatableSize2DInBytes"/>.</value> | ||
/// </summary> | ||
public Size MaxAllocatableSize2D { get; set; } = new Size(ushort.MaxValue, ushort.MaxValue); | ||
public Size MaxAllocatableSize2DInBytes { get; set; } = DefaultMaxAllocatableSize2DInBytes; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Instead of defining 2D dimension limits, we should define a memory group limit in bytes. Also see my comment on GetDefaultMaxAllocatableSize2DInBytes
.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We need it to somehow reflect the actual dimensions of a buffer (to the user an image). A memory group is an implementation detail but if I can say don't allow a total allocation of more than X, Y then the groups are covered.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I would prefer to see it the other way around. If you define a limit for a discontigous allocation it will cover the image limit.
A memory group is an implementation
Not entirely, we have documented that our buffers are discontiguous and IMemoryGroup<T>
is public. Regardless, we don't need to involve those concepts to when defining the limits. We can just document that maximum in-memory image size is X (mega/giga/tera)bytes
. It is much better to reason about from memory management perspective.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
If we believe we need dimension limits (eg. bc various formats have them anyways), we can apply them separately either in Image<T>
or in Buffer2D<T>
code, without involving sizeof(TPixel)
. But for me that concern is orthogonal to the issue this PR is aimed to fix.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The issue directly stems from attempt to allocate an image that is too large. Fixing any additional allocation limits was a something we needed to do in addition to that public facing allocation.
By placing a limit at the point of allocation call, before we start even looking at memory groups, I avoid the possibility of over allocation in all cases. The limits are not dimensional limits, they reflect dimensional limits. We can now say, we've defaulted to an image of X*Y at this pixel format bit depth. It's a starting point that users can understand and update when required.
Whether our buffers are discontiguous doesn't really change things. We've already established that attempting to allocate an image of overly large dimensions will cause issues.
int maxLengthInRgba32Bytes = maxLength * Unsafe.SizeOf<Rgba32>(); | ||
return new(maxLengthInRgba32Bytes, maxLengthInRgba32Bytes); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The largest memory group that will pass the check against this limit is of size width * sizeof(T) * height * sizeof(T) = 65535^2 * sizeof(Rgba32)^2
, while we want it to be 65535^2 * sizeof(Rgba32)
.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We want 65535 * 4
in each direction. That's what I'm returning.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
No we don't... We want 65535 * 4 * 65535. Will update,
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
That doesn't make sense mathematically.
If we want the image area not to be larger than 65535 * 65535
of Rgba
, that area is 65535 * 65535 * sizeof(Rgba) = 16 TB
. The logic in the PR will allow 64 TB
instead.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yeah, that's what I said in my follow up.
The area is 65535 rows of 65535 Rgba32 so (65535 * 4) * 65535
/// Defaults to <value>1073741823.</value> | ||
public int MaxAllocatableSize1D { get; set; } = int.MaxValue / 2; | ||
/// Defaults to <value><see cref="GetDefaultMaxAllocatableSize1DInBytes"/>.</value> | ||
public int MaxAllocatableSize1DInBytes { get; set; } = DefaultMaxAllocatableSize1DInBytes; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is in fact a limit for a single contigous buffer.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Well, yes, but you have a different internal value for that which is used for contiguous chunks within a 2D buffer. I had to create something new.
} | ||
|
||
throw new InvalidMemoryOperationException($"Parameter \"{paramName}\" must be between or equal to {min} and {max}, was {value}"); | ||
throw new InvalidMemoryOperationException($"Parameter \"{paramName}\" must be between or equal to {min} and {max}, was {valueInBytes}"); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
For value>max
, it would be nice to have an exception message that is more indicative of a huge allocation attempt.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I dunno, they get a stack trace, they know it's from an allocation attempt.
=> MemoryGroup<T>.Allocate(this, totalLength, bufferAlignment, options); | ||
|
||
internal static void MemoryGuardMustBeBetweenOrEqualTo(int value, int min, int max, string paramName) | ||
internal static void MemoryGuardMustBeBetweenOrEqualTo<T>(int value, int min, int max, string paramName) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
min
is 0
at each call of this method. I think it would be better to redefine this method as a buffer length validator, and throw a different exception (or use a different exception message) for length < 0
. Also, length == 0
calls are suspicious but maybe there are valid cases (when not, we should disallow them at higher levels).
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We allow 0 for some reason, returning an empty buffer. Don't ask me why 🤷
@antonfirsov I really don't understand what you mean by this.
It is at the very root of the memory allocation code. We limit any and all allocations from within
65535*65535 is the maximum valid Jpeg dimension. 32767x32767 is half of that and what System.Drawing limits you to on all platforms. The 1D buffer allocations could possibly be lower than 64/32MB but I chose those values because we already attempt to allocate 64MBin our memory tests. |
@antonfirsov @tocsoft Can you please review at your earliest opportunity. I have something in place now that will enforce a limit while maintaining a sane configurable API. |
@JimBobSquarePants I decided to communicate my ideas in an alternative PR #2706, let me know what you think. |
Your PR Rulez, mine droolz. |
Closed in favor of #2706 |
Prerequisites
Description
While the specification does not apply a limitation on BMP dimensions, on some machines attempting to decode a malformed or extremely large BMP can lead to OOM exceptions.
This change limits the dimensions to match the default set by browsers. For example Firefox
Fixes #2696