-
-
Notifications
You must be signed in to change notification settings - Fork 685
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
V12.1.2 check maximum uncomressed file size #799
Comments
This looks good to me, and yes I'd add a max file limit as well.
- Jim
On 5/30/20 6:49 AM, Elar Lang wrote:
V12.1.2
<https://github.com/OWASP/ASVS/blob/master/4.0/en/0x20-V12-Files-Resources.md#v121-file-upload-requirements>
Current requirement:
V12.1.2 Verify that compressed files are checked for "zip bombs" -
small input files that will decompress into huge files thus
exhausting file storage limits.
Problems:
* "zip bombs" is just a name, but still leads to zip format. Problem
is general for "compressed files", including docx, xlsx etc.
* "... thus exhausting file storage limits" - this is only valid,
when server unpack compressed files on the server side. Uploading
"zip bombs" for other users to download gives also attack vector
against site visitors/users.
Proposal ideas (but not wording):
* "Verify that the application checks compressed files against
maximum allowed uncompressed size"
* We still should mention "ZIP bomb" as an example attack
Question:
* Should we address also "maximum allowed amount of files in
compressed container" limit?
—
You are receiving this because you are subscribed to this thread.
Reply to this email directly, view it on GitHub
<#799>, or unsubscribe
<https://github.com/notifications/unsubscribe-auth/AAEBYCPBNJXBOIJVXV72YCTRUDQERANCNFSM4NOURRZA>.
--
Jim Manico
Manicode Security
https://www.manicode.com
|
@elarlang agree with "Proposal", please provide proposed wording. |
Situation - uploaded ZIP (or whatever container) file is uploaded, file contains "max amount of allowed files in ZIP" container files and those will be unpacked to one folder. And then few more of this kind of files. From some moment I may take too much resources to have file listing from that folder and it may cause dos attack because of that. |
Ok, happy to see a proposed wording. Separate requirement? |
Proposals for discussion:
|
PS: there is no way to verify the zip output size without walking the entire directory first, it’s a pain! I typically unzip into a folder with a quota and let the operating system fail the save if the contents are too big.
Point being, it’s a pain.
…--
Jim Manico
@manicode
On Jun 10, 2020, at 5:40 AM, Elar Lang ***@***.***> wrote:
Proposals for discussion:
Plain
Verify that the application checks compressed files against maximum allowed uncompressed size and against maximum files inside container.
Extensions
Verify that the application checks compressed files (e.g. zip, gz, docx, odt) against maximum allowed uncompressed size and against maximum files inside container.
Explanations
Verify that the application checks compressed files (e.g. zip, gz, docx, odt) against maximum allowed uncompressed size and against maximum files inside container to avoid "ZIP bomb" attacks against the application and users.
—
You are receiving this because you commented.
Reply to this email directly, view it on GitHub, or unsubscribe.
|
With running unzip you already got the attack (even if it does not fill your disk quota, it's using resources for unpacking). What about:
and
|
Those utilities walk the entire zip tree. It’s brute force calculation. The zip standard does not reveal its unzipped size.
Try building a zip with millions of small compressible files and see how much CPU those utilities consume to answer the question.
…--
Jim Manico
@manicode
On Jun 10, 2020, at 8:51 AM, Elar Lang ***@***.***> wrote:
With running unzip you already got the attack.
What about:
$ zipinfo test.zip
Archive: test.zip
Zip file size: 101940 bytes, number of entries: 1
-rw-r--r-- 3.0 unx 104857600 tx defX 20-Mar-20 08:59 test.pdf
1 file, 104857600 bytes uncompressed, 101774 bytes compressed: 99.9%
and
$ exiftool test.zip
ExifTool Version Number : 10.80
File Name : test.zip
Directory : .
File Size : 100 kB
File Modification Date/Time : 2020:03:20 08:59:58+02:00
File Access Date/Time : 2020:06:10 15:48:45+03:00
File Inode Change Date/Time : 2020:03:23 10:41:51+02:00
File Permissions : rw-r--r--
File Type : ZIP
File Type Extension : zip
MIME Type : application/zip
Zip Required Version : 20
Zip Bit Flag : 0x0002
Zip Compression : Deflated
Zip Modify Date : 2020:03:20 08:59:29
Zip CRC : 0xba4c670c
Zip Compressed Size : 101774
Zip Uncompressed Size : 104857600
Zip File Name : test.pdf
—
You are receiving this because you commented.
Reply to this email directly, view it on GitHub, or unsubscribe.
|
In practice, adding a lot (empty) files increases output zip file size quite fast:
It's possible to decrease file sizes a bit but it gives some idea. Example (random) solution for detecting "amount of files", yes, it goes through listing, but it shows that time is not something for what you need to take calendar:
Practical advice could be: allow smaller "max upload file size" for archives from untrusted sources. So I would say that within relatively small "max upload file size" you will not have "huge catastrophic-problematic amount of files in container" from detection perspective. If you keep unpacking them constantly to the same folder then you will have one. This is my knowledge at the moment with quick research. |
ok this is awesome :) I like real data, and am starting to see your
point. Point being, if there is just a reasonable zip file size limit
pre unzipping, then we should be ok to CPU exhaustion attacks.
Can you try 500 000 000 files if you have the machine to handle it?
Also, we still need to be weary of Zip Bombs; 42k zip files that unzip
into Petabytes https://en.wikipedia.org/wiki/Zip_bomb
- Jim
On 6/10/20 1:40 PM, Elar Lang wrote:
In practice, adding a lot (empty) files increases file size quite fast:
* |100 000 files = 8.4MB, generated in 6sec|
* |500 000 files = 43.7MB, generated in 30sec|
It's possible to decrease file sizes a bit but it gives some idea.
Example (random) solution for detecting "amount of files", yes, it
goes through listing, but it shows that time is not something for what
you need to take calendar:
|time zipinfo -1 alotoffiles_500000.zip | wc -l 500000 real 0m1,724s
user 0m1,171s sys 0m1,951s |
Practical advice could be: allow smaller "max upload file size" for
archives from untrusted sources.
So I would say that within relatively small "max upload file size" you
will not have "huge catastrophic-problematic amount of files in
container".
This is my knowledge at the moment with quick research.
—
You are receiving this because you were assigned.
Reply to this email directly, view it on GitHub
<#799 (comment)>, or
unsubscribe
<https://github.com/notifications/unsubscribe-auth/AAEBYCP7T7EFKVPTYI3FXITRV7AQ3ANCNFSM4NOURRZA>.
--
Jim Manico
Manicode Security
https://www.manicode.com
|
at the moment it could take too much time (to provide quick answers). But definitely I'll investigate it further.
this one requires recursive unpacking. Another tricky topic, should we warn about nested zip and avoid recursive unpacking or don't allow zip inside zip? |
There are zip bombs that do no require recursive unzipping. A small zip
file, in one operation, can still unzip to a dangerously large amount.
https://portswigger.net/daily-swig/ancient-zip-bomb-attack-given-new-lease-of-life
The solution here is some kind quota set up BEFORE You unzip. I tend to
unzip into a dynamically created folder with quote limits set at the OS
level.
While we are zipping, the other issue is a file path that unzips into an
unexpected directory using path traversal known as the zip slip attack.
https://nakedsecurity.sophos.com/2018/06/06/the-zip-slip-vulnerability-what-you-need-to-know/
Again, the quota limited directory and OS configuration of your unzip
command helps. I limit my zip utility to only unzip into one directory
with OS permissions and the unzip command fails when quota or filepath
limitations are broken.
...
There are other ways to address these Zip problems, but at scale I
leaned on the OS.
- Joim
On 6/11/20 4:55 AM, Elar Lang wrote:
Can you try 500 000 000 files if you have the machine to handle it?
at the moment it could take too much time (to provide quick answers).
But definitely I'll investigate it further.
Also, we still need to be weary of Zip Bombs; 42k zip files that
unzip into Petabytes https://en.wikipedia.org/wiki/Zip_bomb
this one requires recursive unpacking. Another tricky topic, should we
warn about nested zip and avoid recursive unpacking or don't allow zip
inside zip?
—
You are receiving this because you were assigned.
Reply to this email directly, view it on GitHub
<#799 (comment)>, or
unsubscribe
<https://github.com/notifications/unsubscribe-auth/AAEBYCIFIXY46DU4UJQF7ZTRWCLYFANCNFSM4NOURRZA>.
--
Jim Manico
Manicode Security
https://www.manicode.com
|
... and based on all that, any proposals to my proposals? :) |
@jmanico I'll take this through to completion if you don't mind. I'm working on backporting these changes to 4.0.2 |
Ok all yours Andrew |
I think this is going to end up too complicated for 4.0.2 |
My proposal: 12.1.2: Verify that the application checks compressed files (e.g. zip, gz, docx, odt) against maximum allowed uncompressed size and against maximum number of files before uncompressing the file. |
V12.1.2
Current requirement:
Problems:
Proposal ideas (but not wording):
Question:
The text was updated successfully, but these errors were encountered: