Security
Headlines
HeadlinesLatestCVEs

Headline

GHSA-2c2j-9gv5-cj73: Starlette has possible denial-of-service vector when parsing large files in multipart forms

Summary

When parsing a multi-part form with large files (greater than the default max spool size) starlette will block the main thread to roll the file over to disk. This blocks the event thread which means we can’t accept new connections.

Details

Please see this discussion for details: https://github.com/encode/starlette/discussions/2927#discussioncomment-13721403. In summary the following UploadFile code (copied from here) has a minor bug. Instead of just checking for self._in_memory we should also check if the additional bytes will cause a rollover.


    @property
    def _in_memory(self) -> bool:
        # check for SpooledTemporaryFile._rolled
        rolled_to_disk = getattr(self.file, "_rolled", True)
        return not rolled_to_disk

    async def write(self, data: bytes) -> None:
        if self.size is not None:
            self.size += len(data)

        if self._in_memory:
            self.file.write(data)
        else:
            await run_in_threadpool(self.file.write, data)

I have already created a PR which fixes the problem: https://github.com/encode/starlette/pull/2962

PoC

See the discussion here for steps on how to reproduce.

Impact

To be honest, very low and not many users will be impacted. Parsing large forms is already CPU intensive so the additional IO block doesn’t slow down starlette that much on systems with modern HDDs/SSDs. If someone is running on tape they might see a greater impact.

ghsa
#dos#git

Summary

When parsing a multi-part form with large files (greater than the default max spool size) starlette will block the main thread to roll the file over to disk. This blocks the event thread which means we can’t accept new connections.

Details

Please see this discussion for details: encode/starlette#2927 (reply in thread). In summary the following UploadFile code (copied from here) has a minor bug. Instead of just checking for self._in_memory we should also check if the additional bytes will cause a rollover.

@property
def \_in\_memory(self) \-> bool:
    \# check for SpooledTemporaryFile.\_rolled
    rolled\_to\_disk \= getattr(self.file, "\_rolled", True)
    return not rolled\_to\_disk

async def write(self, data: bytes) \-> None:
    if self.size is not None:
        self.size += len(data)

    if self.\_in\_memory:
        self.file.write(data)
    else:
        await run\_in\_threadpool(self.file.write, data)

I have already created a PR which fixes the problem: encode/starlette#2962

PoC

See the discussion here for steps on how to reproduce.

Impact

To be honest, very low and not many users will be impacted. Parsing large forms is already CPU intensive so the additional IO block doesn’t slow down starlette that much on systems with modern HDDs/SSDs. If someone is running on tape they might see a greater impact.

References

  • GHSA-2c2j-9gv5-cj73
  • encode/starlette@9f7ec2e
  • https://github.com/encode/starlette/blob/fa5355442753f794965ae1af0f87f9fec1b9a3de/starlette/datastructures.py#L436C5-L447C14
  • encode/starlette#2927 (reply in thread)

ghsa: Latest News

GHSA-xqpg-92fq-grfg: `pyLoad` has Path Traversal Vulnerability in `json/upload` Endpoint that allows Arbitrary File Write