A high-severity vulnerability has been unearthed in Apache Uniffle, the remote shuffle service that powers data movement for massive distributed computing engines. Tracked as CVE-2025-68637, the flaw carries a critical CVSS score of 9.1, signaling an immediate danger to big data environments running Apache Spark, Hadoop MapReduce, and Tez.
The vulnerability stems from an insecure default configuration in the Uniffle HTTP client, effectively rolling out the red carpet for Man-in-the-Middle (MITM) attacks.
Apache Uniffle is a critical component in modern data infrastructure. It handles “shuffling”—the complex process of redistributing data across a cluster during computation. By offloading this task to a remote service, Uniffle allows for “disaggregated storage deployment” and supports super-large jobs with high elasticity.
However, moving that much data requires secure channels, and that is exactly where the software failed.
According to the disclosure, the issue lies in how the Uniffle HTTP client handles SSL/TLS connections. By default, the client was configured to trust all SSL certificates and disable hostname verification.
This flaw exposes all REST API communication between the Uniffle CLI/client and the Uniffle Coordinator service to interception. An attacker positioned on the network could easily present a fake certificate, and intercept or manipulate the command-and-control traffic without the client raising an alarm.
The vulnerability affects all versions of Apache Uniffle prior to 0.10.0.
The Apache Uniffle team has released version 0.10.0 to address this critical security gap. This update enforces strict certificate validation and hostname verification, ensuring that clients only talk to the legitimate Coordinator service.
Administrators running Uniffle clusters are strongly advised to upgrade immediately to prevent potential data interception in their distributed computing pipelines.