At AWS’s annual re:Invent 2025 conference in Las Vegas, the company unveiled a major series of enhancements to its cloud-based contact center platform, Amazon Connect—most notably the introduction of “Agentic AI,” a new generation of autonomous AI agents powered by the advanced Nova Sonic voice model. These agents can not only converse naturally, but also reason independently and execute tasks on behalf of human operators.
In a rare move on the infrastructure front, AWS also announced a partnership with Google Cloud to launch the preview version of AWS Interconnect – Multicloud, a solution aimed at addressing the long-standing complexities enterprises face when managing multi-cloud network connectivity.
At the heart of the Amazon Connect update is the integration of Agentic AI’s self-service capabilities. Through the use of the Nova Sonic model, the new AI agents can speak with a tone, rhythm, and accent that closely mimic human conversation, process multilingual input, and respond fluidly to customer needs.
Moreover, AWS emphasizes that these AI agents possess both reasoning and action-taking abilities. While interacting with customers, the agents can analyze semantics and emotional cues and proactively assist human staff by preparing documents, handling routine workflows, and performing background tasks. This achieves true human–machine collaboration: AI handles tedious backend processes, allowing human agents to focus on customer relationships and more complex cases.
For customers who already rely on third-party voice technologies, Amazon Connect now also integrates solutions from providers such as Deepgram and ElevenLabs.
To strengthen auditability and regulatory compliance, AWS has simultaneously launched Observability features for AI agents. Enterprises can view precisely what the AI understood, which tools it invoked, and how it arrived at its decisions—enhancing performance optimization and fostering user trust.
At the same time, voice-AI vendor Deepgram announced that its advanced speech technologies are entering the AWS ecosystem. Its real-time speech-to-text (STT), text-to-speech (TTS), and voice-agent capabilities will be integrated directly into Amazon SageMaker AI, Amazon Connect, and Amazon Lex.
This integration enables enterprises to build real-time voice applications with sub-second latency within AWS’s secure environment, significantly improving the responsiveness and fluidity of automated interactions.
On the cloud infrastructure side, AWS and Google Cloud jointly introduced the preview of AWS Interconnect – Multicloud and unveiled a new open standard for cross-cloud network interoperability.
The solution merges AWS Interconnect with Google Cloud’s Cross-Cloud Interconnect, enabling enterprises to establish dedicated, high-bandwidth private connections between the two cloud providers. Historically, businesses connecting workloads across clouds faced a dilemma: relying on the public internet without bandwidth guarantees, or dealing with the complexity of building private links.
With this fully managed service, enterprises can rapidly configure cloud-to-cloud connections through the AWS console or API, dramatically reducing the overhead associated with managing physical equipment or virtual routers. AWS has also published the open API suite on GitHub, encouraging more service providers to join the interconnect ecosystem.
This is not the first time Google Cloud has collaborated with AWS on cross-cloud integration. As early as 2020, it introduced BigQuery Omni, enabling cross-cloud search across Google Cloud and AWS, with later expansion to Microsoft Azure.
Amid the current AI-driven landscape, Google Cloud this year also introduced the A2A (Agent-to-Agent) protocol, enabling its AI agent services to operate across cloud boundaries. Microsoft later joined the standard, and AWS recently announced support as well, paving the way for AI systems to interact seamlessly across multiple clouds.
With AWS now deepening its cross-cloud partnership with Google Cloud, both providers are expected to accelerate the development of additional cross-cloud capabilities in response to the rapidly evolving AI era.