Thoma Bravo inks multiyear deal with Google Cloud to accelerate AI adoption.
Photo by Markus Spiske on Unsplash
Thoma Bravo has signed a multiyear agreement with Google Cloud to speed AI adoption across its portfolio, Bloomberg reports. The private‑equity firm will leverage Google’s cloud services to embed generative AI tools in its portfolio companies.
Key Facts
- •Key company: Google Cloud
- •Also mentioned: Google Cloud
The agreement grants Thoma Bravo’s portfolio firms direct access to Google Cloud’s Gemini family of large‑language models, including the newly announced Gemini Enterprise tier that promises dedicated hardware isolation and customizable safety controls. According to Bloomberg, the partnership will let the private‑equity group “embed generative AI tools” across its holdings, which collectively exceed $300 billion in enterprise value. Gemini Enterprise is built on Google’s Pathways architecture, enabling a single model to serve multiple tasks without retraining, a capability that could streamline the integration of AI‑driven features into legacy software stacks typical of Thoma Bravo’s investments.
From a deployment standpoint, the deal leverages Google Cloud’s Vertex AI platform, which provides a managed environment for model training, fine‑tuning, and inference at scale. Bloomberg notes that the portfolio companies will be able to “improve product offerings” by tapping into Vertex’s pre‑built pipelines for data preprocessing, feature engineering, and model monitoring. This infrastructure reduces the engineering overhead required to operationalize large‑scale generative models, allowing firms to focus on domain‑specific applications such as automated code generation for development tools or AI‑augmented analytics for cybersecurity products.
Security and compliance are addressed through Google’s Confidential Computing framework, which encrypts data in use within trusted execution environments (TEEs). The Bloomberg report highlights that Gemini Enterprise includes “customizable safety controls,” implying that Thoma Bravo’s portfolio can enforce content filters and usage policies at the model layer, mitigating the risk of disallowed outputs in regulated industries like finance and healthcare. Additionally, Google Cloud’s multi‑regional data residency options enable portfolio companies to meet jurisdictional data‑sovereignty requirements without sacrificing latency, a critical factor for latency‑sensitive workloads such as real‑time fraud detection.
Cost management is built into the arrangement via Google Cloud’s committed use contracts and sustained‑use discounts, which Bloomberg says are part of the multiyear deal’s financial structure. By locking in pricing for compute resources—particularly TPU v4 pods that power Gemini’s inference workloads—Thoma Bravo can predict AI‑related expenditures across its portfolio, a notable advantage given the historically volatile pricing of on‑demand cloud AI services. The partnership also includes access to Google’s AI‑optimized storage solutions, such as Vertex AI Feature Store, which centralizes feature data for consistent model serving across disparate applications.
Finally, the collaboration opens a channel for joint research and development between Google’s AI research teams and Thoma Bravo’s portfolio engineers. While Bloomberg does not detail specific R&D milestones, the inclusion of Gemini’s latest model iterations suggests that portfolio companies will receive early access to model updates and experimental features. This pipeline could accelerate the rollout of cutting‑edge capabilities—such as multimodal reasoning and retrieval‑augmented generation—into products that traditionally lag behind the AI frontier, thereby sharpening the competitive edge of Thoma Bravo’s holdings in sectors ranging from enterprise software to infrastructure monitoring.
Sources
Reporting based on verified sources and public filings. Sector HQ editorial standards require multi-source attribution.