S&P Global used its appearance at the Raymond James Institutional Investor Conference to deliver a clear message: artificial intelligence is not a sideline experiment but a central engine for growth, margin expansion, and product innovation across the company’s data, workflow and ratings franchises. Executives reiterated the company’s road map to harvest AI-driven product monetization while simultaneously re‑engineering internal operations—pinning a target of 50–75 basis points of annual adjusted operating margin expansion, committing to ~20% run‑rate savings inside its Enterprise Data Office by the end of 2027, and pointing to early commercial traction (notably, nearly 20% opt‑in for an iLEVEL automated‑ingestion add‑on within six months). That combination—aggressive productivity programs paired with new, AI‑enabled monetization channels—frames S&P Global’s pitch: AI will be both a cost lever and a revenue lifter. What remains to be seen is whether execution, legal and competitive dynamics will let this thesis play out as cleanly as management projects.
S&P Global is a sprawling information-services company whose business model rests on three durable pillars: ratings, indices/benchmarks, and market data & workflow tools. Those pillars historically generate recurring, high‑quality revenue and give S&P a familiar and defensible moat: proprietary datasets, rigorous methodologies, and deep client integrations. Management says those core businesses deliver roughly three‑quarters of operating income, while workflow tools account for a meaningful, fast‑growing slice of revenue.
Two secular forces are reshaping the competitive landscape. First, investors and corporates are shifting capital toward private markets and alternative assets, creating demand for new types of benchmarks, reporting and data infrastructure. Second, the rapid rise of large language models and generative AI has created both an opportunity—and a headache—for data providers. AI increases the value of high‑quality, structured data (because models need grounded, authoritative sources), but it also raises the specter that raw datasets could be reused, regurgitated, or even repurposed to train competing models unless vendors erect robust technical and contractual protections.
S&P’s Raymond James presentation makes it plain: the firm intends to lean into both trends by buying private‑markets capabilities, embedding AI in its workflow products, and building permissioning and telemetry tools so its data remains a paid, controlled input—not an unpaid training corpus.
Parallel to product rollouts, S&P completed a major strategic purchase: the acquisition of With Intelligence, a private‑markets data and analytics provider. The deal is explicitly intended to accelerate S&P’s private‑markets capabilities—adding proprietary datasets, fund‑level intelligence, benchmarks and distribution relationships across limited partners, general partners and fund managers. In context, the acquisition reinforces S&P’s expectation that private markets will be a high‑growth runway for new subscription and workflow revenue.
S&P’s productivity logic is straightforward: many of the workflows that sustain the data business are manual, repetitive, and amenable to automation. By redesigning processes, standardizing stacks, and deploying ML/LLM tools as productivity agents, the company expects to:
Yet the strategy runs through several narrow straits. The legal and technical effort required to prevent unwanted dataset use for model training will be ongoing and politically charged. Competition from AI‑native vendors and platform owners will test pricing power. Delivering promised EDO savings and productivity increases across tens of thousands of employees is a substantial operational lift. Finally, the macro backdrop and issuance cycles add an overlay of uncertainty.
For CIOs, data teams and investors, the sensible posture is measured: S&P has real advantages and early evidence of monetization, but the next 12–24 months will be pivotal. Watch adoption curves, telemetry monetization, and EDO execution closely. If S&P can convert the early signals described at Raymond James into predictable, scalable revenue and margin moves, the company’s AI thesis will have passed a demanding market test. If not, the same forces that make AI powerful—ease of distribution, rapid model innovation, and platformization—could limit upside or compress margins more quickly than expected. Either way, S&P’s actions at the intersection of finance, data and AI will be among the most consequential developments in enterprise software and financial information services in the coming years.
Source: Investing.com UK S&P Global at Raymond James Conference: AI as a Growth Catalyst By Investing.com
Background: why this matters for financial data and workflow markets
S&P Global is a sprawling information-services company whose business model rests on three durable pillars: ratings, indices/benchmarks, and market data & workflow tools. Those pillars historically generate recurring, high‑quality revenue and give S&P a familiar and defensible moat: proprietary datasets, rigorous methodologies, and deep client integrations. Management says those core businesses deliver roughly three‑quarters of operating income, while workflow tools account for a meaningful, fast‑growing slice of revenue.Two secular forces are reshaping the competitive landscape. First, investors and corporates are shifting capital toward private markets and alternative assets, creating demand for new types of benchmarks, reporting and data infrastructure. Second, the rapid rise of large language models and generative AI has created both an opportunity—and a headache—for data providers. AI increases the value of high‑quality, structured data (because models need grounded, authoritative sources), but it also raises the specter that raw datasets could be reused, regurgitated, or even repurposed to train competing models unless vendors erect robust technical and contractual protections.
S&P’s Raymond James presentation makes it plain: the firm intends to lean into both trends by buying private‑markets capabilities, embedding AI in its workflow products, and building permissioning and telemetry tools so its data remains a paid, controlled input—not an unpaid training corpus.
The company’s AI playbook: productization, permissioning, and telemetry
At the heart of S&P Global’s argument is a three‑part operational playbook that turns AI into monetizable product features and a productivity engine.1) Productization: embedding AI inside workflow tools
S&P is integrating generative AI into flagship workflows—products like iLEVEL, WSO, Platts Connect, and its Market Intelligence suites. The firm has rolled out paid add‑ons such as automated data ingestion for iLEVEL and energy content delivery via Microsoft Copilot and Copilot Studio, touting early adoption metrics that demonstrate willingness to pay.- Customer-facing benefits: faster ingestion, automated reporting, natural‑language queries, contextual research summaries.
- Commercial benefits: paid entitlements, higher retention, increased data consumption and upsell opportunities.
2) Permissioning and grounding: controlling data flows into LLMs
S&P is deploying engineering controls—MCP connector, grounding agents, and throttling mechanisms—to gate how partner LLMs or third‑party AI platforms access S&P datasets. Management emphasized that access is permissioned and monitored; S&P asserts it will not make its proprietary data available for LLM training without explicit contracts.- Key controls described:
- Permissioning switches to “turn on” dataset access for sanctioned endpoints.
- Grounding agents that answer queries without exposing entire datasets (row‑level throttling).
- Telemetry to measure query frequency, coverage and usage patterns for contractual billing and upsell.
3) Telemetry and data economics: monetization through usage insights
S&P collects detailed telemetry on how customers interact with content—what queries are run, how frequently, and the breadth of coverage used. That telemetry informs commercial conversations and creates levers for new pricing models: add‑ons, consumption tiers, and targeted upsells.- Commercial outcomes management cites:
- Add‑on uptake (iLEVEL automated ingestion).
- Higher retention and ARR growth among customers who adopt Copilot integrations.
- New data sells driven by observed usage during AI interactions.
Concrete early metrics and the With Intelligence acquisition
Investors and analysts were given specific datapoints: approximately 80 customers are using the MCP connector (primarily Market Intelligence), ~60 energy customers have bought Copilot‑based integrations, and the iLEVEL ingestion add‑on reached nearly 20% penetration within months of launch. Those are not mere product demos—management framed them as proof points for monetization and stickiness.Parallel to product rollouts, S&P completed a major strategic purchase: the acquisition of With Intelligence, a private‑markets data and analytics provider. The deal is explicitly intended to accelerate S&P’s private‑markets capabilities—adding proprietary datasets, fund‑level intelligence, benchmarks and distribution relationships across limited partners, general partners and fund managers. In context, the acquisition reinforces S&P’s expectation that private markets will be a high‑growth runway for new subscription and workflow revenue.
Operational efficiency: Enterprise Data Office and the productivity thesis
S&P has publicly identified four large pools of headcount and cost where AI and process reengineering can drive leverage: the Enterprise Data Office (EDO), software development, researcher/analyst teams, and sales/operations. The EDO is the most mature program—management says it covers roughly 9,000 employees and about $0.5 billion of expense on a company cost base of roughly $7.5 billion. The plan is to reduce EDO run‑rate costs by ~20% by the end of 2027 via automation, standardization and tooling.S&P’s productivity logic is straightforward: many of the workflows that sustain the data business are manual, repetitive, and amenable to automation. By redesigning processes, standardizing stacks, and deploying ML/LLM tools as productivity agents, the company expects to:
- Reduce manual data processing and eliminate redundant applications.
- Improve developer productivity (management cites current coding productivity gains and suggests AI could double or triple that benefit).
- Free staff to focus on higher‑value analytics and product development.
Strengths: why S&P’s thesis is credible
- Deep, proprietary data assets and client lock‑in: S&P’s datasets are embedded in client workflows that are mission‑critical—risk models, regulatory reporting, valuation processes. That makes simple substitution difficult.
- Early tech investments and partnerships: acquiring Kensho in 2018 and building connectors like MCP, plus close partnerships with hyperscalers and AI vendors, give S&P privileged access and integration pathways other vendors may lack.
- Commercial telemetry advantage: tracking usage inside AI workflows creates real signals for upsells and dynamic pricing, a new form of monetizable telemetry that few legacy data vendors can match today.
- Strategic M&A to shore up content: the With Intelligence deal adds rare private‑markets coverage at scale, addressing a major market opportunity as alternative assets grow.
- Explicit IP protection controls: grounding agents, row‑level throttling, and contractual guardrails are concrete, technical defenses against unrestricted copying and training uses.
Risks and blind spots: what could derail the plan
While the strategy is clear, several execution and structural risks could blunt outcomes.1) The enforceability of "no training" promises
S&P says it will not allow its proprietary datasets to be used to train third‑party LLMs without permission. In practice, proving and policing how models were trained is hard. Contracts, API controls and technical guardrails help—but they are not foolproof. Third parties may argue their outputs are transformative or otherwise not directly traceable to a single source. Legal and regulatory regimes around model training remain unsettled and could evolve in ways that weaken vendor protections.2) Competition from AI‑native entrants and hyperscalers
Large tech companies and niche AI startups are building verticalized models with attractive UX and low switching friction. If these players bundle datasets and models into serialized offerings, S&P could face downward pricing pressure or have to accept smaller margins on distribution through third‑party platforms. Hyperscalers could choose to internalize key data relationships, or platforms might demand rates and terms that compress data provider economics.3) Integration and execution complexity
Delivering EDO savings and higher developer productivity at scale is nontrivial. Process reengineering across thousands of roles demands cultural change, careful change management and continuous governance. Overestimating productivity gains or underestimating integration costs (including With Intelligence and other acquisitions) could widen the gap between aspiration and realized savings.4) Macro and cyclical reliance
Some of S&P’s revenue (for example rates tied to issuance volumes) remains cyclical. Management’s margin targets assume steady organic growth and that AI benefits compound over time. A sharp downturn in issuance or client budgets could force increased near‑term investment or slow the rollout of paid features—temporarily compressing margins even as headwinds persist.5) Regulatory scrutiny and compliance risk
Financial data and workflow vendors operate in highly regulated spaces. Any feature that touches regulated reporting, audit trails, or compliance (e.g., DORA readiness in the EU) will be judged not only on function but on governance, auditability and resiliency. New AI features will attract extra regulatory scrutiny; compliance failures could be costly reputationally and financially.Technical considerations: grounding agents, throttling and data governance
S&P’s technical defenses are sensible—and necessary—but they face edge cases that require continuous hardening.- Grounding agents: providing only the minimal data necessary to answer a query reduces leakage, but designing accurate and performant grounding while avoiding hallucination is technically demanding.
- Throttling by row/volume: limiting the amount of returned data can protect IP, but it may degrade the user experience or make some AI features less useful. There is a delicate tradeoff between protection and product value.
- Telemetry and audit trails: collecting usage data is a commercial boon, but also creates privacy and governance obligations. Telemetry systems must be secured and transparent, especially when clients rely on them for billing or compliance.
- Interoperability and latency: clients want low‑latency, reliable responses from AI integrations. Ensuring consistency between S&P’s canonical datasets and AI outputs—while preserving performance—requires robust engineering investments.
Competitive landscape and market dynamics
S&P’s push into private markets and AI places it at the center of several competitive dynamics:- Private markets consolidation: the market for private‑markets data has seen high activity and strategic acquisitions. Firms that control critical fund‑level datasets can capture durable premium pricing—hence S&P’s With Intelligence purchase.
- Traditional peers vs. new entrants: legacy index and data providers face competition both from specialized data vendors and from AI‑first startups that bundle models and data into vertical solutions.
- Platform dynamics: partnerships with Microsoft, Anthropic, and other platform players can amplify reach but also create dependency. The balance between direct distribution and platform distribution is strategic and delicate.
What investors and customers should watch next
For those tracking the story, a handful of measurable signals will indicate whether the AI thesis is translating into durable outcomes:- EDO progress reports: quarterly disclosures about run‑rate savings and automation metrics—especially whether the 20% target is on track.
- Monetization metrics: revenue growth from AI add‑ons, expansion rates among Copilot and MCP connector customers, and ARR contribution from With Intelligence.
- Adoption curves: whether the iLEVEL add‑on sustains growth beyond initial early‑adopter cohorts.
- Telemetry‑based upsells: evidence that usage telemetry is producing meaningful new data sales or price renegotiations.
- Regulatory and legal developments: any rulings or guidelines on model training and data licensing that could affect IP protections.
- Customer retention and win/loss trends: whether AI features materially improve retention or enable wins against competitors.
Conclusion: realistic optimism with caveats
S&P Global’s Raymond James presentation articulated a coherent strategy: marry proprietary data and entrenched workflows with AI features, monetize usage and entitlements, and capture productivity gains on the cost side. The plan checks many boxes—strong assets, early productization, telemetry‑driven commerce, and a sizable private‑markets acquisition that expands TAM.Yet the strategy runs through several narrow straits. The legal and technical effort required to prevent unwanted dataset use for model training will be ongoing and politically charged. Competition from AI‑native vendors and platform owners will test pricing power. Delivering promised EDO savings and productivity increases across tens of thousands of employees is a substantial operational lift. Finally, the macro backdrop and issuance cycles add an overlay of uncertainty.
For CIOs, data teams and investors, the sensible posture is measured: S&P has real advantages and early evidence of monetization, but the next 12–24 months will be pivotal. Watch adoption curves, telemetry monetization, and EDO execution closely. If S&P can convert the early signals described at Raymond James into predictable, scalable revenue and margin moves, the company’s AI thesis will have passed a demanding market test. If not, the same forces that make AI powerful—ease of distribution, rapid model innovation, and platformization—could limit upside or compress margins more quickly than expected. Either way, S&P’s actions at the intersection of finance, data and AI will be among the most consequential developments in enterprise software and financial information services in the coming years.
Source: Investing.com UK S&P Global at Raymond James Conference: AI as a Growth Catalyst By Investing.com