ARTICLE

Maximizing alpha: Harnessing data, technology & AI in quant investing

Semiconductor wafer

Bloomberg Professional Services

KEY TAKEAWAYS

  • The source of alpha is shifting from capital access to how effectively firms can structure, connect, and act on data.
  • Data engineering remains the primary challenge; cloud storage is now standard, but point-in-time interoperability is the real differentiator.
  • AI is currently delivering its greatest impact through productivity and efficiency gains rather than acting as a direct generator of alpha.
  • Institutional investors require full transparency; models must be explainable and data inputs must be fully auditable and traceable.

The source of alpha is evolving. Where advantage once depended on access to capital, markets or information, it is now increasingly shaped by how effectively firms can structure, connect and act on data. The edge is in making it usable, timely and decision-ready.

In quantitative investing, this shift is redefining the relationship between data, technology and investment strategy. Infrastructure choices, data engineering discipline and model transparency are becoming as critical as the signals themselves.

These dynamics were a central focus of a recent Bloomberg Enterprise Data discussion, where practitioners across quantitative investing, engineering and cloud technology shared how their approaches are adapting in practice.

Data is foundational, but precision matters

As the volume of available data continues to expand, the challenge is control.

Quantitative firms are working with increasingly complex datasets, where small inconsistencies can materially distort outcomes. The ability to accurately map, align and validate data has become a prerequisite for generating reliable signals.

“From a financial perspective, details really matter,” says Aidan Wilmott, Principal Data Engineer at Polynomial Partners. “You cannot join a security to another security if it’s not exactly the same thing.”

Understanding how datasets are constructed and how they behave once integrated is critical. Without that rigor, scale can amplify errors.

The engineering reality behind alpha

Generative AI may dominate headlines, but the day-to-day reality of quant investing remains grounded in data engineering.

Cloud technology has largely addressed the issue of storing and processing large datasets. The focus is now on ensuring that data can be connected, standardized and analyzed in a consistent, point-in-time framework.

“From an infrastructure standpoint… [the] cloud is just designed to handle large volumes of data,” Wilmott explains.

What remains complex is the underlying architecture. Firms must reconcile datasets from multiple sources, often in inconsistent formats, while avoiding issues such as look-ahead bias.

In this context, interoperability is not just a technical goal. It directly affects how quickly and reliably firms can move from raw data to usable signals.

Speed, scale & cloud agility

As data and model complexity increase, so does the need for computational flexibility.

Rather than maintaining fixed infrastructure, firms are prioritizing the ability to scale resources dynamically in response to market conditions. This allows them to process large volumes of data when needed, without carrying constant overhead.

“For example, if I react to a market event, can I actually quickly analyze if it’s fake news that maybe modifies a certain scenario people buy into, even something that they really shouldn’t?” asks Olivier Klein, Chief Technologist at AWS.

Speed also supports validation. The ability to rapidly test scenarios or verify information is becoming increasingly important in volatile markets.

At the same time, the underlying economics of infrastructure are shifting. Building and maintaining proprietary systems is becoming less attractive relative to more flexible, scalable approaches. To keep pace with the scale driven by AI workloads, AWS is now building its own custom silicon. Klein revealed that they are adding more of this custom silicon to their data centers every day, more than processors from traditional chipmakers such as Intel.

Productivity over prediction

Despite advances in AI, its role in quant investing remains pragmatic.

Rather than acting as a direct generator of alpha, AI is currently delivering the greatest impact through efficiency gains, enhancing how quickly firms can process data, test hypotheses and move strategies into production.

“It’s not about you chasing the next signal… magically delivering another 20% alpha out of nowhere,” says Shan Jiang, Co-head of Quantitative Equity Strategies at HSBC Asset Management.

Instead, AI is compressing timelines. Tasks that once took weeks, such as building bespoke portfolios aligned to specific risk or ESG constraints, can now be completed in days.

This shift is less about discovering entirely new signals and more about accelerating the path from research to execution.

Transparency remains critical

As AI becomes more embedded in investment workflows, transparency is becoming a defining requirement.

Institutional investors require full visibility into how decisions are made. Models must be explainable, and data inputs must be traceable.

Jiang emphasizes that every decision must be auditable. Firms need to understand not only what a model outputs, but why.

This places greater importance on data lineage, governance and point-in-time accuracy. Without these foundations, even the most sophisticated models risk losing credibility.

At the same time, human judgment remains central. AI is increasingly viewed as a tool to augment analysis, not replace it.

Looking ahead

Data, technology and investment strategy are becoming more tightly integrated, with fewer distinctions between them.

What is changing is not just the scale of data or the sophistication of models, but the expectations placed on both. Accuracy, speed and transparency are all moving higher on the priority list.

Firms that can align these elements, combining strong data foundations with flexible infrastructure and disciplined AI deployment, are better positioned to translate information into consistent performance.

Related Content

Get insights delivered to your inbox

Sign up for Bloomberg Professional Services newsletter