Equity research analysts spend 60% or more of their time on data gathering: reading SEC filings, scanning news, tracking sentiment, and compiling notes. That is time not spent generating investment insights and alpha.
A typical 10-K filing is 100-150 pages. An analyst covering 20 companies reads 2,000-3,000 pages of filings per quarter, plus earnings transcripts, 8-Ks, and proxy statements.
All data lands in Delta Lake with full governance via Unity Catalog.
The research pipeline runs entirely on Databricks:
ai_summarize and ai_extract process filings to extract revenue trends, margin analysis, risk factors, management commentary, and forward-looking statements. For full 10-K analysis, ai_query routes to a long-context model via AI Gateway.ai_analyze_sentiment scores articles. The agent produces sentiment signals with evidence citations.ai_gen produces structured research briefs with investment signals, peer comparison, and confidence-weighted recommendations.Agent Bricks coordinates the pipeline. All models accessed via AI Gateway on Databricks Model Serving.
Research prep time drops by 70%. Analysts get structured briefs with sources instead of raw documents. Sentiment signals are real-time instead of periodic. Coverage capacity increases because the data gathering bottleneck is removed.
The analyst still makes the investment decision. The agents handle the research grunt work.