We don't just use AI — we build the connective tissue between foundation models, real-time market data, and proprietary trade intelligence. Here's how the stack works.
We don't lock into a single model. Multi-model orchestration — matching the right model to each research task — is itself a competitive advantage.
Primary reasoning engine. Long-context processing for financial reports, causal chain analysis, thesis stress-testing, and bilingual cross-referencing. The backbone of our research workflow.
Complementary model for multimodal tasks — chart parsing, scanned document analysis, and complex quantitative reasoning where specialized capabilities matter.
Efficient preprocessing and triage engine. Initial news filtering, routine summarization, metadata extraction — concentrating budget on the analytical core while using cost-effective models for preprocessing.
Through Model Context Protocol (MCP), our AI talks directly to data sources — turning "human queries data, feeds AI" into "AI queries data autonomously."
Global equities, fixed income, derivatives, ESG data. The foundation for cross-border valuation comparisons and global fundamental analysis.
China market data core — A-shares, HK equities, futures, funds, macro indicators. Open-source, API-native, purpose-built for AI integration.
Research ingestion pipeline — auto-collecting sell-side reports, news, papers, social signals. AI extracts key views and maps consensus vs. our internal thesis.
Workflow orchestration — connecting models, data sources, and tools into repeatable automated research pipelines. The glue layer of our entire stack.
Cross-border finance demands specific cloud architecture: data sovereignty, compliance, latency optimization, and multi-region consistency.
Global infrastructure backbone. Bedrock for Claude deployment, Data Exchange for market feeds, cross-region data synchronization.
Azure OpenAI Service for enterprise-grade model deployment. Microsoft 365 ecosystem integration for research workflow automation.
BigQuery for large-scale market data analysis. Vertex AI for custom model training. Global network infrastructure for latency-sensitive processing.
An end-to-end research pipeline that connects signal detection, context assembly, multi-model analysis, and three-lens review.
Automated monitoring scans market data, news, social media, and trade flow data. Pre-defined thresholds trigger alerts.
AI via MCP automatically pulls historical data, sell-side consensus, related assets, and macro context into a unified analysis workspace.
The primary LLM handles core reasoning (causal chains, thesis stress-testing). Supporting models handle data processing and quantitative analysis.
Fundamental analysis checks thesis quality. Risk assessment evaluates exposure. Market structure analysis assesses dynamics. AI assists the first two; human makes the final call.
Structured research memo generated. Monitoring rules updated. Bayesian priors adjusted for the next cycle.