Over four months we led the AI initiative at Sunday Capital — building eight production tools and a standalone integration, while the team kept doing their actual jobs in Slack, Excel, Outlook, and the CRM they already used. Nothing here asked anyone to learn a new platform.
They aren't engineering choices so much as a stance: meet operators inside their existing software, put the AI in the seams between systems, and treat compliance as a starting constraint rather than a finishing tax.
A full loan-lifecycle platform that ingests deals from email, takes status updates over Slack in plain English, syncs with the CRM, and files documents into the team's existing OneDrive structure.
The firm's hand-tuned Excel underwriting model — years of accumulated logic — re-implemented as an interactive web app with AI research and document parsing baked in.
| 5.50% | 5.75% | 6.00% | 6.25% | |
|---|---|---|---|---|
| +4% | 28.1% | 24.6% | 21.4% | 18.5% |
| +3% | 25.2% | 22.4% | 19.1% | 16.0% |
| +2% | 22.0% | 19.4% | 16.3% | 13.4% |
| +1% | 18.4% | 16.0% | 13.0% | 10.2% |
A purpose-built investor relationship and pipeline tool for an active $50M fundraise. Built because the off-the-shelf options either over-served or under-served — and the team was already tracking everything in spreadsheets.
Letters of Intent get drafted, posted to a Slack channel as a PDF, and need sign-off from the partners. The bot reads the PDF, drafts an approval thread, and tallies approvals as they come in — entirely inside Slack, where the conversation already happens.
A Model Context Protocol server that exposes the firm's loan pipeline and investor CRM directly to LLM clients. Anyone on the team can ask their assistant questions like "how many loans have we done with [capital partner]?" and get a real answer pulled from the live database.
Some of these are full applications. One is a hundred lines of Worker code. The right size for the work, not a uniform template.
Two things that aren't products themselves but shaped how the work landed.
Eighty-six deduped automation requests collected from seven team members across twelve themes, each tagged with a multi-select of requesters so demand intensity stayed visible. Hosted in Airtable, shared with the external AI consultant. Sourced from an internal interview process we ran across the team — so the build queue reflected what people actually wanted, not what was easiest to ship.
The discipline applies to our own work too. A daily launchd agent runs LLM extraction over yesterday's meeting transcripts and appends action items to an Obsidian dashboard. The same MCP server above, connected to a personal assistant, lets us query and update the loan pipeline conversationally during meetings — same pattern, different operator.
Slack, Excel, Outlook, the CRM. The new platform appears only when nothing existing can carry the job — and even then, it's wired into the existing tools, not in place of them.
Email ingestion, plain-English Slack updates, doc parsing, deal research. The team doesn't "use AI" — they communicate normally and the model translates.
A standalone Worker for one integration. A full multi-user platform for the multi-tenant tool. Right-size to what the work actually needs — don't reach for the platform every time.
Every new surface had an Excel import on day one. The team's existing data wasn't a migration problem — it was the starting point.
SSO-locked to the company domain. Auth on every API route. Audit trails on the writes that matter. Not retrofitted at the end.
351 tests on the underwriting engine, validated to within $1 / 0.001% / 0.005 IRR against the source workbook. When the calculations drive real money, "looks right" isn't a standard.
We take on a small number of engagements at a time — usually multi-month builds for operating companies that have real data, real workflows, and a team that doesn't want to learn a new tool.
josh@frontera-ventures.com