S [ Signal ]
Back to Work
Case Study
Selected Work

AI tools for a $1B+ real estate lender, built where the team already worked.

Over four months we led the AI initiative at Sunday Capital — building eight production tools and a standalone integration, while the team kept doing their actual jobs in Slack, Excel, Outlook, and the CRM they already used. Nothing here asked anyone to learn a new platform.

Role
AI Initiative Lead
Timeframe
Jan – Apr 2026
Build
~260–350 hours
Agency Equivalent
$175k – $300k+

Three threads run through every project.

They aren't engineering choices so much as a stance: meet operators inside their existing software, put the AI in the seams between systems, and treat compliance as a starting constraint rather than a finishing tax.

01
Inside the tools the team already used.
Slack stayed Slack. Excel stayed Excel. Outlook, OneDrive, and the CRM stayed too. New surfaces only appeared when nothing existing could carry the job — and even then, they were wired into what was already running.
02
AI in the seams, not the surface.
The team doesn't "use AI." They send emails and Slack messages like they always did. The model handles the translation underneath — email ingestion, plain-English updates, document parsing, deal research.
03
Compliance-shaped from day one.
SSO-locked to the company domain. Auth on every API route. Audit trails on the writes that matter. Built so a regulated lender could actually deploy it — not retrofitted later.
8
Production tools shipped
Plus one standalone Cloudflare Worker.
~300
Hours, four months
Jan – Apr 2026.
351
Tests on the underwriting engine
Validated to $1 / 0.001% / 0.005 IRR.
0
New tools the team had to learn
Everything met them where they were.
Project 01

PulseActive loan tracker

A full loan-lifecycle platform that ingests deals from email, takes status updates over Slack in plain English, syncs with the CRM, and files documents into the team's existing OneDrive structure.

The problem

  • Loan updates lived in scattered Slack threads and spreadsheets.
  • New deals arrived as forwarded email PDFs to the underwriting alias and got typed in by hand.
  • Closing docs got dragged into OneDrive folders that may or may not exist yet.
  • The CRM was the system of record but rarely up-to-date.

The approach

  • AI deal ingest — incoming emails to the underwriting alias get parsed by a frontier LLM. Borrower, address, terms, capital source — each field confidence-scored. Team reviews and one-clicks to create a loan. Duplicate detection by borrower name and property address.
  • Plain-English Slack bot — team posts updates in a deals channel ("450 Oak Ave funded yesterday at the higher rate"). Fuzzy loan matching via pg_trgm handles typos and abbreviations. Status changes auto-notify the channel.
  • CRM sync — OAuth client with auto-refresh, incremental sync every fifteen minutes. Conflict policy: CRM wins for origination, platform wins for operational fields.
  • OneDrive auto-filing — email attachments land in Pipeline / [Borrower] – [Address] / automatically. Docs link out to OneDrive when available, fall back to platform storage when not.
  • Document room per loan — uploads, version history, activity log with source badges (manual / Slack / email).

Stack

Next.js 16 Supabase Microsoft Graph Slack API CRM API Frontier LLM

Outcome

Deal intake collapsed from ~15 minutes of typing to a one-click review.
Loan status now lives in one place — driven by how the team already communicates.
Five phases shipped; phase six polish in progress.
#loan-updatesToday
A
Avery9:42 AM
450 oak ave funded yesterday at the higher rate. construction draw also released 1.2m
Pulse · bot Updated 450 Oak Avenue — status → Funded, rate locked at 9.25%. Construction draw $1.2M logged.
R
Reese10:11 AM
riverbend extension approved — 60 days
Pulse · bot Updated Riverbend Apartments — maturity extended 60 days. Capital partner notified.
Plain-English Slack updates → structured loan changes, with confirmations posted back in-thread.
Project 02

Underwriting PlatformExcel proforma, re-implemented

The firm's hand-tuned Excel underwriting model — years of accumulated logic — re-implemented as an interactive web app with AI research and document parsing baked in.

The problem

  • Underwriting lived in a dense, fragile Excel workbook. One analyst owned the master copy.
  • Property research (rent comps, market trends, sponsor track record) was a manual web-search slog.
  • Financial docs (T-12s, rent rolls, OMs) got typed into the model by hand.

The approach

  • Pure-function calculation engine — 13 modules, 351 unit tests, validated against a real 18-unit multifamily fixture to within $1 / 0.001% / 0.005 IRR. EOMONTH, PMT, XIRR, A/B-note schedules, DSCR, IRR/MOIC, sensitivity, sources & uses, scorecard.
  • AI property research — eight-section auto-generated dossier (market, comps, sponsor, demographics) via a frontier LLM with web search. Renders as collapsible prose in a dedicated Report tab.
  • Document parsing — drag a PDF or XLSX in, the model extracts fields, the analyst selectively merges them. Every field human-confirmed before it lands.
  • Six-tab deal workspace — Deal Setup, Rent Roll, Loan Terms, Returns Dashboard, Scorecard, Export.
  • XLSX round-trip — ~512 cells across 14 sheets exported back into the firm's template, so anyone outside the platform still gets the familiar Excel.

Stack

Next.js 16 TypeScript Vitest Frontier LLM PDF / XLSX parsing

Outcome

Underwriting moved from one analyst's laptop to a multi-user platform with auth.
Hours of manual research per deal collapsed into minutes — with citations.
Engine has 351 tests; calculations are auditable and refactor-safe.
IRR sensitivity · Exit cap × Rent growth
5.50% 5.75% 6.00% 6.25%
+4%28.1%24.6%21.4%18.5%
+3%25.2%22.4%19.1%16.0%
+2%22.0%19.4%16.3%13.4%
+1%18.4%16.0%13.0%10.2%
351 tests · last run passed. Engine validated to within $1 / 0.001% / 0.005 IRR against the source workbook.
Returns dashboard with sensitivity grid. Numbers match the source workbook to the cent.
Project 03

FlowInvestor CRM for an active fundraise

A purpose-built investor relationship and pipeline tool for an active $50M fundraise. Built because the off-the-shelf options either over-served or under-served — and the team was already tracking everything in spreadsheets.

The problem

  • Fundraise pipeline lived in a shared Excel sheet that decayed weekly.
  • No way to visualize stage progression or commitment progress.
  • Importing from existing broker lists meant manual copy-paste.

The approach

  • Excel import wizard — upload the existing spreadsheet, map columns, preview, import. Met the team where their data already was.
  • Pipeline kanban — drag investors across stages — cold → warm → meeting → soft commit → hard commit / disqualified — with optimistic updates.
  • Commitment flows — soft commit / hard commit dialogs with amount capture; disqualify and requalify with reason tracking.
  • Dashboard — fundraise progress bar, pipeline funnel, stage breakdown, metric cards.
  • Custom design system — Tailwind v4 theme, full component library — no off-the-shelf dashboard look.

Stack

Next.js Supabase Zustand Tailwind v4

Outcome

Replaced the shared spreadsheet without forcing the team into a generic CRM.
Pipeline stage and commitment progress are now glanceable, not assembled.
Fund I Capital raise $14.5M raised / $50M
Cold38
Investor A
$500K
Family Office B
Warm22
$8.4M
Family Office C
$2M
Investor D
$750K
Meeting14
$11.2M
Fund E
$3M
Soft commit9
$18.7M
Holdings F
$5M
Family G
$2M
Hard commit6
$14.5M
Partners H
$4M
Pipeline kanban. Stage totals and commitment progress visible at a glance, drag-to-reassign for stage changes.
Project 04

LOI ApprovalsSlack-native approval workflow

Letters of Intent get drafted, posted to a Slack channel as a PDF, and need sign-off from the partners. The bot reads the PDF, drafts an approval thread, and tallies approvals as they come in — entirely inside Slack, where the conversation already happens.

The problem

  • LOI sign-offs were a chase: post the PDF, ping each partner, lose track of who had approved.
  • No structured record of who approved what, when — and no easy way to know what was still pending.

The approach

  • PDF-aware bot — when a partner drops an LOI PDF in the approvals channel, the bot reads it, extracts the property, amount, and terms, and posts a structured tracker thread.
  • Inline approve / decline — one-click buttons in the thread; approvals are timestamped per partner and the running tally updates in place.
  • Conversation-aware — objections and back-and-forth ("I thought we were doing a 9.75 rate") stay in-thread, alongside the approval state.
  • Audit trail — every approval is recorded with name, time, and the LOI version it applied to — exportable on demand.

Stack

Slack API PDF parsing Frontier LLM Supabase

Outcome

LOI approval moved from a manual chase to a glanceable thread.
Partners approve from their phones, in the channel they already check.
Compliance gets a clean audit trail without the team filing one.
Thread · approvals 5 replies
PDF
1200 Sample St — LOI 04.27.2026.pdf
Posted by Partner One · 12:01 PM
1200 Sample St, Sample City, ST 00000
LOI 04.27.2026 · Drafted by Partner One
Approvals · 4 of 5
Partner One 12:36 PM
· Partner Two
Partner Three 12:22 PM
Partner Four 2:05 PM
Partner Five 1:49 PM
A live approval thread in the approvals channel. PDF in, structured tracker out, approvals collected inline.
Project 05

Custom MCP ServerNatural-language access to the loan pipeline

A Model Context Protocol server that exposes the firm's loan pipeline and investor CRM directly to LLM clients. Anyone on the team can ask their assistant questions like "how many loans have we done with [capital partner]?" and get a real answer pulled from the live database.

The problem

  • The platform had the data. People wanted to ask it questions in plain language without clicking through dashboards.

The approach

  • Eleven tools spanning loans, contacts, capital sources, and pipeline summaries.
  • Search, count, aggregate, update — including batch loan-status updates from a single conversational request.
  • HTTP transport — connects from Claude Code, Claude Desktop, or any MCP-compatible client with one config line.
  • Auto-deployed from GitHub on every push.

Stack

MCP TypeScript SDK Vercel Supabase

Outcome

Natural-language access to the entire loan pipeline.
Same data the platform UI shows — but conversational, inside the assistant the team is already using.
Assistant · internal-tools MCP
You
How many loans have we closed with Capital Partner X so far this year?
→ tool_call: search_loans
{ "capital_partner": "Capital Partner X", "year": 2026 }
← result: 4 records · 18ms
Assistant
Four loans, $58.2M total. Three are funded; one (Maple Ridge, $24M) is in closing. Average rate 9.1%, average term 32 months. Want a breakdown by asset class?
Question in. Tool call against the live database. Real numbers back. Same data as the dashboard, conversational.

Four more, sized to what each job actually needed.

Some of these are full applications. One is a hundred lines of Worker code. The right size for the work, not a uniform template.

Project 06
Task Board
Kanban with a Slack intake channel
A small, fast LLM parses messages in the tasks channel, identifies the assignee, creates the task, and posts a confirmation in-thread. Team-wide kanban, drag-and-drop reassign. Daily 9 AM ET cron posts a completion summary back to the channel.
Next.jsSupabaseSlack APISmall LLM
Project 07
Deal Teaser Generator
Excel + Python pipeline → one web tool
15-section accordion form, Excel-model import, one-click import from Pulse, client-side PDF generation matching the established teaser layout. localStorage drafts; saved-deals list. Replaced two scripts and a macro with one tool the credit team can run themselves.
Next.js@react-pdf/rendererSupabase Storage
Project 08
Console
Internal ops dashboard
Per-request token logging with a daily area chart of LLM spend. In-app bug submission with screenshot upload routed to a dedicated Slack channel. /bugs Slack slash command for filing without leaving Slack. Manual sync controls with last_sync_at displays.
Next.jsSupabaseSlack APIshadcn charts
Project 09
CRM Auto-Pilot
Standalone Cloudflare Worker
Built deliberately outside the main platform. Single-purpose Worker — Slack webhook in, CRM API out. Company-owned credentials stored as Worker secrets, no dependency on any individual's account. Slack confirm / edit / cancel buttons for human-in-the-loop deal intake.
Cloudflare WorkersWranglerSlack APICRM API

The operating system, not just the tools.

Two things that aren't products themselves but shaped how the work landed.

AI Initiative Tracker

Eighty-six deduped automation requests collected from seven team members across twelve themes, each tagged with a multi-select of requesters so demand intensity stayed visible. Hosted in Airtable, shared with the external AI consultant. Sourced from an internal interview process we ran across the team — so the build queue reflected what people actually wanted, not what was easiest to ship.

Personal automations

The discipline applies to our own work too. A daily launchd agent runs LLM extraction over yesterday's meeting transcripts and appends action items to an Obsidian dashboard. The same MCP server above, connected to a personal assistant, lets us query and update the loan pipeline conversationally during meetings — same pattern, different operator.

Six principles, applied across every project on this page.

01

Meet operators where they already work.

Slack, Excel, Outlook, the CRM. The new platform appears only when nothing existing can carry the job — and even then, it's wired into the existing tools, not in place of them.

02

Put the AI in the seams.

Email ingestion, plain-English Slack updates, doc parsing, deal research. The team doesn't "use AI" — they communicate normally and the model translates.

03

Build the lightest thing that does the job.

A standalone Worker for one integration. A full multi-user platform for the multi-tenant tool. Right-size to what the work actually needs — don't reach for the platform every time.

04

Replace spreadsheets without insulting them.

Every new surface had an Excel import on day one. The team's existing data wasn't a migration problem — it was the starting point.

05

Compliance is a starting constraint.

SSO-locked to the company domain. Auth on every API route. Audit trails on the writes that matter. Not retrofitted at the end.

06

Test the math.

351 tests on the underwriting engine, validated to within $1 / 0.001% / 0.005 IRR against the source workbook. When the calculations drive real money, "looks right" isn't a standard.

Get in touch

Working on something similar? Let's talk.

We take on a small number of engagements at a time — usually multi-month builds for operating companies that have real data, real workflows, and a team that doesn't want to learn a new tool.