Hackathon Full-Stack

Undercut: Building a Production-Grade AI Car Platform

Jan 2026
6 min read

For my second hackathon, my teammates Soroush Baraouf, Parisa Taherizadeh, Ariyan Azami, and Mahan Ghaffarianmayouni and I took on an ambitious challenge: building a full-stack, production-ready platform that uses AI and quantitative analysis to help people find underpriced used cars in the Toronto market.

The Result: Meet "Undercut" 🚗

The used car market is notoriously opaque. Buyers lack tools to determine if a listing is fairly priced or a potential scam. Undercut levels the playing field by combining high-frequency web scraping, quantitative pricing algorithms, and Google Gemini AI analysis to give buyers "unfair" market intelligence.

My project role:

  • Scraper Developer: Engineered high-performance scrapers in Go with Playwright to extract real-time data for thousands of GTA car listings.
  • Backend Architecture: Designed and built the FastAPI backend with SQLAlchemy ORM.
  • AI Integration: Implemented Google Gemini 1.5 Flash for listing analysis and negotiation scripts.
  • Quantitative Engine: Developed Fair Market Value (FMV) calculation and deal grading algorithms.

The Challenge: Building a Complete Ecosystem

Unlike a typical hackathon project, Undercut required building three interconnected systems:

1. The Scraper ("The Hunter") - Go

Built with Playwright for Go, I developed the scraper that navigates AutoTrader.ca to extract car listings. The main challenge was bypassing anti-bot protections on a JavaScript-heavy site. I implemented stealth techniques and headless browser automation to scrape thousands of listings within a 100km radius of Toronto (GTA).

2. The Backend ("The Controller") - Python FastAPI

The backend is the brain of Undercut. I architected a FastAPI application with:

  • SQLAlchemy ORM for database interactions with PostgreSQL
  • Pydantic schemas for type-safe request/response validation
  • Rate limiting using SlowAPI to prevent abuse
  • Comprehensive REST API with 15+ endpoints for cars, users, and alerts
  • Quantitative services for FMV calculation and deal grading (S/A/B/C/F tiers)

3. The Frontend - Next.js 14

My teammate built a modern React interface using Next.js 14 with the App Router, featuring:

  • Server-side rendering for optimal performance
  • Radix UI components for accessibility
  • Framer Motion animations for smooth interactions
  • TanStack Query for efficient data fetching

The AI Integration: Gemini as "The Mechanic"

I integrated Google Gemini 1.5 Flash to power two critical features:

Red Flag Detection

The AI scans seller descriptions for concerning keywords like "rebuilt title," "needs work," or "running rough." It acts as a virtual expert mechanic, providing instant verdicts on whether a deal is worth pursuing.

Negotiation Script Generator

Once a user finds a car, Gemini generates a personalized negotiation script. It includes:

  • Specific data points from our FMV analysis
  • A suggested opening offer price
  • Key leverage points based on market data
  • A walk-away threshold

The prompt engineering was critical. I designed the system prompt to make Gemini act as a "ruthless expert car flipper," ensuring unbiased, profit-focused analysis.

The Quantitative Engine

I built a custom pricing algorithm that calculates Fair Market Value (FMV) for each listing. The system then assigns a deal grade:

  • S-Tier: 10%+ below FMV (exceptional opportunity)
  • A-Tier: 5-10% below FMV (great deal)
  • B-Tier: Within 5% of FMV (fair price)
  • C-Tier: 5-10% above FMV (overpriced)
  • F-Tier: 10%+ above FMV (avoid)

Additionally, I implemented a Total Cost of Ownership (TCO) calculator that factors in fuel costs, depreciation, insurance, and maintenance to show users the real monthly cost of ownership.

Database Design & Architecture

The application uses PostgreSQL hosted on Supabase. I designed a normalized schema with four main tables:

  • cars: Stores all scraped listings with VIN indexing
  • users: User profiles with onboarding completion tracking
  • alerts: "Sniper alerts" for instant notifications on new deals
  • saved_cars: Junction table for user watchlists

Testing & Quality Assurance

I wrote a comprehensive test suite using pytest, covering:

  • Unit tests for all CRUD endpoints
  • Service layer tests for FMV, TCO, and deal grading
  • Integration tests for search and trending features
  • Mock AI responses to avoid API costs during testing

Deployment & DevOps

The entire stack is containerized with Docker and orchestrated using Docker Compose. The frontend is deployed on Vercel, while the backend and scraper run on managed cloud infrastructure.


This hackathon taught me the importance of system design, API architecture, and building production-grade features under pressure. A huge thanks to my teammates Soroush Baraouf, Parisa Taherizadeh, Ariyan Azami, and Mahan Ghaffarianmayouni for the incredible collaboration. Building a full-stack AI platform from scratch was the most challenging and rewarding coding experience I've had.