Back to Projects
liveFull Stackfrontendbackend

RAMifications — DDR5 Cross-Retailer Price Tracker

A DDR5 cross-retailer price tracker. Paste a Newegg / B&H / Micro Center link and see the same SKU at every retailer, with a Claude AI verdict on whether to buy now or wait.

2026-042026-04
RAMifications — DDR5 Cross-Retailer Price Tracker

Key Highlights

  • Paste any Newegg / B&H / Micro Center URL → live cross-retailer chart of the same DDR5 SKU
  • Claude Haiku 4.5 buy-signal verdict (BUY/WAIT/AVOID) with confidence + reasoning, 1h cached
  • Retroactive overpay detector: mark a past purchase, get the price-match deep link
  • Real-Chrome Playwright scraper bypasses Cloudflare on B&H via channel=chrome + webdriver masking
  • Built end-to-end in 90 minutes for the progsu Claude Code workshop

Overview

RAMifications is a cross-retailer DDR5 price tracker. One pasted product URL resolves to a canonical SKU and surfaces live prices from Newegg, B&H, Best Buy, and Micro Center side-by-side, plotted on a multi-line chart, with a Claude-powered buy/wait verdict on top.

Problem

DDR5 prices have been whiplashing since the HBM allocation pull on DRAM capacity in 2025. Cross-retailer spreads on the same exact SKU routinely exceed $100 on a given day — but no shopper checks four tabs every time. The spread is invisible at the moment of purchase, which is exactly when it matters.

Solution

Track each SKU once across every retailer it sells at, plot the price history on a single chart, and let a Claude Haiku 4.5 verdict translate the chart into one of three actions: BUY, WAIT, or AVOID — with a short reason and a confidence score. For people who already bought, a retroactive overpay detector links straight to the retailer's price-match form (Newegg honors 30 days; Best Buy honors 14).

My Contributions

  • Designed and built the full stack: Next.js 15 / React 19 / Tailwind 4 frontend on Vercel, FastAPI + SQLAlchemy 2 async backend, Playwright scraper, Anthropic SDK integration
  • Built a 4-strategy scrape chain (`jsonld → og → site-specific → regex`) so retailer markup changes degrade gracefully instead of failing
  • Wrote a real-Chrome Playwright stealth path to bypass Cloudflare on B&H — `channel=chrome`, `navigator.webdriver` masking, and a homepage warm-up to acquire the `cf_clearance` cookie
  • Implemented the AI buy-signal with prompt caching on the system message and a 1-hour memo per SKU
  • Built the affiliate-tag rewriter so every outbound buy button is monetized at response time via env-driven tags — no DB column, no auth, no Stripe
  • Shipped the full app end-to-end in 90 minutes during the progsu Claude Code workshop on 2026-04-27
  • Stack

    Frontend: Next.js 15, React 19, TypeScript, Tailwind 4, Recharts, deployed to Vercel

    Backend: FastAPI, Python 3.11, SQLAlchemy 2 async, aiosqlite, managed by `uv`

    Scraper: Playwright Python with headless real Chrome

    AI: Anthropic SDK, `claude-haiku-4-5-20251001` with ephemeral prompt caching

    Challenges & Tradeoffs

    Challenge: B&H is behind Cloudflare and blocks bundled Chromium on sight, so the standard Playwright install couldn't see prices.

    Solution: Switched to `channel=chrome` (real installed Chrome), masked `navigator.webdriver`, and added a homepage warm-up navigation to acquire the `cf_clearance` cookie before hitting product pages. The scraper now sees the same DOM a real user does.

    Challenge: "Same SKU" across retailers isn't actually the same — Newegg lists the CL30 variant, B&H lists CL36, both labeled "Corsair Vengeance 32GB DDR5-6000."

    Tradeoff: For v1, treat them as one canonical SKU and surface the spread anyway — the user benefit (visible $100+ delta) outweighs the precision loss. Variant clustering with separate buy-signals per CL rating is on the v2 list.

    Tradeoff: The backend stays local in production — only the frontend is deployed to Vercel. Workshop scope made that the right call; Fly.io for the Playwright runtime + Neon Postgres in place of SQLite is the obvious v2 deploy.