Case Study — Construction Industry

50 Hours.
5 Automated Systems.
One Transformation.

How Evan's List built a full-stack AI automation suite for a Texas construction company — from lead scraping to AI voice agents — turning manual processes into a hands-free growth engine.

0
Hours
0
Systems Built
0+
Leads Analyzed
0
Counties Covered

Before & After Evan's List

A side-by-side look at how operations changed in under two weeks.

Before

  • Manual lead discovery from county websites
  • No centralized CRM — leads in spreadsheets
  • Facebook messages going unanswered for days
  • No lead enrichment or contact data
  • Duplicate & messy CRM records
  • Zero automation — everything was manual
  • No performance analytics or visibility

After Evan's List

  • 7-county permit scraper runs daily at 6:15 AM
  • GoHighLevel CRM with automated lead pipeline
  • AI auto-responder handles FB messages 24/7
  • BatchData skip-trace enriches every lead
  • CRM cleaned: 125 bad records removed, 0 errors
  • End-to-end pipeline: scrape → enrich → CRM → call
  • Interactive dashboards with real-time metrics

The Automation Pipeline

Data flows automatically from government permit records to qualified phone conversations.

🏛️
County Permits 7 Texas Counties
🔍
Web Scraper Daily @ 6:15 AM
📋
Data Enrichment Skip-Trace API
📊
CRM Pipeline GoHighLevel
🤖
AI Voice Agent Automated Calls
Qualified Lead Ready for Quote

5 Production Systems, 50 Hours

Every system is production-ready, documented, and running autonomously.

🏗️

7-County Lead Pipeline

Automated scraper pulls fresh permit applications from 7 Central Texas counties daily. Leads are cleaned, deduplicated, and pushed into the CRM with full contact enrichment via skip-trace APIs.

7 Counties
Daily Frequency
3-Stage Pipeline
💬

Messenger Lead Intelligence

Deep analysis engine that processes Facebook Messenger conversations to extract leads, score engagement quality, map conversion funnels, and identify high-value message patterns that drive the most conversions.

1,867 Leads Analyzed
0–100 Engagement Score
8 KPI Dashboards
🤖

AI Auto-Responder

Intelligent chatbot monitors incoming Facebook messages 24/7 and runs a multi-turn conversation flow. Uses data-driven message templates derived from the highest-converting phrases found in lead analysis.

24/7 Active
4 Nudge Sequence
8 Smart Templates
📞

AI Voice Agent Server

Webhook middleware that connects the CRM to AI voice calling agents. When new permit leads arrive, the system queues them and has an AI agent call to gauge interest — updating pipeline stages automatically.

M–F Call Window
Auto Stage Updates
REST API + Admin
🧹

CRM Data Cleanup

Audited 1,269 CRM contacts, identified 7 data issues, fixed 4 pipeline bugs, and executed a multi-step cleanup — removing duplicates, fixing naming conventions, and tagging records for enrichment retry.

1,269 Contacts Audited
125 Records Cleaned
0 Errors

Real Results, Real Impact

Every metric comes directly from production systems we built and deployed.

⏱️
0
Hours to Delivery
From kickoff to production
📊
0
Leads Analyzed
FB Messenger conversations
🧹
0
CRM Records Audited
125 cleaned, 0 errors
🗺️
0
Counties Automated
Central Texas coverage
🔁
0
Systems Integrated
All running autonomously
🐛
0
Pipeline Bugs Fixed
In production codebase
🤖
24/7
AI Responder Uptime
Automated follow-up engine
📈
0
Duplicates Eliminated
Clean CRM, clean pipeline

Engagement Timeline

A week-by-week breakdown of what was built. Click any item for details.

Week 1 — Discovery & Lead Intelligence

🔍 Messenger Lead Analysis Engine

Built a comprehensive analysis system that processed the client's full Facebook Messenger history — extracting leads, scoring engagement, identifying conversion patterns, and cross-referencing with CRM data.

Python NLP Analysis 1,867 Leads
Click for details
Key deliverables:
  • Interactive HTML dashboard with conversion funnels, monthly ad efficiency trends, and lead-level filtering
  • Engagement scoring system (0–100) based on message volume, response cadence, and contact capture
  • CRM cross-reference identifying leads missing from GHL with phone/email (ready for import)
  • Message phrase analysis — identified top-converting opening lines with statistical significance
  • CRM-ready CSV exports with GHL match status and engagement scores
Week 1–2 — Lead Pipeline

🏗️ 7-County Permit Scraper & Pipeline

Designed and built an end-to-end lead generation pipeline — scraping permit applications from 7 Texas counties, enriching leads with skip-trace data, and pushing to the CRM automatically.

Python REST APIs CI/CD
Click for details
Key deliverables:
  • Multi-county web scraper pulling fresh permit applications from government databases
  • Skip-trace enrichment layer adding owner phone numbers and contact data to raw permits
  • CRM push script with intelligent deduplication (upsert by phone, email, or application number)
  • GitHub Actions CI running the full pipeline daily at 6:15 AM CST
  • Baseline diff system to only process net-new leads — no duplicates, no wasted API calls
  • Pipeline run summaries with full audit trail (JSON artifacts for every run)
Week 2 — AI & Automation

🤖 AI Auto-Responder & Voice Agent

Built a data-driven AI auto-responder for Facebook Messenger with a 4-nudge follow-up sequence, plus a webhook server for AI voice agents to call new leads during business hours.

AI/ML Node.js Python
Click for details
Auto-Responder highlights:
  • 8 message templates modeled from the highest-converting phrases in the lead analysis
  • Data-driven nudge timing: 4hr → Day 2 → Day 5 → Day 7, backed by response window analysis
  • Smart tagging system (bot-active / auto-qualified / auto-no-reply) so team knows lead status at a glance
  • Yes/No/Unclear keyword classification routes leads through different qualification paths
Voice Agent Server highlights:
  • Express.js middleware with SQLite queue — leads auto-queued when they hit the CRM
  • Smart scheduling: calls M–F, 9 AM–5 PM Central only
  • Adapter pattern for swappable voice AI providers (no code changes to switch)
  • Admin endpoints for queue monitoring, call history, and live stats
Week 2 — Data Quality & Cleanup

🧹 CRM Audit & Data Cleanup

Full audit of 1,269 CRM contacts, identifying 7 data quality issues. Built and executed a safe, step-by-step cleanup script with dry-run support — zero errors across all operations.

0 Errors Python CRM API
Click for details
Operations performed:
  • Tagged 154 unenriched contacts for enrichment retry (154/154, 0 errors)
  • Deleted 6 test/example contacts (6/6, 0 errors)
  • Renamed 154 placeholder contacts to proper naming convention (154/154, 0 errors)
  • Deduplicated contacts by address — removed 119 duplicates, kept 25 best records (0 errors)
  • Fixed 4 pipeline bugs causing duplicates and inconsistent tagging
  • Every step had dry-run verification before live execution
Week 2 — Dashboards & Handoff

📊 Interactive Dashboards & Documentation

Built client-facing analytics dashboards with dark/light themes, animated system diagrams, interactive county maps, and comprehensive documentation for every system.

HTML/CSS/JS SVG Maps Documentation
Click for details
Dashboard features:
  • Executive overview dashboard with system status indicators
  • Permit lead automation dashboard with animated pipeline flow diagram
  • Interactive SVG map of all 7 monitored Texas counties
  • Messenger lead analysis report with sortable tables and conversion funnels
  • Dark/light theme toggle with brand color matching
  • Self-contained HTML — zero build step, zero external dependencies
  • Full handoff documentation for every project with resume checklists

Tech Stack

Modern, maintainable tooling chosen for reliability and ease of handoff.

🐍 Python 3 🟢 Node.js Express.js 🗄️ SQLite 📊 GoHighLevel API 🔍 BatchData API 🏛️ MGO Connect API 🤖 AI Voice APIs ⚙️ GitHub Actions ☁️ Cloudflare Pages 📱 Facebook Messenger API 📋 REST / Webhooks 📈 HTML/CSS/JS Dashboards 🧠 Claude Opus 4.6 💻 VS Code 🔒 Environment-based Secrets

Why This Worked

This engagement delivered 5 production systems in 50 hours because we prioritize building things that run themselves. Every pipeline, responder, and scraper was designed to work autonomously from day one — not as a demo or prototype, but as production infrastructure the team uses every single day.

— The Evan's List Approach


What 50 Hours Included

📋

Full Source Code

Every script, server, and dashboard deployed to the client's own GitHub repo. No vendor lock-in — they own everything.

📖

Complete Documentation

README files, handoff guides, and resume checklists for every project. Any developer can pick up where we left off.

🚀

CI/CD Deployment

GitHub Actions running daily, webhook servers deployed, dashboards published. Everything is live, not just development-ready.

🔧

Bug Fixes & QA

Found and fixed 4 production bugs in the existing pipeline. Every cleanup operation verified with dry-runs before execution.

Ready to automate your business?

Evan's List pairs businesses with elite AI engineers who ship production systems — not slide decks.