Mar 25, 2026 | 5 Minute Read

We Built an AI-First School by Treating Lovable as a Co-Developer — Here's What Actually Happened

Table of Contents

On February 18, 2026, we shipped progressionschool.com. Twenty-plus pages. Razorpay payment processing with EMI subscriptions. A Zoho CRM pipeline that converts leads to contacts to deals without a single manual step. An AI chatbot that scrapes its own website nightly so it never gives stale answers. A voice agent that handles inbound sales calls. SEO and GEO infrastructure that serves Google’s crawler and ChatGPT’s indexer from the same codebase.

The entire thing was built on Lovable. Every line of code — React, TypeScript, Tailwind, edge functions, database migrations — was written by AI inside a single platform. No external IDE. No terminal. No deployment pipeline. No engineering team.

That last part is the one that matters. The person who built this is a marketer. Not a marketer who dabbles in code — a marketer who had never opened a terminal before this project. And what he shipped isn’t a prototype. It’s a production system processing real money.

This post is about what that experience actually looked like from an architecture perspective — what worked, what broke, and what it means for how we think about building software.


Why Lovable Won the Evaluation

We looked at several AI-code and no-code platforms. The decision came down to four things Lovable offered simultaneously that nothing else did.

First, real code output. React plus TypeScript plus Tailwind — no proprietary runtime, no lock-in. The entire codebase is exportable at any point. Second, a full backend out of the box: database, authentication, serverless edge functions, file storage, and secrets management, all under what they call Lovable Cloud. Third, instant preview — every change renders in real time. Fourth, one-click publish with automatic backend deployment.

The mental model is simple: describe what you want, Lovable writes the code, see it live. But the surprising part was how far this model stretches before it breaks. We pushed it through payment webhook cryptographic verification, CRM API orchestration, and multi-tier course enrollment logic. It held.


The Architecture That Emerged

The system has three layers. A React frontend with 20-plus lazy-loaded routes. A Lovable Cloud backend running 10-plus edge functions with a production database. And integrations with Razorpay, Zoho CRM, Thinkific, ElevenLabs, and Firecrawl.

The frontend is a single-page application — great for UX, historically terrible for SEO. Rather than migrating to a server-rendered framework, we built a static content fallback shell embedded in index.html. Hidden from sighted users and screen readers, but immediately visible to crawlers that parse raw HTML. This resolved Google Search Console’s “low text-HTML ratio” warnings overnight and gave us a dual-indexing strategy: traditional SEO through Schema.org JSON-LD, and generative engine optimisation through a purpose-built llms.txt file that AI models can parse.

The backend is where Lovable was most surprising. Edge functions handle payment order creation, subscription management, webhook processing, form submissions, lead lifecycle syncing, the AI chatbot, voice call logging, and a knowledge base refresher that auto-scrapes the site nightly. Every one of these was written by describing business logic in natural language. The prompt for the payment webhook was: “When Razorpay sends a payment.captured event, verify the signature, find the lead in Zoho CRM, convert it to a Contact, create a Deal, and enroll them in Thinkific.” Lovable wrote the complete function — HMAC verification, API calls, error handling, all of it.


What Broke

The first real failure was hosting. Lovable provides a default deployment domain, but we needed a custom domain on progressionschool.com. The domain was registered on GoDaddy with Cloudflare DNS proxying. Lovable’s hosting couldn’t resolve the domain correctly through the Cloudflare proxy layer — a DNS conflict that neither Lovable’s support nor standard troubleshooting could resolve cleanly. We migrated to AWS Render for hosting, kept Lovable as the development environment, and pointed the domain there. An afternoon of yak-shaving for what should have been a one-click operation.

The second failure was more interesting. Razorpay sends three separate webhook events for a single EMI payment: payment.authorized, payment.captured, and subscription.charged. Our first implementation processed all three, creating triple CRM entries for every installment. The fix required understanding Razorpay’s event model deeply enough to implement a deduplication guard — checking for subscription_id and invoice_id on payment.captured events and deferring to subscription.charged as the canonical event. When we described the problem to Lovable (“each EMI payment is creating three deals instead of one”), it analysed the event model and wrote the deduplication logic correctly.

The third was an identity resolution edge case. A student’s parent pays via UPI. The payment entity carries the parent’s bank-registered phone and email — but the student’s details are in the Razorpay subscription notes. Without handling this, the CRM would track the wrong person. Lovable implemented an email priority chain — trusting subscription notes over payment entity data for subscription events, and disabling phone-based CRM lookup entirely for subscription payments.

These aren’t toy problems. They’re the kind of production edge cases that typically surface weeks after launch and require senior engineering time to diagnose. In each case, we described the business impact in plain language, and the AI partner produced the correct technical fix.


What This Actually Means for Architects

There is a specific claim floating around the industry right now: AI tools let non-engineers build software. The implication is that this is either revolutionary or dangerous, depending on who you ask.

Having watched this happen up close, I think both readings are wrong. What actually happened is more nuanced.

The person who built Progression School didn’t become an engineer. He became something more like an engineering manager who happens to have an infinitely patient, extremely fast direct report. He made architecture decisions — “prices should never live on the frontend,” “the CRM should have one Contact record per person across all interactions,” “the AI chatbot should refresh its own knowledge base so it never goes stale.” He defined the business logic, the edge cases, the data model. Lovable handled the implementation.

The quality of the output was directly proportional to the quality of the instructions. When the prompts were vague (“add a form”), the code was generic. When they were specific (“this form submits to a Zoho iframe, and on completion we detect the reload and trigger a payment flow”), the code was precise and production-ready.

This has a concrete implication for how we think about team composition. The bottleneck in most software projects isn’t writing code — it’s translating business intent into technical specification. If that translation can happen in natural language with an AI partner, the constraint shifts. You need people who understand the domain deeply enough to specify behaviour precisely. You need people who can evaluate whether the output is correct. You don’t necessarily need those people to write the code themselves.


The Honest Accounting

What took a traditional team of four or five specialists several months, one person built in weeks. But the quality bar was different too. There’s no test suite. There’s no CI/CD pipeline. Error handling exists where we thought to ask for it, but there’s no systematic coverage. The codebase is clean because Lovable writes clean code by default, but it wasn’t designed with long-term maintainability as a primary constraint.

For a product at this stage — validating market fit, processing initial revenue, iterating on the offering — this tradeoff is correct. The system works, it handles edge cases we’ve encountered, and it can be extended by continuing the same conversation-driven workflow.

For a system that needs to scale to thousands of concurrent users, handle compliance requirements, or survive a team handoff to engineers who didn’t build it — you’d need to layer in the engineering discipline that was intentionally skipped. Lovable gives you a running start. It doesn’t give you a finish line.


The Takeaway

The interesting question isn’t whether AI can write code. It obviously can. The interesting question is what happens to the people closest to the problem when the implementation barrier drops to near zero.

In our case, the person who understood the curriculum, the pricing psychology, the student journey, and the sales process — the person who would normally write a requirements doc and wait — built the system himself. The feedback loop between “I want this” and “it exists” collapsed from weeks to hours. That speed advantage compounds. Every insight about user behaviour could be acted on immediately, not queued in a backlog.

Lovable isn’t a replacement for engineering. It’s a redistribution of who gets to build. And for teams willing to rethink who “the builder” is, that’s a more significant shift than any framework migration or cloud platform switch.

We shipped. It works. That’s the whole story.

About the Author
Hardik Kumar Patel, Senior Software Engineer

Hardik Kumar Patel, Senior Software Engineer

Hardik is a big fan of sports and video games. Away from work, he spends time with his son, works out at the gym, hangs out with friends, and plays cricket. Bring PUBG, PS4, or Call of Duty in your conversation, and he’ll be all ears.


Leave us a comment

Back to Top