When Thuan Pham joined Uber in 2013, the company ran on a Node.js monolith that failed multiple times a week. Seven years later it operated across continents with thousands of services and one of the largest real‑time infrastructures ever built. In conversation with Gergely Orosz, Uber’s first CTO described how the engineering organization evolved—from constant outages to predictable global scale—and how similar lessons now guide his teams at Faire, where AI is reshaping development itself.
From Monolith to Microservices
At the time Pham arrived, Uber handled about 30 000 rides per day. The dispatch system—the core that matched drivers and riders—was single‑threaded and vertically scaled until even the fastest hardware could no longer keep up. Pham’s first requirement for a rewrite was brutally simple: a city must run on multiple boxes, and a box must serve multiple cities. That minimal constraint made horizontal scalability possible. Uber rebuilt dispatch in months, narrowly avoiding collapse.
The same survival pattern repeated across the stack. Each scale bottleneck—databases, API monolith, messaging—triggered a rewrite before the next growth wave hit. Time pressure, not architectural ideals, led the company toward microservices. Within two years, thousands of independently deployable components replaced the monolith, enabling teams to move at the speed the business demanded. Later, when growth stabilized, Uber consolidated again under projects such as ARC, grouping related services by domain to regain operational simplicity.
The Platform–and–Program Split
Long before microservices, Uber restructured its engineering organization. Functional silos—backend, mobile, frontend—were slowing delivery even at 100 engineers. Pham, Travis Kalanick, and Jeff Holden reorganized everything on a whiteboard with color‑coded sticky notes.
Each program team became cross‑functional and owned a business‑facing feature end‑to‑end. Platform teams built the shared infrastructure that enabled programs to ship independently. This model provided internal APIs before the systems themselves exposed external ones.
The split also created a professional trajectory: infrastructure specialists could dive deep, while product engineers iterated quickly. It was one of Uber’s earliest moves toward engineering leverage at scale.
Internal Tools by Necessity
Why did Uber build so many in‑house frameworks—Schemaless, TChannel, Ringpop, M3, Jaeger, and hundreds more? Because in 2013‑2015, few external systems could sustain that rate of change. PostgreSQL crashed under unpredictable load; support was nonexistent. Lacking vendors able to debug kernel‑level faults, Uber replaced core data layers with custom software under its own control.
This pattern wasn’t about NIH syndrome—it was about deterministic operation. When every extra millisecond mapped to real‑world waiting time, owning the full stack became an existential requirement.
Scaling People and Process
Technical architecture mirrored organizational design. Naming conventions and engineering culture matured as quickly as code. Pham once emailed the company reminding teams to ditch whimsical service names—“we are not a Mickey Mouse shop”—so newcomers could navigate thousands of repositories safely.
He also re‑graded senior levels, splitting “Senior Engineer” into L5A and L5B to smooth the path toward Staff. And when engineers complained that transfers were harder inside Uber than externally, he removed manager approval. Anyone could apply internally; leaders were incentivized to retain talent through growth, not policy.
For Pham, the CTO’s main job was twofold: maintain high talent density and “see around corners” 18‑24 months ahead. While teams solved near‑term issues, he anticipated the next systemic constraints—organizational, architectural, or both.
Lessons in Scale and Risk
Some of Uber’s wildest projects proved this principle. Launching in China required a new data‑sovereign infrastructure within months. Pham’s team delivered by partitioning architecture for strict data separation and phased city launches—beginning with the hardest, Chengdu. Doing the toughest part first made everything else predictable.
The Helix mobile rewrite followed the same rhythm: a complete overhaul of the user experience, backend protocols, and real‑time pipeline done in less than a year. These projects established a cadence where “survive first, optimize later” became a rational engineering strategy.
Engineering in the Age of AI
Today at Faire, Pham applies those scaling instincts to AI‑driven development. His teams use orchestrated agent systems—“swarm coding”—to automate large code transformations and assist with feature work. Early adopters have doubled effective output within three months.
The challenge, he says, isn’t generating green‑field prototypes but evolving legacy systems: applying AI safely to millions of lines of existing production code while preserving behavior and dependencies. AI raises the baseline—allowing non‑programmers to build usable software—but great engineers still stand out through the same traits as before AI: curiosity, fearlessness, and relentless learning. “Complacency is death,” Pham notes; the tools change, the mindset stays constant.
What Makes Great Teams Endure
Whether rewriting dispatch or rethinking AI workflows, Pham’s approach centers on people. Great engineers attract peers of similar caliber; assembling enough of them creates self‑sustaining culture. His worldwide network—from VMware to Uber to Faire—formed not through networking tactics but through consistent reputation: doing excellent work, treating colleagues well, and being someone others trust to call when systems are on fire.
For technical leaders and system architects, Uber’s evolution remains a case study in scaling under pressure: build minimal viable abstractions, restructure for autonomy, invent tools only when no alternative exists, and invest in culture as early as you invest in infrastructure.
AI may automate code, but it cannot automate judgment—the kind of judgment forged by shipping critical systems when failure wasn’t theoretical but hourly.