The Complete History of iOS Development (So Far)
I downloaded the first iPhone SDK the day it was available. March 6, 2008. There was no App Store yet - that wouldn't come until July. There was no documentation worth reading, no Stack Overflow answers, no tutorials, no community. I don't think I'd ever even heard of Objective-C before this. But none of that mattered. I just needed to be part of whatever this new thing was going to become.
What I didn't know - what none of us knew - was how many times Apple would tear up the foundation and make us rebuild. Not once. Not twice. Every few years, sometimes more often, Apple ships something that fundamentally changes how you build iOS apps. Sometimes it's a new language. Sometimes it's a new framework. Sometimes it's a design philosophy that renders every existing app obsolete overnight. And every single time, you adapt or you're done.
I've adapted every single time. Here's what that actually felt like.
Retain, Release, and Pray
The original SDK was Objective-C with manual reference counting. If you've only ever written Swift, you cannot appreciate how much of your brain was consumed by memory management. Every object you created, you owned. Every object you owned, you had to release. Every object someone else owned that you wanted to keep, you had to retain. Get it wrong and your app leaked memory until it was killed. Get it wrong the other way and you'd access a deallocated object - a zombie - and crash with a stacktrace that told you nothing useful.
You'd spend hours in Instruments hunting for the one retain that didn't have a matching release. You'd write dealloc methods longer than the actual logic of the class. This was just the tax you paid for writing any code at all.
I got good at it. We all did, the ones who stuck around. You developed an instinct for retain cycles, for ownership semantics, for when to copy versus when to retain. It became second nature. And then Apple automated it.
ARC Changed Everything and Nothing
Automatic Reference Counting shipped with iOS 5 in 2011, and the reaction was split down the middle. Half of us were relieved. The other half were insulted.
I'd spent years mastering manual memory management. I could trace a retain cycle in my head. I'd earned that skill through pain and now it was being handed to everyone for free. But here's the thing - and this is a pattern that repeats with every paradigm shift - the skill didn't become worthless. It became context. I still understood what ARC was doing under the hood. I could still debug the retain cycles between closures and delegates that ARC couldn't magically solve. The people who'd never learned manual memory management just knew the compiler handled it now, and when it didn't, they were lost.
Storyboards, or How Apple Tried to Kill My Workflow
Most experienced iOS developers came to the same conclusion eventually. Storyboards were a tool for demos, not for teams.
I need to talk about Storyboards because my hatred for them is deeply personal.
Apple introduced Storyboards in iOS 5 and spent years aggressively pushing them as the default. Every WWDC sample project used them. Every tutorial used them. The project templates defaulted to them.
The problem is that Storyboards are opaque XML files that are impossible to review in a pull request, create merge conflicts that will ruin your afternoon, can't be meaningfully diffed, and encourage a monolithic approach to view controller management that does not scale. I watched teams spend more time resolving Storyboard merge conflicts than actually building features. I watched apps where the entire navigation flow was crammed into a single Storyboard file that crashed Interface Builder if you looked at it wrong.
I went back to nibs - individual xib files paired with code. It was the ideal middle ground: visual interface design without the monolithic nightmare. You could design a view visually, keep it scoped to a single screen or component, and still review your pull requests without decoding XML soup. Most experienced iOS developers came to the same conclusion eventually. Storyboards were a tool for demos, not for teams.
iOS 7: Flat App Theory
iOS 7 was the most visually jarring update Apple has ever shipped - until Liquid Glass, anyway. Jony Ive killed skeuomorphism overnight. The green felt of Game Center, the leather stitching in Calendar, the wood grain in Newsstand - gone, replaced by flat surfaces, thin fonts, and translucency.
And it wasn't just cosmetic. iOS 7 changed how navigation bars worked - content now scrolled underneath them by default, which broke the layout of essentially every existing app. You weren't just updating colors and icons. You were reworking view hierarchies, edge insets, and scroll behavior across entire codebases.
I reworked multiple apps simultaneously, against a deadline that Apple set and Apple didn't care if you met. That was my first experience with what I'd later recognize as Apple's standard operating procedure: announce a massive change at WWDC in June, ship it in September, and good luck.
Swift: A New Language, Whether You Were Ready or Not
Apple announced Swift at WWDC 2014. A new programming language. After decades of Objective-C.
My first reaction was resentment. Objective-C wasn't just a language I used - it was a language I thought in. The message passing, the dynamic runtime, the bracket syntax that looked insane to outsiders but felt like home. I'd spent years getting fluent in it, and now Apple was telling me that investment was effectively over. A lot of us felt that way. We'd done exactly what Apple asked - learned their platform, mastered their tools - and now the ground was shifting again.
And Swift wasn't ready. Not even close. Swift 1.0 was a proof of concept, not a production language. The tooling was buggy, the compile times were brutal, and the language was changing between point releases. Nobody serious was shipping Swift in production that first year. It wasn't until Swift 2.0 that it started to feel like something you could actually build on.
But once it got there, the truth was hard to argue with. Optionals forced you to think about nil in ways that Objective-C let you ignore. The type system caught entire categories of bugs at compile time that Objective-C would have let slip through to runtime. The resentment faded because the language was genuinely better, and I'd been doing this long enough to know that fighting the obvious is a waste of energy.
The Swift 3 Apocalypse, or, The Great Renaming
And then Swift 3 happened.
If you weren't writing Swift in 2016, you missed the single most painful migration in the history of iOS development. Swift 3 was a source-breaking release. Not "some things changed." The API naming conventions were completely overhauled. Method signatures you'd been writing for two years were renamed. The entire standard library was reorganized. Apple's migration tool handled maybe sixty percent of the changes. The other forty percent was you, manually fixing thousands of compiler errors.
I had projects where the migrator generated more errors than the original code had lines. And Swift was still changing after that. Swift 4 broke some more, and it wasn't until Swift 5 achieved ABI stability that you could finally write Swift with confidence that it wouldn't be obsolete in twelve months. The early adopters paid the price for Swift's evolution in blood and billable hours.
The Dependency Management Wars
This one doesn't get talked about enough.
CocoaPods was first. It worked, mostly, if you were willing to hand over your Xcode project file to a Ruby gem that would modify it in ways you couldn't predict and occasionally couldn't undo. When it broke during major Xcode updates - which it did pretty much every time - you were debugging Ruby internals instead of building your app.
Carthage came along as the "we won't touch your project file" alternative. Philosophically pure, practically miserable. Build times were brutal, framework search paths were a nightmare, and when Apple changed how frameworks worked, Carthage was always playing catch-up.
Swift Package Manager was Apple's answer - years late and initially undercooked, but it's the one that won because it's built into Xcode. SPM isn't perfect, but it works, it's integrated, and you don't need Ruby installed to use it.
I've used all three in production. I've migrated projects between all three. I've watched developers who started after SPM became default have no idea how good they have it.
SwiftUI: The Great Reset
But the industry did move on, because Apple said so. And when Apple says so, that's the end of the conversation.
Apple announced SwiftUI at WWDC 2019 and the promise was enormous: declarative UI, live previews, one framework across all Apple platforms. The demos were beautiful. The reality was humbler.
And here was the resentment again, right on schedule. I'd spent years becoming a UIKit expert. Not just competent - expert. I knew every esoteric trick, every workaround, every undocumented behavior. I could fix layout bugs that made other developers give up and start over. I could make UIKit do things Apple never intended, and I'd earned every bit of that knowledge through years of shipping real apps. And now Apple was doing the same thing they did with Objective-C - taking the thing I'd mastered and telling me it was time to move on.
SwiftUI in its first year did not make that any easier. It was not production-ready for anything complex. Navigation was broken. Lists were slow. The data flow model changed between betas.
But the deeper problem wasn't the bugs. It was that SwiftUI required a completely different way of thinking. If you'd never done reactive programming before, it was like learning to be an iOS developer all over again. All those years of UIKit expertise, all that muscle memory for how views work, how layout works, how data flows through a screen - none of it transferred directly. The reactive paradigm is a fundamentally different mental model, and switching to it felt less like learning a new framework and more like switching platforms entirely.
And honestly? UIKit was and still is easier to work with in a lot of ways. You had direct control over your UI. You told a view what to do and it did it. With SwiftUI, you're describing what you want and hoping it figures out the right way to get there, and when it doesn't, the hoops you have to jump through to course-correct are absurd. A lot of us would have happily stayed in UIKit if the industry hadn't decided to move on.
But the industry did move on, because Apple said so. And when Apple says so, that's the end of the conversation. So I learned it. Not because it was obviously better the way Swift was obviously better. Because it was next, and staying current on this platform means going where Apple goes, whether you agree with the direction or not.
And SwiftUI is still only as good as the version you're allowed to deploy, which is directly tied to the minimum iOS version your product team will let you ship to. If your app still supports iOS 15 or even 16, you're not writing modern SwiftUI - you're writing SwiftUI with one hand tied behind your back. The framework gets meaningfully better every year, but you only get to use those improvements when your deployment target catches up. Five years from announcement before it was mature enough for a full production app without caveats. Apple doesn't tell you that at the keynote.
The Async Saga
Asynchronous code on iOS has been reinvented so many times that the history itself is a paradigm shift.
In the beginning, you had NSThread and performSelectorOnMainThread: and a prayer. Then Grand Central Dispatch arrived and blocks changed everything - suddenly you could dispatch work to background queues without managing threads directly. GCD was elegant and powerful and also the single easiest way to create bugs that only reproduced one time in fifty. Race conditions, deadlocks, priority inversions - all the classic concurrency nightmares, now available in a convenient closure-based API.
Combine showed up alongside SwiftUI in 2019 as Apple's reactive framework. Publishers, subscribers, operators, cancellables - a whole new mental model for handling asynchronous data streams. It was powerful for the right problems and wildly overengineered for the wrong ones. I watched teams wrap a single network call in a Combine pipeline with six operators when a completion handler would have been three lines.
Then Swift 5.5 introduced async/await and structured concurrency, and suddenly Combine looked like a transitional technology. Actors, sendable types, task groups - the language itself now had opinions about how concurrency should work. The model was genuinely good. The migration was genuinely painful. Rewriting callback-based code to async/await isn't just a syntax change - it's rethinking control flow.
And now Swift 6 strict concurrency is here, and it's a mess. The compiler is right about everything and helpful about nothing. You turn on strict concurrency checking and your project lights up with warnings about sendability violations that are technically correct but practically incomprehensible. The error messages read like they were written for the compiler team, not for the developer staring at them. And let's not even get into "approachable concurrency" - which is about as approachable as a porcupine dipped in poison. Half the community has turned strict concurrency on and is fighting through it. The other half is waiting for the dust to settle. I don't blame either camp. Apple shipped the right idea with the wrong developer experience, and they'll probably fix it in two years, the same way they fix everything - slowly, and after we've already done the hard part ourselves.
Liquid Glass: Here We Go Again
And then at WWDC 2025, Apple did it again. Liquid Glass is the biggest visual overhaul since iOS 7, maybe bigger. Every surface, every control, every navigation pattern has a new material language. It's beautiful and it means every custom UI component you've built needs to be reconsidered.
I watched the keynote and had the same feeling I had in 2013. Here we go again. Every app is going to look dated. Every client is going to want an update. The timeline is going to be aggressive because Apple's timeline is always aggressive.
But this time I'm not anxious about it. I've done this before. I've done this so many times that the process is familiar: watch the sessions, download the beta, start a branch, figure out what breaks, figure out what's new, figure out what's better. Same cycle, different details.
And After All of That
It's the closest thing in software to a doctor's obligation to stay current - except instead of a medical board enforcing it, it's Cupertino.
Here's what people don't talk about when they talk about iOS development: it demands continuing education in a way that most platforms don't. A web developer can write JavaScript the same way they wrote it five years ago and still ship. A backend developer can run the same framework for a decade and never be forced to change. iOS doesn't work like that. On this platform, what's current isn't determined by you, or by the community, or by market trends. It's determined by Apple. And when Apple moves, you move, or your skills start expiring. It's the closest thing in software to a doctor's obligation to stay current - except instead of a medical board enforcing it, it's Cupertino.
I've met that obligation every time. I've manually managed memory, survived three major language transitions, rebuilt UIs for two complete design overhauls, navigated the dependency management wars, relearned async patterns three times, and shipped production apps on every version of every framework Apple has offered. Not because I had to. Because this is what I chose, and I keep choosing it.
I've literally forgotten more about iOS development than most current iOS developers know. That's not a boast - it's just math. When you've been doing something for this long, through this many changes, the sheer volume of knowledge you've accumulated and then replaced is enormous.
Every paradigm shift felt like the end of something. And every single time, it wasn't the end. It was a layer. The new thing built on the bones of the old thing, and understanding the old thing made you better at the new thing.
That's what experience is. Not just knowing the current tools, but knowing every tool that came before them and why they were replaced.
That's not something you can shortcut. It's not something you can tutorial your way into. It's what happens when you show up, every year, for nearly two decades, and do the work.