On Concurrency

Think of a common task across the iOS ecosystem: a network call. The user launches your app and you want to show them the current weather. The following is a naïve, linear implementation.

This code is linear and it’s easy to follow. For example, in order to get to line 25 (the print statement) all the previous lines need to finish execution. It’s like a hallway that has no doors – you just move forward.

If we were to draw the process, it would look something like this:

Linear programming is a relay race that happens from beginning to end and back again.

These people represent your algorithm and each one needs something from the next one in order to proceed. A real life example would be App Launch trying to bake a cake, but they need flour from Create URL request. Create URL request wants to give App Launch flour but they need money to go and buy it from the store, so they ask Send request to server for money… you get the idea. App Launch has a chain of dependencies, so its total runtime will be equal to it own runtime + Create URL request runtime + all the other subdependencies’ runtimes.

Now, the people represent our naïve code implementation, but code needs to be executed somewhere. That somewhere is a Thread. Think of a Thread as the way your computer thinks of your processor’s core. Just like a String represents written characters, a Thread represents a processor’s core.

App Launch begins executing and they keep going until they realize they need flour. At that point, Create URL request begins executing, so where does App Launch go? They take a break and wait in the Stack. The Stack is sort of a history of what we’ve been doing.

And so our code continues executing until Display It begins executing. By that point, the Stack is filled with our previous, incomplete steps.

Everyone in the stack is waiting for Display It to finish. Once Display It finishes, Parse the response goes back to execution until it finishes which frees up Receive response data from server. Repeat until App Launch is back on the Thread and finishes executing.

Doesn’t look like anything to me…

This would work perfectly in the real world if it weren’t for one simple detail: your app is waiting for everyone to finish… and somebody else is waiting too.

To complicate matters greatly, the UI is drawn 60 times per second. Imagine the blue person trying to get to their destination and not being able to because the road is blocked. This is the #1 sin in mobile development: freezing your UI.

Even if we let them pass, in between short steps they’re going to keep coming back again and again and again 60 times per second, so when we get to a long step like Receive response data from server, they’re going to have to wait way too long.

If we are to keep the UI constantly updated, we must make sure that every time Draw UI comes up, they have a Thread available to them.

Simple enough, right? How does that look in code?

Your good, old, run of the mill networking call.

Concurrency is hard

Before learning concurrency, you were absolutely certain that to get to line 25, all previous lines must have finished execution. Now that we introduced Concurrency, this is no longer true.

In the above example, line 22 (the print statement) will execute after line 26.

To be honest, the rules never changed, you just learned a simplified version. Just like Newtonian physics is a great way to describe our world until you start approaching the speed of light, then it breaks down and stops making sense.

In this case, what’s happening is that we’re breaking up our algorithm in two. Instead of having a single file of 6 people, we break the process up as needed. The people who can immediately help are split from those who need some time to finish their tasks.

The image above now separates each task based on how quickly they may finish. Blue tasks will finish quickly so they go on the Main Thread. Green tasks may take a bit longer, so they go to the background thread.

As you can see, now that we have two threads, we need someone to help us with the queueing of tasks. In the Apple ecosystem, that someone is Grand Central Dispatch.

“It’s not that hard” -famous last words

GCD attempts to simplify the queueing of tasks by introducing exactly that: a queue. Instead of having you, the developer, handling the moving of tasks from one Thread to another, GCD does that for you. All you need to do is be explicit in the order of your tasks.

We’re ready to launch.

If we go back to the code, the order of immediate tasks should be clear. I’ve added a Draw UI at the end of it to pretend it’s our 1/60th of a second call. Let’s see what happens next.

App Launch goes to GCD and GCD sends it to the main thread

App Launch is coming directly from the OS. It means the user just tapped on your icon. Since it’s a Very Important™ task, GCD decides to execute it on the Main Thread.

Not very different from before.

We haven’t introduced Concurrency yet, so we’ll follow the steps of our linear example and send App Launch to wait in the Stack.

Now, Send request to server is where it gets interesting. Sending the request is very quick; waiting for the server to respond is not. Here’s where Concurrency comes into play. Send request to server uses GCD to enqueue its next steps by adding them to the background queue, all as a group.

And then something funny happens… now that Send request to server is done, Create URL request and App Launch can keep going and Draw UI is free to execute. At the same time, Receive response data from server is now waiting for GCD to give the signal for it to go. Once the server responds, GCD will move it from the background queue to the background thread.

The Main Thread is now clear for as many Draw UI code blocks as needed.

When the server returns the data, then Receive response data from server begins executing…

And now everyone’s happy. Receive response data from server will continue executing, then Parse the response, then Display It. None of them will block Draw UI. We’ve managed to request information from a server while at the same time keeping the road free so we can always draw the UI.

If we could highlight our code in blue and green just like we did with the example above, it’d look like this.

Hopefully now it becomes obvious why line 23 will execute after line 25. Blue tasks take precedence, so like 1-7 will execute, then the process will skip until line 24 and finish executing on the main thread in line 29.

When (and if) the server returns a response, then the enqueued part of line 7 will begin executing and lines 7-23 will execute linearly.

Abandon hope all ye who enter here

Concurrency is an extremely simple concept to grasp: just do two things at the same time. However, it’s also a perfect example of a rabbit hole. Try following the White Rabbit down its hole into the strange world of concurrent programming and you’ll find yourself in a land full of bizarre execution issues like deadlocks, livelocks, crashes and corrupted data. It can get confusing very quickly because all code is executed linearly, just not in respect to other code.

Concurrency is also at the heart of iOS development. Knowing how to use Concurrency means you get to do things while the user keeps using the app: saving to disk, fetching data from an API, a long-running operation. It is an absolutely essential skill that you must learn to get a job.

If you liked this article, please consider subscribing! I write a weekly newsletter with a 15-minute coding exercise, and additional sections like interviews from members of our community.