Opinion

Reflections on WWDC08

Now that WWDC08 is over, I thought I'd take a little time to look back at what happened from my point of view in terms of new information, and no, I probably won't go into detail about how, although they had free Odwalla, the food provided there was frankly speaking quite shitty.

Garbage Collection
I can't remember a single session I attended in WWDC08 that didn't have at least a slide or a mention of GC – it was everywhere. This is something Apple seems to be pushing big time in the yet-to-be-released Snow Leopard and wants developers to adopt it. However, GC is not the panacea that it kind-of is in Java.

The first factor to consider is the "opt-in" nature of GC in OS X, which means that any application can choose to run in GC or non-GC mode. And regardless of which mode an application chooses, any frameworks it uses, any plug-ins that hook into it, must also be capable of running in either mode as required. This means that framework/plug-in writers have to deliver a single code base and a single binary that supports both GC and non-GC. The Apple frameworks already do this.

The second factor to consider is that if you as a developer think that for every kind of creating-something-that-needs-to-be-freed-later function call, you can simply omit the corresponding 'free' call, it's far from the truth. GC applications do eliminate a lot of tedious memory management, but they aren't simply programs without the memory management calls – or at least not in Objective-C 2 – and getting an existing application moved over to GC might not be without its bugs (there can, ironically, even be memory leaks with GC on when there weren't any with it off).

Thirdly, even Apple admits that there are certain applications which should not be GC'd because they require the assurance that nothing else is running that could even potentially slow down performance for a millisecond and should thus manage their own memory in their own complicated ways. In fact, Apple itself hasn't yet ported most of its applications to GC. One of the few it has is Xcode 3.

Finally, GC in Objective-C 2.0 is optimized for Objective-C objects and that means that unlike porting pure Objective-C code over to GC, porting C/C++ code base might prove to be either infeasible or even impossible. Even in GC mode, for example, any memory allocated using malloc still needs to be released using free and CFRetain calls need to be matched with CFRelease calls respectively – the application maintains a separate malloc zone and a GC zone on the heap.

However, although this prevents any large developers with giant applications (such as Adobe or Microsoft) from switching to GC for many years to come (hence Apple making this opt-in, I feel), GC is a great boon to both new developers who don't need to worry about memory management from the get-go and to existing developers with comparatively smaller code bases who can transition to GC in their next release cycle. And Apple is providing some magnificent incentives for these developers to do so. For example, GC is going to be six to ten times faster in Snow Leopard, not only compared to Leopard GC but also to traditional malloc. And to developers, this gives them a concrete incentive to transition their code over to GC, in addition to cleaner, more manageable code. And, from my point of view, this is as good as it gets because I get to continue along the path of not including memory management code in my application while having my already-fast application receive a free speed boost come Snow Leopard some time in 2009.

Multithreading
Being able to spawn off multiple threads to do their bidding more efficiently in parallel has been something that programmers have been able to do for many years now. However, in order to achieve this, they have to significantly alter the structure of their existing single-thread code base and endure much toil while doing so. What is Apple doing about this? Apple is not providing a magic pill to solve one of the very basic problems of multithreading, i.e., deadlocks, however, it is making multithreading a lot more easier and accessible in Snow Leopard, in which developers can designate certain bits of code to run inside "blocks". By specifying some code inside a block, the developer tells the OS that that particular piece of code can be run in its own thread if the need arises. Thus, when the application runs, Apple's new "Grand Central Dispatch" runtime decides whether it's economically viable to spawn new threads to run the code (based on system load and on the number of cores on the machine), and if it is, it makes the new threads, but also cutting them back when it needs to. This basically means that developers never have to bother with ever manually spawning threads and can instead just make their code thread-compatible and let the OS take care of squeezing the most performance out of it.

Rosy picture aside, the two major caveats that come with using blocks seem to be that (a) you still take care of deadlocks, etc. in your code and (b) the syntax for creating and using blocks looks ugly as fuck. If you've seen function pointer syntax in C++ and shudder at the thought of ever writing another one, think that except worse.

Deeper into Objective-C
This is just personal experience by my going to WWDC for the first time but it's like I've discovered this whole hierarchy of programming interfaces beyond the "NS" frameworks that were basically all my world until last Sunday. I find that I now understand a lot better how the Mac OS X programming stack works and how many layers there are and where the boundary is between Cocoa Animation, Core Animation and OpenGL, etc. And I feel that even though I won't be using much of the programming API beyond the NS-level classes, it makes me more at peace knowing what's going on underneath. Also, my definition of what's considered "easy-and-straightforward" has been tickled a little because people at WWDC throw more into that category than what I would normally allow. 

Comments (36) Posted on at  

News