ARC vs. GC
What is Life Cycle Management?
What is Life Cycle Management? Simply put, life cycle management is the feature of the language that keeps track of how long an object will stick around ("be alive") in memory before it gets destroyed and its memory released to be used by other objects. Keeping track of this and efficiently disposing of objects that are no longer needed is a crucial task, as memory is a precious resource (on some systems, such as mobile devices, more than others), and if too many objects stick around longer than necessary, the application (and eventually the entire computing system) will run out of memory to perform further operations.
The Olden Days
Before the introduction of modern object life cycle management, developers had to keep track of all objects they created, themselves, and make sure to explicitly release them when done. This can lead to unnecessary plumbing code at best, and to hard-to-maintain class structures at worst.
In Delphi, for example, all created objects need to
be freed by explicitly calling their
Free method. In many cases, this
finally blocks just to free local objects, and to think
hard (and document well) about ownership of objects returned from
methods or contained in the fields of a class. In Objective-C, before the
introduction of ARC, manual calls to
necessary to keep track of object ownership.
Both GC and ARC aim to take this burden from the developer, so that you no longer need to worry about tracking reference count, think about ownership or, indeed, manually free objects as they become unused. Both techniques do it in a rather transparent way that works similar enough on a language level that you just do not need to think about object life cycle management at all when writing day-to-day code.
What Keeps an Object Alive
Object life cycle management is really about keeping track of whether an object is still needed by the application. Both GC and ARC simply do this by defining that an object is considered needed as long as there are references to it. Simply put: as long as some piece of code, any piece, is holding on to the object (and thus potentially is able to perform tasks with the object), the object is still needed. Once this ceases to be the case, the object can be released.
There are a few scenarios to consider:
- an object stored inside another object's field or property,
- local objects created within the current method (including those defined outside, but used inside an ),
- objects passed into or out of method and function calls,
as well as, of course, combinations of the three.
In any of these three scenarios, the compiler alongside GC or ARC will take care that the object in question is kept around as long as it is needed. For example, if you store an object inside a field (or a property), that object will stick around, and the field will contain a valid reference to it until the field is being overwritten with a different reference, or the object containing the field is freed itself. Similarly, if you declare a local variable inside a method and assign an object to it, you can be sure that the referenced object will be around for as long as the variable is in scope.
Of course all of these rules combine, so if the same object is stored in both a field and a local variable, it cannot be considered for release until both the field and the local variable have let go of the reference.
What this boils down to is that you can pretty much just take for granted that your objects stick around as long as you can access them, and that they will automatically be freed once no part of your code is using them anymore. The implementation details for how this is achieved with GC vs. ARC vary greatly though.
Garbage Collection (or GC for short) is the technique used for life cycle management on the .NET and Java platforms. The way GC works is that the runtime (either the Common Language Runtime for .NET or the Java Runtime) has infrastructure in place that detects unused objects and object graphs in the background.
This happens at indeterminate intervals (either after a certain amount of time has passed, or when the runtime sees available memory getting low), so objects are not necessarily released at the exact moment they are no longer used.
Advantages of Garbage Collection
- GC can clean up entire object graphs, including retain cycles.
- GC happens in the background, so less memory management work is done as part of the regular application flow.
Disadvantages of Garbage Collection
- Because GC happens in the background, the exact time frame for object releases is undetermined.
- When a GC happens, other threads in the application may be temporarily put on hold.
Automatic Reference Counting
Automatic Reference Counting (ARC for short) as used ontakes a different approach. Rather than having the runtime look for and dispose of unused objects in the background, the compiler will inject code into the executable that keeps track of object reference counts and will release objects as necessary, automatically. In essence, if you were to disassemble an executable compiled with ARC, it would look (conceptually) as if the developer spent a lot of time meticulously keeping track of object life cycles when writing the code — except that all that hard work was done by the compiler.
Advantages of Automatic Reference Counting
- Real-time, deterministic destruction of objects as they become unused.
- No background processing, which makes it more efficient on lower-power systems, such as mobile devices.
Disadvantages of Automatic Reference Counting
- Cannot cope with retain cycles.
A so-called retain cycle happens when two (or more) objects reference each other, essentially keeping each other alive even after all external references to the objects have gone out of scope. The Garbage Collection works by looking at "reachable" objects, it can handle retain cycles fine, and will discard entire object graphs that reference each other, if it detects no outside references exist.
Because Automatic Reference Counting works on a lower level and manages life cycles based on reference counts, it cannot handle retain cycles automatically, and a retain cycle will cause objects to stay in memory, essentially causing the application to "leak" memory.
ARC provides a method to avoid retain cycles, but it does require some
explicit thought and design by the developer. To achieve this, ARC
introduces Storage Modifiers that can be
applied to object references (such as fields or properties) to specify
how the reference will behave. By default, references are
strong, which means
that they will behave as described above, and storing an object
reference will force the object to stay alive until the reference is
removed. Alternatively, a reference can be marked as
weak. In this case, the
reference will not keep the object alive, instead, if all other
references to the stored object go away, the object will indeed be freed
and the reference will automatically be set to
A common scenario is to determine a well-defined parent/child or
owner/owned relationship between two objects that would otherwise
introduce a retain cycle. The parent/owner will maintain a regular
reference to the child, while the child or owned object will merely get a
weak reference to the parent. This way, the parent can control the (minimum)
lifetime of the child, but when the parent object can be freed, the
references from the children won't keep it alive.
Of course the children or owned objects need to be implemented in a way that enables them to cope with the parent reference going nil (which would, for example, happen if an external direct reference to the child kept it alive, while the parent is destroyed). It would be up to the developer to determine how to handle such a scenario, depending on whether the child object is able to function without the parent or not.
The Storage Modifiers are only supported on .
IDisposable & Finalizers
The .NET and Java frameworks provide the "Disposable" pattern that lets specific classes work around the non-deterministic deallocation of objects.
While for most classes deterministic deallocation is not crucial, there are some cases where it is, such as with classes that represent so-called "unmanaged resources", i.e. resources outside of the scope of the garbage collector. For example, a class might contain an open exclusive file handle, or a network connection. If such a class is no longer used, it is commonly desirable to have the unmanaged resource released immediately, e.g. have the file closed and its handle released, or the network connection shut down.
Because we cannot rely on the exact time for when an object will be deallocated under GC, the Disposable pattern provides a well-defined interface and method that can be called on an object to "dispose" it deterministically. Calling this method will not actually release the object (the GC will do that, as it does for all objects), but it will give the object a chance to "clean up" after itself, release any unmanaged resources and (typically) set an internal flag to indicate that it has been disposed.
On .NET, the interface for this is called
IDisposable, and the single method is called
On Java, the pattern uses the
Closeableinterface, with a
closemethod to be called.
For both platforms, Elements provides a statement to work with an object and then have it closed/disposed as the block ends. (On Cocoa, the statement works as a no-op, simply creating a local variable, and letting ARC collect the object at the end of the statement. This way, the syntax can be used in a cross-platform fashion in all three editions.)
On both .NET and Cocoa, Finalizer methods defined with the
finalizer keyword (in
method) can be provided to perform additional cleanup.
On .NET, finalizers are a last resort, and should serve only as a
backup in case the user of a class "forgot" to use the Disposable
pattern and call
Disposeproperly. They are costly to the garbage collector, and should not be declared on objects without sufficient reason.
- On Cocoa, finalizers are a regular part of an object's cleanup, and they will be called deterministically when the object is released.