Re: Is Apple's singleton sample code correct?
Re: Is Apple's singleton sample code correct?
- Subject: Re: Is Apple's singleton sample code correct?
- From: David Gimeno Gost <email@hidden>
- Date: Sun, 27 Nov 2005 21:19:15 +0100
On 27 Nov 2005, at 01:23, mmalcolm crawford wrote:
The current pattern has been used (as far as I can tell) without issue
for almost two decades now, and it's still not clear to me what the
problem is?
The problem is that your implementation of the pattern strives to
prevent the singleton from ever being deallocated. If we were talking
C++ I would say that the problem is that your implementation strives to
prevent the singleton destructor from ever being called. Nobody in
his/her right mind would ever suggest such an implementation in C++.
Why is it considered good design in Objective-C escapes me.
Yes, "similar" implementations in languages other than Objective-C are
the norm. But they are similar only in that, once created, the
singleton remains alive for the rest of the program. In those
languages, the singleton is usually designed such that when the program
exits the singleton is deallocated, its destructor gets called, and the
resources it manages are properly disposed of. The language runtime
system and supporting libraries allow for this to be done transparently
and that may lead to the illusion that the singleton is never
deallocated. But it is, it just happens that the singleton can be
written and used such that this is automatically taken care of by the
runtime system.
I have no problems with such implementations. It is appropriate in most
cases to assume that the singleton will remain alive until the end of
the program. In languages with supporting mechanisms for automatically
and transparently handling proper deallocation/destruction, that
assumption simplifies the design, so let it be.
No such supporting mechanisms exist for Cocoa/Objective-C AFAIK. In
Objective-C, to properly handle singleton destruction at the end of the
program, someone has to take ownership of the shared instance and send
it a -release message... assuming the singleton hasn't been designed to
prevent this method from actually doing anything useful, that is.
On 27 Nov 2005, at 01:44, mmalcolm crawford wrote:
You have completely missed my point.
Yes, and I suspect I continue to. It's becoming increasingly unclear
exactly what the problem is -- the point seems to change with every
post...
The point is that, up until now, the only "rationale" I've been given
for designing the singleton such that it can never be deallocated is
that "that's the whole point of the pattern" or something on the lines
of "that's the way things have always been done". With such bogus
(in)justifications, I'm forced to make it clear that the lifetime of
the singleton is not part of the pattern, that nothing in the pattern
as it is widely known says the singleton should never be deallocated.
The point is that removing those unnecessary constraints makes it
possible to design a singleton class that allows its instance to be
properly deallocated while at the same time greatly reducing the number
of methods that must be overridden.
I don't see what's difficult to understand about the current model
That it strives to prevent the singleton from ever being deallocated
for no apparent technical reason.
how it is difficult to understand or use, or how it is unsafe?
It requires overriding several methods to accomplish something that
shouldn't be a constraint in the first place, it hides possible memory
management bugs in client code, it makes it harder to properly dispose
of the resources managed by the singleton.
The pattern as described is trivial to implement, and even easier to
use
It would be easier if it didn't require so many methods to be
overridden, the complexity of the methods that must be overridden
remained the same, and it allowed the resources managed by the
singleton to be properly disposed of.
Consider also that you already overlooked an important bug in your
first implementation. It's not clear now what your preferred
implementation would be?
I didn't have an implementation, I still don't have one. I've just been
trying to prove the concept and to understand why things are the way
they are. If I had to choose an implementation now, I would choose the
one proposed by Uli Kusterer, if for no other reason because it
resolves the repeated -init invocation problem that has been bugging me
since the beginning.
Regards.
_______________________________________________
Do not post admin requests to the list. They will be ignored.
Cocoa-dev mailing list (email@hidden)
Help/Unsubscribe/Update your Subscription:
This email sent to email@hidden