Re: Optionals? A better option!
Re: Optionals? A better option!
- Subject: Re: Optionals? A better option!
- From: Jens Alfke <email@hidden>
- Date: Thu, 14 May 2015 11:19:02 -0700
With all due respect, I think you’re falling into the common engineer pitfall of jumping to the conclusion that there’s a trivial solution without first understanding the problem. (Sometimes expressed as “any bug in your program is trivial; any bug I have to fix is intractable.”) Which is to say that, if you really want to engage in productive debate or provide alternatives, you should spend some time learning the theory behind languages and also looking at non-C-like languages, especially functional ones. (Apologies if you’ve got such experience, but based on your answer I’m guessing you don’t.)
I’m not terribly qualified here; I’ve dabbled in language and compiler design in the distant past. Not enough to make me an expert, but enough to make me respect that there are a lot of hidden difficulties and ramifications to even simple language features.
Optionals come out of a long line of thinking in functional programming languages. The broader idea is that if a value can have multiple mutually exclusive states (in this case “has a value” vs “has no value”) then those states should be tagged and should require explicit action to select between. That’s basically what Swift enums are (and the same concept is found in a lot of other languages like Haskell, Erlang, Scala, Rust…)
There’s a school of thought that null pointers are harmful; optionals are a reaction to that. I just looked up the source — Tony Hoare gave a presentation where he formally apologized for inventing null pointers in 1965 as part of ALGOL W:
"I call it my billion-dollar mistake. It was the invention of the null reference in 1965. At that time, I was designing the first comprehensive type system for references in an object oriented language (ALGOL W). My goal was to ensure that all use of references should be absolutely safe, with checking performed automatically by the compiler. But I couldn't resist the temptation to put in a null reference, simply because it was so easy to implement. This has led to innumerable errors, vulnerabilities, and system crashes, which have probably caused a billion dollars of pain and damage in the last forty years.”
— http://en.wikipedia.org/wiki/Tony_Hoare#Apologies_and_retractions <http://en.wikipedia.org/wiki/Tony_Hoare#Apologies_and_retractions>
It’s a great quote, but I don’t think that was the first appearance of null. LISP dates back to the late ‘50s and has always had nil references (right?)
—Jens
_______________________________________________
Cocoa-dev mailing list (email@hidden)
Please do not post admin requests or moderator comments to the list.
Contact the moderators at cocoa-dev-admins(at)lists.apple.com
Help/Unsubscribe/Update your Subscription:
This email sent to email@hidden