Re: swift and objective-c
Re: swift and objective-c
- Subject: Re: swift and objective-c
- From: Ron Hunsinger <email@hidden>
- Date: Thu, 05 Jun 2014 12:40:45 -0700
On Jun 4, 2014, at 12:07 PM, Rainer Brockerhoff <email@hidden> wrote:
> On 6/4/14, 12:51, email@hidden wrote:
>> Date: Tue, 03 Jun 2014 20:19:22 -0700
>> From: Jens Alfke <email@hidden>
>>
>> I’ll bet the Burroughs B5000 got it right, though. As we all know here, the B5000 did everything right. ;-)
>
> Let me preemptively ;-) make a remark about that - well, not the B5000 but the B6700 I actually worked on. Besides being (sorta) the Mac 128 of its day - built by a small team, odd architecture, no assembler available - it had two related high-level languages: Algol for writing compilers and applications, Espol for writing the basic OS.
>
> Perhaps we can see a far-off day when most iOS/OSX stuff will be written in Swift (including things where C++ is used today), and mostly the low-level stuff will be in C with embedded assembly. Sounds good to me!
The B6700 also did it "wrong", but in hindsight I see why. The Burroughs Large System computers did all arithmetic in floating point, with integers simply being the special case where the exponent was zero (putting the decimal point at the _end_ of the mantissa, which was allowed to be un-normalized).
When a and b are floating point, interpreting a%b as a remainder (with the sign of a) as opposed to a modulus (with the sign of b) has the important advantage that it's always exact. There is never a rounding error. Never mind that in Algol the expression would be written A MOD B, the operation was actually a remainder operation. The modulus operation can have devastating loss of significance. For example, -1e-50 mod 1.0 == -3e-50 mod 1.0 == 1.0. All bits of the left-hand argument get rounded away. But -1e-50 rem 1.0 == -1e-50. No bits are lost.
In Swift, it makes sense that the % operator should have the same semantics for any primitive type. If % should be remainder for floats, it should be remainder for integers as well.
Espol and its successor NPL were really both Algol with a few extensions to facilitate system programming. Similarly, DMAlgol was Algol with extensions for implementing databases, and DCAlgol was Algol with extensions for implementing data communications. One way or another, all "system" code was written in some dialect of Algol. All code, whether "system" or other, was written in a high level language. There was never an assembler.
There's no reason to assume that in the future Swift will not be used also for the low-level stuff. Multics was originally planned to be mostly PL/1, with low-level stuff written in assembler. As time went on, more and more of the assembler code was replaced with PL/1, for efficiency. It turns out, compilers can optimize much better than humans can.
A Swift-everywhere future looks good to me, too, but I cannot see Apple justifying the man-hours it would take to replace millions of lines of working C code with Swift. They could reduce the time by writing a compiler that "compiled" C and Objective-C into Swift, but the man-hours to do even that might be hard to justify, especially seeing as how the only gain would be satisfy the esthetics of a few purists who will never be satisfied anyway. And there's all that C++ code, which I suspect would be harder to convert automatically.
Probably what we'll get is that all new code will be written in Swift. Which means Swift has to be open-sourced if new code is ever to be added to Darwin.
-Ron Hunsinger
_______________________________________________
Do not post admin requests to the list. They will be ignored.
Xcode-users mailing list (email@hidden)
Help/Unsubscribe/Update your Subscription:
This email sent to email@hidden