MemoryMapping Large Files - ???
MemoryMapping Large Files - ???
- Subject: MemoryMapping Large Files - ???
- From: Christopher Holland <email@hidden>
- Date: Sat, 12 Jan 2002 05:52:51 -0600
Hello everyone,
I am having a bit of a problem mapping a rather "large" file to an NSData
object. I can map smaller files without problem, but the larger one pukes
with the following error:
-[NSConcreteData initWithBytes:length:copy:freeWhenDone:bytesAreVM:]:
absurd length: 1866240000
Of course the length of the file is almost 2 gigs so I realized that it
might
have problems before I got started. OSX is supposed to be able to handle
files larger than 2 gigs, correct?
I'm using the following code:
bigData = [[NSData alloc] initWithContentsOfMappedFile:bigDataPath];
I've tried using 'initWithContentsOfFile' also...just to see if it was the
memory mapping doing it.....no go there.
Should I use the BSD 'mmap' funtion instead of using the 'NSData' methods
above? I'm not coding wizard, but I 've worked with this data and written a
similar program for SGIs. I'm just starting to work on OSX programming and
I'm excited by the possibilities.
Any insight into the mmap problem would be appreciated.
Thanks,
Christopher Holland
Sent using the Entourage X Test Drive.