Re: MemoryMapping Large Files - ???
Re: MemoryMapping Large Files - ???
- Subject: Re: MemoryMapping Large Files - ???
- From: Jeff Galyan <email@hidden>
- Date: Sat, 12 Jan 2002 11:59:51 -0700
You can only map an entire 2GB+ file into memory on 64-bit systems. Mac OS X
is still 32-bit.
--Jeff
On 1/12/02 4:52 AM, "Christopher Holland" <email@hidden> wrote:
>
Hello everyone,
>
>
I am having a bit of a problem mapping a rather "large" file to an NSData
>
object. I can map smaller files without problem, but the larger one pukes
>
with the following error:
>
>
-[NSConcreteData initWithBytes:length:copy:freeWhenDone:bytesAreVM:]:
>
absurd length: 1866240000
>
>
Of course the length of the file is almost 2 gigs so I realized that it
>
might
>
have problems before I got started. OSX is supposed to be able to handle
>
files larger than 2 gigs, correct?
>
>
I'm using the following code:
>
bigData = [[NSData alloc] initWithContentsOfMappedFile:bigDataPath];
>
>
I've tried using 'initWithContentsOfFile' also...just to see if it was the
>
memory mapping doing it.....no go there.
>
>
Should I use the BSD 'mmap' funtion instead of using the 'NSData' methods
>
above? I'm not coding wizard, but I 've worked with this data and written a
>
similar program for SGIs. I'm just starting to work on OSX programming and
>
I'm excited by the possibilities.
>
>
Any insight into the mmap problem would be appreciated.
>
>
Thanks,
>
>
Christopher Holland
>
>
>
Sent using the Entourage X Test Drive.
>
_______________________________________________
>
cocoa-dev mailing list | email@hidden
>
Help/Unsubscribe/Archives:
>
http://www.lists.apple.com/mailman/listinfo/cocoa-dev
>
Do not post admin requests to the list. They will be ignored.