Re: Files larger than 2.9 gigs
Re: Files larger than 2.9 gigs
- Subject: Re: Files larger than 2.9 gigs
- From: Shawn Erickson <email@hidden>
- Date: Wed, 21 Dec 2005 11:06:08 -0800
As an FYI a 32 bit application should have over 3 GB available to it
(around 3.5GB IIRC on 10.3+) but largest contiguous block of virtual
memory is under 3 GB. All values depend on what frameworks are
loading, etc. into the application since those can partition and
consume the virtual memory space.
Anyway Mark you should work on reading and write to the large files
in blocks since you don't have enough virtual address space to map
the whole of the file. Many ways to do this... hard to answer which
you should use without knowing much about how you need to work with
the files.
-Shawn
On Dec 21, 2005, at 10:53 AM, John Stiles wrote:
[myData bytes] and pwrite will do what you're asking, but you can't
allocate more than ~2GB in your application's memory space (maybe
2.9GB, I haven't tested it). Even if you have more memory than
that, there is a 32-bit limit for each process.
On Dec 21, 2005, at 10:52 AM, <email@hidden>
<email@hidden> wrote:
I am trying to deal with files larger than 2.9 gigs but the cocoa
writeToFile:atomically: and NSFileHandle methods simply will not
write
beyond the 2.9 gig limit. How do I deal with these large files?
Should I
be using the C primatives? and if so how do I get the NSData in to a
char array so I can write it?
Thanks for any help
Mark.
_______________________________________________
Do not post admin requests to the list. They will be ignored.
Cocoa-dev mailing list (email@hidden)
Help/Unsubscribe/Update your Subscription:
This email sent to email@hidden