Re: Files larger than 2.9 gigs
Re: Files larger than 2.9 gigs
- Subject: Re: Files larger than 2.9 gigs
- From: <email@hidden>
- Date: Wed, 21 Dec 2005 14:23:28 -0500
- Bounce-to: <email@hidden>
Well all I need to do is read files in parts from one place and write
them in to a single file on disk. basically taking file1 file2 file3 etc
and putting them all into a single file.
I tried this but it does not work:
while(fileparts = [iter nextObject]){
FILE * dest = fopen([[panel filename]fileSystemRepresentation],"a");
NSData * tmp = [NSData dateWithConentsOfFile:[filepart
objectForKey:@"Path"]];
char tdata[] = [tmp bytes];
but the compiler errors out telling me that the Char array initializer is
invalid.
so I do char tdata = [tmp bytes];
fwrite(tdata,1,[tmp length],dest);
---compiler tells me that tdata is an incompatible pointer type.
--- could I do: fwrite([tmp bytes],1,[tmp length],dest)?
fclose(dest)
}//end while
Thanks, Mark
On 12/21/2005, "Shawn Erickson" <email@hidden> wrote:
>
As an FYI a 32 bit application should have over 3 GB available to it
>
(around 3.5GB IIRC on 10.3+) but largest contiguous block of virtual
>
memory is under 3 GB. All values depend on what frameworks are
>
loading, etc. into the application since those can partition and
>
consume the virtual memory space.
>
>
Anyway Mark you should work on reading and write to the large files
>
in blocks since you don't have enough virtual address space to map
>
the whole of the file. Many ways to do this... hard to answer which
>
you should use without knowing much about how you need to work with
>
the files.
>
>
-Shawn
>
>
On Dec 21, 2005, at 10:53 AM, John Stiles wrote:
>
>
> [myData bytes] and pwrite will do what you're asking, but you can't
>
> allocate more than ~2GB in your application's memory space (maybe
>
> 2.9GB, I haven't tested it). Even if you have more memory than
>
> that, there is a 32-bit limit for each process.
>
>
>
>
>
> On Dec 21, 2005, at 10:52 AM, <email@hidden>
>
> <email@hidden> wrote:
>
>
>
>>
>
>> I am trying to deal with files larger than 2.9 gigs but the cocoa
>
>> writeToFile:atomically: and NSFileHandle methods simply will not
>
>> write
>
>> beyond the 2.9 gig limit. How do I deal with these large files?
>
>> Should I
>
>> be using the C primatives? and if so how do I get the NSData in to a
>
>> char array so I can write it?
>
>>
>
>> Thanks for any help
>
>> Mark
_______________________________________________
Do not post admin requests to the list. They will be ignored.
Cocoa-dev mailing list (email@hidden)
Help/Unsubscribe/Update your Subscription:
This email sent to email@hidden