Re: Working with large files and Memory
Re: Working with large files and Memory
- Subject: Re: Working with large files and Memory
- From: Jean-Daniel Dupas <email@hidden>
- Date: Tue, 11 Mar 2008 18:18:31 +0100
Le 11 mars 08 à 17:54, Carl E. McIntosh a écrit :
Can you please give advice about handling large data files with
memory management techniques? I am attempting to read three large
files (1 GB, 208 MB, 725 MB) sequentially and place the data into
arrays for processing. Here is my psuedocode:
1) Import a file into NSString.
NSString *aFileString = [NSString stringWithContentsOfFile:
fileLocation]; // Convert file at path to myFileString text holder;
2) Use NSScanner pull out integers and floats
NSScanner *aFileScanner = [[NSScanner alloc] initWithString:
aFileString];
3) Store values into arrays.
float myFloats [100000][2000]; or
float myInts [100000][2000];
4) repeat three times with 3 different files.
This algorithm works for smaller files but chokes on the larger
files and I get malloc errors. I've attempted to use NSZone's to the
same failure.
Can you please give advice about handling large data files with
memory management techniques?
I have 4 GB ram and can hog off 2 - 3 GBs for the process. I don't
know how to explicitly allocate real memory. I'd rather not use
virtual memory. Any references or examples would be appreciated.
The first advice I can give you is "do not load the whole file into
memory". Use read stream to read chunk of data and process them. (see
NSInputStream or NSFileHandle).
Maybe other people on this list may have other advices too.
_______________________________________________
Cocoa-dev mailing list (email@hidden)
Please do not post admin requests or moderator comments to the list.
Contact the moderators at cocoa-dev-admins(at)lists.apple.com
Help/Unsubscribe/Update your Subscription:
This email sent to email@hidden