• Open Menu Close Menu
  • Apple
  • Shopping Bag
  • Apple
  • Mac
  • iPad
  • iPhone
  • Watch
  • TV
  • Music
  • Support
  • Search apple.com
  • Shopping Bag

Lists

Open Menu Close Menu
  • Terms and Conditions
  • Lists hosted on this site
  • Email the Postmaster
  • Tips for posting to public mailing lists
Working with large files, efficiently/fast?
[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Working with large files, efficiently/fast?


  • Subject: Working with large files, efficiently/fast?
  • From: Juan Pablo Pertierra <email@hidden>
  • Date: Thu, 28 Apr 2005 21:30:02 -0500

Hello,

My cocoa program has to deal with files larger than 2GB, as well as write them as fast as possible to disk using data acquired through an I/O port. In the past people have suggested that write()/fwrite() works fine for this, so this is what i've been using. However, whenever I try to work with files larger than 2GB, the standard functions such as ftell() only use long data types which are not wide enough to handle the size/file offsets.

Is there any class within cocoa I should be using, or is there another way around this? My main concern besides being able to actually access the large files is to be able to write data as fast as the system can support it...i am unsure if this is dependent on the library i am using to do so or not.

Thanks,
Juan

_______________________________________________
Do not post admin requests to the list. They will be ignored.
Cocoa-dev mailing list      (email@hidden)
Help/Unsubscribe/Update your Subscription:
This email sent to email@hidden


  • Follow-Ups:
    • Re: Working with large files, efficiently/fast?
      • From: "R.Claeson" <email@hidden>
  • Prev by Date: Re: NSBezierPath intersection
  • Next by Date: Re: releasing IB stuff when quitting
  • Previous by thread: Re: releasing IB stuff when quitting
  • Next by thread: Re: Working with large files, efficiently/fast?
  • Index(es):
    • Date
    • Thread