• Open Menu Close Menu
  • Apple
  • Shopping Bag
  • Apple
  • Mac
  • iPad
  • iPhone
  • Watch
  • TV
  • Music
  • Support
  • Search apple.com
  • Shopping Bag

Lists

Open Menu Close Menu
  • Terms and Conditions
  • Lists hosted on this site
  • Email the Postmaster
  • Tips for posting to public mailing lists
Re: Reading a file as it fills up by another program
[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: Reading a file as it fills up by another program


  • Subject: Re: Reading a file as it fills up by another program
  • From: Christopher Nebel <email@hidden>
  • Date: Sun, 14 Dec 2003 12:13:30 -0800

On Dec 13, 2003, at 6:32 AM, Harald E Brandt wrote:

Yes, that's a nice test. 'curl' DOES write to stderr, but since it is redirected to stdout and piped throuh tr (or a similar thing) it is "reverted" to stdout behavior. ... In this case, the behaviour is that of stdout, despite we are writing to stderr from perl. Maybe I should re-redirect it back to stderr and then again redirect it to stdout?? That sounds absurd though!

Ah, yes. Because you're piping through a process that's doing all its writing to stdout, you wind up with block buffering again. Redirecting through stderr might work, though; I expect the phrase "1>&2" would be involved somewhere.

Now to a solution that I have found out myself: The solution is a bit brute force - not very elegant, but it actually works! ... The technique is to repeatedly in a loop open and close a file, thereby forcing it to really write to the file. Like this:

Since you're using perl, a better solution would be to alter the buffering of the output stream, such as by using $|. (See man perlvar for details. I expect there's a way to change the buffering of an arbitrary filehandle, but I don't know it offhand.) You'll probably still need to open and close the file on the reading side, though. (Hmm. I wonder how tail -f does what it does?)

Yet another method I contemplated, is to pipe everything through 'cat-u' before writing to file. That would force it to be unbuffered, but such things are considered dangerous - especially with disk activity, and I don't even know if it would work.

I've never heard of unbuffered output being considered dangerous before. Slow, maybe, but not dangerous.

Most of all, of course, one would like to read the buffer instead, not involving any disk activity at all! But is that really possible?

If it is, then the OS will do it for you -- this is what Unified Buffer Caches are all about.


--Chris Nebel
AppleScript Engineering
_______________________________________________
applescript-users mailing list | email@hidden
Help/Unsubscribe/Archives: http://www.lists.apple.com/mailman/listinfo/applescript-users
Do not post admin requests to the list. They will be ignored.

  • Follow-Ups:
    • Re: Reading a file as it fills up by another program
      • From: Harald E Brandt <email@hidden>
    • Re: Reading a file as it fills up by another program
      • From: Walter Ian Kaye <email@hidden>
References: 
 >Reading a file as it fills up by another program (From: Harald E Brandt <email@hidden>)
 >Re: Reading a file as it fills up by another program (From: Christopher Stone <email@hidden>)
 >Re: Reading a file as it fills up by another program (From: Harald E Brandt <email@hidden>)
 >Re: Reading a file as it fills up by another program (From: Christopher Nebel <email@hidden>)
 >Re: Reading a file as it fills up by another program (From: Harald E Brandt <email@hidden>)

  • Prev by Date: Re: Filemaker -- define fields
  • Next by Date: Re: Another script to analyize, please
  • Previous by thread: Re: Reading a file as it fills up by another program
  • Next by thread: Re: Reading a file as it fills up by another program
  • Index(es):
    • Date
    • Thread