Re: NSTask/NSPipe STDIN hangs on large data, sometimes...
Re: NSTask/NSPipe STDIN hangs on large data, sometimes...
- Subject: Re: NSTask/NSPipe STDIN hangs on large data, sometimes...
- From: "Joe Pezzillo" <email@hidden>
- Date: Fri, 17 Jan 2003 11:22:38 -0700
Bill-
Thanks for the prompt reply, that looks like some very useful code, too...I
hadn't yet considered (or desired) porting, but it's good to know that it
can be done!
Sadly, yes, I am already doing readInBackgroundAndNotify, at least on the
asynch version. The synchronous version uses readDataToEndOfFile.
But remember that my problem is that writeData hangs as part of "launching"
the command (not the specific [task launch] message, but the set-up to
making the command do anything by piping it some STDIN to chew on after it's
been launched), so in the synchronous version, I never get a chance to
readDataToEndOfFile, it just hangs.
Similarly, the asynch version posts a notification request, launches, and
then tells the task's stdOut fileHandleforReading to
readInBackgroundAndNotify. Then, once the task is launched I write the data
to stdIn, but since it hangs there, I never get any notifications of data
coming back.
I looked for a "writeDataInBackgroundAndNotify" or anything else related to
asynch writing in the NSFileHandle header file but I didn't find anything
new.
Since the previous post, I've tried syncrhonizeFile before writing, that
didn't work (NSFileHandOpExcp:invalid argument).
Nor did trying to get NSFileHandle to give me fileHandleWithStandardInput
and then write to that (bad file descriptor).
Also note that this only seems to affect a few UNIX commands so far,
/usr/bin/grep (or egrep), and /usr/bin/uniq. Other commands (specifically:
tail, wc, sort) work just fine with large data written to STDIN using the
same code. Doesn't mean it's not still my fault, but it does prove to me
that, as long as I don't use grep or uniq with more than 32K of data, I've
got a working implementation.
I also tried making a method that chunks the write operations into blocks of
less than 32K each, but that didn't help either...as soon as a cumulative
32K of data has been written to the file handle, even in smaller chunks, it
hangs...but only with grep/egrep or uniq (so far, those are the cmds that I
have tested and found this problem). I tested the new chunking write method
with other commands (tail, wc, sort) and they work fine.
I guess I'll start getting the project ready to post...
Thanks for your help!
Joe
email@hidden
On 1/17/03 9:25 AM, "Bill Bumgarner" <email@hidden> wrote:
>
On Thursday, Jan 16, 2003, at 21:11 US/Eastern,
>
email@hidden wrote:
>
> Has anyone else experienced this problem and solved it or am I a lone
>
> corner
>
> case? I googled a little and have looked at some of the Cocoa sites
>
> (and
>
> this list), but nothing jumped out at me that was addressing or
>
> solving this
>
> issue.
>
>
>
> Is this a problem with my code (I'll post a spike of the app if
>
> needed) or
>
> the frameworks or Darwin, OR?
>
>
Are you using the 'readInBackgroundAndNotify' mode on NSFileHandle? If
>
not, that is the cause of your problem as the buffering in NSFileHandle
>
will end up blocking on itself. This isn't so much a bug as fallout
>
from the way the system works (and one for which an easy workaround
>
exists -- readInBackgroundAndNotify).
>
>
Also, availableData, readDataToEndOfFile, and-- I think--
>
readDataOfLength: can all cause the NSFileHandle instance to block
>
without readInBackgroundAndNotify having been activated.
>
>
[...]
_______________________________________________
cocoa-dev mailing list | email@hidden
Help/Unsubscribe/Archives:
http://www.lists.apple.com/mailman/listinfo/cocoa-dev
Do not post admin requests to the list. They will be ignored.