Re: Problem using .zerofill / -segaddr to create very large segments
Re: Problem using .zerofill / -segaddr to create very large segments
- Subject: Re: Problem using .zerofill / -segaddr to create very large segments
- From: Jay Reynolds Freeman <email@hidden>
- Date: Tue, 22 Sep 2009 13:50:28 -0700
Terry Lambert gave good advice, including:
> That said, unless you have a metric ton of memory, mapping 320GB
> into your address space will take a lot of wired pages to represent
> the pmap and other data structures needed to provide the mapping
> for that many pages ...
It is 160 GByte, not 320, I am sorry if something in my posts was
off-by-two.
Terry, I think you came in late on this: The problem is, I am doing
mmapping, and I need a big block of memory at the *same* virtual address
in each of several processes (because that block contains pointers
to locations within itself).
The processes are 64-bit applications that run in Snow Leopard. They
can use of all the memory you can put in a Mac Pro, and then some. I
have
a parallel lisp system with a dual-memory scheme for garbage
collection --
the memories are most of what goes in the shared segment. I want
to allow each of those memories to be as large as the most RAM you
can have in a Mac, which I believe is 64 GByte today.
I am sure I can find plenty of room in any one process's address space,
but the interprocess coordination required to make *certain* that all
of the
interested processes can find a block at the *same* address is a mess.
I feel strongly that I should seek a simpler solution.
The solution I have been using -- which was recommended by Apple,
actually -- is to use a .s file with .zerofills in it to create a large
region of empty memory, and then use the -segaddr flag (to the linker)
to tell the linker where to put it. That makes sense to me, and I
think that kind of thing is what linkers are for.
That has so far worked fine, but I seem to be up against a limit
of 160 GByte to the size of my segment, which is independent of the
address I provide with -segaddr. Any more memory and then, although
the build and link run fine, nevertheless I get run-time errors about
classes that cannot be loaded.
I can make do with 160 GByte for a while, but I will certainly want
more in the near future.
At the very least, I would like some better error messages; I ought to
get some warning that I am painting myself into a corner where a class
won't load, and perhaps a clue what to do about it.
-- Jay Reynolds Freeman
---------------------
email@hidden
http://web.mac.com/jay_reynolds_freeman (personal web site)
_______________________________________________
Do not post admin requests to the list. They will be ignored.
Darwin-dev mailing list (email@hidden)
Help/Unsubscribe/Update your Subscription:
This email sent to email@hidden