[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: more question about porting/gcc



On Thu, 19 Mar 1998, Bernd Ebert wrote:

> I do own a hades with 64mb EDO-Ram, but this is still no real possibility,
> I do own that much Ram (relative, for Atari) because of scanning and
> working with images, it is definitely no solution for me to have 2 times
> the ram I really need.
> Besides that: with the third big program the problem would come back....

Well I thought I had observed that gcc would use almost all the available
memory for the entire length of time that gcc was running. This means that
no matter how long you give gcc to do an mshrink, it's not going to make
any difference! The only way to avoid would be to limit the RAM gcc can
use, or fix gcc. Is this not the case?

Yes, I agree that it would be a nuisance to have twice the RAM you need.
It was just an idea. But then there is another possibility... I don't know
whether the RAM limit affects how much RAM that program can allocate with
malloc(), or only how much RAM the program received initally. If it is the
latter, then you should still be able to scan big images with no problems,
even after limiting the RAM (hopefully). On my 40MB TT, RAM fragmentation
has not been a big problem, but then it does start to get worrying that
the largest block is down to about 10MB when the machine's been on for a
few days (and a couple of thousand processes have run during that time). I
might try this method myself and see if it works. In fact the  
fragmentation in this case would limit the size of a picture I can load 
much more than a memory usage limit would anyway.

Really as far as I can see there are two problems here: One is that your
memory slowly breaks up into more and more small blocks. The other is that
gcc likes to use up all the RAM. Then there's the problem of starting up
one app, then starting another before the first mshrinks, but I really
can't see that being a problem in real life - since the program would use 
mshrink so soon. Well I refer to the gcc problem above, but as for the
fragmentation problem, it seems to me that this could be difficult to
avoid, unless the amount of memory given to a process is limited. To do
this the system needs to know how much RAM a process will need, before the
process is started. Then the whole mshrink-problem is not relevant any
more anyway, since it won't have to mshrink. The next best solution is the
memory limit in mint.cnf as I explained above (assuming it doesn't limit
Malloc as well, of course).

Am I making any sense here?

-- 
Mario Becroft                        Auckland, New Zealand
mb@tos.pl.net                    http://www.pl.net/~mario/      |\__/,|   (`\
Tariland, Atari Support in New Zealand                        _.|o o  |_   ) )
tariland@tos.pl.net     http://www.pl.net/~mario/tariland/ --(((---(((--------