Mikro, do you see a difference in the generated code size (both .o
files, and the final binary) ? Currently, with cross-compiler, I see
something like +10/15% in generated code size between gcc 3.x and 4.x,
for the same optimisation settings (-O2 -m68020 when I compiled
Doom for example).
Well, yes, I admit my comment was quite pessimistic, I even changed my
mind and use it for my daily development (with -O0 -g ;-) -- as a
proof of working I compiled Ocean Machine with -O3 -m68020-60 and it
worked quite nice. (Argh, I must look after that quake one day -- I'm
really not happy with this random behaviour on various compilers).
Btw, this gnu stuff really rules. I spent a afternoon with compilation
of various linux packages (I took Linux From Scratch as the base). In
80% of cases my work to get the newest stuff working was to set
--target=m68k-atari-mint or more directly, CC=m68k-atari-mint-gcc,
CFLAGS=-m68020-60 and here we go. Quite cool. The only bad thing on
this is it's without any packaging -- this is probably the most boring
part of whole process of releasing new packages -- take some RPM from
Fedora for example, "configure" it for FreeMiNT target, test it, fix,
again test etc...
I was even played with the evil idea to make a distro just based on
.tar.gz packages with some simple uninstall information (i.e. storing
the paths of installed files into some txt file) + maybe some unit way
of storing patches for source "packages". Or maybe directly adopt
Slackware-type packages. Who needs dependency checking in FreeMiNT
world anyway. We don't have shared libs so it's only about development
packages and every developer knows what it means "missing SDL.h
file"... I know, bad, bad guy :)