[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [MiNT] XaAES palette colour handling

On Tue, 17 Feb 2004, Odd Skancke wrote:

> On Tue, 17 Feb 2004, Standa Opichal wrote:
> > Hi!
> >
> > On Tue, 17 Feb 2004, Odd Skancke wrote:
> >
> > > > those colors may legally be used by the bitmap IMHO. The way it is used in
> > > > XaAES it doesn't harm in any way so why to ban it?
> > >
> > >  Yes, exactly. The palette is used when transforming the CICON bitmap in
> > > TC/HC modes. If the CICON is to be a "standard" CICON, it will contain
> > > pens 0 - 16, and as such the 'systems' colors should be used here. If you
> > > set own colors for all CICONS using pens 0 - 16, there is no way a user
> > > can change the colors of "standard" icons. This is wrong. If this
> > > continues, we'll end up with a situation where GEM's look is totally
> > > unconfigurable, and looks is left to individual applicaions. Not good. A
> > > good designer always uses pens above the first 16 to create a image or
> > > special icons that is application-specific.
> >
> > I didn't know that the CICON bitmaps are to be affected by a "desktop"
> > palette. Is this something e.g. Atari Compendium says somewhere? Or where
> > the "standard" came from?
>  Ok, this "standard" come from Atari themselves. I know I have read this
> in the original Atari developer documentation, but I dont have that handy
> right now. However, read what the Atari Compendium says under the "Using
> Color" heading in section 7.8. (at least thats where I find it in my book)
> If you have the .HYP file, look at "Using Color" in the VDI section.

Hehe.. I'm one of the authors of the .HYP ;) so... looking into the .stg

=== Cut ===
The first 16 VDI color registers are used by the operating system and
should be avoided. If your application must change them, they should be
restored when no longer needed.
=== End ===

We are speaking about the OS, right?

> > Hmm, so can we disable the CLUT modes until the nearest color computation
> > is implemented and turn off the palette "constistency" check (the 0,0,0
> > RGB check for the color 16 and above)?
>  We can disable it until we have color calcs for CLUT modes.


> The system pens (first 16) is never to be changed.

Still not convinced ;)

> The palette "consistency" check should never go away.

I see this as not necessary and failing when it is intentional for some
strange reason :)
It falls to just more code in the AES implementation that is not really
needed and not really easy to read and understand already.

> If a resource file has no palette, current system palette should be used
> in any case, no matter if we're doing color calcs or not. An empty
> palette could be thought of as "use default colors" for pens above 16.
> This is not documented anywhere, but since we are about to implement
> something new, we need to think things through. And thats my "suggestion".

OK, something new.

We are defining the OS behaviour, right? So why not to allow to change the
lower 16 colors in the theme RSC (which would the XaAES adopt as system
colors) and then also allow to use these pens among applications in their
private RSCs so that they would not affect the system look (by computing
the color distance for CLUTs) in any way. See more below...

> > The theme may be done exactly by using a different RSC palette inside a
> > theme RSC file. This way the AES may adopt the palette there as system one
> > (set it to the HW registers) and my code would work as expected not caring
> > about the first 16 pens. What is wrong with that?
>  How themes are implemented is beside the point.

No it is not.

> The point is that we do not change things that belong to the system, which
> your code does on an application basis,


> unless the user wants to. Like using themes to change system look.


> Your code would most likely be perfect in a "aes_settheme()" call or something.

No. aes_settheme() would set the theme RSC palette to the system vwk as
system colors IMO.

Once more:

Theme RSC (OS feature):
  Set the system palette and use the RSC as widgets (e.g.)
App RSC (using OS):
  Compute CLUT transformation (or use the palette directly for CICONs in
  non-CLUTs) and keep the display using the system palette unchanged.

> > IMHO the 16 pen rule makes more trouble than theme RSC palette taken as
> > system one.
>  Why? Please explain.

Because once we make the OS rules more easy to adopt by the application
writers they would have hand free and may use any pen in the CICON bitmap.
I can't just imagine how it is complicated to say "do not use the lower 16
colors for this picture" to some of the bitmap processing application just
to be able to use it for some silly AES which is a part of some silly OS.
Do you get it?

16 pen rule does the thing complicated and without this it is easier to do
everything. The OS would handle the 16 pen protection on its own not
providing the possibility to change that by user application.

best regards