[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

[MiNT] Am I an idiot?



OK - someone PLEASE explain to me, how this is supposed to work with 16
bit scancodes!

On Fri, 2005-04-01 at 08:03 +0200, Adam Kłobukowski wrote:
> As a ompromise we could return dwo words: a scancode in first, ascci +
> modifiers in 2nd, how do that sound?

short mt_evnt_multi (short Type, short Clicks, short WhichButton, short
WhichState, short EnterExit1, short In1X, short In1Y, short In1W, short
In1H, short EnterExit2, short In2X, short In2Y, short In2W, short In2H,
short MesagBuf[], unsigned long Interval, short *OutX, short *OutY,
short *ButtonState, short *KeyState, short *Key, short *ReturnCount,
short *global_aes)

Ok ... if I understand right, for the new KEY-UP/KEY-DOWN stuff, we're
to have the scancode as 16 bits in "Key", and the ASCII code, presumably
in the upper 8 bits of "KeyState" (so a shift is needed to use it) ...
since I see no other place for it and that seems to be what was
suggested - please tell me where the new 16 bit scancode goes!  Or is
evnt_multi() itself being replaced?  Either seperate events for KEY-UP
and KEY-DOWN, or a bit in the lower 8 bits of KeyState to determine if
the key is up or down, any is fine.  Likewise, state of the
CONTROL/ALT/SHIFT can be marked in "KeyState" in addition to reporting
the scancode as above, and either the seperate events for Key Up/Down or
extra bit in KeyState (so CONTROL being pressed might set 2 bits in
KeyState if using the extra-bit approach).

What I take issue with is that for regular MU_KEYBD events, "Key" has
the ASCII value in the lower 8 bits, and the scan-code in the upper 8
bits.  This means applications can get the ASCII value of the key
translated for them by the OS, with no additional work and no
translation to use the value in a string.  You only have to look at the
less-portable scancode when the ASCII value is 0.

Breaking this convention doesn't seem like a good idea to me, especially
since Bconin(), Cconin(), Keytbl(), KEYTBL.TBL, evnt_keybd(), MU_KEYBD,
and just about everywhere else in the OS, the scan-code is a single
BYTE.  So, I want to know if we plan on changing all these calls for
consistency right down to having the keyboard driver send 16 bit
scan-codes, or having a handful of new extensions that everyone must use
to get the new 16 bit value.  Neither seems like a good idea to me when
the functionality required can be done with such minimal changes that
wouldn't break existing conventions.

If I'm an idiot and missing something, please explain it to me in
detail!

Thank you,
Evan