Wednesday, June 3, 2009

Buttons in Physical and Virtual Environments

While using the self-service postage machine at the Post Office today, I noticed what appeared to be a pretty high latency between when I punched a button on the keypad (a real keypad, not touchscreen) and when my action was acknowledged and processed. The feedback came in the form of a "chck" sound, much like an old typewriter.

Eventually, I realized that the "latency" was due to the fact that button presses were not registered until my finger came up, as opposed to when the key was depressed. The system was behaving exactly how it was designed, but it presented a false sense of latency and sluggishness.

Why?

Event handling on the automated postage machine was that of a virtual button, where click events are not fired until the button is released (think: onmouseup). Conversely, a physical button's click event is fired when it's pressed (think: onmousedown). Don't believe me? Open up TextEdit/Notepad and press a key on your keyboard. Imagine if the keypress didn't register until the button popped up; wouldn't that be annoying as hell?

Here's an example. The following button uses the virtual button event paradigm:



Notice how you can move the mouse off of the button before it pops back up, and the event is cancelled.

This button uses the physical button event paradigm:



Kind of annoying, huh?

Using the physical paradigm in a virtual environment doesn't give the user a way to change their mind. Using the virtual paradigm in a physical environment adds the effect of latency.

Why do I think about stuff like this? I'm weird, I guess.

No comments:

Post a Comment