I’ve recently ordered yet another gadget – an oversized android tablet. The intent is to fill a role somewhere between my laptop, e-reader and phone, but also to provide more work area (the main reason I always want higher resolution displays). There just isn’t enough on just two monitors, and I’ll never quite understand why Dell saw fit to put seven video outputs but only two display controllers in the laptop.
My initial simple idea for how to do this involves using a larger framebuffer and VNC to display an off-screen section over the network. Or perhaps distributed multi-head X. I might have to tweak my window manager’s idea of what screens are about a bit, but it should fit neatly into the existing Xinerama support. That should cover getting a picture up as a third screen.
It doesn’t quite cover another issue I’ve been feeling a bit; the controls of my windowing system aren’t aging well. With the Alphagrip I’m already feeling that the super-shift-digit binding for moving windows is impossible, and the tablet won’t have any keys at all when I’m using it away from the work terminal itself (unless it’s in a dock). So it’s time to look at other schemes, like tagging windows and using gestures.
A few programs have their own gesture support, such as Xmonad, Blender, gschem, epiphany and firefox (some of those only through extensions). But we can do better, and I believe I shall try with easystroke – a gesture recognition tool that can send custom commands to other programs. It’s not proper TUIO control (which would support multitouch), but it’s a start.