How we think about code (updated)

Darren was showing off his FingerWorks Touch-Stream keyboard today, and it sparked a discussion about how we think about code. Basically his keyboard is a giant touchpad that combines the work of a keyboard and mouse, plus it detects complex finger movements and interprets them (e.g. as cursor movements, as cut / copy / paste, or as file-open / file-save).
This ability to perform gestures as well as type is what the discussion revolved around. Some people said that they couldn't see what the fuss was about: to them code was just a stream of characters in a file, and no expensive gadget would change that. But others immediately started raving about how gesturing at their code (or even 'in the code') allowed them to start interacting with it in a way that felt natural. It was is if their internal 3D model of the codebase was brought to life, e.g. performing an extract method refactoring by reaching into one method and physically pulling part of it out. The discussion moved on to talk about programming with data gloves and other sorts of input devices, but to me the most interesting thing was hearing developers talk about visualising code, as if they can translate O-O language concepts directly into imaginary solid things.