Ambient informatics through the rearview mirror

In 1998 I was nearing completion of the grad program at Georgia Tech in Information Design and Technology (now called Digital Media), cutting my teeth in the theory and practice that I use to this day. But some of it, like the project below for a course in Human-Computer Interaction taught by Greg Abowd (basically this class), only seems really meaningful nearly 12 years on.

Sonopticon was a team project to build a prototype of an automobile-based ambient sensing and heads-up display. We didn’t have to build a car that knew its surroundings — this was HCI, after all — but we did have to explore the issues of what it would be like from a driver’s perspective.

My wife and I took the car out one day (this is how you do anything in Atlanta) and filmed scenarios for later editing in After Effects. The RealVideo files (!) are gone, but some screenshots still exist, which I have strung together below. It’s laughable, really, the quality and overlays, but it conveys some interesting concepts that only now are becoming technically feasible. If the city of data really is coming into being, this is part of it.

And just because I’m channeling 1998 I’m gonna lay this out in one big honkin’ table. Take that CSS absolute positioning! (Best viewed in Netscape 3.0.)

ignition.jpg

Ignition

activated.jpg

Sonopticon activated

allclear.jpg

Mirror check

backup_caution.jpg

Caution avoidance alert

enter85.jpg

Entering I-85

cancel.jpg

Active Noise Cancellation

emerg1.jpg

Emergency vehicle detected

emerg2.jpg

Visual confirmation

emerg3.jpg

Vehicle passes

construction.jpg

Upcoming construction

blind_clear.jpg

Blind spot check

blind_alert.jpg

Vehicle moves into blind spot

truck.jpg

Visual confirmation

satisfied_user.jpg

A satisfied user

What’s funny to me all these years on is how my focus has shifted so decidedly away from augmenting the automobile to enabling an infomatics of the human-scale city, pretty much the opposite of what the car has done to our metro regions. Though I suppose making cars more aware of their surroundings is the one step towards this vision.

The full project write-up is here, if you are so inclined. I think we got an A.

(By the way, the car used in this demo is the one-and-only MySweetRide.)