I watched the N-Judah MUNI bus whiz by last night in San Francisco’s Sunset district. I could easily have caught it if I busted into a quick jog, but with two buddies behind me, I figured it wasn’t worth the collective effort.
Still waiting 25 minutes later in a chilly, damp fog, I regretted my decision.
If I could have frozen time at that “make or break” moment, I would have:
- Pulled out my iPhone
- Clicked my Nextbus bookmark and navigated through no less than 4 links to find the right stop and check the arrival time of the next bus (the routesy iphone app might be faster)
- Convinced my 2 friends to run!
In reality, that would have taken at least 2 minutes and the bus would have been long gone. So…we needed an ambient computing technology that understood my intention to catch that bus. It should deliver the “run or don’t run” response in a split-second. Complex stuff, but a reasonable guess could be made by these tidbits of information:
- I took public transport earlier in the day to the Bluegrass festival in Golden Gate Park
- I just finished dinner and at 10:30 PM, was likely heading home (in fact, I told that to my wife on the phone just minutes before)
- Neither of my 2 friends had a car
- We were walking towards a popular bus route
Not sure what this ambient technology might look like, but it probably involves my phone, some communication protocol with the incoming bus, and maybe some supplied context on my part. Probably not through high effort keyboard input, but maybe a quick voice command like “taking N-Judah home”.
I’d hope for a response like “run now! Next bus won’t arrive for 30 minutes” or maybe an orb-like display color coded with green for “take your time” or red for “run now!”.