When I was a student and later, faculty, at the MIT Architecture Machine Group we did a lot of things that were considered crazy and irresponsible at the time, such as giving powerful computers to mere undergraduates and building displays that allocated an entire bit of memory to a single pixel on the screen. Moore's Law is now accepted wisdom and the personal computer a part of every life, but the spirit of innovation lives on at what the Arch Mach's became, the Media Lab, which celebrated it's 25th anniversary last week.
What started as a small band of researchers on the fifth floor of Building 9 is now more than 500 faculty, staff, and sudents that fill two building at the East end of the campus. While the original mission of building tools for architects expanded into the general field of human-computer interaction, in some ways the Media Lab has returned to its roots of building computation into the three-dimensional environment in which we live, and doing so in a three-dimensional space that encourages interaction between multiple disciplines and research organizations. While the original building housed researchers in the kind of dark, enclosed spaces they needed for viewing dimly-lit computer screens, the new building is full of light and glass - more like an artist's studio than a computer lab. Much of the research has moved beyond the two dimensional computer screen to engage the physical world in form and motion, much as the Arch Mach's original Seek project did in 1970. (photo at right)
- A camera that can see around corners.
- An inexpensive piece of plastic that turns an ordinary mobile phone into a device for measuring refractive errors in the eye.
- A "bathroom mirror" and related video technology that can measure pulse rate, detect smiles, and measure moods.
- Socially intelligent personal robots.
- Endless forms of tangible media and user of materials to interact with computers through light, sound, touch, clothing and crafts of all types.
I also got a chance to chat with John Moore, an MD who is now working in the New Media Medicine Group building the next generation of tools to involve patients in their own health care. I saw more of what he presented at Health 2.0 a few weeks ago, which combines data visualization with communication tools to give patients a better idea of the consequences of their actions. For example, instead of hectoring patients to take their medication, it shows an animation of what's going on in their blood, and doing this on anything from a PC to a Furby. It will be really interesting to see how this works the the sensing technology coming from the Affective Computing group downstairs from them.
Some photos of the event here.