View allAll Photos Tagged Compute
For Induction, Computing students took part in a Carnival style fun day, where they collected tickets and won prizes!
Sam Pugh, Damon Stock, Daniel O'Neil, Glynn Merryweather, Olivia Tuppen, April Gwynne, Joe Maynard, Alice Perkins - Games Design
Toby Farrier, Dan George, Oliver Osei-Ofosu, Jason Farrier - Forensic Computing
Jade Byrne, Stuart Carter, Bradley Warren, Kane Whelan - Multimedia Web Design
Kieran Scott, Luke Cutuan, Thomas Jaggs - Product Design
Liam Harris, Jack Mills, Emmanuel Tresor Siebadji- Computing
Sepideh - Cyber Security and Chris Zielazny - Business IT (all model release forms signed - in folder)
Hosted in collaboration with Google's CS4HS initiative, the MIT Creative Computing 2012 workshop was held at the MIT Media Lab, August 8-11, 2012.
Hosted in collaboration with Google's CS4HS initiative, the MIT Creative Computing 2012 workshop was held at the MIT Media Lab, August 8-11, 2012.
Students from the School of Computing showed off their innovative ideas during the annual Creative Computing Showcase in the Biosciences Complex on April 6. (Photo by Stephen Wild) More on Flickr…
Hosted in collaboration with Google's CS4HS initiative, the MIT Creative Computing 2012 workshop was held at the MIT Media Lab, August 8-11, 2012.
Hosted in collaboration with Google's CS4HS initiative, the MIT Creative Computing 2012 workshop was held at the MIT Media Lab, August 8-11, 2012.
Hosted in collaboration with Google's CS4HS initiative, the MIT Creative Computing 2012 workshop was held at the MIT Media Lab, August 8-11, 2012.
Sharir is Professor of Dance And Technology at the University of Texas, and Yei has collaborated with him for the past two years; previously he collaborated with a team of architects in Salt Lake City.
Sharir and Yei demonstrated three iterations of wearable computing suits which they have developed in the past two years. Sharir said: "Issues relating to human/virtual interactions have been around for a long time, but we have never mastered the technology." He moved on to show video footage from a project using the first wearable suit developed with Wei Yei, entitled The Automated Body Project. This featured the pair's cyber-suit: "Which collected data from the wearer, including EEG information, talked to a "mothership" and returned, via radio-frequency communications, an image representing the data. You could move your eyes and the image would move, or extend an arm and the image would extend - the images were projected on a transparent screen. A dataglove also let me manipulate that material in real-time."