27 May 2011

Get Pregnant at SIGGRAPH 2011



Well not literally, but you certainly can experience what it feels like to be pregnant at SIGGRAPH 2011 this August in Vancouver. "Mommy Tummy: A Pregnancy Experience System Simulating Fetal Movement" is one of 23 interactive Emerging Technologies that will be featured at SIGGRAPH.


This interactive system simulates pregnancy through the use of a Mommy Tummy jacket which transmits the fetus’s temperature, movement, and heartbeat. Also, by rubbing the jacket you are able to communicate with the fetus. Within a few minutes, the jacket's weight and size change, simulating fetal growth over nine months. An auxiliary screen displays the condition of both the fetus and the mother in each simulated month with a 3D model of the fetus. The moods and activity of the fetus are displayed in a natural manner to simulate normal fetal development.


There are two types of fetal movement: kicking and wiggling. According to the inventors, achieving a kicking simulation is not as difficult as a wiggling simulation. It was achieved by using a technique inspired by the tactile funneling illusion (PhS) initially discovered by Georg von Bekesy as a type of illusory tactile sensation that arises between two points of simultaneous vibration or electric stimulation. Using PhS, Mommy Tummy simulates a wiggling sensation through continuous, temporally displaced operation of multiple air actuators.


When the user moves violently, the fetus enters a bad mood state and makes intense, jerky movements. Conversely, when the user caresses the abdomen, the fetus enters a good mood state and makes steady movements. Heavy physical exercise is discouraged until a stable period is reached. We wonder if a birth simulation is next up for these researchers - who are listed below.


For a preview video of many of these intriguing Emerging Technologies, click here.

Presented by:

Takayuki Kosaka, Kanagawa Institute of Technology;Takuya Iwamoto,
Japan Advanced Institute of Science and Technology; Robert Songer,
Kanazawa Technical College; Junichi Akita, Kanazawa University; Hajime Misumi, Kanagawa Institute of Technology

26 May 2011

SIGGRAPH 2011 Emerging Technologies: Interacting with the Future

The SIGGRAPH Emerging Technologies program is home to the latest developments in technology, including haptics, displays, robotics, and artificial intelligence. This year will feature 22 of the latest innovations selected by a jury of industry experts from more than 100 submissions. Topics range from displays and input devices to collaborative environments. SIGGRAPH 2011 takes place 7-11 August at the Vancouver Convention Centre.

A preview video of the SIGGRAPH 2011 Emerging Technologies program is available here.


"The SIGGRAPH Emerging Technologies program is unique in its interactive approach that allows people to experience the most cutting-edge developments first-hand,” said Cole Krumbholz, SIGGRAPH 2011 Emerging Technologies Chair and co-founder of Koduco Games. “This year, conference attendees will experience the latest achievements from industry and university research labs.”

Featured highlights from the SIGGRAPH 2011 Emerging Technologies:





Face-to-Avatar
Hiroaki Tobita and Shigeaki Maruyama; Sony Computer Science Laboratories, Inc.

This floating avatar system integrates a blimp with a virtual avatar to create a unique system of telepresence. The blimp avatar contains several pieces of equipment, including a projector and a speaker as the output functions. Users communicate with others by presenting their facial images through the projector and voices through the speaker. A camera and microphone attached to the blimp provide the input function and support the user’s manipulation from a distance.

The user’s presence is more dramatic than a conventional virtual avatar (CG and image) because the avatar is a physical object and moves freely in the real world. In addition, the user’s senses are augmented because the blimp detects dynamic information in the real world. For example, the camera provides a special floating view to the user, and the microphone collects a wide variety of sounds such as conversations and environmental noise.

Potential Future Use: Allows the user to have a moveable, physical presence in the real world from even the most remote location.



Volumetric Display Based on Vibrating Mylar Beam Splitter and LED Backlit LCD
Lanny Smoot, Quinn Smithwick, and Daniel Reetz; Disney Research

This new volumetric display produces full-color, high-spatial-resolution aerial images in front of the apparatus. It is based on a new optical element: the large, tunable-resonance, edge-driven, varifocal beam splitter.

This new display technology uses a circular Mylar beam splitter and adds a tension-adjusting metal hoop pressed against its surface. The beam splitter is adjusted, with high Q, to a specific resonance frequency. Three rim-mounted impulse drivers apply low-amplitude sinusoldal drive. Due to the high Q, the diaphram's sympathetic vibration is large. The beam splitter folds the optical path, and the system includes a fixed-curvature concave mirror to create real images that appear out in front of the apparatus.

It produces high-quality 3D images that occupy a one-third-meter cube 1/3 meter out in front of the apparatus. The image is viewable over a 30 degree viewing angle.

Potential Future Use: Advancements in 3D displays will impact many fields from medical research to gaming.



A Medical Mirror for Non-Contact Health Monitoring
Ming-Zher Poh, Harvard-MIT Division of Health Sciences and Technology; Daniel McDuff and Rosalind Picard, MIT Media Lab

Digital medical devices promise to transform the future of medicine with their ability to produce exquisitely detailed individual physiological data. As ordinary people gain access to and control over their own physiological data, they can play a more active role in diagnosing and managing their health. This revolution must take place in our everyday lives, not just in the doctor’s office or research lab. This project starts in the home environment by transforming everyday objects into health-sensing technology.

The Medical Mirror is a novel interactive interface that tracks and displays a user’s heart rate in real time without the need for external sensors. Currently, collection of physiological information requires users to strap on bulky sensors, chest straps, or sticky electrodes. The Medical Mirror allows contact-free measurements of physiological information using a basic imaging device. When a user looks into the mirror, an image sensor detects and tracks the location of his or her face over time. By combining techniques in computer vision and advanced signal processing, the user’s heart rate is then computed from the optical signal reflected off the face. The user’s heart rate is displayed on the mirror, allowing visualization of both the user’s physical appearance and physiological state.

This project illustrates an innovative approach to pervasive health monitoring based on state-of-the-art technology. The Medical Mirror fits seamlessly into the ambient home environment, blending the data collection process into the course of daily routines. It is intended to provide a convenient way for people to track their daily health when they use the mirror for shaving, brushing teeth, etc.

Potential Future Use: This device allows for easy and much more sophisticated everyday health monitoring.



Telenoid: Tele-Presence Android for Communication
Kohei Ogawa, Shuichi Nishio, Kensuke Koda, Koichi Taura, Takashi Minato, Carlos Toshinori Ishii, Hiroshi Ishiguro; ATR Intelligent Robotics and Communication

This new system of telecommunication focuses on the idea of transferring human “presence”. A minimal human conveys the impression of human existence at first glance, but it doesn’t suggest anything about personal features such as gender or age. The minimal appearance allows people to use Telenoid to transfer their presence to distant places regardless of their personal features.

Telenoid's tele-operation system is simple and intuitive. It can be controlled by even novice users. Its face-tracking system automatically captures the operator’s facial movements and expressions. Field tests revealed that most users tended to have a strange and negative impression of Telenoid in the beginning, but eventually they became comfortable. Elderly people had very positive feelings about Telenoid at first sight.

Potential Future Use: Telenoid provides a much more interactive and intimate experience than technology that is available today, such as Skype.



True 3D Display
Hidei Kimura and Akira Asano, Burton Inc.; Issei Fujishiro and Ayaka Nakatani; Keio University

This research team was the first to use laser-plasma technology for a true-3D display device that allows users to draw 3D images in midair Now the team has developed a much more compact and precise display, called SRV (Super Real Vision)-5000, based on advanced laser technology. One remarkable feature of the new device is its enhanced resolution: from 300 points per second to 50,000 points per second. It displays 3D objects more faithfully in real time and increases the range of possible applications.

Potential Future Use: Advancements in 3D displays will impact many fields from medical research to gaming.

A complete listing is available here
.

SIGGRAPH Technical Paper Getting Media Attention

SIGGRAPH Conferences have long been the home to advancements in computer graphics and interactive techniques. SIGGRAPH 2011 - set in wonderful Vancouver - appears to be no different.

Check out the media buzz this one Technical Paper "Depixelizing Pixel Art" is already creating:

20 May 2011

Highlights from SIGGRAPH 2011 Art Gallery: Tracing Home

The SIGGRAPH 2011 Art Gallery: Tracing Home presents exceptional digital and technologically mediated artworks that explore issues related to the concept of home in the networked age. From more than 300 submissions, the Art Gallery jury selected 16 pieces to be featured at SIGGRAPH 2011. These include 2D images, audio, video, as well as novel data-driven and mixed-media installations. They explore “home” as both a conceptual category and a physical reality, often blurring the boundaries between the two.

"The interplay of physical and virtual within our lived experience enables simultaneous and discontinuous realities at the touch of a button, echo of a voice, or nudge of a sensor,” said Mona Kasra, SIGGRAPH 2011 Juried Art Chair. “This year’s Art Gallery connects attendees with a unique interactive approach to art that explores this new era of technology, from a platform that captures the story of current world disasters through tweets and stock exchange information to a computer system that allows attendees to remotely control physical aspects of a house in foreclosure.”

The Art Gallery jury includes a wide range of artists, designers, technologists, and critics hailing from academia, industry, and the independent art world. Works exhibited in the Art Gallery are published in a special issue of Leonardo, the Journal of the International Society of Arts, Sciences and Technology. Peer-reviewed SIGGRAPH 2011 Art Papers will also be published in this special issue, which coincides with SIGGRAPH 2011 in August.

SIGGRAPH 2011 Art Gallery Highlights include:

MOSTON - Anya Belkina, Emerson College

A 12-foot-tall suspended inflatable sculpture, MOSTON conjures a technology-driven amalgamation of Moscow and Boston with its three-dimensional form of mutated Russian nesting dolls and two-dimensional surface design of printed artwork and documentary footage projection.


tele-present wind - David Bowen, University of Minnesota

This installation consists of a field of x/y tilting devices connected to thin dried plant stalks installed in the gallery, and a dried plant stalk connected to an accelerometer outdoors. When the wind blows, it causes the stalk outside to sway. The accelerometer detects and transmits this movement in real time to the grouping of devices in the gallery.

Open House - Patrick LeMieux, Duke University; Jack Stenner, University of Florida

An installation that allows visitors to telematically squat in an actual Florida home undergoing foreclosure after the United States housing collapse. Virtual markets transformed this otherwise livable property into a ghost house. Prior to the collapse, movements of global capital seemed like a distant reality, but it was imaginary systems of value, not bricks and mortar, that asserted ultimate authority. Open House temporarily resists eviction by mirroring the market and creating hybrid subjects who occupy both virtual and physical space. Cross the threshold, open the door, flicker the lights, and rattle the shutters.


The Garden of Error and Decay - Michael Bielicky, HFG/ZKM Karlsruhe; Kamila B.Richter, HFG/ZKM Karlsruhe Programmer: Dirk Reinbold Sound: Lorenz Schwarz

A poetic visualization of real-time world catastrophes. In this data-driven narrative of current world disasters, the artist, Twitter users, and stock-exchange information all influence the storytelling.


Wait - Julie Andreyev, Emily Carr University of Art + Design; Simon Overstall, Emily Carr University of Art +Design

Human and canine communication methods are interrogated through this interactive video installation. Taking cues from the movements of the visitor to the installation, interactions with the video point to the relationship of control between human and animal.

View the entire Art Gallery online. Contact media@siggraph.org for high-resolution images. A preview video of the content will be available and posted here in the next few days.

18 May 2011

SIGGRAPH 2011 Technical Papers Preview Video is Live

SIGGRAPH just released the 2011 Technical Papers preview video. To view the video on YouTube, click here. A preview video of the Art Gallery and Emerging Technologies will be available in the next week or two.

04 May 2011

New Content Unveiled at SIGGRAPH 2011 Demonstrates Optimal Furniture Arrangement

Presenters from UCLA and the Hong Kong University of Science and Technology will be unveiling a new program that will automatically design the most desirable layout for pieces in various rooms based on rules pre-defined, such as "couches should face the television for optimal viewing" or "furniture should allow doors to open inwards". They do this by extracting relationships and then running everything through various optimization steps. Different optimized final outcomes are possible.

The group has used the program to optimally design everything from a living room to a sewing factory to a Chinese restaurant. They creatively titled their paper "Make it Home: Automatic Optimiziation of Furniture Arrangement."

Here's an example for an ideal layout given the equipment and furniture located inside a sewing factory.

Before:













After:












Image Credit: UCLA and the Hong Kong University of Science and Technology.


To view their YouTube video, click here.