06 July 2011

SIGGRAPH Gaming Content Revealed: Part Two

Following is the second entry of a three-part series focusing on gaming content that will be featured at SIGGRAPH 2011. This is a conversation between Jason RM Smith (pictured at the right) and ACM SIGGRAPH Chapter Reporter Mariebeth Aquino.

After successfully leading the SIGGRAPH 2010 Real-Time Live! program, Jason RM Smith continues his legacy as Chair and Director for 2011 and 2012. In his life outside SIGGRAPH, he is Digital Production Supervisor at LucasArts San Francisco.

Background details: This year's content features a substantial amount of game development content: game papers, courses, talks (technical, studio, and exhibitor talks), workshops on game development, and technical papers. In addition, game content can be found in the Computer Animation Festival where Real-Time Live! showcases the latest trends and techniques in games. And for those wanting a hands-on experience, the Sandbox provides the ideal environment to test drive the latest in interactive entertainment.

A Conversation with Jason RM Smith, SIGGRAPH 2011 & 2012 Real-Time Live! Director and Chair


What are your responsibilities as Director and Chair of Real-Time Live!?
It’s actually a combination of things which makes the role so interesting.

Firstly, I’m accountable for ensuring Real-Time Live! and the Sandbox are each a success in their own right at SIGGRAPH 2011. Fortunately I get to work alongside a bunch of super smart people who know how to organize an amazing conference, which makes my job much easier.

Secondly, I get to move the two events further forward in terms of placement and understanding as part of SIGGRAPH; they are still both relatively new compared to many other areas of the show so a goal for 2011 was to ground them in their own right, building on the great work established by Evan Hirsch in previous years.

The last major goal was to look to broaden the content (and hopefully the interest) in RTL! Computer Graphics has such a great community it was relatively easy to establish a subcommittee who provided outreach across a wide range of industries and groups that are pushing the boundaries of real-time graphics.

The sub-committee was almost too successful – we ended up with so many strong submissions that deserve to be on stage it was a real challenge to keep the show within its allocated time-slot.

What is the Sandbox? What was the vision behind it and what role does it play at SIGGRAPH?

The vision behind the Sandbox is very straightforward. Real-Time Live! is all about the live presentations from the artists and engineers responsible for the work, and the Sandbox was setup to allow SIGGRAPH attendees to try these projects hands-on, for themselves. (I’m also using it to squeeze in additional content just because there were so many great submissions; be sure to check out the handful of projects in the Sandbox you won’t find anywhere else at the show).
It’s also a really nice change of pace; after a few hours of lectures it’s fun to exercise the physical reflexes a little. Whether you want to race cars or interact with virtual Jellyfish, the Sandbox has something for everyone.
With the rapid evolution and advancements in technologies such as real time rendering, are games using top-notch technologies as impressive to gamers as they are to game developers? Are users more aware or appreciative of what goes on behind the scenes to create their games of choice?

You know, I’m constantly amazed by the number of gamers and communities who are extremely real-time tech savvy just through their interest in the medium. They represent a minority group of players but their depth of knowledge is just incredible.
Looking at the gaming audience in general, I think it’s much more a question of how the technology contributes. If developers’ use breakthroughs to drive a more holistic, compelling experience overall, there is no doubt the work involved is appreciated, even where the exact areas of improvement aren’t obvious to most people playing the game. Ideally, we want to focus 100 percent of the player’s attention to the situation we’re presenting on screen. In many ways, if the technology behind the scenes is obvious or demands attention as an element, then we have failed at delivering the most immersive experience possible.

In-game cinematics temporarily take gamers out of the active play role and place them in a more passive role. With large game development studios placing high importance on the quality of these cinematics, do you foresee a shift towards maintaining player involvement by making these segments more seemingly active?

Yes; there’s no doubt that developers are becoming more comfortable delivering meaningful action and narrative to garner an emotional response from the player using real-time sequences. As this becomes standard, our understanding of cinematics will be challenged as the boundaries between interactive and non-interactive disappear.

As an industry, we’re just beginning to understand how to balance multiple levels of interactivity to deliver a rewarding experience that sits comfortably between storytelling and retaining an immersive experience. The trend has started; it’s interesting to see where it will lead over the next couple of years.
Which is the stronger driving force:
- Video game hardware technology or video game software technology?
- The artist's craft to create video game content or the programmer's knowledge of the latest render technology?

None of them; or all of them. It’s a trap!

What makes a creative industry successful can be broken down or split up into many component pieces, but ultimately the one thing that drives all of those areas is innovation. It’s not always about doing something new, using the same pieces in a new way can be just as progressive, but it’s the underlying creativity that makes the difference, whether affecting one or all of the areas above.


1 comment: