Sunday, December 18, 2011

The Gathering StormCLOUD

It's funny how resistant we usually appear when it comes to change, yet how quickly we adapt when said change slips in under the radar.  I remember when music was undergoing the shift from physical media to downloadable.  I asked one of my college classes what they thought about this trend.  A full 80 percent of the class indicated they would never download their music because they really liked (among other reasons) owning a physical copy.  The following year I asked the same question and the number of die-hard supporters of CDs dropped.  By this time, iTunes was beginning to take hold.   Downloading music was suddenly safe, simple, and not terribly expensive.

When games began being offered as downloads instead of disc, I was met with the same resistance.  My student would prefer to own a physical copy, or the amount of data to download was too large.  Once Steam worked out all of its kinks and began to offer a huge selection of game at competitive prices, my students eagerly jumped on the bandwagon

Here we are now at the onset of a cloud revolution where games, music, video and more are being made available via a stream.  You don't need to wait for something to download, you just log in and go.  While my students were quick to embrace the likes of Netflix, cloud gaming services like OnLive remain in question.  The reason?  You guessed it.  My students prefer "owning" their digital copies.  They feel safer knowing their game is located on their hard drive and not "in the cloud."

So how long will it take for us to adapt to this new way on consuming media?  My guess is, if OnLive can make a compelling case for ease-of-use and put games at a competitive price point, people will make the jump.  It would speed things up if Microsoft, Sony, and Nintendo stepped up with their own cloud services.  I don't know what they are planning but they would be nuts not to move in this direction.

Sunday, November 20, 2011

You suck at watching television

Dear Hardcore Gamers,

You know I love you guys.  We have had more than our fair share of laughs at the expense of the hordes of Dark Elves and Space Marines who have stood between us and our goal.  We've discussed at length the wonders of games like Shadow of the Colossus, Portal, and Civilization all the while wondering why more people; more non-gamers haven't given a care to play nor even heard of them.  I've got a fair amount of respect for you, my brethren.  But listen, there's something you need to know.  You positively suck at watching television.

I'm not saying it to be cruel. Really I'm not.  I just think that perhaps all that game playing has clouded your perspective of what other media do well.  Let's take a recent example of gamer nerd rage against a popular television series: The Walking Dead.  I'm not here to proclaim that this show is the greatest thing ever, but it's fairly obvious to any moderately intelligent consumer of TV what The Walking Dead is and what it isn't.

The Walking Dead is a character drama about a group of survivors during a zombie apocalypse.  It's about how the people on the show react when everything they know and understand it taken away.  Walking Dead is not actually about the zombies and it is most certainly not about how a group of people will devise some clever way to dispatch said zombies each and every episode.  Even the graphic novel the show is based on focuses less on the zombies themselves and more on the seemingly directionless travels of the survivors.  It's almost like the survivors are actually *GASP* the titular Walking Dead.  Their life is over.  They just don't seem to know it, yet.

It ain't rocket science people.  So why is it we have continued criticism that the show isn't delivering the requisite number of zombie encounters per episode?  Why do people scoff when the survivors don't act rationally and fail to execute a perfect set of plans?  After all, if we encountered a zombie apocalypse, we'd totally have our shit together, right?  Right?

I think what's going on here is a contingent of genre-loving nerds have forgotten that linear drama doesn't play out like a video game.  In a game, players are expected to strategize and execute their plans perfectly in order to progress.  If you fail, it's no big deal. You just start over.  By the time you've gotten well into the game, you understand the rules and the gamespace so well that you can handle anything the game throws at you.  And if the games hasn't prepared you for the challenge and you have to try to figure it out, it's called a bad game.   When a TV show doesn't prepare the characters for the challenge and they have to try to figure it out, it's called drama.

So when gamers watch a show like Walking Dead, they are first looking for concrete examples of how the world works so that they may exploit what they know in order to win the scenario.  Are the zombies slow?  Can they use tools?  Can they climb?  Hey, they weren't fast last episode, but some of them are running in this one.  THAT'S NOT FAIR!  THIS SHOW SUCKS!   Actually, it's completely fair.  Yes, TV shows need a certain amount of internal logic to function, but only so much as needed for the characters to get on with what they are doing.  Everything does not need to be spelled out for the viewers from the get go.  In fact, doing this would be considered very bad television.  The rules will be explained on a need to know basis.

Gamers also tend to dislike characters who don't behave how they would perceive themselves (as the hero of the story/game) as behaving in the same situation.   When Rick's son is shot right in front of him, his reaction is entirely believable precisely because he did not act like a strong heroic type.  He was scared and confused.  He made bad choices.  He abdicated his responsibility to the group.  You could say that up to this point, season two has been about characters who are all about following rules (Rick was a sheriff before the ZA) are learning how to live in a world where the rules no longer apply.  So when an episode or two go by with characters wrestling with these issues, that's not boring my friends, that's riveting television.  That's good character drama.  That's what TV is good at. 
Games on the other hand, are horrible at character development, but decent at plot progression.  It's no wonder gamers can't stomach all that "soap opera" stuff.  They have grow accustomed to moving from set piece to set piece with the belief that all the character stuff in between is complete rubbish.  Well, if they are talking about video games, I would be inclined to agree.  Most characterization in games IS rubbish and you're probably better off skipping whatever the hell Solid Snake is pondering in Metal Gear Solid.

But television is different.  It's those moments where characters wrestle with their reality that carry the real weight.  Yeah, it's fun when the zombies arrive and rip someone apart.   But can you imagine that happening every single episode?  Can you imagine the survivors going from place to place, mowing down hundreds of undead while scarcely muttering more than a phrase or two?  I can.  It's called Left 4 Dead and it's an amazing video game.  But it would make a horrible TV show.

Wednesday, November 2, 2011

MIGS 2011: Day One

After fortifying my body and soul with poutine (whoa, my spellcheck doesn't know what poutine is) and beer, it was time to tackle the 2011 Montreal International Games Summit.   I go up about every other year with a group of my students from Champlain College.  Our game development program believes it's important to expose our students to professional events like MIGS, and sure enough there were plenty of things to chew on.

I attended several talks during the first day and while I haven't had a lot of time to process everything, here are some highlights.

Keynote with Richard Lemarchand from Naughty Dog on how the culture and practices at their company led to the successful Uncharted series.
Friendly enough speaker.  Most of the keynote felt like a giant advertisement for the Uncharted Series.  I would have liked to hear more about what makes their company tick.  To be fair, Lemarchand did spend time talking about the relatively flat hierarchy and how contant communication between the people on the team is key.  But is that really any big revelation in this day and age?

Brain Dump (7 or 8 rapid-fire talks/rants)
A pretty funny way to start the rest of the day after the keynote.  Not all of them were direct hits, but I really liked Raphael Van Lierop's premise that game designers really need to read more books.  Drawing on nothing but other video games for your inspiration leads to crap games.  While he did sound a little too alarmist (Seriously, dude.  Print is not dead) his position supports my belief that you can probably make a game out of anything.  Why aren't we trying harder to do this?  I mean, I love Space Marines and Elves, but come on.

A short Interlude...
I did notice a couple of themes emerging from the talk.  One of them was the Waterfall vs Agile debate.  There's not really much of a debate anymore as most developers feel that Agile is the production method that works best.  But there was definitely a feeling that developers should never adhire to one specific methodology because teams are their own organic thing.  A good lead has to figure out the way their team works best and not try to impose something just because it's the "best way to do it."  Sometimes waterfall might be the way to go.

The Great Divide, Cord Smith, Director of Marketing for Square-Enix on Developer/Publisher relations.
This was a great talk on how each side sees the other.  Smith comes from both backgrounds and he believes that both sides could learn a lot from one another.   Marketing folks should learn some game development tools and 3D art tools (hell, even a little Photoshop couldn't hurt) to get a feel for just what it takes to make a game.   Game devs should shadow a marketing team to see what it is they do all day.   Above all else, both side should show repect and most importantly empathy for the other.  On one want's their game to suck.  Really.

The last talk of note was from Lee Perry of Epic Games in protoyping gameplay for the Gears of War series.
An awesome and inspiring talk that reinforced my belief that game designers really need to roll up their sleeves and get their hands dirty.   With today's tools, there is no reason a game designer can't sit down and build working prototypes of gameplay.   It used to be that a designer would write up a spec for the feature (which no one would read) and then expect some artists and programmers to accurately interpret what he or she wanted.   Perry said designers need to be more like Chefs and less like Food Critics.  Don't just order up a meal and criticize what's put in front of your.  Get your ass in the kitchen and whip of that souffle yourself.  He showed a bunch of prototypes of features he cooked up for the Gears of War games.  Some of them made it into the full game.  Many of them were cut.  The point was that the team was able to play his ideas within a day or two of him coming up with it instead of the weeks it would take had he written a ponderous design spec.  Loved this talk.  

Day Two Awaits...

Friday, October 28, 2011

The Gaming Digital Divide Narrows

Over the last 10 years or so, I have believed that the digital divide between people who play interactive games and those who don't was an ever-widening, insurmountable gap.  Where at one time games (yes, even video games) were a plaything of the masses, things took a sickening turn for the worse.

Simple arcade style game, designed by their very nature to be easy to pick up and play (yet maddeningly difficult to master) gave way to deeper, more complex experiences.  Put the blame wherever you want: home computers, increased visual and aural fidelity, the game designer geek ghetto.  The end result was that in order to satiate the small but dedicated group of game fanatics, designer had to create more elaborate challenges and experiences.  Unless you kept up with how to address these challenges, you were quickly left out in the cold.

It's no surprise then to see children and young people who have an abundance of free time and a willingness to sit down and master these things remain an important demographic.  Any adult picking up an Xbox360 controller would likely not understand how to play anything beyond the most simple of game.
I mean, just look at this thing!
But there is hope and it comes from a variety of likely and unlikely places.   Smart phones have given masses of non-game players a piece of easy-to-use hardware that is perfect for playing games.   Interestingly, Apple for the longest time actively discouraged games on their platforms (look at the robust Mac gaming library for proof).  If anything the hunger for games on the iPhone is an indication that "normal" people really do want to play video games, and the inherent simplicity mandated by the platform gives people an easy way in.

Facebook is another place where lapsed players and first timers can get their game on.  There's a lot to be said about the current crappiness of the Facebook gaming scene which appears more interested in draining wallets using psychological trickery rather than providing an engaging experience.  There is hope.  It appears that the "ville" games are trending down, shedding their users, and are ripe for reinvention.

Even Microsoft (with a grudging nod to Nintendo Wii) has entered the fray with Kinect.  No experience necessary, just stand in front of your TV and perform the moves naturally... well mostly naturally.

So where does that leave the hardcore gamer dude and dudette.  No worries. Deep, challenging experiences are still available, though perhaps not quite as plentiful as before.  But think of it this way.  Someone playing an iPhone game or a Facebook game is really just another game player.  We all don't need to be FPS experts, just like not everyone who watches TV needs to watch Boardwalk Empire (although they should!)  Games have diversified and reclaimed an audience I though we had lost forever.  And the game industry is all the better for it.

Sunday, October 16, 2011

Jammin' with Jager

Attending a talk with Michael Jager is a bit like being in an information blender set to "liquify".  It's a constant barrage of ideas and concept that while never boring, is nonetheless difficult to parse.

Although they have a varied clientel, JDK is perhaps best know for their work with Burton snowboards and Microsoft Xbox.  It was their work with Xbox that I find the most interesting, but not because of the design and marketing work they did. 

Side Note:  I never liked the Burton spots, but then again I am so not the market for Burton.

I find it intriguing because Micrsoft has a brand that is lothed by so many.  Micro$oft, Microshaft, Windoze, etc.  The list goes on and on.  It is considered a faceless corporate monolith only interested in delivering bad product and services, taking your money, and maintaining a stranglehold on innovation.  So why work with a company so universally loathed?

I think the Microsoft doesn't always get a fair shake.  I'm coming at the from the perspective of an avid game player, but in the earlier days of computer gaming, it was the openess of their operating system (DOS) that allowed game developers to fourish, experiment, and push the technological boundaries of the PC.  Meanwhile, Apple's closed system actively sought to stifle game.  Expansion was difficult if not impossible.

When Windows 3.1 launched it was notoriously bad for games.  Developers continued to make games in DOS.  Microsoft saw what was happening, got the best gaming minds together and asked them what windows needed to do to run games better.  The result of this reaching out was DirectX, which is the building block of nearly all PC games.

Getting back to Michael Jager and Xbox, I think he saw past the bullshit and connected with the Xbox team on a personal level.  Saw what they were doing and was able to help them craft a message that resonated with the public.  For a relative newcomer to home videogame consoles, Microsoft has done remarkably well capturing their target (heh) market and becoming a household name.  Even Microsoft detractors have to admit, they knock this one out of the park.

Tuesday, October 11, 2011

Champlain College FunSpot Trip: A Picture Journal

What do you call a mass of 70 college students converging on the largest collection of classic arcade games in the world.  You call it awesome, because that's what is was. Teaching video game history is no easy task.  Unlike other forms of media, how you experience games, especially older games is unique.  While you certainly can play classic arcade games via emulation, nearly all of the context gets lost.  Providing that context to students who have no practical knowledge of that era has been next to impossible.

Thank god for FunSpot and the American Classic Arcade Museum.  With over 300 working games and pins (AKA Pinball) covering the early 1970s up through 1989, the ACAM let our students experience more in a few hours than days of lecture and discussion could accomplish.  Here are some highlights:

Computer Space (1971)
Galaga (1981)
Duck Hunt and Hogan's Alley (1984) - Gangsta Style.
Rows and rows of old machines.
With 70 students, the place really felt as vibrant as the arcades of yesteryear.
4-Player Gauntlet (1985)
Rockin' the old Pinball Machines.  Those yellow cups are filled with tokens.
Stun Runner (1989)
Donkey Kong (1981) - Very obscure game :)
Xenophobe (1987) - Sark and Tron look on.

Later that day we met with Bob Lawton, the owner of FunSpot, and Gary Vincent, the president of the American Classic Arcade Museum. They were a fountain of knowledge about games and the state of the industry back in the 70s and 80s. After hearing about our Game Development Program, we were invited to the repair shop where we saw some of the machines they are trying to get up and running. In addition to the 300 working machines on the floor, Gary has around 100 in a warehouse in various states of disrepair. Locating parts and material is painstaking. Many items aren't manufactured anymore. Once a vintage vector display blows, it's gone for good.

Meeting Bob Lawton.
Space Fury (1981) - Being restored
Gary Vincent explains the hazards of old monitors.  Turns out they catch fire.
Missile Command (1981) - Rare sit-down version.
Gary shows us restored control panel artwork for the rare game Flower (1986) 

Parts and materials for old arcade games.
They even have old consoles and computers.
All tuckered out, we climbed back into our buses and made our way back to Burlington, Vermont.

Thursday, October 6, 2011

Do you speak game? Part 1

If we are to agree that games (specifically Video Games) are a kind of medium to communicate thoughts and ideas, then what language they use?  It has become increasingly apparent that games are at a crossroads of sorts.  Interaction and exploration of the system have always been the heart and soul of the gameplay experience, but as technology advanced and and computers became able to do more than display simple graphics produce primitive bleeps and bloops an interesting, yet entirely predictable thing happened.  Game designers began look to film and television for inspiration on how games should be perceived.

Ms. Pac-Man: Act 1
Pac-Man(1980) introduced the concept of a non-interactive sequence designed to tell a story (more of a skit in this case).  This was quickly followed by Ms. Pac-Man where the sequences began with a movie-style clapboard introducing a three act narrative arc over the course of the game.

Night Trap (1992)
From that point on, the notion that a game had to have some kind of story took hold.  The advent of CD-Rom technology with close to 700MB of data gave game developers literally more space than they knew what to do with.  Instead of making their games larger or their systems more complex, many developers loaded their games with CD quality music, pretty pictures, and full motion video.

This trend had continued up through today with some popular games, such as the Metal Gear Solid series sporting 70% non-interactive sequences.  Again, hardly surprising.  New and dominate forms of media have a history of mimicking their predecessors.  Early film resembled little more than a stage play captured by the camera.  Early TV programs own much of their content to vaudeville and radio dramas.  Games, in this case, appear to be no different in their behavior.  However, over the last few years there has been an interesting development.  Games appear to be developing a language  of their own.  The next post on this topic with highlight some areas where games are finding their own voice.

Friday, September 16, 2011

Games communicate what now?

A constant theme in my game design classes is the idea of intent.  What do you, the designer intend to say with your work.  Do you have a point?  Even if "making a kick-ass game with space orcs" was the entirety of your thought process, doesn't that point to your intent?

Looking at a slightly more mature medium like television or film, we see in even the most banal content an attempt communicate to an audience, not just entertain.  Take Happy Days, a popular American sitcom that ran from the mid-seventies through the early eighties.  While primarily a comedy, the show was known for trying to tackle some sensitive issue. In the Fifth season episode "Richie Almost Dies," Richie does indeed almost die.  Due to the popularity of most of the Happy Days characters, especially Fonzie, it allowed the audience to explore the themes of inner strength, faith, and loss... well almost loss.  After all, Richie doesn't die, he almost dies.  Says so right in the title.

Yeah, it's pretty cheesy by today's standards, but addressing dramatic themes in a comedy show was not common in that era although there are some exceptional stand outs. See M*A*S*H or Norman Lear's body of work.

I guess my point is that all creators of media need to understand the power they wield.  Even the purveyors of pedestrian fare like 70s and 80s sitcoms understood that in their own little way, they could try to do some good.

So whither game design?  In future posts, I hope to address where we're making progress and where I think we're falling short.

Tuesday, February 8, 2011

Goin' with Agon

Between job, school, my wife’s job and school, and our two kids, there’s not a lot of room to spontaneously create a bubble of play.

The one place where it naturally develops is with our kids, of course, and usually this is within the context of established games.  My son, Jack, loves Mario and is really quite decent at Super Mario Bros. on the Wii. 

Together (over the course of probably a year of playing on and off) we managed to beat the game.  We finished every level.  However, there’s a secret world that you can only access if you find special coins on each level in the game.  These are usually quite difficult to obtain.  Jack would be happy to keep banging his head against that particular wall for hours, but I have imposed a strict 3 level limit to our play time.  He gets to choose the levels we play.  Occasionally he’ll opt for one of the small bonus levels on the map and once completed he’ll turn to me and say, “But that’s not really a level.”   Clearly his own way of trying to keep the bubble of play open.  Once we play three “real” levels, he accepts that we’re done and moves on to other things.

My only free time this week was during the Super Bowl.  We were invited over to my cousins house for food, drink, and football.  I figured this was the best time to try to get a competition going, so I looked for any reasonable opening outside of outright discussion of the game… I figured that would be too easy.

It started with martinis.  My cousin makes a mean martini, but he had run out of olives, so he used pickled banana peppers instead.  I casually suggested that without the olive, it really can’t be considered a martini and must be named something proper.  The only real rule was that the name couldn’t have “ini” at the end.  The winner would get another drink.  The loser… would get another drink.  So much for high stakes.   So we started rattling off different names:  Ginanna, Pepprigo, Pepson (like a Gibson but with a pepper), etc.   My cousin finally won by naming it after himself (“The Pasley”).  We mutually decided that was best and most appropriate name.  I considered coming up with some other game to play, but after two martinis (excuse me, Pasleys) it just didn’t seem all that important.

Monday, January 31, 2011

Interactive Interactions

While we are certainly not the only animals on the planet to do so, human beings are perhaps best known as users of tools.  It is this evolving relationship with these devices we have incorporated into our lives that helps to defines us.  How we choose to interact with the world around us colors our cultural perspective.  With what we choose to interact is also clearly dependent on our environment.  In short, we are what we do (…apologies to Jung for pulling that quote completely out of context).
In the last 20 years computers and the internet have become invaluable tools.  They are nearly inextricable from our daily live.  They are also head-scratchingly difficult to use.  Part of this is due to the open nature of what computing is capable of in a world where nearly all media has been touched by digitization.  But it’s also because computers haven’t been around long enough to enjoy the benefits of numerous iterative design cycles.  Consider how many revisions mundane objects like toothbrushes and toasters have gone through.   Compared to the lowly toothbrush, computers have been with us for an insignificant spec of time.  And for most of that time, computers have been the stomping grounds of engineers and technicians.
Until recently, little thought has gone into how normal everyday people would interact with computers.  If we can agree that the notion of “interaction” is akin to a kind of conversation with the tools we use, and “interface” as the method in which we convey our message, working with a computer is a bit like an English person trying to talk to a  French person through a Japanese interpreter.  It’s the interface that’s getting in the way.
Fortunately, we have been making great strides over that last 5 years or so.   The smartphone revolution is a great example of that.  Earlier mobile phones had tons of functionality shrouded by a mostly impenetrable interface.   Sure, my Dad could probably figure out how to create and maintain a contact list on his old phone, but the learning curve was not worth the effort need to achieve a desirable result.  His old analog address book worked just as well.  However, the moment I handed him my iPhone, he almost instinctively understood how to use it.  Interactions with the device nearly always provided instant and logical feedback.  He was learning how to talk to the device, but it was so painless he didn’t even realize he was learning.  It just seemed natural.
Although we’ve come a long way in a short time, there are still computerized devices in our homes that are inexplicably complicated to use.  Tivo is the most easy and elegant DVR interface ever invented.  Why is my Comcast DVR counterintuitive in nearly every way?  Is it because it’s just barely good enough to be functional for 90% of the consumers?  I would predict that in the near future there will be some interface standardization initiated by what we’ve seen in smartphones.  Devices will adopt context sensitive screens instead of modal buttons.
More importantly, I suspect there will be huge changes in interaction, and by that I’m mean the flow of the conversation.  A lot of our interactions with these machines are what I would categorize as “call and response”.   We tell the machine to do something; it does it, spits out the result, and waits for the next input.   We are already seeing devices anticipating our needs based on past interactions.  This kind of adaptive interaction will become more prominent in the future.  Our dialogue with our tools will become more participatory and less authoritarian.