Culture and Media Archive

Game Over. Start Again? – Part 2

Posted November 2, 2003 By Earl Green

No, Really – It Can Be A Computer

With the debut of broadband internet multiplayer gaming via consoles such as PS2, Xbox and Gamecube, and the introduction of keyboard and mouse peripherals edging these game machines closer to being full-fledged computers, is it possible that the video game industry has finally made good on the long-standing pledge of convergence?

Don’t count on it.

The add-on keyboard and internet service may have caught the public eye when Sega brought them to the Dreamcast in 2000, but it wasn’t the first time, and almost certainly won’t be the last. The first add-on keyboards/game-to-computer upgrades actually date back to the underpowered Atari 2600. No fewer than three such peripherals were put on that market toward the end of that console’s life span, and other manufacturers tried to follow suit. Almost from the beginning, Mattel Electronics made “forward-looking statements” about the Intellivision becoming the core of a full-fledged personal computer system, but kept delaying the bulky keyboard module which would literally engulf the company’s game console. Before long, consumer complaints and truth-in-advertising advocates caught up with Mattel, and questions were asked in Congress and the halls of the Federal Trade Commission. To avoid hefty fines, Mattel rushed the Intellivision Computer Module to a somewhat limited market – and it did almost nothing that had been promised. But with a product at last delivered, Mattel was off the hook.

Another attempt by Mattel to make good on these promises was its purchase of a low-end computer/game console called Aquarius from a Taiwanese manufacturer. Aquarius burst onto the scene in 1983 with bright blue rubber keys, a casing that resembled a game console more than anything, a licensed-but-lobotomized version of Microsoft BASIC and a processor that had one of its software designers openly calling it “a computer for the 70s!” – not an auspicious start. Mattel added a peripheral of its own, the Mini-Expander, allowing Aquarius to play specially-programmed versions of such Intellivision games as Utopia and Tron Deadly Discs, though the graphics and game play of those titles paled in comparison even to their Intellivision versions. Aquarius tanked, and tanked hard. Mattel wound up begging out of its contract and selling its stock of the machine and related peripherals and software back to the Taiwanese manufacturer.

Coleco, which had won nothing but favorable press for its high-end Colecovision console (which included a near-perfect version of the arcade hit Donkey Kong with every machine), made a similar misstep with the introduction of its Adam computer. Arriving as both a stand-alone product and an expansion module so one could turn one’s Colecovision into a full-fledged computer, Adam was, like Aquarius, initially sold on the basis that it could play bigger, faster, better-looking games. A large software library was promised, but there was one problem – a large user base never quite materialized. Worse yet, many of the Adam computers that were sold turned out to be defective, and like Aquarius, the machine hit the stores just in time to be engulfed in the video game crash of 1983-84. Coleco hastily exited the game and computer business, finding great success in its next major product: Cabbage Patch Kids dolls.

Some companies did successfully launch companion lines of computer products, most notably the Atari 400 and 800 computers introduced in 1979, though they were incompatible with the 2600. But these computers only came after Nolan Bushnell sold Atari to Warner Bros. for millions. Earlier than that, Bushnell had listened to a pitch from one of his young employees; with his brilliant-minded engineer friend, this Atari worker had invented a new computer, and they thought Bushnell should buy the design from them and launch a line of Atari computers. Bushnell passed on the idea, instead pointing the two youthful inventors toward a source of venture capital – and together, Steve Jobs and Steve Wozniak launched the meteoric, personal-computer-industry-sparking success story of Apple Computer. Barely three years after turning down their pitch, Nolan Bushnell was kicked off the Atari board under the new Warner-owned corporate regime, and new CEO Ray Kassar ordered the creation of Atari’s first computers in an unsuccessful attempt to catch up with Apple.

Online gaming isn’t new either. Sega marketed modems for the Genesis console, but even before that there was talk of Nintendo turning the NES into the hub of its own network: imagine trading stocks, sending e-mail and ordering products online with the same “joypad” controller you just used to play a round of Super Mario Bros. 2! Those plans failed to materialize.

But even before that, an early precursor to online gaming hit the market for the Atari 2600. A memory cartridge and modem all in one, the Gameline peripheral would connect Atari owners to an online database of downloadable software – all for a monthly fee, of course. The only drawback was that the downloaded game held in the Gameline cartridge’s memory would be wiped the moment the console was powered down. Gameline met with minimal success, and its creators later turned toward the burgeoning home computer market, a demographic they felt would be more familiar with modems and downloads. Gameline later underwent a series of rather drastic changes in both name and structure, eventually becoming an upstart telecommunications service called America Online.

Sex and Violence

It seems like video games only make the news these days if there’s a major hardware innovation or if someone’s decided they’re corrupting the youth of the world. Blame for the Columbine High School massacre in Littleton, Colorado was all but laid at the feet of The Matrix and Id Software’s highly customizable first-person PC gun battle, Doom. Not long before that incident, games such as Mortal Kombat had also been singled out for violent content, and Night Trap, a full-motion video “interactive movie” released by Sega, had received bad press for violence and featuring women in skimpy clothing or lingerie. More recently, Grand Theft Auto: Vice City has been given a great deal of negative press for similar themes.

And once again, neither argument is new to the industry. The first protests against violent video games came during the infancy of the art form, thanks to 1975’s Death Race, an arcade driving game by Exidy which required one or two players to guide a pair of cars around a closed arena, running down as many stick figures as possible. Each figure “killed” would be marked by an indestructible grave marker afterward. Now, according to the game, these stick figures were zombies urgently in need of squishing. But it didn’t take long for Exidy to begin receiving complaints about the game, and Death Race even earned coverage in print news outlets and a report on 60 Minutes. Exidy wound up cutting the manufacturing run of Death Race short, and only 1,000 machines were produced. Atari founder Nolan Bushnell promptly made appearances, declaring that no such games promoting violence against human beings would ever be produced by Atari; in some ways, Bushnell’s timely play for press attention presaged similar maneuvers that Nintendo and Sega would use against one another during the vicious, backbiting Congressional hearings that eventually led to the modern video game rating system.

It took longer for the issue of sex in video games to appear, but it finally did in 1983. American Multiple Industries – actually an adult video vendor operating under a somewhat more respectable sounding name – released a series of adult-themed cartridges for the Atari 2600, starting with one which raised the most controversy of all: Custer’s Revenge. Players guided a naked American soldier across the screen, trying to avoid a stream of arrows, to reach an equally naked Native American woman tied to a pole on the other side of the screen; when the woman was reached, players hit the action button on the joystick to – as the documentation so euphemistically put it – “score.” The graphics were primitive enough that they were more laughable than erotic, but in this case it was the thought that counted – and both American Multiple Industries and Atari received numerous protests from the National Organization of Women, several Native American groups, and rape victims’ advocacy groups. Atari did its best to do damage control, pointing out that it had no control over its licensees’ game content. American Multiple Industries released a few more games, though none of them had quite the one-two punch of controversy that Custer’s Revenge carried, and shuttered its video game operations a year later when the home video game industry suffered a massive shake-out.

Crash

If the video game industry is cyclical in nature, with warm periods of public acceptance and cold gaps of public scorn, why did the crash only happen once? From a business standpoint, it only needed to happen once.

In 1983 and 1984, the home video game industry entered a steady but gradual decline. There were too many competing hardware platforms on the market, some of them by the same manufacturer: Atari software designers were frustrated that the company had refused to retire its biggest cash cow, the Atari 2600, on time. The 5200 was to be the next generation machine, but two major problems stifled it – months passed between the initial release of the console and an adapter that would allow new 5200 owners to play their old library of 2600 games. Colecovision hit the stores with such a peripheral already available, making it a no-brainer for Atari 2600 owners wishing to trade up. The 5200 was also killed by the still-profitable market for Atari 2600 games. A very few third-party manufacturers even bothered to create 5200 games – why should they when the 2600 had a far larger user base? Other consoles such as Intellivision and Odyssey 2 hung on for dear life, but they were niche systems whose user base consisted primarily of early adopters – i.e., few customers looking for a new game system bothered with those machines with a lesser market penetration, and fewer third-party software makers bothered to create games for them.

But the third-party game makers were more than happy to milk the Atari 2600 for all it was worth, hoping to ride the venerable workhorse of the video game industry straight to the bank. And this is where it all started to go horribly, horribly wrong.

In 1979, Atari lost a landmark case against the world’s first third-party video game software company, Activision. Initially, Atari had sued Activision’s four founding members – all ex-Atari employees – for misappropriation of trade secrets, but that case had been lost, and finally Atari failed to prove in court that Activision’s very existence was detrimental to Atari’s sales. At most, Atari was granted a license provision that sent small royalties its way anytime a third-part game company mentioned in packaging or sales materials that its games would run on Atari’s hardware – but that was all. The floodgates were open and anyone could make games for the Atari 2600 so long as the royalty was paid. Even if the games, to put it bluntly, sucked. Even if they were Custer’s Revenge.

To make a long story short, dozens of companies piled onto the 2600 bandwagon who had never shown any previous interest in the home video game business: 20th Century Fox, CBS, board game gurus Avalon Hill, Children’s Television Workshop (the makers of Sesame Street), and even companies hoping to tie up niche markets like Christian-themed gaming. Sega’s first home video game products were Atari 2600 cartridges. These dozens of companies produced hundreds of games, and to be charitable, many of them were substandard. (It should also be noted, however, that some of Atari’s in-house titles weren’t much better, as demonstrated by miserable adaptations of arcade games like Pac-Man and movies – namely E.T., five million unsold copies of which are buried in a New Mexico landfill.) As the consumer base shifted its weight toward home computers and high-end systems like Colecovision, Atari’s hangers-on began dumping games at bargain basement prices. Atari itself swore not to do the same, but eventually it had to – and the bottom dropped out of the stock value of virtually every video game company when it happened. Even arcade game manufacturers were caught in the undertow and went out of business.

Activision barely survived the wash-out of third-party software houses; its next closest competitor, Imagic, abandoned plans for an IPO, sold its intellectual property rights, and quietly folded. Many talented game designers found themselves out of work in a fledgling IT job market that thumbed its nose at “mere” game designers. As former Odyssey 2 programmer Bob Harris said, “I made a conscious decision to get out of video games because I would have to move to California to stay with it, which I couldn’t afford, and I was scared by the negative view game work was given when I interviewed outside the game industry. I started looking for other jobs, I remember that the general attitude I seemed to run into was ‘so, you’ve been programming games, and now you want a real job.’ It came as a little slap in the face, because the real-time aspects of game programming were more challenging than most other programming jobs I’ve had since.”

Somewhat eerily predicting the dot-com boom and bust, the video game market was brought to its knees. Programmers were out of jobs en masse. Speculators hoping to hop aboard for a sure thing were left in bankruptcy proceedings. The Atari 2600 and other system hardware and software was dumped onto the market at fire sale prices, if stores could be convinced to keep selling them at all. For all intents and purposes, the American video game industry was dead.

Coming up in part three: the video game industry resurrects itself with a different set of rules, and comes full circle. Thought the classics were old hat? Game over. Start again.

        

Game Over. Start Again? – Part 1

Posted June 1, 2003 By Earl Green

It began with two squares and one straight line. An entire industry grew from those inauspicious beginnings which would eclipse the music and movie industries in revenue and define whole generations. Video games are nothing new – in fact, they’re probably older than you think. And most of the controversy surrounding marketing the games is also older than you think – almost as old as the industry itself.

The story begins in the Brookhaven National Laboratory, where the need for a user-friendly public demonstration of a massive new room-filling computer system led programmer Willy Higginbotham to create a simple video ping pong game called Tennis For Two. Played not on a television monitor but an oscilloscope, this early precursor to Pong was created in 1958 – truly the first video game, even though it was part of a free public display and not for sale. In 1961, several budding hackers at MIT, led by Steve Russell, created a game called Spacewar on the somewhat less massive PDP-1 minicomputer. Still a hulking mainframe of a computer, the PDP-1 was manufactured and sold to many colleges, and Spacewar became a kind of killer-app demonstration of the machine’s abilities, distributed free of charge. Read the remainder of this entry »

        

Everybody’s a Critic

Posted May 1, 2003 By Dave Thomer

Readers of this site are probably aware of my fondness for Pete Yorn’s musicforthemorningafter. So you can probably imagine that my hackles were pretty well raised by Kevin Canfield’s recent Salon article “This Year’s Model,� in which Canfield says, among other things, “In a way, Yorn is everything that is wrong with contemporary rock and pop.� Obviously, I disagree, but I was prepared to write it off as a disagreement of taste until Canfield explains exactly what way he means: “The epitome of a mediocre (but carefully packaged) soulful white boy who looks good in publicity photos, he represents something very bad and very annoying about the relationship between labels and the rock writers who make careers of regurgitating press releases.�

In other words, Yorn isn’t bad just because his music isn’t very good. He’s bad because Columbia Records has gone to great lengths to say that Yorn is good, and because rock critics like Rolling Stone’s Arion Berger (who gave music a four-star review) have blindly gone along with the label to perpetuate this obvious falsehood. But all is not lost, Canfield says. Rock critics aren’t dumb sycophants all of the time, although he adds, “You might be excused for thinking that, though, if you look at all the ink still devoted to a band like Pearl Jam, all the praise heaped on Springsteen’s simply awful The Rising, or the fact that some critics are actually paying any attention at all to Lisa Marie Presley.â€? Occasionally, though, they show some sense and agree with Canfield. Read the remainder of this entry »

        

Swimming Up Mainstream

Posted March 1, 2003 By Dave Thomer

A few weeks ago I was channel surfing and came across a showing of Braveheart. This is not all that surprising; the movie’s been on cable for a while now. What was more than a little surprising was that the movie was playing on the Sci Fi Channel. Now, I’ve heard that Braveheart plays fast and loose with historical fact, but that alone doesn’t qualify something as science fiction, does it?

As it turns out, the Braveheart showing was just part of a general strategy by Sci Fi to try to broaden its programming. Alleged psychics, dream interpreters, and scary hoax-meisters have all been added to the network’s schedule over the last few years, and network head Bonnie Hammer has given numerous quotes in interviews that suggest the network wants to be less oriented to spaceships, technology and other things that people might think of as science fiction. That change in direction has probably contributed to the cancellation of (what used to be) Sci Fi’s flagship series Farscape, along with Sci Fi’s decision to pass on two possible series by Babylon 5 creator J. Michael Straczynski. Needless to say, there are quite a few disgruntled fans out there who wonder why a network called the Sci Fi Channel would abandon science fiction in order to pursue the mainstream.

Me, I have another question: how did we ever get to the point where science fiction isn’t considered the mainstream? Read the remainder of this entry »

        

We Shall Call It – The Alan Parsons Project

Posted February 1, 2003 By Earl Green

Originally intended to be the name of a single album and not an ongoing band, the Alan Parsons Project was a bold concession to early 70s art-rock and progressive rock, fusing the expansive (and often lengthy) compositions of such acts as Yes with the conceptual cohesion of Pink Floyd and Emerson Lake & Palmer. And ironically, the idea behind the Project (for the purposes of that first album) was to dispense with the focus on the performers and place the emphasis entirely on the concept. Little did Parsons – whose experience had included engineering major hits with Pink Floyd, The Hollies and Paul McCartney – realize that the Project would become one of the most enduring lineups of the 70s, 80s and 90s.

The Project was the brainchild of Parsons – acting as producer and musician – and Eric Woolfson, a musician, songwriter and vocalist in his own right who was serving as Parsons’ manager in 1975. Woolfson and Parsons, with the help of orchestral arranger Andrew Powell (whose contributions to the Project would span the next two decades), devised a musical suite based on the works of Edgar Allan Poe. With several solid rock songs, and almost half of the album written as a purely orchestral work, Tales Of Mystery And Imagination was quite unlike anything else. The Moody Blues, The Beatles and Electric Light Orchestra had fused classical instrumentation with rock numbers, but none of them had given virtually one entire side of an LP over to a session orchestra. The rock numbers were skillfully executed by members of Ambrosia and Pilot, whose most recent album Parsons had been involved with, including the amazing guitar virtuosity of Ian Bairnson, who would also stay with Parsons through the end of the Project’s existence and beyond. Read the remainder of this entry »

        

Lies, Damn Lies, and Sample Size

Posted September 1, 2002 By Pattie Gillett

Hey there. Yeah, you. What magazines do you read? What programs do you watch on television? What radio stations do you listen to? What web sites do you visit? No, I haven’t started channeling John Ashcroft., I’m asking the kind of questions decent marketers ask – if they want to keep their jobs.

In marketing, it’s all about knowing your audience. Or, rather, knowing what a statistically accurate sub-sample of your given audience would do under a given set of parameters. Let’s face it, only the federal government is crazy enough to try and count every single human being in America and even they are trying to get out of the business. Even spaced at ten-year intervals, that can be a pain. Read the remainder of this entry »

        

Finnaticism

Posted March 1, 2002 By Earl Green

Rock music is rife with siblings, ranging from the Everly Brothers to Heart’s Wilson sisters to the Kemp brothers of Spandau Ballet. And then there’s the enduring, if somewhat more obscure, legacy of New Zealand’s Finn brothers, veterans of such acts as Split Enz, Crowded House and – finally – just themselves.

Neil and Tim Finn have carved out their own little niche in the pantheon of singer/songwriters, each turning out music with a very distinct character, and each gathering a loyal fan base. Tim’s music often bounces along with a wistful, whimsical flavor, but he’s also turned out some quite interesting, refreshingly un-clichèd ballads since launching his solo career. Neil’s music shows its Lennon & McCartney-inspired roots vividly, with unexpected chord changes, harmonies and experimental instrumentation aplenty; yet the younger Finn seldom turns out anything that sounds like Beatles pastiche, with literate lyrics that manage to be both heartfelt and stream-of-consciousness at the same time, and a way with angst-heavy ballads that no one else has been able to match.

Both brothers’ solo careers are almost proceeding along parallel tracks – they have nearly instant name recognition in Australia, New Zealand, Canada, the U.K. and Europe, and yet they’ve all but stalled in gaining U.S. recognition, their latest solo releases appearing on tiny indie labels. This is surprising, given that Neil Finn was the man responsible for Crowded House’s “Don’t Dream It’s Over,” a 1986 hit that climbed to #2 on the Billboard charts. The uninitiated might be tempted to write that off as a one-hit wonder, but the real story behind the Finn brothers stretches back 30 years – and at least as many hits. They’re just hits that American radio hasn’t taken on board. Read the remainder of this entry »

        

Be Mused

Posted February 1, 2002 By Earl Green

Editor’s Note: This essay was originally written and published in January 1998 as part of a series. Visit theLogBook.com for part one, and then bug Earl here on the forums for the still-unpublished part three.

From time to time I’ll run into someone who asks me what I do. Simple – I write and produce promos and commercials. Scriptwriting, filmmaking, and occasionally a smidgeon of voice acting all in one. Not a bad package. Until this hypothetical someone – not entirely hypothetical, though – says “Oh, so basically, you sold out.”

You betcha.

Be prepared to hear this line of total B.S. if you plan on extending your creativity into your professional life. Trust me – you will hear it at least once.

Aside from my perhaps too-pragmatic belief that the satisfaction of a full stomach beats the romance of a bohemian freelance-artist lifestyle any day of the week, let’s get one thing out in the open, friends. And this may very well be the moment at which you decide to keep traveling down this road…or back up to take the other exit you just passed.

The moment you make the arts or the media your profession, you are being exploited.

Go back. Read it again. I’m not joking.

You are being exploited. You are allowing yourself to be exploited. It is what you do.

Think about our society. Teachers struggle to keep food on the table, their meager reward for teaching our children. Members of the military, in return for waiting for opportunities to protect our borders – opportunities which may never actually arise in their lifetimes – are lucky to break even. Police officers try to make ends meet…a paltry compensation for their profession, which may be called upon to make sure that the rest of us do not meet our end.

Where is the money in western society?

Recently, NBC renewed the Warner Bros. hour-long hospital drama ER for a staggering, history-making (and, I fear, history-breaking) $13 million an episode.

Until now, million-dollar-per-job fees have only been commanded by major film stars. Now, there’s a good chance it will become de rigeur for television actors as well.

These people aren’t being paid millions to educate your kids. They’re not leaping in front of you to keep a random bullet from kissing you. They’re not even poised for action should a hostile international power try to challenge territorial rights or common sense.

These people are sitting around, eating free catered food, smoking a lot, and pretending to be teachers, police officers, soldiers, and, yes, doctors.

In our society, despite the NEA’s gloomy prediction of what will happen without federal grants, we throw our money – liberally – in the direction of the arts. Think about how much movies and music cost. Books are rising in price. Artwork, even when reproduced for mass consumption, isn’t getting any cheaper. That’s where the money is. (Well, there is medicine and law, but this isn’t a series of essays about medicine and law, is it?) Read the remainder of this entry »

        

Dancing About Architecture

Posted January 1, 2002 By Kevin Ott

Writing about music, said Elvis Costello, is like dancing about architecture. Extra points go to whoever can figure out what the hell that means, and further points go to whoever can get me a sample of the clearly potent and very likely lysergic acid-soaked hashish Costello was smoking just before he said it.

Here’s the best guess: Costello is implying that it’s impossible to use one art form to convey the messages of another; just as one can’t execute a pirouette reminiscent of, say, the Sears Tower, one can’t write an essay or an article or a review that adequately conveys the nuances of tone and lyric and scale. If you think this makes sense, remember that Costello also once thought putting Daryl Hall and John Oates in one of his videos would be the epitome of hipness.

If it’s not already too obvious, I should note that I don’t agree with the fatalistic, don’t-bother nature of what Costello said (he said it as part of a greater speech I can’t recall right now verbatim). If we assume that writing is art just as music, dance and architecture are, we can assume that writing can draw its inspiration from similar sources. And while each art form represents that inspiration differently, I think it’s possible to convey similar emotions and ideas that

“But Kev,” you’re saying, “isn’t this supposed to be a music article and not a philosophy article? Also, shouldn’t you have some pants on?”

As to the first question: Yes, and I’m about to get on with it. As to the second: Mind your own damn business. Read the remainder of this entry »

        

Racism on the Front Page

Posted December 1, 2001 By Kevin Ott

Pick up your local newspaper and flip through it. Look at all the photos. Look closely. Notice all the local pictures of nonwhites and other minorities?

Not really, huh? Well, don’t be surprised. There’s been a lack of real-life and everyday portrayals of blacks, Latinos, Asians, gays, lesbians and just about every other type of person who isn’t straight or white from the newspapers we read for quite a while now. It’s one of those things that’s nobody’s fault per se, but that we all have a responsibility in dealing with.

It’s called gatekeeping, and it’s as old as newspapers themselves.

In 1968, in the wake of bloody race riots in several major cities, a federal commission led by Illinois Governor Otto Kerner discovered what America’s black population already knew: That blacks were woefully underrepresented in the press, often featured in unflattering light, if they were featured at all. The commission’s report urged newspaper editors and publishers to diversify both their newsrooms and their coverage of minority communities.

In the intervening three and a half decades, many newspapers have performed admirably in their efforts to recruit African American reporters and present complete, balanced coverage of their community’s black populations. Unfortunately, these efforts haven’t brought newspapers within range of their projected goals, according to the American Society of Newspaper Editors (ASNE), and within the past year, the number of minorities in the workplace has actually decreased from 11.85 percent to 11.64 percent. Read the remainder of this entry »