jump to navigation

The High Definition Video War Part 2 January 30, 2012

Posted by Maniac in HD Format War, Histories.
add a comment

HD-DVD released first and being first to market helped them quite a bit. Armed with an initial selection of high-definition movies from exclusive and non exclusive studios gave HD an impressive catalog of movies very quickly. A brand new HD-DVD player would cost the consumer about $500US at release, with models supporting higher resolutions and wider features costing more. The HDTV owning public was eager to finally have some native content to show off their new TVs and they bought them right up.

Blu-Ray launched towards the end of 2006 near the release of the Playstation 3. I would like to say that they had just as strong a launch as HD-DVD did, but they didn’t. The initial launch price of a standalone Blu-Ray Disc player was around $1000US, and the alternative to buy a brand new Playstation 3, which had full Blu-Ray support, was much cheaper, but not when compared to the price of a Nintendo Wii or Xbox 360. Sony was also bringing out their catalog of recent movies from their Studios, the majority of which were critical bombs that no one was interested in seeing, let alone seeing in HD. Warner Bros also held back the release of several movies they had initially released for HD-DVD, since they made use of features that the launch version of Blu-Ray could not make use of, like Picture-in-Picture commentary, and instead chose to hold back the release of those movies until Blu-Ray came up to HD-DVD’s specifications.

That update for the Blu-Ray players would come indeed, but it would not make a lot of early Blu-Ray adopters happy. When Sony announced that they were working on Profile 1.1, the first major Blu-Ray system update, it was clear that there were going to be plenty of currently existing standalone players that would not be able to meet this specification. The hardware requirements for a second video stream that was essential for picture-in-picture playback was not part of the initial BD specification (it was a requirement of HD-DVD however) and as such, even with a firmware update, there would be plenty of players that would not be able to make use of the feature. The Playstation 3, however, would have no problem meeting this specification.

Around the time that the Playstation 3 launched, Microsoft released an HD-DVD drive for the Xbox 360. Using the Xbox 360’s processor to do all the work, the HD-DVD drive would connect externally to the 360 using the rear USB port. It also had a USB hub in the rear for extra peripherals that were used by the Xbox 360’s rear USB, like the Xbox Live Vision Camera and WiFi Adapter. Priced at $200US and designed to be a cheaper alternative to buying a standalone HD-DVD player for people who already had a 360, the external drive sold quite well and increased the HD-DVD’s user base. It also had 256MB of internal flash memory to meet the format’s specifications for any downloadable content, and the ability to automatically update it’s firmware through Xbox Live.

Then there was an interesting little development. The porn industry threw their entire weight behind the HD-DVD format. This may seem like a minor sidebar for hilarity’s sake, but in reality this was taken with a lot of seriousness. The porn industry claimed that HD-DVD was the better format to work with, and as with Betamax, Sony was not all that receptive to having Blu-Ray porn. The truthfulness of this statement could be taken with a grain of salt, but it was certainly true during the VHS/Beta war years prior, and many have credited the porn industry’s support of VHS to be the sole reason why the format was a success over Beta.

Then there was another development that put the format war into a heavy stalemate. As Blu-Ray was picking up steam, Paramount Pictures announced that they would become an HD-DVD exclusive studio, adding to the ranks of Universal. Paramount had a pretty extensive movie collection out on Blu-Ray already, but it had been clear that because of the improved software of HD-DVD, some discs, in particular Mission: Impossible III, were superior on HD-DVD than their Blu-Ray counterpart. This also meant that a movie they had already manufactured and shipped to retailers, Blades of Glory, would have its Blu-Ray version pulled from shelves before release.

This put the format war into a stalemate through the 2007 holidays. Both Sony and Toshiba offered promotions to get customers during the holiday season, ranging from offering free movies through rebate incentives (which the Playstation 3 was also a part of) to temporary price cuts for new standalone players.

Then Blu-Ray dropped a devastating blow to HD-DVD that they could not recover from. Warner Bros announced they would become Blu-Ray Disc exclusive. Why did WB cross? Well, quite simply, they looked at the movies they had released for both platforms and said they clearly sold more Blu-Ray Disc versions of their movies than HD-DVD versions. Paramount had it in their contract that their HD-DVD exclusive status would be conditional that Warner Bros did not become Blu-Ray exclusive, as they knew that Warner Bros’s weight alone would turn the tide of the war and they did not want to find themselves exclusive to the eventual losing format for longer than they needed to be.

With the release of Profile 2.0 (BD Live), Blu-Ray players could finally meet the specifications that HD-DVD had since launch. It was now mandated that all Blu-Ray players would allow internet access from their discs and ship with 1GB of storage space for any downloaded content. While once again the Playstation 3 would have no problem meeting this specification, the current wave of standalone Blu-Ray players could not meet this hardware specification.

Toshiba put all their final efforts into one last commercial (which ironically wasn’t even in high-definition) to air during the Super Bowl, but it was too late. The format was running on fumes and they were fizzing out. Shortly after the Super Bowl, Toshiba called it quits with the HD-DVD format. Blu-Ray Disc had won the HD format war.

So how did HD-DVD, with its fantastic launch and superior software lose to Sony with their initially disappointing exclusive lineup, required hardware upgrades, and more expensive hardware? Well, it all came down to the elephant in the room. Sony was right in assuming that having Blu-Ray hardware inside the Playstation 3 would bring them the edge in the hardware install base. When Warner Bros looked at their sales figures for each of the platforms, they could see that the Blu-Ray versions of their movies were selling better, in my opinion this is probably because anyone with a Playstation 3, since they had the player already, was probably purchasing Blu-Ray versions of newer movies over DVD counterpart. The Playstation 3 integration, plus the install base from standalone players, gave Sony the sales edge in multiplatform movies. It also seemed that with the majority of early Blu-Ray adopters choosing the PS3 as their player of choice, the blowback from the hardware upgrade requirements with the new profile specifications was minor.

With the release of Profile 2.0 (BD-Live) Sony has brought Blu-Ray Disc to the software capabilities HD-DVD launched with. Warner Bros would use that to re-release previously exclusive HD-DVD movies which took advantage of the platform’s software capabilities, like Constantine, Batman Begins, and The Matrix Trilogy. Now, with 3D support, the Playstation 3 continues to support every major hardware update Sony has brought to Blu-Ray (internet connected firmware update required), and BD Live players have become as common and inexpensive as the DVD players they intend to replace.

So if you happen to be in a secondhand movies retailer and see a new movie in a red box and wonder, “Gosh, what’s HD-DVD?” now you know. Now put it back on the shelf.

The High Definition Video War Part 1 January 29, 2012

Posted by Maniac in HD Format War, Histories.
add a comment

Are you guys ready for another history story?  Back in 2005, the world was itching to bring High Definition TV into the mainstream.  While PCs had enjoyed HD content for years prior, it was finally being adopted into the home theater market.  The problem was DVD was not a sufficient technology for HD video and the movie studios knew it.  With the choice to use a red laser in DVD players, there was not enough bandwidth to support high-definition playback.

While HD televisions were finally being sold to the mass markets, the install base was still quite low, in 2006 only about 20% of homes had a HDTV in their house, and very few of them had any content to use for it.  Not all of the next generation consoles had even been released yet.  The time had come for the successor to DVD to be released, a new video format capable of displaying movies in high-definition video and audio, and show HDTV owners the full potential of their new TVs.

The problem was that like with any large group, the manufacturers couldn’t agree on how they wanted to go about making this new format and who would be the ones to make the standard.

Sony had a product in mind to meet this demand.  They had a new optical disc format with storage capacity of around fifty gigs.  Dubbed Blu-Ray Disc (BD for short) for the blue laser the player would read the disc with, Sony believed that the extra storage capacity of the discs would do well for containing large series, loads of special features, and completely uncompressed surround sound audio tracks.  The problem with it was that it was so dissimilar to what was currently on the market, new facilities would have to be built to mass produce them.

Toshiba had their own ideas.  They had a disc format of their own in mind that, while it did not have nearly as much of a storage capacity as Blu-Ray did, had a full list of technical features and software ready to go for it that Sony didn’t.  At launch, their format would be able to support picture-in-picture commentary, as well as allow any users with internet access the ability to download new content to their players.  Toshiba called the format HD-DVD and unlike Blu-Ray it could be mass produced using currently existing facilities.

Each side believed they had the superior product, but everyone knew that there was a major elephant in the room.  Sony was also producing their next generation console alongside their new high-definition format.  The Playstation 3 was the third in Sony’s highly successful gaming division, which had twice prior won the gaming wars by a landslide.  The decision to include DVD playback in the PS2 at launch had been an enormous success for the initial sales of the PS2, as they sold it as not just a gaming platform but an entertainment device, and Sony was banking that the choice to include a Blu-Ray player inside of the PS3 would be just as big an advantage to them as it had been in the previous generation.  None of the other consoles would support Blu-Ray out of the box, but there was some musings that Microsoft may include an adapter to allow HD-DVD playback on the Xbox 360 after the 360’s launch.

The sides were chosen.  Universal Studios would be an HD-DVD exclusive provider.  20th Century Fox and Disney both decided to join Sony Pictures and exclusively support Blu-Ray Disc.  However, not all the studios were willing to choose a side in this fight just yet.  Paramount and Warner Bros, who probably had the biggest studio catalog of all the studios, would remain neutral and support both platforms.

However, completely independent of whoever was going to win or lose this format war, the true loser of it was going to be the home consumer.  With an only twenty percent install base to go for, both Sony and Toshiba were going for a small portion of a niche market, which was probably one of the worst business decisions anyone could make.  It also would mean that with studios exclusive to certain platforms, there were going to be movies released that would not see a release on the alternate format.  If a consumer chose to buy a Blu-Ray player, they would be forced to buy the DVD of anything released exclusive to the alternate format.  They also knew that whoever would end up buying the eventual losing format would be forced to buy the winning format after the fact, and possibly rebuy their movie collection.

There were rumors of an 11th hour meeting of the minds to stop the format war before it started, but it fell through.  The war, it seemed, would be decided by the consumers.

Know Your Gaming History: Where Did the Term Game Gods Come From? September 30, 2011

Posted by Maniac in Histories.
1 comment so far

I think there are a lot of people out there that want to know more about the history of gaming, but don’t know where to begin. I do not believe I would be any good to anyone without a full history of the industry I’ve been covering on this site for over the past year, nor do I think anyone should dare put a key to the keyboard that isn’t fully versed on what they’re writing about. As someone who has been following the history of gaming for the past ten years (and sharing some of that information with all of you) I would like to share with you a few of my favorite sources for gaming history.

Unfortunately, there is a lot of disinformation out there (even from normally genuine sources). We live in a world where I saw a documentary on The History Channel call the original Playstation the first (compact) disc game console, and that’s just completely incorrect. Compact discs have been used in game consoles since the days of the CD-i, 3DO, and Sega CD, all of which came out around 1991 (some of those first came out in Japan) whereas the original Playstation launched in December, 1994 in Japan.

So where can one find good information about gaming history, and where did gaming start? Well, I don’t want to give a whole lecture about the history of games in general (that might be for another day) but I would like to float out some great sources I’ve found over the course of my life which still hold up.

Anyone who visits the site regularly knows I throw around the term “God” as a nickname for a select few of game developers, and I’m sure some of you wonder why. Thais because there was a list of game developers published who at the time were considered the best in the field (many who are still fondly remembered) and called the “Game Gods.” For those of you curious where the original list of what game developers were called “Game Gods”, that came from an article written ten years ago for PC Gamer Magazine, where they comprised a pretty lengthy list of some of the best game makers, who worked on some of the best PC Games at the time, and called them the “Game Gods”. Guys like Warren Spector, John Carmack, Tim Schaffer, Sid Meyer, and Wil Wright made that original list, which is why whenever I reference them in my work they have the nickname of God. The list was later expanded a year later to what they called the “next” generation of game gods, focusing on a younger generation of game developers they believed showed a lot of promise, and that list included American McGee, CliffyB and Stevie Case.

There was a G4 show later on called “Game Gods” but they profiled guys like John Romero, which was kind of odd since they interviewed genuine game gods (Carmack and Spector) to talk about someone who lost his title after making Daikatana. Although, to G4’s credit, according to Masters of Doom, Romero had been calling himself “God” openly at the high points of his career. While I would not consider him a God (I don’t think he was featured in the original article), he may be the origin of the term.

There’s no question that while some of those developers have retired from game development, most of the guys who made the original list are still at the top of their field and have released tons of hits after the article had been published. The next generation on the other hand was a bit more hit or miss. While CliffyB has since graduated to full God status with the Gears of War franchise, Stevie Case left the gaming business for many years, and American McGee would never get the same high praise for a game after the release of Alice.

I don’t think PC Gamer ever made another list after the next generation. I cancelled my subscription about a year ago after a ten-year run. I don’t know, the magazine had some pretty awful redesigns over the years and lost some of its charm for me, but some of those older issues I’ll still site back to. Nowadays the term Game refers to more than just PC, and I think the list should be rewritten to take into effect console developers and the developers who created the foundation for gaming as it is today, like Gunpei Yokoi and Shigeru Miyamoto. Also, plenty of new people have stepped up over the past ten years to take a swipe at that title, and probably have earned it, including David Jaffe, Sam Lake, or Hideo Kojima.

I think for this new millennium a new list must be made. Any takers or do I have to do everything myself?

History of the N Gage August 1, 2011

Posted by Maniac in Histories.
add a comment

Does anyone else remember the Nokia N-Gage? When the cell phone boom began and everyone under the sun started to get their own personal phones for business or personal use, Nokia did very well for themselves, as one of the major cell phone manufacturers. To appeal to younger people, their phones would be fully customizable with replacable phone bodies, programable ringtones, and would even feature a pretty addictive Snake game. They sold like crazy.

Six years ago, Nokia decided to do an experiment and throw their hats into the gaming ring. As one of the biggest personal device manufacterers on the planet, they figured they could compete against the likes of Nintendo in the handheld gaming device market, because their next handheld was going to be more than just a gaming device, it was going to be a personal phone.

Called the N Gage, it was a revoultionary idea which the time’s technology just could not do well or affordably. The price point for the device was far too expensive for a cell phone or a handheld gaming device at the time, and unlike the modern smartphones of today, it was designed for only one of those two options. It did not have a varied library of different productivity software. You could only use it to make calls or play games. The design was pretty bad too, you had to take out the battery to replace game cards (yes, you had physical games for it which came on SD cards) and you had to hold it sideways like a taco to make a phone call with it. The launch price would not be priced to compete, well it would be priced to compete, just not with the Game Boy, it would be priced to compete with the current generation of home consoles.

But what I think killed it most of all was Nokia’s attitude about it. Early on they made a comment to the press that they didn’t expect their customer would be the kind of person who would pull out a Game Boy in public. Well who do you think was Nokia’s potential market? The gamers! This offended a lot of the gaming population, which was this product’s primary market. In fact, the only place you could get it at the initial launch of the device were dedicated game retail stores like GameStop. This did a good job of cutting out potential impulse gamers who needed a new cell phone and could be impressed by the phone’s capabilities. When it launched, revolutionary as it was, reviews were abysmal, and people were not willing to gamble the price point on it, no matter what was promised.

After the initial negative reviews of the original model, Nokia released an updated version of the phone called the N Gage QD. The redesign was a step in the right direction. You no longer needed to hold the phone on it’s side to make a phone call, and you didn’t have to remove the battery to change games. In a lot of ways it was a better device, but more than that, this is the device that Nokia should’ve RELEASED IN THE FIRST PLACE.

There’s a cool article about the phone online that I got a look at this weekend, and it brought back all my memories of the device. I never owned one personally, but a cousin of mine knew a guy who owned an independant Cingular store, when the redesigned N-Gage QD hit shelves and was finally for sale at cell phone retail stores he loaned me one for a week and I barely used it. By this point, I already had a Sony PSP, which while it wasn’t a phone, already had a great library of games and movies which I was enjoying a lot more than a generic golf game on a phone.

Interesting post-script to that story. One of the original media proponents of the N-Gage was former G4 hottie Laura “Thug” Foy, who now does a weekly video series on Xbox Live about the newest software releases for Windows Phone 7.

PS3 HDMI Controversy Clarified and the History of HDCP July 13, 2011

Posted by Maniac in Game News, Histories.
1 comment so far

Recently the internet has been in a bit of an uproar when an internal memo from GameStop leaked that they needed to start training their people to inform customers that a new hardware revision of the PS3 would only allow HD playback of any kind if you had an HDMI equipped HDTV.  Anyone using a legacy HDTV without HDMI support (like those who would use component cables) would have to buy a new TV or be forced to play their games and watch their movies in standard definition (considered by many online a fate worse than death).

When I first read the news I dismissed it, because I already KNEW this was coming, and GameStop had made a small error in the memo.  Today GameStop did in fact announce that they were misinformed or at least misunderstood what the new specifications for the PS3 would be and have corrected their mistake and clarified what would be going on with the Playstation 3’s HD output.  The internet is still in an uproar of course, but it’s a little late to the party since this has been something the movie studios have wanted for years now, and it was announced at the beginning of this year that this would finally be happening in 2013 if we liked it or not.

When Blu-Ray and its dead rival HD-DVD were announced, the movie industry was more overzealous than ever to protect their movies from people copying them as much as possible.  Whearas DVD’s copy protection standards were broken quickly and could not be changed, with the advent of upgradable player firmware, they could simply change the encryption codes as new discs were being released.  Of course home owners would need to be informed to plug their players into a dedicated high-speed internet connection and to continually check their players for updates, but they so far haven’t done a good job of that.  That’s why there’s a leaflet in pretty much every Paramount disc saying, “Update your player’s firmware for this to work,” instead of a chapter listing insert.

The protection in question is called HDCP and it’s been bad news since pretty much the start of the HD format war.  This is also why all your HDMI equipped HDTVs, Receivers, Computer Monitors, and cables have the HDCP certified sticker on them, because if they didn’t, they’re pretty much worthless.  Now, HDCP equipped HDMI equipment has been standard since pretty much the format was introduced, so don’t worry if you’ve bought some cables or a HDTV recently if your movies will eventually stop playing.

The problem is the copy protection does NOT work using component cables!  Anyone who accesses an HD stream using those analog cables is getting a completely unprotected video output, and that has terrified every movie studio for the past five years because it can be copied.  When they found out about this back when HD-DVD and Blu-Ray Disc launched in 2006, the movie studios actually demanded the disc manufacturers to not allow any HDTV output of the discs through component, and only provide HD output through HDMI.  At the time, this was universally considered the dumbest idea ever.

HDTVs had a 20% install base at the time of the format launch, and HDMI was only then becoming a standard port in new TVs.  HDMI equipped HDTVs were rare at the time and extremely expensive, and the HDTVs people did have, while they did not support HDMI, worked just fine and displayed picture with negligible quality difference.  Five years ago people were still just as cheap as they are today (sarcasm) and just didn’t want to buy new TVs.  Since these people were ALSO the immediate customers of the new HD formats, and since a war was going to be going on to get at that 20%, when the formats launched the studios did a rare sane thing and did not restrict the component HD stream of the movies.

But the day has come where this status quo will cease to be.  Starting in 2013 all new Blu-Ray players will not be made with any component connections in them at all.  Current players will be just fine and continue to run, although a new firmware update might be coming for them.  Current discs will not be modified and their HD streams will continue to play through component, but after that time the future discs are going to not allow HD playback in component.  However, the discs with this restriction will be marked on the package.  If you don’t have an HDMI equipped HDTV and you want to enjoy new Blu-Ray movies in HD, you’re going to need to buy one.

Now this brings me back to the PS3, which I’ve kinda neglected during this history lesson.  As a Blu-Ray player as well as a gaming console, the PS3 is just as susceptible to this controversy as the dedicated players, and what GameStop was informing their staff of is literally what I just said in the previous paragraph.  This was the decision of the big movie studios and not of Sony or the Playstation teams, and if it was left to them it would have been made sooner.  It’s an unfortnate predicament, but for those who have followed the format for some time, an expected one.

You’ll still be able to play all your PS3 games in HD if your HDTV supports the resolutions through component, but future Blu-Ray Discs will require HDMI playback for full HD support.  Look for the label to know for sure, and if you don’t have an HDMI equipped HDTV, I recommend you start your shopping for a new HDTV.

The Life and Death of E for All (Part 3) December 10, 2010

Posted by Maniac in Histories, The Life and Death of E For All.
add a comment

After it was over, the press was not kind to E for All that first year.  Some were still bitter from the PAX fiasco, but the negativity was not without merit.  E for All was NOT what people were expecting.  When the organizers made promises like, “It’s E3, but the public can get in too!” the public was disappointed they discovered that was not what they got.  The 2007 show was far too barren to rival what E3 was in its prime.  While what was shown got plenty of positive press, especially the Metal Gear Solid 4 and Super Smash Bros Brawl showings, E for All got a major negative mark across the board from all the attendees, including myself, who had been former E3 attendees.  We knew it could’ve been a much better show.

What was really telling was that many of the developers who attended the first year, specifically Nintendo, announced no plans to appear in the next one.  Many felt that Nintendo was the unquestionable ruler of the first E4 and if they weren’t going to be showing for 2008’s show it must’ve been becuase they didn’t feel it was worth it.  Penn and Teller did tape an episode of their show there however.  I feel bad I missed out on the opportunity to be on Showtime, I’ve already been on HBO (but that’s a different story which has nothing to do with gaming).

In 2008 things were not looking good for the ESA.  The smaller E3s were just not getting the attention the older ones used to.  Gamers were disappointed by the downsizing that was taking place.  The disappointment of 2008’s E3 seemed like a cry to bring it back.  Many of the attendees to the smaller E3 show lamented the fact all these booths and all the empty halls were begging to be filled once again with games.

ESA was also losing membership and funding because of the cancellation of the show.  Game developers were dropping their memberships left and right.  Because E3 was their major revenue stream for the year, the ESA had to raise their rates in order for developers to keep membership, and without E3 as a reason to be a member, many just didn’t care to spend the extra money to remain in the organization.

There was only one logical solution, and that was to bring E3 back for 2009, and that was the decision the ESA made.  E3 would return to its full glory as everyone remembered it.  With E3 back, E4 was history, and not too many were upset over the loss.  E3’s triumphant return would mark one of the best gaming years of this generation, and their tradition continues to this day.  E4 will likely just be considered a footnote, a dark time of the history of trade shows.

But for all their faults, E4 did have the right idea in mind.  Besides security concerns, why doesn’t E3 allow the public into the show?  Almost all of their rival trade shows offer at least one day of public consumption.  Heck, PAX was created to give the public the chance to see these games early, and their security is more than adequate to handle the people.  Sure the occasional person acts up, but they are promptly arrested without much ruckus being caused.

Open your show to the public E3, it’s for your own good.  You don’t want another imitator to go for your crown again do you?

The Life and Death of E for All (Part 2) December 9, 2010

Posted by Maniac in Histories, The Life and Death of E For All.
add a comment

Before E for All let a single gamer enter the LA Convention Center it had already found itself in the middle of a controversy.  The yearly Penny Arcade Expo (PAX) was taking place in Seattle, WA some time before the inaugural E4.  PAX was (at the time) the biggest gaming expo open to the public and had been established for many years.  To solicit the same gamers, or perhaps looking to capture some of PAX’s magic, some E4 promotional people handed out trinkets and swag with the E for All logos on them outside of the convention center that PAX was being held.  They were doing this without permission by PAX’s organizers, and many deemed it was in fact illegal.  The gamers they solicited were so mad at what the promotional teams were doing, they defaced a lot of the E for All merchandise and threw a lot of it in the trash.  It ended up being a very negative start to an organization seeking for legitimacy in an already established hierarchy.

The first show was getting closer.  Badges would cost around $100 US for an early three-day pass.  They promised it would be exactly like E3, only open to the public.  E3 had previously filled the LA Convention Center halls to capacity and thensome, and the gamers who were not put off by the PAX debacle and wanted to finally experience the magic of E3 willingly put their money down to get in.

The first thing a gamer noticed when walking into the south hall for the first time was that it was empty, VERY empty.  Just one hall of the Convention Center was being used, and there were barely any booths to fill it.  To fill space, the organizers set up a food court and “Gamer’s Lounge” in the hall, but anyone who had previously attended an E3 could see just how much empty space there was.

Only one of the major console manufacturers, Nintendo, agreed to appear, as they had one huge lineup for the Christmas season coming to their hardware.  Super Smash Bros Brawl, the sequel to Melee, the biggest game on the GameCube, was in full force, and attracted a crowd big enough to fill a stadium seating rig.

Without Sony and Microsoft showing, it was up to the major publishers and smaller developers to fill up the hall space.

Intel and HP had booths set up but very little to actually show.  HP had Gears of War for PC and Unreal Tournament III on hand to play, but Gears of War launched on the 360 the year prior and with a 360 controller played identical on PC, and Unreal Tournament III already had a demo out by the time of the show.  Intel had a race car simulator on hand, and was green screening dancers in exchange for free 1GB USB drives.  This was my video.

Smaller developers like Telltale Games, who were showing off the second season of Sam and Max, got some press, and there were PLENTY of samples of Five Hour Energy being passed around.

Other than Nintendo, if there were reasons to attend E4, Guitar Hero III was one of them.  Entire stages were set up with stadium seating just to have Guitar Hero II competitions.  The winners (chosen by independent judges) would get demo units for Guitar Hero III with early wired Les Paul controllers.  Demo kiosks were set up for the game along the back, giving attendees the first chance to rock on songs like “Even Flow” and “The Metal”.

But EA would be there to compete with Guitar Hero.  The line to demo the first Rock Band wrapped around the big rig trailer brought in to play it on.  All positions were available, guitar, bass, and for the first time, drums and vocals.  An attendee’s whole day could be spent just waiting to play it.

But let’s not forget the unquestionable reason for the first E4, Konami’s first playable demo of Metal Gear Solid 4 was at their booth on the show floor.  Konami’s developers were on hand to brief the attendees on the new SIXAXIS control scheme, and answer questions while gamers played.  I had been waiting for MGS4 for years at this point, and had already bought a PS3 just to play it.  Ryan Payton himself answered some of my questions as he kicked us out of the demo so the next group could play.

After three days, the first year of E4 was over, and it would return to the same place in the following year.

The Life and Death of E for All (Part 1) December 8, 2010

Posted by Maniac in Histories, The Life and Death of E For All.
add a comment

The year was 2006 and the Electronic Entertainment Expo (E3), the biggest gaming trade show in the world, was starting to get a little long in the tooth.  Formerly an essential trade show where companies could court retailers on what products would be coming out in the next year, by 2006 the show had become a media circus full of loud noises, giant screens, and scantily clad women.  Oh boy, was it heaven on earth.

The suits were starting to complain.  It was becoming harder and harder for the corporate end of gaming to operate.  The backroom deals that were essential for game promotion and sales were taking a backseat to the media circus on the outside.  They didn’t like it was getting harder to navigate through the show floor to go from one scheduled meeting to another.

The gamers were starting to complain.  What was originally a venue for game developers to announce major news, show new footage, and premiere new games was getting wrapped up in its own hype machine.  While these things were still being done, in order to stand out in the organized chaos and attract attention to their games, the developers and publishers would compete at things like who could have the loudest music, the biggest screen, and the prettiest supermodels to attract publicity and get press to cover their game, instead of the merits of how good the game played.  What used to be a show of substance had become a show of hype and little else.  What’s more, new trailers barely had actual gameplay footage in them and playable demos of games would rarely be provided unless the game was very close to release.

The ESA, who ran E3, decided enough was enough and it was time for a change.  They discontinued the E3 as we had known it, instead deciding to shift to a smaller more manageable show and E3 as we knew it was gone.  A power vacuum formed among the game trade show organizations.  E3 was the unquestionable king of trade shows, and with it gone, another show, either new or currently existing, would have to take the crown as the new leader.  TGS, PAX, GAME and GDC were ready to ramp up their shows, and the developers were happy to increase their spending in those shows to make up for E3’s loss.

But a new trade show was ready to step in, planning to take E3’s crown for themselves by being everything E3 was and more.  They were going to be in the exact same place as E3 was, the Los Angeles Convention Center, and they promised their game developers would be there, ready to show their games to the gamers.  The improvement they would have with their show was that unlike E3, they promised to open their show to the public.  Their name, after being picked in an internet contest, would fittingly be E for All (E4), and they were ready to step in for their first show, which would be in October 2007.

Little did the attendees know that what they were anticipating would end up being a massive disappointment.

The Laserdisc Revolution November 26, 2010

Posted by Maniac in Histories.
1 comment so far

A few dots appear on a screen and maybe a few dashes, some of which may or may not have color to them. Beeps and bloops come out of the speaker. Insert twenty-five cents please. In the early 1980s, that’s what you would call a game.

But gamers were about to get something more.  A new storage media, one capable of displaying full motion video, was on gaming’s horizon, and it made most other games look like antiques in comparison. They used laserdiscs, a new technology for home video display which at the time was light years ahead of what VHS and Beta were capable of. Any individual second of a movie could be called upon the disc with very minimal seek time and no need to rewind or fast forward. For a long time, programmer Rick Dyer had been trying for years to find a format capable of producing an interactive game. He started with cash register paper on a mechanical machine, a great idea but one that couldn’t be mass-produced easily. The laserdisc format was perfect to make the next generation of arcade games. All that was needed was a piece of hardware attached to it which was programmed to take a user’s response and compare it to the correct solution to get the player through each level.

The machines themselves consisted of a built in controller with a stick capable of eight possible directions, and one attack button making for a total of nine possible input options for any danger moment. A standard definition monitor displayed the images off the laserdisc, and sound could be output in 2-channel stereo. If the user could correctly survive the sequence, they would make it to the next. If they couldn’t, the user would fail and would have to replay it.

Dragon’s Lair, directed by film veteran Don Bluth, would become one of the first arcade games released which would make use of this technology. Released in 1983, The arcade cabinet was also designed to be easily adaptable to any future game release. Within no time at all Dragon’s Lair became a cult classic. While usually arcade games would take months to recoup their cost to their arcade owner, Dragon’s Lair paid itself off in a matter of weeks, unheard of in the arcade market. The fascinating graphics and challenge of Dragon’s Lair provided its player with the incentive to keep playing.

Built on top of Dragon’s Lair’s hardware, Bluth and Dyer’s studios created the next game to take advantage of the laserdisc arcade machines, Space Ace and released it around a year later.  Fortunatly for the arcade owners, Space Ace could be bought in full or as an upgrade package (which was much less expensive) to convert an already existing Dragon’s Lair cabinet. This is almost like what a home player could do by just swapping out the cartridge in their Atari 2600!

But there was one thing that not even Dragon’s Lair and Space Ace could outrun, and that was the gaming collapse that it had found itself in the middle of. By 1983, the greatest crash in gaming history had begun. The flooded market of games could no longer sustain itself and even the biggest companies were going out of business. While Space Ace and Dragon’s Lair were doing well in sales, the arcades that were displaying them and the publishers that funded them were going bankrupt and closing down. The problem was that they were single handedly holding up a market that was not just on the decline, but ready to burst. By the time their third game was nearly complete the money just evaporated and the company was forced to lay off all their employees. It was expected to be a temporary inconvenience, but it lasted for years.

Dragon’s Lair II would finally see a new investor five years after the entire staff had been laid off and the Bluth Studios had moved to a different country. The arcade market had become an entirely different beast all these years later, and a new champion of the home gaming market reigned supreme, and it was Nintendo. Graphics were now better, and since more games had been using the technology in the meantime, the initial wow factor of a laserdisc game was gone. The arcades would need to evolve again or lose to the home video game market.

Technology may have improved but the games have always stayed the same. The resolution may have changed but it still takes the same moves to progress through sequence 13 and save Princess Daphne. However, the games have stood the test of time. Dragon’s Lair has been ported to over a dozen different formats over the past 25 years ranging from the early PCs to iPhone, to Blu-Ray Disc. The players are in many cases those very same players who put down 50 cents back in 1982 just to try to make it to the last screen. Playing it again brings them back to the time when the Arcade was the King and the Princess was not in another castle.

Console Wars V – Part II October 20, 2010

Posted by Maniac in Console War, Histories.
add a comment

Microsoft launched the Xbox 360 without a Halo game even announced at launch or internal Blu-Ray support.  A year ahead of the competition, the Xbox 360 struck first and hard.  At a release price of $399 dollars with a hard drive and a hard drive free $299 model, the Xbox 360 was the biggest Christmas item of 2005.  The number one game for it?  Halo 2 for the original Xbox, which was the first game Microsoft made backwards compatible with the new console, and gave it an HD facelift with a native 720p resolution, and removal of the texture pop.  This made Xbox fans think of the 360 as a logical upgrade, exactly what Microsoft wanted.  Of course there was also native 360 games like Call of Duty 2, the biggest 360 exclusive seller of the Christmas season, Perfect Dark Zero by Rare, who made the original Perfect Dark, considered by many the best game of the N64, and Dead or Alive 4, the first HD entry into the popular Dead or Alive fighting series.

Sony released a year later with Playstation 3’s, but just like with the Playstation 2’s launch, they could not meet the original launch numbers for the console, however unlike the Playstation 2, nobody cared how many Playstation 3’s were on shelves, they weren’t going to buy it.  The high price and lack of any high profile games was unconvincing to get early adopters to buy the system immediately.  Blu-Ray, seen as the killer feature of the new consoles, found itself in the middle of a format war at launch.  Sony’s competition in the HD war had launched HD-DVD months earlier, and had several exclusive studios and a strong early lineup of movies.  Sony, while it also had exclusive studios like Disney, 20th Century Fox and their own studios, they all had made very bad movies in the previous year.  When these bad movies had their BD release in the initial run, nobody cared about them, and Blu-Ray wasn’t the killer function Sony was banking on to sell the Playstation 3 in the early months.  Even the lack of vibration on the new controllers made Playstation 1 and 2 owners think twice about thinking of the Playstation 3 as an HD upgrade to all their current games, instead keeping their current Playstations as they were.  However nothing was more damning to Sony than the high price point, over $200 more expensive than their chief rival and the poor showing of launch titles they had at that year’s E3.  Very quickly the supplies of Playstation 3’s made it to the retail shelves where they just sat and collected dust.

But then something happened.  Something Sony and Microsoft had not even considered.  The Nintendo Wii released shortly after the Playstation 3 and was selling out everywhere.  Even with a larger supply at launch than the Playstation 3 had, retailers could not keep it on shelves and would always be sold out.  While sellouts were a common occurrence with a new console (the Sony PS2 could not be found on shelves for over half a year after release) the Nintendo Wii, the cheapest and lowest powered console of this generation, was sold out for over a year.

With all the processing power and HD graphics of the Xbox 360 and Playstation 3, the Nintendo Wii has become the far leader in this generation’s console war based upon sales.

But how did the Wii, if it did everything wrong, still come out on top? The graphics were nowhere near as good as the other consoles, but with low system specs came a low price for the unit. The low price point at launch was a big initial draw. On top of that, even with the low price Nintendo was still making profit from every unit sold, Sony, even with their high price, was losing a fortune with each Playstation 3.

What about the lack of DVD? Well the truth is by this point in time, everyone already had a standalone DVD player, making DVD playback not the must have feature as it was the last generation. Also, Blu-Ray, which is what Sony was banking on to sell their PS3s just as DVD had sold the PS2s, simply was not taking off as quickly as Sony expected due to the low install base of HDTVs at the time (only about 20 percent of homes at the time had HDTVs and most of them only supported 720p or 1080i) and the poor initial lineup of movies.

The Wii also came with the best game for the system packaged inside the console already, just as it had been with Super Mario Bros/Duck Hunt in the original NES. By bundling with the console, Wii Sports became one of the highest selling games of all time with over 50 million copies sold as of this time of writing.

The war was just beginning, but the lines were immediatly drawn, the positions were immediatly cast, and the fanboys were completely fanatical to their sides.  More games would be coming, prices would be dropped, and new technology would be destined for the horizion.  Will the console war shift?  We’ll find out soon.