jump to navigation

What Happened to Quinni-Con? May 7, 2018

Posted by Maniac in Histories.
add a comment

In 2012, a good friend of mine told me about a free fan convention that was going to be held at Quinnipiac University. It was called the Quinni-Con, and focused on anime, cosplay and video games. As a fan who had to spend their late teen years repeatedly traveling to the other side of his country to participate in any fan conventions, a local convention seemed like a pipe dream to me. So my friend and I packed up in my car and drove to the Quinnipiac York Hill Campus.

The inaugural Quinni-Con was being held in the Rocky Top Student Center. The venue not only looked cool, but it had plenty of space for all of the attendees. The cafeteria area was perfect for vendors and an artists’ alley. The high tech classrooms on the second floor were perfect for screening rooms and Q&A panels. There was even a maid cafe in the back which was offering tea and small cakes free of charge. What followed were two days of fandom bliss. We had a blast.

I wasn’t much of an anime fan in my youth, but the person who went with me was the biggest anime fan I know. Because of my lack of anime experience, I was a little confused when deciding which panels we should attend. Thankfully, I did recognize something on the list, Pokémon. Q&A panels for a webseries called Pokémon ‘Bridged were being held throughout the weekend. The panels were being hosted by 1KidsEntertainment and Nowacking, two of the three creators of the series. I used to watch episodes from the early seasons of Pokémon back when they aired on Kids WB!, and Pokémon ‘Bridged took those exact same episodes I remembered and redubbed them with hilarious results. I had such a nostalgic blast watching their series and I became an instant fan! Ironically, my anime friend was not familiar with Pokémon, but I was able to explain it to him well enough for him to understand the panels. We also had a blast attending the Q&A panels hosted by Voice Actress Lisle Wilkerson. She and my friend were fans of some specific animated shows that aired in Japan during the 70s, and they had a spirited discussion about it. Overall, it was a great first year and it looked like it would be an even better second.

Quinni-Con 2013 was even better. Before the con started, I bought my first Nintendo handheld in twenty years, the 3DS XL, and made the decision to get into Pokémon games. OneKids and Nowacking were returning along with a new guest, voice actor/director Chris Cason. Walking onto the York Hill Campus that second year was like stepping into a real-life Pokémon Center.

Yep, that’s what it looked like!

Voice actor and director Chris Cason hosted a few Q&A panels during the day. Some of them were more formal and focused on his work and others were more laid back, where he got to know the attendees at the show.

Chris was a funny guy with a great personality and listening to him talk about what it is like in a recording booth directing people like Briana Garcia was amazing. Overall it was an incredible experience and certainly better than the previous year. The next event could not come soon enough!

Quinni-Con 2014 was held about a month earlier than usual, but it was very welcome. The cherry blossoms were in bloom at the Rocky Top Student Center, making it the perfect photo spot for tons of cosplayers in kimonos. The complete cast of the now named Elite 3 was in attendance for the first time, and they had an all-new episode of Pokémon ‘Bridged to show us. It was great to finally meet xJerry64x. Quinni-Con even held a real-life Pokémon Center panel, where attendees were encouraged to use the time to trade, StreetPass and battle each other with their Nintendo handhelds.

However I would be remiss to say that Quinni-Con was entirely without problems. A quick search of the hashtag #quinnicon on Twitter will bring up posts by a few angry Quinnipiac students resentful of the people hanging out in costumes at the Rocky Top Student Center. Some convention attendees responded kindly, others were not so kind with their responses. By the end of Quinni-Con 2014, the word started to circulate that the organizers wanted to expand the con and started taking ideas from the attendees.

I had previously done video reviews of the convention, so I attended the idea panel. I spoke to them honestly about what I liked about the event and how some panels could be improved in the future. Apparently, some of the people in the panel room were aware of my videos and told me they appreciated them. My favorite bit of advice remains the request to put up something on the projector during the Pokémon Center panel. The organizers loved that idea and said if they couldn’t put their own live Pokémon feed for the projector they would be fine showing a Pokémon Let’s Play on it. I made it clear that overall I thought the con was fantastic the way it was, and part of the reason the con was so great was because it was local and it was free. I begged the higher ups not to move forward with their plans for expansion.

There was no Quinni-Con 2015, even though there were plans for it to happen. The event organizers sent out a mass email to former attendees and stated their expectations for the next convention were grand and required at least another year of planning. I was disheartened to hear that, but I understood their perspective. The reality, it seemed, was worse than I expected.

It turned out new management took over the con. By itself, this is understandable and is actually more common than you think with a student-run convention. Students graduate, and organizations are meant to shift to new student managers each cycle. The problem is new managers may not be as up to the task of running an elaborate event as the previous ones were.

Quinni-Con 2016 was announced, and it would be held off-campus for a fee of $15. The con organizers planned to host the event at a Hotel/Waterpark/Convention Center in Waterbury, CT. While this sounds like a fine place to hold an event on paper it was the worst choice for a venue the organizers could have thought of. Geographically, Waterbury is pretty far away from Quinnipiac University, and getting there requires at least an hour drive on Connecticut highways that are always jammed. The hotel they chose had a two-star rating online, and even worse, it was planning to close the day after the con ended. What incentive did the cleaning crew have to actually sanitize a hotel that could get dirty very easily if they were all about to get fired? The Elite 3 was invited to come, and they were willing to tough it out despite cleanliness concerns, but only if they were booked at another hotel. The con agreed, but it turned out to be all for naught.

Six days before the event was to be held, Quinni-Con 2016 was cancelled in the worst way possible. No official announcement of the event’s cancellation was ever posted on the convention’s official website (which is now defunct) or Twitter feed. The ONLY place on the entire internet that mentioned the event’s cancellation was on their Facebook page…which at the time had a tendency to lock out non-Facebook users from even viewing it.

It’s all quite a shame. In its prime, Quinni-Con was a well run fan event. It brought locals together in a way this part of the country sorely needed. I’ve been a fan of Pokémon ‘Bridged since finding out about the series at the first Quinni-Con and I consider to be a huge contributor to my return to Pokémon fandom. It would be nice to see the event return to the Rocky Top Student Center some day, and when that day comes I’ll be there.

Console War VI Part 2 May 3, 2016

Posted by Maniac in Console War, Histories.
add a comment

This is the second part of our ongoing history on this generation’s console war. If you’d like to read the first part of this article, please click here.

If there was anything that gamers could take away from E3 2014, it was that Microsoft was going to do everything they could to try and regain their lost customer base.  Without telling their developers, Microsoft announced they were no longer bundling the Kinect sensor with every Xbox One, effectively dropping the price of the console by $100 US.  As for new titles, while Microsoft had its own games coming to the Xbox One including Halo 5, they had bought exclusive console rights to many third-party published games hoping gamers would buy an Xbox One to play one of their exclusives.

Nintendo meanwhile was having their own problems with the Wii U. While the system sold decently at its launch, Wii U sales stagnated after the launch of the PS4 and Xbox One. When a console’s hardware sales slump, sales of third-party games on that console slump as well. Once they saw their games were not selling well on the Wii U, third-party publishers slowly pulled their support from the console, leaving Nintendo alone to develop and publish the vast majority of Wii U games. Nintendo announced they were cutting back on E3 plans in 2014 and decided to focus on showing their next major title, Super Smash Bros for Wii U, directly to fans by making E3 demo units playable to the public at Best Buy retailers.  While the general public could only preview the game for a few hours, Nintendo also hosted an enormous Super Smash Bros tournament in Los Angeles during E3. This epic tournament was not only open to the public, it was streamed live on Twitch.TV to a massive viewership.

When Super Smash Bros was released on the Wii U a few months later, it became a huge seller, but that wasn’t the only hit that Nintendo sold.  Around the same time Nintendo was promoting Super Smash Bros, they announced a new technology would be coming to the Wii U called Amiibo.  Amiibos were collectible figurines which made use of the Wii U controller’s NFC transmitter. They were designed with the likeness of various popular Nintendo characters and Nintendo promised anyone who used them could bring their Amiibos into their game, and that they would become more effective the more they were used. When Amiibos launched alongside Super Smash Bros on the Wii U, retailers could not keep them on shelves.

The PS4 was still selling very well throughout 2014, but even if you looked at events with the perspective of time, it is difficult to precisely determine why. Microsoft’s agressive third party buyouts ensured many popular titles like Dead Rising 3, Sunset Overdrive and Titanfall would remain exclusive to the Xbox and/or PC platforms, but gamers saw these exclusives as corporate pandering and refused to buy the Xbox One on principle, and voiced constant complaints to the publishers. In response, while Sony only had their first-party developers to rely on for exclusive PS4 titles, Sony boasted the PS4 would offer superior graphics and performance over the Xbox One when it came to multiplatform games and many independent testers confirmed this was the case.  The PS4 would get great exclusive titles like inFAMOUS: Second Son and Until Dawn but their releases were widely spaced out.  In fact, a vast majority of the system’s exclusive titles were ports of popular PS3 games like The Last of Us and God of War III.  The strength of the multiplatforms and the promise of great upcoming exclusives like Uncharted 4: Among Thieves made the PS4 the highest selling console of the year.

Instead of hosting a live press event during E3 2015, Nintendo once again chose to host live demonstrations of their games at retail stores across the country. While they had great success with titles like Mario Kart 8 and the Wii U version of Super Smash Bros, their next major Legend of Zelda game was suffering delays in development.  Nintendo had also planned an incredible fan events, including the first Nintendo World Championships in over a decade!  While the gaming press didn’t understand what Nintendo was doing, the fans who attended these events sure enjoyed themselves.  They also had been tremendously successful with their line of Amiibo figures. They were the perfect combination of collectible and game accessory. Limited run figures would sell out immediately.  Heck, they were even being purchased by people who didn’t even own the Wii U.  Everyone was buying them like crazy, so Nintendo started integrating Amiibo support into many of their new games including Super Mario Maker.

By E3 2015, Microsoft got desperate. All of the exclusive titles they had bought were not selling as many consoles as they had expected them to, and the third party publishers were becoming wise that Xbox One exclusivity meant an unacceptable drop in projected sales. Public negativity towards the Xbox One had harmed Microsoft’s image, and even though Microsoft had stopped bundling the Kinect, price matched their competition and pledged not to engage in anti-consumer resale restrictions, gamers were still choosing the PlayStation 4 over their console.  At E3 2015, Microsoft announced they were bringing Backwards Compatibility (BC) for select Xbox 360 games to the Xbox One.  Similar to how the Xbox 360 could play only specific original Xbox games, Microsoft vowed that with a simple update and installation, gamers could play select Xbox 360 games natively on the Xbox One whether they downloaded them or got them at retail!

Microsoft’s Xbox One Backwards Compatibility announcement got a mixed reaction from the mainstream gaming press. While it was undoubtably great news for consumers, it couldn’t guarantee console sales this late in the Console War.  The Wii U had full Backwards Compatibility with Wii games and hardware, and it hadn’t helped them convince most Wii owners to upgrade from the Wii.  Nintendo went to a lot of trouble to make it easy for gamers to transfer all of their purchases, saves and DLC from their old system to their new one, but most consumers weren’t even aware of it.  While Xbox One compatibility with popular Xbox 360 games was a great show of goodwill on Microsoft’s part, there was just no way to know if this would be the decision that changed gamers minds about the Xbox One.

Meanwhile, Sony had no direct response to Microsoft’s Backwards Compatibility announcements, since the PS4’s hardware was completely incompatible with the PS3. If players wanted to play PS3 games on the PS4, they would have to wait for Sony to port it, and if Sony happened to port a major retail title, consumers would usually have to either rebuy it or pay an upgrade fee.

In all, the console war was starting to look like a repeat of the PS2/GameCube/Xbox era.  Sony was on top, Microsoft was trailing behind Sony and Nintendo was in last place making almost all the games on their own system.

As the Console War hit its halfway point, Sony announced they would be entering the Virtual Reality market. That, dear readers, is a story for next time.

Console War VI Part 1 August 25, 2015

Posted by Maniac in Console War, Histories.
add a comment

In 2011, Nintendo would be the first to enter a new generation of console war.  Fueled by the tremendous success of the Nintendo Wii and Nintendo DS, Nintendo was ready to take another gamble to try to repeat the unbelievable success of the Nintendo Wii.  They planned to create a low powered reasonably priced console which would reinvent the controller in a way that only they were capable of.  Taking inspiration from their successful handheld lineup, and the increasing popularity of personal tablet computers, Nintendo created a console based entirely around a controller equipped with both motion controls…and a touchscreen capable of displaying its own video feed.  The Nintendo Wii U was officially announced at E3 2011 to incredible fanfare, and a wide variety of first and third party games Nintendo was preparing for the console’s launch.

Sony and Microsoft said nothing about the Wii U’s announcement, and they were not concerned about Nintendo launching the next console war first.  They knew their consoles would have at least one more year in the market before they would be considered technically obsolete and they were not ready to reveal what they were working on just yet.  The mainstream gaming press gave Nintendo a lot of positive praise for the Wii U, but many were wary.  The console’s graphics were basically on par with what the Xbox 360 and PS3 were already capable of, and without the tablet controller, the Wii U was essentially a high-definition capable Wii.  The Nintendo Wii U launched at the end of 2012 with a pretty impressive series of launch games including Super Mario Bros UBatman: Arkham City Armored Edition, and the most anticipated third party game in the Wii U’s lineup, ZombiU,  To best show off the system’s capabilities, Nintendo bundled the game Nintendo Land with every premium black Wii U model sold, hoping that it would bring the same success that bundling Wii Sports with every Wii brought.

Wii U sales were slow, but the system gained a loyal following.  People who did buy the system opted to only buy the premium black model, so Nintendo eventually eliminated manufacturing the cheaper white model.  Reviews for the system ranged all over the place, while players loved Nintendo Land and ZombiU, most felt that the games alone did not merit the console’s purchase, even though it was compatible with every Wii game and allowed players to transfer all their save games, Miis and digital purchases from their Wii to the Wii U.  Meanwhile, Sony and Microsoft continued to promote their current platforms but remained tight lipped if they had any plans to replace the PS3 or Xbox 360 with new consoles.  Christmas 2012 would be dominated by the Wii U, but would it be alone for long?

In February 2013, Sony announced their successor to the PlayStation 3, the PlayStation 4.  They had no demo unit available to show the press, only a controller, a 3D camera, and a very select amount of games.  The PS4’s architecture would be a complete 180 from what the PlayStation 3 used, making it completely incompatible with any PS1, PS2 or PS3 game.  In fact, most of the system’s presentation revolved around Sony’s plans to offer a game streaming service based on Dave Perry’s Gakai service.  After the platform’s creator unveiled some of the system’s major features, including an impressive standby feature, several games were shown including a racing game called DriveClub, as well as new entries in the Killzone and inFAMOUS franchises.  Third party developers like Ubisoft also demoed their upcoming games on the PS4, and showed Watch Dogs would be coming to the platform.

After the presentation concluded, PS4 buzz began almost immediately.  It was undoubtedly a powerful system, but there were still a lot of questions about it.  Since Sony had not included a mock up of what the console was going to look like at its initial presentation and spent so much time going on about the console’s streaming services, players did not know if the PS4 would even include a disc drive until after Sony released the system’s specification sheet.  Also, the lack of backwards compatibility was an issue, especially since Sony was planning to sell new PS3 and PS4 titles over the next year, and Nintendo was able to offer Wii compatibility with the Wii U.  However, the console’s specifications impressed and the games looked incredible.

After Sony’s PS4 announcement wrapped, all eyes were on Microsoft to announce their successor to the Xbox 360.  Microsoft would announce their next Xbox console a few months later.  At the announcement event, Microsoft unveiled what their next console would look like and it’s name, the Xbox One…which happened to be the exact same thing most of the mainstream was already calling the first Xbox console since the Xbox 360 launched.  To show the audience how revolutionary their new console was, they showed a clip from the popular game show The Price is Right to show the world their console could stream regular television feeds by connecting with mainstream cable/satellite provider’s set-top boxes!  That’s right, Microsoft was showing how revolutionary their next generation console was by demoing gimmick features nobody would make use of.  They also announced a new Halo TV series was in development with the help of Steven Spielberg, but to this day absolutely nothing has come of that project.  The first game that was shown on the system was Remedy’s Quantum Break, a game which has not been released at the time of this writing, but still remains my most anticipated Xbox One game.

To cap the presentation off, Microsoft announced that every Xbox One sold would come bundled with its own brand-new Kinect camera which would enable full voice control, motion tracking, and video streaming.  When Microsoft launched the first Kinect sensor for the Xbox 360 in 2010 a lot of people thought that it had a lot of potential, but most game developers were not willing to develop games for such an expensive optional accessory.  Now that Microsoft was planning to bundle new Kinect units with every Xbox One sold developers could take full advantage of everything the Kinect added to the platform.

The Xbox One impressed a few but a lot of people remained skeptical.  With the exception of Quantum Break’s showing, most of the time Microsoft spent unveiling their next generation games console was used to talk about everything the system could do but play games. Also, most of the mainstream press had a bad feeling about the things Microsoft was not saying about the new system.  The Xbox 360’s Kinect was revolutionary when it was released, but anyone who had one knew it was too unreliable to work as well as a controller did.  Plus, with the improved camera, a lot of people expressed major privacy concerns with what they considered should have been an optional accessory.  However, the biggest concern the mainstream media would have about the new platform was how it would handle used and traded game sales.  Several media outlets had heard musings that the Xbox One would deny playability to all resold, rented or traded games, one of the most anti-consumer practices that any game developer could have engaged in.  On video, Microsoft spokespeople denied these claims, but officially Microsoft had planned for the Xbox One to be one of the most anti-consumer consoles in gaming history.

With the last two major console announcements out of the way, all eyes were on E3 2013.  There was no doubt that Microsoft, Sony, and third party publishers would be showing off more games for the Xbox One and PS4.  Microsoft struck first, announcing tons of exclusive titles would be coming to the Xbox One including LocoCycleKiller Instinct, Dead Rising 3, D4, Forza Motorsport 5, and the next main Halo game.  As the show concluded, they announced the Xbox One’s price, $499 US, and said all systems would include a controller, headset, 500GB internal Hard Drive and a Kinect.  However, Microsoft said nothing about how the system would handle its games or how disc purchases would be handled by the system.  Even after the show wrapped, many were still extremely concerned that the Xbox One would not be usable for players who lacked an internet connection, and that game rentals and used resales would be impossible on the system due to heavy anti-consumer copy protection.

A few hours later, Sony took the stage to show the final version of the PS4 and several of the games that consumers would be able to play day one.  Most of the games shown were multiplatform titles and sadly, Sony had no God of War or Uncharted game to show.  However, near the end of the presentation Sony had a moment that most of the mainstream press considered one of the greatest moments in the history of E3, a “drop the mic” moment if you will.  Sony’s executives made it crystal clear in plain English that the PS4 would ship with absolutely no anti-consumer copy protection and have no problem playing borrowed, resold, and rented game discs.  The system’s final price would be $399 US, a hundred dollars cheaper than the Xbox One’s.  The crowd exploded, and preorders for the PS4 in the US went crazy that night.

Nintendo was the last to present, and they showed off a library of upcoming games for the Wii U including Mario Kart 8, Donkey Kong Country Tropical Freeze, and a Wii U remake of The Legend of Zelda: Wind Waker.  It impressed Nintendo’s loyalists, but Wii U sales had been slumping and many were concerned that these games would not improve the Wii U’s sales.

After the E3 show concluded, both Sony and Microsoft started looking at their preorder numbers.  Sony was happy, Microsoft was not.  Oddly enough, Microsoft’s anti-consumer plans for the Xbox One were not resonating with consumers, and the lower price and solid titles offered by the PS4 was more than enough to earn gamers’ trust.  Fearing their own decisions would make them lose the console war before it even stared, Microsoft scrambled their PR teams to try to fix this debacle before the console’s launch, and they made a public announcement to all of their dedicated retailers that they were changing course with their plans and removing the online requirements and rented/resold/borrowed game restrictions of the Xbox One.  The console would have an initial online activation requirement at launch (similar to how a SmartPhone has to get activated in a store before you can take it home with you), but that would be all.  Many consumers, myself included, breathed a sigh of relief over this announcement, but the news was considered too little too late for many who simply didn’t trust Microsoft and had already planned to buy a PS4.

Fall 2013 came around, and the battle was about to start.  The PS4 launched first and quickly sold out its initial allotment.  Demand for the console was so high many were turned away with their money still in their pockets.  Even though it had no backwards compatibility, few exclusive titles, and a launch lineup of games you could likely get for other systems, new PS4s would not stay on retail shelves for long.  When asked why most players were interested in the system, the mainstream consumer listed price and technical capabilities as their primary reasons for buying a PS4.  They believed the multiplatform games looked and ran better on PS4, and for $399 US, the price was right.

Microsoft launched the Xbox One with a Kinect, a huge lineup of exclusive titles for download and retail release, and a $499 price tag.  Aside from a huge market for people who purchased the Day One edition of the console, any non-Day One Xbox One system sat on shelves to collect dust.  The peripheral that Microsoft felt would give the Xbox One a huge leap over Sony’s PlayStation 4 console became every conspiracy theorist’s whipping boy.  Even though Microsoft had reversed their decision to restrict used game sales and require a persistent online connection to play their games, privacy concerns over the Kinect sensor became the reason many gamers refused to pick up the console.  In contrast, Sony’s console was such a hot seller consumers wouldn’t be able to find it on shelves for another three or four months,  By E3 2014, Microsoft backtracked on their decision to bundle the Kinect with the Xbox One, and announced the Xbox One would be sold without a Kinect for a price of $399 US.

What came of this decision and how did this effect the Console War?  You’ll have to read that next time!

Gaming History You Should Know: The Half-Life 2 Leak October 16, 2014

Posted by Maniac in Gaming History You Should Know, Histories, Uncategorized.
add a comment

After producing our recent site video about the long and grueling development of Half-Life 2: Episode 3, I thought I would take another look into Half-Life 2’s past.  We all know that Half-Life 2 launched in 2004 and is considered one of the best PC FPS games ever made, but did you know some people just weren’t willing to wait until the game’s release to play it?  In 2003, an early build of Half-Life 2 leaked onto the internet.  With all the buzz generated from the game’s E3 2003 showing, the Half-Life 2 leak quickly became one of the most illegally shared pieces of content at the time.  Now, sit back and join me as I look ten years into the past at one of the biggest leaks in gaming history.  It is meant to inform gamers of a dark time in the history of gaming, and as a cautionary tale to game developers.  Enjoy.

Back in April of 2003 Valve announced that a sequel to their hit PC game Half-Life had been in production for the past five years in secret and would be coming out later that year.  Using a cutting edge in-house developed graphics engine and a real-time physics engine, Half-Life 2 was first shown to the public at the Electronic Entertainment Expo in Los Angeles in May 2003.  All of the gaming press, myself included, were blown away by the incredible thirty minute demo shown at the trade show.  To cap the excellent showing off, Valve announced Half-Life 2’s release date would be September 30th, 2003.  Unfortunately, this wasn’t soon enough for someone.

Half-Life 2’s development was not keeping to Valve’s schedule and as the weeks ticked by it was becoming more and more likely that the game would not be able to meet its September 30th’s release date.  According to Valve owner Gabe Newell, on or around September 11th, 2003 a person other then himself was accessing his e-mail without his permission.  Whoever this individual was, he wasn’t interested in e-mail.  Valve soon discovered viruslike symptoms on their computer systems, including crashes when right clicking, but strangely their anti-virus software wasn’t picking up on anything unusual.  The hacker had used a modified version of a virus called “RemoteAnywhere” to exploit an Outlook Express buffer overflow exploit, and since it had been modified it was undetectable by the anti-virus programs Valve was using.  Once it had been installed, the virus in turn installed keystroke logs on all the computers at Valve Software.  By September 19th, the attacker had downloaded an unconfirmed amount of un-compiled source code and game resources (including sounds, maps, and textures) for Half-Life 2, as well as the source code and resources for as many as two other games they had in development.  All this data found itself on peer-to-peer sharing networks within hours, and the amount of users who downloaded and shared the content was monumental.  Gabe Newell was forced to officially confirm the leak on October 2nd, 2003, and asked for the help of the gaming public to bring those responsible to justice.

Even to this day, we don’t have a complete picture of just how much of the game was stolen.  Owner Gabe Newell claimed at first that only a minor amount of the final source code was taken, and when the leak first sprang up, only the game’s source code was available on the peer to peer networks.  As the days ticked on it became clearer that whoever had stolen all of the game’s content hadn’t released all they had just yet.  Over the next week gaming news sites were all over this leak, trying to report the most up to the minute information and as time went on, more and more copies were being made of the source code by the peer-to-peer users.

The content leak may have been fun to mess around with for those who downloaded it, but it meant a lot of headaches for its creators.  Because of the leak, Valve was now in violation of a contract they entered into with a company called Havok, who Valve used to license their real-time physics engine*.  A stipulation to the contract between Valve and Havok was that Valve needed to keep Havok’s code safe, but since their code was also included with the Half-Life 2 code, Havok could now lose a lot of business.  This was a major problem for Havok since their income was from leasing their code to game developers.  Another of Valve’s issues with having the game’s source code released illegally meant the increased danger of multiplayer cheating.  Cheating is a common occurrence among multiplayer gaming and can destroy the fun of playing games online.  With the source code out before the game’s release, the program writers for cheating programs can have a head start to write their cheats for release by the time of the game’s release.

On October 7th, 2003 a playable version (dubbed “playable beta”) of the game was released onto the peer to peer networks by the hacker (who refers to himself as Anonymous Leaker).  The previously released source code was only ninety-four percent compilable, and without the game’s content it was useless.  The leak had to either have been more massive then Valve knew or bigger then they were willing to admit.  Half-Life 2’s original publisher, Vivendi Universal Games, did a press release to push back the release to April 2004 and credited the leak.

But the most interesting thing about the playable beta was that users who downloaded the illegally released version of the game found a game that was in fact nowhere near completion.  It was missing a lot of content and looked like a barely playable version of the levels shown at E3.  While this could have been because the Leaker stole either an early build of the game client, or was unable to completely obtain all the game’s assets before the content leaked and claimed it was a finished build, it raised the question in a lot of gamers’ minds if this incomplete title was what Valve planned to release on September 30th.

On October 9th, in a questionable move, a forum user claiming to be the Leaker announced the content that was released online was indeed the current work Valve had for Half-Life 2 and that the game was nowhere near completion.  Because of this, he plead innocent and said that he was not the reason for the delay of the game.  On top of that he claimed that if Valve continued to claim that the leak was the reason for the delay, he would release all the stolen content he had.  On October 13th, the source code to the game maps were released to peer to peer networks.  As far as I know, that would be the final leaked Half-Life 2 content to make it onto the web.

So who was this mysterious Leaker and why did he steal the biggest game of its time, just to release it before it was finished?  Well, as it turned out, the man did in fact have a conscience.  The Leaker, who’s name will not be posted on this site, would eventually get in personal contact with Gabe Newell via e-mail.  The purpose of the e-mail was not to gloat or threaten, but to apologize.  After verifying his identity by providing Gabe some information that had not gone public, Gabe knew he was speaking to the real deal.  At first, Gabe was furious at him for all the trouble he had caused the company, but he wasn’t planning to let the Leaker know that.  The Leaker said that he was just a big fan of Valve Software, and he had never meant for the content to go public as it did.  He just wanted the chance to play their games and he pleaded Gabe for his forgiveness.

Gabe played it cool.  On the one hand, he finally knew the Leaker’s motivations, but he still wanted to bring this person to justice.  He decided to set up an elaborate sting operation, with the intention to bring the Leaker to the United States and have him immediately arrested by the Feds.  Gabe told the Leaker that he was impressed by the Leaker’s actions, and wanted to bring him in to interview as a new “security consultant” for the company.  The Leaker was skeptical of course, but he didn’t want to pass up the opportunity to work for Valve Software.

After Gabe spent some time trying to set up this elaborate sting operation, the law enforcement agency of the Leaker’s home country got wind of what Gabe was doing and stopped it, preferring to arrest him themselves.  Shortly after that, the Leaker was arrested by his country’s local law enforcement.  He would maintain that he never wanted the content he had stolen to leak out into the open as widely as it had, and that a peer of his that he had entrusted a copy of the data to had leaked it onto the web.

Half-Life 2 would finally release in November 2004 and became the flagship title for Valve’s online distribution system, Steam.  The gamers who purchased the game legally through the service or at retail found a polished masterpiece which won numerous Game of the Year awards in a year which also saw the release of titles like Metal Gear Solid 3: Snake Eater and Halo 2.  The game’s PC release was followed by a console port to the original Xbox shortly afterwards.  Two episodic expansions were produced for the game and these expansions were included alongside Half-Life 2, Team Fortress 2 and Portal when the game was released on the Xbox 360 and Playstation 3 in 2007.

As for Gordon Freeman? He has not been seen since.

Video Game Handheld War Part 11 September 11, 2014

Posted by Maniac in Histories, Video Game Handheld War.
add a comment

When we last left the Video Game Handheld War, Sony launched their second dedicated gaming handheld platform in the form of the Playstation Vita and it was practically dead on arrival. The system and its peripherals were just too expensive at launch and many players believed that after Nintendo’s 3DS price drop, Sony would respond with one of their own and chose to wait. Even though the handheld’s biggest launch title, Uncharted: Golden Abyss, received some favorable reviews from the gaming press, in gamers’ minds it did not merit an investment in the product just yet. Stock of the Vita gathered dust on retail shelves for weeks. Everyone expected Sony to announce a price drop at E3 2012 but strangely it didn’t happen. To further hurt the Vita’s chances, Sony didn’t impress much in the form of any new Vita games at the show. The biggest takeaway we got from that show was Sony’s promise there would be Vita connectivity with future PS3 titles. While none of the mainstream press mentioned it at the time, I had seen a similar tactic years before. Nintendo had brought GameCube connectivity features to the extremely popular Game Boy Advance, hoping to increase sales of the floundering console. It may have sounded like a gimmick at the time, but Nintendo was able to do some pretty creative things with that connectivity feature in games like Four Swords Adventures and Final Fantasy Crystal Chronicles. In hindsight, Sony was not able to do much with their connectivity plans. They were able to offer some decent features, like cross platform multiplayer for select games but that was about it. At least they were willing to offer cross platform digital purchases, ensuring any digital games purchased on different platforms would be playable on any Sony hardware a player owned without forcing their customers to rebuy the same game multiple times. While it was certainly very consumer friendly, to this day neither Nintendo nor Microsoft will allow software purchased on one platform to be played on a different platform without making you rebuy it, it didn’t add much to enhance the multiplatform gaming experience.

Nintendo meanwhile had a great E3. To show off how strong their handheld platform had gotten since it’s price drop, they dedicated a separate live show exclusively to show players all of the upcoming 3DS games and Nintendo had a lot of surprises ready for that show. Tons of new games were shown at the separate presentation hosted on the first day of E3 including New Super Mario Bros 2, Luigi’s Mansion 2: Dark Moon, and even a new Castlevania game. They also announced they were enhancing their digital download service capable of offering full retail titles for digital purchase. 3DS owners interested in purchasing all their retail games digitally would have done well to purchase a new High Capacity SD memory card, because Nintendo was planning to offer New Super Mario Bros 2 on digital download the same day and date with the game’s retail launch.

However Nintendo’s handheld release schedule for that year did not revolve entirely around the 3DS, there was one major release coming to the DS by the end of 2012. Well to be clear, there were actually two major releases for the end of 2012, Pokémon Black 2 and Pokémon White 2. That 2 is not a typo, these were the very first direct sequels to a Pokémon generation Nintendo ever released and continued the story of the fifth generation games. Nintendo was also planning to release two digital applications to the 3DS’s online marketplace which would tie into Pokémon Black & White 2‘s release, Pokémon Dream Radar and Pokédex 3D. So while the games would play just fine on the Nintendo DS for all current DS owners, 3DS owners would be able to download some extra applications which could enhance their gaming experience. Heck, the original Pokédex 3D application was totally free.

While the DS was still going strong, the PSP on the other hand was just plain dead. Retail stores, if they had any left over PSP games in stock, was trying to get rid of them at heavily discounted rates. If you were able to find them, games like The 3rd Birthday and Dissidia 012 would be at some pretty reasonable prices. It was also a great time to buy some last minute peripherals like spare batteries, earbuds, and tv out cables for the PSP because they would not be restocked.

With the PSP on the way out, the Vita needed to step up to the plate to keep retailer confidence. So what was next for the Vita in the form of new exclusive games? A new Resistance title. The Resistance franchise had garnered a dedicated following since Resistance: Fall of Man launched alongside the PS3. In fact, I believed that game was the best PS3 title until Uncharted: Drake’s Fortune was released in 2007. Sadly, the franchise’s creator, Insomniac Games, had moved on and Sony had put the franchise in the hand of another developer to produce the Vita exclusive title, Resistance: Burning Skies. Sony put a lot of hype into promising great things from Burning Skies but when it launched in June 2012 reviews of the game ranged from lukewarm to abysmal and it was not the system seller Sony desperately needed. The aftermath of the game’s failure was so bad many are worried it may have killed the Resistance franchise.

As the summer progressed, Nintendo had another huge announcement ready to go, they were preparing to release their first hardware revision for the 3DS. That announcement shocked just about everyone, since it hadn’t been that long since the 3DS launched, but Nintendo was ready. The new 3DS was larger and would feature a larger screen, making the handheld’s 3D effect much easier to see. It would also have an improved battery for longer gameplay and standby times. Nintendo even got rid of the 3DS’s annoying collapsable stylus, instead the XL would come with a solid full sized stylus. The new handheld was called the 3DS XL and the price would be $199 US, still $50-100 less than what the Vita was selling for. It would even come stock with a 4GB SDHC card for storage, offering an improvement over the 2GB cards which came standard in the original 3DS. About the only problem gamers had with the XL was that it lacked a second analog stick, and the new form factor made the XL incompatible with the 3DS’s Circle Pad Pro peripheral. However, most 3DS games were designed around a single analog stick and proponents of the platform didn’t believe this was much of a problem. Current DSi and 3DS owners would even be able to fully transfer all their save data and purchased content to the 3DS XL without much issue, making it a logical upgrade in the minds of many Nintendo fans, and gaining the interest of gamers on the fence about investing in the platform.

The 3DS XL launched in August 2012. On the same day, Nintendo released the highly anticipated title New Super Mario Bros 2, the sequel to the DS’s biggest selling game, at retail. Just as promised, Nintendo made it the first 3DS retail game to have a day and date launch online, and gamers were happy to be offered the option. This was the kick off point of a revamped 3DS eShop, and more 3DS retail games would be coming to complement the NES, Game Boy, DSi and 3DS downloadable titles the service already offered.

To compete with the launch of the 3DS XL, Sony had…nothing. After the failure of Burning Skies, Sony was unable to bring Vita games to the platform at the same pace that Nintendo was getting games for the 3DS. Gamers were not adopting the platform if it wouldn’t offer games and developers weren’t willing to take a risk on a platform with such a low install base. About the only thing that was in store for the Vita in the immediate future were ports. Meanwhile Nintendo was swinging hard with regular releases for the 3DS on the horizon. As 2012 came to an end, not only was it clear that Nintendo was keeping its crown in the Video Game Handheld War, it was possible that the Vita no longer had any chance in being even remotely competitive against Nintendo for the rest of the handheld generation!

However, total dominance in the handheld space wasn’t good enough for Nintendo, and little did they know that as 2013 began, the conditions were right for Pokémania to have a resurgence not seen since the year 2000. All it needed was a little announcement by Pikachu to kick it off. Stay tuned, dear readers. I’ll share that story with you next time.

Video Game Handheld War Part 10 May 12, 2014

Posted by Maniac in Histories, Video Game Handheld War.
add a comment

After a lapse of a few months, we’re back to talking about the Video Game Handheld War.  When we last left the series, Nintendo had once again dominated the handheld space with the Nintendo DS family of portable gaming devices.  Sony, who entered the portable gaming space for the first time with the Playstation Portable, was unable to compete against Nintendo in sales figures.  However, the PSP had its loyalists, and Sony was ready to compete against Nintendo again in the next handheld generation.

Being on top of the handheld space since they entered it and being the clear winner of the previous generation console war, Nintendo was not going to revolutionize the gaming industry in the next handheld war like they had with the Nintendo Wii.  The Nintendo DS family had serious brand power, and their next generation handheld would be more evolutionary than revolutionary.  Since the release of Avatar, 3D, previously a fad, was returning to the mainstream. By 2011, analysts predicted 3DTVs were about to become as revolutionary an enhancement to the entertainment experience as HDTV and Surround Sound were, and Nintendo was ready to offer a 3D experience you couldn’t get anywhere else.  Nintendo’s next handheld system not only include some of the most impressive graphics they had offered on a handheld, on par with what a GameCube or Wii could offer, it would have a 3D gaming screen you wouldn’t need glasses to see.  Nintendo called their next handheld the 3DS.

The price was pretty steep, $249US.  In comparison, no previously released Nintendo handheld ever broke the $200 price point, and Nintendo reaped the benefits.  Sony had launched the original PSP at $249 US back when the handheld launched in 2005, but even with all the features Sony included with the handheld to justify that price point, PSP sales were never close to being competitive with Nintendo’s offerings.

In the meantime, Sony was preparing their own next generation handheld system, which they dubbed the “NGP” for Next Generation Portable.  Like they had with the PSP, Sony was banking on providing the most technologically capable gaming handheld on the planet.  The NGP would feature an OLED screen, for the best possible picture quality, and the most cutting edge graphical hardware, superior to what was possible on consoles at the time!  When an early version of the handheld was first unveiled, the press could not believe Sony’s portable was capable of practically recreating the same kind of graphics gamers had seen on the PS3’s killer app, Uncharted.

We can talk for hours about the NGP’s sheer technical powers, but really the biggest advancements the NGP brought to the table came in the form of the new control system.  Taking a page from the Nintendo DS, the NGP would feature a front and rear touchpad, and like the PS3 controller, it would have motion control and rumble.  The NGP would also have the most requested feature players wanted on the PSP, a second analog stick.

The Nintendo 3DS launched in Spring 2011, a few months before that year’s Electronic Entertainment Expo.  Early sales of the platform were strong.  For a lot of DS players, the 3DS seemed like a logical upgrade.  The 3DS design was very similar to the DS form factor.  It featured all the functionality of the Nintendo DSi family of handhelds including dual screens, compatibility with DS and DSi games, and wireless access.  In fact, DSi players could transfer all their purchases and savegames to their new handheld by simply downloading a free application to their DSi, increasing the early value of the new system.  Also, the Nintendo DS offered some of the finest exclusive titles of the previous generation, so anyone who never picked up a DS could finally play all of the classic DS games on the new 3DS.  It was like having two platforms in one.

After a moderately successful 3DS launch, Nintendo had a problem on the horizon.  After strong early sales, 3DS purchases quickly began to falter.  A price point of $249 was just too high for a handheld device, while top of the line smartphones were readily available with greater capabilities at a lower price.  While the DS compatibility was a great feature, the initial allotment of 3DS exclusive titles was just not enough incentive to buy a 3DS immediately.  On top of that, new DS games were being published on a regular basis, and players decided to stick with their DSi systems for the time being.

In a surprising move that nobody could have seen coming, Nintendo responded to their lower than expected sales by slashing the price of the 3DS hard!  Almost overnight, the price of the Nintendo 3DS dropped from $249 US to $169 US.  To appease the early adopters, anyone who picked up the system before the price drop could register their system serial numbers and become a part of the Nintendo Ambassador’s program, making them eligible to download a dozen of the most popular Game Boy and Game Boy Advance games ever made free of charge.  This was no small promotion, to this day Nintendo has not officially offered a single GBA game, including any of the titles they released through the Ambassador’s Program, on the Nintendo eShop for purchase.

The lower price point helped sales of the Nintendo 3DS pick back up, but analysts didn’t believe Sony would be immune from the same market trends that affected Nintendo.  In fact, many pundits predicted that dedicated handheld gaming devices were no longer practical in today’s smartphone driven economy.

At E3 2011 Sony unveiled the final version of their next portable device along with its final name, the Playstation Vita.  Five playable games were shown at the event, including the game that everyone expected to be the Vita’s killer app, Uncharted: Golden Abyss.  The controls were smooth, the OLED screen looked fantastic, and the games shown were a lot of fun to play, but gaming journalists (myself included), walked away from E3 that year without the impression that the Vita would be a sure seller.  Sony did not have the Vita’s battery finished by the time of E3, and would not answer questions about the system’s battery life.  They also impressed a lot of people by saying they were offering a Vita model with a 3G modem, but actually got booed when they announced an exclusive partnership with AT&T for that 3G access.  Gamers were still angry over AT&T’s poor handling of the iPhone on their wireless service since the iPhone 3G, and they were not happy they would have to use AT&T’s network for the Vita.

The Playstation Vita launched in December 2011 in Japan and February 2012 in the US at a price of $249 US.  Even with Uncharted: Golden Abyss as a launch title, the platform was nearly Dead on Arrival.  While the Vita itself was a beautiful piece of hardware capable of everything Sony promised it would be capable of, the high price point and lack of an early allotment of must-have titles hurt the Vita’s sales early on.  The Vita did not use a UMD drive, instead going with a new proprietary Vita Card slot for retail games.  Because of that, the Vita could only play PSP games that players purchased digitally and downloaded from the Playstation Network.  On top of that, unlike the 3DS which used a pretty common SDHC card for external storage, the Vita had its own propriety memory card for storage.  The new Vita memory cards were incredibly overpriced in comparison to the SD storage medium and gamers would have to buy one since the first version of the Vita had no internal memory of its own.

By early 2012, gamers were left with a choice.  They could either buy Nintendo’s handheld which was much cheaper, could play two generations of retail handheld games (which included some of the finest games Nintendo ever published), and was compatible with a cheap, common storage medium, or they could buy a Vita which had nowhere near the library of quality titles, was incompatible with the previous generation’s retail games, and used an expensive storage medium.  Gamers made their choice, and they chose the Nintendo 3DS.

Stay tuned next time for when Nintendo decided to take their lead and throw it into overdrive.

The Fall of G4TV, Part 4 January 17, 2014

Posted by Maniac in Histories, The Fall of G4.
add a comment

After discovering that Hearst was no longer interested with turning G4 into the Esquire Network, I seriously thought I would never have to revisit this article and create further parts of this series.  I figured this meant that Hearst was interested in breathing new life into a station that they had shut down, and was optimistic about the future for the station.  Today I just noticed that my cable provider has dropped the G4 channel from my channel lineup without informing me of it.  To quote Doc Brown, “What kind of a future do you call that?”  Given the fact that I’m paying hundreds of dollars a month for HD Cable TV, my provider terminating access to the G4 channel without informing me about it ahead of time was a pretty underhanded trick.  They severed my connection to a channel that had meant so much to me just as I was hoping things would be getting better with it.

If you look in your digital channel lineup you can find a little station on the list somewhere between MTV2 and Encore called G4 or G4TV. You’ll also note that the schedule for that station will likely only include reruns of Cops and Cheaters.  If I told you, and you didn’t know any better, you probably wouldn’t believe that station used to be the premiere station for gaming coverage.  You’ll probably also wonder if they were a station for gaming coverage why do they only show reruns of Cops and Cheaters, as well as a few other international shows nobody cares about.  Well, it’s a story that goes back a long time, but I have no problem telling you it.  It’s a sad story with a very sad ending, but just like with Halo Reach, even though you know how it ends from the beginning, it’s still a story you want to hear.

You can read the previous parts of this article here.

No matter how bad G4 got over the years, if there was one reason to watch the station it was for their E3 coverage.  G4 would not only air the official E3 Press Events, they would get their teams together to analyze all of the major announcements after they were made.  Even when the station was bad, you could always rely on their E3 coverage to be consistent.  By broadcasting E3 press events and doing a decent analysis of all the new content, it gave people who was unable to attend gaming’s Super Bowl in person a virtual seat at the events and saved them from the inconvenience of having to stream the broadcasts online.  For at least one week a year, I was watching G4 non stop.

By 2012, G4 was improving. A great show, G4’s Proving Ground, hit the air. Featuring the late Ryan Dunn and former IGN hottie Jessica Chobot, the two hosts spent each episode trying to replicate the gadgets and technology seen in famous movies or video games. That’s right, Proving Ground actually covered video games, and the chemistry between the hosts made for a great show! Heck, even the writing staff for X Play and Attack of the Show would be better by 2012, and I found myself watching the station again.

But then something weird happened. All of a sudden, G4 was losing its primary talent.  Both Kevin Perrara and Adam Sessler would leave their respective shows.  While they were quickly replaced by other G4 talent, it seemed curious that they would leave the shows they had been on for so long.  It wouldn’t take us long to find out why.  It turned out that G4 was now owned by the Hearst Corporation and as G4 had been underperforming for years they were planning to end it.  That’s right, they actually took the advice I gave them years earlier, when I told them to either bring back their old shows or they may as well just shut down. Well, sadly, they chose the latter. The channel was going to be completely replaced by a new channel, the Esquire Network, and an entirely new lineup of shows started production. A lot of people online completely failed to understand Hearst’s reason to do this. While G4 was indeed underperforming, it was because they were not offering a consistent lineup of quality gaming programs. Had they decided to use the resources that went into creating the Esquire Network lineup and instead focused on making a new lineup of G4 shows, likely the dedicated fans of the show who hadn’t watched the station since the Tech TV merger would have returned. It’s not like any other gaming focused stations like G4 existed. Did they honestly think the channel’s target demographic would have been more interested in watching Esquire’s content?

The final episodes of X Play and Attack of the Show aired in January 2013. The staff was let go, and no further production was put into place as an entire station’s worth of new content was being created for the Esquire Network for when it was planned to launch. By February 2013, G4 was only broadcasting reruns of X Play on top of their syndicated content like Heroes and Quantum Leap.  The station was practically on automatic play mode.  However, this wasn’t exactly a bad thing. The last few months of X Play included some of the finest episodes they had ever produced, and while they were being replayed on a regular basis, I found myself watching them all I could. On top of that, G4 was reairing all of the content they had experimented with since the Tech TV merger, like Proving Ground, It’s ‘Effin Science, and Web Soup. While none of these shows saw new episodes past their initial run, airing all these programs again on a regular basis made for an interesting offering of consistent content, something G4 hadn’t offered in nearly ten years.

The Esquire Network was coming, but it was sure taking its time getting here. At first the station was expected to launch some time in spring. Then it was pushed back to the summer. American Ninja Warrior, a show which intended to broadcast its latest season after G4 had become Esquire Network, instead was broadcast as the station was still G4 branded and G4’s ratings reaped the benefits.

By the end of the summer, after being pushed back many times, the Esquire Network had finally set launch date.  However, a few days before the station was expected to launch, an odd mention of the network made its way into the tabloids.  The article read that Hearst was no longer interested in replacing G4 with the Esquire Network, and even though they still planned to launch Esquire, it would be on top of another more fitting station, the Style Network, which as far as I knew only served to rebroadcast old E! Network shows.  When Esquire did launch, the Style Network became no more, and G4 stayed in my channel lineup.  Even though the station was still only broadcasting the same content they had been for nearly a year, now I had a little hope left in my heart that G4 would return better than ever.

Apparently, all Hearst did was prolong the channel’s demise, the Television Providers became the ones to end it.  Last week, very quietly and without ever notifying me of their plans, my cable provider, Comcast Xfinity, terminated the station from my channel lineup.  Quite a dirty trick, as my bill hasn’t gotten any lighter.  The most ironic part of this is the fact that it was Comcast’s meddling with the station back in 2005 that would nearly destroy the network, and now that they no longer owned it the fact that they chose to discontinue broadcasting it was a fitting way for them to finally end it completely, something they were unable to do themselves.

While Comcast and a few other providers are no longer carrying G4, the station still exists and is still continuing to broadcast their syndicated content, I am just no longer able to watch it.  According to the research I’ve been doing, the only Television Provider still carrying it is AT&T U-Verse, I have no idea if this fact will get them new customers, but if this fact alone is making you want to change your provider to AT&T, please post a comment about it.  G4 during its heyday was THE reason I made my family upgrade their cable package to digital channels.

As there has been no word that there are any plans to bring new content to G4, and with all the providers dropping the station there is no reason for me to hope the station will return to its former glory any more.  Other than Spike’s terrible awards show and fantastic GameTrailers.TV show (inconveniently aired super late at night), there is really nothing else out there like G4.  Hopefully some day soon some other content provider will see the need to bring a major gaming network back to television.  Just do it well and we will watch.

I would be more than happy to tell you how.

The Fall of G4TV, Part 3 January 12, 2014

Posted by Maniac in Histories, The Fall of G4.
add a comment

After discovering that Hearst was no longer interested with turning G4 into the Esquire Network, I seriously thought I would never have to revisit this article and create a third part of this series.  I figured this meant that Hearst was interested in breathing new life into a station that they had shut down, and was optimistic about the future for the station.  Today I just noticed that my cable provider has dropped the G4 channel from my channel lineup without informing me of it.  To quote Doc Brown, “What kind of a future do you call that?”  Given the fact that I’m paying hundreds of dollars a month for HD Cable TV, my provider terminating access to the G4 channel without informing me about it ahead of time was a pretty underhanded trick.  They severed my connection to a channel that had meant so much to me just as I was hoping things would be getting better with it.

But I’m getting ahead of myself, so let’s start from the beginning.

If you look in your digital channel lineup you can find a little station on the list somewhere between MTV2 and Encore called G4 or G4TV. You’ll also note that the schedule for that station will likely only include reruns of Cops and Cheaters.  If I told you, and you didn’t know any better, you probably wouldn’t believe that station used to be the premiere station for gaming coverage.  You’ll probably also wonder if they were a station for gaming coverage why do they only show reruns of Cops and Cheaters, as well as a few other international shows nobody cares about.  Well, it’s a story that goes back a long time, but I have no problem telling you it.  It’s a sad story with a very sad ending, but just like with Halo Reach, even though you know how it ends from the beginning, it’s still a story you want to hear.

I understand that it has been a while since I posted an article in this series on my site, so I would like to take this opportunity to direct any new readers or to both of the earlier parts of this series.  To fully understand how disappointed I am with this development, you should read Part 1, which details the station’s best years from its launch to it’s merger with Tech TV, and Part 2, which details the station’s downfall, after corporate meddling nearly destroyed it.

By the time of the year 2011, the majority of the content G4 ran on a regular basis was syndicated reruns from shows like Cops or Cheaters.  G4 was only airing two shows which they were producing new episodes for a regular basis, and both of the shows were never part of the station’s initial lineup.  X Play, the game review show which they acquired when they merged with Tech TV, and Attack of the Show, which was a renamed version of The Screen Savers, another carry over from the Tech TV merger.  Sadly, both of the shows weren’t doing as well as they could have been.  While audiences enjoyed the personalities of the show’s hosts, the writing at the time was not wining any awards with the viewers.  In fact, it was borderline incompetent.  I can think of a few game reviews that aired which made glaring errors like X Play’s review of Metroid: Other M which had a complaint that there was no way to dodge incoming projectiles when in fact there was in fact a dodge move in the game.  There were other bad reviews which just came down to a matter of taste, like their Duke Nukem Forever review which gave the game a 1 out of 5 simply because of the game’s offensive content.  While the review’s writer may have truly been offended by game’s subject matter, the review came off more as a childish lashing out against a game which had been in development for so long, not as a professional review on the game itself.  Although to be honest, I felt most game reviewers were unfair to this game. In fact, even Jon St. John, the voice of Duke Nukem himself, made a mention of this during my interview with him at ConnectiCon 2013, although he didn’t mention any reviewers in particular.  So as you can tell, by this point, I wasn’t watching G4 any more for its gaming content, as I felt that it no longer had any standards for which I could relate to.

But it’s unfair to say that X Play and Attack of the Show were G4’s only shows.  For a few years, they were experimenting with different show ideas every couple of months, but they were mostly hit or miss.  Sadly, most of them had nothing to do with gaming (which is odd for a gaming focused TV station), but not all of them were bad.  Jump City Seattle was a freerunning athletic competition where athletes from all over the world showed off their skills.  It had nothing to do with gaming but it was a well produced show where you could see athletes like Brian “nosole” Orosco and his mustache do some fantastic moves that defied what we thought was possible with the human body.  Web Soup was basically the show Ridiculousness with a host that wasn’t as funny as Rob Dyrdek.  It’s Effin’ Science showed how science could be used in cool ways, and tossed in a few explosions for good measure.

While not all of these shows were winners, I found a lot of them enjoyable.  G4 was also experimenting with airing different syndicated content, and instead of showing just episodes of Cops and Cheaters, they started showing some classic science-fiction shows including one of the greatest shows ever produced, Quantum Leap.  I was too young to enjoy the show when it was new, but by the time it came to G4, I couldn’t stop watching it.

Things weren’t perfect, but for the first time I started to be optimistic that G4 could improve.  Things at G4 were starting to look up, but little did we all know that by the time the station hit its stride, it was doomed.  Stay tuned to this site for Part 4 of this article, where I’m going to detail the end of G4, just as it had started to get good again.

The Video Game Handheld War Part 9 October 6, 2013

Posted by Maniac in Histories, Video Game Handheld War.
add a comment

Times were changing and while the hardware was still being revised on a regular basis, we were still in the middle of the handheld war between the Nintendo DS and the Sony Playstation Portable.  Previously, Nintendo had launched its highly successful DSi, a camera equipped DS which allowed for digital game downloads, and Sony had just released their third revision to the Playstation Portable, dubbed the PSP-3000.  Based upon sales, the Sony PSP was still trailing behind the Nintendo DS, but the landscape for what would make for a video game handheld system was about to be completely changed.

After the failure of the UMD Video format, Sony set their sights digitally.  Apple’s iTunes store had seen a major influx of customers after they released the iPhone.  After the release of the second iPhone, the iPhone 3G, many critics believed that Smartphones would be the future of portable gaming platforms and not dedicated handheld gaming systems.  This was a worrying prospect for everyone, and Sony was determined to take drastic steps to prevent the PSP platform from becoming obsolete.  Sony had already established a hugely popular online store for the PS3 which offered full game downloads, add-ons, trailers, and more for download, and Sony was ready to bring it to the PSP.  Also, portable storage was getting cheaper.  By this point in time, with the low price and larger capacity, PSP’s Memory Stick offered players a lot more options than UMD could, and unlike UMD, digital downloads cost nothing to manufacture and ran no risk of running out of inventory.  Soon, Nintendo would not be the only handheld game system with an online store.

Sony decided to go further with the idea and designed an entirely digital PSP.  Sony completely redesigned the PSP’s form factor to slide open, making it look and feel more like a modern day cell phone.  However, there were a lot of downsides to this new system.  First off, it would be using entirely new standards for everything, making currently existing PSP peripherals and external storage incompatible.  The device would have 16 GB of its own internal storage, with the option of expanding memory using Memory Stick Micro.  It would also feature no UMD drive at all, making it incompatible with all retail PSP games.  Dubbed the PSP Go, it would ship at a price of $249 US…which was $50 dollars more expensive than the already released PSP-3000, which had nearly all the same features of the PSP Go and a UMD drive.

In 2009, Nintendo released a larger model DSi, dubbed the DSi XL.  The intention was to release a larger handheld with all the functionality of the DSi, including access to the popular DSiware store, but with a larger screens.  The DSi XL would be the final DS model to get released from the DS line.  Sadly, Nintendo chose not to bring the GBA slot back with the new system, making the entire DSi line of handhelds incompatible with GBA games or DS peripherals that made use of the GBA slot, like the Rumble Pack.  However, this turned out to be a moot issue, as DSi XL owners were buying their systems to play the fantastic library of DS games that were already available and the DS games that would be coming soon.

The PSP Go launched just in time for Christmas 2009, and by all intents and purposes it was Dead on Arrival.  Most users thought that the PSP Go was the stupidest idea that Sony had ever had and they couldn’t believe Sony actually brought it to retail.  Any regular model PSP would have access to the digital Playstation Store and so long as they had a Memory Stick with enough space, they could download all of the same content that a PSP Go could.  On top of that the PSP Go had an incredibly unfair price premium over the most recent model PSP, and yet without a UMD drive it could not play any retail PSP game!  If a user didn’t have access to a WiFi hotspot with internet access, the PSP Go was useless, unlike an iPhone owner who could download content through the wireless phone service.  This made it the most illogical handheld upgrade ever and current PSP owners decided to stick with their systems.  Gamers knew Sony was expecting them to pay extra money for what was essentially a crippled PSP, and they did not have any interest in it.

In fact, many retailers worldwide were uneasy about stocking it, not just because of its high price point and the bad word of mouth, but brick and mortar stores were angry that Sony was planning to cut them out of possible revenue from the system’s games.  Sony assured retailers that was not their intention, and retailers would have the opportunity to sell prepaid digital game codes for many of the PSP’s most popular games at the time of the PSP Go’s launch.

Aside from a few curious adopters, the PSP Go did not sell.  Reviews for the device were lukewarm to terrible, as critics believed the device did not merit the high price point.  However, while the PSP Go was by all accounts a complete failure and a big black eye for Sony, the online marketplace set up to be used by the PSP Go was a resounding success.  Playstation Portable owners hoping to get their hands on older games that were no longer in print finally had the chance to download all the games they wanted.  With the low price and high capacity of external Memory Sticks, the storage capability on the PSP could in theory get just as large as what a stock PSP Go could offer.  No matter how many price drops Sony did on the PSP Go, no one would buy it, and the system, as well as the peripherals for it, collected dust on retail shelves.  However, the digital download codes that Sony offered to retailers, since they worked on any model PSP, sold quite well.  There are still plenty of people who do not have credit cards or are unwilling to use them for digital purchases, and retailers made a smart decision by offering downloadable game codes, as well as Playstation Network Gift Cards, at retail.

Meanwhile as Sony floundered with the PSP Go’s launch, Nintendo had entered in unimaginable success with the DSi systems.  Now, Nintendo prepared their fifth generation Pokémon game launch, and the second generation to appear on the DS platform.  Previously, a new Pokémon generation game (not including remakes) would launch on a new Nintendo platform, but the success of the DS was so great that Nintendo chose to release the newest Pokémon games to the same platform as the previous ones.  The fifth generation games would be called Pokémon Black and Pokémon White.  Normally, a Pokémon game’s release would go on to become the biggest selling games on Nintendo’s handheld platform, but Pokémon’s popularity had started to wane since the release of the DS.  By this point there were so many other great games on the DS that were huge system sellers, a Pokémon game released so late into the platform’s lifespan just could not make up those sales, but Black and White sure tried.  When the games were released, they gave a great boost to the franchise’s popularity and went on to become staple games for the platform.  In fact Pokémon Black & Pokémon White were so popular they remain the first Pokémon games to not receive a special third game in their generation like Pokémon Yellow or Pokémon Platinum.   They were also the first Pokémon games that would go on to get direct sequels in the form of Pokémon Black 2 and Pokémon White 2, but I’ll get more into those games at a later time.

In 2011, Sony officially announced they had discontinued the PSP Go.  Nintendo on the other hand was still doing phenomenally with their entire DS line, and the DSi XL still can be found at retail to this day.  In the end, the Sony Playstation Portable could not compete with the overpowering success of the Nintendo DS brand, which if you take into account combined sales of all their different DS models, made the DS the second best-selling console of all time.  However, Sony’s PSP cannot be considered a complete failure.  While UMD never took off in the way that Sony had hoped when they designed the format, the PSP itself offered a lot of features that gamers liked, including MP3 Music and MP4 Video playback, Internet Browser, and support for streaming Podcasts.  The fact that it allowed its internal software to be updated on a regular basis offered gamers the opportunity to receive new features, and to this day no game console goes without some sort of update feature.  Also, the PSP Go’s online digital network laid the groundwork for a fantastic digital game marketplace to already be in place by the time Sony’s next handheld would hit the market, but that’s a story that is still being written.

The Video Game Handheld War Part 8 September 25, 2013

Posted by Maniac in Histories, Video Game Handheld War.
add a comment

As we enter this eighth part of our History of the Video Game Handheld War, we’re going to continue our discussion of the Sony PSP and Nintendo DS generation.  The reason why I’ve chosen to break this particular generation into so many different parts of this ongoing series is because unlike most of the previous generations before it, a lot of events transpired during this past generation.  New hardware was getting released regularly, and the popular franchises that were coming over to the platforms were big events.  Each side constantly tried to one-up the other, but as we enter this latest part, the Nintendo DS was still far ahead of the Sony Playstation Portable.

The year was 2007 and Sony had just shipped a brand new model of the Playstation Portable, the PSP-2000, but many just simply dubbed it the PSP Slim.  By all intents and purposes it was an improvement over the original PSP, and immediately after launch, gaming journalists discovered that the games played on the PSP Slim enjoyed much shorter load times.  However, the PSP Slim was not without its problems as some players would find issues with the Slim’s LCD screen, and complaints of image ghosting started to spread.  However, the PSP Slim’s TV-out feature, which was compatible with both SD and HDTVs, made the ghosting issue a bit of moot point.

At around the end of 2007, Sony released their final first-party title for the Playstation 2, God of War II.  The game was the sequel to one of the PS2’s most critically acclaimed games, and it became one of the highest anticipated releases of the year, and one of the best selling on the PS2 that season.  While the game ended with a cliffhanger, the game’s manual hinted at the possible future of the series.  Gamers saw there was an advertisement for the next God of War game which made it clear that the series would be coming to the Playstation Portable.

Once again, just like with Grand Theft Auto: Liberty City Stories, instead of jumping in excitement that one of Playstation’s most iconic franchises was getting a new title released on a portable platform, gamers complained like crazy that it wasn’t coming to a platform they already owned, like the Playstation 2.  However, when God of War: Chains of Olympus finally launched on the Playstation Portable, it got a huge critical response and many critics remarked just how well God of War’s core gameplay was able to be brought over to the Playstation Portable.  While it did not resolve the cliffhanger left at the end of God of War II, it served as an exciting prequel story which further fleshed out God of War’s characters, and had a pretty exciting ending to boot.  Players who picked it up were not disappointed, and the game became a hot seller.

In 2008, Nintendo decided that the time was right for a new DS revision.  The early buzz was that Nintendo was going to release a larger DS Lite model, but Nintendo chose to go a different route.  Once again, Nintendo was releasing a smaller and lighter model DS which would have slightly larger dual screens, but that wasn’t all.  This new model DS would also feature a dual camera system, giving players the chance to take digital pictures or to use the cameras during gameplay, and because of that, the new handheld revision was dubbed the DSi.  The downside was that the DSi would not feature a GBA slot, and any peripherals that would take advantage of it (including the required adapter for Guitar Hero: On Tour) would no longer be compatible.  This angered some of the DSi’s early adopters, as well as Nintendo loyalists planning to upgrade, but by this point Nintendo was no longer selling Game Boy Advance games, and most retailers were no longer stocking GBA games either.

With the release of the DSi came the end of the long reign of the Game Boy brand, one of the most successful hardware platforms of all time.  The DSi, like the DS Lite and Game Boy Color before it, once again shipped in multiple colors which offered players a small way to personalize their systems.  In Japan, the platform was a huge hit at launch, both from new customers and from already existing DS players who wanted to upgrade.  Nearly all of the launch units solid immediately.  When it finally launched in America, it shipped in two colors, a first for the region.

Reviews of the system were widely positive.  While the addition of the extra cameras wouldn’t win the DSi any major awards for great technical achievements, the DSi’s new online DSiware store alone made the upgrade worth the price.  While Nintendo chose to only release DSiware exclusive content and not full retail games through the service, the service was very successful and it gave Nintendo the opportunity to release new DSi content on a regular basis.  While the device only shipped with a finite amount of memory, Nintendo included an expansion slot for SD cards, which would allow users to hold more memory.  The downside was that the DSiware content was region locked, unlike retail DS game cards, and like the Wii, DSi purchases were locked to the individual handheld device.

Unfortunately, everything still wasn’t going well for the PSP.  By 2008, the UMD Video bubble had finally burst.  Far too wide a range of videos were getting released and the PSP’s market share was not large enough to buy all the titles that were being offered.  With Walmart having ended their support years earlier, the UMD Video market had started to stagnate.  On top of that, UMD was seeing a heavy competition on a medium without a physical format.  Apple’s iTunes store was offering digital downloads of movies ever since Apple released a color version of their highly successful iPod and with removable media overtaking the storage capacity of what could be held in a UMD, gamers decided that downloading multiple movies to a portable device instead of carrying around physical media was the better option.

Sony released one more incremental hardware revision to the PSP in the form of the PSP-3000.  By all intents and purposes, it was another PSP Slim, but it featured a slightly improved screen which lessened the ghosting images that many complained was a problem with the PSP-2000.  It was also compatible with nearly all of the PSP-2000’s peripherals, including the battery, Skype headset, and TV Out cables.

However, I would be remised to talk about the other big elephant that had entered the handheld space by this point, and that is the rise of the Smartphone and by 2009, both Sony and Nintendo had to sit back and take notice.  Smartphones had already hit the market with huge success, and it became clear very early that something like a brand-new iPhone could have just as much gaming capability as a portable game system could.  An iPhone user could wirelessly download anything they wanted to their phone in just a matter of minutes.  Previously, Sony had been one of the largest cell phone manufacturers in the world, and they saw this digital download craze as something they could bring to the Sony PSP in the form of an entirely new PSP hardware revision.  Would they succeed?  That’s a story for next time.