Science Check: Heavy Rain, Revised July 8, 2013
Posted by Maniac in Editorials, Science Check.add a comment
With the release of Quantic Dream’s newest game BEYOND: Two Souls in October, I decided that we should take another look at their last major release, Heavy Rain for the PS3. I had previously looked at Heavy Rain two years ago in the very second Science Check and we discussed the ARI glasses that character Norman Jayden wore. They had the capability to show him a deeper level in crime scenes, kept his clues organized, and offered minigames during off hours. At the end of that original article I came to the conclusion that the ARI glasses in Heavy Rain that Norman Jayden wore were not theoretically possible to construct with modern technology because they would have required a GPS, IR Transmitter, Night Vision, GPU, CPU, Cellular Modem, a battery, and still be cheap enough to manufacture with 2012 technology that they would be considered disposable the second a new model came out.
This week on Science Check, we’re going to be taking another look at a game that I had previously covered in another Science Check article, Quantic Dream’s Heavy Rain. I didn’t believe the ARI glasses could work in that original article, but given some recent advancements in technology, while I stand by that previous article at the time, I neglected to mention something that Quantic Dream could have theorized would be an integral component to make ARI work, Cloud Computing, or that similar designs to ARI may be hitting the market quite soon in the form of Google Glass.
Sometimes, you’re forced to make some severe leaps of logic as to just how plausible a video game’s grounded reality can be. Some things we’re willing to take for granted, like enemies will simply just carry health and ammunition supplies with them at all times, and you will be immediately able to make use of them.
But then sometimes there will be moments in gaming which skirt the bounds of reality and you are forced to ask yourself…COULD THAT REALLY HAPPEN? Fortunately for me, I happen to have a bunch of friends on speed dial with science backgrounds and when I ask them questions, they have no problem filling me in on just what reality would do in these situations.
So this is Science Check, where I take a look at the leaps and bounds of scientific logic that games have made over the years and check if it would indeed work, or if you tried doing it in the real world, you’d be totally screwed.
In the past six years, there has been one huge technological development which has impacted the world in a way I never could have imagined when it was in its infancy, and that’s online streaming of content. You don’t have to look any further than the success of services like YouTube, Netflix, or Amazon to find companies which can offer customers the capability to instantly stream movies, television, and other videos to your computer, smartphone or television. In 2009, a company called OnLive thought that they could do even more with this streaming capability and planned to offer a service where they could stream high-end PC games to people’s HDTVs or computers in High-Definition without the need of a high-end PC. I’m talking about streaming entire games, the next logical step in streaming technology powered by the Cloud. With OnLive, a gamer would no longer need to install a game to their PC which would render it with the computer’s central processor and graphics card. The downside of running a PC game is if the system wasn’t very powerful, the game’s performance and detail would suffer. Instead, the game content would be processed on OnLive’s supercomputers at data centers across the country and fully stream the content to your house with very little processing needed when it arrived. Depending on the speed of your connection to your Internet Service Provider (ISP), you could stream a high-end game up to 720p HD. All a player would need was a small streaming set-top box for their HDTV or the OnLive application installed on their desktop or laptop PC.
Like most gamers, I was skeptical if OnLive could deliver what they promised after they announced their service. Heck, my cable company takes at least thirty seconds to respond to a pause command when I’m watching videos on-demand, so I had no idea if there was enough bandwidth on the planet to get a player to play a game like Crysis without heavy latency. However, I was able to check out OnLive’s service when they had a demo station set up in the Press Room at E3 2011 and was impressed that the system was able to provide such a fluid experience from a non-local source. I figured if it can handle something as complex as a video game without notable latency, it can handle ARI.
That’s the beautiful part of what the Cloud offers. You wouldn’t need to have a large device if most of the rendering and processing was done elsewhere. The servers offered by Government organizations would be the best in the world, and would be very capable of handling the heavy tasks that ARI would require. Without the need of a high-end CPU or GPU, the glasses could get away with being slimmer and cheaper to manufacture. The FBI would need to invest in some serious supercomputers, but it would be reasonable to assume that would be something they have anyway. The downside is that ARI would NOT work in an area without cellular reception, and when you live in my neck of the woods, that’s a serious issue. ARI would be better served to use some better established encrypted cellular band than what is offered to regular consumers.
The problem with OnLive wasn’t with their technology, but their price point. They were charging the same price as retail for games, and they were considering to charge for the service’s use once it left beta. Gamers chose to continue to purchase their games at retail and play them on their existing hardware, and I think a few of them were also concerned they would lose access to their games if OnLive was ever shut down. If a gamer is going to spend money on something, most of them need to be absolutely sure on the long-term use of the product. Because of that will decide against products which have an internet requirement without an online component, and had no idea of the long-term viability of OnLive to invest in buying games through the service. Not too long ago, OnLive announced a massive staff reduction to reduce their operating costs. I understand that as technology improves, it will always get smaller, and it will always get faster. With Cloud processing like what OnLive could do, in the case of the ARI glasses, technology may not need to get smaller or faster, the Cloud can take care of all that for you.
But whether they could use the Cloud or not, wearable computers are now a very close possibility. Google is currently open testing a set of glasses appropriately named Google Glass. They are supposed to offer an Augmented Reality (AR) to the wearer, which in conjunction with a GPS, gyroscopes and accelerometers will provide further information on the area around the user just by looking at it. It is also equipped with a 720p HD camera which can record video at the user’s request. This is very close to Norman Jayden’s ARI but lacks several important features. One is the fact that the Glass cannot currently do Night Vision (IR), nor does it currently have a cellular modem. From a brief glance at the technical specifications it looks like Glass uses WiFi and Bluetooth for connectivity. It also has a very small Heads-Up-Display (HUD) at about 640×480, which may sound low but when the screen is that close to the human eye, it should work well enough. The biggest difference between Glass and ARI is it doesn’t seem like Glass’s display is part of the lenses themselves like Jayden’s ARI. In time we may be able to produce a lens which can be a combination of viewport and screen that’s small enough to fit on a normal pair of glasses, but that may take time.
As I said in my original Science Check, I expected all the processing and rendering that ARI did to help Norman find clues, further investigate them, and interact with his virtual environment to be done by the glasses themselves. It could be possible that the ARI glasses were designed to do all their processing and storage through the Cloud instead of locally. This would save space in the glass frames from having to include a high-end CPU and GPU in order to process crime scenes and provide a virtual workspace. If you’re able to virtually feel like your desk is on the surface of Mars or under the sea, you’re going to need both of those things in your hardware, but not if the servers your device has access to is doing all the processing for you.
Just make sure you don’t lose reception.
As a Post Script to OnLive’s story I would just like to include that while it looks like OnLive didn’t do too well financially with their streaming plans, OnLive wasn’t the only company planning to offer a streaming service like this. There was another company called Gakai which was run by Dave Perry who was trying to do something along similar lines. Gakai was bought by Sony and their technology will appear in the PS4 and PS Vita. Time will tell how well the Cloud streaming service that Sony will be offering on their newest consoles will work, but Sony has promised instant gameplay to anyone using the service to stream their games on the PS4 when the service launches. Perhaps by selling their technology to Sony and integrating their service into a future-generation console which would guarantee them an install base, Gakai could succeed where their competition has faltered.
Halo 4 Champions Bundle Trailer July 8, 2013
Posted by Maniac in Game News.add a comment
The next map pack for Halo 4 has been announced, and it’s going to be offered with a bundle with several new armor sets that have never been seen before called the Infinity Armor, Steel Skin, and Bullseye Packs. If you want to see the new armor for yourself, check out this new trailer.
Halo 4 Champions Bundle is coming August 20th, 2013 through the Xbox Live Marketplace for $10 US. Halo 4, which is out now exclusive to the Xbox 360, will be required to play.