Bluth Group LTD

Tech

/

After a Long Absence, Here’s how Gaming has Changed

From the time I was 12-years-old until the time I left for college, my gaming life was dominated by the Nintendo GameCube. At first, I played more kid-friendly titles like Super Mario Sunshine before seeking out titles from the Star Wars franchise, including Jedi Knight II: Jedi Outcast and both Rogue Squadron II & III, as well as other pop culture tie-ins like Spider-Man 2: The Game and The Simpsons Hit & Run. Then, after turning 17, I was allowed access to the more mature titles for the console, like GUN and my beloved Resident Evil series.

So many things in my life, from conversations with friends at the cafeteria table in high school to my devotion to watching the G4 cable network devoted to video games and gaming culture, reinforced their place as the prominent media de jour. But that started to fade around the time I started college, as video games became a more social activity. The act of playing something by yourself was replaced with Halo parties in my freshman hall and eventual competitions of titles like Big Buck Hunter, Super Smash Bros, and The Beatles: Rock Band with close friends in my dorm room or in communal social areas.

However, I didn’t abandon gaming entirely. My cousin would bring his Xbox on our annual family beach trips, allowing me the chance to play zeitgeist-capturing titles like the 2013 reboot of Tomb Raider, Saints Row: The Third, Assassin’s Creed IV: Black Flag, and the first two installments of the Batman: Arkham series: Arkham Asylum and Arkham City. I might’ve watched some friends play Skyrim here and there, and loved playing competitive fighting games like Mortal Kombat, NBA 2K, and Injustice 2 with friends in my quarantine bubble during COVID-induced lockdowns.

The Last Starfighter
‘The Last Starfighter’ (1984)

But this past Thanksgiving, that same cousin gave me his old Xbox One X, allowing me to immerse myself in serious gaming for the first time in many years. The first titles I sought out were the remakes of Resident Evil 2 and Resident Evil 3 that I couldn’t wait to play. And soon, other titles joined the roster. In keeping with my Star Wars fandom, I got the acclaimed Jedi Fallen Order as well as the more divisive Battlefront II. I completed two sets of trilogies based on franchises and characters I love. I beat Batman: Arkham Knight, as well as the two most recent Tomb Raider games, Rise of the Tomb Raider and Shadow of the Tomb Raider.

Coming back to traditional single-player video games after what feels like such a hiatus has provided me with some insight. The industry had changed dramatically, as had the gaming experience itself. Gone were the occasional glitches that could derail the gaming experience, to be replaced instead with stunning graphics and immersive gameplay. A lot had stayed the same, but as with everything, change is inevitable. Here are some of the highlights of what I noticed as I took a deep dive back into gaming for the first time in a decade.

Graphics tend to be a bigger selling point

It used to be that in the old days, you could forgive clunkiness or glitches if the game experience was good enough. Now, however, it’s about bringing as realistic an experience as possible. Every game I’ve played in the past 6 months has had incredible graphics, and it’s hard to point to a weak one in the bunch. Glitches are rare. The settings of some of these games are lit and framed in a way that practically feels cinematic. Stunning graphics is perhaps the most obvious perk given how much time had passed and how much technology had improved, but I was still surprised by just how spectacular they ultimately ended up being.

Remastered editions breathe life into older titles

For the first time in my life, I played updated remakes of some of my favorite games. In addition to the previously stated Resident Evil remakes, a remastered edition of Ghostbusters: The Video Game proved to be another fun, worthwhile treat. In all cases, the differences were drastic: fully rendered characters that look practically human instead of the polygon-heavy characters of the original Resident Evil and the previously-cartoony look of the Ghostbusters.

There were other aesthetic differences. For example, the older Resident Evil games had a fixed-camera system that restricted the player’s perspective, only to be updated with the over-the-shoulder third-person look that has become the standard. It is true that in some cases, the weight of expectations might factor in: the original Resident Evil 3 still remains one of my favorite games of all time. Its remake, as graphically impressive, fun, and scary as it can be, only took a fraction of the time to beat, and I wish there was simply more game there. Despite that, my experiences go to prove that remakes help reinvigorate some of gaming’s most iconic franchises.

Open-world gaming is the norm

The concept of “open-world” video games was just taking off in the sixth generation of home video game consoles, of which the GameCube is included, thanks to better digital capabilities and the success of the open-world Grand Theft Auto series. But, with the technology having advanced that much further, the capabilities to make immersive worlds the player can explore became that much better and even more crucial to the video game experience.

It used to be that in older Tomb Raider games, you could replay individual levels, but there wasn’t much to be had of exploring your surroundings for anyone except your most diehard completest. Now, as Lara Croft, you can hunt animals, find and complete other challenge tombs, complete side missions, or just take in and explore the beautiful surroundings of settings as varied as the tundra of Siberia and Incan and Mayan-inspired jungle settlements of South America. Keeping a player engaged even after completing the main story was something that was rarer in previous console generations; now, it practically feels like a necessity.

Micro-transactions are a thing now

When I was playing Spider-Man 2 for the GameCube, I had to manually keep track of how close I was to completing various challenges. Now, my Xbox will alert me to completed challenges I wasn’t even aware I was participating in. It contributes to the player feeling like they’re doing more, accomplishing more, and certainly adds to the overall gaming experience.

Moreover, the games now actively want you to “unlock” as much as possible in order to have more items to customize characters with or to actually gain new characters and settings. You can always earn points that you can spend to unlock these things by playing the game. But of course, there’s always an option to use real, actual money in micro-transactions for “loot” in order to just bypass the process entirely and get straight to the goodies. It doesn’t seem to be as much of a problem for titles I played like Star Wars: Battlefront II or Injustice 2 as much as it was when they were first released, but it is noticeable. For example, when I want to live a childhood dream of playing as either Michael Keaton or Christian Bale’s Batman, or being able to drive in their respective Batmobiles in Arkham Knight, I need to fork over some change over to Microsoft.

***

Video games never really left my life, but it felt like for the past decade, I had more-or-less pressed the pause button on them. Having this new chapter in my relationship with them has been refreshing, as well as educational. I always had my ear to the ground about developments in the industry and individual breakthrough games, so I feel like I understand more about why some of the titles I’ve played were so successful, as well as some of the criticism that was thrown at them. I’m excited to learn even more, to play even more, and the prospect of stumbling onto something new or unexpected.