- cross-posted to:
- games@sh.itjust.works
- gaming@beehaw.org
- cross-posted to:
- games@sh.itjust.works
- gaming@beehaw.org
A three-year fight to help support game preservation has come to a sad end today. The US copyright office has denied a request for a DMCA exemption that would allow libraries to remotely share digital access to preserved video games.
“For the past three years, the Video Game History Foundation has been supporting with the Software Preservation Network (SPN) on a petition to allow libraries and archives to remotely share digital access to out-of-print video games in their collections,” VGHF explains in its statement. “Under the current anti-circumvention rules in Section 1201 of the DMCA, libraries and archives are unable to break copy protection on games in order to make them remotely accessible to researchers.”
Essentially, this exemption would open up the possibility of a digital library where historians and researchers could ‘check out’ digital games that run through emulators. The VGHF argues that around 87% of all video games released in the US before 2010 are now out of print, and the only legal way to access those games now is through the occasionally exorbitant prices and often failing hardware that defines the retro gaming market.
I don’t think we’re talking about arcade games at this point though. We’re talking to a large extent about 3rd–6th generation home gaming consoles. For Nintendo, that’s the NES to GameCube. Sony entered with the PlayStation in the 5th gen, and Xbox came out in 6th.
I think a lot of people would see this (and to a slightly lesser extent the 7th gen) as the high point where games came out in a completed state and you paid once and the just enjoyed the game.
Well, no, we’re talking about everything. Everything before 2010, explicitly.
I would guess most people just fill in whatever moment of their childhood there was when they would buy a thing and enjoy a thing and not worry about it too much.
Me being me (see the old codger self-identification up there), I substitute in the late 80s and 90s, when I would plead and beg for coins to squeeze in another 60 second gaming session and then go on to save for months in order to get a lesser version of that same experience at home for anywhere between 60 and 90 bucks (140-220 adjusted for inflation).
In the grand scheme of my memories, the five years after arcades were relevant and before Microsoft started charging a monthly fee to play online and Facebook started a games division are too short of a blip to consider a golden age. My nostalgia is on ranting angrily about having to purchase Street Fighter 2 for the fourth time and having Capcom re-sell the PSOne version of Resident Evil a third time for the privilege of having added analogue stick controls.
But an arcade game is a physical object. The preservation needs of arcade games are very different to games distributed on cartridge or disk, which is why I suggested that a digital library would be focusing on home game consoles, especially those released at a time when home gaming was the main way gaming got experienced (i.e., after arcades were the most popular way).
Assuming that “too short” and reference to a “golden age” was meant in refutation to my claim of the 3rd–6th console generations, which lasted from 1983 until 2007. If that’s the claim, I find it absolutely absurd. When we discuss the golden age of TV we’re talking barely one decade, from the mid-to-late oughts to the late 10s.
If you meant something else by that bit, I’m sorry, please disregard the above paragraph. But I don’t know quite what you do mean.
No, I’m arguing that if you’re trying to identify an era where the industry at large was not overmonetizing that’s your timeframe: From the death of arcades to the birth of modern casual gaming/F2P/Subscription services. By the numbers that’d be 2001-2005.
Before then you have arcades acting as the first window of monetization, where a whole bunch of console games started and where a lot of the investment went. After that you’re balls deep in modern gaming, with games as a service that are still live today, from World of Warcraft to Maple Story.
That’s a handful of years, at best. Any other interpretation has to ignore huge chunks of the industry that were behaving in the same way that makes people complain today. Either you dismiss arcade gaming despite it being the tentpole of the entire industry or you’re dismissing the fact that subscription and MTX games were already dominating big chunks of the space.
So no, it’s not 24 years. It never was 24 years.
And for the record, we knew at the time. We’ve been complaining since the 90s. I wasn’t joking earlier, “Ubisoft greedy” today is a carbon copy of “Capcom greedy” in 1997. I’ve been stuck in nerdrage Groundhog Day for thirty years.
But I don’t think you need to go from the time when arcades were entirely irrelevant, but merely where they were no longer the main driving force. That’s at most the late '90s with gen 5 consoles and many big popular or influential game franchises like Quake, Pokemon, Age of Empires, Fallout, Diablo, and Grand Theft Auto (that’s '96 and '97 alone).
And you need to go up until at least the time when few of the largest games were available without cancerous monetisation strategies, not merely when a few games had started doing it. So you definitely need to go up to at least the launch of the 7th generation consoles in 2007.
To bring it back to the original point of the conversation, that’s not to say that it isn’t worth preserving games that did have those strategies of course. It just doesn’t detract from the sense of a period when the majority of gamers’ experience was much better.
And EA greedy in 2007. Doesn’t mean that what they were doing then was as bad as what is being done today.
Well, we’ve gone from 24 and 5 to a 10 year compromise, so we can agree to disagree on that basis.
That said, I do disagree. You are underestimating how relevant arcades were in 2001. Soul Calibur may have been an early example of the home game being seen as better than the arcade game in 1999, but it was an arcade game first, I had played the crap out of it by the time it hit the Dreamcast.
And I was certainly aware of Maple Story before it was officially released here. And of course I mentioned WoW as the launch of the GaaS movement, but that’s not strictly accurate, I personally know people who lost a fortune to their extremely expensive Ultima Online addiction in 1997/98.
I am still not convinced that the experience of those gamers was any better or worse, me having been there in person. The kids in my life seem perfectly content with their Animal Crossings, Minecrafts and even Robloxes. The millions of people in Fortnite don’t seem mad about it. I sure was angrier about that Resident Evil business at the time than people are about the Resident Evil remakes now. Hell, I got pulled from playing a fantastic remake of Silent Hill 2 by an even better JRPG in Metaphor ReFantazio, and neither of those games features any MTX or service stuff. And of course that’s not mentioning the horde of games in the 20-40 range that are way better and more affordable than anything I had access to in the 90s.
People are nostalgic of the nostalgia times, reasonable or not, and time has a way of filtering out the nastiness, especially if you were too young to notice it. I was wired enough to hear the lamentations of the European game development community being washed away by Nintendo and Sega’s hostile takeover of the industry and their aggressive imposition of unaffordable licensing fees. I was aware of the bullshit design principles being deployed to milk kids of their money in arcades. I had strong opinions about expansion packs and cartridge prices. It’s always been a business, it’s always been run by businessmen.
Best you can do is play the stuff that’s good and ignore the rest.
Second best you can do is be publicly mad at the business driving unreasonable regulations that are meant to do the public a disservice.
Third best you can do is start archiving pirated romsets to privately preserve gaming history, blemishes and all, so we get to keep having this argument when the next generation of gamers are out there claiming that Fortnite used to be cool when it was free and had a bunch of games in there instead of requiring you to sign off your DNA to be cloned for offplanet labor or whatever this is heading towards.
Got time for a short story?
Manna
As a fellow ancient of the game world, I would say 20ish years is not far off give or take. The Atari 2600 was around in the 70s and the original NES came out in 1985(?). The NES was really the beginning of the end for the arcade scene. True that a lot of the arcade ports where terrible, but the power just wasn’t there to do it in a small box yet. $1 rentals from the local video shop would let you play a game all night or longer depending on who it was from.
While the online game services from Xbox and co could be seen as returning to a pay-to-play situation, they where never a must have. You could still play with friends locally without a subscription and the mass push for DLC buys wasn’t there yet.
I would really put the return to money snatching along side the rise of mobile games. Buying addons and in game coins to get an advantage really picked up with the ease of always on connections and purchases with a simple swipe of the finger. Once that ‘just one more boost will do it’ addictive mechanic was made the norm it was all over for the concept of a game that you just bought as a complete thing. Now it’s a novel thing to see a game offered that you just buy and play as it is.
See, this is the part where I’m not going to dismiss your experience, because a lot of this was super regional, but I’m going to say my experience was not that at all.
I definitely spent most of the money I put in arcades AFTER I had a 16 bit console. The “arcade>marketing>console port” hype cycle went on for a decade after the NES first became a thing.
And Live Gold is just a sign of something that was going on everywhere. The first free to play hits were happening, WoW was taking over the world and making GaaS mainstream… and yeah, online gaming was becoming a thing on consoles and getting monetized in brand new ways.
But also, I’d say expansions were a thing way before DLC became a dirty word. Because Groundhog Day I distinctly remember having conversations with angry nerds in the mid 2000s explaining that there wasn’t much of a difference between DLC and a lot of the expansion packs and shovelware content expansions being pushed around all through the late 90s.
And of course there are tons of games you can buy as a complete thing. As I said above, I’ve been playing Silent Hill 2, Metaphor and a bunch of other stuff that is very classic in its structure. Another constant of gaming nerdrage is that people don’t care if what they like continues to exist, they are mostly clamoring for the things they dislike not existing, which I’ve never been on board with.
What is DLC vs an expansion has become somewhat blurry. There was StarCraft BroodWar that was really an entire separate game to the point of launching them separately at the menu. Now things like Rimworld (which I play far too much of) have these expansion/plugins that add new mechanics and features but don’t create a separate game in their own right.
I actually read an article recently about 20 years of Oblivion horse armor or some such. They made an interesting point that a lot of the acceptance of micro buys came from online games letting you show off your new gear to the masses.
Yeah, in some ways I think playable content vs cosmetics is a more functional distinction than DLC (or MTX) vs expansions. The big thing that changed is that games now will sell you visual items for bragging rights, rather than stuff for you to play.
I would suggest that not buying those is a good idea, but clearly a bunch of teens and rich people disagree with me on that one.