Ten Years After “Live Free Or Die Hard”

Can you believe that Live Free or Die Hard released June 22nd, 2007?

I mean, damn. I feel old.

Ten years later, and I have such a mixed relationship with this movie. The Hollywood Hacking cliches are offensively straight out of an early 1990’s film, but the dialogue is spot on. The action scenes are completely over the top, and yet they are fun as hell to watch.

Ten years later, I still have no idea how I feel about the film.

Ten years later, we are more dependent on computers than ever before, but if anything, the film’s premise of a dedicated group of cyberterrorists being able to shut down the entire national infrastructure seems more distant than it did in 2007. Part of that is we just have a better public understanding about how IT works today — one of the benefits of being so uniformly dependent on computers. Another part is that the technology has advanced more, and still yet another part is that the movie’s plot has always been… well, let’s be charitable and call it “highly implausible”.

So why do techies hate this movie? I mean, it’s a dumb but fun summer flick, isn’t it? Well yes, but it presented just realistically enough in 2007 that a lot of people believed it was pretty accurate. More than that though, as I’ve said before when discussing this movie: “You don’t go into an alley to get stabbed, but you still feel the knife when it happens.”

Live Free or Die Hard doesn’t mean to fool people. It doesn’t mean to cause a virtual aneuryism in techies and gadget geeks. It just wants to be a fun movie. Fortunately, for the most part, it is. So, the science underpinning the plot is hilariously and offensively wrong. Big whoop. The action scenes are fun to watch and appreciably over the top, even if the previous films felt a bit more grounded in reality. The dialogue is snappy, snarky, and sarcastic at all the right points, with Bruce Willis and Justin Long trading positions regularly as the guy calling out the absurdities of the plot, and there are plenty of those moments to be found.

Ultimately, Live Free Or Die Hard doesn’t play by the rules of reality. It plays by John McClane’s, which as the films repeatedly go out of their way to point out, don’t really ever make full sense.

Live Free or Die Hard  is a great movie. It’s fun, it’s witty, stuff blows up and Bruce Willis manages to be himself throughout.

I just wish I wasn’t so compelled to scream “OH MY GOD THAT COULDN’T BE MORE WRONG” at every single turn of technobabble.

And fuck that F-35 scene.

Here’s hoping you all get another opportunity to enjoy this underrated classic of American Summer action cinema on this upcoming Fourth of July.

Peace out.


Judas the PlayStation 4 is finally dead!

It’s official.
My old PS4 is dead.
D. E. A. D.

But honestly? After the damned thing cost me internet overages on no fewer than 18 separate billing cycles by unpausing my paused game update files at 4 in the morning while I slept and couldn’t stop it, I had already named it “Judas”. In a not so weird way, I’m glad the bitch is finally dead – I’d only tried booting it up at all to watch a blu-ray after leaving it unplugged for almost a year. Very not worth it to me to buy a new one. So tomorrow I’ll be holding a funeral for Judas the PlayStation.

It’ll be a viking funeral – I can’t wait to set this little motherfucker on fire!

Just a crying shame I can’t take it out THIS way.

And before you bring up the obvious “solution” (turning off auto-updates), let me first be sarcastic at you: I’ve owned Judas since six months after the PS4’s launch. No, turning off auto-updates had never once fucking occurred to me.

I exhausted every option available to me in my multi-year quest to fix this damn thing. I did it all. Eventually, I even unsubscribed from PS+. Not for performance issues, mind, but because my online friends began to stop playing co-op with me altogether and simultaneous with these events, the free monthly games selection began to routinely suck ass.

But none of that appeased Judas.

Judas’ constant insatiable hunger for updates never stopped or slowed. If I ever agreed to download something, Judas wouldn’t rest until it was done, so I just unplugged it when I wasn’t using it (it would even turn itself back on –cold boot itself– otherwise!), and I eventually stopped using it completely. After that, it just sat in a closet corner with a sheet covering it like some horrible forgotten thing for the better part of 2016, only being awakened once during that span to blitz through Uncharted 4 (of which I am honestly not a fan, which I never thought I’d say about a Naughty Dog game).

Meanwhile, my PlayStation Vita has never once had betrayal on its mind, and my Xbox One has never lifted a metaphorical finger to do ANYTHING without my express permission in each instance. And so far, my PC has behaved itself, barring a crazy amount of updates after the initial startup, which honestly I’d expected to happen anyways.

Judas was just pure evil in console form, and after Sony botching the releases of their various Xperia phones in America during that long but critical period where people like me still cared about that brand, their MASSIVE customer service fuck up regarding my Vaio laptop some years before (they refused to repair it despite it being under warranty and made me send it to Best Buy, who broke it even worse. Twice in a fucking row.), and their willful abortion of the Vita before they’d even seriously tried to help it succeed (they gave up entirely after what, two years?), this string of betrayals by Judas was the thing that finally and fully axe-murdered my prior obsessive fanboyish love of Sony products beyond hope of redemption.

Sony done screwed up, and after all this time, I firmly believe they have no one to blame but themselves.

And Judas.

Fuck you very much, my old friend.


Quick notes on Framerates (for Gamer Arguments)

At the dawn of the 20th century, Thomas Edison and his company discovered that the human eye needed to see, at bare minimum, a framerate of what we would today call 10 FPS before our brains could become convinced that a single image was in fact moving and not just a fast slide show. 10 FPS is the baseline. Edison actually shot a few films at this framerate, including a short film adaptation of Mary Shelley’s Frankenstein (beating Universal to the punch by two decades), but little of these films survived.

24 FPS was a standard of Old Hollywood, and games below this seem choppy and poorly animated to pretty much everyone. This low framerate is part of why restored versions of old films that feature a higher framerate seem so odd to watch, but it is only part of the reason.

30-45 FPS is the current Hollywood standard, and this is the range in which most console games fall, as developers consider it the “best of averages” balancing on screen details and framerate.

60 FPS is the up-coming standard of Hollywood — for instance, the Hobbit films were filmed with special 60 FPS cameras. This ended up making test audiences complain that the films felt “unnaturally smooth”, so in many theaters, the films were artificially brought back down to 45 FPS, which is how you probably saw them unless you saw them in IMAX. For PC gamers, this is often the minimum optimum framerate.

60+ FPS is the desired framerate for PC gaming, especially for games that require pixel-perfect reflex timing like serious e-sports. It will be a long while before Hollywood or consoles catches up to this.

Here endeth the lesson.

Goddamn I miss the 1.0 Internet

So I was touring the old Space Jam website (as I occasionally do) and reliving a whole bunch of childhood, when I stumble onto some missing bits that just dead end.

..No. I wanted to keep going, you monsters!

And while the nostalgia is utterly lost on people who were either too young or didn’t have internet at all in the 1990’s, well… web design kind of sucked by modern standards. But we made it work, and while the end results were about as pretty as the mugshot for the Frankenstein Creature, the fact that it worked at all was kind of sexy. These were the days of Geocities, Tripod, and Angelfire. MySpace wasn’t around to be a meme yet, and above all, America Online seemed like it ruled all. Furthermore, we ALL had dial up (Only folks like Bill Gates and co. could afford broadband in those days, or at least it seemed that way), which was accompanied by the most delightful noise to ever grace your eardrums. We didn’t have Facebook groups, we had fanlistings and fansites (most of which were hosted on the aforementioned Geocities, Tripod, or Angelfire), and you were a trailblazer if you had one. The opening of Friendster heralded the beginning of the Social Media Era, and all the cool kids were using it, and neither Apple nor Google had yet taken over the effing world. Indeed, Google was barely getting started by the close of the 1.0 era.

And I guess most importantly about the Internet 1.0 era was that it was, apparently, NOT “the internet” but “the world wide web”. Oh, and everything felt like it broke every half hour.

That too.

But despite all the troubles that came with it and the frustrations of the limitations of the technology, I miss it a lot.

Not because it was in any way better than what we have now (BY NO MEANS), but because it all still felt infinite, especially to those of us who were kids at the time, and like the best was yet to come.

Now, in more ways than one, it feels sort of like we’re rapidly approaching the end of the internet.

Not literally of course. But the wild west anything-is-possible feel of the digital frontier is now very greatly diminished, and I miss when we had it in abundance.

That’s all.

Some quick thoughts on the “PS4 Pro”

​Well, thank you Sony for justifying my switch to the Glorious Golden PC Gaming Master Race, with additional slumming it in Xbox Land.

The whole PS4 hardware gen has been one long irritating disappointment for me with inconsistent levels of control over auto updates (which has cost me money in wifi overages), network issues, price spikes of Playstation Plus, and all with nothing seriously good to show for it.

Xbox by comparison has done so much right this gen.

Boy, what a reversal of the pre-release hype and scandal this gen turned out to be.

Also: it doesn’t even do 4k natively? The shit, Sony?


For shame, Sony.

I guess you really do just have failure ingrained in your DNA now. First PCs, then TVs, then tablets, then phones, and now PlayStation. I guess there really isn’t a single failure you won’t run with scissors straight into these days.

How far you have fallen.

Premium is a bullshit descriptor

When it comes to phones anyway.

It’s kind of sad, but despite all the OEMs saying that a metal build (or even more horrific, a GLASS BACK PANEL!) feels the most “premium”, I still find myself wistfully holding my first gen LG Nexus 5 and first gen Moto X and acknowledging that these phones were the best feeling devices I’ve ever held. The first gen Nexus 5’s shape is still the coolest in my opinion: straight sides and that subtle science fiction curve at either end of the device is just classic and ageless. There’s a reason that shape is still used by so many app advertisements (the only shape used more frequently is that of the iPhone).

Granted, I don’t think that should be the shape of every Android phone, but in an age when many phones still boil down to “screen on a black or white slab”, I think we could stand to make that slab a little more eye catching.

And it and the 2013 Moto X were both made of plastic yet are some of the most comfortable devices to hold, and never once felt cheap. I think that skewers the present ideal among manufacturers about what “premium devices” should feel like (lots of glass and/or metal. Gods help you if you drop it onto a sidewalk and permanently get a jagged scuff on the metal or crack the back glass panel — something I have seen happen to a LOT of iPhones over the years). Plastic takes a drop better, and the 2013 Moto X had a metal frame BENEATH said plastic for extra rigidity, and mine has taken countless dives with only some minor scratches on the corners to show for it.

I think we ought to redefine what a “premium” feel means, and I think the secret to that lies in phones made in 2013.

System Exclusivity is dead, and I kind of miss it.

Some friends and I were discussing a frankly broad topic involving PC gaming and the ways that Sony’s latest marquee system has managed to fatally offend me, but eventually one brought up system exclusives citing something along the lines of “it will always be a problem”.

No it won’t. It’s barely a problem now, but as it turns out, it was a problem worth having.

I actually addressed the Console Exclusive Problem-that-is-not-a-problem in my New Year’s op-ed, in which I praised the Wii U for being the only console out there actually innovating on gameplay.

As I wrote there:

System Exclusives seem to have gone the way of the Dodo Bird, which means that there’s no longer any clear reason to buy one system over the other as everything is available on everything, with less than five percent of major releases being tied to a given system, and most of those are TIMED exclusives, which will eventually be available on a competing platform, which generally fails to drive system sales.

Compounding the issue, the PS4 and Xbox One are basically just a low-midrange gaming PC with a custom UI in terms of hardware. Then, most damningly of all, not only do systems get all the same games, but all those games feel more similar than ever before.

Console exclusives are becoming a huge rarity. It’s not like back in 2004 when you had to buy a Playstation 2 to play one game, a Gamecube to play another, or an Xbox to play a third. Look at Amazon listings at some point and you’ll see what I mean.

The PS4 and the Xbox One share the majority of their respective libraries, and virtually all of their major releases, and most of that can be found on PC as well. This is good for the consumer in the short term but bad for Sony and Microsoft in every term, as it forces them to lean on aging tentpole franchises like Halo and Uncharted that come from an era before multiplatform gaming was the default. Most system exclusives tend to be in a franchise that originated pre-2008, and the whole industry is feeling the resultant dearth of identity. In that environment, why go for either when the PC is better at almost every price point and has the most of the same games, a massive amount of titles not to be found on any console, plus things like mods and emulation?

That’s part of why I praised Nintendo for betting big on a library unlike almost any other out there. The Wii U has a phenomenal catalog of games that appears nowhere else. Furthermore, the system is so cheap that it undercuts even the best budget PC builds. Now, I’d still like to have one, but for the time being, it’s all hands on deck as I build my PC gaming monster.

Really, the only solution that makes any logical sense for most gamers in the current release climate is a cheap but powerful budget PC and whatever Nintendo’s got that year. Sure you’ll miss out on the latest inFamous or Gears of War, but these are franchises that are seriously feeling their own age and are only iterative tweaks on prior games. There aren’t many inventive or bold system exclusives left on the console side; PC, by comparison, has TONS. Sure, the Xbox One and PlayStation 4 are more indie friendly than ever. And you know what? I wholly support this. But PC was the primary bastion of these guys to begin with, and still gets the first helpings. Nintendo? They just channel that indie spirit into their own games, and it leaves the Wii U feeling like an incredibly polished library, though small, with mainstay game styles and other, indie-esque eccentricities in equal measure.

Microsoft and Sony are starting to notice that the homogeneity between their libraries is a problem, and I think gamers are starting to take notice too. I mean, can anyone rightly tell me that Battlefield  and Call of Duty are completely different games? Do you really want to lie to yourself that much after all this time?

Honestly, there’s only one system exclusive franchise that still manages to hold my interest, and that’s Halo. Halo has turned into quite the lovely space opera, with one hell of a story running throughout that spans millenia that rivals anything Star Wars, Star Trek, Dune, or Mass Effect has on offer. Halo 5: Guardians was a very mixed bag, but it delivered some effective twists throughout that kept the story interesting.

Halo has remained a worthy exclusive, in my eyes, but you know what? It doesn’t sell a system like it used to. We need system sellers more than ever in a period where there are fewer than ever. Despite this unfettered access any given platform has to gaming at large, without the drive to create a game that proves beyond the shadow of a doubt that you should buy this new Xbox or PlayStation, innovation dries out. Then the sepia shooters move in, and everything turns the color of mud.

Ironically, the game that convinced me to buy an Xbox One wasn’t Halo. It was Sunset Overdrive, and it’s a system exclusive for the Xbox One. I would not have an Xbox One today if this game had been on another platform. I’d have bought it for the thing I already had, and that would have been the end of it. I wouldn’t have given the Xbox a second glance, and given some of the technical issues I have had with the PS4 I think that would have been a travesty.

So while I still work away at my no-holds-barred PC, fittingly (and lengthily) named the “Das Übermensch Build — “God is Dead”, also sprach Zarathustra” (or “Nietzsche” for short), I continue to hold a fond desire in my heart that we’ll see a game SO GOOD I’d buy a whole new system just for a chance to play it.

Yeah, system exclusives used to be a problem. They were expensive buy-ins, railroading, and always felt a tad dishonest. But in a world that functionally exists without them, I can honestly say I’d rather go back to having that problem than the homogeneity among games we have today.

It’s worth bringing in this sentiment from Infocom back in the eighties, after they had been asked by the company’s new owners at Activision (yes, they’ve been screwing up your favorite developers for a while now) to design graphical multiplatform games: “A game made for every system cannot take advantage of the strengths of any of them, and must therefore cater to the lowest common hardware denominator.”

Let’s just call it “Infocom’s Law“, and it still holds up today. Assassin’s Creed: Syndicate is a gorgeous console game, but it can’t take proper advantage of the PS4’s hardware gains over the Xbox One, and the PC version is likewise hamstrung by the same.

Infocom’s Law speaks clearly in favor of system exclusives in clear, observable logic, and explains just what we lost and why with the consumer demand that everything play on anything. By necessity, the wider you make the pool of hardware the game is meant to play on, the more the end result suffers.

If things keep going the way they are, we risk the industry becoming creatively stagnant, and then we can expect a fairly literal depression to hit the industry, not just in terms of sinking sales, but in unenthusiastic customers as well. Competition keeps the edge sharp, and Microsoft and Sony just aren’t competing correctly. It’s a duel of clones, Solid Snake vs Liquid Snake, but instead of the threat of global nuclear war, the stakes are simply an ennui-stricken industry that doesn’t see the point in trying anymore.

System exclusives were born out of a desire to produce a “killer app”, a game that would sell systems by virtue of its excellence and unseverable ties to a given platform; if you wanted to partake in the face melting awesomeness that was Halo: Combat Evolved, well, you have to buy an Xbox. If you wanted to experience the amazingness of the original release of Devil May Cry or Devil May Cry 3: Dante’s Awakening (because screw Devil May Cry 2), sorry bruv, you needed a Playstation 2 (Xbox fans would finally get the Devil May Cry HD Collection. A decade later.). Without those exclusives vying for your cash and system loyalty, any given system is just as good as any other. When any system is as good as any other, well, both console manufacturers and consumers get punished in the long run as consumers both say “well screw it” and just spring for the best option they can find. Increasingly, that option is a PC, and with game changing boxes like Alienware’s Alpha and Steam Machines which can be had for around the price of a gaming console and with more press than PC gaming has ever had since the mid-1990’s, if console manufacturers and game development studios can’t pull together some impressive killer apps in the near future, we could be looking at a very different gaming industry in just a few short years.

So, I hate admitting this, but… COME BACK SYSTEM EXCLUSIVES.

We need you.