Monthly Archives: June 2012

Why I’m not buying Diablo III, or possible any other Blizzard game again

If you’re an Android Police reader, you may not know it, but I’m a fairly seasoned gamer. From the time I was 5, I partook in every gaming medium I could get my hands on, spending hours with Sonic and early iterations of Madden on my Sega Genesis, Nintendo 64 with Mario Kart, Zelda, and Starfox, the odd year or two with Gamecube, and even a brief stint with Xbox. And yes, there were some dark Pokemon Gamboy years in elementary school thrown in there. During most of that time, I was also a devout PC gamer.

My first real addiction on the PC was probably Duke Nukem (I was a bit late to the game for Wolfestein 3D / Quake), but I quickly graduated to a wide variety of titles as PC gaming entered its “golden age.” Half-Life (plus Counter-Strike, Day Of Defeat, etc.), Age of Empires 2, Diablo I and II (the latter having been quite an addiction), Medal of Honor, Call of Duty (the first time I got into true competitive gaming) 1 and 2, Battlefield 1942 (how I miss that game), Warcraft III, Company of Heroes, and finally World of Warcraft – the crescendo of my time as a gamer.

Dozens of other titles can be chocked into that list as fleeting interests (Rome: Total War) or poor purchase decisions (ugh, STALKER: Shadows of Chernobyl). But in the last few years, my relationship with PC gaming has dwindled into much more of an off-and-on fling.

While I long ago admitted that I am no match for the young, raised-with-a-Sixaxis-in-hand FPS junkies, I do still dabble in modern titles when I’m in the mood. Borderlands was a magnificent, if sometimes tedious, piece of art. Fallout 3 was The Elder Scrolls with guns – and actual entertainment value. Aside from the hyper-modern FPSs such as whatever the newest Call of Duty / Crysis / Battlefield DLC-fest is, I like to think I’ve kept up with what’s going on in the industry. On the occasion I visit my family, I’ll even pick up the Xbox gauntlet and dick around with Forza.

So, when Diablo III’s release loomed imminent, I took interest. Diablo II consumed two solid years of my mid-teens, and I loved almost every minute of it (aside from the acne and social ostracization). I remember with fondness the longing I felt for particular “dupes” (rare items duplicated using exploits before Blizzard fixed those glitches), and all the Perfect Skulls I traded for my Amazon’s ultra-rare sapphired Raven Cry (or something like that) bow on the US West server. That was the joy of Diablo II: a logical, if imperfect, proto-MMO economy added infinite replay value to the game. At least until the Lord of Destruction expansion pack, which promptly turned it on its head.

Anyway, recently I began to read reviews of Diablo III, the server-snafu-panic having finally died down to some degree, and I found myself confused. The game is clearly massively popular, and yet literally every major outlet review I read (most of them being very positive) had negative comments posted seemingly into perpetuity. There were more upbeat ones sprinkled throughout, but 90% of the feedback was what I’d call “burning hatred.” While some were still complaining about DRM and technical issues, the vast majority were substantive complaints. Complaints that were surprisingly absent from critics’ reviews.

For a casual gamer, a review from an outlet like Gamespot may be useful, but it’s clear that real “hardcore gamers” were almost universally disappointed with the methodology reviewers used to evaluate the game. I was honestly taken aback – with a MetaScore of 8.9, Diablo III is absolutely slam-dunking it with the media. And yet an extremely vocal group of players have basically exposed gaping flaws in the game.

The auction house system has essentially neutered the loot system that was so loved in Diablo II. In order to advanced the most difficult portions of the game, you are essentially forced to buy items from the auction house. There really is no other way, unless you literally have no life other than sitting on your ass playing Diablo III for 14 hours a day, 5 days a week. The “real money” auction house is also luring those with deep pockets, creating class-warfare of sorts between the “hardcore” base that has traditionally been so key to Diablo’s success, and the impulsive and free-spending “casual” gamers that publishers are increasingly trying to target.

The game itself has also apparently been “dumbed down,” requiring less careful customization of characters (if any), and is far more forgiving in regard to re-tuning skillsets with basically no penalty.

In my mind, this goes against my core belief about online games: they should be fun for everyone, regardless of effort or wealth (as is practically possible), and they should reward experience and strategy in such a way that the benefits of “cheating the system” are marginalized. Should people be playing video games 14 hours a day? Or 50 hours a week? Probably not – but Blizzard knows that it’s those people who are out there prosthelytizing the virtues of their games to their friends, families, and random people on the internet. When you upset them, you start risking becoming irrelevant.

Some of these people have claimed that Blizzard has “World of Wacraft-ed” the Diablo franchise, but I think that couldn’t be further from the truth. In World of Warcraft, Blizzard carefully built a balanced economy and actively cracks down on exploiters (though gray market currency is readily available). It also made the game such that the most desirable items (at least when I played last) were only attainable through extremely difficult, group-based dungeons or insane acts of devotion and dedication. The inability to transfer many of those items made this economy much more stable, and kept the game fun. And Blizzard did this on the condition that you gave them around $150 a year – issuing patches regularly, adding content, providing good customer service, and keeping the hardest of the hardcore entertained, while making sure the game was still accessible to beginners. World of Warcraft will probably be remembered as Blizzard’s crowning achievement.

Diablo III has seen World of Warcraft’s auction economy bastardized and made central to end-game activity, which is just absurd. Adding real money into the equation just sullies the reputation of the company (that, until recently, could do no wrong in the eyes of its fans) even more. Blizzard is now treading a dangerous line: appealing to the casual gamers of today at the expense of its most loyal fanbase tomorrow. It’s a risk many game developers are taking, and it puts them at the mercy of a massive market whose whims can alter in an instant.

Perhaps Diablo is destined to become Blizzard’s “casual franchise” from here on out, while Warcraft will remain free of such taint. But I’m not keeping my hopes up, and I’m certainly not buying Diablo III.

In fact, I’ve already pre-ordered Torchlight 2

A (really, really late) review of John Carter

In a world where we’re constantly being bombarded with witty, character-driven sci-fi, superhero, and fantasy films, John Carter stands alone.

It’s not particularly clever. I think I chuckled once throughout the entire film, and it was fairly near the beginning. The dialogue isn’t especially thought-provoking – it’s actually quite simple. You have to remember, this movie was made for the whole “PG-13” family, something Disney absolutely nailed with the Pirates of the Caribbean franchise (well, until they beat it into the ground with unending sequels).

John Carter also eschews the modern trend of an intensely character-driven plot. John Carter himself is a moderately interesting and likable rogue with a dark backstory that Disney’s morality assurance department allows the film to only hint at, and this will almost certainly leave those expecting a darker, more gritty protagonist disappointed. My advice? Get over it.

Unlike almost every other film in this genre (or genres, I should say) presently, John Carter’s source material has no real (living) “cult following.” It’s based on a serialized novel published in 1917, and while the book, A Princess of Mars, certainly has its place in sci-fi / fantasy canon, most people today don’t know it. In fact, the author Edgar Rice Burroughs’ crowning achievement is Tarzan, leaving little room in modern memory for an earlier, less successful work. While many writers and directors claim the series as a source of inspiration, its fame has undoubtedly dwindled in recent decades. But this is part of what makes it great, in my mind: I had no idea just what I was getting into when I began watching it.

This also, unfortunately, was likely a big part of the reason for the film’s landmark commercial failure. In recent years, America has become obsessed with shamelessly nostalgic rehashes of well-known stories and characters. Whether it be long-standing comic superheroes (movies I am all too happy to see), modernly successful book franchises (which I am less happy to see, though Harry Potter grew on me), or questionable TV-sourced throwbacks (The Smurfs, Alvin and the Chipmunks), today it seems all we care about in popular film is what we liked as kids.

That means to attract an audience to a film in the sci-fi / fantasy genres, you either need to appeal to their sense of nostalgia or current pop culture obsessions. It’s an annoying and frustrating limitation. One that has, of course, produced great films. Under the guidance of Christopher Nolan, the Batman franchise is having its best, grittiest years. Iron Man, a hero that could barely be sold as a cartoon, has been absolutely brought to life by the charismatic Robert Downey Junior and a flotilla of clever writers. Lord of the Rings. Harry Potter. X-Men. Need I say more?

For a film with a less recognized brand to prosper, it seems to have been accepted at this point that it must be one of two things: cheap and deep, or flashy and stupidly simple. V for Vendetta, which was critically lauded, had a budget of just $54 million – compared to John Carter’s monumental $350 million. Just think about that – for the cost of John Carter, they could have made V for Vendetta 6 times. With money to spare for a few luxury yachts.

A big budget sci-fi / fantasy film means big expectations. And without a recognized “brand” to back it up, it has to have mass appeal (see: Avatar) and an expensive hype campaign (again, see: Avatar) behind it. Even if the movie isn’t that good (… Avatar).

John Carter has neither. It’s a story about a guy on Mars with some aliens wearing a leather, almost bondage-like apparatus on his chest, and nobody had any idea why they wanted to see this movie. It was shitty marketing, end of story. That’s what really killed this movie commercially, in my opinion.

Anyway, back on point: the movie. As I said, the main character, John Carter, isn’t particularly deep or developed. Nor is any other character in the film, for that matter. But that’s because this film is about telling a story (it goes for the decreasingly popular “narrative wrap” format). The story is full of anachronistic symbolism and metaphors (the civil war and turn of the century native american relations are rather indelicately shoved in your face), but they’re so dated that they don’t feel cheesy or preachy.

John Carter is also rife with character condensing, as is a necessity with book adaptations. It’s most notable when it comes to certain characters essential to the plot, but who have less need for screen time. The primary Martian antagonist, whose name is so forgettable I have, in fact, forgotten it, is utterly featureless. The true alien villain (spoiler alert), whose character is unnamed (he’s the bad guy from the first Sherlock Holmes), is full of ominous one-liners and shadowy intent. It’s enough to make you curious about what’s going to happen next, but he’s just there as a mechanism for advancing the plot more than anything (and to set up a sequel).

The thing is, though, I didn’t care. The story is simple, meanders only occasionally, and takes you on a journey through one of the first truly original-looking sci-fi / fantasy worlds I’ve seen in ages. It really is an amazing thing they’ve crafted, worthy of the Disney name adorning this film. Costumes, environments, aliens, ships – it all looks amazing – and original.

The film really doesn’t evoke much in terms of emotion, but that’s to be expected for something based on a sci-fantasy novel written at the turn of the century – it just wasn’t the style of the time. But the conclusion of John Carter will, without a doubt, shamelessly pull at your heartstrings and leave you yearning for a sequel (which has been announced – though funding is a big question-mark).

Part of me wants to see John Carter live to fight another day and round out Burroughs’ first three novels in the series. I crave to know what happens next.

But another part, perhaps just as big, wants it to end here, and leave this great film to stand on its own so that, years from now, hopefully, it will be rediscovered and cemented as the classic it deserves to be.

AT&T wants to make unlimited access deals with companies like Netflix, people bitch anyway

People hate wireless carriers in the United States. With a fiery passion. In fact, they’re the least-liked industry in the country, followed closely by big oil and cable/satellite providers. Considering how much money we give them, it’s not hard to understand why. Tiered data, for example, has absolutely enraged a lot of people.

AT&T has heard from many of its customers that tiered data basically makes it prohibitively expensive to do things like stream Netflix, YouTube, Spotify, etc. on a regular basis. And now it’s hearing from some of those companies, like Netflix, that they’d like this to change. So, AT&T wants to have companies like Netflix pay it in exchange for providing unlimited access to those services through its network.

Predictably, a great many people have drug out the “net neutrality” flag and started waving it frantically.

This isn’t a net neutrality issue in any technical sense – none whatsoever. The carrier is not prioritizing, blocking, throttling, or otherwise physically impeding your access to specific content. Sorry. Like many things in America, it’s a “I don’t want to pay more money for something I don’t think I should have to” issue.

If Netflix strikes a deal with AT&T, and decides it wants you to pay extra for mobile streaming on your phone, that’s Netflix’s choice – it has little to do with AT&T at that point, only in the sense that they’re financially incentivizing this for Netflix. Spotify already charges a premium for mobile access. Hell, Amazon requires you to buy a separate piece of hardware to stream Instant Video on a mobile device.

I think the scenario we’ll see unfold is pretty simple. These “value-added” bonuses on carriers will become incentives to subscribe, while services like Netflix, Hulu, and Spotify will start to specifically monetize and tier mobile subscriptions / access. Carriers will just be forced to compete for your business on a new level, and subscription-based services will want you to pay money to access them from your phone. We’re moving to a service-oriented world on the mobile web, and honestly, I don’t see a problem with it.

This “it’s the 1990’s internet all over again” garbage that keeps getting spewed is becoming tedious. This has nothing to do with the “open internet,” or the right to access information – it has to do with people doing what they always do: bitching about the prospect of paying more money for something they want.

Call me when AT&T starts redirecting you from Wikipedia to Bing, then we’ll talk about “net neutrality.” Until then, this is business as usual. Which is to say, business.

Musk and Dragons

It’s somewhat amazing to me that we’ve witnessed the first private spacecraft to go into orbit, dock with an orbital facility, and return successfully to earth.

Sure, the US, Russia, and other national governments have managed this. They’ve been doing it for decades. And yes, NASA (and thus the US government) has provided around $500 million to make this a reality – but NASA didn’t design the rocket, or the capsule (even if many of its former engineers did). This is a completely new vehicle and propulsion system – designed from the ground up.

And SpaceX did it in 10 years. That’s astounding. From founding to full demonstration (minus human occupants), they did in a single decade what has taken countries like China many times longer than that to develop, and with a much better, more versatile end product.

Say what you will about Elon Musk, about the demise of the shuttle program – this is the future. I can think of no better use for NASA’s budget than seed funding and cargo contracts for an eccentric billionaire who wants to retire on Mars.

A Note On Audio Equalizers, Sound, And Android

H’ok – I’m going to put on my “sound snob” hat for a minute here, because I feel like a lot of people in the Android world are becoming obsessed with EQ’ing (equalizing) their phone and tablet audio, and that some of them are misinformed about what EQ is, its benefits, and what it does to sound.Equalizing your audio is just that – balancing it out. In a perfect world, where your house was made of egg crate foam and you had access to thousand-dollar plus speakers, EQ would never be necessary. Everything would pretty much sound right without any kind of adjustment (assuming your media sources are good, as well).

Ever notice how on your 5.1/7.1 surround setup in a large room makes it impossible to hear voices during movies unless you absolutely crank the volume? That’s what equalizing (or dynamic volume normalization) is for – it balances the speaker output to the space you’re in. If you had a $10,000 speaker layout, with speakers perfectly spaced and angled, you wouldn’t need EQ or normalization. But most of us don’t – so that’s why your stereo receiver has those awesome EQ settings – to adjust for the failings of the space, or for the inadequacies of the equipment given the space it’s in. You can also just use it for fun, or for various occasions (eg, more bass for music at a party).

Analog EQ was originally used for recording and live music / performances to make up for the differences and deficiencies of various environments (eg, amphitheaters, stadiums) and quickly made its way into consumer hi-fi equipment. You can still buy separate, basic graphic analog EQ hardware for your home stereo today.

But what about headphones?

This is where many people are applying EQ today, and it’s largely unnecessary.

First, EQ’ing should almost never be necessary in a headphone application. To say an equalizer is “required” for listening to music on your headphones is like saying you need to adjust your television’s color settings every time you change the channel. If you had a really horrible TV that absolutely could not display CNN (lots of red) after being adjusted for FOX News (lots of blue – funny, right?), there might be an exception.

That is to say, if you’re using $5-15 earbuds (or a phone/tablet with a really horrible headphone amp) then yes, EQ can enhance your listening experience when adjusted for particular genres, because your listening equipment is so bad that it’s probably distorting the crap out of whatever you’re listening to.

The other scenario is something like Pandora set to low quality but very high volume – think of it like a channel on an aerial that comes in ever-so-slightly fuzzy on a big 48″ CRT TV. By adjusting the sharpness, you could reduce the visibility of the interference. It was still there, and in no way was the actual interference lessened, your eyes just noticed it less. Alternatively, you could sit further away from the TV. Adjusting the EQ settings when listening to a really low quality audio file can trick your ears into believing it sounds more like it’s “supposed to.” But it’s still going to sound like crap. You can also, of course, just turn down the volume and achieve a similar effect, with the drawback of quieter sound.

What EQ is not is sound profiles to make your music sound “better.” Setting your EQ to “rock” when you listen to rock does not make the music sound “better.” It adjusts the levels such that, for the “genre” of rock according to whoever wrote the preset, your audio will place an increased emphasis on the frequency ranges that are most important on rock tracks, and de-emphasize, or leave neutral, those that are not.

How can a “rock” EQ preset work for AC/DC, U2, and The Beatles? It can’t. It has no fucking clue what you’re playing. In fact, The Beatles sound awful on most rock EQ settings, because the EQ preset is for much more modern (and loud) rock. No song exists that is perfectly shaped to that preset, because it’s modeled on averages.

Custom-setting your EQ to your headphones’ or device’s particular strengths and weaknesses should involve minute adjustments, if any, unless you’re using the garbage ones that came with said device, or something like Beats that over-amplify low frequencies.

The best way to improve the sound of your music? Buy better headphones (or speakers), and use local audio files of at least 192Kbps (128 even is probably fine, blind tests have shown). You can’t make a 12″ CRT TV from 1975 give the color of a 42″ OLED, and you can’t make $5 earbuds sound like a $100 pair of Klipsch, Shure, or Etymotic.

Otherwise, you’re just doing the audio equivalent of adjusting RGB / brightness / contrast levels – a custom arrangement or preset may look better to your eyes, but the same data is coming through regardless of what you change, you just see it differently. Equalization does not enhance sound – it shapes it.

Google’s Google+ “problem” isn’t a problem, even if Googlers say it is.

This is a link to an article by a former Google employee:

In it, he has received notoriety specifically for his derision of Google+ as Google’s new focus, something other Google employees, current and former, have also complained about:

“I think Google+ is an effort that does not deserve the engineering minds at Google. This is mostly a personal bias. I see Google as solving legitimately difficult technological problems, not doing stupid things like cloning Facebook. Google, in my opinion, lost sight of what was important when they went down this rabbit hole.”

I’ll say I was among the first to make fun of Google+ when it came out – and I still think it has a long way to go. But it has also come a long way in a very short time.

The more I listen to Larry Page talk about the future of Google, of the web, and technology, the more I think those who criticize it just hate the idea that the web is becoming socially driven, instead of query/tool-driven.

It makes sense – Google has traditionally focused so hard on those products, and now people working on them (and being made to work on making Google+ work with them) feel like they’re being manipulated into a project they don’t care about, or want to have to think about.

But every single day the internet becomes more about the people and organizations you know and follow directing you to information you didn’t know you wanted. Social is a legitimate, technically difficult problem – in the sense that making it into something truly useful for all aspects of our use of the web is something no one has accomplished. Social is this giant mass of information about people and the things they do, places they go, their interactions, their friends – leveraging this information is the next “quantum leap” – it’s Web 3.0.

And that’s what Google is doing with Google+. Facebook will never share the user information it has with Google, so Google had to come up with something themselves. Google needs this information to push its existing products to the next generation. It’s the exact opposite of “losing sight of what’s important.” Google+ is the means – not the end. Google doesn’t want to be Facebook, it just wants to have the information Facebook does, because that information is unique, and it is invaluable to advancing the usefulness of Google’s core products.

Google, as it always has, is looking to the future. It seems this guy is stuck in a Google that was looking to a future that happens to be our present.

Warm Leftovers

I’m David. I write for This is my personal blog, so everything you read here probably wasn’t interesting or relevant enough to post there, but has been stewing around in my brain enough to want to commit it to text. Thrilling, I know.

This is the welcome post. I’m not sure if I’ll continue using this blog in the long-term, but hey, it’s Friday morning, news is slow, and I’m bored. So, join me for some warm leftovers. God, that tagline is horrible. I’ll never use it again, I just felt obligated to plug it for the first post.