Thursday, May 26, 2011

Book Review: Anathem

Very few people are going to like ALL of Neal Stephenson's books. I've read a number of them, and usually, but not always, like them (I loved Snowcrash and Diamond Age, was ambivilent about Zodiac, disliked Cryptonomicon, and loved the System of the World trilogy).

One thing I've found is that the time he takes in building up to the action is pretty consistently one quarter of the way in. The problem is that in a book like Anathem - a thousand page monster - this means it's quite some time before you get to the action.

In this case, it's still worth it. The long slow build-up does two things. First, the main character is part of a monastically-styled class in the book's world, and this long deliberate build lets the reader get immersed in the slow, pensive, pace that the 'monks' move and live. Secondly, It really takes the reader on a long and deep journey so that you find yourself - like the main character - gaping in wonder at how far you've come in the journey since the start.

And in that journey, well, it does go quite a ways. Picture somethings on the order of a Dune or Lord of the Rings; starting in Name of the Rose, but ending up in 2001: A Space Odyssey.

The book also serves as backdrop for commentary on what I believe is Stephenson's distaste for the growing rift between the literati and the 'illiterati' in modern society.

As I said above, it's a long and arduous read. If you can get through long and deep sci-fi reads, though, you'll find a really great adventure here.

Wednesday, May 25, 2011

Does LA Noire represent the next 3 years of AAA game innovation?

I started playing through LA Noire a few nights ago and am digging it thus far. As Red Dead Redemption taught me, the combination of Rockstar's immense game worlds and my limited gaming time, I won't likely finish it until sometime in 2012. Still there are a few initial thoughts I thought worth writing up.

The game doesn't have me *loving* it yet. Not in a Portal 2 (which I loved start to finish) kind of way, or even in a Red Dead Redemption (which I had 'moments of swooning' for) kind of way. That said, I think it offers some interesting hints on future direction.

LA Noire is very much a Rockstar game. Like both GTA and RDR, it offers a large 'open' world, many side-quests and missions, in-game mini-game type activities. The story is - so far anyway, I'm only five cases into it -more linear than GTA and RDR.

Still, there's no reason that this game needed to be anything other than 'Grand Detective Auto', just as many people labelled RDR 'Grand Theft Horsie'. It could have been the good-guy version of GTA, with lots of cops and robbers shootouts in a late 1940's setting. And they'd have sold tons of it.

Still Rockstar chose to take a couple risks in straying from the usual, in trying to make not a cops-n-robbers game, but a detective game.

The first is that the game has an element of point-and-click-adventure to it. Searching crime scenes, finding bits of evidence, writing these in your notebook, and then recalling them at the right time later to solve a puzzle - these are all reminiscent of the best of early 90's Lucasarts titles. More in keeping with the theme, and more unique in today's market vs the plethora of shooters.

The second, and bolder, risk they took was in making interrogation a core element of the game. The facial animation that's been much talked about, together with what must have been a massive spend on voice-and-mocap acting, were needed to add a more innovative - and thus risky element to the game. The game asks you to interview witnesses and suspects, and then lets you decide whether to believe, doubt, or accuse them of lying. Doing so requires that you look at the evidence you've got thus far, as well as reading the characters faces and body language. In doing so, it not only bumps into the uncanny valley - it throws the player headlong into it. Is it perfect? No.

I don't like the way this guy avoids eye contact!

The interesting thought to ponder here is where Rockstar chose to invest and/or innovate.

It certainly feels like we're near the asymptote on performance that can be extracted out of the current generation of consoles, and next generation isn't coming until (depending who you beleive) 2014.

That means another 3 years or more of making AAA titles, in which the publishers/developers need to garner interest in their titles, and they can't use screenshots alone to do it.

So, what do you do if you are trying to make money in this space. Well, for one, you focus on efficiency. Aside from that, you invest in content production (like all the voice acting in this case, and the more detailed facial models), and in innovations in gameplay (like the lie detection element; or the revisiting of point-n-click adventure).

I think this is a good indication of how others will be spending their money too. Production quality, art, and innovation in gameplay. I hope we'll see more big budget titles with experimentation along these lines over the next few years. Kudos to Rockstar for taking these risks.

VaioS laptop

Early in the year, I said that one of the trends we'd see was what I called more 'bespoke' design in our computers & electronics. More looking like items of fine craftsmanship, less prominent branding (or - gasp - no branding at all), simplicity and elegance over LEDs & complexity.

The new Vaio S series is certainly an example in that direction. Unfortunately, it still has the logo/branding, but on the plus side, it has a brown/gold option that immediately made me think of Robin Williams life-editing cutter laptop in The Final Cut (an awesome sci-fi flick, if you haven't seen it) which I've pointed to before.

Sony Viao S

A replica of the Final Cut laptop

Williams holding the prop from the movie.

Monday, May 23, 2011

Game Journalism in the Age of Digital Books

Last year I authored a number of posts (like this one) on the future of ebooks, and also did a few book reviews and comments on digital typography (like this one), as well as pointing at the excellent thinking on the subject by guys like Craig Mod and James Bridle.

Suffice it to say that digital books and digital reading is a very exciting area of development, one ripe for many years of innovation. I'm continually surprised at how many people look at the iPad - essentially the industry's second take on a digital reader after the Kindle - and are calling it done.

It took us a few thousand years to get print into decent shape. I think we should at least give this one the decade, ok folks?

Anyhow, I was encouraged to see my areas of interest overlap when seeing two different experiments in taking game journalism into the new age of digital print.

The first is The Final Hours of Portal 2, by Geoff Keighley. Keighley spent three years with behind-the-scenes access to Valve & the Portal 2 team, and delivers a fifteen thousand word ebook/application. Less a review and more a gushing fan souvenir, it's nevertheless an interesting experiment - taking the lengthy text on the game and embedding video, interactive application elements, etc, to deliver an in-depth experience any fanboy would love.

I'll post a longer review when I'm done getting through it, but regardless of any flaws, I recommend you spend the $2 to download this to get an idea of (some of) what's possible.

The second example I came across is the Kill Screen Review of Infinity Blade. Great use of interactive typography to actually convey a key element of the game - in the text layout itself, not just in the text.

Go check out both of them, as I'm sure you'll be entertained and inspired.

Saturday, May 21, 2011

Book Review: You Are Not A Gadget

I found a few of the ideas in Jaron Lanier's You Are Not a Gadget interesting and provocative. Despite that, I can't recommend the book.

The high level goal of Lanier's manifesto is to call into question the unbridled enthusiasm of the Pro-Internet, Web 2.0 movement, without taking the luddite stance that some do when taking issue with the same technology and progress. Lanier believes in technology's power, he just isn't naive enough to think that only good can come of it.

Some of the concerns he raises include whether we should consider the influence that system architectures have upon us and our culture, the cost of software & hardware design lock-in, and whether open source and open content models can innovate as well or in as many ways as traditional methods.

These are all good topics, worthy of discourse. Unfortunately, I found Lanier's screed to be a long-winded ranty and poorly structured attempt at doing so. Your time is better spent reading The Master Switch, or Bill Patry's book for starters.

Saturday, May 14, 2011

Splitting the iPhone vs DS argument in two

There have been a number of posts about Nintendo vs Apple, the 3DS vs the iPhone, and "99c games vs $40 games". The summary of these posts revolves around two different issues. They get muddled together, and as a result, cloud the argument. This post is an attempt to distinctly break apart the two arguments, such that we might look at each cogently.

Argument A: "Will Apple's 99 cent games destroy the market for Nintendo's $40 games?"

This is indeed a good question, and one to which the answer seems obvious.

On the 'Yes' side, one have only to look at Apple's recent numbers, the excitement around iPhone, and the challenges some have had in trying to maintain traction around higher price points in the app store on that platform.


On the 'No' side, there's an argument that a large development budget can give a deeper, larger, experience. There's also the fact that Nintendo has continued to have success with some titles on the DS. There's also another version of the No answer along the lines of "if you want Mario, he costs $40", in other words, Nintendo's IP is something to consider.

The answer is probably somewhere in between. Undoubtedly the bar will be raised for what people expect to get for more than $0.99, so it will be a challenging sell.

The second argument is distinctly different than the first though. So let us suspend belief for a second, and imagine a world in which all iPhone games cost $40 (or one in which all DS games cost $1 - your call, as long as argument A goes to equal footing)

Argument B goes as follows: "Will 'portable game console' join the list of devices who have been subsumed by the Smartphone?"

Once again, there are arguments for and against, but in my opinion, the arguments are more one-sided.

On the No side: Nintendo continues to innovate (dual screens, stylus, 3D-stereo screen, etc) to stay ahead of the capabilities of the phones (or at least different, if you want to take issue with 'ahead'). Also, there's value to the quality, curated experience a closed, vertical platform like a console provides. And once again, there's Nintendo's IP. Finally, while Nintendo doesn't appear to do this currently, as a closed platform they could choose to sell their hardware below cost as a path to customer acquisition. (Of course phone vendors do this, and there's nothing saying that the same could hold true for an iPod-touch-like device provided it was sufficiently tethered to an app store or iTunes like service.

On the other hand or 'Yes' side of the question, the evidence pointing to an eventual subsuming of the function-specific portable game console by the general-purpose Smartphone doesn't look good for the portables.

When looking at other function-specific devices competing with the general-purpose device, the pile of bodies is pretty high, and growing.

The first category to go was dedicated PDA's. Never more than niche to begin with, these devices begged for communication to begin with. It could be argued that this was less about phones integrating the functionality, and more about the devices lacking the requisite functionality to begin with.

One category of device people started talking about suffering from this phenomena was the GPS. (See here, here, or here's a graph of a couple top GPS vendors where you can see the precipitous drop - even beyond that of the market due to recession, in green - which lines up with Apple's 3GS launch).

GPS is still alive as an electronics category, but it seems clear that the device manufacturers are suffering here and running for high ground in the way of either added niche-valued functionality and automobile OEM sales. whether these run out of runway as well is TBD.

Another category of device with some compelling evidence is that of digital cameras. The following post discusses some data posted from the Flickr blog, about usage statistics for posted photos to the site, as marked by what camera type took them.

The first graph shows the increase in iPhone 3 and 4 uptake vs photog staples at the high end like the Canon... The SLRs hold their own, but the increase in iPhone usage is clear.

The chart further down the page shows the decrease in leading point-n-shoot camera usage in the same period. There's a pretty clear correlation here. This is a classic case of the "Christensen Effect" at work.

I don't know how this will play out. It certainly will be interesting to watch from the sidelines. Developers should be interested, certainly if they are thinking about targeting either platform.

In trying to game it out though, its worth distinguishing what I believe are these two key questions at play. When looking at both together, Nintendo certainly has their work cut out for them. I can speculate about what that might lead to... but that's the subject of another post.

Friday, May 6, 2011

Book Review: The Whuffie Factor

I picked up The Whuffie Factor well over a year ago, got about half-way through it, and then got distracted with other reads. I forced myself to pick it up again and finish it off. I suppose that gives some hint as to what I thought of it.

It's not a bad book, mind you. It's just that, well, let me put it this way: If you are already somewhat versed in the subject matter, if the terms whuffie or social capital don't require explanation - then you already get it. If you don't, then maybe it's useful. My father-in-law, for example, picked it up while visiting and found it interesting.

Furthermore, for those that expound on the social media scene, there's this annoying trait of that tight-knit circle that they all share the same 20 anecdotes about the successes and failures (Zappos, Threadless, etc). If you read Hugh MacLeod's stuff, you get it with a hard edge. If you read Seth Godin, you get it with some inspiration and some good analysis. In Tara Hunt's case, I found it was just the regurgitated story, without much new to add.

So, in short, if you can infer from the title what it's about, then its probably not for you. If you know someone looking for a crash course, maybe its for them.

The Whuffie Factor: Using the Power of Social Networks to Build Your Business

Algorithmic Vegetables

Good talk on the downside of personalized search, and what it means when we don't tell ourselves that we need to eat some vegetables before we get our desert.

In other words, what would Edward R Murrow said to those writing our search algorithms, and to us using them?

Tuesday, May 3, 2011

Revolution-era pragmatists

Recently when I reviewed "The Best in Technology Writing 2010", I mentioned that Clay Shirky's piece on the radical revolution/reformation of the newspaper industry, 'Thinking the Unthinkable' was one of my favorites.

I revisited it recently, and really liked the observation made in the following passage:

Revolutions create a curious inversion of perception. In ordinary times, people who do no more than describe the world around them are seen as pragmatists. While those who imagine fabulous alternative futures are viewed as radicals. The last couple of decades haven't been ordinary, however. Inside the papers, the pragmatists were the ones simply looking out the window and noticing that the real world increasingly resembled the unthinkable scenario. These people were treated as if they were barking mad. Meanwhile the people spinning visions of popular walled gardens and enthusiastic micropayment adoption, visions unsupported by reality, were regarded not as charlatans, but saviors.

When reality is labeled unthinkable, it creates a kind of sickness in an industry. Leadership becomes faith-based, while employees who have the temerity to suggest that what seems to be happening is in fact happening are herded into Innovation Departments, where they can be ignored en masse. This shunting aside of the realists in favor of the fabulists has different effects on different industries at different times. One of the effects on the newspapers is that many of their most passionate defenders are unable, even now, to plan for a world in which the industry they knew is visibly going away.

He is writing about newspapers, but the above passage could be speaking about any large incumbent (or group thereof) in any industry undergoing radical change. I've seen it at Intel and Microsoft dozens of times, and I confess to having played the role of accuser and accused at different times.

Monday, May 2, 2011

Book Review: The Ascent of Money

Naill Ferguson's The Ascent of Money is probably best summed as one part history, one part crash course in finance and economics, and one part treatise on the fallibility and hubris of mankind. Another way to describe it might be as a cross of The Big Short with To Engineer Is Human, stretched out over a window of two thousand years.

The book gives a crash course on the history of finance, looking in turn at the advent - and roles of - currency, credit, banking, the bond market, the stock market, insurance, housing, and of course the intersection between all of these in things like derivatives, mortgage backed securities and the like. Each is examined from birth through to current day, as well as looked at in a global context.

Through it all, Ferguson gives a history of bubbles, from the "first bubble" in which an enterprising Scot bankrupted all of France in the early 1700's, through to the mortgage-backed mess the USA finds itself in today. Here is where I found a similarity with To Engineer is Human, in that it is a story of how we inevitably fail to learn from History. Perhaps it's more accurate to say that when we do, time and greed inevitably erode the safeguards we put in place.

This is what the book's closing chapter discusses, is whether such cycles are inevitable, and whether we do as much damage as we protect against, when we mess with the cycle of creative destruction.

While finance in general can be a bit dry, Ferguson does a good job of making this entertaining. The history of finance has no shortage of colorful characters and he introduces the reader to many of them. Even if you are not that enthused about the topic, it nevertheless will give a good overview of it that will entertain. Also, it serves to remind that the "current state of things" (i.e. the last 20 years or so, further back from which people have a tendency to subscribe to different rules) is by no means the way things must remain.

Finally, I'll add that the final chapter's coverage of reasons why people tend to predict the future of financial markets properly applies equally well to the tech industry as well. There are lessons to take away here that have less to do with dollars and more to do with our tendency to believe that "this time is different". If history shows us anything, it's that we repeat it.