Thursday, January 28, 2010

Nerd Word of the Week: Econopocalypse

Brown mondayImage by Latente 囧 www.latente.it via Flickr
Econopocalypse (n.) -- Slang term for a sudden and catastophic economic calamity, which results in the rise of a dystopic or even post-apocalyptic society. While the term is relatively new, econopocalyptic fiction isn't -- they just usually call it dystopian fiction and ignore its economic bent. George Orwell's 1984, Ayn Rand's Anthem, and Lois Lowry's The Giver are all arguably econopocalyptic fiction, though at least in Lowry's case the economic aspect of the dystopia -- state control of the economy -- is secondary to the state control of human consciousness. This brings up an important point; real-world economic distress often leads to a rise in creation and consumption of dystopian or apocalyptic fiction, though rarely is that fiction directly econopocalyptic.

I bring it up because: The econopocalypse is the new zombie apocalypse, at least according to Barack Obama's State of the Union Address last night. It was interesting to juxtapose the political crossfire over how to combat the presumed jobless recovery we're staring down after the subprime cirsis with Apple's fanboy-entrancing release of the iPad, a computer that is part phone, part laptop, and all status symbol. Clearly, we're not nearing a post-consumerist society, so far as St. Jobs is concerned, but then Steve wouldn't mind being the Big Brother in charge of the new media economy. Maybe that's why the iPad didn't ship with a viewer-facing camera -- too much of a tipoff that Big Steve is watching. In any case, times of economic unrest often inspire dystopian fiction, but whether we stay with the zombie track (as is indicated by AMC's Frank Darabont-helmed option for Robert Kirkman's Walking Dead series) or we get a new econopocalypse-styled dystopian breed in line with Jeff Somers's Electric Church remains to be seen. In either case, a lack of money will probably be good for the downer spec-fic business. Ironic.

Tuesday, January 26, 2010

Truly Trivial: How did Apple infamously deal with their unsold inventory of Lisa computers in the 1980s?

Apple Lisa
This trivia geek has been under the weather lately, so I'm recklessly going to the bullpen again for this week's Truly Trivial, resurrecting an old (but timely) Geek Trivia column. Apple is set to announce their new Apple tablet (or so we've been led to believe) in the next few hours. The uber-anticipated device is supposed to revolutionize publishing, gaming, Web surfing, commerce, race relations, economic disparity, and possibly even the quantum structure of the universe--at least according to Apple fanboys. How soon we forget that other Apple products have had similarly anticipated debuts only to fail miserably. No matter how certain you are that the Apple tablet is a can't-miss consumer device, just remember the sad tale of the Apple Lisa:
The Apple Lisa was the first commercially available stand-alone PC to employ both a graphic user interface (GUI) and a mouse. Developed under the direct supervision of Apple cofounder Steve Jobs, Apple intended the Lisa to revolutionize office computing as an all-in-one technical solution.
Beyond the GUI and the mouse, the original Lisa boasted several hardware and software features that were well ahead of their time. ... Too bad the Lisa cost almost $10,000 per unit and suffered woeful performance lags. The exorbitant amount of RAM and other high-end features made it significantly more expensive than IBM PCs and Apple's own Macintosh, which itself ran a faster, leaner GUI.
These drawbacks helped ensure that the Lisa never gained any significant market traction or adoption. After six years of frustrations and failures, Apple finally took a drastic and somewhat poignant measure to rid itself of the last 2,700 Lisa PCs it had in stock in 1989.
HOW DID APPLE UNLOAD ITS FINAL UNSOLD INVENTORY OF THE ORIGINAL LISA COMPUTER?
Read the complete Q&A here.


Thursday, January 21, 2010

Nerd Word of the Week: Uncanny Valley

Repliee Q2.Image via Wikipedia
Uncanny valley (n.) - A phenomenon which describes your feeling of discomfort when observing characters, objects, or images that appear almost, but not quite, human. Basically, the kind of icky you feel when catching a glimpse of a weirdly almost-human robot or cartoon character --something that's almost too human, but not quite there. The explicit term uncanny valley refers to a graph of human reaction to human-like images. As the images become more human, people become more comfortable with them. As the images approach a very near-human status, a valley appears in the reaction curve, one that abates once the images become all but indistinguishable from actual humans. It is generally assumed that the uncanny valley is an evolutionary aversion -- one that arose as a means of dissuading humans from interacting with the ill or recently dead (who often look slightly less than human). That, or it's nature's way of preparing us for the inevitable zombie apocalypse.

I bring it up because: A recent Popular Mechanics article pointed out that there is virtually no scientific basis for the uncanny valley (hat tip to io9) despite the fact that science and science fiction have been casually invoking the term since the 1970s. While there is ample anecdotal evidence of the uncanny valley -- just ask anyone who has seen The Polar Express -- almost no formal research has ever been conducted into the incidence of, or mechanisms behind, the uncanny valley. That's a shame, as computer animation, particularly of the Avatar-esque 3D variety, is going to start pushing character designs right into the supposed uncanny valley. The Japanese also can't seem to stop themselves from building more and more (creepily) human-like robots, too. If mainstream media and consumer electronics are going to be heading that direction, it would be nice to know whether the public notion of the uncanny valley is mere conventional wisdom, or perhaps even erroneous pseudoscience. Somebody get the Skepchicks on this, stat!


Tuesday, January 19, 2010

Truly Trivial: When was the first 3D television broadcast, and what program was shown?

Pocket stereoscope from Zeiss with original te...Image via Wikipedia
For those of us fortunate enough to attend the 2010 Consumer Electronics Show in Las Vegas, it has become glaring apparent that every major television manufacturer is desperate to shove 3D TV down our throats -- whether the consumer likes it or not. If you're among the millions of movie-goers (or Golden Globes judges) that saw James Cameron's Avatar, you know that Hollywood has also suddenly decided that 3D is the technology that will once again get consumers lining up at cinemas rather than queuing up on bittorrent. Piling on, ESPN and the Discovery Channel are committed to creating 3D HD television channels this year, and pretty much every major PC video game has a 3D expansion or sequel in the works (you couldn't throw a rock at CES without hitting a 3D version of Batman: Arkham Asylum).

Hope you like wearing dorky 3D glasses for several hours a day.

What's lost in all this sudden 3D hoopla is that 3D photography, motion pictures and television have been around for decades and that, while each has enjoyed a brief spark of popularity, the public has always swung back to familiar, comfortable two-dimensional media as its preferred viewing format. Some of this has been due to limitations in technology, some of it has been due to the paucity of good 3D content, but for this author's money the problem that killed 3D in the past remains the one that neither Silicon Valley nor Tinseltown have yet solved -- nobody wants to wear 3D glasses to watch TV. (Yes, there are 3D screens that don't require glasses, but those models demand a direct viewing angle; step a few degrees left or right of center and the image blurs, which is equally if not more inconvenient.)

Three-dimensional stereoscope photography dates back at least to the 1840s with Charles Wheatstone and David Brewster inaugurating the technology. A 3D photograph of Queen Victoria was displayed at the Great Exhibition in 1851. In 1855 the Kinematoscope 3D movie camera was produced, and by 1935 3D films started appearing in theaters. This technology didn't achieve a major commercial groundswell until the 1950s when classic films like Bwana Devil and the original House of Wax delighted movie-going audiences. But by the 1960s, the insatiable craze for 3D films had died out, partly because moviehouses couldn't afford, maintain, or properly operate the dual-projector systems required to show the films, and partly because the public fad of 3D movies had passed on.

That same fad reached television in the 1990s, when major networks offered special 3D episodes of primetime television programs -- including a particularly memorable episode of 3rd Rock from the Sun, "Nightmare on Dick Street" -- but again the craze died out by the end of the decade. This time, the passing couldn't be laid at the feet of the technology, as 3D TV was proven feasible over forty years earlier.

So, when was the first 3D television broadcast, and what program was shown?

Monday, January 18, 2010

The Three Commandments of Web Site Feature Development

UNSPECIFIED - OCTOBER 10:  In this photo illus...Image by Getty Images via Daylife
Below are the three hard lessons I garnered from years as a product lead for CNET. They're intended for anyone designing a Web site that does more than just serve content, to caution you against investing time, money, and code in features that won't bring in an audience or a return on investment.

Lesson #1: The 90-9-1 Rule - The rule breaks down like this: 90 percent of the readers of any blog or Web site will never leave a comment, nine percent will comment once during their entire tenure of readership, and one percent will do the vast majority of the commenting. If you think that sounds unreasonable, consider any of the radio call-in shows you may have listened to in your life. How many calls do those shows get, as a percentage of their total audience? How many times have you called into those shows? Yet there are regular callers; they're just a tiny percentage of the overall audience. I've found the 90-9-1 rule to be strikingly true online and, if anything, optimistic in the percentage of the audience which are regular posters. And that's just for a very basic interactive activity like posting comments. The percentages drop steeply as you get into interactive functions that require more time and effort, like filling out profiles or writing reviews. I took a stab at setting interactivity expectations here, based on what we learned at TechRepublic with some flamingly unsuccessful blog, profile, and social bookmarking projects. I can say with a straight face that getting a decent user-submitted video is literally a one-in-a-million proposition.
Moral: Design for the 90 percent, not the one percent, if you want to actually see an increase in activity, visits, and traffic. Focusing on the power users will almost never move the needle, especially since your usage zealots are already doing almost everything they can or will on your site.

Lesson #2: Design for an Audience of One - Flickr and YouTube get a lot of hype for how 99.999 percent of their content was acquired for free from users, and how the users employ tagging and groups to create the wonderful emergent communities, content, and traffic bursts. What people don't talk about is that those are side effects of YouTube and Flickr's business model and use cases. The vast majority of YouTube users don't give a crap about viral videos or monetizing video content, they just want an easy way to format videos and post them online. Almost all Flickr users don't care about aggregated group feeds or discovering like-minded photogs via tags, they just want an easy way to post and store pictures online. Flickr and Youtube have value to me even if I'm the only guy using them. All those group-dependent features are a result of Flickr and YouTube's scale. You can't start with those features, you tack them on once you're massive.
Moral: Any feature spec that includes the phrase "will be useful once a bunch of people join in" will almost certainly fail because there is no value for the initial users.

Lesson #3: More of the Same is the Only Feature That Matters - Nobody bookmarks a page anymore. They either search Google directly for what they want (and don't care where it comes from) or get it sent to them in an RSS feed from whichever sources they prefer. Again, your zealous users are a tiny percentage of your audience, so any efforts made to be the be-all, end-all of your audience's activity are likely to fail because most of your users aren't devoted to your site and prefer to go elsewhere. That upvoting feature that works just like Digg? They'll use actual Digg instead. Your blog platform? If they wanted to blog, they'd use LiveJournal, Blogger, or any of the other services out there first. A user profile? Maybe you've heard of Facebook. Until you reach the stratospheric heights of traffic, there's no point in trying to create new user behaviors on your site. The best thing you can do is reinforce the existing user behaviors. If they came to you for content, the best thing you can do is show them more content. If they came here looking for help making a buying decision, help them make a buying decision.
Moral: Any feature spec that includes the phrase "if we can just get the users to do X" will fail, because if they aren't doing X already, they aren't likely to start.


Thursday, January 14, 2010

Nerd Word of the Week: Slacktivism

Facebook groups are not political protestImage by ʎpuɐ via Flickr
Slacktivism (n.) -- Pejorative slang term for feel-good measures that have little hope of actually aiding the cause or individuals they claim to support. Particularly used in cases where the "activism" is limited to online activity with little or no investment or sacrifice required of the supporter, such as signing an online petition, joining a Facebook group, or modifying your online avatar in some fashion.

I bring it up because: This week saw two events that brought the slacktivists out in droves: The earthquake near Haiti, and Google's threatened withdrawal from China. While many online actions have actually done some good -- such as entreaties to text certain codes that will send a donation to Haiti relief efforts through your cell phone bill -- far too much of the "response" has been limited to Twitter hashtags. How much did turning your avatar green really help the election protesters in Tehran last year? How much did revealing the color of your bra on Facebook really help breast cancer awareness or research? True protest, and true activism, means upsetting the status quo, but by integrating our support into our regular routine of online activities it becomes just more trend-follower noise rather than actual change. Something to think about as we rage against the machine online -- if our words aren't backed up by substantive actions, we're just another part of the machine we rage against.

Tuesday, January 12, 2010

Truly Trivial: What billion-dollar movie franchise did James Cameron NOT create, but want a writing credit for?

Cover of Cover of True Lies
So James Cameron's Avatar is on its way to becoming the highest grossing movie in cinema history, displacing the previous record-holder of 12 years, James Cameron's Titanic. That Cameron guy sure has a knack at the box office; factoring in Terminator 2 and True Lies, Cameron's last four major motions pictures have raked in a little over $4 billion -- and that's before you count ancillary merchandising, home video, and television rebroadcast profits. It also ignores the added tally of the original Terminator, The Abyss, and Aliens, all of whom were arguably superior movies -- aesthetically speaking, if not financially -- than Cameron's more contemporary cash cows.

What's funny about Cameron's success is that, for all his ability to push the visual envelope and expertly depict even the most pedestrian of storylines (*COUGH*Titanic*COUGH*Avatar*COUGHCOUGH*), he's begun to develop a rep as the guy who steals all his ideas. The most famous example is the Terminator franchise, which owes great steaming piles of mea culpa to one Harlan Ellison, who just happens to be one of the most iconic sci-fi scribes of the 20th century. The Terminator shared a number of explicit plot points and story ideas with a couple of classic Outer Limits episodes written by Ellison: "Demon With a Glass Hand" and "Soldier".

Now, Ellison is infamous for being both contentious and litigious -- he earned as much notoriety for quarrelling with Gene Roddenberry, for whom he wrote arguably the greatest original Star Trek episode ever, "City on the Edge of Forever", than he did for his actual writing -- so it's little wonder that Cameron found himself taking some heat from Ellison over The Terminator. What's surprising is that Ellison's case was strong enough that Cameron caved, and the Terminator film and franchise now appear with the following phrase in their credits: "Acknowledgment to the works of Harlan Ellison."

If the Ellison affair was an isolated incident, we'd be apt to let it go and probably even chalk it up to Ellison being easier to buy off than fight off. But there have been questions raised about the unacknowledged inspiration for Avatar, too. No, we're not talking about Dances With Wolves, though that parallel is explicit. Poul Anderson's novella Call Me Joe follows the story of a paraplegic who connects his mind to a genetically engineered lifeform to explore a harsh planet and then ends up going native. Sound familiar?

Here's where it gets really funny. There's another billion-dollar movie franchise that Cameron wanted to direct but for which he couldn't secure the legal rights. For once, lack of ownership actually stopped Cameron from making the movie, even if it didn't stop him from writing a script treatment for the property. Moreover, when the movie finally got made -- and became an international phenomenon -- there were some very vague similarities between Cameron's script treatment and the finished product, so much so that Cameron felt "slighted" that his previous work wasn't acknowledged. Ironic, isn't it. So, you gotta ask:


What billion-dollar movie franchise did James Cameron NOT create, but want a writing credit for?

Thursday, January 07, 2010

Nerd Word of the Week: Blobject

The iPod family with, from the left to the rig...Image via Wikipedia
Blogject (n.) - A portmanteau of blob and object, a blobject is a household item or device distinguished by its smooth, rounded, almost seamless design. The iPod is a classic blobject, and its popularity has radically popularized the blobject design ethos. Blobjects owe their existence largely to computer-aided design and manufacturing, and you can see early inklings of its association with futurism in early 1970s sci-fi television and movies, where the smooth "plastic fantastic" designs of Logan's Run and its ilk took hold. This aesthetic was mainstreamed, arguably, by Star Trek: The Next Generation where rounded edges and buttonless interfaces were the norm. Everything was seamless, plastic, and disposable. In some ways, the steampunk movement arose as a repudiation of the blobjectivism of mainstream design, with the individualized, customized, constantly-tinkered-with and constantly maintained bulk and clatter of steampunk tech (and its associated DIY culture) rejecting the upgrade-every-year trendiness and assumed vacuousness of blobject ownership.

I bring it up because: I am presently at the Consumer Electronics Show, and though I wrote this entry before I left (Planning!) I fully expect CES to be dominated both by already know blobjects (Google's Nexus One) and speculation about possible future blobjects (Apple's iSlate tablet). It's just one more step towards an ability to instantly manufacture anything we can mock up in a CAD program -- hello 3D printing, which is already scheduled to be demo'd at this year's CES -- which is itself another increment on our journey towards Ray Kurzweil and Vernor Vinge's predicted techno-singularity. Just so long as the future has Wi-Fi, I'm cool.

Tuesday, January 05, 2010

Truly Trivial What sci-fi novel was the XBOX 360 dev team required to read?

XboxImage via Wikipedia
I'm off to the Consumer Electronics Show this week, so I'm shamelessly and indefensibly shirking my trivia responsibilities yet again. To fill the void, here's some tech-toy themed minutia from my old Geek Trivia columns:
While Bill Gates may have a personal wealth that dwarfs the gross national product of many third-world countries, and Microsoft boasts a cash flow that would make some state revenue cabinets envious, jumping headlong into the multibillion-dollar gaming hardware market was still quite a daring leap for a software company. The man who convinced Gates and, perhaps more important, Steve Ballmer to get in the game, so to speak, was Xbox development chief J Allard. ...
Allard drew his inspiration for the Xbox 360 not just from more traditional sources of product development and market research, but also from science fiction—including a noted sci-fi novel that Allard made required reading for his entire Xbox 360 development team.
WHAT SCIENCE-FICTION NOVEL DID XBOX 360 DEVELOPMENT CHIEF J ALLARD REQUIRE HIS TEAM MEMBERS TO READ?
Get the complete Q&A here.