Space Ramblings

Tag Archives: Tech

The Adventures of Tintin movie review, Uncanny Valley and the Limitations of CG Technology

The Adventures of Tintin has all the passion and visual ingenuity missing from Spielberg’s Indiana Jones and the Kingdom of the Crystal Skull. It also has the enthusiasm that has been missing from Spielberg films for too long. Unfortunately it’s also a paint by numbers cartoon and while its combination of motion capture technology and visual style avoids the uncanny valley, the motion capture can’t invest the figures with a soul.

Indiana Jones didn’t work because cliched pulp stories were such a brilliant concept. It didn’t work on Spielberg’s skills alone. It’s actors who bring a story to life, and while The Adventures of Tintin manages to invest Captain Haddock with human characteristics, all the technology can’t seem to make him something more than a one-note character stumbling into another punchline.

The Adventures of Tintin is visually frantic because its characters are so stiff and the world is so flat. Snowy, Tintin and Haddock are constantly rushing and stumbling and flying over things in Bugs Bunny style, but they are Bugs Bunny, one-note characters pretending to be human in a story that was a cliche even when it was written.

The Adventures of Tintin is a passable movie, but it works best for those who want to see the characters in the comics come to life. Remaining audiences see Spielberg doing what Zemeckis did, being seduced into believing that the power of complete control over an environment means unlimited creativity. The Adventures of Tintin is more polished than most of Zemeckis’ efforts, only during the concert scene does the CG look truly tacky, but all that effort is still wasted.

3D cartoons work best when the characters are drawn simply and cartoonishly. The Adventures of Tintin teeters between photorealistic fidelity and the simple lines of a cartoon. Its opening gimmick, echoes the one from Team America World Police, but without the sense of humor. The technology being shown off is impressive, but it never manages to lift The Adventures of Tintin beyond its limitations.

Unlike Zemeckis’ efforts, Tintin’s failure can’t be blamed on unready technology. Tintin is as ready as the technology will ever be. Its characters are a world away from the nightmarish wooden puppets of Polar Express. But they’re not people and they never will be. Uncanny Valley doesn’t make them creepy, just limited.

Cartoonists always knew that simple lines can capture more depth than detail. It’s something even DC and Marvel know, which is why every issue doesn’t look like an Alex Ross painting inside. There are comics that try that, going for photorealistic paintings and they combine badly with a fast moving story.

The real loss here is that Robert Zemeckis and Steven Spielberg have wasted their time trying to create their ideal movie in 3D CG, instead of making it the way they used to.

From 1980 to 2000, Zemeckis made the Back to the Future movies, Who Framed Roger Rabbit, Death Becomes Her, Cast Away and Forest Gump. Since then he made a bunch of disposable CG movies like Polar Express, Beowulf and Christmas Carol. Coming up next is Yellow Submarine.

The loss of Spielberg as an exciting director can’t be completely blamed on CG, but the difference between the first three Indiana Jones movies and the last one, is the difference between a director who went places to tell a story and one who went behind the Green Screen. Tintin isn’t lazy, but that’s because the artists are doing most of the work. It’s visually ingenious, technologically innovative and hollow. There’s action without momentum and visuals without impact. It’s clever without being alive.

Do We Even Need One Steve Jobs Movie?

It’s a reasonable question because how many hours can you really spend on a guy screaming to his subordinates about getting the exact right shade of the color white? But there’s two Steve Jobs biopics coming up, because you can never have too many movies about a famous guy who just died. According to Aaron Sorkin, who is more qualified to pick up awards than he is to write scripts about anything involving technology, that’s okay because there’s room for multiple Steve Jobs movies.

“Steve Jobs is a big enough person, big enough life that there’s room for more than one movie.”

Maybe it can even be a trilogy. Or a miniseries. Maybe we can build a Steve Jobs museum on the moon.

According to Sorkin, Steve Jobs is just like the Beatles or something. But give the man his age. Back then everything important was compared to the Beatles. When JFK was assassinated, when man landed on the moon and when that stupid movie about Facebook got a bunch of awards, it was all just exactly like the Beatles.

You know who else couldn’t be kept down in one movie? Bob Dylan, who had to be played by Richard Gere, Cate Blanchett, a little black kid, also Heath Ledger and Christian Bale. As an eternal reminder of why this is a bad idea, here’s the trailer for “I’m Not There.”

The only difference between this and whatever Sorkin and his rivals will churn out to commemorate Steve Jobs is that at least “I’m Not There” is almost self-aware enough to know how ridiculous it is. That’s a level of self-awareness that Sorkin could not even begin to aspire to.

Pull back from all the hoopla over Jobs death and ask yourself whether Jobs would be getting a fraction of this attention if he had done everything he did in our universe until 1997, when Apple’s troubles allowed him to waltz back and the music industry and Microsoft’s incompetence allowed him to build a hardware business on an easy way to buy songs and some flashy interfaces? No, no and no.

Jobs pre-1997 would have been kindly remembered as a “pioneer” accompanied by Mac photos, the way Wozniak will be one day. Steve Jobs post-1997 is remembered for being successful. Not for being a genius.

An honest movie would pick up in 1997. It would focus less on his genius and more on the way that the incompetence of established industries creates openings for insurgents to revolutionize industries by exploiting opportunities. But instead we’ll get an actor delivering rapid fire dialogue while screaming about product demos and refusing to return calls from his family.

Does anyone actually need that?

Why Cloud Gaming is not the Future

Sure cloud gaming might be the future. The really distant future. The one where everyone wears jetpacks, sends clones to go to work and lives in orbit around the earth. It’s not the 5 minutes from now future. Not even the 10 minutes from now future, no matter how much NVIDIA keeps beating the digital drums for GRID.

 

1. Mobile gaming won’t integrate with desktop and console gaming

Not only are mobile games different, because they’re intended for a different type of control mechanism and a different type of environment (kill 5 minutes while waiting to skydive over Hawaii or ride in an elevator to the next meeting), but there’s a built in hardware bottleneck which leaves the idea that you can walk away from Skyrim or Battlefield 3 on the PC and smoothly pick it up on your iPad, an idea.

The only way to make this kind of cloud gaming work is to throttle desktop and console gaming graphics and controls to the level of a pad. Desktop games have already suffered from being throttled to console gaming specs, but even with Id’s scalable engine and new chips, the marketplace won’t stand still just so cloud gaming can be a buzzword.

 

2. The technology isn’t there

I don’t mean whatever NVIDIA is rolling out to impress everyone with, that doesn’t matter, I mean ISP’s are not out providing the kind of connection that makes regular cloud gaming feasible outside a small group. That’s the group leaving comments about everyone else being backward. And that’s fine. If your target audience for AAA games is now limited to 0.01 percent of the marketplace, go for it. Someone else will pick up the rest.

And mobile? Good luck downloading a 32 gig game on your data plan while waiting in line. Unless providers can suddenly gain a compelling reason to provide the bandwidth to cater to that kind of gaming, without tripling everyone’s bill, but still being profitable, then you can forget about it.

 

3. No one needs it

Sure Diablo III has made a lot of money. The Auction House will make a lot of money too, when it gets working. Maybe after all the promotional expenses, Activision will use the money to buy another marble palace haunted by demons. Maybe. But is it really worth it?

Other companies were counting on Blizzard and the unstoppable Diablo name to make an unpopular concept workable, instead it arrived broken on delivery and no amount of arguments that games are supposed to be broken on launch will change that. Diablo III was the test bed and it blew it. Sure the sales are there, but are they going to be there for companies without the Blizzard/Diablo brand? If Blizzard had trouble functioning and faced furious feedback, what happens to companies without the fanboy insulation or the online gaming experience?

NVIDIA can pitch GRID, but it’s in the hardware business. It doesn’t have to worry about launching games and when the cloud goes bad, the customers won’t kick its doors in, they’ll rage against the companies they gave their money to.

Blizzard’s fanboy shield can only cover them for so long. Bioware’s gave them a free pass for Dragon Age II, but broke on Mass Effect 3. Betting on Blizzard’s to survive another of these isn’t a good proposition. And most companies don’t have even one shield.

Sure averting piracy is a priority, but the question is how much do you want to risk doing it? And how much do you want to spend?

Always online costs money and sticking auction houses into every 50-60 dollar game will infuriate players even faster. Turning every game into an MMO without the subscription costs is financially scary and trying to sell people a 60 dollar game with crippled single-player and ordinary multiplayer and then tacking on a subscription fee, I’m not even sure Blizzard could get away with it.

More Copyright Wars

From a book review of an industry friendly copyright book in the New York Times

That’s what happened with the music industry, which, spooked by the proliferation of pirated file sharing on Napster, struck a bad deal with iTunes that allowed Apple to replace the sale of $15 albums with 99-cent songs. “Even if they continue to grow,” Levine writes, “those 99-cent-song sales won’t come close to making up for the corresponding decline in CD sales.”

The average album had 10-15 songs on it, which comes to about the same thing to 99 cents a song. How much did the industry really want people to pay per song? And if you eliminate the cost of actually packaging and manufacturing the albums, then 99 cents a song may even be a better deal. Apple takes its chunk, but does it really take more than Wal-Mart and K-Mart?

The real deal is that the industry didn’t want to change its business model of packaging whole albums that people had to buy to get a few songs they wanted. That business model helped encourage piracy. Sure manufactures would make more money if they forced people to buy printers with every computer or floor mats with every car. But that business model was killed by the internet.

Similarly, the best TV shows, like “Mad Men,” are produced by cable channels like AMC that hold back their content from Hulu, a network-owned platform for distributing TV over the Internet. The Hulu model has succeeded on the premise that “if someone was going to make their product available online for nothing, it might as well be them,” as Levine says of the networks.

I don’t know that Mad Men qualifies as one of the best shows on TV, but Cable has been sinking more money into developing shows that appeal to a more upscale audience. Hulu distributes shows that are mostly free to watch on TV already.

Levine says, is an open Internet model of free video that, by denying the networks any revenue to invest in shows like “Mad Men,” would instead produce the likes of the viral video “Charlie Bit My Finger.”

I’m not so sure one is that much worse than the other, but networks get revenue from advertising, cable networks get revenue from advertising and from their gated community. But free to watch networks spend money on shows, and can pay for it with advertising too.

Recently, France has begun to revive a business model that thrived in the 19th century: a collective or blanket license that, by adding a fee to Internet connections, would allow the convenient downloading of copyrighted music and divide the money to compensate producers and artists.

The fee based internet thrived in the 19th century? Wow. That is some revisionist Steampunk history right there. Must have been that Babbage based internet.

Anyone praising a media internet tax is a shameless shill for the entertainment industry. It’s completely indefensible, not least because it asks paying consumers to pay twice, once for what they buy and once as a confiscatory tax for the industry.

Let’s say we have a universal internet tax/fine, who should get it? Anyone who makes content that is distributed on the internet? Yeah right. Sorry we’re not going back to taxing cassette tapes for the music industry.

Germany has laws forbidding the aggressive discounting of books in chain stores, which has preserved independent booksellers while making it harder for Amazon to introduce the Kindle.

While keeping books more expensive. I like independent bookstores, but does this law do anything to promote reading or help writers?

But regardless of your position in the business-of-culture wars, it’s hard to resist Levine’s conclusion that the status quo is much better for tech companies and distributors than for cultural creators and producers. That status quo may benefit consumers in the short term. But if it continues, Levine argues, the Internet will increasingly become an artistic wasteland dominated by amateurs — a world where music, TV and journalism are virtually free, and where all of us get what we pay for.

My own position is skeptical toward both sides, which means I am skeptical that industry advocates care about creators. Creators are collateral damage for both sides. I am even more skeptical of the idea that the industry will stop making professional music, books and movies because of internet piracy. They had plenty of time to stop in the last ten years.

What the review and probably the book does not mention is that piracy encourages the industry to target the dumbest consumers even harder because they are less likely to have the know how to pirate. The industry has dumbed down its own product, but that is only one of the reasons.

The specter of amateurs is not all that horrifying, what is horrifying is that the future will belong to Cory Doctorow and Lana Del Rey, people who have nothing of worth to offer but strike a convenient pose that connects with a demographic. Creators who are much better self-promoters than they are artists.

Microsoft’s Moment

Microsoft is either on its last legs or on the verge of seizing the future by the throat. It all depends on who you ask. It’s easy to dwell on its legion of failures. A company which could have had iTunes, Android, the iPad and the iPhone is still clumsily playing catchup and asking customers to take another look at yet another mobile product that they’re not interested in anymore. Its biggest non-legacy asset is still the XBox. But Microsoft is trying, still plowing money into one thing or another. And occasionally there are signs showing that it gets it. Windows 8 will either allow it to muscle its way into the mobile marketplace or turn into another major dud. But even if it does it still has the cash flow and the determination not to give up. Some companies are smart and graceful, Microsoft has always been the bull. It may miss its moment but it will keep on pushing forward.

AOL Please Go Away

Please. I don’t know why AOL is still in business. There must be some kind of cash flow from somewhere, but it’s as if a telegraph company was still in business and trying to figure out its business model when it clearly isn’t going to happen.

AOL has tried everything from merging with Time Warner to putting out a ton of free services to buying the Huffington Post and putting the crazy right wing lady who facing too much competition on that front from Ann Coulter reinvented herself to become the crazy left wing lady in charge, and now talking about buying Yahoo. Sorry guys, Yahoo may be troubled but it’s not that troubled. Friendster isn’t that troubled. Somalia isn’t that troubled.

AOL still somehow has money and like a billionaire hanging off a cliff, it’s using that money to try and stay alive any way it can, before the last of its subscribers figures out they’re actually on AOL. But it just needs to go. Nothing they’ve done has worked. Putting Arianna Huffington in charge has been a disaster. Moving on Yahoo has freaked out Yahoo executives whose biggest nightmare is becoming the new AOL. Enough. Just turn out the lights and go.

Asking To Be Hacked

“The naked truth: Stars are asking to be hacked”

Does Frazier Moore actually think this or does he just write stuff like this because it will pick up hits? Today with the sad state of journalism, it’s probably the latter.

What was going on in Johansson’s pretty head when she, like so many, snapped candid self-portraits

Probably that she was sharing private photos with one person, not the whole world.

This sort of head-in-the-cloud narcissism (or is it head-in-the-iCloud?) fails to acknowledge that, more and more, people live in glass houses — especially famous people, whose houses are bigger and even more transparent than others.

So assuming that your photos won’t be hacked is now narcissism? Does Moore even understand the words he’s using.

In this era of digital snooping, why would any celebrity delude himself or herself that his or her physical seclusion guarantees privacy?

I don’t get it, are celebrities supposed to wear clothes all the time? Are they supposed to live their lives assuming that all their emails will be read?

Let’s go a step further. People know where celebrities live. If someone breaks into their house and steals their things, weren’t they asking for it by assuming that physical seclusion guarantees privacy?

However high the walls surrounding one’s property and however well-staffed one’s security detail, why would any celebrity store nude photos on any electronic device that connects to the Internet — unless, of course, the celeb is a closet exhibitionist and secretly hopes the stuff will go viral.

Clearly. Celebrities who have anything private in their private email accounts must be exhibitionists because they don’t assume that they are doomed to be hacked.

But she didn’t know better than to leave ripe for the picking those photos of her in her birthday suit, as if to dare some hacker to share them with the world. Sure, Johansson is one of many victims of cyber-hacking. Maybe she was also asking for it.

So by leaving photos in a private account whose contents the hacker couldn’t have known before hacking into it… she was really asking the hacker to do it.

I’ve seen rape defenses better thought out than this. Actually this is every rape defense ever boiled down with the added celebrity tag.

Steve Jobs’ Second Chance

Not many visionaries get a second chance the way Jobs did, and if he hadn’t managed to get back to the top at a desperate Apple, his obituary would probably read famous Apple co-founder and mention his Pixar connection. But Jobs got his second chance and he leaves Apple as a company at the top of its game.

Even the underwhelming iPhone 4S launch only helps shore up the image of Apple adrift without Jobs. As Google’s Android does to Apple what Microsoft did to it back in the day, creating a universal open platform that can be used by numerous hardware manufacturers, Apple’s likely decline will be attributed to Jobs’ death, rather than to the business model of the company.

There’s no question that Jobs helped revolutionize the portable market, but he also left Apple at a time when the company is fast approaching its limit. There’s only so many other things that can be grafted on to the iPhone or the iTouch and the Classic is out to sea and the Nano is now officially a kids toy. There’s not many other places left to go and once that evens out, Android will begin gaining even faster. But Jobs leaves as a visionary and a genius on a high note. It’s not a bad exit.

Studios and Theaters Get the Blame

The Video on Demand battle between studios and theater owners is opening up another posturing front. Both theaters and studios are complaining about how much money they aren’t making. But they aren’t making money because they’re both trying to squeeze each other and moviegoers too aggressively.

Skyrocketing budgets and ad costs, and theater consolidation and investment in renovations have hiked up costs on both sides. And those costs have been triply passed on to customers. Movie prices have gotten outrageous and that translates into fewer people showing up. Studios investment more in fewer movies and count on huge box office takes to pay off their investment. It’s a bad business model that’s called putting your cinematic eggs in one weekend. And when one movie with 150 to 200 million in costs crashes and burns, there’s not much chance of recovery.

Studios have stripped away originality and in cooperation with theater owners are turning theaters into amusement parks, cranking out 3D rides based on some randomly familiar IP. Alvin and the Chipmunks. Airbender. Monopoly. GI Joe. These aren’t movies anymore. They’re overpriced rides. And when a ride fails, it takes everything with it.

Studios and theaters need to revisit their business model. Spending more on less and expecting everyone to pay more doesn’t work. Making more movies for less and cutting theater prices would give better results in a bad economy. The premium model on a national level is a no go. And studios and theaters are destroying the movie to make more money.

Underwhelmed by Skyrim

Why can’t the company that published Brink and acquired Id trot out an Elder Scrolls game that looks good? The one thing that the Skyrim previews show is that the faces have improved, but Bethesda is still way behind the technology. Skyrim promises a lot, but so did Oblivion. And it looks like a slightly shinier version of Oblivion.

Story and gameplay could still save Skyrim. You never know. And it’s hard to imagine them completely blowing the icebound setting. But there’s no real reason for optimism. Not after Oblivion. And not after footage that looks a lot like Oblivion did, boring landscapes, whack and smash first person battles, poor animation and everything so shiny you could see your reflection in it if the engine were better.

The engine is a major barrier. But so is Bethesda. Oblivion and Fallout 3 were weak in the story and world building departments. Even 15 mins in Fallout New Vegas shows that even with a clunky engine, a great game can be made. If Obsidian were writing Skyrim, the clunky engine wouldn’t be that much of a negative.

Post Navigation

 
Custom Avatars For Comments
UA-32485431-1
%d bloggers like this: