Jim here. Once you get set up with NVIDIA's 3D Vision technology, the 3D world is your oyster. Case in point: Fujifilm has just released their Finepix Real 3D W1 - the world's first 3D point and shoot camera in North America. And it works with 3D Vision!
Yes the camera is pricey - $600. But for early adopters like a lot of us, it is time it happened! Further, you don't need 3d glasses to view the photos -or videos- you take with the camera; the rear LCD displays it in 3D for you. You can also buy Fujifilm's V1 Picture Viewer with an 8 inch screen for $500. Online ordering of actual 3D prints is done directly with Fujifilm for $6.99 each.
Even more good news is that Fujifilm has partnered with NVIDIA to deliver a way to view the photos on a regular screen (120Hz monitor that is) with their 3D Vision technology. So it really is the best of both worlds. With glasses or without.
This looks like perfect timing to me. With 3D movies and gaming absolutely booming right now and with 2010 holding the keys to 3D TVs - Fujifilm seems to be at the right place at the right time with the right product. I am hoping to try one out soon!
Wednesday, September 30, 2009
Jim here. Once you get set up with NVIDIA's 3D Vision technology, the 3D world is your oyster. Case in point: Fujifilm has just released their Finepix Real 3D W1 - the world's first 3D point and shoot camera in North America. And it works with 3D Vision!
Buzz Lightyear Returns To Earth After 15 Month Mission In Space! Ready For 3D Premiere of TOY STORY...
Jim here. Yep. Buzz made it. He spent the last 15 months aboard the International Space Station as part of the joint Disney / NASA educational outreach program. Tim Allen probably wishes it was him instead of the toy - but hey, he at least gets to lend his voice once again for TOY STORY 3!
Buzz softly touched down September 11th on the Discovery STS-128 mission.
Mr. Lightyear will be treated to a full ticker tape parade along Main Street USA at Walt Disney World, Florida on October 2 - which also happens to be the premiere of his TOY STORY 1 and TOY STORY 2 double feature in stereoscopic 3D. The double bill is a limited 2 week engagement (unless demand soars which it probably will).
The next big 3D release after it will be Disney's A CHRISTMAS CAROL on November 6 - so I would look for the TOY STORY 1 & 2 event to be extended ala Hannah Montana's concert movie.
Will TOY STORY characters once again fill the holiday stockings of kids worldwide AND the coffers of Disney again? You bet. This is as solid a hit as you can imagine. Two movies for the price of one, beloved characters and all in 3D. Kids are pumped up for this!
Look for TOY STORY 3 in theaters June 18, 2010. And ah, yeah - awareness will be through the roof for this one!
Tuesday, September 29, 2009
Michael here. Fantastic Fest will have "exclusive footage" from Avatar at 9:45 PM on Wednesday, September 30 at the Alamo S. Lamar 1 theater.
From Fantastic Fest's website:
One of the highlights of the Fantastic Fest 3D program will be a screening of exclusive footage from James Cameron's AVATAR with producer Jon Landau in person to participate in a Q&A with Harry Knowles of Ain’t It Cool News. AVATAR footage courtesy of Twentieth Century Fox.
There will also be a really cool 3D-focused party (the "Double 3D Dance Party") right after the Avatar presentation:
Following the AVATAR presentation, Fantastic Fest will present a groundbreaking 3-D Film & Interactive Party, presented by RealD and 3DFF. The event will feature multiple 3-D displays and projection screens that will showcase 3-D interactive experiences, in addition to a 3-D Dance Floor Exhibit.
Partygoers will find themselves lost in the 3rd Dimension as they move to the music inside of a stereoscopic 3-D dance floor, which utilizes 3-D projection screens and interactive 3-D televisions. The Fantastic Fest 3-D Film & Interactive Lounge presented by 3DFF and RealD comes equipped with the most technologically advanced dance floor experience ever created for a special event venue.
We would love it if any of you MarketSaw readers attending Fantastic Fest could write in with your experiences of the footage, Landau interview, party, and any other cool things you see!
Jim here. It's zombie night at the 5th Annual Johnny Ramone Tribute in the Hollywood Forever Cemetery on October 3rd. Not only is the world 3D premiere of NIGHT OF THE LIVING DEAD (NOTLD) taking place, but NVIDIA will also be there to allow attendees to dispose of zombies in stereoscopic 3D with their 3D Vision technology and Capcom's RESIDENT EVIL 5!
PassmoreLab did the modern conversion of NOTLD and they are really looking forward to showing their 2D to 3D expertise at the festival.
Rose McGowan (GRINDHOUSE) will be on hand to introduce the movie.
The event is open to the public for a $10 per person minimum donation, with proceeds benefiting the Johnny Ramone research fund at USC Westside Prostate Cancer Center. Gates open at 5:30 pm. Picnic dinners, drinks (including alcoholic ones), pillows, blankets and small chairs are permitted. Attendees will receive free commemorative 3D glasses upon entering.
Resident Evil 5 is the latest PC game to take advantage of NVIDIA 3D Vision technology, which transforms standard games into eye-popping, jump-out-of-your-seat, 3D experiences. Not only did its creator Capcom design "out-of-screen-effects" to scare the daylights out of gamers, they also rendered all of the game's cinematic cut-scenes with the same immersive 3D effect. The critics have raved about the game running on NVIDIA 3D Vision. Said Jeff Haynes of IGN.com, "Resident Evil 5 was designed to fully take advantage of 3D, [with] certain segments of the game piercing the digital 'fourth wall' with a sense of negative depth that is quite incredible."
"Night of the Living Dead was one of Johnny Ramone's top ten horror film favorites of all time," said John Cafiero, chief of staff of the Johnny Ramone Army, an official entity acting on behalf of the seminal punk rocker and his Estate preserving the icon's memory and legacy. "What better way to celebrate the world premiere of this incredible flick in 3D than to give people a virtual experience to fight the living dead in Resident Evil 5 in 3D on NVIDIA's latest gaming hardware."
For more information: 5th Annual Johnny Ramone Tribute, NVIDIA 3D Vision, PassmoreLab.
Monday, September 28, 2009
Jim here. In one of the more anticipated 3D movies of the year - Disney's A CHRISTMAS CAROL - we see Robert Zemeckis delivering his finely tailored performance capture wizardry once again.
Now comes a new trailer showing us some more scenes from the classic tale starring Jim Carrey as, well, just about EVERYONE :-) Check it out...
Jim here. Spoke with one of my top sources of AVATAR material and he has some news regarding the upcoming AVATAR promotional material that is due soon.
Here is what he had to say: "Well Jim there are more trailers than I can count on 2 hands. (TV spots) (featurettes) (Ubisoft promo reels) etc, etc, and yes "exclusive" content will be available on avtr.com".
He also confirmed what we have been thinking for some time now - the next trailer will feature "more than 3 lines of dialogue" :-) and further, Stephen Lang and Sigourney Weaver will have heavier exposure as will the Na'vi.
But out of all the news - this is what I find the most interesting: We will see glimpses of future Earth and with the "Story Trailer" in particular we will see some of the "incredible performance that Sam Worthington gives as Jake". Pandora's night life will also play a role.
He reiterated that of course this could all change on a dime given the track record of getting media pieces out to the masses, but this is what he has seen.
As far as when the next trailer will makes its debut: "I have no solid date to give you on the 2nd trailer front other than to say that it is coming *soon*"
We can be pretty sure that this trailer will debut online folks. Keep MarketSaw.com and AVTR.com on your favorite toolbar! This should be GREAT :-)
Sunday, September 27, 2009
Jim here. No surprise here really with this year's success of horror films in stereoscopic 3D. Variety reports that Bob Weinstein is pushing 3D on his aptly named Dimension shingle and pushing it in a big way.
SPY KIDS 4 is being written right now by Robert Rodriguez and will begin lensing in March 2010 in partnership with Disney. If you will remember, SPY KIDS 3D did exceptionally well at the box office before modern 3D took over - they used the terrible old red and blue glasses. SK3 took in $197,011,982 worldwide. Obviously Dimension wants to repeat or better that result.
We now know that Patrick Lussier (MY BLOODY VALENTINE 3D) will be helming HALLOWEEN III and Weinstein has this property high on his list - look for launch in October 2010 - yep most likely head to head with SAW VI (October 23). By the way, SAW VII has been confirmed to be 3D - but we knew that was coming since January didn't we?
So lets see here now, what else is going 3D? How about the HELLRAISER franchise? Pinhead himself will grace the silver screens once again but this time in 3D! This is one series I am really looking forward to - hopefully they land the right director.
SCANNERS will also explode into 3D. No word on whether David Cronenberg will come back to reprise as director - and no, the only 3D headaches will be ON the screen :-)
Weinstein did mention a couple of other reboots that I am assuming could also make it to 3D: SCREAM (Neve Campbell, Courtney Cox and David Arquette have all signed on) and CHILDREN OF THE CORN (the 3D would be amazing hovering over the swaying cornfields at eye level - they HAVE to do this!)
Further on financing Weinstein said: "There is no question that financing is readily available to produce and market these films. I am eager to expand our scope in the 3D business."
Remains to be seen if they shoot in native 3D or go the 2D to 3D conversion route in post production. I suspect that once they choose a path they will stick with it with all of their properties. I have not heard either way on this yet and will post as soon as I hear something.
Oh and in case you were wondering how 3D has been doing lately? CLOUDY WITH A CHANCE OF MEATBALLS will be owning the box office for the second weekend in a row, so far with a stellar $5.6 million on Friday besting SURROGATES. This after a gigantic showing by THE FINAL DESTINATION earlier in the month. See where I am going with this?
Not only is 3D here to stay, it is now the standard for horror and animation. Next up? SciFi. Sometime around December 18th I believe... :-)
Saturday, September 26, 2009
Jim here. And the hits just keep on coming from AVTR.com - AVATAR's viral website. Looks like we have our first glimpse of unobtainium from Pandora - and through its magnetic field and superconductivity, it levitates.
I love the presentation of this! Asks a whole lot of questions in my mind that I hope are answered in AVATAR - guess that's the whole point of a teaser photo isn't it?
This is the mineral that humans are so desperate for - that they would come all the way out to Pandora, mix it up with the flora and fauna just to mine it.
Next up is some concept art found below - the first shot is of Hells Gate and a fleet of Scorpion aircraft. The second is of and RDA hydro electric generating station on Pandora. Be sure to check out the rest of them on AVTR.com! Click to zoom in.
The thought that went into a simple shot like the unobtainium above sets the bar even higher for me if that is possible. Sure I expect revolutionary 3D and effects from AVATAR - but now I am seeing the artistry and use of color too. I am spellbound by it. Sure I have had an interest in geology since I was a kid, combing my grandparent's rocky beach - picking out amethyst and agate among the ancient fossils that dwell along the Bay of Fundy. That, along with my mother being an artist and my intense interest in film making - I can really appreciate the work that went into this simple, but gorgeous shot. Well done guys! Oh and thanks to A@ronW for keeping a vigilant eye on AVTR.com.
Thursday, September 24, 2009
Michael here. Check out this gameplay video of Avatar: The Game from Tokyo Game Show:
Click here to watch it in HD
Also, IGN has some very positive impressions of the demo:
Where Avatar sets itself apart from the pack (aside from the stereoscopic 3D visuals) is in its depth and scope, both leaps and bounds ahead of what most licensed movie games offer. Tons of side quests are up for grabs, a full leveling and skill unlocking system is included, and there's even a little mini-game that plays out a bit like Risk. I played for about a half of an hour spread across two stages and I didn't even get to scratch the surface on any of these features, let alone complete a quest or two.
The levels here are huge. So big, in fact, that teleporters have been included to make the trek across the stages manageable. Activate a couple and you'll be able to quickly zip from one end of a map to the other. This feature is more important than you might initially think. There are roughly 16 stages and enough side quests that you'll want to backtrack and replay levels to do everything.
I can't wait to get my hands on a demo of this game. I'll let you all know as soon as I find out when it will be available.
Wednesday, September 23, 2009
Jim here. Thought you should know that if you are considering a PC purchase and want to get into stereoscopic 3D gaming, Dell and NVIDIA just joined forces in presenting a one stop shop for just that.
Starting right now, you can buy a Dell XPS 630 or a Dell Studio XPS 8000 with a complete 3D Vision gaming kit, including Samsung 120hz monitor (Syncmaster 2233RZ) and compatible GPU right off of Dell's website. Further, you can buy just the 3D Vision kit as an accessory on Alienware's site (which is owned by Dell).
I love this marriage. It just makes sense - so many people want S3D gaming but are a bit standoffish as you want all the parts to just work out of the box without having to figure out compatibilities and how to upgrade your existing PC. Now you can easily do it.
I am building a PC right now around my new S3D hardware, so feel free to ask any questions and I will do my best to answer them. It really isn't all that hard to get S3D gaming at home right now. And the quality is awesome!
Tuesday, September 22, 2009
Jim here. Feels like the holidays around here! Just got a extra special package delivered to my door - a gorgeous 22" Samsung Synmaster 2233 (yep! 120hz - and VERY 3D ready!), an EVGA NVIDIA GTX 285 graphics card (a behemoth card!), and last but certainly not least, NVIDIA's 3D Vision.
I'll give you ten seconds to figure out what I am going to be doing in the very near term! I guess the title gave it away :-) Have some serious 3D gaming coming up and I will be posting my reviews for your use in judging what you want, when you want when it comes to Stereoscopic 3D gaming. BATMAN: ARKHAM ASYLUM in 3D anyone?
I am now in the process of building a box around these core pieces!
Further, I will be shooting the unboxing of these sexy pieces of 3D hardware shortly. Stay tuned! :-)
Monday, September 21, 2009
Hi, Michael here with a glimpse at what many people consider to be the holy grail of display and entertainment technology: holographic video. Prof. V. Michael Bove, Jr. (of the Object Based Media Lab at the MIT Media Lab) and his team are hard at work creating an inexpensive desktop monitor that displays holographic video images in real time.
Bove, Jr.'s enthusiasm for holovideo is contagious. As he leads me around his Cambridge, Massachusetts lab - bustling with students and packed with all sorts of devices and pieces of future technology, we arrive at a circa-1994 holography table. "We are replacing pretty much everything from here down, except for one moving mirror, with this chip," Bove, Jr. tells me. "That 19 inch rack is being replaced by 4 small boards like this. This gas laser is being replaced with a tiny semiconductor laser. In a lab around the corner, we are working on packaging everything up. The new one is 440 scan lines. We're doing both red and green. One of the problems is that once the display is boxed up, it’s just going to be a black box with a window on the front, it’s not going to be Frankensteinein like that [holography table]. So there’s something sexy about a holovideo display that makes noise, and has high voltage (laughs). It's like1920’s TV! Hey, there’s mechanical stuff in there! There’s motors! There’s galvanometers! At the same time, I hate working in there, since it’s so scary in there!"
I didn't get to see the system in action, but from what I learned during my visit to MIT, affordable holographic video will be arriving sooner than you might expect. Read below for details on the innovative tech that has gone into shrinking a dining room table-sized system to something you can fit into your pocket, the exciting possibilities that holovideo enables, and more.
Prof. Bove, Jr. explains how his group got involved with holovideo, and discusses the advantages of holovideo over autostereo technologies:
In about 1989 / 1990 [Stephen Benton’s team] created the world’s first holographic video display, which was called the Mark 1. My group got involved because we were doing hardware and software for processing video and graphics in real time. So we ended up getting involved in some of the electronics and software to drive the electro-optics that Steve’s crew were building. In the mid-90’s they built a 2nd generation display, the Mark 2 display. We were continuing to be involved in the development of that. And there are lots of interesting things that you can do with a holographic video display. It’s not just that it’s no glasses – autostereoscopic – but it’s also that you can, like with any hologram, you can make the object be in front of the physical display – subject to the fact that you’re vignetted by the frame of the screen so you can’t have objects hang out beyond the edges. But subject to that, you can make objects be out in front, which means that you could do haptic interfaces, gestural interfaces, interact directly with things. We had a student do a PhD dissertation here about 8 years ago on haptic interfaces to objects displayed in holovideo, which most volumetric displays don’t let you do, because the view volume is inside a box, or under a dome, or something like that. So we were very interested in the interaction possibilities.
On his decision to turn holovideo into a consumer technology:
When Steve passed away we decided to move the whole holovideo project up into my group because we had been working on it for more than 10 years with Steve. We decided to take a slightly different spin on it. At the time – and this was roughly in the 2004-2005 time frame – I noted that there was going to be a push for 3D in the home, and I said, “OK, let’s try to make holographic video into a consumer technology.” So, what does that mean? It means, obviously, that you need a display technology that’s inexpensive – a few hundred dollars. You need content. You need processing. And you need a distribution mechanism. So what we predicted was that there was going to be a lot of stereo, and even multiview stereo content already available, so we wouldn’t need to deal with that. And there were going to be lots of 3D models. Every video game is represented as a 3D model. And if you can turn a 3D model into a hologram on the receiving end in real time, then the content and distribution is already taken care of – although there isn’t really a standard for transmitting 3D, people who have done online 3D games have demonstrated that it is practical to distribute 3D models in real time of reasonably high quality.
So we decided to concentrate on the processing and display technology. In terms of the processing, we have since about 2004 been using off-the-shelf GPUs where you take a 3D model, and instead of rendering it as an image you render it as a diffraction pattern for the hologram. Our initial results were between a half and 2 frames per second, and we’re now in the 15 to 30 frame per second range for standard definition TV resolution.
Michael: Wow, that’s impressive. I had heard about the University of Arizona hologram, which can show a new image every few minutes...
Prof. Bove, Jr.: What they have is an electrically eraseable photopolymer. Which means that they have to write an image using some technology onto the hologram. Unlike other photopolymer holograms, they can erase theirs and rewrite it.
So there are two different aspects to the refresh. The one is the physical display. Like any display the display has an inherent refresh rate – and all the electronic displays we’ve built around here for years have had a 30 Hz refresh rate – which isn’t quite fast enough, but it’s pretty good. We’d like to get it to 60. We can, but you give up something in return for that. So we’d rather have higher resolution at 30 than lower resolution at 60. The other thing is that we have the computation; so the display can be refreshing at full speed, but you might not be changing what it’s displaying. So when I say we have 15 to 30 Hz, I mean that the GPU can generate 15 to 30 holograms per second. And that’s for scenes that are kind of PS2 quality – they’re not super, super high quality graphics – but it’s pretty good.
On how to make the display technology:
Prof. Bove, Jr.: So the issue is: how do you make the display technology? Because we’re starting with something that, when Steve’s group was working on it, was dining room table size, around $50,000 worth of stuff, and a whole 19-inch rack full of electronics, and what we wanted to do was package all that stuff in something like a CRT monitor. Unfortunately, the way we are doing it right now you can’t make it flat. You can make it relatively shallow, but you can’t make it totally flat. So our technology is going to look like a CRT when it’s packaged up. But, it’ll then just have one or two DVI connectors or HDMI connectors going to your PC or game console as the interconnect.
Now, there’s a catch to all of this. The catch is, with most display technologies, you have a fixed-size image. You know, HD is 1920 by 1080. Well, when you make the screen bigger, you just make the pixels bigger. So it’s still 1920 by 1080 on a big screen or a little screen. With a hologram, because you’re using the physics of diffraction to steer the light, the pixels have to stay the same size, and you need more of them as the display gets bigger.
Michael: Even 4k panels are very expensive. So is packing more pixels a cost issue, a technology issue, or both?
Prof. Bove, Jr.: Essentially, realistically, you need pixels that are about the size of the wavelength of light. So that means for every meter of screen width, you need about 2 million pixels [on each horizontal line]. Now, it’s further the case that you need the same pixel pitch vertically to do a full parallax hologram. And that’s really hard to do, because that's A LOT of pixels. You’re talking about rasters [data structures] that are about 2 million by a million, let's say. And just moving that many pixels around – just building the interconnect for something like that is murderous. It's doable, but it’s not something that’s going to be scaled down to consumer costs any time in the immediate future. So what we and some others have been doing with horizontal-parallax-only holograms is to say that it has 2 million pixels per scan line per meter, but it’s the same vertical resolution as an SDTV or HDTV. What that means is that you don’t have vertical parallax. So, if you move your head left and right you can look around objects. If you move your head up and down, the scene doesn’t change.
Michael: Which you usually don’t do anyway when you’re sitting, watching something.
Prof. Bove Jr.: Right. And given that your eyes are arranged side-by-side, unless you are lying on the couch, sideways, you don’t really notice the lack of vertical parallax. Now, for things like medical visualization, or certain kinds of engineering design, or CAD [computer aided design], or something else, you really want to be able to look over objects and not just around them, so that might not be a suitable trick to use for those high-end applications. But for low end applications, and for consumer applications, it means that you are actually down to something practical. Now, what do you mean by practical? You mean by practical that you could, with the kinds of interconnects we have available, and the kinds of GPUs we have available, you can at least do a small – by small I mean postcard-sized – HDTV display being driven by one GPU, normal GPU right in your PC, over the cables that are coming out of your PC. And so we’re doing proof-of-concept work.
Think of it as post-card-sized SDTV resolution. The actual number of pixels in there is a whole lot more than that. So imagine SDTV with maybe 25 to 30 degrees of look-around. And you can do that on the GPU you’ve already got.
Michael: That’s amazing. I can see that being implemented in a next generation video game console or something similar.
Prof. Bove Jr.: Sure. Given that we can do that – we thought the computation was going to be a lot harder. And two researchers in my group – Dr. Quinn Smithwick and graduate student Jim Barabas have really managed to take advantage of the processing pipelines that are already out there. So we’re doing this – we did it in Open GL, we’re now using CG – but we’re just doing shader programming. So it’s just the same kinds of optimizations that gamers are using.
One of the things that we’re doing - we have to make the scan lines really long, so we’re making the display treat multiple scan lines – because the GPUs tend to have a limit on how long a scan line can be. It’s hardcoded in somewhere, both in the hardware and in the drivers. So we have to make the display treat multiple scan lines as one scan line. So the higher end [off the shelf] GPUs let you do basically make the rasters bigger, so you want to use one of those. Because your frame buffer is just larger.
But it’s not really all that magical, what we’re doing, in terms of the processing. But it turns out that for various coincidental reasons, the kinds of things that people – ATI, NVIDIA – are optimizing for GPUS for regular graphics rendering, turn out to be the same kinds of things we need to do in generating holograms. So that’s all good. So the display itself then poses a problem, because you need something that has the ability to make *extremely* small pixels. And there are a variety of ways of doing that. People have taken a variety of approaches. One of the things you can do – and some people in Japan who we know quite well have taken the chips for HD projectors or the chips for HD electronic viewfinders – which have really small pixels – and you can tile a whole bunch of them together.
And you can imagine that the wiring of something like that has to be monstrous. You’ve got a whole bunch of HDTV rasters in something that’s the size of like a paperback book. The other thing is just to use a very, very different kind of technology. One of my grad students, Dan Smalley, is making chips, and they don’t actually look like much, but what this is – there’s a kind of chip that is in, for example, most wireless phones and other things, called a surface acoustic wave chip, or SAW chip, and they are used as filters for very high frequencies. And the way that they work is that if you have a material that is piezoelectric, you can turn an electrical signal into a sound wave effectively that travels across its surface, and for various reasons you can make very cheap and efficient filters that way. Well, there is a similar phenomenon – if you have a material like that that’s transparent, you can make a chip that’s called a “guided wave device”, where you turn the electrical diffraction pattern into a sound wave that travels across the surface. Now it’s a very funny kind of sound wave because it’s a sound wave at a gigahertz. But it’s a sound wave. And if you then run monochromatic light through the material, the signal going across the surface diffracts it.
So, this is a particularly good match to horizontal-parallax-only holography, because effectively what you do is you shine a laser in here and you get out one whole scan line of a hologram. So your display basically requires just a vertical scanner to scan out, say, a rotating mirror which is moving pretty slowly to scan out all the scan lines. And then you just *look* at this thing – and the scan line diverges out when it comes out of here, so this device can be a lot smaller than the physical scan line. So, these are cheap and we can make them here on campus in a lab across the street.
So this is what we have been pursuing, over about the last 3 and a half year: how to make these into a commercial display. And there’s one other catch. And the catch is that the idea of using diffraction to make a 2D display actually goes back to the 1930’s. There was a company called the Discophany company. When CRT’s were still expensive and unreliable Discophany company was making refractive 2D projection TVs. The problem with diffraction is that the sound waves travelling across the surface of this are travelling at the speed of sound, so the *picture* is travelling horizontally at the speed of sound. And up until now, to do something like this you needed a rotating mirror going in the opposite direction at the same speed so that you get a stationary picture. My student Dan has worked out a way to [get rid of the mirror]. That mirror is moving really, really fast because it’s moving at the horizontal scan rate, not the vertical scan rate. He’s figured out a way to make the system so you don’t need the mirror and you’ve still got a stationary image.
Michael: The very idea of a refreshable hologram is very new. Most people still think of bulky tables and still images. Holovideo would strike many people as almost like magic.
Prof. Bove, Jr.: Remind your readers that holographic video is *exactly* the same as any other kind of hologram in that it’s something you look toward. You can’t make the light go across the room like Princess Leia. Because photons do not want to ravel halfway across the room and suddenly decide to make a left turn toward your eye, so you need to look toward the display, but objects can be out in front of it. And that’s one of the reasons for wanting to scale-up holographic video displays, because obviously as the frame gets bigger, you can have objects seemingly hang out farther in front. And of course you’ve got a view volume that goes back into the screen as well. So, now that we can actually render these in real time, we can do a lot of the stuff we’ve talked about for years – like take advantage of the 3D volume of interactive video games.
If you think about it as you don’t just have screen space the way you do in a normal video game – you have a view volume that’s pretty deep. And so you might imagine something as simple as you’re playing tennis and the ball’s coming toward you, AND you have force feedback in the paddle. When the ball hits the paddle, you can feel it. So we’re just starting to prototype some games. I don’t have anything I can show you along any of those lines yet, we’re probably be showing those off publicly next spring sometime.
The notion of thinking about a volume in which you can interact is something that people really need to start thinking about seriously. I mean it’s hard to do that just with stereoscopic 3D, for a lot of reasons. Including the fact that with stereoscopic 3D you don’t have motion parallax. So you can’t move your head to look around the thing in the front to see the thing in the back. And so that quickly limits your options for laying things out in 3D space.
Michael: Head tracking and motion tracking makes more sense for home use. It seemingly wouldn’t be effective in a large theater.
Prof. Bove, Jr.: With this kind of thing, you don't have to track heads. There is a company that is prototyping holovideo displays called Seereal, and they’re a startup out of Germany. They are solving some of the computation and other problems of the display by tracking your eyes. So they have a camera looking out at you that tracks your eyes, and they only generate the hologram that you could see where your eyes are.
Michael: That’s fascinating. It sounds like it has a lot of potential.
Prof. Bove, Jr.: It’s very interesting. And they’ve produced some lovely demos. But they’re not looking to manufacture this technology from what I can tell. My understanding is that they are looking to license it. And they are making several other things as well. We, on the other hand, don’t want to go through all that trouble. We just want to generate everything everywhere. That means we need to do more computation, but we have a little less complication in the overall system design. So when I say motion parallax, I’m just saying it’s free space. You move your head around and you can look around things [because] it’s generating all those parallax views all the time. But you also have some other things - you can’t use focus cues because in a normal stereoscopic display, your eyes are converged at one distance, and focused at a different distance. In a holographic display, since you’re actually creating wavefronts of light, you can have the material be diverging from any point.
Michael: So you can change your eyes, avoid eyestrain.
Prof. Bove, Jr: That’s right. So you can have an object in the foreground – if you have enough depth - you really can focus your eyes on the thing in the background, then focus your eyes on the thing in the foreground. Which adds some computational complexity. We are just starting to explore some of the perceptual and user interface issues, and how much you can get away with on that. But the fact that you do have focus and parallax - as with any holograms – makes viewing these things just more like looking at an object.
Michael: I’m meeting Prof. Raskar later [read our interview with him here]. He gives the example of a hologram of a flower sitting next to a regular flower, and how the hologram wouldn't respond to outside lighting.
Prof. Bove, Jr.: When you’re making a synthetic hologram, you can relight it all you want. If you have, particularly, a stationary hologram of a flower, and you have the [real] flower, if the sun moves overhead, the flower’s shadow is going to move, and the illumination is going to change, and the hologram is going to look the same. However, if you’re computing the hologram on the fly, you can change it. You can put the lighting wherever you want, you can make a synthetic focal length of whatever you want, you can make a synthetic aperture. You can play with depth of field, you can play with all kinds of stuff. Now, in order to make it actually respond to the sun moving overhead, you obviously need some kind of sensor that is going to detect the sun’s direction- a camera or something-and then re-render the hologram to do that.
But that’s a characteristic of static holograms. You have to be computing the hologram on the fly. A lot of what has been done in holovideo over the years, is that you compute a single synthetic hologram, then you store those pixel values in a very high resolution frame buffer, then you just display that image. You can’t interact with it. It is not changing dynamically. Indeed, fully the first at least 10 years of holovideo around here [MIT] were largely precomputation. Gradually, in the late 1990’s we were getting away from doing total precomputation. But, even doing that with specialized hardware, which we were doing at the time - It was just a whole lot of work. So the fact that GPUs have developed in a direction that allows them to do the kinds of computations we need to do, and that here are other market forces forcing GPUs to get faster and at a higher resolution: that's good news for us.
One of the things that is particularly interesting to us is the fact that a lot of the standardization activities in the regular stereoscopace, are potentially of service to holovideo as well. In particular, now that there’s a multiview extention H.2.6.4’s AVC, that means you can send a lot of parallax views, and you can send them efficiently. And there are codecs for dealing with it. And there’s a standard for dealing with it. So that means somebody can go out with a camera rig, just a whole bunch of webcams and shoot for an integral [integral imaging], or just a lenticular, parallax barrier autostereo screen.
A holovideo display is just sort of a superset of all of those other parallax displays. So we can take those multiple parallax views, and in real time put them on a holovideo display. There’s a point in our processing pipeline where if you’ve got a bunch of parallax views you can stick them in, and then do the computation downstream to make each parallax view come out at a particular direction. So if there is multiview content floating around, that’s perfect!
Michael: What are your optimistic and pessimistic guesses for when holovideo will be on the market for consumers?
Prof. Bove, Jr.: I would say there will probably be some kind of diffractive, holovideo-type display on the market in 5 to 7 years. And it probably won’t be big. [I get this] from projecting what people are doing in displays, and the ability to put that many pixels, using one technology or another, together, and to interface them with the appropriate driver circuitry. That doesn’t require a super-breakthrough.
Michael: It seems Moore’s Law will take care of much of that. We're seeing a push to 4K projectors and panels much faster than expected.
Prof. Bove, Jr.: The other thing that’s happened that we are taking advantage of, is that you can now get red-green-blue laser triplets relatively cheaply – consumer grade cheaply, because they are being developed for 2D projectors, particularly for Pico projectors. And for the particular way we are doing holovideo, we don’t actually need good coherence, we just need monochromiticity. And the only reason for that is that different wavelengths diffract at different angles. So if you have a very wide band source of red light, like what you’d get from a filtered white light source , you’ve got a fuzzy pictures, because things diverge as they get farther from the screen, so you can’t really get a good 3D effect. And even ordinary LEDs aren’t quite monochromatic enough for us. But the fact that there are cheap semiconductor lasers at fairly high power, available in R, G, and B, means again that we don’t have to solve that problem.
So most of what we’re doing around here right now is monochrome. Just because going to color is pretty simple. I mean, color is just a factor of 3. Which means you either have to do the computations 3 times as fast, or stack 3 of your light modulator chips on top of each other. We’re not showing the latest holovideo display publicly yet, because it’s still dim and it’s still fuzzy. The efficiency needs to go up a bit to get more of the input light out into the hologram.
And with that, Prof. Bove, Jr. showed me around the lab before getting back to work.
A huge thanks to Prof. V. Michael Bove, Jr. for meeting with me in his Cambridge office, and to Alexandra Kahn for arranging the meeting!
Head over to the Spatial Imaging Group site for more information on holographic video.
V. Michael Bove, Jr. holds an SBEE, an MS in visual studies, and a PhD in media Technology, all from MIT, where he is currently head of the Object-Based Media Group at the Media Laboratory, co-director or the Center for Future Storytelling, and director of the consumer electronics program CELab.
He is the author or co-author of over 60 journal or conference papers on digital television systems, video processing hardware/software design, multimedia, scene modeling, visual display technologies, and optics.
He is co-author, with the late Stephen A. Benton, of Holographic Imaging (Wiley, 2008). He is on the Board of Editors of the Journal of the Society of Motion Picture and Television Engineers, and associate editor of Optical Engineering.
Sunday, September 20, 2009
Michael here. Wow - you never really know where you will get your next glimpse of Avatar, do you? Although a Japanese Panasonic ad makes a good deal of sense given the partnership between Lightstorm and Panasonic. Thanks to reader siweb for pointing us to this!
Anyway, check out the ad (with beautiful unseen-until-now Avatar footage) below:
Click here and click on the "HD" icon to watch the video in HD
Jim here. Wow. I took in CLOUDY WITH A CHANCE OF MEATBALLS last night and let me tell you the movie exceeded my expectations. And yes, Pixar should take note of just how far Sony Pictures Animation has come. Sure a few minor rough edges, but very minor. CLOUDY is a feast for the eyes.
Of course the basic premise is of a young inventor (Flint) who takes it upon himself (not initially on purpose) to save his economically dying town by introducing falling food for tourism. Of course things get out of control and the size of the morsels keep getting bigger and bigger. Talk about fodder for 3D!
And the 3D is done extremely well. In one superb sequence, Flint realizes he must get across town to his lab in order to reverse the oncoming food downpour - and the obstacles that are placed before him are all shot in wondrous 3D. That scene makes the movie in my mind.
Similarly, the Jello scene is memorable. Hilarious really. Cannonballs and belly flops into Jello make for a completely different results! :-)
While the movie is made with children in mind, there are plenty of adult gags to laugh at - like when Steve the monkey is fighting Gummy Bears and strikes a karate blow to reach in and pull out a gummy heart - very cool. Kids won't think it is a heart though, so still very kid friendly. The only scenes that could be considered more intense for kids is when Flint goes bonkers running around hitting people in the face with ice cream snowballs (a little overboard) and when roasted chickens start attacking - and EATING alive the humans - that last scene (and some language in the movie) probably earned the PG rating.
So I give CLOUDY an 8 out of 10. There is a great father - son message in it too. Take your kids to it (I would say 8 and over) - you will be glad you did. A great way to introduce 3D to them and keeps you entertained too. Oh and it is easily tops in the box office with an estimated $8.1 million on Friday alone. The ~$100 million production should top $30 million for the weekend.
Thank you to Empire Theatres for the screening.
Jim here. Rumors are flying (Perez Hilton) that the queen of pop Britney Spears will be shooting her Mandalay Bay (my fav LV hotel btw) concerts in stereoscopic 3D. The dates for the shows are September 26 and 27.
Hey - the more concerts in 3D (and done well) the better. The music will be her Circus Tour material which means it will not be half bad.
I will admit listening to her Circus tunes on the elliptical trainer - it gets the job done! No word yet on the crew that will be shooting this - I have a few emails out.
Saturday, September 19, 2009
Hi, Michael here with an exclusive image of Stephen Lang looking badass as Colonel Quaritch.
Those who have seen the Avatar Day footage know that Quaritch views Pandora and the Na'vi as deadly adversaries. Lang gives an incredible performance in that scene - you just know that there will be some very intense moments involving Quaritch later in the movie. And according to Cameron the most difficult-to-shoot scene in the movie (which took months to figure out how to do) is Quaritch's final scene, which involves characters of three size-scales interacting.
I love the design of the gun, which appears designed for smaller game. Also notice the breathing mask with sheathed knife attached to his vest.
Head over to http://www.avtr.com/ and wait a few seconds to see a video of Quaritch speaking to you about the danger of Pandora - "You are not in Kansas anymore. You are on Pandora." The words are largely the same as what we saw on Avatar Day. I can't wait to see what else appears on this site.
Anyone think the origin of the scar will be revealed?
Thanks to Fox for the image!
Friday, September 18, 2009
Jim here. Yep. You read right! RealD will be making designer 3D glasses available BEFORE AVATAR is in theaters. From designers like Gucci. We have known about this for some time now, but it is nice to actually hear some sort of timeline attached to it.
You will not have to submit yourself to the mass produced glasses associated with 3D movies any longer! Further, for those of you who must wear prescription glasses - you can soon order prescription glasses made for 3D viewing.
I have heard rumors that these glasses will also double as sunglasses by simply adjusting the angle of the polarity, but that remains to be seen.
I would love a pair of Ray Bans for 3D! Or maybe those glasses that Bono designed... :-) Rest assured, we will be making these glasses available for you to purchase - there is only one place for 3D on the planet - MarketSaw.
Jim here. I remain somewhat skeptical but there is a new 3D technology player in town who really isn't all that new to Hollywood. Technicolor. They claim to have a 35mm print solution for 3D and according to exit polls being conducted recently it may even rival some digital technologies.
While the technology is interesting, its reception at the 3D Entertainment Summit has been cool according to Variety. I tend to go along with what they are assuming - Hollywood does not want to go anywhere near another format war and I don't blame them.
Further, the notion of bundling up the new Technicolor solution with the old analog projectors that are being replaced and shipping them overseas where there just may be a market - makes a whole lot of sense.
To be clear, Technicolor is presenting their solution as a stop gap between the current theatrical shortage of 3D screens, and when there is sufficient digital 3D deployments made. Again, this makes sense to me. AVATAR and other huge 3D properties will need these additional screens if it possible to wrap them up and deploy them in time. Apparently the Technicolor solution uses polarized glasses - just as RealD does.
HOWEVER, I do NOT wish to see Technicolor creep into the picture by overstaying in the US/Canada scene. Sure, bridge the gap by all means. However, digital is CLEARLY the best solution for Hollywood going forward and from that I will not be swayed. From what I have seen, the digital picture is brighter and clear of residual artifacts and does not age. Further, 3D needs those extra lumens.
So, welcome to the 3D scene Technicolor! You are like a visiting family member bringing freshly baked bread who I missed a great deal. But don't overstay the welcome... :-)
Thursday, September 17, 2009
Michael here. Check out this spiffy new featurette on Ubisoft's Avatar game:
There are some beautiful visuals here. And I love how they are going all-out in terms of promoting Avatar as a world that is to be experienced. Expect to get lost in Pandora well before the movie opens.
Jim here. The reviews have been stellar - live sports broadcasting is certainly here! Variety reports that the ESPN S3D coverage of the USC - Ohio State football game continues to get awesome word of mouth at the 3D Entertainment Summit taking place right now at the Universal Hilton.
The live 3D was handled by the ever present Vince Pace. His cameras are also being used in December's AVATAR (to name one of many projects) and if his schedule is any indication of the future of 3D - it's looking darn good.
From Variety: Random fans from the main floor were invited up to watch the S3D coverage for a few minutes. Most, if not all, were seeing S3D TV for the first time. Some 50 answered an extensive survey on their reaction from ESPN; we peeked at a few, and all of those were very positive. One fan was overheard to say, "It's really like being there," and another noted, "It's addictive."
Significantly, after a few minutes watching S3D, some would notice instantly when the broadcast picked up a flat shot from the 2D feed.
The Skycam, which flies above the action on cables, wasn't used for the coverage. There wasn't time to safety-test an S3D Skycam, but Vince Pace promises, "Next time, watch out."
Even live sports coverage, it turns out, is not immune from gimmick 3D shots. When sideline reporter Ed Cunningham found Super Bowl hero Santonio Holmes on the sideline, he asked Holmes to show his championship ring to the folks at home -- and put it right into the camera. Sure enough, Holmes' hand seemed to extend out past the screen, drawing laughter and cheers from the ESPN Zone crowd. Think "Dr. Tongue's 3D House of Super Bowl Rings."
"I had talked to Ed before about not being afraid to have fun," Pace said. "It's a natural part of a 3D presentation to say hello to the camera that way." And, he notes with some pride, "Technically, it was pulled off perfectly."
Great to see more live sports broadcasts taking place and more than one company backing the technology to do so. 3ality Digital also does live sports broadcasting with one of their last efforts being the BCS Championship Game broadcast during CES in 3D.
I would like to see a ramp up of live 3D sports being pumped into theater screens, especially Sunday afternoons. Perhaps a drinking age crowd? Certainly makes sense for playoff season!
Jim here. It has been a long time coming - I went public about it after there was no movement a year ago - but there is finally a top flight educational institution planning on offering a very much needed Stereoscopic 3D program - USC.
From Variety (emphasis is mine): Hoberman and Scott Fisher, chair of USC's interactive media division, are setting up an interdisciplinary program at the School of Cinematic Arts that will address how the technology can be used in narrative-based production such as movies and scripted television, as well as in gaming and immersive media. According to Fisher, the program should commence next fall.
The program will have a strong research component to complement its classes, Fisher said. USC already collaborates with such industry partners as Sony, HP and EA on S3D and boasts many alums who are its boosters, including George Lucas, Randall Kleiser and faculty member Michael Peyser, who exec-produced last year's "U2 3D" film.
Fisher said while the technology itself is undeniably important, the program will focus on developing its pictorial language.
"We have a good sense of the differences (between 2D and S3D), simple things like not making quick cuts between, say, a nearfield scene and a landscape because it hurts your eyes," Fisher said. However, filmmakers are just beginning to test out such techniques as upping the perception of depth to heighten emotional impact, he said.
"We're getting so many requests from industry to provide them with this kind of background in stereoscopic imaging because they're making more and more films and need that kind of talent to move things forward," he said.
About time. I have been pushing folks for some time now to get this done and it is so gratifying to see a formal program being set up to build that foundation for this generation's filmmakers. Late or not, hats off to USC for ramping this up.
Wednesday, September 16, 2009
Jim here. Last we heard from the NIGHT OF THE LIVING DEAD public domain franchise was the modern colorization and 2D to 3D conversion by PassmoreLab, which I can't wait to see. The world premiere will be Saturday, October 3 at the Hollywood Forever Cemetery.
Now comes word via Heat Vision that Zebediah de Soto is directing a prequel to the series entitled NIGHT OF THE LIVING DEAD: ORIGINS with Simon West producing.
But what is more exciting to me is a new piece of technology that De Soto (and Gus Malliarodaki) have invented known as "the beast". This equipment "allows filmmakers the ability to direct CG performances the same way they would direct real live actors. The aim of the process is to make tennis balls on a stick representing real people or monsters a thing of the past by allowing actors interact with CG elements as if they are tangible."
If this technology is as good as it sounds - it's a home run. More on the project when I get it.
Jim here. Got some photos of THE HOLE's after party at the Toronto International Film Festival's 3D Lounge.
A good time was had by all!!
That's Nathan Gamble beside the movie poster and James Stewart with Joe Dante... Their photo was taken with a Fujifilm Real 3D W1 Camera!
Sponsors included our own RealD and PassmoreLab!
By the way - THE HOLE won Best 3D Movie over at the Venice Film Festival!! Congrats Joe!
Jim here. Check out the latest and greatest - Join the AVTR Program! A new viral website for James Cameron's AVATAR.
CHECK IT OUT! And log in via Twitter to see the site's community come alive...
Let me know if you find anything I haven't! :-)
Tuesday, September 15, 2009
Jim here. Got some rather cool 3D gaming news to share - My friends over at IGN posted their Resident Evil 5 for the PC review and has given it a very nice score of 9.3! This is a higher score than the console version received! The editor who reviewed it loved the graphics, the upgraded mercenaries mode and the new 3D Vision feature.
IGN compared the game in all formats and the PC version came out on top.
Here is what they had to say: ...However, if you do happen to have a 120Hz monitor and you have the Nvidia 3D vision glasses, you're definitely in for a serious treat. RE5 was designed to fully take advantage of 3D, and it doesn't disappoint, rendering Chris and Sheva in a way that provides them with the visual illusion of weight and mass as they move through their environments instead of being flat 2D images onscreen. Objects and enemies also appear to pop out of the screen during certain segments of the game as well, piercing the digital "fourth wall" with a sense of negative depth that is quite incredible. Once you see the slithering worms of the first boss of the game seemingly drip through the screen at your face, you'll know you're seeing something special. Since you're able to tailor the perception of the depth to your personal taste, you can also adjust how much of an impact the 3D has, which allows you to make the experience fully subjective to your tastes. While not everyone with a PC may have the gear to take advantage of the 3D, it's extremely easy to enable, and RE5 provides two benchmark tests to help you configure your experience. If you have the means, definitely check out the game like this – you'll enjoy the experience.
RE5 is actually better looking on the PC, and if you happen to have the equipment, playing in 3D is an excellent experience.
I think that is a pretty clear, ringing endorsement of 3D gaming on the PC through NVIDIA's 3D Vision. Can't wait to get mine set up - should have it shortly!! Let the reviews begin on MarketSaw :-)
Found this clip on YouTube too - Enjoy the preview ...
Jim here. Have a quick look at the new HOW TO TRAIN YOUR DRAGON teaser before Dreamworks Animation asks me to pull it down. Not much is seen, but it gets the blood pumping a bit :-)
HTTYD is due in theaters March 26, 2010 in glorious 3D and uses the vocal talents of Gerard Butler, Jonah Hill, Christopher Mintz-Plasse, Craig Ferguson, and America Ferrera.
Based on Cressida Cowell's first novel, this comedy adventure follows a Viking teenager named Hiccup. Knowing that his offbeat sense of humor doesn't satisfy his tribe chief father, he seizes the chance to prove he has what it takes to be a fighter when he is included in Dragon Training. But when he encounters an injured dragon, his world is flipped upside down. What started out as his one shot to prove himself turns into an opportunity to set a new course for the future of the entire tribe.
Jim here. Got some news from one of my top sources in the Cameron camp and man oh man does BATTLE ANGEL ever sound Kick. Ass.
On what is happening right now for ANGEL:
Make no mistake this is the movie your website will be pursuing in time to come. The tests are "Beautiful" and "Complex" beyond your wildest dreams. Stunningly beautiful, considering the fact that some are old(ish). Imagine this for a second. A cyborg chick "the Angel of Death" taking out "a lot" of other part man, part machine type "people".
We do indeed plan on staying with Cameron believe us! AVATAR is the training ground for BATTLE ANGEL and I can't wait to see the ideas he has for it.
On what those tests contain:
Now picture the "Angel of Death" looking like a child. a 14 year old. Now imagine this, this little girl goes through these guys like butter. Did you see the KICK-ASS trailer? Well imagine that girl via the Matrix via Ghost in the shell, via AI. "Remember the True Lies bathroom brawl?" Well take out Arnold, and "copy and paste" Bruce Lee on acid via the Terminator. as seen through the eyes of a sweet innocent 14 year old cyborg girl. And all of this done via the mind of James Cameron, Holy shit what is not to love? The lights flicker and shit hits the fan. Awesome stuff.
With performance capture and CG anything is possible. That includes using an underage actor in very VERY violent scenes. This would be cutting edge stuff and who better to pull it off?
On Angel's future: You will be hearing the phrase "Angel of death" a lot in time to come. At least judging from the level of involvement now. I bet Angel will go green soon. The only thing I can see setting back Angel is Avatar. And you know why that is don't you.
ANGEL OF DEATH is a potential alternative title as it is what she is also known as. But I think BATTLE ANGEL will stick. AVATAR has potential sequels written all over it of course and that may be the next project for Cameron depending on studio and audience pressure after the massive response to AVATAR is felt. But you can count on BATTLE being made, big time.
Obviously casting for Angel is key. I am curious to know who you think would make a great Battle Angel? Choices are limited as the character is only 14 years old, but I have a few ideas. I had heard rumors that Cameron already has his Angel - but that was some time ago. Haven't heard anything since.
And we know from a MarketSaw exclusive interview with designer James Clyne that there is a VERY real possibility of BATTLE having some gladiatorial aspects of future sports and a wide social gap of the "have and have nots". And further that BATTLE ANGEL will have many different looks (new faces, bodies, and outfits) as she evolves in the movie. We learned from yet another MarketSaw exclusive interview with Mark Goerner that the movie follows and even amplifies Kishiro's original vision of BATTLE ANGEL.
Other stereoscopic 3D projects to watch that may closely resemble BATTLE ANGEL are APHRODITE IX and Spielberg's GHOST IN THE SHELL. All of them have young female cyborgs kicking some serious butt. Interesting how they line up after something that catches Cameron's eye.
I am thrilled to learn that Cameron is still doing tests for BATTLE. These tests are very, VERY secure - but we have the best sources on the planet so YOU get the updates! Stay here with MarketSaw for the latest and greatest on all things James Cameron AND 3D.
Second photo courtesy of Allison Rose. The model is Riki Le Cotey.