Andy Maltz 2017

[ 00:00:19 ] It’s been two years since we talked.

[ 00:00:22 ] We were here two years ago to launch a science version 1.0 to the industry and it wasn’t particularly new at that time. It had actually been in development for 10 years but it was finalized as a version one with the confidence that it wasn’t going to change for a while and that really helped get a quick manufacturer who is comfortable with implementing the specifications for color management from camera all the way through to display and design visual effects and on set in the middle. So in the two years we’ve seen broad spread adoption from small indie films commercials to big tentpole movies there is one that just opened last weekend that used aces the end superhero movies. And what we’re also finding is that in other markets where you wouldn’t think that car management was top of mind for them turns out car management is now top of mind in particular in video games. Epic’s Unreal Engine now includes Acer support for more cinematic look we are seeing some activity in virtual reality with any brand new technology that comes out despite technology the first thing that the companies and the practitioners and the film makers and content creators or in the case of or the experience creators tried to do is just get a picture up on display. And that’s largely what that part of the industry has been going through. And as you get the basics down like not dropping frames or getting sufficient spatial resolution you then work your way down the list to the other more nuanced parts of the system which include color management.

[ 00:01:59 ] So we think that the time is right for the various virtual reality display manufacturers to get together and agree on display calibration standards.

[ 00:02:11 ] So then you can have a robust robust code management throughout the system which any visual storytelling or visual entertainment medium must have because that’s part of the quality of the picture and the experience for the end user the value of standards is that you have a common infrastructure that everybody can innovate on top of the pieces of the system that nobody makes any money on like file formats. Nobody makes any money on file formats. It’s the content that’s carried around in those file formats and various display platforms where the real value is. So if you don’t have that plumbing in a standardized way there’s no market to grow. If you look at the motion picture industry the reason that that one of the reasons why I grew into a global 40 billion dollar business over 120 years or so is because of the 24 frames per second frame rate standard the sound standards that came along the line the aspect ratio standards as it came along the line that there’s plenty of room for creative flexibility within those standards if the standards are done right. But you have to nail down certain aspects of it. We wouldn’t have had the Transcontinental Railroad if the the space between the rails was not standardized. Imagine going to Home Depot and buying screws and there’s no such thing as 6:32 or 10:24 or whatever the numbers are. Those are all standards. So in terms of color management and display calibration standards it’s the same thing in the digital world. You have to have the basics nailed down. Otherwise there’s no room for growth there’s no room for innovation.

[ 00:03:59 ] Again it provides the foundation for one to innovate on top of if you go back to what happened in visual effects and the groundbreaking movie for visual effects digital visual effects was Jurassic Park and the comment from people that left the audience was Boy those dinosaurs were so believable I couldn’t tell the real ones from the fake ones. So the technology had progressed to the point where you could have seamless integration of that and you only get that if your if your platform underneath gives you those tools so in the case of mixed reality or augmented reality formats if what you’re really trying to do is building the fake dinosaurs with the real dinosaur as well then those are the pieces that you have to take care of.

[ 00:04:41 ] Now you may want to have your fake dinosaurs look very different from the real dinosaurs so you still want that control the creative control. And that’s the real benefit that these sorts of standards like aces provides the content or experience creators is control over their experience and their content. I think cinema will always be cinema and virtual reality is now just beginning to figure out what virtual reality will be. Same thing with augmented reality. In my opinion they are different things. One is called content the other has experience. Those are those are very very different things.

[ 00:05:35 ] Well in any innovative environment where you’re creating something new there will just by the very nature of innovation that will be new ways of approaching things so when you talk about lots of color management there’s lots of proprietary color management mechanisms that are unique to the company or the facility that is working in that environment. And that’s one of the benefits you get with digital technologies if you know how to write software if you learn a little bit about math and science you can roll your own.

[ 00:06:04 ] You can make up your own way of doing that’s largely what happened for many years as we move away from motion picture film and all the standards that came from that. And the industry flourished under that but you get to a point where innovation is limited. So now that we have new technologies coming online like high dynamic range Well everybody has a different opinion about what a high dynamic range image a good high dynamic range images and that’s that’s a necessary dialogue and debate to have right now.

[ 00:06:33 ] But then you move away from the need for specialized color management because now you’re talking about something different. So everybody needs to get on the same plane in terms of moving the bits around because that’s already figured out now. And there’s no reason to really differentiate. There is a reason to differentiate on the new things because you’re still trying to figure it out and where quite often happens when you get a new medium or new technology it leverages off of what you already have. Think about what happened when television first came and it was basically a shooting stage plays and it wasn’t until multi-camera shoots came came on board that you got some new creative possibilities with that and go back even further when motion pictures first started it was the same thing.

[ 00:07:20 ] They were just static blocked off shots until the camera started to move around and then the technology evolved and the sound was added. Where that art form really flourished. And I think you’ll see the same thing with these new forms of experience that are coming on line.

[ 00:07:44 ] So in the two years since we released Ace’s 1.0 to the industry as I mentioned earlier it’s found widespread use.

[ 00:07:52 ] It’s established itself as a core management archiving and digital image interchange platform sufficiently enough where we can now begin to go to work on taking it to the next level. So in service of that we have some new leadership of the project that’s still under the auspices of the Motion Picture Academy but we have heads of technology production of technology from Marvel Studios HBO the locks that are leading the effort and we have issued a call for participation to the entire industry to people using a service. The experience people the people there are new to aces to take it to the next level and make of the things that are a little challenging to do within that infrastructure a little easier and bring more functionality to it to to help these next generation content experiences.

[ 00:08:39 ] Just be a little easier to create if you want to get more involved and you want to put your imprint on the in and color coding system.

[ 00:08:55 ] We encourage you to go to ISIS central dot com. That’s the easiest portal for everything for filmmakers for equipment manufacturers for facilities people for engineers. That’s the place to have a discussion. And there are connections from other user groups online into the central and vice versa or really trying to do is build the community and involve all of the existing communities that are out there.

[ 00:09:17 ] So we we reduce the standards that are useful to everybody.

Thought Gallery Channel:
Backstage Conversations
Backstage Conversation Season: 2017

Antonio Bolfo

[ 00:00:19 ] A background in fine art I was drawing and painting my whole life went to art school Rizzi where I majored in animation and film. From there I went into the video game industry to work for a company called harmonics. We me a lot of music interactive video games. You know one of one of them being Guitar Hero which you know became a huge franchise and they eventually went on to develop rock band as well.

[ 00:00:45 ] But you know after that kind of a career path I went into journalism and I became a photojournalist and brought kind of my my ability to see and my just my visual background into the journalism space but also because I had an interactive background I wanted to tell more interactive visual stories which very soon I realized was incredibly expensive and time consuming to do. And a lot of my colleagues wanted to do the same thing. But you know the publishing houses we just couldn’t afford to spend money on these. And on these interactive platforms per story. So I do a lot of assignments for the New York Times magazine for Time magazine Newsweek and many others. And we just constantly saw that we were pasting what was in our magazine pages and site copying what was on magazine pages and pasting into a computer screen.

[ 00:01:37 ] And I figured there has to be a better way to tell stories online. But there wasn’t a way to do it affordably. There wasn’t a way to do it in a time effective way because the news we’re talking about. And so I you know looked around for four for these platforms and nothing existed.

[ 00:01:56 ] So I decided to take it upon myself and my co-founder who I met at Rizzi and we developed verse and it’s a do it yourself interactive video player and video platform that allows anybody not just journalists but filmmakers brands advertisers to make these really cool compelling interactive stories online things that we always wanted to do.

[ 00:02:17 ] We can finally finally do it without the cost of the insane cost without the time needed to code these things from scratch. So all we really think and we are seeing it drive the needle in a huge way in multiple industries journalism advertising the brand space filmmaking.

[ 00:02:37 ] I’m really excited to see how this actually transforms the video experience.

[ 00:02:42 ] You know in the future you know so it’s just you said that right.

[ 00:02:53 ] Because if you take a step back and kind of look at what is like film what is video. One of my favorite quotes is omission is the act of creation. Right. So you know people in the video space know that sometimes it’s actually hard to tell a shorter story than a longer story. That being said if you want to tell a long story with all these pathways that you’re talking about then yeah you have to you know create all these different pathways that your audience can choose from. However you know we’re always shooting so much. 95 percent of what we shoot what we make gets left on the cutting room floor. So yeah you might have to put a little bit more editing time in but that’s a lot less painful than seeing your babies kind of like cut from the final scene. Right. And that’s what you see in linear video always constantly is if you like I have all this amazing footage. He’s a great character. And all of a sudden like boom all that awesome stuff is cut to a really short piece because we’re talking about the Internet here right. Sure. Like I love my Amazon Netflix accounts mean HBO specials.

[ 00:03:55 ] I can sit down and watch an hour show on my living room on my couch in my living room. But now you know during the day where most of my waking hour takes place. I’m consuming video online.

[ 00:04:07 ] So now I know I’m incredibly distracted right. The internet has programmed me to click this click that choose this choose that. And the issue with the video. Is it still incredibly linear.

[ 00:04:20 ] So when I press play I am forced from being an engaged viewer to now be in a passive you’re And that’s a that’s a very tough juxtaposition that you know people are faced with. We are trained to be active users on the Internet and everything on the Internet. But we still have to be passive viewers when it comes to video.

[ 00:04:38 ] So you know if you see a lot of the stuff we actually produce with partners like the Washington Post who else the north face a lot of ad agencies like a gray blink you know Jaguar you’ll see that you can take you know really cool stories and you can make interactive experiences without having to do a massive amount of shooting. So like the Washington Post they just won an award for a series. It’s just that with us where they took six interviews of voters and just let’s let the audience choose what voter they want to see. Right. Right. Just on the page in the video player it’s not. You know rocket science to say that people who actually choose what content they want to see are actually going to stay and watch that content. So we’re really giving back the authority to the viewers to kind of choosing click and decide what part of the narrative they want to consume.

[ 00:05:39 ] So this is I love this conversation because now we’re moving away from technology into into creation. Right. So we just provide the technology and the tools for people to be creative. Now you know looking towards the future I’m really excited for these kinds of interactions. Right. What does the future of interactive video mean. I’m not going to sit here until you know exactly what it means because it’s a medium. We have not been able to explore because it was too expensive and again too time consuming to go through the development cost plus one is really expensive. You can’t take risks. Right. If you’re going to drop $150000 on a story you’re going to make sure everything is laid out the storyboards are perfect. The timeline is perfect but with something like verse we just upload your upload your your video assets and you play around with them. If it doesn’t work you go back to the drawing board you play around with them a little bit more. So I think as a community you know we really needed something like verse to allow us to actually take risk to explore these creative thoughts and processes right. So to your point absolutely. Yeah. Like it you’re adding things you’re taking things away. Some some films and TV shows you want to have all you know alternate endings some you don’t. That’s really up to you know the director or the producer marketers. So but the point is now they have the opportunity to actually have these discussions. And I think that’s vital to any evolution of an artistic medium.

[ 00:07:22 ] So I think you know this new medium that we’re talking about interactive video.

[ 00:07:28 ] You know you first need to have access to be able to build it too. You first need to have access to this. You know these tools to develop interactive video in order to start thinking about how to tell interactive video stories because you can sit in your in your in your studio all day and come up with ways to tell interactive stories. But if you don’t actually have the technology to do it it’s a moot point. You know the art of storytelling is still the art of storytelling. Right no technology is going to make you a better storyteller. That is something that comes from experience from you as a creator your own experience is what you feel about the world what you have to say about the content. So I think verse allows you to have access to new ways of telling these things from inside of you and telling stories that you want to tell in new ways. But at the end of the day you know storytelling is is a human thing.

[ 00:08:30 ] It’s not technology so versus an interactive video platform that allows anybody to make really rich engaging interactive videos online whether you’re a journalist a filmmaker a brand an advertiser you can now take all your video assets your photo assets put them into verse and literally play around with them like Lego blocks to build a really cool engaging experience for your viewer. No no longer do they actually have to click on a video and sit back and possibly watch. They can now choose inside versus what part of the content they want to watch. So you know I think what’s really cool about verse is we’ve been around for only a year and we’ve been able to talk to a lot of these companies who you know wanted to tell interactive stories for a long time but just couldn’t afford to do it. So you know some of these companies we work with are the Washington Post discovery communication the New Yorker or the Atlantic many many others who’ve been wanting to do stories in this new visual medium. But again we’re Privett that we’re unable to just because of the cost and the death time associated with the with the with the the old way of doing interactive content.

Thought Gallery Channel:
Backstage Conversations
Backstage Conversation Season: 2017

Pat Tubach

Patrick Tubach joined Industrial Light & Magic in 1999 as a Compositor. During his time at the company he has taken on roles of increasing scope and responsibility and now serves as Visual Effects Supervisor for ILM. 

Tubach graduated summa cum laude from Baker University in Baldwin City, KS, with a BA in Communications, and although much of his schooling centered on creative writing and journalism, his lifelong interest in film led him to Los Angeles, where he began his visual effects career at Kodak’s Cinesite. As digital effects grew more sophisticated, Patrick soon made the transition from roto and paint artist to compositing. In that marriage of photography and CG, he found a genuine affinity and passion for the artistry of digital effects. In addition to working as visual effects supervisor on “Star Wars: Episode VII – The Force Awakens,” Patrick was a co-visual effects supervisor on “Transformers: Age of Extinction” and “Star Trek Into Darkness,” and digital production supervisor on “WALL-E” and “Mission: Impossible III.”

Thought Gallery Channel:
Backstage Conversations

Chris Bailey

[ 00:00:19 ] Well I mean I just did a panel this morning talking about OT and social media and streaming and all that content needs to get to the point of streaming somehow. And that’s you know they’re the ones that are going to take over the digital or the physical medium. We’re the ones that are going to get the content through the plumbing faster to get it to the point where it’s distributed to the consumers. So far as what I have to say with the digital versus the physical media. I’m not sure I’m not the expert in that but we can get to the point where they either create digital or physical media a lot faster than traditional methods. Traditional methods I like to use the analogy of a swimming pool if you have a swimming pool when you’re filling up with your hose the outside your house you have one bucket and you fill it up and you walk over and you pour it in. You walk back. That’s traditional file transfer. If your pool is down the street and you still have the one bucket Now you fill it up walk over you’re walking a lot further and you can only carry a little bit of water at a time. Now this is like having a human chain. You pass the bucket along. Fill another one. Pass them along. You don’t have to wait for that first bucket to get back in that way. Doesn’t matter how far away the distance is that pools are essentially going to get filled up just as fast because you’re dumping water in constantly. So that’s what makes us faster and the way that we make that accessible to people is we create API and desktop applications that watch folders on your desktop and move files across automatically when they see a new file being encoded for example. And the API is that we make allow integration with third party mamm systems and workflow systems.

[ 00:02:07 ] Let me let me use the best known example that we’re allowed to talk about and that would be what we’ve done with NBC Olympics. So NBC approached us in 2014 and what they wanted to do is leave a significant portion of their editing team back in Stanford instead of taking them traditionally to take them to. In London they took a large portion with them and they take a lot of their archive footage along with them as well. They want to leave this stuff behind and Stanford. So what they did in Sochi is as the events are being recorded and created into a high rez video file they’re actually creating low rez proxies at the same time. Our software is transferring 80 to 100 of these lower as proxy files back to Stanford and it gets inserted in the Avid system on the other end where the editors are sitting and they can and the loggers are logging it inserting it into the meme. So at any point they can call up a clip and it’s only delayed by a few seconds to bring back high rez of what they actually need and they push that up to social media push it to the web etc. and on the flip side we give them high speed access to their archives as well so they got a spectra control. SAGAL sitting in the basement. They need to get a clip. It’s all logged in there ma’am. Someone searches for Michael Phelps clips winning gold boom it gets copied over to a media deck and we can transfer it over to Sochi at high speed. Now for the fun of it we decided to take one of these clips and flip it to FCP mode in mid middle impacts and it was like a 20 gig clip and that the time remaining for the transfer was like 21 hours and we flipped it back to use file counters protocol. It was there in under a minute. So those are the kind of games and that’s the kind of leverage that they get and you know how they can leave an entire archive at home with still have quick access to it from halfway around the world and then for real they took it to another level because they saw what they could do in Sochi. They left more at home. They increased the scale and it’s going to be just the same for Pyeongchang coming up in the next year.

[ 00:04:07 ] The cloud storage is a lot more I guess the cost per gigabyte is actually down around tape these days so people are taking advantage of it and moving stuff into S3 and Amazon.

[ 00:04:18 ] Or is their blog and they need mechanisms to get it in there and you have all the different workflow and manned providers they’re providing access to it through file catalysed because we provide that window into objects in the same type of acceleration in and out. And then recently you see a lot of hybrid cloud and on prem object stores coming out so people didn’t like to pay 5 cents a gig to pull data out of us three or out of Azur. So now they’re hosting their own private cloud or objects or other in premise. And you also see the companies like spectra and quantum that are providing you know warm storage in front of their tape that’s object store based. And so yeah archiving is huge and being able to access that data fast you know and from anywhere is huge and that we play a big role in that. The other day I was talking to someone about this and you know to add context to a broadcast you know you want to be able to pull up information and type in a tag and bring up stuff that you’ve tagged previously to add context and if it’s something that you’re not necessarily prepared for like let’s say you know I’m a big hockey fan. I’m from Canada so Bruins and senators are playing the other day and one of the players is leaving after the game that we won of course. And a fan reaches down grabs the hockey stick he tries to pull it out of the player’s hand. Erik Karlsson took his stick and whacked the fan on the hand. This is a perfect opportunity to go and look for a tag like all the instance where fans and players kind of interacted and build like a top 10 list on the fly. And as a broadcaster that’s huge value be able to pull that stuff. It was unexpected. You weren’t expecting to pull that and see the Michael Phelps stuff that we did with the Olympics they knew Michael Phelps going to win gold medals they can have that stuff ready and have it there. But when the unexpected happens you need to have low latency access to that remote footage. And that’s again that’s just an example of the value that that cowskin out.

[ 00:06:12 ] Really.

[ 00:06:15 ] Exactly. And the bigger the geographic distances the better. I mean we try to create it not just for the Olympics but just for you know your MO trucks at a venue we want you to feel like you have access as if you were parked right at your headquarters you know but you might be on the other side of the country. Well the beauty of Amazon S3 is that they provide the same interface regardless of whether it’s there you know glace here or there you know S3. And then a host of other objects storage vendors the on prem stuff also provide an S3 interface so by talking S3 we can actually talk to many many vendors. Of course Microsoft has their own Google Cloud has their own but you know we’re integrated with those ones as well. The way that consumers are watching media these days is requiring Not only that it’s available in several form factors on your mobile device and social media over-the-top you know set top boxes but they want it now. They don’t want to wait. So part of what we did with some of our customers is it gives them the ability to really move ahead of their competitors in terms of providing real time content not just stills but you know clips that have already been transcode and available on all the different mediums within a couple of minutes after the events have actually occurred. And I mentioned the Olympics but it’s not just with the Olympics and NBC Sports it’s with a lot of other sporting events as well that aren’t necessarily as big but you want to have the ability to get that content to the consumer and that keeps them coming back to your your social platform or your mobile app or your network. So if the other guys don’t have it then they’re going to switch the channel and go somewhere else.

[ 00:08:13 ] It’s funny. I mean this happened to me yesterday I was at the Wynn for a meeting and the sense of rooms were. And it was the end of third period going to overtime. And I just finished a meeting I had to get up to meet a couple of buddies at Planet Hollywood. I had 18 minutes intermission to get our car get up there and I was almost there and I called up where are you at. And he said oh by the way the Sens just scored and I’m like great so I go to my phone and I’m trying to refresh my feed to see if I can see the clip. I didn’t see it and it took 10 15 minutes for that clip to finally appear and I could see what actually happens. So over the next few years I expect to see that you know the way that we can go and access these clips is going to just be phenomenal. It’s going to be faster it’s going to. Everything is just going to get faster and faster and you know we’re going to move into even bigger content because we’re going to have higher resolutions you’re going to have or you’re going to have all these new technologies that it’s just it’s exploding the size of the media which is creating the making the problem even larger that we’re solving. Speaking of replay I mean inside a venue you have so many angles and so many feeds now. So it is funny the panel is on this morning. A guy from Fox was talking about having 100 cameras inside of the arena and I mean they’re following every individual player on their own feed and automatically you know clipping different logging it automatically based on these feeds and it’s incredible. But the amount of content that’s being generated is just massive and how you view the content around.

[ 00:09:50 ] Right exactly it’s like NFL like red zone.

[ 00:10:00 ] They have all the different feeds you can see and it’s just that just wasn’t there a few years ago you know. And to be able to supplement it with your social media and kind of keep everybody tuned in to your channels is just that’s that’s where things are going. Cameras and microphones you know sensors motion sensors that sort of thing all kind of converging to automatically process all this video and put it somewhere. We had last summer with the Olympics with NBC we had Phelps cam. People just wanted to see him backstage with his hood up listen to his music going like this and I’m like why do people want to see this but I’m not a huge swimming fan. I’m a hockey fan. And as soon as I see someone wins I want to go on social media. I want to see the clips of disappointed fans on the other team. I want to see that. I want to see the locker room reactions and I want to see it fast. I just like I said I keep refreshing. I want to see it now.

Thought Gallery Channel: 
Backstage Conversations
Backstage Conversation Season: 2017

Guido Voltolina

[ 00:00:19 ] So when when people experience virtual reality the rotation is definitely very high and the core difference is really what we call a sense of presence and sense of presence in virtual reality is fundamentally tricking your brain to a certain extent and making you feeling right there.

[ 00:00:41 ] As soon as you feel right there and you don’t feel in front of a screen like normal to the V-2 your brain registered that experience is pretty much part of your life. So now the emotional not associated with it and everything is really retain with the emotional aspect that is much much higher. And so the empathetic element is redivivus retention is high because then the emotion associated with that experience really are for. In your brain not as I saw a video of. But I’ve been doing this I’ve been part of it. And that is the magic element to make vr fundamentally a new medium. It’s not an extension of video. It’s not an extension of pictures. It’s just the medium on its own. The other aspect from a rotation point of view is that if you watch a video you’ll be watching it one time maybe two times if you really like it for the rest that you will tell people you’ve got to watch this video. But you are not going to watch it again and again. You just tell people watch it again. In vr the dynamic of experiencing the video are different because I may be watching a certain area you’ll be watching a different one. Now when we exchange the social aspect of the experience now you’re going to tell me all you got to watch on the right side what is happening and maybe I was watching on the left side.

[ 00:02:12 ] Let’s watch. I’m going to go back and watch it again because I want to see the same thing you saw. But a surfer so I’ve seen something that we both have not seen.

[ 00:02:21 ] So now not only the attention but the viewing times are you know on average five six times versus the one or two of the video related to the suspense to suspend disbelief a concept or pulling a person completely inside the story and being part of it. The virtual reality allow you not only to be part of the story but to interact with the story.

[ 00:02:58 ] So now when you’re in front of a screen of a beautiful movie of course it is emotional a very impactful movie theater you see people crying. But you’re passive you’re watching it. You are a fly on the wall if you want or you’re following the story of course and the storytelling is very entertaining and bringing you emotionally but you’re not really part of that story. You’re watching this story somehow no matter how big is the screen of the cinema. How wonderful is the sound that the surround sound. You feel like you’re behind a window watching that story in virtual reality as you are in the story now it’s up to the producer and the director how much you can interact or not with the story but the language of the storytelling is is completely different. First of all you cannot really force people to watch this or that direction. And so you need to develop a technique in order to take them through the story like in real life. You know right now we’re facing each other a little. Are you looking at me the only reason for me to turn around that will be someone call my name or you start really looking behind me. That will be my hand to say you because something is happening. I bet if none of those events take place I’ll be just you know looking this way. Now NPR sound is very very relevant. Much more than INTUITY because you will be able to tell the people were to look at based on sound that is very natural for us. It’s the same thing that happening in real life. And then also you have this element of interactivity. You can make things happen differently depending on how they interact with a virtual reality environment. Of course it’s more developer in computer graphics than video. But already there are technologies available for making. Also the video part interactive. So that now I’m watching. I don’t know a certain scene of certain happening around me and depending on how I press buttons or move around or simply if I look right or left different things happen different ways. All those elements make these you know completely integrated experience of bringing somebody in a much more powerful in VR and really letting the people inside these windows. I mean these is it’s a piece of glass that we bump into. Because we can jump into this scene and you might remember some of the old movies where where people jump into the screen of the cinema and suddenly they are inside the movie and they’re part of the movie. The r is that thing is happening.

[ 00:06:03 ] Related to the development of the language of storytelling and the narrative in VR you have the opportunity of having multi threads but you don’t have the requirements of supplying all of those.

[ 00:06:18 ] So you can definitely develop a linear story liking to the where the viewer has only the freedom of looking around and then will be the storytelling and the sound and some some elements are telling him where to look at. Of course the main actors is what you’re following the story about you instinctively follow where he goes. Even if he’s moving around in an environment. So you don’t have to deliver ten stories but you have that opportunity of having this multithread exactly as it happened in real life in real life. You have the opportunity of looking around you. You don’t have to. And you can be in a certain moment. We both could be together right. And we look at different direction. Your life experience will be different than mine because you might witness something happening that I’m just not looking at at that particular moment. That’s how life is. So in VR you have the opportunity of doing storytelling with multithread. But you don’t have the obligation of doing it. Now one thing that you can do in VR that is quite compelling particularly for advertisement or entertainment or to bring in people in you can start telling people to look for things and they don’t necessarily see it unless they see in the right direction. So you can imagine that if you were doing a you know major brand commercial and you tell people to look at a certain product and he only appears at a certain moment a certain location they might be willing to wash their commercial multiple times to see where and when that thing appears.

[ 00:08:06 ] Same from a storytelling point of view. You can almost develop. A way where if we both watch the same vr video we may end up with two different stories just because we look at two different direction. Now what is fantastic in VR and is the concept that I call of Time Machine is that I can go back. And have the same experience that you’ll have and play back just following where you wash I if you tell me instead of left look at right. OK I’ll play the video back. Look right. And now we can share the experience. And that applies also for news events personal details. You can only imagine you know people doing a wedding in VR. They are the people getting married so they don’t know what is happening because they’re there.

[ 00:08:59 ] You know the main characters but later they can go back and be part of the audience later that can go back and be I don’t know the priests or whatever whatever the camera has been position that really open a series of experiences that people will be able to relieve again and share that was not available before.

[ 00:09:32 ] That that concept is new. But at the same time is I cannot say it’s infinite but is kind of unlimited. Whatever the difference is infinite and unlimited is and the constraint is really time. How much time you have to explore all those places. And then because very similar to surfing the web in the early days in the early days there were very few things that you will resolve. Surfing the Web right. You would put a word there were like some sites but not many.

[ 00:10:04 ] Right now there are I don’t know thousands. Right. So today we were ready develop our own skill sets in filtering out. So we know much better when we surf the web but how to be more precise and narrow to find out what we’re looking for saving in those new environments are generated by either computer graphics or soon also with videos as well. We will develop our own taste to our own filtering where we start looking at something like you know what. It’s not for me. And you go back and say oh I want something more of these chambray or more exciting or more relaxing. You know everybody has different needs a different time of the day. There is a moment on the day where you want to be very active very interactive. There are moments of the day where you don’t want to be interactive you want to be passive just relax and enjoy a certain environment. But what is important is that now the journey of going somewhere either is a real place or an imaginary environment or very interactive or a very non active place. They’re all available and they’re all available in a very fast way. So as the Web Search for example before we could we could sort out that information we just took more time will be no phone calls books. And so maybe we’re not doing it because we’re taking too much time.

[ 00:11:29 ] Same here you know we might not go some places because it takes three days to travel and see it. And now instead you just click and you’re in the Amazon. OK now you’re going to be the Amazon so you’ll become much more efficient in feeding your own need of entertainment. And you can do that together. You know a lot of people are saying that we are not social actually is very social. We can be in the same place at the same time.

[ 00:11:59 ] And the magic aspect of the are where the physical space is not a limitation anymore. You know in real life I can be the only one next to a person on a two seat car in VR if one of the two seats is a VR camera. I can have a million people in that seat and they can talk to each other at the same time having the same experience. So from a social perspective is really enabling an interchange of information and connecting people at a level that is much broader than before. From augmented reality point of view we also want to extend to what we called M.R. mixed reality. So there is V.R. they are M.R. many are ultimately they’re all converging to the same spot. Either you talk about overlain information on top of what you’re seeing in real life at this particular moment to overlain and augmenting information on something that was captured earlier but play back in a time shifted or something that is completely computer generated completely virtual. Right. All those are just different experiences with different school ingredients if you want to. But at the end they generate an experience. Now in terms of good mental reality what you can start having is supernatural powers try things that you cannot do in real life versus without the augmented reality. You don’t have that additional layer. So now I can look at something and I know the exact distance with my vision. I know approximately how far it is.

[ 00:13:56 ] But now I can have the details of the exact kind of information that I can pull on my you know myself because I don’t have the database your name your address your phone number.

[ 00:14:08 ] And not only generically of object and I’m looking at it but specifically on the information I’m looking for at that particular moment. So we document the reality. I can start again being more efficient in my day to day life.

[ 00:14:23 ] I can be shopping and instead of spending time trying to look for the price of something I can preset and say I want to see only items there are five to ten dollars on just looking for a good five to ten dollars.

[ 00:14:34 ] So now I’m in the store I’m walking in the store but I’m very rapidly filtering in five to $10 things.

[ 00:14:44 ] So it’s really about efficiency and how that can implement the supernatural experience that end up to be more entertaining. Office business wise a much more efficient in delivering the end product.

[ 00:15:03 ] Really really. I’m pretty good.

[ 00:15:06 ] Anything you wanted to get out your while your so and not the technologies we are really investing in VR technologies related to the sense of presence.

[ 00:15:22 ] And with that we’re looking at what are the tools that are needed to today’s industry in order to capture the lever and play back what we call sense of presence.

[ 00:15:36 ] Today we started with the video with spatial audio delivery of those experience life and so people can be live at the concert or sports event. And then the play back in order to deliver all the elements that will tell your brain you are there. This is the essence of what we call the Ozel reality just because it’s not to the 360 video that you know that is fun to look around but your brain doesn’t feel like being there. And we do believe that as soon as you start delivering the full immersive experience then people not only will remember more strongly what they experience but they also will enjoy much much more emotional involvement.

Thought Gallery Channel:
Backstage Conversations
Backstage Conversation Season: 2017