Apple Pure Vision and the Immersive Experience Opportunity
| |

Apple Pure Vision and the Immersive Experience Opportunity

Memorable VR experiences

AR/VR and XR have been around for years, if not decades. The most unique VR experience I was involved with was people wearing an immersive headset whilst snorkelling in a pool to experience being “weightless” whilst watching an immersive video. 

The second most interesting video 360 experience was a ZDF volcanic explosion where you could watch a volcano explode, as if you were in Pompei. You could follow the projectiles as they flew by you. You could watch the pigeons take off and fly away.

The third immersive experience of note was where we sat in wheelchairs, with a VR headset on and we saw the world through the eyes of an old person in a wheel chair. We could see people “talk” to us, as if we were the main character. 

Price as a Barrier to Entry

The Pure Vision headset by Apple is exhorbitantly priced. A few years ago the Quest 3 headset cost about 340 CHF and could be used for gaming. The Apple headset costs 3700 USD. Samsung headsets in contrast cost the price of a mobile phone and a headset. 

The Affordable Options

VR headsets that are mobile friendly cost just 20-70CHF depending on the size of the phone, and the quality you desire your headset to have. Apple’s is way out of people’s acceptable price range. 

Patent Monopoly

One of the most worrying things about the Pure Vision demonstration yesterday at WWDC was the mention that they had over 5000 patents. Some might think “Youpie, that’s a lot of innovation” but I see this as monopolistic and destructive. The more they patent indiscriminately the more they will prevent innovation by others. I would own a Quest 3, if Meta didn’t own Oculus. I don’t trust Meta with Immersive experiences, and I don’t like how Apple charges exorbitant fees. 

Always Worn

From the demonstration of the Pure Vision headset it seems as though Apple wants us to immerse ourselves into XR for extended periods of time, to browse photos, videos, and work in AR. They seem to want us to edit video in augmented reality, rather than on a screen. They want us to be fully immersed, to the point that when we’re talking to someone, the display becomes transparent, rather than have us take off the headset. 

Film Watching

They demonstrated how the headset fades the background to “nice colours” or some similar kitsch. They also promoted that we can make the screen for movie watching as big as we want. When I heard this I thought “Imax VR experiences”. With 23 million pixels you could probably watch Imax quality content from the comfort of your airline seat, should you have such a desire. 


One of the key issues I see with the Pure Vision headset is that it will isolate us from the world we are in, encouraging us to spend more and more time alone in a virtual world, rather than in the real world. I see this as great for people who live alone, but awful for family, love and other aspects of life. If you’re watching a film with a VR headset then you’re alone, unless they code in a way to be with others.


It’s ironic that just as Meta gave up on VR/AR/XR apple brings out a headset that would help get people immersed.

The Mobile Office

The Apple Pure Vision headset was marketed as a remote work tool, as well as a work tool. the idea is that you organise your work space within the immersive VR goggle experience. Video editing in VR could be interesting because rather than get a large screen, you simply set up the timeline, player and program screens as you want them to be. In theory you can edit 360 videos whilst immersed in VR. They didn’t explore typing and writing, but I did get the impression that they want us to navigate through the environment either through gestures, to grow or shrink windows, or voice, to tell Siri and the VR environment what we want to see.

Looking Forward to the PureVision SE

Although Apple haven’t even started to sell their PureVision headsets yet I look forward to the PureVision SE alternative. This version should be more affordable and more interesting for normal people. By normal people I mean people who buy a fragile glass headset rather than an electric bike. I want the PureVision SE option to be affordable, maybe even existing mobile device friendly as we have with the Samsung Gear solutions. 

An Increased Demand for Content

The PureVision headset should come out in the beginning of 2024 which means that for the rest of the year we should be working on 360 videos and other immersive experiences. Now is the time to make sure that when the headsets come out, our content is there, for people to enjoy. YouTube videos, netflix content, amazon prime, and others should know work towards 360 video experiences, like the porn industry has already made available. 

The one drawback to porn and the PureVision goggles, is that there are plenty of cameras, and if it uses gestures, then self-gratification may not be ideal. 

And Finally

With good battery life, simplified typing and good gesture controls PureVision and similar products could replace laptops and desktops, and if not laptops and desktops, then external screens. In my experience VR is a lot of fun to experience but it does cause fatigue, and that fatigue means that we use it less than we would otherwise. 

The act of putting a VR headset on, queueing content and more is slow and clunky. If Apple has found a way to make this easier, then that will help drive adoption. The Quest 3 was already a good step forward. We have to see the leap made by PureVision. 

Filming events in 360

Filming events in 360

We have all seen events covered by photographers and camera operators but how many events have we seen covered with 360 degree videos?

A few weeks ago I filmed the Escalade, wrestling and other events with 360 cameras and it was fun. In some cases it was the opportunity to play with a new format and in other cases it was the opportunity for proof of concept.

The thing to remember about 360 videos is that you’re placing the camera at a point in space and people can look around as if they were holding the camera. Of course rather than hold the camera with their hands they are holding it with their fingers or even on their heads.

Turn your head instinctively and you see what you’d expect. Look up, look down, look behind you. You are there in full control. You will see a group of runners run towards you and when they pass you can follow them with your gaze.

It’s not just that you follow them with your gaze. In some cases you’re right by the action. When I filmed wrestling I got the camera right by the action rather than on the pilon holding up the ropes. You can look up as one wrestler jumps from the ropes into the opponent below.

Of course I still need to watch this footage and see how effective my experiment was. I suspect it worked well. When I get back to the edit suite I will be able to experiment.

A continued Interest in 360 Videos

A continued Interest in 360 Videos

A few weeks ago I was at the Geneva International Film Festival in Geneva as a volunteer so I got to play and experience new 360 experiences and some of these were interested because we could move around in space whilst others were interesting because of their length.

A few years ago people thought that 15-20 minutes was long for content but during the GIFF it became apparent that content is now 30 or more minutes long so another wall has come down. People can now experience immersive VR Goggle experiences for extended periods of time and this opens quite a few new doors.

It also changed how I see 360 videos. If a film festival can present immersive video experiences that last half an hour to people coming off the street then the quality of the experience has progressed. It means that 360 video content, and people who want to view it are jumping into the mainstream.

Yesterday I was at a film rehearsal and brought an old 360 camera to film. You can see the director directing the actors, the lights and more as an immersive experience. This morning I watched, to see how it felt and it was good.

Today I experimented with a 5.7K 360 video device and the image quality is much nicer than with previous devices I used. I feel that we are finally getting to see interesting image quality.

One of the reasons I had cooled down to 360 videos is that I had tried a few options and been filled with hope only to have it dashed at playback in VR goggles. Today this issue is resolved. Cameras provide us with the quality that we want and expect.

Sea of Tranquility – Snorkeling VR by Pierre Friquet
| | |

Sea of Tranquility – Snorkeling VR by Pierre Friquet

During the World XR Forum this year in Crans Montana I helped Pierre Friquet with his Sea of Tranquility VR Experience. This VR experience was unique in that it required you to be either in your swimming clothes, your underwear or other.

This was a VR experience where you went from being outside where the temperature was descending to winter temperatures to what felt like a warm room when dressed but comfortable when wet.

The experience is simple. You change from your conference clothes into your swimsuit and step into the pool. You stood on a ledge with a depth of just 75 centimetres and received a short briefing. “Are you familiar with snorkelling?”, “Are you familiar with VR?”. They then stepped towards the rope and a tethered floatation belt was brought to them. They placed the belts around their hips and when ready the VR headset was placed on their head, the snorkel was placed, and then a headlight. “When you’re ready to let me know and I will start the video”.

As the video started to play those experiencing The Sea of Tranquility VR experience leaned forward and assumed a horizontal position and floated towards the deep part of the pool. From here they could look down. They went from being in a room if they looked down and started their trip towards the moon.

The audio they could hear was mission audio from Appollo 11. They could turn their head if they quickly wanted to look around but they could also turn by thrusting with their arms either left or right. Instead of a swivel chair or a wheelchair, their support was water.

I personally tried the experience once when tethered and then again when free. During this experience I wanted to swim down and so did others. Two others and I tried the experience untethered as well. The desire to dive down was strong so two of them did.

In one case I saw that a person was uncomfortable with snorkelling so I brought her back to the shallow part of the pool and she enjoyed it. Usually, this VR experience is in 1m20 of water and here people were in 2 meters of water. That’s why I brought the person who looked uncomfortable to the shallow end of the pool.

The torch that people wore on their head added to the experience. As they watched the video and moved in the water so their head turned, and as their head turn you could see that they were looking around. When two people were in the water at once it was quite entertaining.

When some people finished the experience they were so absorbed by the video and the VR experience that they needed to re-acclimatise to reality. In one case it took several minutes.

In several cases, people tried the VR experience and then went for a swim. In one case I did several laps with another person and that was one highlight of the event. It’s the third time I go as a volunteer to this event and I think that between sleeping in a bomb shelter, helping people experience VR in water and by having regular meals this is the best World XR event I have experienced to date.

As a bonus, some of us got to try two other VR experiences. One of them was floating through the ISS but the one I really liked was the scuba diving demo video. In this VR video, you are descending a slope and as you look right and left you to see fish, sharks, a wreck and more. At one point you get to the end of the slope and get to a wall. You float over the wall and then you’re in the water with three whales. As a diver, this sensation was familiar, as was looking around like this.

I really like snorkelling VR experiences and I would love to experience one with scuba equipment. Imagine being underwater, rather than floating on the surface. Imagine being able to dive down and experience the depth, as well as weightlessness.

The Insta360 Nano and Air – A climbing test
| | | | |

The Insta360 Nano and Air – A climbing test

The Insta 360 Nano and Air are two affordable cameras. The first is designed to work with the new iPhone shape as well as a stand alone device. The Insta360 Air works only when it is plugged into an Android device. Both are good for specific uses.

Insta360 Air

The Insta360 Air requests a firmware update the first time you want to use it. This takes a few minutes and then the device uses the phone’s gyroscopes to keep the image stable. On the Via Ferrata I climbed this weekend I used the insta360 Air and Xperia Z5compact phone to take one or two landscape pictures. In these images you can look up at the cliff, look across at the landscape or look down at how far from the ground I am. This is a nice way of giving people a feel for what it is like to practice Via Ferrata. For the use of this system, it is good to have both hands free.

[vrview img=”” width=”500″ height=”500″ ]

Use the mouse/trackpad to rotate the image

Insta360 Nano


The Insta360 Nano is great because it has an SD card slot. It can be used as a stand-alone device. With the tripod mount and a selfie stick interesting images and video are possible. I tested it on a Tyrolean crossing. That’s where you attach your pulley to a cable and swing across over a waterfall. With a 360 camera you look anywhere you like. Image stabilisation for Tyrolean crossings is essential. When you transition from standing on firm ground to swinging across you move a lot. With image stabilisation this is avoided.

Post production

Post production with the Nano is quick and easy. Take the SD card, read it with your laptop and share. With the Insta360 Air you’re using the phone’s microSD card. You can batch edit and export to the insta360 community sites. I want to bulk export directly to Google photo from an Android device.


For the price of a Ricoh Theta S, you can have two 360 cameras. The Nano is ideal for monopod use and the Air is ideal for web streaming once you find the right phone mount for a professional monopod. With image stabilisation the camera keeps the image centred where the person with a VR headset looks. Without image stabilisation Nano footage would give people motion sickness.


A 360° cooking Show would be interesting to watch.

For a few weeks now I have been thinking about how you could make a 360° cooking show. For this video I would like to be able to see the process from an angle where I see the person cooking. I would also like to see all of the ingredients and the preparation of various stages of the recipe. For this you would need an open plan kitchen where preparation takes place in the middle of the room.

If it was to be filmed with just one camera then the camera should be lower than eye level but not by much. I would want to look straight ahead in to the eyes of the person preparing the meal and talking. As the person speaks about ingredients I would like to be able to look down and see all of the ingredients. You would need enough room for chopping and marinating. I would think about having a camera above the cooking surfaces so that we can see how the textures change from the start of cooking to when the mix becomes “saucy”. That’s how they describe it in recipes.

I want the 360 approach to be justified. I want to make it so that the viewer has a reason to wear VR goggles. The video above justifies a 360° video whereas the video below does not. The camera needs to be placed as close to the action as possible. It should be designed so that the viewer has to turn his head to see what is happening.

This video does not justify the 360 approach because the action happens in front of the camera and although you can look around the background is not relevant. The action is too small on screen. You can’t see what they are doing.

The camera operator and the producer should learn the recipe and identify what the stages of preparation are. They should have a shot list of every thing that is relevant and place the camera accordingly. Is an oven or a grill used? is sauce prepared? Is something timelapse friendly?

A few years ago we were up in a swiss chalet with a friend and we decided to cook a pizza with a log fire rather than an oven. We started the log fire and then we placed the pizza next to the fire. As the fire was only on one side we rotated the pizza to get it to cook in full. In watching the playback you see the pizza cook as well as the logs go from being logs to embers and “melt” downwards as they burn. If the camera can safely be placed close enough to the fire then you could look down at the pizza as the ingredients change appearance and look up to see the logs decrease in size.

I believe that 360° cooking shows have a future. In theory you need just one well placed camera and editing is cut down to a minimum. Instead of editing and vision mixing from two or three cameras you need just one. I believe that the programme should not be more than 5-10 minutes. People, for now, are still getting used to VR headsets and may find extended periods of time more tiring to watch.

How long will it take before someone like Jamie Oliver uses VR cooking simulations to teach people to cook in the real world. With enough time they could emulate real cooking times at different heats so that when you transition from VR to reality you have the right intuition to successfully cook for yourself and others.

A 360 Video of planes landing – An experiment

A 360 Video of planes landing – An experiment

Waiting for planes to land

I prepared a 360 Video of planes landing at Geneva International Airport yesterday afternoon. Watching planes land, especially when you’re right underneath them just as they’re about to touch down is a lot of fun. You see the lights in the distance and slowly those lights approach. There is a point after which everything speeds up and the sound gets louder. The plane flies right over you and you see the landing gear, the engines and you see the plane correct as the wind buffets it from side to side. You see it skew and then you’re turning around very quickly and you watch it land.

With a conventional camera or tripod this is a challenge to film because tripods are designed to pan and tilt but not in this manner. If you wanted to film landings from the position I was in you would need to build a rig that tilts up and then when the plane is directly overhead swivels and then tilts back down. With a conventional tripod this takes practice to get right.

360 cameras are an interesting alternative for camera operators because rather than move the camera it is the viewer that changes the direction in which he is looking. He watches the plane approach, looks up as it flies overhead and then swivels around to see it touch down.

I filmed 18 landings. Some of the aircraft are A-320s and others are private jets. I would have liked to get one of the A-340s that lands at Geneva but I was not that lucky. When viewing this footage I notice that the underside of the aircraft is dark. When you watch this in person then you see a lot of detail which the camera was unable to pick up. I look forward to when we can film 360 videos with full control of the device.


The Theta+ Video app is available
| |

The Theta+ Video app is available

Yesterday the Theta+ Video app came out for Android. The Theta+ video app allows you to trim video clips and then share them to social networks. This means that you no longer need to wait until you get home to prepare content for sharing. You can do it while you sit and have a post activity hot chocolate or other drink.

[caption id="attachment_3325" align="aligncenter" width="169"]Video options Video options[/caption]

When you select the raw video it is converted to be a spherical video. When that process completes you can choose between creating a 360 degree video or a cropped one. A cropped one is a tinyworld video.

[caption id="attachment_3326" align="aligncenter" width="169"]Filters, Trim and music Filters, Trim and music[/caption]

The next menu gives you three choices, filters, trim or background music. I never bother with filters and the trim option is fiddly on the Sony Xperia Z5 compact with a 13 minute video. With a shorter video I would have found this process easier. Saving is not intuitive. First you trim the video and then you go back and save the changes. While saving you need to keep the app open.

The sharing options are to Facebook, youtube and other social networks. This varies according to which apps you have on your phone. I like that the two first options are facebook and youtube as these are the networks that I usually share to. When I tried to upload to youtube it failed twice. When I attempted to upload to Facebook it was stuck at 99 percent twice.

This is a great app to trim videos before sharing and add some music when required. What I would like to see in future versions is the ability to compile a number of 360 videos together to create edited sequences. They need to improve uploading so that it works better. At the moment of posting all attempts to share videos failed.

360 timelpase videos
| | |

360 timelpase videos

360 timelapse videos provide us with interesting new opportunities. Imagine for example placing the camera out to see near Weymouth beach and watching as the tide comes towards the camera and then beyond it towards the city. Imagine watching as the sun rises on one side of the Leukerbad Valley and sets on the other. Imagine that BBC Natural history unit sequence of sand dunes moving across the landscape one day at a time for a year.

Timelapse with the Ricoh Theta S

Two days ago I was tempted to try a timelapse video with the Ricoh Theta S. My plan had been to take the camera up to La Barillette and film a timelapse. From this point of view you can see the whole of the Lac Léman. You can see from Geneva to Villeneuve on a good day. With a weather system like we have at the moment you can watch clouds form and dissipate. You can also see the shadows left by those clouds and more. With a standard timelapse camera you would see just a small part of the scene. With a 360 timelapse you could look out towards the Alps or around at the cars and hikers. You could look up at the mast and more.

I say that you could do this because there are high winds up there and you need a heavy tripod to keep the camera from falling and breaking one of the lenses. You also need to find something to do while the camera is working.

Yesterday morning was clement, we had clouds and blue sky so I was able to try a timelapse. I set the camera to take an image every ten seconds for an unlimited amount of time. The settings on this camera give you great flexibility with timing. You can go from every eight seconds to setting a much longer amount of time.
timelapse(2)You can set the interval to take pictures from every 8 seconds to every 60 minutes and 59 seconds.

You can either preview the image as a spherical image or as an equirectalinear image. Once you are happy with the settings you can start capturing. In yesterday’s test I was able to get more than 600 images on a single battery charge when the camera was set to take a picture every ten seconds.

The obvious limiting factor with this camera for timelapses is battery life. As soon as the camera is plugged in to a power source it turns off and starts to charge. As a result charging and taking pictures at the same time is not possible. There is also the minor issue of having the USB charge port right next to the tripod screw. You would need to modify a plate to charge the camera at the same time. The camera lasted for about 100 minutes before the battery died.

Post production

With the Ricoh Theta S and final Cut Pro X post production is efficient. You are dealing with images with a resolution of 5,376 x 2,688 pixels. That qualifies as UHD. You can import the image sequences from your timelpases straight from FCP X cutting out the need for other apps. Once the images are imported your your event you can open a new project at full resolution. I added the UHD image sequence to the timeline, created a compound clip and then used the speed tool to adjust the duration.

I still need to do some research about how to export the edit at full resolution. As I was given an error message I decided to export the video as 1920×960. This worked flawlessly. I used the Spatial Media Metadata Injector to add the necessary image meta data and then uploaded the injected video to youtube.

I look forward to finding ideas and projects that will take advantage of what 360 timelapse videos have to offer. I feel that it provides us with an opportunity to better understand how time and light evolve in a spherical environment.