Just off Wellington's coast in the harbour was the location of the 1968 Wahine Disaster which caused 53 deaths. The Wahine was a ship that sunk and left many left as survivors on both Seatoun and Eastbourne Beach. The accident was caused by atrocious weather conditions and bad decision making on both the ship and on land.
There were 734 passengers on board the ship as well as crew. The Cook Strait was rough and choppy but this was common Wellington weather and the storm was named tropical cyclone Giselle.
At 5.50 am on the 10th of April, the ship entered Wellington Harbour and at winds of 100 km/ph with a large wave the vessel turned on its side towards Barrett Reef.
At 11.00am the boat had been tied to a tow rig but the line snaps in the treacherous waters and the mission was abandoned to let the boat sink.
Because of the weather the Captain thought that it was better if the passengers could stay on board for as long as possible. Three lifeboats safely landed on the coasts of the harbour and many were forced to jump into freezing cold water in life jackets.
By 2.30 the Wahine had capsized in 11.6m of water.
200 survivors reached the coasts but this was where 51 of the deaths happened.
Memorial Park
A memorial site was made on Seatoun Beach Park to acknowledge people who passed in the Wahine disaster and those who showed bravery in the rescue. The site explains a brief explanation of the disaster and a place to mourn the effects of it. The spatial factors of the site are implicit to the happenings there and the most bold physical qualities to the site are the red and white fog horns which surround the memorial. The physical qualities of the memorial are connected to the disaster as a brass anchor and chain shows reminisce of the boat which lyes under the ocean.
I went to this site, read the information and took a moment to engage with the spatial quality of the memorial site to test what association to empathy i could grasp from the site.
The rounded shapes acts as a barrier to the wind, a nice sheltered area where you can sit comfortably and watch the bay - where the disaster took place. The memorial site does not take much away from the natural quality of the beach as it is built up with dirt, concrete and stone. On a grey or stormy day which more closely locates yourself in a situation closer to the happenings of April 10th you could easily imagine the terror of people struggling against the waves for safety. In contrast, on a clear day with blue sky and little wind, the area could easily be overlooked as a memorial site for a terrible disaster.
When I experienced the space, the day was bright and windy. I read over the explanations of the Wahine disaster that I experienced and sat for awhile experiencing the memorial space. When experiencing the space, the stone wall created a barrier to your site which meant that your main focal point was the ocean ahead of you and the site of the disaster. I thought that this was interesting as it was my bodies reaction to the architecture of the memorial site.
I created this video on Premiere Pro. The video was not very edited as I wanted to more closely represent how the space would be displayed in real life without any headers or obviously edited aesthetics.
Here is the final Wahine Memorial Park video:
Experience viewing in VR:
Firstly something I found in viewing the 360 video was that the speed of motion can be quite disorientating when the camera is moving and your head is also moving around the space I found the motion quite unnatural and difficult to watch. The scenes where the camera was still were the scenes that I found the most successful. I found these scenes more immersive also because they enabled me to be able to the explore the 360 of the frame more comfortably and willingly. The site was more in my focus when I could explore and review what was being presented in the video. I didn't feel a strong connection to narrative in this film because there was no context to video except for my prior knowledge of the memorial park. This meant that my main focus was on my spatial surroundings rather than a narrative or story being presented. This could be furthered by a introduction to the film or voice over.
What I found frustrating about these 360 videos was that you had only one control over the spatial element of the film which was moving your head around, you are essentially still being carried around in the space as if you saw an image in 2d. Some would argue that this is still two 2d because of the way that it is protected out of a 2D screen. In relation to embodiment, the space did not feel immersive to me because of these factors.
In my next experiments I would like to explore more 3D spaces and how you can create similar life like experiences in 3D virtual environments. I would like to create hands on the viewer so that they can use controllers to perceive hands or simular body imagery which enables them to feel greater agency over the environment that they are in.
Things I found in the process of using the Samsung Gear 360:
- keep video files short; uploading large video files takes about half an hour each 8 minute video and editing them in premiere is even slower.
- Find a thin based for the camera to sit on so that you cannot see a table underneath.
- Propping the camera up to eye level will help the viewer to feel like they are inhabiting a human like form in the space
- Export all files once downloaded into premiere pro for VR capabilities (export frame - enable VR - reimport frame)
To create this immersive space for communication, I filmed in 360 and downloaded the files to edit in iMovie which was the only video editing software that I had on my laptop at the time. I then build a 360 viewing platform through a application made by Unity.
To build this I projected the film on to a sphere and created a material which flipped the material to the inside surface area as opposed to the exterior of the sphere where the material would usually lye. This was done with a shader code which I retrieved from GitHub. This meant that when you played the video, the camera was inside the sphere which had the video projected. This with the game compatabilities that Unity had encoded, you were able to move the camera around and look around the inside of the sphere.
The app I created through Xcode to present the 360 photography was bit glitchy and was not easy to focus.
This made this process not very helpful but it did mean that you could view the app without internet.
After this process was found to be pretty unsuccessful I used the exporting compatabilities with Youtube to export the videos that I had edited in Premiere Pro.
This was more successful and less glitchy however because of the size of the 360 video files the exporting time was very long. Overall the simplest way to create 360 videos with an easy viewing platform as well.
When using Youtube, there is a function where you can spilt the screen so that each eye has a screen to focus on and this makes the video a lot clearer. You can also make the resolution higher so that the video is clearer. These two functions when uploading 360 videos to VR make the videos a lot clearer and easy to watch.
I found that the split screen function with 360 videos is the most valuable to the experience.
Comments