This reading explains new technologies in social spaces. The reading explains how augmented reality is progressing to create different forms of what human reality means to us and how we interpret human senses.
The reading explains new advances in cloned avatars and they way they are being created to react similarly to how a person would using individual expression. New avatars are being built in a way which means they can store information about the way each individual responds to a situation, e.g. heart rate and temperature to more accurately resembled a person in the 'real world'.
Through the newer built softwares created by scientists and engineers, designers and makers are able to access the skill to create and decide the way that we use AR experiences in our day to day lives. This will essentially change the way that we see the real world or how we want to use or react to it. In teaching AR technology is being cared so that the expressions you show the being taught e.g., confusion or interest the teaching will be adapted to how you respond to the information. if, you are confused, further expansion on a topic will be explained to you.
AR is giving us new eye. With wearable technology the way that we interpret our surrounding and happenings are being more informed and understood.
Prediction is an important part of the human experience, we use it in our daily lives and learn from each interaction we have. Hawkins points out that the human brain is making constant predictions from how we perceive our environment. We experience the world through patterns that we store and recall. For example; if you open a fridge door and you feel a tension, you may predict that something will falls falling out on the other side. New technology is being created to assist in daily routines. This however i think would only be useful to a person with set routines in place.
Papagiannis suggests that smart phones are an extension of our selfs and through technology will learn to experience the world in a human way. The lines between virtual and real will blur even more so as what we understand to have ‘human’ sense will be attributed to devices. As we learn from these devices we perhaps will receive greater insight to our surroundings. for example it will be more commonly noted how much exercise is done thought ones day when a pedometer is checked frequently. An example given in the text is a device with monitors depth of field and is used by people with no sight to be able to feel their surrounding through vibrations.
‘white christmas’ the black mirror episode is addressed in this reading with software being produced that can blur out unwanted images from a persons sight which raised an interesting question from students at the University of Pennsylvania. “what if we lived in a world where consumers were blind to the excesses of corporate branding?” The students built a head mounted display which they explain to be a “adBlock for Real Life”. We already modify our life and block people and companies from our virtual medias, what else can we choose to delete from our realities? Papagiannis hopes that the advances in technology are used to support human connectivity, and even empathy. What if, she’s questions, we could use mediated reality to create focus and ignore day to day distractions which stop us interacting with our loved ones. This reminded me of the tradition of siesta in spain, a time to take a break from the creation of chaos and stressed in capitalist western environments.
Sensory technology is also building to help humans communicate with each other using touch software. This application may help to translate human connection through language to felt experience. This is tactile information that helps people to communicate with each other from a distance. Smartstones is a company based in Santa Barbra that is helping friends and loved ones to connect non verbally. They can send an receive gestures through a pendant or wristband. I wonder how this information is received and acknowledged and whether the code of tactile vibration is being written and learnt. Or if each interaction has their own code of identifying what these vibrations mean. Very interesting idea. - An app was created so that gestures could be added to a library of gestures -
This idea expands on the way and depth of how we can understand and relate to one another through the use of haptic, translating mood, sentiment and messages in non verbal ways.
Augmented audio is being used to “help people cut through the often-impenetrable veneer of a place and really feel what it is about”. While listening to a storyteller explain their story you are also hearing the spatial environment that they are in, walking through a neighbourhood or home. A VR film was played to a group of leaders at the World Economic Forum in Davos in January 2015 of a 12 year old girl living in a Syrian refugee camp whose life would be drastically impacted by a decision being made at the event. The 360 experience and audio of her experience makes you experience her humanity in a much deeper way. The empathy received from this form of information was useful to fully understand the implication and situation of the environment being affected. Another example is one of showing people the amount waste produced in a day from the San Fransisco Zero Waste Mission.
AR not VR? Papagiannis explains her understanding of aura and sense of understanding of a place. This reminds of the maori term kaiakitanga (check) the connection and guardianship of a place through their lineage and turangawaewae. This reminds me that to connect and have a positive influence on a persons reality, we need to consider the meaningful links to our place, reality AND the virtual. This is explained with the term “contextualised presence”.
Papagiannis, Helen. Augmented Human : How Technology Is Shaping the New Reality, O'Reilly Media, Incorporated, 2017. ProQuest Ebook Central, https://ebookcentral.proquest.com/lib/massey/detail.action?docID=4980342.
Comments