Monday, March 4, 2024

Conscious Imprinting


COMMENTARY AND OPINION

Publications and articles involving current scientific and technological discoveries are referred to and highlighted in part below. Concern as to the significance of these events is entirely dependent upon the observer.

An illustration of the design matrix is provided to bring attention to each article's focus, influence and impact upon design consciousness.

* * *

What Apple’s New Vision Pro Headset Might Do to Our Brain 

By Lauren Leffer, Scientific American Newsletter, Feb. 21, 2024. 

Apple’s 1.4-pound goggles use sensors, including a lidar scanner and a camera array, to place people into what’s been called “mixed reality.” 

Mixed reality is neither traditional VR, which completely blocks out the real world, nor AR, which presents a digital overlay on transparent lenses. Instead a pass-through device translates a digital representation of a person’s environment (their hands and nearby objects, for instance) into a completely virtual space. 

This means the device mediates everything about a user’s experience. It’s “the dream of tech companies because you never [have to] take it off,” says sociocultural anthropologist Lisa Messeri of Yale University, who is author of an upcoming book on virtual reality, In the Land of the Unreal. “They can always have your attention. They always know where you’re looking. They always know what you’re doing.” 

“We don’t know what it means to walk around the world with reduced peripheral vision or visual distortions for hundreds of hours in a month,” says Rabindra Ratan, an associate professor of media and information at Michigan State University and a co-author of the recently published study. “This is purely speculative, but there could be effects on the way your eyes move around in space, and maybe that could make your vision worse,” Ratan suggests. “We don’t really know what that will do to our brain.” 

AR, VR and mixed-reality headsets also frequently cause “simulator sickness,” a suite of uncomfortable symptoms that include nausea, headache, dizziness and eye fatigue. Bailenson, Ratan and their co-authors encountered simulator sickness in the majority of their device sessions, even though the tests generally lasted less than an hour. Enduring even low levels of simulator sickness could impact people’s quality of life, activity level and productivity—which is one reason Bailenson worries that people might try to rely on these devices for their day-to-day work. 

In one 2014 experiment, Frank Steinicke, a professor of human-computer interaction at the University of Hamburg in Germany, spent 24 hours alternating between two-hour bouts of VR use and 10-minute breaks. Throughout the study, Steinicke became unsure of was real and what wasn’t. “Several times during the experiment the participant was confused about being in the [virtual environment] or in the real world and mixed certain artifacts and events between both worlds,” 

“The audio-visual display is getting better and better; therefore, I am pretty sure that virtual and real content will continue to merge,” Steinicke says. 



* * *

Wearable Tech Reads Human Emotions  

JooHyeon Heo   Neuroscience News.com.  February 23.2024

Summary: Researchers unveiled a pioneering technology capable of real-time human emotion recognition, promising transformative applications in wearable devices and digital services. 

The system, known as the personalized skin-integrated facial interface (PSiFI), combines verbal and non-verbal cues through a self-powered, stretchable sensor, efficiently processing data for wireless communication

This breakthrough, supported by machine learning, accurately identifies emotions even under mask-wearing conditions and has been applied in a VR “digital concierge” scenario, showcasing its potential to personalize user experiences in smart environments. The development is a significant stride towards enhancing human-machine interactions by integrating complex emotional data.

Understanding and accurately extracting emotional information has long been a challenge due to the abstract and ambiguous nature of human affects such as emotions, moods, and feelings. 

To address this, the research team has developed a multi-modal human emotion recognition system that combines verbal and non-verbal expression data to efficiently utilize comprehensive emotional information

Utilizing machine learning algorithms, the developed technology demonstrates accurate and real-time human emotion recognition tasks, even when individuals are wearing masks. 



* * *



"To believe is to accept another's truth. 

To know is your own creation."

Anonymous


* * *

Always know what you are dealing with. What you focus upon defines your reality. 

* * *


Edited: 03.04.2024, 04.01.2024

Find your truth. Know your mind. Follow your heart. Love eternal will not be denied. Discernment is an integral part of self-mastery. You may share this post on a non-commercial basis, the author and URL to be included. Please note … posts are continually being edited. All rights reserved. Copyright © 2024 C.G. Garant. AI usage is prohibited. 








No comments: