Hey everyone! When starting this video I thought it would be as simple as going out on my kayak, throwing on my headset and recording some of the gameplay. I was so happy with the level of immersion I felt though that I had to dig a little deeper. While creating this video I was able to achieve Ready Player One level immersion and I am happy to share the results with you. I was also able to learn a lot in regards to the science behind virtual reality and how our brains interact with virtual environments compared to the real world. Reading all about this and talking to a couple of different scientist has gotten me incredibly excited for the future of the metaverse and all of the VR tech on its way. Thank you so much for watching this video, I hope you liked it and for all other VR news, reviews, gameplays, discussions and more please subscribe, I would love to have you here!
Link to Suir Blueway VR (The kayaking game in this video)
https://sidequestvr.com/app/4520/suir...
Full quote from u/ForkliftMasterPsych
"This is an interesting topic since given enough correct stimuli the brain would not be able to tell the difference.
There are a few neurocognitive principles that bind this together.
First we have the fact that perception is based on interpretations of stimuli, and this is done in a slight backwards/loop order. The brain interprets the signals received and makes a prediction of what comes next, receives signals and updates the prediction. This includes interpretations of people and facial expressions and is likely the reason for the phenomenon of the ”uncanny valley” were artificial faces look real but feels a bit of. So the level of mental emersion would be based on how good the different stimuli are and how well the ”real” and virtual stimuli match. Side point, It is as well part of humor since a lot of comedy is based of tricking this part of our function and surprising us. ( if you want to read up on this I recommend searching for ”embodied cognition” and related topics on Google scholar, ResearchGate and also listening to the Brain science podcast with Ginger Campbell)
Second, we extend our feeling of our ”space”, the reach and end of our functional body, with what we interact with. If you hold a coffee cup you know where it ends so you can put it to your lips or on a table, even though it is not part of your body. Same goes for a hammer or if you hold a chair. The thing you are interacting with is added to your field of influence. (See Micheal Graziano, the space between us: a story of neuroscience, evolution and human nature) This is should also a part of the learning curve for prosthesis, especially modern ones that pick up nerve signals from the sight where the prosthesis is attached. I don't have articles to link regarding this thou but it should follow the same logic. This becomes extra interesting when we talk about VR and AR as you can as a developer choose what kind of stimuli and feedback you want the users to receive. I think it might have been Dr Kevin Warwick that implanted a echolocation device during the early 2000 that gave a small neural feedback. With time it granted a ”sense” of his surroundings. A less invasive alternative would to give different haptic devices to enhance and emerge the user in VR. Perhaps a sense of the sun on your skin? But now I’m speculating. Hope this gives you some answers and a few leads to read up on."
#vr #virtualreality #oculus #oculusquest2 #AndysVrReviews