My Vision Of The Metaverse — a big gulp of info directly into the brain
For someone who has been thinking about the Metaverse/Cyberspace/Matrix/Oasis for 2 decades, it’s about time it became hyped! As you know, hype cycles eventually lead to disillusion, and then finally normal adoption. I guess we are finally inside the hype cycle. SO let’s get serious, the Internet with a capital I is a mass of globally interconnected computers sharing common protocols. So if you are familiar with the OSI model you know that protocols build on top of each other. You don’t even have to download an app! Today’s killer app “is the browser”, because with one of those you can do most anything, regardless of whether you are on a phone, tablet, laptop, desktop, VR/AR headset/jokingly smartwatch/OS/whatever. It probably has some kind of browser, which is really a generalized app for accessing and displaying data/info/etc. (Duh) So now if you add WebRTC and WebXR and Edge AI, the cloud of course (another name for massive datacenters of servers, duh again) the browser can now do things it was never designed for initially, such as control IoT/VR/AR/Robotics/Gaming/BLE/simulation/on and on, even BCI and eventually with some implants, neural connectivity (a more informed duh).
Technological telepathy. Writing in “sort of”, but not Faulkner style, but my new style of no paragraphs . . . information in just one big gulp. Why stop to separate ideas in complex concepts? Because it’s some silly rule, no! Just remember to breath if reading out loud (again joking). So the next layer of the Internet will be the “Metaverse” integrating humans (including smartagents) more naturally into a computing layer that virtualizes human interactions and begins merging them tightly coupled into the machine network (a link to my profound duh, really a poem I wrote over a decade ago now representing my epiphany). The mouse will become archaic. Clicking/selecting will exist and be done in many ways, eventually by thought. Humans are primarily visual, auditory, tactile and olfactory. So look for advances in those areas UI/UX/BCI/hand gestures and finger signs short term. Facial expressions become clues for AI recognition. Hopefully there won’t be too many walled gardens blocking us and openness, free choice will still exist in some viable form. And finally hope AI works primarily to our advantage and doesn’t dumb us down so much into mindless sheep herded and heading for “the cliff of oblivion”. You just have to laugh and cry. If you don’t believe me or you really want to learn more about what the Metaverse really is and how the idea got started, it’s easy, just do some googling “metaverse wikipedia”.
Hope you enjoyed a big gulp of info, please clap me up, be well, and thank you for reading!