You have /5 articles left.
Sign up for a free account or log in.
I must admit that I have been closely following the developments in generative artificial intelligence. So much has developed on a daily basis that is changing the way in which we teach, research, study and work that I think many of us find it hard to keep fully up to date. While watching and considering the implications of generative AI, other technologies have progressed.
The advent of digital cloning has advanced, and the metaverse has taken enormous leaps forward. I was most impressed with a couple of adventuresome reporters who developed videos to give us a vision of where we are with these technologies.
First, on the topic of digital cloning. We are now in the era of deepfakes. As Ian Sample describes them in The Guardian, “The 21st century’s answer to Photoshopping, deepfakes use a form of artificial intelligence called deep learning to make images of fake events, hence the name deepfake.” They are amazingly realistic and, of course, include audio with the video component.
I was particularly taken by the video produced by Joanna Stern of The Wall Street Journal, who tested digital cloning of herself. The result, titled “I Challenged My AI Clone to Replace Me for 24 Hours,” was a short but revealing YouTube video of how she accomplished the process and tested the clone in real-life applications—an online interview with an industry leader and a phone conversation with her sister. The digital audio clone passed in both of those cases; the industry leader and her sister could not immediately determine it was not the “real” Joanna but rather a digitized version. A phone call to her bank about her account was not challenged by the voice-recognition software there. However, a Google Meet gathering with friends and an attempted TikTok video both met with immediate skepticism.
Now, things have gone much farther than mere digital clones. It was just one year ago that Mark Zuckerberg announced and demoed a version of its Metaverse. It was met with less-than-flattering reviews. Maggie Harrison wrote under the headline in the Byte in August 2022, “Wave of criticism hits Zuckerberg’s Metaverse for looking like crap; ‘billions and billions poured into it and this is the result.’”
“There’s no way to sugarcoat it, even if we wanted to. Facebook-turned-Meta’s Metaverse—into which Meta has been pouring billions of dollars—looks bad. Unfathomably bad. Yesterday, CEO Mark Zuckerberg took to Facebook to post what apparently was meant to be a celebratory screenshot of Meta’s Horizon Worlds, a VR game where the digital selves of folks worldwide can gather. The game was just released in France and Spain and the ruler of all metamates wanted to share the good news. As waves of critics pointed out, the design of this ‘immersive’ world is astoundingly underwhelming. In the cursed screenshot, Zuck’s pasty, robotic avatar—the design of which is perhaps a half step above a Wii Mii—is pictured in front of a sparse, sad landscape upon which arbitrarily sized replicas of France’s Eiffel Tower and Spain’s Tibidabo Cathedral are uncomfortably plopped.”
That was just one year ago. However, just two weeks ago, another rather skeptical journalist at the Byte, Lex Fridman, arranged with Zuckerberg to demonstrate the latest in Meta technology. The results are truly stunning.
Writing in the Byte, Frank Landymore describes the hourlong interview session that was held in the Meta Metaverse:
“On Thursday, podcaster Lex Fridman released what he calls the ‘first interview in the Metaverse,’ where he and the Meta CEO have a conversation in VR, using their astoundingly lifelike avatars. Gone are the legless, dumb-looking Mii ripoffs. Here, while Fridman and Zuckerberg sit in different rooms in different parts of the country wearing Quest Pro headsets, their in-Metaverse avatars, each a 3D portrait from the shoulders and up set against a black background, seamlessly chat back and forth while looking alarmingly like their real-life counterparts. ‘It just feels like we’re in the same room,’ Fridman said in the podcast, his avatar faithfully expressing a near deadpan. ‘This is really the most incredible thing I’ve ever seen.’ These photorealistic clones, known as Codec Avatars, have been a years-long endeavor of Zuckerberg’s.”
Fridman calls the Metaverse “incredible” and suggests that it’s the future of how we will communicate on the internet. He opens the podcast saying,
“Mark and I are hundreds of miles apart from each other in physical space, but it feels like we’re in the same room because we appear to each other as photorealistic Codec Avatars in 3D with spatial audio. This technology is incredible, and I think it’s the future of how human beings connect to each other in a deeply meaningful way on the internet. These avatars can capture many of the nuances of facial expressions that we humans use to communicate and motion to each other. Now, I just need to work on upgrading my emotion expressing capabilities of the underlying human.”
As generative AI continues to advance at the same time as this metaverse, Zuckerberg says in the interview that humanlike avatars can embody AI assistants, for example, to join in the virtual environment: “And also, by the way, another thing that I think is going to be fascinating about being able to blend together the digital and physical worlds in this way, is we’re also going to be able to embody AIs as well. So, I think you’ll also have meetings in the future where you’re basically, maybe you’re sitting there physically and then you have a couple of other people who are there as holograms and then you have Bob, the AI, who’s an engineer on your team who’s helping with things and he can now be embodied as a realistic avatar as well and just join the meeting in that way. So, I think that that’s going to be pretty compelling as well.”
In higher education, the metaverse has the potential to become the preferred platform for distance learning. It will carry far more realistic interpersonal engagements than we have had before. AI agents can be seen in realistic human bodies. It will carry the more subtle aspects of nonverbal communication as learners engage with one another and their instructor. Those kinds of interactions can be more robust than ever before. Is this the beginning of the full merging of the digital and the physical in our lives? Are you and your colleagues prepared to address the questions and opportunities that are raised by this emerging platform?