Abstract
This article outlines the immersive pedagogy and structure for a second-year university course on new media and technology taught inside virtual reality (VR) using a series of metaverse platforms. The immersive pedagogy model is introduced as a conceptual framework for designing courses inside VR. The model includes a scale of immersion to bridge pedagogy and representation fidelity. The structure of the course is outlined, including commentary on lectures, user representation (avatars), field trips, and assignments like building virtual worlds and streaming gameplay.
Keywords
Immersion, Virtual Reality, Media Education, Pedagogy, Metaverse Platforms
1. Introduction
In December 2021, Zoe Weinberg penned a column in The New York Times titled, “The Metaverse is Coming and the World is Not Ready for It,” speculating that students may one day attend classes there. Two months earlier, the design for a second-year communication class taught entirely inside VR in the School of Communication at Simon Fraser University (SFU) was already underway (SFU, 2022). In the month following Weinberg’s prediction, a 12-week course began that included ten weeks of classes–a two-hour lecture and 30- to 45- minute tutorial (both conducted inside VR)– totalling over 25 hours of time per student inside Oculus Quest 2 headsets. The objective of the course was to provide students with an opportunity to explore VR technologies (e.g., Google Cardboard, Quest 2) and metaverse platforms as new media, as well as to critique the social imaginaries and hyperbole presented by big tech. Students watched Oculus 360 films like Traveling While Black and engaged with lectures, tutorials, and assignments inside metaverse platforms. (See Figure 1)
The metaverse–or the promise of immersive digital worlds–often refers to cyberspace, entailing digital enhancements like VR, augmented reality (AR), or artificial intelligence (AI) (Anderson & Raine, 2022; Ravenscraft, 2021). Presently, the metaverse refers to a series of disconnected platforms populated by digital avatars in three-dimensional (3D) environments that continue to persist even when people are not ‘there’ (Kemp & Livingstone, 2006; Wood & Solomon, 2009; Lee et al., 2021). The SFU class engaged in a number of these meta-spaces, including Engage, Spatial, Horizon Worlds, Echo VR, Bait, Gun Raiders, and Rec Room.
Goldman Sachs (2018) predicts that VR will reach 15 million learners by 2025. Given the increase in affordability and accessibility of VR (Mills & Brown, 2021) alongside the number of metaversities set to launch in 2022 (Melnick, 2022), this estimate may be conservative. This paper responds to calls to explore teaching and learning through VR in higher education (Makransky & Petersen, 2021, etc.) and proposes the immersive pedagogy model (adapted from Dalgarno and Lee (2010) and Fowler’s (2015) work on VR pedagogy) before highlighting aspects of the course (lectures, user representation, field trips, and assignments).
2. Immersive pedagogy model
VR technologies offer promises and challenges for learning environments (Fowler, 2015; Allison, 2008). Adapting frameworks about 3D virtual learning environments (see Dalgarno & Lee, 2010; Fowler, 2015; Fowler & Mayes, 2011), I introduce the immersive pedagogy model to emphasize the technological, pedagogical, and psychological factors of teaching in VR (see Figure 2). The proposed model focuses on representational fidelity, learner interactions, and pedagogy.
Representational fidelity includes factors like spatial audio, realistic environmental displays, user representation, tactile feedback (Dalgarno & Lee, 2010), infrastructure, social VR applications, and tech hardware. Fowler’s (2015) reconstruction of Dalgarno and Lee’s (2010) framework de-emphasizes representation fidelity, which is suitable for standalone VR activities, but problematic for tech-dependent courses taught inside VR as learner interactions–identity, embodied actions, dialogue, verbal and non-verbal communication, empathy (Fowler, 2015; Dalgarno & Lee, 2010), intimacy, and affect–are encased in the operating system. A high level of representational fidelity may not equate to deeper learning, but Fowler’s (2015) rationale for minimizing its significance rests on the premise that there may be some circumstances where categories on the continuum are not relevant. A better solution, however, is to adopt a more flexible framework. The main components of the immersive pedagogy model (i.e., representational fidelity, learner interactions/outcomes, pedagogy, and the scale of immersion) remain stable, but the categories contained within the components can be modified by educators – retaining, adding, or removing elements to meet the objectives of an activity or course.
Fowler (2015, p. 416) recognizes that immersion is one way to bridge “technological, psychological, and pedagogical” learning experiences in 3D worlds, but does so through conceptualization, construction, and dialogue (formerly referred to as courseware) (Mayes & Fowler, 1999). As VR routinely intersects with immersion, interaction, and imagination (Burdea, 1996; Burdea & Philippe, 2017), I focused on immersion and its connection to presence. The three categories of immersion include full-immersion (head-mounted displays), semi-immersion (large projection screens), and non-immersion (desktop-based VR) (Gutiérrez et al., 2008). Integrating the scale of immersion ensures accessibility and affords immersive hybridity – the ability for students to select a type of participation along the scale of immersion. The SFU class used Quest 2 head-mounted displays, Google Cardboard, large screen projection, and desktop/smartphone applications for VR. The majority of students opted to attend class inside the Quest 2 headset, but one student prone to headaches often attended lectures as an avatar from her laptop.
Higher levels of immersion lead to a higher sense of presence, providing a sensation of being in a virtual environment (Makransky et al., 2018; Cummings & Bailenson, 2016), which is often characterized by the ability of users to block out the physical world (Mills & Brown, 2021). Presence is a subjective state, rather than a feature of technology, and “is experienced on a spectrum that depends on the quality of the immersion and degree of multisensory feedback” (Mills & Brown, 2021, p. 12; see also, Light, 2019; Gleason, 2016). The immersive pedagogy model includes a scale of immersion situated in parallel to a scale of presence. Co-presence is not listed but can be achieved at all stages along the scale of presence. Virtual worlds afford a spatial presence that fosters social experiences in VR (Hackl, 2020).
Fowler (2015) applies conceptualization, construction, and dialogue as the pedagogy, but this may limit some instructors. Instead, I use common practices such as engagement, problem-posing (Freire, 2017), active learning (e.g., Bloom’s taxonomy), experiential learning (Kolb, 1984a; 1984b), high-impact educational practices (HIP) (Linder & Hayes, 2018), and presence pedagogy (Bronack et al., 2008; Fowler, 2015). The difference with immersive pedagogy is that it perpetually weaves between the scales of immersion and presence to reach learners encased in the technical system. This flexibility allows educators to choose what works best for an activity or course.
While representational fidelity weighs heavily in the immersive pedagogy model, I want to explicitly note that I share Fowler’s (2015, p. 420) perspective that it is “imperative to move away from research that starts with an analysis of the technology then seeks to derive learning benefits.” The difference in our approach is that I see representational fidelity and pedagogy as equal measures in a fully immersive VR classroom and use the scale of immersion as a way to bridge these concepts. Representational fidelity and pedagogy were at the forefront of the design process, not one in front of the other, but as a side-by-side considerations.
3. Teaching new media in the metaverse
The immersive pedagogy model conceptually created a foundation for the course, which I outline in the next section of the paper, with an overview of the lectures, user representation, field trips, and main assignments.
3.1 Lectures inside virtual reality
The VR course operated similarly to other university classrooms, including lectures, seminar-style discussions, group activities, board work, and individual presentations. The only functional difference was that our class happened inside Quest 2 (128GB) headsets and students were represented as self-designed avatars. Equipped with accurate motion tracking, a powerful processor, immersive capacity, wireless nature, 1832 x 1920 pixel resolution, and a (comparably) lightweight frame (Lynch, 2022; Greenwald, 2021), the Quest 2 was a natural selection for the course. Before the first class, I configured the headsets, added our applications, and connected the devices to the campus wifi network. Students were also asked to set up accounts on Facebook and Oculus to assist with the set-up process.[1] The Quest 2 headsets were provided to the students, but students were asked to bring 3.5mm headphones to plug into the devices.
I started the course by casting my headset onto the physical lecture hall screen to show students the operating system and demonstrate the First Steps – a tutorial designed to onboard users to the controllers. After, I led the class into a public corridor on campus where students were given the opportunity to learn how to wear the head-mounted displays, use the controllers, set up guardian boundaries,[2] and complete the First Steps tutorial (see Figure 3). The public spaces were more suitable for certain activities as we could avoid the stairs in the physical lecture hall and set up larger guardian zones to facilitate movement. After students completed the tutorial, they were asked to design avatars across several applications (see section 3.2).
Initially, many students struggled to place eyeglasses inside headsets, locate buttons on the controllers, and navigate the meta-spaces. One student was concerned that her eyeglasses wouldn’t fit inside the headset, but once she figured out the placement, she inserted her eyeglasses for the remainder of the semester. Another student opted to switch to contact lenses after the first week. The majority of the students were new to VR, but students with previous gaming experience acclimated to the Quest 2 faster. Once students adjusted to the hardware, they unanimously agreed the interface was similar to many social media platforms. After the first class, one student said: “I felt a little bit self-aware at first, but the Oculus world is incredibly immersive, and I therefore had no issues adapting to my environment.” Another student said:
Getting to use… [VR] was mind blowing and so immersive. I found all of the controls to be intuitive… and the hardware wasn’t so bad… After, I was almost relieved to see that everything was still there since VR space is so immersive. Something I wasn’t expecting is that I had not actually moved while I was in VR. It felt like I had just been walking around for an hour but I had not moved a foot in any direction.
The objective of the first class was to build enough literacy to conduct lectures and tutorials inside the headset the following week. Our first lecture in VR was inside Engage – a social metaverse platform with ISO 27001 security certification, customizable representations, virtual locations, IFX (3D objects), and teleportation. Metaverse platforms were selected for their ability to afford dialogic spaces, IFX, multiple locations, connect to the internet (YouTube, Canva, Emaze, Google Docs, etc.), facilitate collaborative exercises[3], and the ability to customize self-representations. Being inside VR made it easy to teleport around virtual worlds, participate in simulated scenarios like resuscitating an infant, or use IFX (3D) objects and pens to visualize conceptual concepts in the course. The IFX tool allowed me to place objects, whiteboards, and large-scale screens in any location. As a result, our class could move between meta-spaces quickly and lectures were rarely static. Similar to social media platforms, I had content moderation controls. As the host, I could teleport the entire class with the click of a button, which meant I did not have to wait for students to join me in subsequent spaces – a common limitation with other metaverse platforms. Engage also allowed me to restrict features for select participants. As a class, we decided to open our classroom to the public for two weeks (see Figure 4), and during that time, I gave IFX access to the students but not the public. While content moderation protocols allow the host to mute people, ‘lock’ people in their seats, or permanently remove individuals, I did not need to use these features in our classroom. As Engage was also available through desktop and mobile/tablet applications, it was easy for students to select their level of immersive participation. One day, a student who typically preferred attending class from inside the headset, came to class with a headache and opted to participate from her laptop. Non-immersive desktop participants can still move, interact, and talk with the class, but experience a lower degree of presence.
During the first lecture, students experimented with IFX in the Hub Space, placing 3D objects like tractors, dinosaurs, and furniture around the virtual classroom. One student said: “I wasn’t familiar with the Oculus handle features so my movements were quite weird; however, when I got into Engage it was so much fun… I did not expect the engagement between different people to seem so real.” Students enjoyed the haptic nature of clapping, shaking hands, and fist-bumping classmates to activate the tactile feedback that pulsates through the controllers. After a lecture in the virtual lecture hall, a student said, “I was impressed a lot by how a lecture hall in the Oculus would be so real.” Another student said, “My classmates avatars felt very similar to how they are in reality due to the body language and hand gestures, even the turning of the heads reflected how they would be in reality.” Figure 5 highlights some class images across several meta-spaces.
3.2 Avatars: User representation and identity
After creating avatars across a number of metaverse platforms, students learned how to take selfies and pictures from inside the headset and upload those images to OneDrive so they could access them outside the head-mounted display. For a low-stakes project at the start of the semester, students were asked to take selfies of self-constructed avatars across different applications and write a commentary to reflect on representation choices using identity theories. One student found using the tablet on their wrist frustrating at first: “Towards the end, [I] got the hang of how to take pictures and upload the pictures, and started feeling more comfortable.”
Every student–without prompting–created user representations that matched their physical identity. Students wanted to accurately represent their race, sex, style, hair, body shapes, and so forth. A student designed an avatar in Engage to represent herself as a female, but observed, “Since I see myself as a woman, this is not an issue; however, these options posit models of femininity and masculinity, and further compound models of representation for non-binary or gender nonconforming individuals.” The majority of students found the avatar’s skin tone and race difficult to replicate. Another student observed: “For both apps, I had selected black hair, because that is my natural hair color. I was slightly discouraged that neither app had the option to select a multicolored hairstyle because my hair is partially dyed.” A major limitation across metaverse platforms was the inability of students to create avatars the way they wished to be represented, but this didn’t prevent students from forming strong attachments to their avatars. An interesting consequence of this assignment was the subsequent selfie culture that emerged in the class throughout the semester. One student described taking avatar selfies as “strangely satisfying.”
For the majority of the semester, students attended class as cartoon-like avatars. Our class spent so much time inside VR that several students only recognized their classmates as avatars during the first few weeks of class. During one of the two public classes at the end of the semester, we uploaded profile images to Engage to place our actual faces on our avatars – what Mark Zuckerberg calls a professional avatar (see Figure 6). The day we wore our real faces, a number of people joined our class, including a university student from Latvia who was learning how to use the platform for an upcoming course. The students enjoyed speaking with the student (we’ll call her Maja) about their own experience taking a class in VR. Later, a student said, “Hearing Maja’s story about studying technology innovation in Latvia made me realize how insane it was that we were interacting with a person across the world.” After chatting with Maja, the students continued delivering presentations on misinformation in the virtual lecture hall. Students described presenting material in VR as “bizarre,” “exciting,” and nerve-wracking. One student said: “It still amazes me that we are able to upload slides from Canva into Engage.” Some students found the experience of presenting in VR “less stressful” because there was less non-verbal feedback, while others found it “hard” because they couldn’t see people’s emotions and the avatars weren’t facially expressive. In a face-to-face talking circle at the end of class, one student remarked about how the generation of faces through AI synthesis created a “more adequate (and somewhat uncanny) element” of the self, contributing to a “feeling of vulnerability.” The SFU class agreed that they would prefer to use their real face on avatars with family, friends, classmates, or co-workers, but would prefer a cartoon-based avatar when interacting with strangers.
3.3 Field trips in virtual worlds
A common feature of the course was field trips to virtual worlds. In a lecture on digital creativity, I started the class in an executive-style meeting room. The students learned how to use a 3D pen before I assigned a group task to complete and present using the virtual whiteboards. Next, I moved students into the Sky Hotel where they drew 3D pictures, picked up the object, and assembled a class collage. Students enjoyed the ability to write in the air and then move their writing or drawings around the meta-spaces. During that class, we moved quickly between meta-spaces, ranging from an art gallery to the Oval Office and a theatre. “I felt like we were traveling across many different places in the world,” commented one student. Another said, “I was interested in the gallery room the most because that room was very real: it simulates precisely what a real gallery exhibition look[s] like in real-life.” The entire class walked through the gallery at the same time, viewing the same posters and art; however, the text on the posters was customized, which allowed some of the class to view text in English while other people viewed them in Mandarin.
Later in the class, we went to a Sound Stage theatre where the class engaged in group improv exercises using IFX objects. During the improv session, students externally appeared to be holding invisible objects while talking loudly, which attracted the attention of people passing by. Towards the end of the lecture, students went to the Pompeii Great Theatre where students were asked to create a collective story using IFX (see Figure 7). Students used the 3D pens to draw fire and flames, set off fireworks, and place mammals and statues. The lecture not only demonstrated the potential of VR for field trips in education but also the embodied movement from storytelling to storyliving (Tarantini, 2020).
3.4 Assignments in VR
The course included a major assignment for each of the three modules, including building virtual worlds, streaming gameplay from VR to social media, and a research project about learning in VR through metaverse platforms.
3.4.1 Building virtual worlds
In preparation for the first assignment on building virtual worlds, students decorated rooms, built part of an amusement park, and experimented with building tools across a variety of applications. One student said: “Being able to decorate a space… was interesting” and similar to “decorating a real space minus the price tag. My creative taste can transfer from reality to VR and the more I am inside VR, the more it feels real and natural.” Additional work blocks were set aside for students to use headsets to build their virtual worlds. One student visited VR worlds in Rec Room, Engage, Spatial, and Horizon Worlds, and built a virtual world in Horizon Worlds (see Figure 8). These exercises led to rich exchanges about the digital economy, real estate in meta-spaces, cryptocurrency, blockchain, non-fungible tokens, and the future of work in virtual worlds. A drawback to building with IFX is that when too many 3D objects are in use at the same time it often would crash Engage or disconnect our headset from SFU’s wireless network. These disruptions offered an opportunity to discuss the limitation of VR. One student noted: “As Nicole said… this was an experiment and there is no right way to learn. I feel like I’ve learned a lot from the obstacles and affordances presented by this technology.”
3.4.3 Playing (and streaming) games in virtual reality
For the second assignment, students were asked to stream games from inside the Quest 2 headset to Oculus, Twitch and YouTube. Before we started, students had one hour to learn how to play games like Echo VR, Bait, and Gunraiders. Students were asked to play one single-player and one multi-player game using the headset, where they cast their gameplay through the Oculus desktop, simultaneously live streaming to Twitch while recording a copy through Oculus to download and upload to YouTube (see Figure 9). After, students included a link to YouTube in a 1200-word essay about game streaming. Students developed full, expressive body movements during the game streaming class. After playing Echo VR one student said they felt like they were part of the game and kept putting their hands up to stop themself from colliding into a virtual wall: “After getting out of the Oculus headset today, I can say that this was my favorite experience. I felt fully immersed.”
3.4.3 Learning how to conduct research
At the start of the semester, students began making reflective notes about their own experiences as well as their observations of their classmates’ external movements in a class-wide Google Sheet. The dataset was then used at the end of the semester for the students to write a 2500- to 3000-word research paper. The process allowed the second-year students to learn how to design a codebook, collect data, examine data, and write an essay using the findings. As part of this project, students also created drawings to describe their peers’ body motions. Before the assignment was due, students had the opportunity to work in groups to determine the findings. The advantage of the class-wide project was that it gave students a chance to reflect on their experiences throughout the semester – individually and as a class.
4. Conclusion
VR offers many possibilities and limitations in higher education. While there are certainly concerns around login procedures, privacy policies, user data collection, stable wifi connections, spatial audio, cost, and accessibility, the SFU class illustrates the potential of VR for experiential and immersive learning. I propose the immersive pedagogy model as a way to consider the factors that weigh into a VR class. As the case study shows, immersion is a central feature in VR learning and a useful bridge between representational fidelity and pedagogy. The immersive nature of the class was appealing to all of the students. One student reflected on their time in the course, saying:
“My experience of learning inside the metaverse was very immersive, engaged, and interactive. I really connected with the course material and the lectures really resonated with me.
I was blown away by the potential of VR and was inspired in numerous ways because of this course. I really loved the lens we used to study new media.
This is one of the first times that I have felt like a class was meant for me in my entire academic career. Learning inside the metaverse has been a blessing.”
The most obvious implication of this work is the ability to transfer this type of course design to online learning. Unlike previous separations between face-to-face and online learning, VR enables both to occur at the same time, resulting in a reshaping of spatial configurations around learning. The ability of students to embody a virtual body, experience/view a wide range of body language, connect with high-fives or share items like markers, create 3D objects or stories together, walk around, clink coffee mugs at a cafe during breaks, and visit an unlimited number of virtual worlds are just a few of the advantages of learning/teaching with VR. Students consistently remarked on how amazing it would have been to have learned inside VR when all of their classes were online during the pandemic. The course highlights the need for more research on VR in education, especially around user representation, VR collaboration, and this new spatial configuration to learning modalities.
5. References
Allison, J. (2008). History educators and the challenge of immersive pasts: a critical review of virtual reality ‘tools’ and history pedagogy. Learning, Media and Technology, 33:4, 343-352, DOI: 10.1080/17439880802497099
Anderson, J. & Rainie, L. (February 7, 2022). Visions of the Internet in 2035. Pew Research Center. https://www.pewresearch.org/internet/2022/02/07/visions-of-the-internet-in-2035/
Bronack, S., Sanders, R., Cheney, A., Riedl, R., Tashner, J. & Matzen, N. (2008). Presence Pedagogy: Teaching and Learning in 3D Virtual Immersive Worlds. International Journal of Teaching and Learning in Higher Education, 20: 1, 59-69.
Burdea, G. (1996). Force and touch feedback for Virtual Reality. John Wiley & Sons.
Burdea, G.C. & Philippe, C. (2017). Virtual reality technology. Wiley-IEEE Press.
Cummings, J. J. & Bailenson, J. N. (2016). How Immersive Is Enough? A Meta-Analysis of the
Effect of Immersive Technology on User Presence. Media Psychology, 19(2), 272–309. https://doi.org/10.1080/15213269.2015.1015740
Dalgarno, B. & Lee, M. (2010). What are the learning affordances of 3-D virtual environments? British Journal of Educational Technology, 41, 10–32.
Freire, P. (2017). Pedagogy of the Oppressed. Penguin Classics.
Fowler, C. (2015). Virtual reality and learning: Where is the pedagogy? British Journal of Educational Technology, 46(2), 412–422. https://doi.org/10.1111/bjet.12135
Fowler, C. J. H. & Mayes, J. T. (2011). Learning relationships from theory to design. Research in Learning Technology, 7(3). https://doi.org/10.3402/rlt.v7i3.11554
Greenwald, W. (August 12, 2021). Oculus Quest 2 Review. PC Magazine. https://www.pcmag.com/reviews/oculus-quest-2
Gleason, S. P. (2016). Technology and the not-so-stable body: ‘Being there’ in the cyborg’s dilemma. Journal For Virtual Worlds Research, 9(2), 1–15.
Goldman Sachs. (2018). Profiles in innovation report. http://www.goldmansachs.com/our-thinking/pages/technology-driving-innovation-folder/virtual-and-augmented-reality/report.pdf
Gutiérrez, M. A., Vexo, F., & Thalmann, D. (2008). Stepping into Virtual Reality. Springer London, Limited. https://doi.org/10.1007/978-1-84800-117-6
Hackl, C. (August 30, 2020). Social VR, Facebook Horizon and the Future of Social Media. Forbes. https://www.forbes.com/sites/cathyhackl/2020/08/30/social-vr-facebook-horizon–the-future-of-social-media-marketing/?sh=6dc9fce05b19
Kemp, J., & Livingstone, D. (2006, August). Putting a Second Life “metaverse” skin on learning management systems. In Proceedings of the Second Life education workshop at the Second Life community convention (Vol. 20). CA, San Francisco: The University of Paisley.
Kolb, D. A. (1984a). Experiential learning: Experience as The source of learning and development. Pearson FT Press.
Kolb, D. A., Rubin, I. M., & McIntyre, J. M. (1984b). Organizational psychology: Readings on human behavior in organizations (4th ed.). Prentice-Hall.
Lee, H. W., Kim, S., & Uhm, J. P. (2021). Social Virtual Reality (VR) Involvement Affects Depression When Social Connectedness and Self-Esteem Are Low: A Moderated Mediation on Well-Being. Frontiers in Psychology, 12, 753019. https://doi.org/10.3389/fpsyg.2021.753019
Light, E. (2019) Playing in cyberspace: the social performative on Heidelberg Street, Critical Studies in Media Communication, 36:3, 207-220, DOI: 10.1080/15295036.2019.1583349
Linder, K. E., & Hayes, C. M. (2018). High-Impact Practices in online education: Research and best Practices. Stylus Publishing.
Lynch, G. (April 22, 2022). Oculus Quest 2 review. TechRadar. https://www.techradar.com/reviews/oculus-quest-2-review
Makransky, G. & Petersen, G. B. (2021). The Cognitive Affective Model of Immersive Learning (CAMIL): a Theoretical Research-Based Model of Learning in Immersive Virtual Reality. Educational Psychology Review, 33(3), 937–958. https://doi.org/10.1007/s10648-020-09586-2
Makransky, G., Wismer, P., & Mayer, R. E. (2019). A gender matching effect in learning with pedagogical agents in an immersive virtual reality science simulation. Journal of Computer Assisted Learning, 35(3), 349–358. https://doi.org/10.1111/jcal.12335
Mayes, J. T. & Fowler, C. J. H. (1999). Learning technology and usability: a framework for understanding courseware. Interacting with Computers, 11, 485-497. https://doi.org/10.1016/S0953-5438(98)00065-4
Meta (2022). About. https://about.facebook.com/meta/
Melnick, K. (April 7, 2022). 10 “Metaversities” Are Opening Across the US This Fall. VR Scout. https://vrscout.com/news/10-metaversities-are-opening-across-the-us-this-fall/
Mills, K. A. & Brown, A. (2021): Immersive virtual reality (VR) for digital media making: transmediation is key, Learning, Media and Technology, DOI: 10.1080/17439884.2021.1952428
Ravenscraft, E. (April 25, 2022). What Is the Metaverse, Exactly? Wired. https://www.wired.com/story/what-is-the-metaverse/
Simon Fraser University (SFU) (2022a). The Medium is the Metaverse: Studying New Media in Virtual Reality. School of Communication: News and Community. http://www.sfu.ca/communication/news-and-community/researchnews/the-medium-is-the-metaverse–studying-new-media-in-virtual-reali.html
Tarantini, E. (2020). Immersive Learning for Teacher Education. In CELDA (7th International Conference on Cognition and Exploratory Learning in Digital Age) 2020 Proceedings. IADIS Press, S. 379-382.
Weinberg, Z. (December 2, 2021). The Metaverse Is Coming, and the World Is Not Ready for It. The New York Times. https://www.nytimes.com/2021/12/02/opinion/metaverse-politics-disinformation-society.html?smid=tw-share
Wood, N. T. & Solomon, M. R. (2009). Virtual Social Identity and Consumer Behavior. In Virtual social identity and consumer behavior (1st ed., pp. xv–xv). Routledge. https://doi.org/10.4324/9781315698342
Acknowledgement
Thank you to the School of Communication at SFU for allowing me to teach inside VR and to the CMNS 253 students for an unforgettable semester. I also wish to acknowledge Song Tang for his illustration services (Figure 2) and Nesan Fertado for allowing me to include the tweet in Figure 4.
[1] A major limitation of the Quest 2 is that devices require authenticated logins and restrict the number of accounts per device. For the head-mounted displays to be used widely in higher education, Meta needs to change these policies for educators, ideally removing all login requirements for educational headsets.
[2] A “guardian boundary” is a safety feature that lets Quest 2 users set up boundaries within the physical space in which the device is used that appear within the VR headset display when the user physically moves too close to the edge of the defined play area.
[3] We were able to complete group activities, board work, and creative projects together, but struggled to find an application where we could co-write or design presentations easily within meta-spaces. Recent application updates are resolving some of these challenges.
Current Issues
- Media and Information Literacy: Enriching the Teacher/Librarian Dialogue
- The International Media Literacy Research Symposium
- The Human-Algorithmic Question: A Media Literacy Education Exploration
- Education as Storytelling and the Implications for Media Literacy
- Ecomedia Literacy
Leave a Reply