Follow This Blog: RSS feed
Neverending Search
Inside Neverending Search

Revisiting #ALATTT: Trend #1: AR/VR/MR and a touch of AI

Revisiting #ALATTT_ Trend #1_ AR_VR_MR and a touch of AI (1)


46176020884_2361093d5c_zBack at ALA Midwinter, I was honored to present on LITA’s Top Tech Trends panel, with fellow panelists Kate Tkacik, Suzanne Wulf, Becky Yoose, James Neal and Cynthia Dudenhoffer.  Here’s the slide deck.

Each of us was initially assigned two trends to explore and discuss. I thought I’d break my own twoish trends and resources into two posts.

But before I share that, we were able to solve a mystery that has been plaguing librarians for years. How do you pronounce LibGuides?

Here’s the definitive answer in the form of a tweet from Springshare itself:


Now, back to the session. The first of my shares was on AR/VR/MR with a sprinkling of AI.  (Next time around, I’ll share my notes on OER.)

Trend 1: Augmented Reality, Virtual Reality and Mixed Reality (And a little bonus on AI)

We are seeing movement way beyond hype. Companies are investing in immersive technologies. Why?  And, what’s in it for us?  What is our role in creating, curating, sharing, teaching with, engaging with and making immersive experiences available to our communities?

Having been on AASL’s Best Apps for Teaching and Learning Committee for a couple of years now, it’s hard to ignore how immersive learning technologies have taken hold as opportunities for exploration beyond our walls, for new types of experiences beyond our textbooks, for enabling connection, for fostering empathy, for enabling previously unimaginable face-to-face collaborative experiences, and for moving opportunities to create well beyond the 2D experience.

It’s hard to imagine not fully realizing the potential for our mobile devices as learning tools.

What are immersive technologies?

  • AR (augmented reality): enhances, overlays what we see in the real world, when 2D or 3D objects are layered over what we see in the real world. (Pokemon Go allows you to catch monsters in your neighborhood). AR often uses a trigger image or target to activate digital layers.
  • VR (virtual reality): We no longer see the real world. Instead we are  completely immersed in a 360-degree digital worldview, experienced through sensory stimuli (such as sights and sounds) provided a computer and in which one’s actions partially determine what happens in the environment.
  • MR (Mixed Reality) Digital objects merge with real-world objects to produce new environments. We use our natural gestures to interact with the digital/media content. MR combines digital interactions with experience of the real world and allows us to collaborate.

In terms of young people, immersive realities have the potential to address the characteristics of interactivity, connectivity, access identified in Eliza Dresang’s Radical Change Theory as critical in enhancing agency among digital youth.

Imagine the possibilities for

  • creating and participating in immersive scavenger hunt orientations
  • enhancing every-day learning experiences–media-rich annotations of historic places, libraries (shelves, books?), museum environments, individual works of art or architecture
  • engaging with new types of story (See further options on AASL Best Apps and Best Websites)
    • Google Spotlight Stories “Google Spotlight Stories means storytelling for VR. We are artists and technologists making immersive stories for mobile 360, mobile VR and room-scale VR headsets, and building the innovative tech that makes it possible. Enjoy the experience – look, listen, explore – and never worry that you’ll miss anything.” (AASL Best Apps for Teaching and Learning 2018)
  • immersive journalism: sharing new forms of story and journalism with our patrons and students
  • experiencing what it was/is like to live in another time, another place by engaging in immersive virtual field trips–a walk through the solar system, the experience of a tornado, a tour around a refugee camp, the solar system, back to ancient Rome
  • creating experiences that promote empathy:

“. . .spend days in the life of someone who can no longer afford a home. Interact with your environment to attempt to save your home and to protect yourself and your belongings as you walk in another’s shoes and face the adversity of living with diminishing resources.”

“Experiences are what define us as humans, so it’s not surprising that an intense experience in VR is more impactful than imagining something,” Jeremy Bailenson, co-author

  • building AR/VR: students can their own 360-degree, interactive, immersive stories with augmented and virtual reality creation tools. We can inspire our communities to use affordable emerging technologies to create communication products and stories using AR/VR/MR
    • Metaverse – AR Browser “Metaverse is the easiest way to create Augmented Reality experiences. Create mobile games and choose your own adventure interactive stories using the Metaverse Studio and watch them come to life in the Metaverse app browser. Learners can create all kinds of interactive experiences, including games, scavenger hunts, memes, and other educational experiences.” (AASL Best Apps for Teaching and Learning 2018)
    • CoSpaces: Build and navigate 3D and 360 degree virtual reality worlds, either replicating actual places or creating imaginary ones, using objects and backgrounds from Cospaces’s library or importing your own. Cospaces also allows users to animate and code creations with simple visual programming language. Use this app to create simulations, tours, games, and more. All Cospaces scenes can be experienced in 3D on mobile devices with VR headsets. Grades 2 and up.
  • offering virtual science laboratories for dissection, examination of organs, systems, etc.–opportunity to fully explore, rotate and dissect a 3D model of the human body. Human Anatomy Atlas 2018
  • previewing a college campus or vacation venue
  • practicing emergency preparedness routines, for instance, weather crises, active shooter drills
  • test-driving a (new) car
  • constructing furniture and engaging in DIY projects (ala Ikea’s furniture building instructions)
  • translating written language and conversations in real life and in web-based conversations, allowing us to communicate more effectively with new ELL students and their parents and immigrant populations in our communities 
    • Google Translate app (now incorporate WordLens OCR optical character recognition to translate signs. This one is both AR and AI)  
      • text translation: translates between 103 languages by typing
      • offline: translates 59 languages with no internet connection (59 languages)
      • instant camera translation: translates text in images instantly by just pointing your camera (38 languages)
      • photos: takes or imports photos for higher quality translations (50 languages)
      • conversations: translates bilingual conversations on the fly (32 languages)

Among the tools:

  • Google Expeditions:”immersive education app that allows teachers and students to explore the world through over 1000 virtual-reality (VR) and 100 augmented-reality (AR) tours. You can swim with sharks, visit outer space, and more without leaving the classroom.”  Available Expeditions
  • Google Tour Creator: Create your own tours of school, library, community using Tour, using imagery from Google Street View or 360 photos, then publish them on Google’s Poly platform–a growing library of free VR and AR objects. Tours may be a research product, reflection following a field trip, creative writing.
  • Metaverse: allows educators to create AR and VR activities using a storyboard, scenes and blocks.
  • Storyfab: Create short, narrated augmented reality films with virtual actors, scenes and special effects
  • ARKit: design 3D worlds in your own space, kinda like the new diorama. Save what you create and upload and showcase.
  • Google Street View: Create your own 360 images.  Take them with your phone, android or tablet.
  • Figment AR:  Augment reality with objects, portals, and effects.  Walk into your portal and upload your own 360 images and videos
  • RoundMe: Make your 360 images interactive by adding hotspots and connect multiple 360 images by using portals to jump from one experience to another without interruption.
  • CoSpacesEDU: Create VR worlds by dragging and dropping items into a 360 space and animate them using code.
  • Merge Cube: Hold a hologram in your hands and create your own 3D objects. Merge Cube works with such apps as:
    • Dino Digger: Dig up dinosaur bones, build them into 3D skeletons, and bring those skeletons to life
    • AnatomyAR+: Hold body organs in your hands
    • HoloGLOBE:  From NASA, hold a real-time, interactive version of the earth in your hands
    • 3D Museum Viewer The app places museum artifacts in the palm of your hand. Take the learning to the next level by placing the artifacts to scale inside your classroom such as the Rosetta Stone shown above. In addition, Merge has given you the option to upload your own 3D content using their Object Viewer app.
    • Oculus Go: completely stand-alone, No computer required. Navigate VR space by swiping, selecting and physically moving around Oculus Go Experiences
      • Looking Glass go back in time
      • Wonderful You: experience life inside the womb, watching the growth of the baby and interact with using the five senses.
      • MasterWorks: Journey Through History  View from Mount Rushmore. audio offers background as you explore the space and its story.
      • MEL Chemistry Lab: the Periodic Table comes to life
      • Anne Frank House: Experience what life was like for Anne and her family and friends in 1942 hiding in the Secret Annex
      • . . .  Coming soon Quest–a no-computer, no-wires, VR gaming system

Things to consider: provide a safe environment for your “play area” in the library.  Be careful about glasses.  Consider germs. Wipe down headsets before next user.


Donally, J. (2018) Learning Transported,  International Society for Technology in Education. Website/blog:

Dresang, E. T. (1999). Radical change: Books for youth in a digital age. Contemporary Issues In Technology & Teacher Education, 8(3), 278-282.

Herrera F, Bailenson J, Weisz E, Ogle E, Zaki J (2018) Building long-term empathy: A large-scale comparison of traditional and virtual reality perspective-taking. PLoS ONE 13(10): e0204494.

Jung, T., & Tom Dieck, M. (2018). Augmented Reality and Virtual Reality Empowering Human, Place and Business . Cham: Springer International Publishing.

Liu, D., Dede, C., Huang, R., & Richards, J. (2017). Virtual, Augmented, and Mixed Realities in Education. Singapore: Springer Singapore.

Pope, H. (2018). Incorporating Virtual and Augmented Reality in Libraries. Library Technology Reports, 54(6).

Valenza, J.K. (27 May, 2018).  On immersive technologies and the library: a visit with author Jamie Donnely.  School Library Journal.

AI extra:

I couldn’t help spilling this trend over into artificial intelligence (AI) by sharing some of my favorite examples of useful applications:

JSTOR Labs Text Analyzer Upload an article, your own paper, an illustration. The tool analyzes the text within the document to find key topics and terms used, and then uses the ones it deems most important — the “prioritized terms” — to find similar content in JSTOR.

Talk to Books – Google Books: Browse passages from books using experimental AI. In Talk to Books, when you type in a question or a statement, the model looks at every sentence in over 100,000 books to find the responses that would most likely come next in a conversation. The response sentence is shown in bold, along with some of the text that appeared next to the sentence for context. (Semantic Experiences, Google Research Blog.)

Microsoft Translator offers real-time translation, improving our ability to support and communicate more effectively with our community–students, parents and new immigrant populations.

Seeing AI The AI in Seeing AI stands for artificial intelligence. This Microsoft app is translating a visual world into words for people with visual difficulties. Documents, products, scenes, people, colors, money, and handwriting are some of the settings offered in this app.  Scenes and people are described along with information like relative location and distance. Microsoft has put accessibility in a pocket-sized format in this powerful tool for iPhones. (AASL Best Apps for Teaching and Learning 2018)

This talking camera app for those with a visual impairment provides the following support:

Short Text: Speaks text as soon as it appears in front of the camera

Documents: Provides audio guidance to capture a printed page, and recognizes the text, along with its original formatting

Products: Gives audio beeps to help locate barcodes and then scans them to identify products

Person: Recognizes friends and describes people around you, including their emotions

Scene: An experimental feature to describe the scene around you

Currency: Identify currency bills when paying with cash

Light: Generate an audible tone corresponding to the brightness in your surroundings

Color: Describes the perceived color

Handwriting: Reads handwritten text

Questions to consider (and possibly, to worry about):

  • What are the potential uses for people with disabilities, the elderly, people who cannot type?
  • What are the implications good and bad for AI synthesis and summarization of content?  Could this type of synthesize help students make connections among texts and support research paper writing? Can robots write our papers, especially when they are able to learn and master our writing styles?
  • What is the future for machine-generated works of art? Should we be introducing this new creative process?
  • How do recommender systems support (or perhaps supplant) our work?  What types of biases should we address?  (Consider Safiya Noble’s Algorithms of Oppression.)
  • How might libraries use AI-powered digital assistant to support users in search, discovering and retrieving resources?  In what ways might AI support basic reference?
  • In what ways might AI support cataloging, abstracting, indexing?
  • How are libraries using AI to track users and understand their behaviors?
  • In what ways might these functions violate patron/student privacy
  • Interacting with existing systems and resources–for instance databases like Kanopy?
  • How might we create “skills” for Alexa /Amazon Echo and “actions” for Google Assistant to access library services by voice? Might we teach students to create them?
  • How might database services by supported by AI.  For instance, libraries like Los Angeles Public have Hoopla and Kanopy services interact with Alexa. Houston Public offers an Alexa app for its catalog. Worthington Library in Ohio is using voice assistants for communication about library services and events. Other library vendors are investigation opportunities. What are the important questions to ask, specifically regarding privacy?
  • Will the dominance of digital assistants move folks further away from search queries towards natural language in our searching? (Has that train left the station?)
  • What are the privacy issues regarding digital assistants in classrooms and school libraries?

Ahmed Elgammal, Ahmed. (2018).  With AI Art, Process is More Important than the Product . SmithsonianMag. (Oct. 16).

Griffey, Jason, ed. (2019). Artificial Intelligence and Machine Learning in Libraries. Library Technology Reports (vol. 55, no. 1).

Hennig, N. (2019). Siri, Alexa, and Other Digital Assistants. ABC-CLIO.  Check out my Voices of Search video with Nicole.

Herold, Benjamin. (2018) Classroom Digital Assistants: Teachers’ Aides or Privacy Threats? EdWeek. (July 17).

Coming in the next post: OER, OA and openness

Joyce Valenza About Joyce Valenza

Joyce is an Assistant Professor of Teaching at Rutgers University School of Information and Communication, a technology writer, speaker, blogger and learner. Follow her on Twitter: @joycevalenza

Speak Your Mind