top of page
teach.with.ar (1).png
Module One
Module Two
Module Three
Module Four
Anchor 1

back to top

Anchor 2

back to top

Anchor 3

back to top

Anchor 4

back to top

back to top

back to top

Anchor 5

TYPES OF IMMERSIVE EXPERIENCES

 

Augmented Reality (AR)

Computer technology that provides content in context - at the exact time and place we need it. Augmented reality is created by augmenting, or “adding,” 3D models, images, or videos over real-world objects. Augmented reality differs from virtual reality because you are not fully immersed in the digital world.

 

Virtual Reality (VR)

A simulated environment that can be explored in 360 degrees. VR places the user inside a digital environment to give a fully immersive experience. Learn more about virtual reality in the classroom.

 

Mixed Reality (MR) 

A combination of both the physical and digital worlds. Sometimes called the best of both worlds as it has more customizable capabilities. See Microsoft’s overview of mixed reality. 

 

Extended Reality (XR)

Augmented reality, virtual reality, and mixed reality are considered extended reality as they extend our experience of a real, physical world.

 

Reality-Virtuality (RV) Continuum

​

>>insert diagram here<<

​

The possibilities of the immersive experience are represented in the Reality-Virtuality (RV) Continuum above. The RV Continuum was first developed by Milgram and Kishino in 1994 to visually represent the Mixed Reality spectrum. AR is at the lower end of the spectrum and is more closely related to the real world. 

 

Studies have shown that AR is an effective educational tool (especially for adult learners) because it is closely related to the real-world environment. 

 

The Interaction Design Foundation gives the following definitions for the four sections:

  • real environment: only real, physical objects 

  • augmented reality: the real world is supplemented with digital elements 

  • augmented virtuality: the digital world is supplemented with real-world elements

  • virtual environment: only digital objects

 

360-Degree Content

Images and videos provide an immersive experience where a view in every direction is recorded simultaneously, allowing users to explore an environment remotely and have control of the viewing direction.

 

 


TYPES OF AUGMENTED REALITY

 

Marker-Based AR

Relies on a marker in the environment to activate the experience. Marker-based AR is also called image recognition or recognition-based AR because it works by scanning a trigger/marker (e.g., an embedded QR code, object, text, video, or animation).

 

Example: https://youtu.be/DvjoKDNSEBQ

Example: https://youtu.be/wjWsmHWdVQI

Example: https://youtu.be/E2K052WwzpM

 

Markerless AR

Allows the user to control where they want to place the augmented content. The digital content is scaled to real-life size. Markerless AR relies heavily on smartphone capabilities, such as cameras, sensors, and processors.

 

Location-Based AR

A type of markerless AR that relies on smartphone GPS capabilities to tie augmentation to a specific place. It reads data from a device’s camera, GPS, digital compass, and accelerometer. Information and virtual objects are mapped in a particular location and displayed when a user’s device data matches the location.

 

Example: https://youtu.be/qujpsJS_JnI

Example: https://youtu.be/X7IqAHgZlCs

 

Projection-Based AR

Another type of markerless AR renders virtual objects on a user’s physical space within a stationary context.

 

Example: https://youtu.be/bA4uvkAStPc

Example: https://youtu.be/LQY5AvRwCN8

 

Superimposition (Overlay) AR



 

AUGMENTED REALITY DEVICES

​

People interact with augmented reality via mobile devices (smartphones and tablets) or a head-mounted display (HMD). 

 

Mobile Devices

Mobile devices use inside-out tracking technology (cameras and sensors) to see and process information. Developers can use software from Apple (ARCore) or Android (ARKit) to develop mobile AR applications. The software packages help with motion tracking, environmental understanding, light estimation, and anchors to produce a more seamless and realistic user experience.

 

Head-Mounted Display (HMD)

Goggles, glasses, or specialized visors with a tiny monitor in front of each eye. Because there are two monitors, images can (some can be monoscopic) appear as three-dimensional. They may also be called holographic devices because of their ability to display digital objects as they existed in the real world.

 

Head-mounted displays have been used since the 1960s to view films, television shows and play video games. Today, they’re the primary hands-free option to interact with AR and VR. 

 

Microsoft Hololens

The Microsoft HoloLens frames include five cameras for analyzing the environment, one camera for measuring depth, one HD video camera, one light sensor, and four microphones. The headset is expensive but one of the few cross-platform capable options. 

 

More information on Hololens

 

Magic Leap

The Magic Leap headset is primarily used at the enterprise level to enhance human productivity. It is a lighter-weight headset used in architecture/engineering/construction (AEC), healthcare, manufacturing, and other similar industries. Magic Leap is Android-based.

 

More information on Magic Leap

 

Meta Quest 

The Meta Quest headset replaced its predecessor, Oculus Quest. The current versions available as of 2023 are Meta Quest 2 and Meta Quest Pro. Meta Quest is Android-based.

 

More information about Meta Quest

 

Google Cardboard

The Google Cardboard headset is a customizable and affordable option with prices as low as $14.93 on Amazon. This headset was built primarily for VR experiences, so it may take experimenting with different tech stacks (or SDKs) and development engines to determine your project's best solution (see article). Google Cardboard SDK supports Android and iOS projects.

 

More information on Google Cardboard

 

Head Tracking

In addition, most HMDs include a head tracker so the system can respond to head movements.


 

​

CREATING A SIMULATION

​

Simulation

​

3D Modeling

Develops a representation of an object based on specific dimensions. The object 

will have points in the x, y, and z-axis. Compare 3D modeling software tools.

 

3D Rendering

Creates a photorealistic image of the space or object based on the size, shape, 

and texture of the 3D model. It provides the ability to display multiple viewpoints, 

proper lighting, and accurate performance in a graphic or image, giving the user 

an immersive real-life experience.

 

Mesh

A collection of vertices from the 3D object. Each vertice has a unique texture and 

color information attached to it. Based on that data, 3D modeling software fills the areas with the right colors and textures.

 

Ray Casting

Creates a 3D render by casting rays onto the model from the camera's 

point of view. It transforms the data into a 3D projection by tracing the rays from 

the camera’s point of view. 

 

Ray Tracing

Simulates shadows, reflections, and refractions.

​

Hologram

Using AR glasses/headsets, the user can see their environment, and the glasses display holograms (or 3D models) that seamlessly fit within the user’s settings. See more about AR holograms.

 

Software Development Kit (SDK) 

A specific set of software tools

 

ARKit (Apple iOS) 

Mobile device augmented reality framework that integrates iOS device camera and motion features to produce AR experiences for Apple devices. Learn more about Apple’s ARKit.


ARCore (Android)

Mobile device augmented reality framework for Android using motion tracking, environmental understanding, and light estimation. Learn more about Google’s ARCore.


AR Foundation

Cross-platform, augmented reality framework from the Unity game engine allows designers to build AR experiences for Android and iOS devices. Designers only have to develop once and then deploy across multiple mobile and wearable devices. Learn more about AR Foundation.

 

This table shows how AR Foundation incorporates ARKit (iOS) and ARCore (Android).

 

OpenXR

Windows-based platform that provides engines native access to a range of 

devices on the mixed reality Spectrum. It enables engines like Unity and Unreal, 

to write portable code once that can then access the native platform features of 

the user’s holographic or immersive device - no matter what vendor built that 

Platform.

 

Vuforia

Augmented reality developer platform that integrates with Unity and provides 

unique features, such as Area Targets. It is widely used and supports most 

phones, tablets, and HMDs. See Vuforia in action.

 

Application Programming Interface (API)

A set of rules that allows two computer programs to communicate with each other. Learn more about APIs.

 

Development Platforms

​

Unity and Unreal

Game engines with software development environments and

pre-built gaming components that AR designers can use to plan and build 

interactive frameworks for the web, mobile devices, and HMDs. Learn more about 

how Unity and Unreal compare.

 

iOS and Android (ARKit and ARCore)

Mobile operating systems created exclusively for Apple and Android smartphones 

and tablets. 

 

Universal Windows Platform (UWP)

Microsoft’s computing platform supports Windows 10 and higher. It has 

cross-platform capabilities so that code doesn’t have to be rewritten for each 

Device. Learn more about UWP.

 

WebAR

Web-based augmented reality that does not require a mobile device to function. 

With WebAR, users can experience AR using web browsers or mobile 

devices. Learn more about WebAR.

 

Occlusion

Accurately rending a virtual object behind real-world objects, making it appear more realistic in its surroundings. Learn more about the occlusion and depth of a scene.

 

Spatial Mapping

Provides a detailed representation of real-world surfaces in the environment, allowing developers to create a realistic AR experience. See how Microsoft uses spatial mapping.

 

Anchors


 

Image Markers / Image Tracking

an augmented reality feature that allows apps to detect 2D images, triggering digitally augmented content to appear in videos, slideshows, 360° video/images, sound, text, and 3D animations. Learn more about image tracking. And here is a video example of image tracking. 

 

Eye Tracking

A technology that monitors eye movements as a means of triggering changes in the content being consumed (Peterson)

​

 

​

PHYSICAL (REAL-WORLD) ENVIRONMENT

​

Physical Environment

The space surrounding a 3D object, including the ground plane and lighting.

​

Ambient Lighting

The overall diffuse light that comes in from around the environment making 

everything visible.

 

Main Directional Light

Determines the direction and intensity of the scene’s primary light source. Ensures 

that virtual objects in the scene cast shadows consistent with the other visible real 

objects.

 

Shadows

Often directional and tell viewers where light sources are coming from

 

Specular Highlights

Shiny bits of surface that reflect a light source relative to the position of a viewer in 

a scene.

 

Field of View

Describes the size of an AR image when viewed through an HMD. A small field 

of view will mean the AR image, as seen by each eye, will be very narrow, 

limiting the size and quality of the image. Learn more about the field of view.

 

​
 

USER EXPERIENCE

 

Six Degrees of Freedom (6DoF)

the directions an object or user can move about freely within three-dimensional space. Sometimes written as 6DoF. (Stevens p. 172)

 

The six degrees are:

  • Heaving: moving up and down along the y-axis

  • Surging: moving forward and backward along the z-axis

  • Swaying: moving left and right along the x-axis

  • Pitch: rotating between the x- and y-axis (up and down)

  • Yaw: rotating between the x- and z-axis (left and right)

  • Roll: rotating between the z- and y-axis (front to back)

 

 

 

User Interface

​

World space

The 3D x, y, z coordinates of any object, defined by the 

environment. 

​

​

Screen space

The 2D space defined by the screen or viewable area. This is 

reliant on the screen size, position, and device resolution. (Stevens p.174)

 

User Flow

 

​

User Personas

representations of a real user that is intended to represent a key audience to provide reference within the specific context of an experience (Stevens p. 156)

​

​

​

AUDIO

 

Spatial Sound

3D audio effects that place sound sources in three-dimensional space around the user

 

Ambisonic Sound

A method of capturing and playing back a 360° spatial sound. (Peterson)

 

Binaural Audio

Reproductions of sound the way human ears hear it. The word “binaural” means “using both ears.” When you listen to a binaural recording through headphones, you perceive distinct and genuine 360° sound. (Peterson)

 

Directional Sound

A technology that concentrates acoustic energy into a narrow beam so that it can be projected to a discrete area, much as a spotlight focuses light. (Peterson) 

​
 


 

SOURCES:

 

Steven's book (Immersive Design)

 

3D User Interfaces

 

https://digitalpromise.org/initiative/360-story-lab/360-production-guide/investigate/augmented-reality/getting-started-with-ar/types-of-ar/

Anchor 6
Anchor 7
bottom of page