AVR Unit 1 Part 1
AVR Unit 1 Part 1
2) A mouse running on a freely rotating ball while exploring a virtual maze that
   appears on a projection screen around the mouse.
• We want our definition of VR to be broad enough to include these examples and
  many more.
• Definition of VR: Inducing targeted behavior in an organism by using
  artificial sensory stimulation, while the organism has little or no awareness
  of the interference.
• Four key components appear in the definition:
1) Targeted behavior: The organism is having an “experience” that was
    designed by the creator. Examples include flying, walking, exploring,
    watching a movie, and socializing with other organisms.
2) Organism: This could be you, someone else, or even another life form such
   as a fruit fly, cockroach, fish, rodent, or monkey (scientists have used VR
   technology on all of these!).
3) Artificial sensory stimulation: Through the power of engineering, one or
   more senses of the organism become co-opted, at least partly, and their
   ordinary inputs are replaced or enhanced by artificial stimulation.
4) Awareness: While having the experience, the organism seems unaware of
   the interference, thereby being “fooled” into feeling present in a virtual
   world. This unawareness leads to a sense of presence in an altered or
   alternative world. It is accepted as being natural.
• Testing the boundaries How far does our VR definition allow one
  to stray from the most common examples?
• Perhaps listening to music through headphones should be
  included.
• What about watching a movie at a theater? Clearly, technology
  has been used in the form of movie projectors and audio systems
  to provide artificial sensory stimulation.
• Continuing further, what about a portrait or painting on the wall?
  The technology in this case involves paints and a canvass.
• Finally, we might even want reading a novel to be considered as
  VR.
• Who is the fool? When an animal explores its environment, neural
 structures composed of place cells are formed that encode spatial
 information about its surroundings.
• Each place cell is activated precisely when the organism returns to a
  particular location that is covered by it.
• It has been shown that these neural structures may form in an
  organism, even when having a VR experience.
• In other words, our brains may form place cells for places that are not
  real! This is a clear indication that VR is fooling our brains, at least
  partially.
• A novel that meticulously describes an environment that does not
  exist will cause place cells to be generated.
• Terminology regarding various “realities” The term virtual reality
  dates back to German philosopher Immanuel Kant although its use did not
  involve technology.
• Kant introduced the term to refer to the “reality” that exists in someone’s
  mind, as differentiated from the external physical world, which is also a
  reality.
• The real world refers to the physical world that contains the user at the time
  of the experience, and the virtual world refers to the perceived world as part
  of the targeted VR experience.
• Augmented reality (AR) refers to systems in which most of the visual stimuli
  are propagated directly through glass or cameras to the eyes, and some
  additional structures, such as text and graphics, appear to be superimposed
  onto the user’s world.
• The term mixed reality (MR) is sometimes used to refer to an entire
  spectrum that encompasses VR, AR, and ordinary reality.
• The most important idea of VR is that the user’s perception of
  reality has been altered through engineering, rather than
  whether the environment they believe they are in seems more
  “real” or “virtual”.
• A perceptual illusion has been engineered. Thus, another
  reasonable term for this area, especially if considered as an
  academic discipline, could be perception engineering.
• When considering a VR system, it is tempting to focus only on the traditional
  engineering parts: Hardware and software.
• However, it is equally important, if not more important, to understand and exploit
  the characteristics of human physiology and perception.
• Because we did not design ourselves, these fields can be considered as reverse
  engineering. All of these parts tightly fit together to form perception engineering.
• Interactivity Most VR experiences involve another crucial
  component: interaction.
• Does the sensory stimulation depend on actions taken by the
  organism?
• If the answer is “no”, then the VR system is called open-loop;
  otherwise, it is closed-loop.
• In the case of closed-loop VR, the organism has partial control
  over the sensory stimulation, which could vary as a result of body
  motions, including eyes, head, hands, or legs.
• Other possibilities include voice commands, heart rate, body
  temperature, and skin conductance.
• First- vs. Third-person When a scientist designs an experiment for an
  organism, then the separation is clear: The laboratory subject
  (organism) has a first-person experience, while the scientist is a third-
  person observer.
• The scientist carefully designs the VR system as part of an experiment
  that will help to resolve a scientific hypothesis. For example, how does
  turning off a few neurons in a rat’s brain affect its navigation ability?
• On the other hand, when engineers or developers construct a VR
  system or experience, they are usually targeting themselves and people
  like them.
• They feel perfectly comfortable moving back and forth between being
  the “scientist” and the “lab subject” while evaluating and refining their
  work.
• Synthetic vs. captured Two extremes exist when constructing a virtual world as part
  of a VR experience. At one end, we may program a synthetic world, which is completely
  invented from geometric primitives and simulated physics.
• At the other end, the world may be captured using modern imaging techniques. For
  viewing on a screen, the video camera has served this purpose for over a century.
  Capturing panoramic images and videos and then seeing them from any viewpoint in a VR
  system is a natural extension.
• As humans interact, it becomes important to track their motions, which is an important
  form of capture.
• What are their facial expressions while wearing a VR headset? Do we need to know
  their hand gestures? What can we infer about their emotional state? Are their eyes
  focused on me?
• Synthetic representations of ourselves called avatars enable us to interact and provide
  a level of anonymity. We can also enhance our avatars by tracking the motions and
  other attributes of our actual bodies.
• Health and safety Unlike simpler media such as radio or
  television, VR has the power to overwhelm the senses and the
  brain, leading to fatigue or sickness.
• This phenomenon has been studied under the heading of simulator
  sickness for decades; in this book we will refer to adverse
  symptoms from VR usage as VR sickness.
• In many cases, it is caused by a careless developer who
  misunderstands or disregards the side effects of the experience
  on the user.
• To engineer comfortable VR experiences, one must understand
  human physiology and perceptual psychology.
• In many cases, fatigue arises because the brain appears to work
  harder to integrate the unusual stimuli being presented to the
  senses.
• Another factor that leads to fatigue is an interface that
  requires large amounts of muscular effort.
• For example, it might be tempting to move objects around in a
  sandbox game by moving your arms around in space. This quickly
  leads to fatigue and an avoidable phenomenon called gorilla arms,
  in which people feel that the weight of their extended arms is
  unbearable.
                   Modern VR Experiences
• The current generation of VR systems was brought about by advances in display, sensing,
  and computing technology from the smartphone industry.
• Video games People have dreamed of entering their video game worlds for decades. Figure
  below shows several video game experiences in VR. Most gamers currently want to explore
  large, realistic worlds through an avatar.
• Figure (a) shows Valve’s Portal 2 for the HTC Vive headset which is a puzzle-solving
  experience in a virtual world. Figure (b) shows an omnidirectional treadmill peripheral for
  walking through first-person shooter games. These two examples give the user a first-
  person perspective of their character. By contrast, Figure (c) shows Lucky’s Tale for the
  Oculus Rift, which instead yields a comfortable third-person perspective as the user seems
  to float above the character that she controls. Figure (d) shows a game that contrasts all
  the others in that it was designed to specifically exploit the power of VR, the player
  appears to have a large elephant trunk. The purpose of the game is to enjoy this unusual
  embodiment by knocking things down with a swinging trunk.
• Telepresence The first step toward feeling like we are somewhere
  else is capturing a panoramic view of the remote environment.
• Google’s Street View and Earth apps already rely on the captured
  panoramic images from millions of locations around the world.
• Simple VR apps that query the Street View server directly enable to
  user to feel like he is standing in each of these locations.
• Even better is to provide live panoramic video interfaces, through
  which people can attend sporting events and concerts.
• An important component for achieving telepresence is to capture
  a panoramic view: (a) A car with cameras and depth sensors on
  top, used by Google to make Street View. (b) The Insta360 Pro
  captures and streams omnidirectional videos.
• People can take video conferencing to the next level by feeling
  present at the remote location. By connecting panoramic cameras
  to robots, the user is even allowed to move around in the remote
  environment
• Current VR technology allows us to virtually visit far away places
  and interact in most of the ways that were previously possible
  only while physically present.
• This leads to improved opportunities for telecommuting to work.
  This could ultimately help reverse the urbanization trend
  sparked by the 19th-century industrial revolution, leading to
  deurbanization as we distribute.
• panoramic video of Paul McCartney performing, which provides a
  VR experience where users felt like they were on stage with the
  rock star.
• Virtual societies Whereas telepresence makes us feel like we
 are in another part of the physical world.
• VR also allows us to form entire societies that remind us of the
  physical world, but are synthetic worlds that contain avatars
  connected to real people.
• People interact in a fantasy world through avatars; such
  experiences were originally designed to view on a screen but can
  now be experienced through VR.
• Groups of people could spend time together in these spaces for a
  variety of reasons, including common special interests,
  educational goals, or simply an escape from ordinary life.
• Empathy The first-person perspective provided by VR is a powerful
 tool for causing people to feel empathy for someone else’s situation.
• The world continues to struggle with acceptance and equality for
  others of different race, religion, age, gender, social status, and
  education, while the greatest barrier to progress is that most people
  cannot understand what it is like to have a different identity.
• Through virtual societies, many more possibilities can be explored.
• What if you were 10cm shorter than everyone else?
• What if you teach your course with a different gender?
• What if you were the victim of racial discrimination by the police?
• Using VR, we can imagine many “games of life”
• Education    The first-person perspective could revolutionize many
 areas of education.
• In engineering, mathematics, and the sciences, VR offers the chance
  to visualize geometric relationships in difficult concepts or data that
  are hard to interpret.
• Furthermore, VR is naturally suited for practical training because skills
  developed in a realistic virtual environment may transfer naturally to
  the real environment.
• The motivation is particularly high if the real environment is costly to
  provide or poses health risks.
• One of the earliest and most common examples of training in VR is
  flight simulation
• A flight simulator in use by the US Air Force. The user sits in a
  physical cockpit while being surrounded by displays that show
  the environment.
• Other examples include firefighting, nuclear power plant safety,
  search-and-rescue, military operations, and medical procedures.
• Beyond these common uses of VR, perhaps the greatest opportunities
  for VR education lie in the humanities, including history, anthropology,
  and foreign language acquisition.
• Consider the difference between reading a book on the Victorian era in
  England and being able to roam the streets of 19th-century London, in
  a simulation that has been painstakingly constructed by historians.
• We could even visit an ancient city that has been reconstructed from
  ruins.
• Fascinating possibilities exist for either touring physical museums
  through a VR interface or scanning and exhibiting artifacts directly
  in virtual museums.
• Virtual prototyping In the real world, we build prototypes to
  understand how a proposed design feels or functions.
• Virtual prototyping enables designers to inhabit a virtual world
  that contains their prototype. They can quickly interact with it
  and make modifications.
• They also have opportunities to bring clients into their virtual
  world so that they can communicate their ideas.
• Imagine you want to remodel your kitchen. You could construct a
  model in VR and then explain to a contractor exactly how it should
  look.
• Virtual prototyping in VR has important uses in many businesses,
  including real estate, architecture, and the design of aircraft,
  spacecraft, cars, furniture, clothing, and medical instruments.
• Health care Although health and safety are challenging VR issues,
 the technology can also help to improve our health.
• There is an increasing trend toward distributed medicine, in which
  doctors train people to perform routine medical procedures in remote
  communities around the world.
• Doctors can provide guidance through telepresence, and also use VR
  technology for training.
• In another use of VR, doctors can immerse themselves in 3D organ
  models that were generated from medical scan data.
• This enables them to better plan and prepare for a medical procedure
  by studying the patient’s body shortly before an operation.
• In yet another use, VR can directly provide therapy to help
  patients.
• Examples include overcoming phobias and stress disorders
  through repeated exposure, improving or maintaining cognitive
  skills in spite of aging, and improving motor skills to overcome
  balance, muscular, or nervous system disorders.
• VR systems could also one day improve longevity by enabling aging
  people to virtually travel, engage in fun physical therapy, and
  overcome loneliness by connecting with family and friends through
  an interface that makes them feel present and included in remote
  activities.
• Augmented and mixed reality In many applications, it is
  advantageous for users to see the live, real world with some
  additional graphics superimposed to enhance its appearance.
• This has been referred to as augmented reality or mixed
  reality.
• By placing text, icons, and other graphics into the real world,
  the user could leverage the power of the Internet to help
  with many operations such as navigation, social interaction,
  and mechanical maintenance.
• The Microsoft Hololens, 2016, uses advanced see-through display
  technology to superimpose graphical images onto the ordinary
  physical world, as perceived by looking through the glasses.
• Nintendo Pokemon Go is a geolocation-based game from 2016 that
  allows users to imagine a virtual world that is superimposed on to
  the real world. They can see Pokemon characters only by looking
  “through” their smartphone screen.
                        Hardware
(a) The Touch X system by 3D Systems allows the user to feel strong
resistance when poking into a virtual object with a real stylus. A robot arm
provides the appropriate forces. (b) Some game controllers occasionally
vibrate.
• Sensors For visual and auditory body-mounted displays, the position and orientation
 of the sense organ must be tracked by sensors to appropriately adapt the stimulus.
• The orientation part is usually accomplished by an inertial measurement unit or
  IMU.
• The main component is a gyroscope, which measures its own rate of rotation; the
  rate is referred to as angular velocity.
• To reduce drift error, resulting from measurements of cumulative change in
  orientation IMUs also contain an accelerometer and possibly a magnetometer.
• Over the years, IMUs have gone from existing only as large mechanical systems in
  aircraft and missiles to being tiny devices inside of smartphones.
• Due to their small size, weight, and cost, IMUs can be easily embedded in wearable
  devices.
• They are one of the most important enabling technologies for the current
  generation of VR headsets and are mainly used for tracking the user’s head
  orientation.
• Digital cameras provide another critical source of information for tracking
  systems.
• Like IMUs, they have become increasingly cheap and portable due to the
  smartphone industry, while at the same time improving in image quality.
• The idea is to identify features or markers in the image that serve as
  reference points for an moving object or a stationary background.
• Cameras are commonly used to track eyes, heads, hands, entire human bodies,
  and any other objects in the physical world.
• One of the main challenges at present is to obtain reliable and accurate
  performance without placing special markers on the user or objects around
  the scene.
• As opposed to standard cameras, depth cameras work actively by projecting
  light into the scene and then observing its reflection in the image.
• This is typically done in the infrared (IR) spectrum. In addition to these
  sensors, we rely heavily on good-old mechanical switches and
  potentiometers to create keyboards and game controllers.
• (a) The Microsoft Kinect sensor gathers both an ordinary RGB image
  and a depth map (the distance away from the sensor for each pixel).
  (b) The depth is determined by observing the locations of projected
  IR dots in an image obtained from an IR camera.
• Computers As we have noticed, most of the needed sensors exist on a
 smartphone.
• Therefore, a smartphone can be dropped into a case with lenses to
  provide a VR experience with little added costs.
• In the near future, we expect to see wireless, all-in one headsets that
  contain all of the essential parts of smartphones for delivering VR
  experiences.
• These will eliminate unnecessary components of smartphones and will
  instead have customized optics, microchips, and sensors for VR.
• Graphical processing units (GPUs) have been optimized for quickly
  rendering graphics to a screen and they are currently being adapted to
  handle the specific performance demands of VR.
• Two headsets that create a VR experience by dropping a
  smartphone into a case. (a) Google Cardboard works with a wide
  variety of smartphones. (b) Samsung Gear VR is optimized for one
  particular smartphone (in this case, the Samsung S6).
• Figure shows the hardware components for the Oculus Rift DK2, which became available in late 2014.
• In the lower left corner, you can see a smartphone screen that serves as the display. Above that is a circuit
  board that contains the IMU, display interface chip, a USB driver chip, a set of chips for driving LEDs on the
  headset for tracking, and a programmable microcontroller.
• The lenses, shown in the lower right, are placed so that the smartphone screen appears to be “infinitely far”
  away, but nevertheless fills most of the field of view of the user.
• The upper right shows flexible circuits that deliver power to IR LEDs embedded in the headset (they are
  hidden behind IR-transparent plastic). A camera is used for tracking, and its parts are shown in the center.
                       Software
• The VWG receives inputs from low-level systems that
  indicate what the user is doing in the real world.
• A head tracker provides timely estimates of the user’s head
  position and orientation. Keyboard, mouse, and game
  controller events arrive in a queue that are ready to be
  processed.
• The key role of the VWG is to maintain enough of an internal
  “reality” so that renderers can extract the information they
  need to calculate outputs for their displays.
• The Virtual World Generator (VWG) maintains another world, which
  could be synthetic, real, or some combination. From a computational
  perspective, the inputs are received from the user and his
  surroundings, and appropriate views of the world are rendered to
  displays.
• Matched motion The most basic operation of the VWG is to maintain a correspondence
  between user motions in the real world and the virtual world;
• In the real world, the user’s motions are confined to a safe region, which we will call
  the matched zone.
• One of the greatest challenges is the mismatch of obstacles: What if the user is
  blocked in the virtual world but not in the real world? The reverse is also possible.
• In a seated experience, the user sits in a chair while wearing a headset. The matched
  zone in this case is a small region, such as one cubic meter, in which users can move
  their heads.
• If the user is not constrained to a seat, then the matched zone could be an entire room
  or an outdoor field.
• Larger matched zones tend to lead to greater safety issues. Users must make sure that
  the matched zone is cleared of dangers in the real world, or the developer should make
  them visible in the virtual world.
• A matched zone is maintained between the user in their real
  world and his representation in the virtual world.
• The matched zone could be moved in the virtual world by using an
  interface, such as a game controller, while the user does not
  correspondingly move in the real world.
• User Locomotion In many VR experiences, users want to move well
  outside of the matched zone.
• Imagine you want to explore a virtual city while remaining seated in the
  real world.
• A popular option is to move oneself in the virtual world by operating a
  game controller, mouse, or keyboard.
• By pressing buttons or moving knobs, yourself in the virtual world could
  be walking, running, jumping, swimming, flying, and so on.
• You could also climb aboard a vehicle in the virtual world and operate
  its controls to move yourself.
• These operations are certainly convenient, but often lead to sickness
  because of a mismatch between your balance and visual senses.
• Networked experiences In the case of a networked VR
  experience, a shared virtual world is maintained by a server.
• Each user has a distinct matched zone. Their matched zones
  might overlap in a real world, but one must then be careful so that
  they avoid unwanted collisions.
• Within the virtual world, user interactions, including collisions,
  must be managed by the VWG.
• If multiple users are interacting in a social setting, then the
  burdens of matched motions may increase.
• As users meet each other, they could expect to see eye motions,
  facial expressions, and body language.
              Human Physiology and Perception
• Our bodies were not designed for VR. By applying artificial stimulation to the senses, we
  are disrupting the operation of biological mechanisms that have taken hundreds of
  millions of years to evolve in a natural environment.
• We are also providing input to the brain that is not exactly consistent with all of our
  other life experiences.
• In some instances, our bodies may adapt to the new stimuli. This could cause us to
  become unaware of flaws in the VR system.
• In other cases, we might develop heightened awareness or the ability to interpret 3D
  scenes that were once difficult or ambiguous.
• Unfortunately, there are also many cases where our bodies react by increased fatigue or
  headaches, partly because the brain is working harder than usual to interpret the
  stimuli.
• Finally, the worst case is the onset of VR sickness, which typically involves symptoms of
  dizziness and nausea.
• Perceptual psychology is the science of understanding how the brain
  converts sensory stimulation into perceived phenomena.
• Here are some typical questions that arise in VR
1. How far away does that object appear to be?
2. How much video resolution is needed to avoid seeing pixels?
3. How many frames per second are enough to perceive motion as continuous?
4. Is the user’s head appearing at the proper height in the virtual world?
5. Where is that virtual sound coming from?
6. Why am I feeling nauseated?
7. Why is one experience more tiring than another?
8. What is presence?
• To answer these questions and more, we must understand several things:
1. Basic physiology of the human body, including sense organs and neural
   pathways,
2. The key theories and insights of experimental perceptual psychology,
3. The interference of the engineered VR system with our common perceptual
   processes and the resulting implications or side effects.
• The perceptual side of VR often attracts far too little attention among
  developers.
• In the real world, perceptual processes are mostly invisible to us. Think
  about how much effort it requires to recognize a family member.
• Optical illusions One of the most popular ways to appreciate the complexity
  of our perceptual processing is to view optical illusions.
• Each one is designed to reveal some shortcoming of our visual system by
  providing a stimulus that is not quite consistent with ordinary stimuli in our
  everyday lives.
• These should motivate you to appreciate the amount of work that our sense
  organs and neural structures are doing to fill in missing details and make
  interpretations based on the context of our life experiences and existing
  biological structures.
• Classification of senses Perception and illusions are not limited to our
  eyes.
• In each eye, over 100 million photoreceptors target electromagnetic energy precisely in the
  frequency range of visible light.
• The auditory, touch, and balance senses involve motion, vibration, or gravitational force;
  these are sensed by mechanoreceptors.
• The sense of touch additionally involves thermoreceptors to detect change in temperature.
• Our balance sense helps us to know which way our head is oriented.
• Finally, our sense of taste and smell is grouped into one category, called the chemical
  senses, that relies on chemoreceptors; these provide signals based on chemical composition
  of matter appearing on our tongue or in our nasal passages.
• Perception happens after the sense organs convert the stimuli into neural impulses.
• According to latest estimates, human bodies contain around 86 billion neurons. Around 20 billion are devoted to the
  part of the brain called the cerebral cortex, which handles perception and many other high-level functions such as
  attention, memory, language, and consciousness.
• Another important factor in perception and overall cognitive ability is the interconnection between neurons. The
  nucleus or cell body of each neuron is a node that does some kind of “processing”.
• The dendrites are essentially input edges to the neuron, whereas the axons are output edges.
• Through a network of dendrites, the neuron can aggregate information from numerous other neurons, which
  themselves may have aggregated information from others.