0% found this document useful (0 votes)
176 views42 pages

Ar VR Lab

The document outlines a series of experiments focused on various 3D modeling and application development tools such as Unity, Maya, 3DS Max, AR Toolkit, Vuforia, and Blender. It details the aims, algorithms, and results of each experiment, which include tasks like modeling 3D objects, creating realistic scenes, and developing VR and AR applications. Each experiment concludes with a verification of successful execution.

Uploaded by

dhanushmathi22
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
176 views42 pages

Ar VR Lab

The document outlines a series of experiments focused on various 3D modeling and application development tools such as Unity, Maya, 3DS Max, AR Toolkit, Vuforia, and Blender. It details the aims, algorithms, and results of each experiment, which include tasks like modeling 3D objects, creating realistic scenes, and developing VR and AR applications. Each experiment concludes with a verification of successful execution.

Uploaded by

dhanushmathi22
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 42

CONTENTS

S no Date Experiment Pg no Mark Signature

1. Study of tools like Unity, Maya, 3DS 1


MAX, AR toolkit, Vuforia and Blender
2. Use the primitive objects and apply 17
various projection types by handling
camera

3. Download objects from asset store and 25


apply various lighting and shading effects.

4. Model three dimensional objects using 37


various modelling techniques and apply
textures over them.

5. Create three dimensional realistic scenes 45


and develop simple virtual reality enabled
mobile applications which have limited
interactivity.

6. Add audio and text special effects to the 53


developed application.
7. Develop VR enabled applications using 57
motion trackers and sensors incorporating
full haptic interactivity.

8. Develop AR enabled applications with 65


interactivity like E learning environment,
Virtual walkthroughs and visualization of
historic places.

9. Develop AR enabled simple applications 71


like human anatomy visualization,
DNA/RNA structure visualization and
surgery simulation.

10. Develop simple MR enabled gaming 79


applications.
Ex no: 1 Study of tools like unity, maya,3DS max.AR toolkit, Vuforia and blender.
Date:

AIM:
To study of tools like unity, maya,3DS max.AR toolkit, Vuforia and blender.
ALGORITHAM:
STEP1: Research and familiarization.
STEP2: hands-on practice.
STEP3: project-based learning.
STEP4: integration and collaboration.
STEP5: iterative learning and optimization.
STEP6: portfolio development.

UNITY:
Unity is a 2D/3D engine and framework that gives you a system for designing game or app
scenes for 2D, 2.5D and 3D.

COMPONENTS OF UNITY:
• Assets. First in the list of elements of Unity comes an asset. ...
• Project. Project is yet another component in the list of elements of Unity. ...
• Packages. ...
• Scene. ...
• Components. ...
• GameObject. ...
• Prefab. ...
• Build.

MAYA:
Maya is a 3D computer graphics software used for creating interactive 3D animations, models,
and simulations. It was first developed in the late 1980s and has since become one of the most
widely used 3D software in the entertainment, architecture, and product design industries.
Maya is professional 3D software for creating realistic characters and blockbuster-worthy
effects.

• Bring believable characters to life with engaging animation tools.

• Shape 3D objects and scenes with intuitive modelling tools.

• Create realistic effects – from explosions to cloth simulation.


Why use Maya?

• Accelerated workflows
• Maya’s powerful tools help you iterate faster so you can focus on creativity and meet
deadlines.

• Deliver stunning visuals


• Add fine details to characters and scenes, and deliver quality work that keeps clients
coming back.

COMPONENTS OF MAYA:
• Application Home.
• Menus and menu sets.
• Workspaces.
• Quick layout and Outliner buttons.
• View panel.
• Panel toolbar.
• Status line (toolbar)
• Shelves.

3DS MAX:
3ds Max® professional 3D modelling, rendering and animation software enables you to
create expansive worlds and premium designs.

• Breathe life into environments and landscapes with robust modelling tools.

• Create finely detailed designs and props with intuitive texturing and shading tools.

• Iterate and produce professional-grade renders with full artistic control.


COMPONENTS OF 3DS MAX:

1.menu bar

2.toolbar

3.animation

4.command panel

5.maxscript

6.viewports

7.color

8.text editor

9.status
AR TOOLKIT:

open-source computer tracking library for creation of strong augmented reality applications that
overlay virtual imagery on the real world. Currently, it is maintained as an open-source project
hosted on GitHub.

USING PROGRAMING LANGUAGE:

C++

VUFORIA:

Vuforia Engine is a software development kit (SDK) for creating Augmented Reality apps. With
the SDK, you add advanced computer vision functionality to your application, allowing it to
recognize images, objects, and spaces with intuitive options to configure your app to interact
with the real world.

BLENDER:

Blender is the free and open source 3D creation suite. It supports the entirety of the 3D
pipeline—modeling, rigging, animation, simulation, rendering, compositing and motion tracking,
even video editing and game creation.

• Components. Volumes. Displacement. Settings.


• Color Ramp Node. Combine Color Node. Combine XYZ Node.
• Crease Angle. Crease Angle.
RESULT:

Thus the study of tools like unity, maya, 3DS max, AR toolkit ,Vuforia and blender was
verified and executed successfully.
EX NO:2 USE THE PRIMITIVE OBJECTS AND APPLY VARIOUS PROJECTION
DATE: TYPES BY HANDLING CAMERA.

AIM:
To use the primitive objects and apply various projection types by handling camara.
ALGORITHM:
STEP1: set up the scene.
STEP2: choose projection types.
STEP3: Implement Projection Functions
STEP4: Apply Projections to Objects
STEP5: Visualize the Results.
STEP6: Adjust Camera Parameters (Optional).
STEP7: Iterate and Experiment.
STEP8: Documentation and Presentation.

1.PERSPECTIVE PROJECTION:
2. ORTHOGRAPHIC PROJECTION:
OBLIQUE PROJECTION:

RESULT:

Thus the use the primitive objects and apply various projection types by handling camera
using blender application is verified and executed successfully
EX NO:3 DOWNLOAD OBJECTS FROM ASSET STORE AND APPLY VARIOUS
DATE: LIGHTING AND SHADING EFFECTS.

AIM:

To download objects from asset store and apply various lighting and shading effects.

ALGORITHM:

1. Click Open in Unity.


2. This launches the Unity Hub.
3. Create a new project or launch an existing one.
4. This launches your project and brings up the Package Manager.
5. Click Download in the Package Manager window.

STEP1: Download blender application and open the application.


STEP2: Download unreal engine and download epic games launcher.

Unreal engine:

Epic games launcher:

STEP1:
STEP2: Open the epic games.

STEP3: select marketplace and select free then select epic games content.
STEP4: search best assert and download free assert.

STEP5: go to blender application and add your assets.

STEP6: go to search bar and search your objects.


STEP7: download the assert.
OUTPUT:

RESULT:
Thus the download objects from asset and apply various lighting and shading effects using
blender application is verified and executed successfully
Ex no:04 Model three dimensional objects using various modelling techniques and
Date: apply
textures over them

AIM:
To model three dimensional objects using various modelling techniques and apply
textures over them.
ALGORITHM:
1. Import Statements
2. Initialization
3. Texture Loading
4. Sphere Drawing Function (draw_sphere())
5. Main Loop
6. OpenGL Setup
7. Sphere Rotation
8. Texture Mapping
9. Display Update
PROGRAM:
import pygame
from pygame.locals import *
from OpenGL.GL import *
from OpenGL.GLU import *
from math import pi, sin, cos

# Initialize Pygame
pygame.init()

# Set display size


display = (800, 600)
pygame.display.set_mode(display, DOUBLEBUF|OPENGL)

# Load texture
texture_surface = pygame.image.load("texture.jpg")
texture_data = pygame.image.tostring(texture_surface, "RGB", 1)
width = texture_surface.get_width()
height = texture_surface.get_height()

# Create OpenGL texture


texture_id = glGenTextures(1)
glBindTexture(GL_TEXTURE_2D, texture_id)
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR)
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR)
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGB, width, height, 0, GL_RGB,
GL_UNSIGNED_BYTE, texture_data)

# Define vertices and texture coordinates for a sphere


def draw_sphere(radius, slices, stacks):
for i in range(stacks):
lat0 = pi * (-0.5 + (i) / stacks)
z0 = sin(lat0)
zr0 = cos(lat0)

lat1 = pi * (-0.5 + (i + 1) / stacks)


z1 = sin(lat1)
zr1 = cos(lat1)

glBegin(GL_QUAD_STRIP)
for j in range(slices + 1):
lng = 2 * pi * j / slices
x = cos(lng)
y = sin(lng)

glTexCoord2f(j / slices, i / stacks)


glVertex3f(x * zr0 * radius, y * zr0 * radius, z0 * radius)

glTexCoord2f(j / slices, (i + 1) / stacks)


glVertex3f(x * zr1 * radius, y * zr1 * radius, z1 * radius)
glEnd()

# Main loop
clock = pygame.time.Clock()
rotation_angle = 0
while True:
for event in pygame.event.get():
if event.type == pygame.QUIT:
pygame.quit()
quit()

# Clear the screen and set up the camera


glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT)
glLoadIdentity()
gluPerspective(45, (display[0] / display[1]), 0.1, 50.0)
glTranslatef(0.0, 0.0, -5)

# Apply rotation
rotation_angle += 1
glRotatef(rotation_angle, 1, 1, 1)

# Draw the textured sphere


glEnable(GL_TEXTURE_2D)
glBindTexture(GL_TEXTURE_2D, texture_id)
draw_sphere(1, 50, 50)
glDisable(GL_TEXTURE_2D)

# Update the display


pygame.display.flip()
clock.tick(60) # Limit to 60 frames per second
OUTPUT:

RESULT:
Thus the model three dimensional objects using various modelling techniques
and apply textures over them was executed successfully
EX NO:05 Create three dimensional realistic scenes and develop simple virtual reality
DATE: enabled mobile applications which have limited interactivity
AIM:
To create three dimensional realistic scenes and develop simple virtual reality enabled
mobile applications which have limited interactivity.
ALGORITHM:
1. Import Statements
2. vizconnect Configuration
3. Load Environment
4. Define Grabbable Objects
5. Initialize Grabber Tools
6. Collect Participant Information
7. Event Handling for Grabbing Objects
8. Data Logging
9. Tracking Data Collection
PROGRAM:
import viz
import vizfx
import vizconnect
import vizinput

#Can use desktop one for desktop mode or create your own
vizconnect.go('vizconnect_config_vive.py')

env = vizfx.addChild('resources/kitchen.osgb')

#Add objects to grab


blueCup=env.getChild('blueCup')
silverCup=env.getChild('silverCup')
grabbableObjects=[blueCup,silverCup]

# Code to get the grabber tool by name and supply the list of items which can be grabbed
grabber = vizconnect.getRawTool('grabber')
grabber.setItems(grabbableObjects)
grabber2 = vizconnect.getRawTool('grabber2')
grabber2.setItems(grabbableObjects)

#Collecting participant info


subject = vizinput.input('What is the participant number?')
#Start timer
start_time = viz.tick()
#Save data for event
session_data = open('data/session_data'+str(subject)+'.txt','a')

def onGrab(e):
elapsed_time = viz.tick() - start_time
if e.grabbed == blueCup:
data = 'Subject ' + str(subject) + ' grabbed blueCup.\t'
print('grabbed blue cup')

if e.grabbed == silverCup:
data = 'Subject ' + str(subject) + ' grabbed silverCup.\t'
#add elapsed time to data
data = data + 'Elapsed time was: ' + str(round(elapsed_time,2)) + ' seconds\n'
session_data.write(data)
print('grabbed silver cup')
from tools import grabber
viz.callback(grabber.GRAB_EVENT, onGrab)

#Save data for tracking


tracking_data = open('data/tracking_'+str(subject)+'.txt', 'a')

#Get the tracking data.


def getData():
position = viz.MainView.getPosition()
#Make a string out of the data.
data = str(round(position[0],2))+ '\t'+ str(round(position[1],2))+'\t'+str(round(position[2],2))+
'\n'
#Write it to the tracking file.
tracking_data.write(data)

vizact.onupdate(0, getData)
OUTPUT:

RESULT:
Thus the creation three dimensional realistic scenes and develop simple virtual
reality enabled mobile applications which have limited interactivity was executed successfully.
Ex no:6 Develop AR enabled simple applications like human anatomy
Date: visualization, DNA/RNA structure visualization and surgery simulation
algorithm
Aim:
To develop AR enabled simple applications like human anatomy visualization,
DNA/RNA structure visualization and surgery simulation

ALGORITHM:
1. Import Statements
2. vizconnect Configuration
3. Load Environment
4. Define Grabbable Objects
5. Initialize Grabber Tools
6. Collect Participant Information
7. Event Handling for Grabbing Objects
8. Data Logging
9. Tracking Data Collection

PROGRAM:
# WaveGrad2 - PyTorch Implementation

<p align="center">
<img src="img/model_1.png" width="80%">
</p>

<p align="center">
<img src="img/model_2.png" width="80%">
</p>

# Quickstart

pip3 install -r requirements.txt


![](./img/tensorboard_loss.png)
![](./img/tensorboard_spec.png)
![](./img/tensorboard_audio.png)

misc{lee2021wavegrad2,
author = {Lee, Keon},
title = {WaveGrad2},
year = {2021},

Output:

Result:
Thus the develop AR enabled simple applications like human anatomy
visualization, DNA/RNA structure visualization and surgery simulation was executed and
verified successfully.
EX NO:07 Develop VR enabled applications using motion trackers and sensors
DATE: incorporating full haptic interactivity

AIM:
To develop VR enabled applications using motion trackers and sensors incorporating
full haptic interactivity.
ALGORITHM:
1. Initialization
2. OpenGL Setup
3. Cube Definition
4. Haptic Feedback Function
5. Main Loop
6. OpenGL Drawing
7. Mouse Input for Rotation
8. Display Update
PROGRAM:
import pygame
from pygame.locals import *
from OpenGL.GL import *
from OpenGL.GLU import *

# Initialize Pygame
pygame.init()
display = (800, 600)
pygame.display.set_mode(display, DOUBLEBUF | OPENGL)

# Set up the OpenGL perspective


gluPerspective(45, (display[0] / display[1]), 0.1, 50.0)
glTranslatef(0.0, 0.0, -5)

# Define cube vertices and edges


vertices = (
(1, -1, -1),
(1, 1, -1),
(-1, 1, -1),
(-1, -1, -1),
(1, -1, 1),
(1, 1, 1),
(-1, -1, 1),
(-1, 1, 1)
)

edges = (
(0, 1),
(0, 3),
(0, 4),
(2, 1),
(2, 3),
(2, 7),
(6, 3),
(6, 4),
(6, 7),
(5, 1),
(5, 4),
(5, 7)
)

# Define haptic feedback function


def apply_haptic_feedback():
print("Haptic feedback applied!")

# Main loop
while True:
for event in pygame.event.get():
if event.type == pygame.QUIT:
pygame.quit()
quit()
elifevent.type == pygame.MOUSEBUTTONDOWN:
apply_haptic_feedback()

# Clear the screen


glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT)

# Draw the cube


glBegin(GL_LINES)
for edge in edges:
for vertex in edge:
glVertex3fv(vertices[vertex])
glEnd()
# Get mouse input for motion tracking
x, y = pygame.mouse.get_pos()
rotation_speed = 0.1
glRotatef(x * rotation_speed, 0, 1, 0)
glRotatef(y * rotation_speed, 1, 0, 0)
# Update the display
pygame.display.flip()
pygame.time.wait(10)
OUTPUT:

Haptic feedback applied!


Haptic feedback applied!
Haptic feedback applied!
Haptic feedback applied!
Haptic feedback applied!
Haptic feedback applied!
Haptic feedback applied!
Haptic feedback applied!

RESULT:
Thus the Development VR enabled applications using motion trackers and sensors
incorporating full haptic interactivity was executed successfully.
Ex no: 8 Develop AR enabled applications with interactivity like E learning
Date: environment, Virtual walkthroughs and visualization of historic places

Aim:
To develop AR enabled applications with interactivity like E learning environment, Virtual
walkthroughs and visualization of historic places
Procedure:

1. E-learning Environment:

• Concept: Overlay 3D models, animations, and text labels on real-world objects or


environments to enhance learning.
• Implementation:
o Use frameworks like ARKit (iOS) or ARCore (Android) for device-specific AR
development.

2. Virtual Walkthroughs:

• Concept: Create an immersive experience where users can virtually explore buildings,
locations, or even historical sites.
• Implementation:
o Develop 3D models of the environment using photogrammetry (creating 3D
models from photographs) or 3D modeling tools.

3. Visualization of Historic Places:

• Concept: Overlay historical reconstructions or augmented information on top of real-


world locations to bring history to life.
• Implementation:
o Gather historical data, photos, and 3D models (if available) of the location in its
past state.

Program:

<p align="center">
<br>
<img src="business-card-ar.jpg" alt="Business Card AR">
<br>
<br>
</p>
# Demo

YouTube URL - [https://youtu.be/Q5jT1cb_o1o] (https://youtu.be/Q5jT1cb_o1o)

# Download Build

- Download the latest release


[here](https://github.com/AgrMayank/Business-Card-AR/releases)
## Note

- A _Vuforia license_ (free/paid) is required for building this project for


Android/iOS.
- The Image target needs to be changed before building the project.

## License

This project is licensed under a


[The GNU General Public License v3.0](https://www.gnu.org/licenses/gpl-3.0.en.html).

<hr>
Made with by [AgrMayank](https://AgrMayank.GitHub.io)
Output:

Result:
Thus the develop AR enabled applications with interactivity like E learning
environment, Virtual walkthroughs and visualization of historic places is executed successfully.
Ex no:9 Develop AR enabled simple applications like human anatomy
Date: visualization, DNA/RNA structure visualization and surgery simulation

Aim:
To develop AR enabled simple applications like human anatomy visualization, DNA/RNA
structure visualization and surgery simulation.

Algorithm:

1. Human Anatomy Visualization Algorithm:

• Main Goal: Allow users to explore a 3D model of the human body and interact with
specific organs or systems.
• Algorithm Steps:
1. Initialization:
▪ Load the 3D model of the human body into the AR application.
▪ Define variables to track user position and orientation (using ARKit or
ARCore).
2. User Interaction:
▪ Detect user taps or gaze on the screen (touch or camera-based).
▪ Calculate the intersection between the user's ray cast (originating from the
device) and the 3D model..

2. DNA/RNA Structure Visualization Algorithm:

• Main Goal: Allow users to explore a 3D model of a DNA or RNA molecule and
understand its structure.
• Algorithm Steps:
1. Initialization:
▪ Load the 3D model of the DNA/RNA molecule into the AR application.
▪ Define variables to track user interaction (touch, gestures).
2. User Interaction:
▪ Detect user taps, gestures, or pinching motions on the screen.
▪ Interpret user actions as zooming, rotating, or manipulating the molecule.

3. Surgery Simulation Algorithm (Simple Version):

▪ Load 3D models of a virtual patient (focusing on the surgical area),


surgical tools, and a virtual camera.
▪ Define variables to track user hand movements (potentially using external
tracking devices).
▪ Track user hand movements and translate them to virtual tool
manipulation within the AR space.
Program:

<h1 align="center">
<img src="https://github.com/xianfei/SysMocap/assets/8101613/adca7a3c-bdb2-4bda-af26-
7ef9ba218c4c" align="center" height="128px" width="128px">
SysMocap
</h1>

<p align="center">
<a href="https://github.com/xianfei/SysMocap/actions" target="_blank">
<img src="https://github.com/xianfei/SysMocap/actions/workflows/main.yml/badge.svg"
alt="GitHub Actions" />
</a>
<a href="https://github.com/xianfei/SysMocap/releases" target="_blank">
<img src="https://badgen.net/github/release/xianfei/SysMocap?color=cyan" alt="release" />
</a>
<a href="#" target="_blank">
<img src="https://badgen.net/github/forks/xianfei/SysMocap" alt="forks" />
</a>
<a href="#" target="_blank">
<img src="https://badgen.net/github/stars/xianfei/SysMocap?color=yellow" alt="stars" />
</a>

</p>

<p align="center">
English Version | <a href="./README.zh-cn.md">中文版本</a>
</p>

<img width="478" alt="image"


src="https://github.com/xianfei/SysMocap/assets/8101613/7b747e44-789c-4a61-83d7-
c8e784a14856">

- If you got `“SysMocap” is damaged and can’t be opened. You should move it to the Trash.`
Please run `sudo xattr -r -d com.apple.quarantine /Applications/SysMocap.app` in your terminal

#### Run on your computer from source code (need lastest Node.js):

```shell
git clone https://github.com/xianfei/SysMocap.git
cd SysMocap
npm i
npm start
```
### Bugs

- You tell me

### Notice
### Star History

[![Star History Chart](https://api.star-


history.com/svg?repos=xianfei/SysMocap&type=Date)](https://star-
history.com/#xianfei/SysMocap&Date)

- [google/mediapipe/Holistic](https://google.github.io/mediapipe/solutions/holistic.html) for
Mocap

- [kalidokit](https://github.com/yeemachine/kalidokit) for Calulate Mocap Data

- [electron](https://github.com/electron/electron) and Vue.js for GUI Framework

- [Material color utilities](https://github.com/material-foundation/material-color-utilities) for


Color Picking

author={Song, Wenfeng and Wang, Xianfei and Gao, Yang and Hao, Aimin and Hou, Xia},
booktitle={2022 IEEE International Symposium on Mixed and Augmented Reality Adjunct
(ISMAR-Adjunct)},
title={Real-time Expressive Avatar Animation Generation based on Monocular Videos},
year={2022},
volume={},
number={},
pages={429-434},
doi={10.1109/ISMAR-Adjunct57072.2022.00092}}
output:

Result:

Thus the develop AR enabled simple applications like human anatomy visualization,
DNA/RNA structure visualization and surgery simulation is executed successfully.
Ex no:10 Develop simple MR enabled gaming applications
Date:

Aim:
To the develop simple MR enabled gaming applications
Algorithm:
1. Choose a Game Concept

2. MR Development Environment

.3. Spatial Mapping

4. Object Recognition

5. Character or Object Interaction

6. User Interface and Feedback

7. Gameplay Mechanics

8. Testing and Optimization

9. Deployment and Distribution

Program:
# Passthrough Measure

Use your Oculus Quest 2 with Passthrough as a tape measure. Keep in mind that this is a **very
early prototype**.

## Screenshots

<img src="Images/1.jpg" width="300" />

<img src="Images/2.jpg" width="300" />

<img src="Images/3.jpg" width="300" />


## Requirements

- Oculus Quest 2 (system version 35)


- [SideQuest](https://sidequestvr.com)

## How to install? (from GitHub)

- Follow [these instructions](https://sidequestvr.com/setup-howto) to install SideQuest and setup


your Oculus Quest 2 for development.

- [Download the latest


APK](https://github.com/fabio914/PassthroughMeasure/releases/latest/download/measure.apk).

## How to use it?

- On your Quest 2: open "Apps", select "Unknown Sources" on the drop-down menu on the top
right corner, and then select "Measure".

- Press the trigger button to start measuring, and then press it a second time to stop measuring or
press "B" (on the right controller) to cancel this measurement.

- Press "B" (on the right controller) to clear all measurements.

## How to build?

- Make sure you have Unity 2020.3 LTS installed (with Android Build support).

- Clone this project.

- Open the project with Unity, then open the Package Manager, and import the Oculus
Integration package (version 35.0).

- Navigate to **File > Build Settings...**, select the **Android** platform, then select your
Oculus Quest as the **Run device** (if it's plugged in) and then click on **Build and Run**.

## TO-DOs

- [ ] Add support for imperial units.

- [ ] Show instructions.
- [ ] Allow the user to remove a specific measurement.

Output:
Result:
Thus the develop simple MR enabled gaming applications was executed and verified
successfully.

You might also like