top of page
Home

About

CAVE (CAVE Augmented and Virtual Environments) Lab is a research initiative under the Center for IoT, PESU, that focuses on exploring various applications of mixed Reality, digital twin, human motion tracking, and generative alternate realities in the metaverse. The goal is to teach, learn, research, innovate, and collaborate with industry in the mixed reality space. The lab is equipped with hardware that enables research in the mixed reality space. CAVE also focuses on the use of VR technology in remote healthcare, neurodevelopmental disorders and so on.

About

We are recruiting!

Apply for an internship at the CAVE Augmented and Virtual Environments Labs!

List of open projects:

  • Intelligent Virtual Reality Driving Simulator

  • Virtual Reality based education for Autism Spectrum Disorder Patients 

  • Mixed Reality Drone Pilot Trainer 

  • Human Motion tracking system using sparse IMU sensors

  • Generative medical dataset for virtual patients

  • Building virtual reality simulators for nursing/medical training. 

  • Virtual reconstruction of human motion data using IMU on 3d avatar

 

Note:

  • Each project will require 3 students

  • The duration of the internship will be from Jan 01st to June  31st 

  • Stipend will be given as per University norms

  • Open for students of CSE, ECE, and EEE from the 8th semester

Write to: iotcenter@pes.edu

CAVE focuses on mixed reality research specifically targeting the following verticals

Digital Twin

Digital Twin is a next-generation technology that is a virtual representation of a physical entity. Current state-of-the-art are systems that are typically used for industrial and manufacturing applications, such as in the design and testing of new products, and for the monitoring and maintenance of existing equipment. The focus of CAVE is to explore static, functional and executable DT in the metaverse for not just products and processes but also to execute twins of physical spaces.

Generative Alternate Realities

Untitled-2-03.png

Generative AI is the buzz in the market. Starting from Image and Video Synthesis, Summarization, Text synthesis to music and speech, Generative AI has captured attention of the researching community. In CAVE the focus will be on generating alternate reality spaces using AI.  These alternate realities can be used to create immersive experiences and to simulate real-world scenarios in virtual environments. The goal of this field is to create synthetic environments that are as realistic and believable as possible. Generative Alternate Realities are an emerging field, and it's a complex task that requires a combination of generative models, computer graphics, and virtual reality technologies.

Human Motion Tracking

Untitled-2-02.png

Virtual reality applications do not work without tracking human movements. In other words, tracking human movement is essential for effective user interaction experiences. CAVE focuses on exploring tracking techniques and their effectiveness in variety of applications including indoor navigation, autonomous vehicles, Biomechanics, Sports Training, Gaming, Human computer interaction, event management.

Remote Health Care

Untitled-2-04.png

Virtual reality (VR) technology can be used in remote healthcare to provide patients with remote access to medical care. VR-based remote healthcare allows doctors and other medical professionals to interact with patients in a virtual environment, enabling them to provide diagnosis, treatment, and care from a distance.

Domains

Publications

1

Angular Features-Based Human Action Recognition System for a Real Application with Subtle Unit Actions’. IEEE Access 10 (2022): 9645–57.

Ryu, Jaeyeong, Ashok Kumar Patil, Bharatesh Chakravarthi, Adithya Balasubramanyam, Soungsill Park, and Youngho Chai.

3

Pilot Experiment of a 2d Trajectory Representation of Quaternion-Based 3d Gesture Tracking. Proceedings of the ACM SIGCHI Symposium on Engineering Interactive Computing Systems, 2019.

Patil, Ashok Kumar, Seong Hun Kim, Adithya Balasubramanyam, Jae Yeong Ryu, and Young Ho Chai.

5

‘An Open-Source Platform for Human Pose Estimation and Tracking Using a Heterogeneous Multi-Sensor System’. Sensors 21, no. 7 (2021): 2340.

Patil, Ashok Kumar, Adithya Balasubramanyam, Jae Yeong Ryu, Bharatesh Chakravarthi, and Young Ho Chai.

7

MotionNote: A Novel Human Pose Representation. 2020 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW). IEEE, 2020.

Kim, Dubeom, Bharatesh Chakravarthi, Seong Hun Kim, Adithya Balasubramanyam, Young Ho Chai, and Ashok Kumar Patil.

9

‘Gaze-Controlled Virtual Retrofitting of UAV-Scanned Point Cloud Data’. Symmetry 10, no. 12 (2018): 674.

Bn, Pavan Kumar, B, Adithya, Ashok Patil, and Young Chai.

11

An Experimental Study on Relationship between Foveal Range and FoV of a Human Eye Using Eye Tracking Devices. 2018 International Conference on Electronics, Information, and Communication (ICEIC). IEEE, 2018.

Adithya, B., B. N. Pavan Kumar, Hanna Lee, Ji Yeon Kim, Jae Cheol Moon, and Young Ho Chai.

13

Kinematically Admissible Editing of the Measured Sensor Motion Data for Virtual Reconstruction of Plausible Human Movements. 2021 IEEE International Conference on Systems, Man, and Cybernetics (SMC). IEEE, 2021.

Balasubramanyam, Adithya, Ashok Kumar Patil, Bharatesh Chakravarthi, Jaeyeong Ryu, and Young Ho Chai.

15

‘Inspired by Human Eye: Vestibular Ocular Reflex Based Gimbal Camera Movement to Minimize Viewpoint Changes’. Symmetry 11, no. 1 (2019): 101.

B, Adithya, Kumar B. N Pavan, Young Ho Chai, and Ashok Kumar Patil.

2

‘Motion Data Editing and Augmentation Method by Using the Motion-Sphere’s Trajectory’. Moving Image & Technology (MINT) 1, no. 1 (2021): 10–14.

Ryu, Jae Yeong, Bharatesh Chakravarthi, Adithya Balasubramanyam, Ashok Kumar Patil, and Young Ho Chai.

4

'Fusion of Multiple Lidars and Inertial Sensors for the Real-Time Pose Tracking of Human Motion’. Sensors 20, no. 18 (2020): 5342

Patil, Ashok Kumar, Adithya Balasubramanyam, Jae Yeong Ryu, Pavan Kumar Bn, Bharatesh Chakravarthi, and Young Ho Chai.

6

Joint-Sphere: Intuitive and Detailed Human Joint Motion Representation. EuroVis (Short Papers), 2020.

Kim, Seonghun, Adithya Balasubramanyam, Dubeom Kim, Young Ho Chai, and Ashok Kumar Patil.

8

‘Scenario-Based Sensed Human Motion Editing and Validation Through the Motion-Sphere’. IEEE Access 10 (2022): 28295–307.

Chakravarthi, Bharatesh, Ashok Kumar Patil, Jae Yeong Ryu, Adithya Balasubramanyam, and Young Ho Chai.

10

‘GazeGuide: An Eye-Gaze-Guided Active Immersive UAV Camera’. Applied Sciences 10, no. 5 (2020): 1668.

Bn, Pavan Kumar, Adithya Balasubramanyam, Ashok Kumar Patil, and Young Ho Chai.

12

‘Interactive Virtual Retrofitting of 3D Chiller Models to Optimize Energy Consumption: A 4-in-1 Alignment Use Case’. TECHART: Journal of Arts and Imaging Science 6, no. 2 (2019): 43–48.

Balasubramanyam, Adithya, B. Chethana, Ashok Kumar Patil, Pavan Kumar Bn, and Young Ho Chai.

14

‘Motion-Sphere: Visual Representation of the Subtle Motion of Human Joints’. Applied Sciences 10, no. 18 (2020): 6462.

Balasubramanyam, Adithya, Ashok Kumar Patil, Bharatesh Chakravarthi, Jae Yeong Ryu, and Young Ho Chai.

16

‘Calibration Techniques and Gaze Accuracy Estimation in Pupil Labs Eye Tracker’. TECHART: Journal of Arts and Imaging Science 5, no. 1 (2018): 38–41.

Adithya, B., Lee Hanna, Pavan Kumar Bn, and Youngho Chai.

Publications

Projects

Management of Pre-Operative Anxiety using Distraction Therapy in Desktop and Mobile Virtual Reality

Mentor: Dr. Adithya Balasubramanyam

Student: Manogna, Armili, Anusha, Gouri (B. Tech, PES University) 

The Project focuses on management of Pre-Operative Anxiety using distraction therapy in mobile based virtual reality environment. Depending on the seviarity of anxiety, the users are presented with immersive (HMD), desktop or mobile based VR content, to manage the bio markers of anxiety in patients like the heart rate and blood pressure. [Work is under review in Applied Science Journal]

A Gamified Zero Knowledge Proof based Immersive application for Rehabilitation of Upper Body Locomotor Disability

Mentor: Dr. Adithya Balasubramanyam

Student: Amit Tonshal (M. Tech, PES University) 

The project aims to create a gamified approach to make physiotherapy interesting and enable Realtime remote monitoring. 30+ different moves are captured and corelated to various physiotherapy procedures that are prescribed for upper body locomotor disability. A zero-knowledge proof based system is implemented to prove that the patient indulging in the gamified physiotherapy is performing the exercises correctly. The project combines multiple technologies like the IMU sensors, camera, controllers and game engines to tracking human body movements.

A Mixed Reality based Executable Digital Twin for Drone Simulation and Pilot Tracking

Mentor: Dr. Adithya Balasubramanyam

 

Student: Adithya, Alwin, Satwik, Manoj (B. Tech, PES University) 

 

This proposal aims to build a mixed reality drone simulator that can provide realistic and immersive training for pilots. The simulator would be designed to mimic the operational environments of various types of drones and enable pilots to learn and practice essential skills, such as flight manoeuvres, navigation, and payload operation.

Visualization of Valence shell electron pair repulsion theory

Mentor: Dr. Adithya Balasubramanyam

 

Student: Satwik (B. Tech, Cambridge University)

 

The project visualizes the Valence shell electron pair repulsion theory, encouraging new students to learn chemistry in an intuitive and visually appealing way. It enables users to create, modify and visualize new compounds and learn the basics structure of the compound and enables better learnability.

Management of Pre-Operative Anxiety using guided meditation in Immersive Virtual Reality 

Mentor: Dr. Adithya Balasubramanyam

 

Students: Manogna, Armili, Anusha, Gouri (B. Tech, PES University)

The Project focuses on management of Pre-Operative Anxiety using guided meditation in the immersive virtual reality environment. Immersive content is rendered to the user to manage the bio markers of anxiety in patients like the heart rate and blood pressure. [Work is under review in Applied Science Journal]

Simultaneous Localization and Indoor

Navigation Assist in PESU campus

Mentor: Dr. Adithya Balasubramanyam

 

Student: Anusha Devanga (Intern)

 

The project focuses on reviewing various techniques and state-of-the-art indoor navigation assist solutions. It uses a combination of multiple techniques like the marker based tracking, Natural Feature Tracking, and Sensors onboard the user’s mobile phone to track the pose of the user accurately and provide assistance to the user to navigate in a minimally modelled environment. The primary objective is to make the system robust and resilient to changing physical layouts of a university campus.

PESU METAVERSITY - A Static, Functional and Executable Digital Twin of PES University

Title: PESU METAVERSITY - A Static, Functional and Executable Digital Twin of PES University

 

Mentor: Dr. Adithya Balasubramanyam

Student: Armili Paturi (intern) 

 

The Project focuses on creating a digital mock-up of PES University that is functional in the sense that the users in the Digital version PESU will be able to navigate, access labs and classrooms through interactions. The immersive experience will enable guided tour to parents and prospective students. An Executable twin enables true global teaching and learning, by allowing students to register to courses from various geographical locations physically but interacting with faculty and fellow students elsewhere in an alternative reality.

Projects

Open Positions

CAVE Lab Poster-01.png
Open Positions

Contact Details

iotcenter@pes.edu

BE- Block, 1108, Center for IoT, PESU University, RR Campus

Contact
bottom of page