Immersive Navigation in XR

This project explores new ways to navigate XR environments. The focus is on creating a simple and intuitive user experience. This includes a voice-controlled assistant based on OpenAI's ChatGPT API, which can flexibly answer questions about the environment and help users move seamlessly through virtual space. The project was developed as part of an independent study at the University of Applied Sciences Potsdam, supported by Prof. Reto Wettach through regular consultations. This approach gave room for creativity while ensuring it met academic standards. The goal is to advance XR interactions and make immersive technologies more accessible.

Core Features

  • Voice-controlled navigation powered by AI integration.
  • Immersive spatial movement without traditional teleportation mechanics.
  • Interactive solar system model for visual and functional navigation.

Challenges and Learnings

The primary challenge was achieving fluid motion in XR while maintaining user comfort. Additionally, integrating the voice assistant required fine-tuning AI responses for natural interaction. This project highlighted the importance of iterative testing and user feedback to refine complex systems.

Outcome and Future Goals

This project successfully demonstrated the potential of voice-controlled navigation in XR. Future goals include enhancing AI response accuracy and expanding the environment with additional interactive elements.