Mario Schwarz

Software Engineer | (VR) Unity Developer | Designer

Archer
Arrow
Profile

About Me

I'm a passionate software engineer specializing in Unity development and VR/MR applications. With expertise in creating immersive virtual environments and interactive systems, I bring ideas to life through code and creativity.

Unity Programming VR/MR Design 3D Assets

Featured Projects

TURTLE: Rabbit Hole

Welcome to Turtle Island

Down the Rabbit Hole is a educational journey inside the TURTLE (Virtual Reality Learning Environment) app. Starting on turtle island, you have the option to explore one of 4 different experiences. After completing the introduction, follow the rabbit to reach rabbit island. Here you can finally descend into a cave that represents a "rabbit hole". The world is designed as a chain of linked environments, from bright coastal scenes to a dimly lit cave environment, symbolizing the depths of an extremist echo chamber.

The experience is built in Unity 2022.3 with the Universal Render Pipeline, the Meta XR All in One SDK, and targets the Meta Quest 3 family. Careful use of baked lighting, custom water and boat motion scripts, optimized shaders, a waypoint system for the rabbit guide Harold Hops and a localized dialogue system with AI generated voices keeps the application visually rich while still running smoothly on standalone VR hardware.

Rabbit Island trapdoor animation
  • Stylized island hub with turtle and rabbit shaped landforms
  • Immersive journey into the rabbit hole
  • Performance focused URP setup with baked lighting and tuned shaders
  • Hand tracking gestures for dialogue and menu interaction
  • Custom NPC waypoint and dialogue tools for the rabbit guide
Water Island Boat
Falling Rabbit

In the depths of Echo Chambers

After descending "down the rabbit hole", the cave sequence turns the world into a visual metaphor for social media echo chambers. Real but anonymized extremist posts are projected as floating social feeds, surrounded by interactive elements like educational pins and holographic commenters. Players get educated about how conspiracy narratives, hidden symbols and hate speech can appear in social media.

Guided by Harold Hops and AI voiced narration, players react to content with hand tracked swiping, grabbing, thumbs up and thumbs down gestures. They investigate hidden codes and solve a codeword puzzle in order to free themselves of the rabbit hole. The VR experience is designed to teach about everyday media use, support digital literacy, critical thinking and awareness of extremist symbolism in social networks.

  • VR cave that visualizes social media echo chambers as a physical space
  • Real social media posts with anonymized accounts and added educational layers
  • Holographic commenters that model different problematic online behaviors
  • Hand gesture choices that mirror common social media reactions
  • Progressing to solve the codeword puzzle as a learning milestone

VR Language Learning

Learning through interaction

The main goal of this project was to get students to interact with each other in an online Virtual Reality world, discussing different topics in a foreign language they are learning. The project was made in Unity with Photon for multiplayer networking. Two different environments were created to host different discussion topics, with one scenario focusing on production sustainability and one on fairness in public transit.

Students first enter a main lobby area where they can specify a room number to host or connect to, to meet up with their assigned discussion partners. Here they also get to choose which of the two discussion environments they want to enter.

Scene Selection Menu

Sustainability and Fairness

Sustainability Scene

In the sustainability scene, students are tasked with exploring a clothing store and discussing the sustainability aspects of different outfits they can choose from multiple individual clothing pieces.

The public transit scene makes the students part of a city council meeting where they have to discuss different problems individuals are facing with public transit in their city and propose solutions to improve the situation.

Fairness in Public Transit Scene

Water

VR Clothing Store

A matter of taste

As part of an exhibition developed for the Experimenta Science Center in Heilbronn, Germany. In this VR experience, users assume the role of a fashion store salesperson. They interact with virtual customers and deliver one of many randomized clothing items based on what the customers ask for. The experience is designed to explore themes of personal taste and influences in decision making.

Rabbit Island trapdoor animation
  • Stylized island hub with turtle and rabbit shaped landforms
  • Immersive journey into the rabbit hole
  • Performance focused URP setup with baked lighting and tuned shaders
  • Hand tracking gestures for dialogue and menu interaction
  • Custom NPC waypoint and dialogue tools for the rabbit guide

Virtual Teacher

AI-based Powerpoint presentations

During our master's program in software engineering, we felt that learning from powerpoint slides provided by some of the teachers was somewhat of a lacking experience and we wanted a more fun and engaging way to learn from powerpoint slides. For this problem we had the idea to build an automated pipeline that turns powerpoint slides into interactive presentations for one of the courses in our master's program. This project built with Unity is the result of this course, made by Mario Schwarz and David Flaig. During the time the project was made, there was no access to GPT-4 or similar models with integrated image recognition, or an AI voice API, so we had to build a custom pipeline that combined multiple services to achieve our goal.

Virtual Teacher Menu Screen

Upon opening the application you are met with a menu to select which presentation you want to start. Each presentation is represented by a folder of slide image exports from Powerpoint and saved with all other presentation specific data. The pipeline starts with Tesseract as an OCR plugin to read the text from the slides, which is then sent together with a prompt to ChatGPT to generate a text that can be spoken by our virtual teacher. This text is then sent to the ReadSpeaker API (deprecated, replaced with Elevenlabs), which streams the audio back into the application. The audio is synchronized to the lip movements of the teacher with the Unity SALSA Lip Sync plugin. All data generated by this pipeline is cached so that presentations can be viewed multiple times without running the pipeline again.

The following video shows an example presentation generated with this pipeline that we used to let our project present itself in our final project presentation slot.

Project

Project

Get In Touch

Interested in collaboration or have a project in mind? Let's connect!