XR Developer | Unity Developer
an XR Developer with over 4 years of hands-on experience in
designing, developing, and deploying immersive XR solutions. Born at the
intersection of Extended Reality (XR) and Artificial Intelligence (AI), I’ve
cultivated a deep understanding of how emerging technologies can transform industries—and I bring that
foresight into every project I lead.
My journey in immersive tech began in the first year of college, where I worked as a Research Engineer, exploring real-world applications of AR, VR, and MR across
diverse sectors. As the Head of the Research Lab, I not only spearheaded
cutting-edge experiments but also led cross-functional teams, gaining full-stack
exposure to the entire product development lifecycle—from ideation and prototyping to deployment
and optimization.
With a sharp eye for photorealism and 3D asset creation, I’ve conducted in-depth research in photogrammetry,
building robust pipelines for creating highly detailed and optimized models for XR environments. Alongside
academic research, I’ve enriched my expertise through internships and open-source contributions, constantly
refining my technical toolkit.
Beyond the tech, I’m a strong believer in the power of collaboration, communication,
and continuous learning. I bring to the table:
As a contributor to the Scribe Project by Wikimedia, I played a key role in enhancing the core functionality of the tool, which assists users in creating well-sourced Wikipedia articles.
For the Octernship open source initiative, I designed and developed an isometric 3D garden environment in Unity, allowing users to interact with plants in an immersive, gamified space.
Developed during my internship with CyberPeace Foundation, this project is a Mixed Reality-based Digital Twin that visualizes an organization’s cybersecurity landscape in real time. Designed to bridge the communication gap between technical teams and senior management, it transforms complex SOC dashboard data into an interactive MR experience. By digitally replicating the organization’s cyber environment, stakeholders can collaboratively monitor threats, analyze vulnerabilities, and make informed decisions with clarity and speed—all through an intuitive and immersive interface.
Cyber War Gaming VR is a gamified VR-based cyber training simulation developed during my internship with CyberPeace Foundation, a military-aligned organization. It immerses players in evolving cyberattack scenarios where they must identify threats like DDoS or SYN floods, take defensive actions (e.g., IP blocking, geo-blocking), and observe real-time impact on system performance. The simulation features a scoring system and provides a detailed end report analyzing each player's responsiveness, accuracy, and the effectiveness—positive or negative—of their decisions, making it a powerful tool for both technical and strategic cybersecurity training.
StrikeXR is a high-fidelity combat training platform developed for the Indian Defense forces, combining VR, AR, and AI to simulate complex battlefield scenarios. It delivers 360° interactive simulations across varied terrains—urban, jungle, desert, and counter-insurgency—enhancing soldier reflexes, decision-making, and tactical coordination. With multiplayer support, squads can train together in synchronized virtual missions. StrikeXR also integrates real-time analytics and adaptive feedback, enabling personalized performance tracking and continuous tactical improvement—all within a safe, cost-effective virtual environment designed to elevate mission readiness.
PlanXR is a Mixed Reality (MR) application developed to transform mission planning for the Territorial Army by replacing traditional map-based approaches with immersive 3D terrain exploration. The app enables officers to virtually interact with real-world landscapes, analyze geographical challenges, and make informed, data-driven decisions. Featuring multi-user collaboration, PlanXR allows teams to co-plan missions in real time within a shared virtual space—enhancing coordination, reducing risk, and significantly improving operational readiness and strategic execution.
ByteSpace is a VR/MR application built for data center managers and infrastructure planners to design and visualize data centers before construction begins. Using highly detailed, photogrammetry-based 3D models of servers, the app allows users to manually place, arrange, and interact with equipment in a life-sized virtual environment. It also features an AI-assisted topology generator, which automatically lays out optimized data center configurations in real time based on user input—streamlining the planning process and reducing costly post-construction changes.
Download on Meta StoreXrchitect is a VR/MR application tailored for interior design professionals, students, and educators to create and visualize design concepts in real-time. Unlike traditional 2D design tools, Xrchitect immerses users in a fully interactive, realistic 3D environment, allowing them to design and manipulate spaces within actual or virtual rooms. This powerful tool enables users to experiment with layouts, furnishings, and color schemes before committing to real-world changes, offering unmatched immersion and realism for both educational and professional use.
Download on Meta StoreAutistica is a compassionate VR application designed to support individuals with autism through immersive, multisensory therapies. The app offers five targeted therapy modules—Aural Therapy, EMDR, Music Therapy, Speech Pathology, and Puzzle-Based Cognitive Exercises—each crafted to provide calm, engagement, and structured stimulation. By creating safe, controlled virtual environments tailored to unique sensory needs, Autistica empowers users to explore therapeutic practices in a more accessible and personalized way.
MindMuseum is an innovative VR application that combines Artificial Intelligence with immersive technology to create dynamic, interactive museum experiences in real time. Users can explore personalized exhibitions, where AI generates art, artifacts, and displays on command, responding to voice or text-based input. Whether it’s ancient civilizations or futuristic inventions, MindMuseum tailors content instantly—making it a powerful educational and creative tool that redefines how we experience museums in the digital age.
Indoor Navigation in AR is an augmented reality solution designed to help users navigate complex indoor spaces, offering real-time directions and location-based information through their mobile devices. It enhances user experience by providing interactive, visual guides to easily find their way around large buildings or venues.
Check out Github RepoA playful recreation of the classic Angry Birds game, where players launch birds at structures to knock down targets. The game offers physics-based fun and interactive mechanics for all ages.
Check out Github RepoA timeless puzzle game where players arrange falling blocks to clear lines and prevent the screen from filling up. This version brings the classic Tetris experience with smooth gameplay and nostalgic charm.
Play GameAn immersive, interactive garden simulator where users can design and place their own virtual garden in an isometric view. Developed for Octernship, the app integrates API calls to provide real-time plant information and dynamic interactions.
Check out Github RepoAn action-packed space shooter where players control a spaceship to battle waves of enemies in a visually captivating galaxy. The game features power-ups and increasing difficulty, providing hours of fast-paced fun.
Play GameA VR driving experience where users take the wheel in a fully immersive simulation of real-world roads, powered by Google Maps and OpenStreetMap. It offers a realistic driving experience for training or leisure, allowing players to explore virtual environments while navigating through them.
Check out Github RepoIn this research, I utilized a single Kinect sensor integrated with Unity to capture 3D point clouds of a person in real time. The system streams depth data from the Kinect to generate detailed point cloud representations, enabling interactive 3D visualizations for applications in motion tracking and virtual environments.
For this project, I am developing a holographic 3D experience by synchronizing data from multiple Kinect sensors. The aim is to combine depth information from various angles to create an immersive, multi-perspective 3D model, enhancing realism and interactivity for holographic visualization in computer vision applications.
I conducted in-depth research to develop an optimized process for creating photorealistic models of complex objects, like servers, involving scanning, reconstruction, manual fixing, texturing, UV mapping, and optimization. Check out the models in AR below: