The University of Utah Robotics Center is a hub of groundbreaking research in Haptics, Human-Robot Interaction (HRI), and Virtual Reality (VR).  In Haptics, the focus is on creating realistic touch experiences in virtual environments and developing haptic feedback systems for various applications. This research is crucial for enhancing user immersion and interaction in VR simulations and training environments. Human-Robot Interaction research explores social dynamics, trust, and collaboration between humans and robots. Projects delve into designing intuitive interfaces, studying user behaviors, and creating robots that can assist humans effectively in different tasks. The college's work in Virtual Reality spans across immersive simulations, interactive experiences, and novel VR applications for education, training, and entertainment. Researchers collaborate across disciplines to push the boundaries of VR technology and its integration with other emerging fields such as artificial intelligence and computer graphics. Through interdisciplinary collaboration, industry partnerships, and a commitment to innovation, the Robotics Center continues to lead the way in shaping the future of Haptics, Human-Robot Interaction, and Virtual Reality research and applications.

Brown Lab

Daniel Brown, PhD

Assistant Professor, School of Computing
Website: https://www.cs.utah.edu/~dsbrown/

My work seeks to directly and efficiently incorporate human input into both the theory and practice of robust machine learning. I apply rigorous theory and state-of-the-art machine learning techniques to enable AI systems to maintain, be robust to, and actively reduce uncertainty over both the human’s intent and the corresponding optimal policy. I evaluate my research across a range of applications, including autonomous driving, service robotics, and dexterous manipulation. My goal is to develop robust AI systems that can be deployed in situations that are risk-sensitive such as self-driving cars, healthcare, and domestic service robots

Drew Research Lab

Daniel Drew, PhD

Assistant Professor, Electrical & Computer Engineering
Lab Website: The Drew Research Lab for Autonomous Robotic Millisystems

The driving goal behind our work is to make insect-scale robots truly useful as tools in industrial, commercial, and personal settings. This means not only overcoming the extreme resource constraints imposed by their scale, but also delivering capabilities that are wholly unique. Our work ties numerical simulation together with cutting-edge microfabrication and meso-scale assembly techniques, exploring novel actuation, communication, and sensing modalities for holistically-designed systems.

Hollerbach Lab

John Hollerbach (UURC Director)

Professor, School of Computing
Lab Website: Hollerbach Lab

Professor Hollerbach's research involves robotics, virtual environments, and human motor control. Current active projects include the following locomotion interfaces and haptic interfaces.


Human-Centered Haptics & Robotics Lab

Edoardo Battaglia, PhD

Assistant Professor, Mechanical Engineering
Lab Website: Human-Centered Haptics and Robotics Lab

The Human-Centered Haptics and Robotics (h-CHAR) Laboratory focuses on the design and evaluation of robotic and mechatronic systems for human-focused applications, such as sensing of kinematic and interaction forces, haptic feedback and training and assistive tasks. Our vision is to create devices that will be functional not just from an engineering point of view, but also from a user-centered perspective, with the goal of creating technology that will be intuitive to use, and as unobtrusive as possible. We target applications in high-impact settings such as medical care and diagnosis, training and virtual/augmented reality.

Human Robot Empowerment Lab (HRE Lab)

Laura Hallock, PhD

Assistant Professor, Mechanical Engineering
Lab Website: Human Robot Empowerment Lab (HRE Lab)

We’re working to leverage human sensing to build more effective assistive and rehabilitative robots. We’re housed in Mechanical Engineering and are working to build collaborations across engineering and medicine, including with the Craig H. Neilsen Rehabilitation Hospital. We ask questions about how to create better human–robot systems and apply our new sensing and modeling tools to rehabilitative and assistive robots.

Magnetic & Medical Robotics Lab

Jake Abbott, PhD

Professor,  Mechanical Engineering
Lab Website: Magnetic & Medical Robotics Lab

The Magnetic & Medical Robotics Lab in the Robotics Center has many projects that incorporate magnetic technologies in robotics, with a large focus on the use of magnetic fields to dexterously manipulate objects without any physical contact. Many of our projects consider medical applications of robotics, with a large focus on the application of magnetics in medical robotics. Some of our projects consider a human operator as a physical element of the control loop, which is a topic referred to as "haptics", and most of our haptics projects are aimed at medical applications or use magnetic technologies.


Utah Wearable Robotics Lab

Haohan Zhang, PhD

Assistant Professor, Mechanical Engineering
Lab Website: Utah Wearable Robotics Lab

We specialize in wearable robotic solutions by leveraging mechanism design and modern computational methods. We strive to understand the sensorimotor system through experiments using wearable robotic platforms. Our laboratory aspires to make a positive impact on daily living of individuals with motor impairments through increased independence and social inclusion.