The University of Utah Robotics Center is a pioneer in Rehabilitative and Assistive Robotics research, dedicated to developing advanced technologies that enhance the quality of life for individuals with disabilities and promote rehabilitation outcomes. Researchers at the center focus on designing robotic systems and assistive devices tailored to specific mobility and rehabilitation needs. This includes developing exoskeletons, prosthetic devices, and robotic aids that assist individuals with movement impairments in daily activities and therapeutic exercises.

The center's interdisciplinary approach integrates expertise from engineering, rehabilitation sciences, and healthcare, ensuring that research outcomes are clinically relevant and user-centered. Collaborative efforts aim to improve mobility, functionality, and independence for individuals with disabilities through innovative robotic solutions. Through innovative design, advanced control algorithms, and user-centric approaches, the Robotics Center at the University of Utah is driving transformative advancements in rehabilitative and assistive robotics, positively impacting the lives of individuals with diverse mobility challenges.

Aligned, Robust, and Interactive Autonomy (ARIA) Lab

Daniel Brown, PhD

Assistant Professor, School of Computing
Website: https://www.cs.utah.edu/~dsbrown/

The Aligned, Robust, and Interactive Autonomy (ARIA) Lab focuses on developing algorithms that enable robots and other AI systems to safely and efficiently interact with, learn from, teach, and empower human users. Our research spans the areas of human-robot interaction, reward and preference learning, imitation learning, human-in-the-loop reinforcement learning, and AI safety. We develop both algorithms and theory, deploy these algorithms on robot hardware platforms, and run user studies to better understand human factors. We are interested in a diverse set of applications including domestic service robots, assistive and medical robotics, bio-inspired swarms, autonomous driving, and personal AI assistants.

HGN Lab for Bionic Engineering

Tommaso Lenzi, PhD

Associate Professor, Mechanical Engineering
Lab Website: HGN Lab for Bionic Engineering

We envision a world where everyone can move and live independently. We envision a world where congenital or acquired body differences, trauma, and injury would not prevent people from pursuing their life goals. We envision a world where advanced bionic technologies will enhance the human body by restoring, preserving, and augmenting human movement ability across the lifespan. To achieve this goal, we focus on the intersection of Robotics, Design, Control, Biomechanics, and Neural Engineering. Our goal is to create new science and develop new technologies empowering the next generation of wearable bionic devices and systems to help people move and live independently, ultimately ending physical disability.

Human-Centered Haptics & Robotics Lab

Edoardo Battaglia, PhD

Assistant Professor, Mechanical Engineering
Lab Website: Human-Centered Haptics and Robotics Lab

The Human-Centered Haptics and Robotics (h-CHAR) Laboratory focuses on the design and evaluation of robotic and mechatronic systems for human-focused applications, such as sensing of kinematic and interaction forces, haptic feedback and training and assistive tasks. Our vision is to create devices that will be functional not just from an engineering point of view, but also from a user-centered perspective, with the goal of creating technology that will be intuitive to use, and as unobtrusive as possible. We target applications in high-impact settings such as medical care and diagnosis, training and virtual/augmented reality.


Human Robot Empowerment Lab (HRE Lab)

Laura Hallock, PhD

Assistant Professor, Mechanical Engineering
Lab Website: Human Robot Empowerment (HRE) Lab

We’re working to leverage human sensing to build more effective assistive and rehabilitative robots. We’re housed in Mechanical Engineering and are working to build collaborations across engineering and medicine, including with the Craig H. Neilsen Rehabilitation Hospital. We ask questions about how to create better human–robot systems and apply our new sensing and modeling tools to rehabilitative and assistive robots.

Neurorobotics Lab

Jacob George, PhD

Assistant Professor, Electrical & Computer Engineering, Physical Medicine & Rehabilitation
Lab Website: Neurorobotics Lab

Our research seeks to augment biological neural networks with artificial neural networks and bionic devices to treat neurological disorders and to further our understanding of neural processing. Working at the intersection of artificial intelligence, robotics, and neuroscience, we are developing biologically-inspired artificial intelligence and brain-machine interfaces to restore and/or enhance human function.

Robotic Systems Lab

Mark A. Minor, PhD


We synergize design, modeling, and control of robotic systems to arrive at novel embodiments that provide new levels of adaptability, mobility, and immersion that would not otherwise be possible.

Our lab branches into several types of robotic systems. These include climbing robots, terrain adaptable mobile robots, virtual interfaces, autonomous vehicles, and some flying robots (ornithopters, helicopters, etc). We have extensive expertise with design and control of under-actuated nonholonomic systems, kinematic motion control, dynamic motion control, state estimation, sensor development, and data fusion.


Utah Wearable Robotics Lab

Haohan Zhang, PhD

Assistant Professor, Mechanical Engineering
Lab Website: Utah Wearable Robotics Lab

We specialize in wearable robotic solutions by leveraging mechanism design and modern computational methods. We strive to understand the sensorimotor system through experiments using wearable robotic platforms. Our laboratory aspires to make a positive impact on daily living of individuals with motor impairments through increased independence and social inclusion.