The University of Utah Robotics Center has a focus in the interrelated areas of haptics, augmented reality (AR), virtual reality (VR), human-robot interaction (HRI), and interfaces that enable humans to interact with computers or cyberphysical systems. In haptics, the focus is on creating realistic touch sensations for users interacting with remote or VR environments. Research topics include the development of new technologies, as well as the study of humans as they interact with them. HRI primarily involves humans and robots exchanging information with each other, but can also involve physical interactions and collaborations.

Magnetic & Medical Robotics Lab

Jake Abbott, PhD

Professor
Department of Mechanical Engineering

The Magnetic & Medical Robotics Lab in the Robotics Center has many projects that incorporate magnetic technologies in robotics, with a large focus on the use of magnetic fields to dexterously manipulate objects without any physical contact. Many of our projects consider medical applications of robotics, with a large focus on the application of magnetics in medical robotics. Some of our projects consider a human operator as a physical element of the control loop, which is a topic referred to as "haptics", and most of our haptics projects are aimed at medical applications or use magnetic technologies.

Human-Centered Haptics
& Robotics Lab

Edoardo Battaglia, PhD

Assistant Professor
Department of Mechanical Engineering

The Human-Centered Haptics and Robotics (h-CHAR) Laboratory focuses on the design and evaluation of robotic and mechatronic systems for human-focused applications, such as sensing of kinematic and interaction forces, haptic feedback and training and assistive tasks. Our vision is to create devices that will be functional not just from an engineering point of view, but also from a user-centered perspective, with the goal of creating technology that will be intuitive to use, and as unobtrusive as possible. We target applications in high-impact settings such as medical care and diagnosis, training and virtual/augmented reality.

Aligned, Robust, and Interactive Autonomy (ARIA) Lab

Daniel Brown, PhD

Assistant Professor
Kahlert School of Computing

My work seeks to directly and efficiently incorporate human input into both the theory and practice of robust machine learning. I apply rigorous theory and state-of-the-art machine learning techniques to enable AI systems to maintain, be robust to, and actively reduce uncertainty over both the human’s intent and the corresponding optimal policy. I evaluate my research across a range of applications, including autonomous driving, service robotics, and dexterous manipulation. My goal is to develop robust AI systems that can be deployed in situations that are risk-sensitive such as self-driving cars, healthcare, and domestic service robots


Neurorobotics Lab

Jacob George, PhD

Assistant Professor
Department of Electrical & Computer Engineering
Physical Medicine & Rehabilitation

Our research seeks to augment biological neural networks with artificial neural networks and bionic devices to treat neurological disorders and to further our understanding of neural processing. Working at the intersection of artificial intelligence, robotics, and neuroscience, we are developing biologically-inspired artificial intelligence and brain-machine interfaces to restore and/or enhance human function.

Human Robot Empowerment Lab
(HRE Lab)

Laura Hallock, PhD

Assistant Professor
Department of Mechanical Engineering

We’re working to leverage human sensing to build more effective assistive and rehabilitative robots. We’re housed in Mechanical Engineering and are working to build collaborations across engineering and medicine, including with the Craig H. Neilsen Rehabilitation Hospital. We ask questions about how to create better human–robot systems and apply our new sensing and modeling tools to rehabilitative and assistive robots.

Hollerbach Lab

John Hollerbach (UURC Director)

Professor
Kahlert School of Computing

Professor Hollerbach's research involves robotics, virtual environments, and human motor control. Current active projects include the following locomotion interfaces and haptic interfaces.


Robotic Systems Lab

Mark A. Minor, PhD


We synergize design, modeling, and control of robotic systems to arrive at novel embodiments that provide new levels of adaptability, mobility, and immersion that would not otherwise be possible.

Our lab branches into several types of robotic systems. These include climbing robots, terrain adaptable mobile robots, virtual interfaces, autonomous vehicles, and some flying robots (ornithopters, helicopters, etc). We have extensive expertise with design and control of under-actuated nonholonomic systems, kinematic motion control, dynamic motion control, state estimation, sensor development, and data fusion.


My work seeks to directly and efficiently incorporate human input into both the theory and practice of robust machine learning. I apply rigorous theory and state-of-the-art machine learning techniques to enable AI systems to maintain, be robust to, and actively reduce uncertainty over both the human’s intent and the corresponding optimal policy. I evaluate my research across a range of applications, including autonomous driving, service robotics, and dexterous manipulation. My goal is to develop robust AI systems that can be deployed in situations that are risk-sensitive such as self-driving cars, healthcare, and domestic service robots