Intent–Constraint–Risk Sharing for Human–Robot Teams
Description:
The team develops and evaluates a shared-autonomy framework that fuses human intent inference, explicit task and safety constraints, and risk-aware decision making to coordinate human–robot collaboration. They model human intent from multimodal cues (e.g., pose, gestures, task context), encode constraints and safety limits in the controller (e.g., kinematic/force bounds, keep-out zones, ergonomic limits), and implement risk-sharing policies that adapt robot behavior to uncertainty and proximity. They prototype the framework on a representative co-manipulation or hand-off task in ROS2, log interaction metrics (throughput, near-misses, interventions), and assess user workload and trust with standard instruments, releasing code and a concise evaluation report.
Duration: Fall/Spring
Team: Graduate/Undergraduate students
Compensation: Unpaid
Preferred Skill and Knowledge:
-
Python and Unity
-
ROS2 and basic robot kinematics
-
Human factors & evaluation: experiment design, logging, and surveys (e.g., NASA-TLX, Trust in Automation)
Application Procedure:
Please send your resume and a short explanation of the project areas you're interested in and why to Chang S. "CS" Nam (csnam@ncat.edu).