CAREER: Re-Thinking the Perception-Action Paradigm for Agile Autonomous Robots
- Funded by National Science Foundation (NSF)
- Total publications:0 publications
Grant number: 2145277
Grant search
Key facts
Disease
COVID-19Start & end year
20222027Known Financial Commitments (USD)
$500,000Funder
National Science Foundation (NSF)Principal Investigator
Giuseppe LoiannoResearch Location
United States of AmericaLead Research Institution
New York UniversityResearch Priority Alignment
N/A
Research Category
Secondary impacts of disease, response & control measures
Research Subcategory
Social impacts
Special Interest Tags
N/A
Study Type
Non-Clinical
Clinical Trial Details
N/A
Broad Policy Alignment
Pending
Age Group
Not Applicable
Vulnerable Population
Not applicable
Occupations of Interest
Not applicable
Abstract
Autonomous robots will become pervasive in our society and will solve complex tasks, actively collaborating with each other and with humans. As the recent COVID-19 outbreak has highlighted, autonomous robots can solve a range of time-sensitive problems including logistics, reconnaissance, and disinfection of critical areas. Beyond pandemic, small-scale robots can help humans in complex or dangerous tasks such as search and rescue, security, and surveillance, and, thanks to their lighter weight, they pose only a modest risk to human safety. These time-sensitive tasks require robots to make fast decisions and agile maneuvers in complex and dynamic environments. State-of-the-art autonomous navigation approaches, while mature, are slow and brittle and prevent robust and resilient agile navigation. This Faculty Early Career Development (CAREER) Program studies the fundamental perception-action problem for agile navigation of autonomous robots in complex environments by planning a novel, low-latency, robust, adaptive, safe, and resilient paradigm. This project aims also to educate students on the technical aspects, societal benefits, and ethical use of autonomous systems by establishing a unique multi-disciplinary, and inclusive research and educational platform which includes a core curriculum on robot localization and navigation, and a series of online racing hackathons for a post-pandemic customized and inclusive research and educational experience. These will contribute to lowering the barrier to participation in research and education for students, particularly underrepresented minorities.
This project will generate a new foundational theory, which includes models and algorithms resulting from a principled combination of perception, learning, and control to holistically design visual perception and action to create small-scale agile autonomous robots. The goal is to capture the strict cross-coupling effects between perception and action to jointly and concurrently resolve the perception-action problem to speed up the robots' decision making process and increase their agility. The project is organized in three thrusts according to a series of objectives, culminating in innovations in robotics autonomy research and education. A compressed and unified representation of the perception and action spaces guarantees to reduce the robot's inference latency and naturally reveals the cross-coupling effects among them. Next, the robot will exploit using this representation its action-predictive information to enhance its inference capabilities and will employ an optimal control/planning approach to maximize its perception accuracy and quality.
This project is supported by the cross-directorate Foundational Research in Robotics program, jointly managed and funded by the Directorates for Engineering (ENG) and Computer and Information Science and Engineering (CISE).
This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria.
This project will generate a new foundational theory, which includes models and algorithms resulting from a principled combination of perception, learning, and control to holistically design visual perception and action to create small-scale agile autonomous robots. The goal is to capture the strict cross-coupling effects between perception and action to jointly and concurrently resolve the perception-action problem to speed up the robots' decision making process and increase their agility. The project is organized in three thrusts according to a series of objectives, culminating in innovations in robotics autonomy research and education. A compressed and unified representation of the perception and action spaces guarantees to reduce the robot's inference latency and naturally reveals the cross-coupling effects among them. Next, the robot will exploit using this representation its action-predictive information to enhance its inference capabilities and will employ an optimal control/planning approach to maximize its perception accuracy and quality.
This project is supported by the cross-directorate Foundational Research in Robotics program, jointly managed and funded by the Directorates for Engineering (ENG) and Computer and Information Science and Engineering (CISE).
This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria.