SBIR Phase I: Assistive Robots for Personal Care and COVID-19 Protection
- Funded by National Science Foundation (NSF)
- Total publications:0 publications
Grant number: 2036684
Grant search
Key facts
Disease
COVID-19Start & end year
20212021Known Financial Commitments (USD)
$255,756Funder
National Science Foundation (NSF)Principal Investigator
Michael DooleyResearch Location
United States of AmericaLead Research Institution
LABRADOR SYSTEMS INCResearch Priority Alignment
N/A
Research Category
Clinical characterisation and management
Research Subcategory
Supportive care, processes of care and management
Special Interest Tags
Innovation
Study Type
Non-Clinical
Clinical Trial Details
N/A
Broad Policy Alignment
Pending
Age Group
Not Applicable
Vulnerable Population
Not applicable
Occupations of Interest
Not applicable
Abstract
The broader impact/commercial potential of this Small Business Innovation Research (SBIR) Phase I project advances the state-of-the art of an emerging class of vision-based, autonomous navigation technologies to open new possibilities for low-cost/high-performance personal assistive robots. The robotics solution enables mobility-impaired individuals to have more agency over their environment and enjoy a higher quality-of-life. This helps address the severe shortage of caregivers for the elderly and post-acute care patients by empowering individuals to maintain their independence, extending the impact of caregivers, and reducing the cost of care in both home and facility settings. Additionally, by providing affordable and reliable isolation support in COVID-19 care settings, the proposed solution can help decrease the financial burden and increase the public health outcomes associated with COVID-19 disease management. The core robotics solution has an immediate addressable market of 11 million high-needs users in the U.S. alone, with projected revenues of roughly $1.65 Billion five years after product launch. Further commercialization opportunities come from licensing parts of the developed navigation technology for other robotics applications and developing an ecosystem of complementary products around the core robotics solution.
This Small Business Innovation Research Phase I project seeks to enable a new generation of assistive service robots that are comparable to commercial robots in performance, but significantly more affordable for individual use and personal care applications. The innovation adopts emerging visual positioning technologies from Augmented Reality to enable robust navigation for mobile robots using low-cost, consumer-grade electronics, while addressing a key limitation of visual positioning systems namely, that external lighting conditions and other changes in an environment can dramatically impact their performance. The innovation addresses these challenges via a combination of hardware and software that learns and stabilizes the highest value visual elements of the environment to maintain persistency across lighting conditions and long periods of time - a development critical to making assistive robots cost-effective for adoption at a large scale. Research objectives include: fully developing and integrating the visual persistency system, to achieve accurate and replicable robot navigation performance across a representative range of lighting conditions and visual characteristics of the target operating environments and benchmarking the resulting solution against state-of-the art technologies, to demonstrate its superior performance (i.e., it can successfully localize in at least 90% of cases where other solutions fail).
This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria.
This Small Business Innovation Research Phase I project seeks to enable a new generation of assistive service robots that are comparable to commercial robots in performance, but significantly more affordable for individual use and personal care applications. The innovation adopts emerging visual positioning technologies from Augmented Reality to enable robust navigation for mobile robots using low-cost, consumer-grade electronics, while addressing a key limitation of visual positioning systems namely, that external lighting conditions and other changes in an environment can dramatically impact their performance. The innovation addresses these challenges via a combination of hardware and software that learns and stabilizes the highest value visual elements of the environment to maintain persistency across lighting conditions and long periods of time - a development critical to making assistive robots cost-effective for adoption at a large scale. Research objectives include: fully developing and integrating the visual persistency system, to achieve accurate and replicable robot navigation performance across a representative range of lighting conditions and visual characteristics of the target operating environments and benchmarking the resulting solution against state-of-the art technologies, to demonstrate its superior performance (i.e., it can successfully localize in at least 90% of cases where other solutions fail).
This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria.