Loading…

Loading grant details…

Completed STANDARD GRANT National Science Foundation (US)

SBIR Phase I: Narrative interface technology to support two-way human-computer interaction for the disabled community

$2.22M USD

Funder National Science Foundation (US)
Recipient Organization Roplagarin Llc
Country United States
Start Date Aug 15, 2023
End Date Oct 31, 2024
Duration 443 days
Number of Grantees 1
Roles Principal Investigator
Data Source National Science Foundation (US)
Grant ID 2304553
Grant Description

The broader impact/commercial potential of this Small Business Innovation Research (SBIR) Phase I project is to improve software accessibility for the roughly 61 million Americans living with a disability. Interactive technology is not currently standardized, and end-users must rely on accessibility features built into software which are inconsistent or nonexistent.

The proposed accessibility interface will enable end-users to utilize 80% of commercially-available software, improving the standard of living for individuals with disabilities and allowing them to engage online professionally and socially. Simultaneously, the innovation will allow software distributors to provide low-cost accessibility solutions, fulfill required mandates, and increase their products’ use and revenue.

The innovation will overcome previous limitations for state-of-the-art accessibility solutions as it will be able to accommodate high-intensity software such as augmented reality (AR). It will also be backward-compatible, adding functionality to older platforms and providing a robust and enduring competitive advantage. US-based medical technology companies, game developers, and organizations with accessibility mandates will be initially targeted.

This Small Business Innovation Research (SBIR) Phase I project seeks to provide interactive technology accessibility for blind, deaf, and physically disabled users without necessitating additional accessibility hardware. A machine learning (ML) engine will translate visual data into text blocks delivered via text or text-to-speech, with visual information recognized through embedded data.

Natural language processing (NLP) will be used to bind actions to spoken phrases, allowing the end-user complete control of the integrated software. This project will test the interface’s ability to integrate with personal productivity software, web browsers, and a video game, with the goal of enabling users to attain 95% functionality across 80% of software on the market.

The research and development will involve: i) training the ML engine to use embedded data to reinterpret visual data as text and produce dynamically-created sentences listing all interactive objects within the user’s field of influence, ii) developing the controller interface, which will be driven by user input, listen for approved phrases, and activate desired software controls, and iii) expanding the interface’s versatility by creating and implementing ML protocols to add accessibility features to any existing or future software. The result will be an early prototype with moderate-to-full functionality across selected software.

This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria.

All Grantees

Roplagarin Llc

Advertisement
Discover thousands of grant opportunities
Advertisement
Browse Grants on GrantFunds
Interested in applying for this grant?

Complete our application form to express your interest and we'll guide you through the process.

Apply for This Grant