FY 2019-2020

Smart Room Demonstration

This demonstration is intended to provide assistive technology support for any client seeking to decrease reliance on paid staff. By using
Alexa, Google and Siri to interact with hubs, devices and the environment, residential users can benefit from voice, button, and switch-activated access to environmental controls, entertainment, and communication tools.
All components in the demo are generic but can be bespoke based on the needs of the user.

Comfort and Tools for Living

This is our newest and least robust area of inquiry. We are currently seeking a residential pilot partner to implement remote support for safety monitoring. For example, staff can use the system to detect falls, unlocked doors or windows or appliances left on. The client can learn who is at the front door (and remotely unlock it). We currently have the ability to empower the client to control the heater or air conditioner and monitor appliances using IOT plugs. In the future, we hope to use IOT and a voice assistant to empower clients to raise and lower their bed or seating.

Robotic Turntable for Individuals with Limited Range of Motion

This experiment includes micro-controllers, stepper motors, a voice assistant and servos to enable an individual with limited range of motion to use their voice to direct the rotation and angle of attack on a work surface or even a dinner plate!

Alexa-Enabled Conveyor Mechanism for DOBOT Magician

This experiment uses metal rails to expand the action area for the DOBOT Magician robotic arm. The system includes the DOBOT Magician, stepper motors, bearings, limit switches, micro-controller and a choice of wired or wireless access.

Touchless Gesture Ultrasonic ‘Harp’

Using an ultrasonic sensor and a Raspberry Pi, this instrument translates gestures into sounds and/or light. Sounds can be played over built-in or external speakers.

Alternative Interfaces for Computer Access

Depending on the needs of the user, accessing a computing space or device can be difficult or impossible. By employing customized switches and buttons, computing can take place triggered by the head, cheek, foot, knee, or breath. These experiments are bespoke and are time-intensive.