Voice based usability
I've been tinkering with spoken interfaces if you've been following me on twitter. In 2015, Michael Collins (lead faculty) and I envisioned a system (in elms) that you could talk to and have it know what to do. Not AI persay, but a really simplistic agency for the system that understood enough to make things easier for you. The foundational work for that is in creating a spoken interface. This is something core in elmsln (that works in drupal in general) and has seen a lot of usability refinement over the last two weeks of playing.
This video shows the current state of the integration and is a window into things to come as these technologies stabilize, get easier to work with and we integrate / get additional user feedback about their use. In this video I've turned on continual listening mode (which is not the default setting) as to allow for a seamless experience without having to press any buttons. It is our hope that down the road this option would be available for anyone to enable (right now it's just admins that can go handsfree, staff can use the key command and by default no one else has these options... yet).
From here, getting the system to speak back to the user, adding custom commands, and improving usability of this mini-form of agency are not terribly difficult.