Bringing Google Home to ELMS:LN

originally posted on Drupal @ Penn State

I've dreamed of a day when systems start to work like the home automation and listening (NSA Spying...) devices that people are inviting into their home. "Robots" that listen to trigger words and act on commands are very exciting. What's most interesting to me in trying to build such systems is,... they really aren't that hard any more. Why?

Well, the semantic web is what's delivering the things for Siri, Google and Alexa to say on the other end. When you ask about something and it checks wikipedia THAT IS AMAZING.... but not really that difficult. The human voice is being continuously mapped and improved upon in accuracy daily as a result of people using things like Google Voice for years (where you basically give them your voice as data pieces in order to improve their speech engines).

So I said, well, I'd like to play with these things. I've written about Voicecommander in the past but it was mostly proof of concept. Today I'd like to announce the release of VoiceCommander 2.0 with built in support to do the "Ok Google" style wikipedia voice querying!

To do this, you'll need a few things:

  • Voicecommander 2.0 (
  • Annyang library in the sites/all/libraries folder (voice commander project has drush command for this / documentation)
  • Puzzler module -

Enable the voicecommander_whatis module, tweak the voicecommander settings to your liking and then you'll be able to build things like in this demo. The 1st video is a quick 1 minute of a voice based navigational system (this is how we do it in ELMSLN). The second is me talking through what's involved, what's actually happening, as well as A/B comparing different library configuration settings and how they relate to accuacy downstream.