Developing a conversational agent for Google Home is complex. But the hard part is to test it.
Especially if you work in an open space full of facetious colleagues.
OCTO’s BYOP (Bring Your Own Pet) policy does not help: sounds from pets are also detected by Google Home devices.
To our astonishment we’ve found that this detection is very sophisticated: Google Home seems to have acquired abilities to understand the language of animals!
Dog or Chat Bot?
For example here is the Json returned by a Google Home device when Oriane’s cat mewed during a test sequence:
Intrigued, we tried with Fonzy, the dog of our CEO. This time, the Json contained the following sequence:
We’ve activated our contacts at Google to find out.
They confirmed the information: since April 2017, Google Speech uses Youtube as the main source of learning. Given the massive number of videos of cats and dogs hosted on Youtube, Google’s Machine Learning algorithms have automatically learned to classify animal sounds.
Google is collaborating with the startup No more woof and plans to extend the capabilities of the Google Translate service to promote inter-species communication and allow your dogs and cats to order food in your absence.
Access to this service is still currently secure: the voice command “OK Google” remains essential before Rex can place its order and eat its own dog food without you.
Google refused to further comment and disclose the list of animals supported. We had to proceed by trial & error in order to find out which animal sounds work and what type of information is detected and returned.
Here are the first results:
|Attribute||Description||Example of collected data|
|rawtext||Sound phonetic transcription||“wooaaf ooaf”, “miaew”, “mooo”, “onk”|
|locale||Species and local dialect||“dog-fr”, “cat-fr”, “cow-fr”, “camel-fr”|
|textValue||Value transcribed in the default language of the Google Home device||“Ouaf ouaf”, “Miaou”, “Meuh”, “Onk”|
|size||(Animal?) Size||“small”, “medium”, “large”, “huge”|
|mood||Emotional state (?)||“cool”, “passive”, “agressive”|
According to our observations, it doesn’t work with all animals yet.
For example, understanding the intentions of the OCTO duck is limited. The Mooh box works better.
The goldfish from our office’s reception was not recognized when we stuck a Google Home Mini to the bowl side.
We thought it was due to the thickness of the glass and plunged the device into the water. For some unknown reason, it shorted before the first bubbles were recognized.
Stimulated by this new challenge, our teams will continue the experiments and keep you informed of the evolution of voice recognition for fish.
To be continued…