[Q] Can an app developer access the Moto X's natural language processor?
One of Moto X's features is the always-on listening (touchless control) via a discrete core in the X8 chip that is specifically for natural language processing. As far as I know, the only thing this is used for is Google Now (saying "OK, Google Now" to trigger it).
Do any of you know if it's possible to access this language processor from a custom app? I would love to use the Wit API (a natural language processing service that is free for open source projects) to be able to make my own functionality accessible by voice control. Wit already provides speech recognition as well as interpreting the query, so all that's needed is the ability to trigger a custom app via a custom voice phrase instead of triggering Google Now (which is the default). There's not much point in doing this in software only, because it would drain batteries like mad.
Anyone have a push in the right direction, or know if this is possible?