[Q] Can an app developer access the Moto X's natural language processor?

Search This thread

xtagon

New member
Nov 21, 2013
1
1
Oregon, US
www.xtagon.com
Hi guys,

One of Moto X's features is the always-on listening (touchless control) via a discrete core in the X8 chip that is specifically for natural language processing. As far as I know, the only thing this is used for is Google Now (saying "OK, Google Now" to trigger it).

Do any of you know if it's possible to access this language processor from a custom app? I would love to use the Wit API (a natural language processing service that is free for open source projects) to be able to make my own functionality accessible by voice control. Wit already provides speech recognition as well as interpreting the query, so all that's needed is the ability to trigger a custom app via a custom voice phrase instead of triggering Google Now (which is the default). There's not much point in doing this in software only, because it would drain batteries like mad.

Anyone have a push in the right direction, or know if this is possible?
 
  • Like
Reactions: maxmousee

deepukrd

New member
Mar 28, 2014
1
1
Hi guys,

One of Moto X's features is the always-on listening (touchless control) via a discrete core in the X8 chip that is specifically for natural language processing. As far as I know, the only thing this is used for is Google Now (saying "OK, Google Now" to trigger it).

Do any of you know if it's possible to access this language processor from a custom app? I would love to use the Wit API (a natural language processing service that is free for open source projects) to be able to make my own functionality accessible by voice control. Wit already provides speech recognition as well as interpreting the query, so all that's needed is the ability to trigger a custom app via a custom voice phrase instead of triggering Google Now (which is the default). There's not much point in doing this in software only, because it would drain batteries like mad.

Anyone have a push in the right direction, or know if this is possible?


I was in the hunt for the same.
Is it possible to design an application to use the Motorola's X8 computing system to respond to our own Hot-Word or Key Word and invoke our Application.?
This always listening feature opens to the different world of developing high potential applications.
Any help would be greatly appreciated.
Thanks in advance :)
 
  • Like
Reactions: maxmousee

maxmousee

Member
Dec 21, 2012
39
8
Amsterdam
OnePlus 3
OnePlus 7
The same here; Tried to decompile touchless control, but it's not the correct app.
I think it's either in Motorola Services or Contextual Services (maybe this is more likely).
Maybe trying to decompile it would reveal something.
Maybe it uses some JNI code to assign it to a specific core.
I'm still just guessing :(
 

Top Liked Posts

  • There are no posts matching your filters.
  • 1
    Hi guys,

    One of Moto X's features is the always-on listening (touchless control) via a discrete core in the X8 chip that is specifically for natural language processing. As far as I know, the only thing this is used for is Google Now (saying "OK, Google Now" to trigger it).

    Do any of you know if it's possible to access this language processor from a custom app? I would love to use the Wit API (a natural language processing service that is free for open source projects) to be able to make my own functionality accessible by voice control. Wit already provides speech recognition as well as interpreting the query, so all that's needed is the ability to trigger a custom app via a custom voice phrase instead of triggering Google Now (which is the default). There's not much point in doing this in software only, because it would drain batteries like mad.

    Anyone have a push in the right direction, or know if this is possible?
    1
    Hi guys,

    One of Moto X's features is the always-on listening (touchless control) via a discrete core in the X8 chip that is specifically for natural language processing. As far as I know, the only thing this is used for is Google Now (saying "OK, Google Now" to trigger it).

    Do any of you know if it's possible to access this language processor from a custom app? I would love to use the Wit API (a natural language processing service that is free for open source projects) to be able to make my own functionality accessible by voice control. Wit already provides speech recognition as well as interpreting the query, so all that's needed is the ability to trigger a custom app via a custom voice phrase instead of triggering Google Now (which is the default). There's not much point in doing this in software only, because it would drain batteries like mad.

    Anyone have a push in the right direction, or know if this is possible?


    I was in the hunt for the same.
    Is it possible to design an application to use the Motorola's X8 computing system to respond to our own Hot-Word or Key Word and invoke our Application.?
    This always listening feature opens to the different world of developing high potential applications.
    Any help would be greatly appreciated.
    Thanks in advance :)