- User Since
- Jan 14 2017, 3:13 AM (61 w, 16 h)
Wed, Mar 14
Its awesome to see support for this integration I can start putting in the python side of Mycroft Elisa-control-skill together (in a scratch repo?) from my end that incorporates the existing Mpris2 interface within Elisa this will add basic voice control on the mpris calls, we could start extending the skill as Elisa expands its own Dbus Interface to interact with different parts of the application to handle complex formations, I have gone ahead and created a very rough flow chart that might be able to give a better view of how Elisa's Dbus interface could be extended to add additional functionality and how it can all come together: https://imgur.com/a/oHzk0
Tue, Mar 13
Sat, Mar 10
Something like "mycroft, play me a song xyz from the playlist/library" could be doable if Elisa can send over an object map with the current playlist over a Dbus callback or some type of JSON call, Mycroft handles the song name matching over an index and call pla over dbus (some index that matches title/tag/artist or fingerprint matching from musicbrains) something like play me something by xyz mood would probably require song tagging within Elisa itself.
It sounds like a cool idea to have Mycroft Elisa integration for voice control, If Elisa is already using Mpris 2 it should be rather easy to have Mycroft talking to Elisa over DBUS for basic actions like: play current song, pause, stop, next, previous song (example: https://github.com/AIIX/amarok-player-skill) but for a deeper integration like song name search, or play by song name match in library, play selected genre or more natural user queries like"play some song based on user mood" the integration would have to go far beyond what the Mpris DBUS interface currently provides.