Eloquence on Android first impressions: CodeFactory still doesn't seem to know how to make an Android TTS voice properly, because it does not interrupt how it should at fast rates, same as Vocalizer. With RHVoice, if I flick through an app quickly, it starts speaking the start of every string as I flick. Eloquence, though? If I flick twice super fast it'll just keep reading the very first string until it's done, and then move onto the final one. #YouHadOneJob

Dennis reshared this.

in reply to Jamie Teh

@jcsteh @Scott @KaraLG84 I wish I'd recorded it, but once they sent me a build that was so responsive that turning one of the controls at high-speed read the beginning of everything you passed through. I said 'I want that one! give me that one!' They didn't. Now if you turn too quickly, you just miss things as I guess, not being a programmer, it has a poll rate or some shit, so if you're over it, it skips. I want a responsiveness slider. Give me a responsiveness slider.
in reply to Andre Louis

@FreakyFwoof @Scott @KaraLG84 Rule number 1 in speech related accessibility: responsiveness is of utmost importance, almost over everything else, unless prioritising it would cause the information to be inaccurate. I don't know why so many people struggle to understand this, especially since the same people would probably be horrified if their buttery smooth scrolling had even the slightest hiccup or dropped a single frame. Apple could take a lesson here too.
in reply to Andre Louis

@FreakyFwoof @Scott @KaraLG84 I used to think this delay was some communication lag between KK and the host, but it isn't, because transport buttons, etc. respond very quickly. With OSARA, the DAW buttons end up reading faster than KK's own controls do. That means it's very likely an artificial delay, which is utterly infuriating. It annoyed me enough that I spent a little bit of time trying to see if there was some way I could reverse engineer their protocol and write my own helper, but I didn't get anywhere.
in reply to Chi Kim

@chikim @Scott @FreakyFwoof @KaraLG84 Makes sense. That's honestly what I thought, but I got the impression somewhere from NI that it used auto mapping features in some DAWs or something and I assumed those mappings were somehow more sensible. Apparently, I misled myself. On the flip side, this should be fairly straightforward to implement in REAPER. Famous last words.
in reply to Jamie Teh

@jcsteh @chikim @FreakyFwoof @KaraLG84 Hmm I thought they were doing some nice sorting or something as well. So two questions:
1. Is using the controller to twiddle DAW params still subject to the same delayed speech response?
2. If there's nothing complicated happening, does that mean ReaKontrol could do this with hardware other than MK3 Jamie?
in reply to Scott

@Scott Regarding 2, I wondered that too. The issue is that ReaKontrol doesn't get feedback from the knobs and page buttons unless you're in the mixer view, at which point the user probably wants to use them for the mixer. We could probably have a "fake" mixer view which gets enabled when you press the 4d encoder while in the mixer view or something like that, but it's definitely weird. Plus ReaKontrol doesn't get anything when you touch the knobs, only when you turn them, so you couldn't query the value. @chikim @FreakyFwoof @TheQuinbox @KaraLG84
in reply to Jamie Teh

@Scott Personally, I don't buy NI's argument that this stuff is only possible with Mk3. Mk3 does more of it on the keyboard hardware itself, sure, but that's no reason they couldn't have done it with Mk2. They'd have to move the display and accessibility rendering out of the KK plugin and into a separate service, but it should still be possible. I think they just chose not to. Of course, I'm not an NI dev, so maybe there's something I'm missing. @chikim @FreakyFwoof @TheQuinbox @KaraLG84
in reply to Jamie Teh

@jcsteh @Scott @chikim @FreakyFwoof @KaraLG84 Yeah maybe it's just a matter of resources. I get the impression they're trying to make their actual software more accessible without the keyboard as well, latest Kontakt update is a good example of this. I can now at least browse presets comfortably without using KK, but I hope they'll make editing accessible from the GUI as well.
in reply to Scott

@Scott If there aren't, that seems like something that could probably be done in ReaKontrol - it would just pretend that only the parameters you chose exist in the order you chose them - but the question is how to set up those templates. I actually wonder whether it might be worth exposing a fake plugin which gives you all the parameters you added to the TCP. That way, you could use track templates to get you just the parameters you regularly use. @chikim @FreakyFwoof @TheQuinbox @KaraLG84
in reply to Scott

@Scott Oh haha, I didn't realise they didn't have a UI for it. Doing it with config files wouldn't be too hard, but how non-OSARA users would get the parameter numbers is an interesting question. Honestly, it probably wouldn't be that hard to build a UI for it either. I'd just rather do... just about anything else. @chikim @FreakyFwoof @TheQuinbox @KaraLG84
in reply to Scott

@Scott You could do that I guess, though it makes the config a bit harder to deal with.
VST: ReaEQ (Cockos) = Freq-Low Shelf | Gain-Low Shelf | BW-Low Shelf
Maybe it needs to be YAML or something like that so we can do items on separate lines:
VST: ReaEQ (Cockos):
- Freq-Low Shelf
- Gain-Low Shelf
- BW-Low Shelf
But anyway, this is a way off, so no idea why I'm musing on it.
@chikim @FreakyFwoof @TheQuinbox @KaraLG84