CV out fluctuations
I have the Eloquencer for a month or so and have really enjoyed it.
I am having an issue recently, however, where CV output voltages seem to randomly fluctuate by approximately one semitone. If I set track A to produce a steady sequence of tone C1, and I route those gates/tones to a synth voice, I occassionally hear a slightly higher tone. This seems to occur randomly on one or two notes out of every 16.
* This only occurs on track A. If I copy or setup the same constant sequence on B, C, D, etc, I hear constant tones.
* I verified that under CV, 'Var Prob' is set to 0%.
* I tested with multiple synth/voices, and I hear the same occassional higher tone.
* I routed the CV out to a scope and I see a steady sequence of 1.0V, then 1.08V when the random fluctuations occur. The random fluctuations seem to always be ~0.08V higher than the programmed tone. If I transpose all the C1 values to C2, for example, I then see the voltage fluctuate from 1.16V to 1.24V. (*Values approximate... I am using o_c as a scope with only 2 digits of displayed precision).
* I recently installed an EME expander and programmed track A to be sent to midi track 1. I never noticed the issue before that, and thought that was potentially related to the issue, but after programming track A not to go to the EME, the problem persists.
* I tried disconnecting the EME expander from the Eloquencer, and the problem persists.
(Hope I'm not doing something dumb via some obscure/unknown setting, but...) seems like an issue with voltage tracking on my unit's Track A output.
ps. I'm running firmware v_1_3_8
Update: I had CV1 assigned to CV Add, Track A, with no patch cable connected to the CV1 input.
Removing the CV1 assignment fixed the problem!
Still seems like a potential issue (e.g. no patch cable attached to CV1 should be equivalent/normal to 0V and have no effect on associated tracks, no?), but I am happy to have a work-around and will avoid using that CV assignment feature unless I'm sending control voltages.
Another Update: I did some experimentation, and with CV1 assigned to CV Add, Track A, with a patch cable connected to a constant voltage source, I can still reproduce the issue.
Is this an artifact of CV Add + quantization, perhaps even a rounding 'bug'?
When the voltage on CV1 goes even slightly above 0.0, and that voltage is added to the 1.0V associated with C1, it looks like the quantizer (scale set to semitone) is bumping the output voltage up to the next semitone. As a result, random slight environmental variations (of hundredths of a volt) cause the quantizer to bump the note up a semitone, causing the apparent random variations.
Shouldn't the quantizer map the voltage presented to the closest associated semitone? (e.g. 1.01 V -> 1.00V while 1.05V -> 1.083V) rather than mapping values barely above 1.0V to the next higher semitone (1.01 V -> 1.083V ?)
In other words, are the quantization thresholds equidistant between the semitone values? Should they be? (Assuming the observations above are correct.)
Hi again @pcharles !
Thanks for the detailed feedback and sorry for the silence, I have been focusing on new stuff and then holidays came.
I will have to add you to the betatesters group, you have potential 😀
This can be explained starting on the original purpose for CV in 1 and CV in 2. They were (hardware) designed as inputs to control generic parameters, not to track pitch. After releasing the first Eloquencer batch many users asked for CV inputs to work on pitch related duties. So I decided to implement it but, as you perfectly detected, in some units the 0 detection was fluctuating between C0 and C#0 because sometimes the ADC that reads the CV inputs couldn't reach zero. We warned about that in the SW release and the manual:
"This tuning will allow the incoming CV IN signals to be interpreted as a V/oct signal. Bear in mind that these CV inputs were not originally intended for pitch tracking and some unexpected behavior can happen in some occasions like some notes jumping to the contiguous semitone."
Actually, this sentence was only making reference to the C0 <> C#0 interval.
I have tried some easy hardware fixes that a DIYer can at home with minimal material but I haven't come with an easy solution.
Thanks for the response.
Happy to hear about the DIY h/w fix, if you have a link or some quick info. Otherwise, not a pressing issue for me, so no need to spend more time on this.
I'm still curious if this could be addressed with a slightly different rounding algorithm in the quantizer portion of the software? e.g. rather than what appears to be '>' type boolean logic, round voltages to nearest target note (e.g. use voltage thresholds at midpoints between target notes, as mentioned in the 4th post above). Maybe that's an oversimplification, or not how it works internally?
@pcharles It's what I'm doing in the code, when receiving the digital value (coming from the analog voltage sampled by the ADC) the algorithm looks for the nearest value in semitones table. The problem is not in the code, is hardware. The code is not capable to see any difference between C0 and C#0 (only in some units). If you send a C0 the digital value is 4095 if you send a C#0 the digital value is 4095 , both C0 and C#0 are detected as C0 (or C#0), if you send a D0 the value is, let's say, 4084 that is detected as D0. Does it makes sense ?
Understood. Thank you.