-
-
Notifications
You must be signed in to change notification settings - Fork 8
Description
Something is not right with the phase correction and/or calibration process.
In Mk2_3phase_RFdatalog_temp, the function processCurrentRawSample corrects the phase difference with an IIR EMA filter by delaying the Current channel. Alpha is set to alpha=0.002, but by my count, it should be at least 0.85 to obtain a 1/6 delay when sampling 6 channels. lpf_gain is zero, so the correction is disabled by default.
Then, looking at cal_CTx_v_meter the function processCurrentRawSample is no longer delaying the current, but the voltage and the delay is now applied with a FIR 2-Tap filter. i_phaseCal is set to max, so the phase correction is disabled here too. With 6 channels, i_phaseCal should be at least 256/6 .
There's f_phaseCal but it's not used.
I don't see how the metering can be accurate with no phase correction.
A sample is 104us, so there's at least a 104us phase delay between voltage and current.