I noticed after installing the energy monitor that the readings from my utility company in their app work consistently about 2% higher. I figured the Emporia might be more accurate than the power companies meter because it’s unlikely that both the individual sensors and the main sensors would be off by this amount (The balance value was around zero).
I paid to have someone from the power company come out and test the meter with some expensive looking calibrated instrument and showed me it was within a fraction of a percent of what their meter was reading, so it was accurate.
In order to compensate for this, I added a multiplier for all of the CTs in the app including the main ones. (1.02 for 120v circuits and 2.04 for 240v).
Since I made that change, the daily and monthly usage from the power companies app is spot on with Emporia.
There are no un-monitored circuits. I know the accuracy can vary a little but why would it consistently be 2% under recording?
Also, the app lets you put in a multiplier for a 10th of a percent, for example 1.023, but it seems it doesn’t save that level of detail.
I believe it was rounding off the values for the individual CTs, and always rounding down (because of a bug perhaps) The values I put in for the main CTs.