The csv file which shows votes per state over time, last few days.
Csv
http://s000.tinyupload.com/?file_id=14566999163598825215
Python script
Issue:
Csv file has the vote share columns which is used to calc each candidates votes. The vote share columns only have a precision of 3 decimal places. This leads to vote shaving, needs to be at least 7 decimal places of precision.
Take 2,690,400
Votes share .493 for Trump (1,326,367 votes) Vote share .502 for biden (1,350,581 votes)
Now change Trump vote share to .492 and leave biden unchanged. Happens all over the place and by bigger margins.
Now Trump has 1,323,676 votes loosing 2,690 votes.
How can any data enginneer use only 3 digit precision for calcs like this unless you want rounding errors to hide vote shaving and use it as cover as a oversight or "glitch".
This manipulation and large voter dumps only showing up in key battkeground states and a few others.
Does anyone have any time series data by precinct?
This data being used by all media outlets is bunk.
Edison Research collects data themselves from multiple sources, bypassing the Secretary of State(s).
They aren't the authoritative source.
Yes. The Secretary of State publishes it.
I can easily find it for my state.
No, it’s done by each state, separately.
Edison Research is a VENDOR to the Misleadia, not an official agency.