The csv file which shows votes per state over time, last few days.
Csv
http://s000.tinyupload.com/?file_id=14566999163598825215
Python script
Issue:
Csv file has the vote share columns which is used to calc each candidates votes. The vote share columns only have a precision of 3 decimal places. This leads to vote shaving, needs to be at least 7 decimal places of precision.
Take 2,690,400
Votes share .493 for Trump (1,326,367 votes) Vote share .502 for biden (1,350,581 votes)
Now change Trump vote share to .492 and leave biden unchanged. Happens all over the place and by bigger margins.
Now Trump has 1,323,676 votes loosing 2,690 votes.
How can any data enginneer use only 3 digit precision for calcs like this unless you want rounding errors to hide vote shaving and use it as cover as a oversight or "glitch".
This manipulation and large voter dumps only showing up in key battkeground states and a few others.
Does anyone have any time series data by precinct?
This data being used by all media outlets is bunk.
why are they using percentages instead of counting and providing the exact number of votes? this is so fucked up. wtf kinda coding is this?
You're thinking like a coder wanting accurate data, not a coder wanting malleable data.
Exactly, I don't get it either.
No reason to do this unless you are trying to hide vote shaving, normalizing and hiding the change through rounding errors.
I would fire any IT developer or db engineer who only used 3 digit precision to calc numbers like this and thought it msde sense.
Seems like they were using float16s to save memory. You know, energy saving, green vote counting using the percent methodology that anyone with a 4th grade education wouldn't use unless they were trying to act like Richard Pryor in Superman III.
Common core coding
Yes