We all know how terrible Marx's understanding of economics was, but how many people knew that he was a terrible mathematician, as well? Here, behold this genius trying to define 0/0:
https://www.researchgate.net/publication/255609552_Calculus_A_Marxist_approach
"In calculating the derivative of a function from first principles, Marx did not like the notion of a limit. When considering (f(x) − f(a))/(x − a) he wanted to put x = a, after suitable cancelling or some other algebraic simplification, and write the result as 0/0. He did not see 0/0 as a fraction; he saw it as one symbol."
For those who don't know, 0/0 is undefined because it literally can't have any consistent definition. So this is equivalent to starting your reasoning with a premise like x and not x, which would imply, insensibly, that everything is true.
As Wikipedia puts it, if b ≠ 0 then the equation a/b = c is equivalent to a = b × c. Assuming that a/0 is a number c, then it must be that a = 0 × c = 0. However, the single number c would then have to be determined by the equation 0 = 0 × c, but every number satisfies this equation, so we cannot assign a numerical value to 0/0.
https://en.wikipedia.org/wiki/Division_by_zero
Weird how they ... somehow forgot ... to mention that in his list of math publications, though:
https://en.wikipedia.org/wiki/Mathematical_manuscripts_of_Karl_Marx?wprov=sfla1
So if any of you are unfortunate enough to know Marx bros, you can point out that he was a terrible mathematician as well as a terrible economist.
Not going to defend Marx, but wikipedia is not a credible source and you're basing the contradiction on an assumption that 0 = 0 x c, where c is a/0. It stands to reason that this is faulty logic, because 0 x c = a.
Division by zero can be done with a mathmatical construct, just like how the square root of negative numbers can be done by expanding into complex numbers by defining i x i = -1.
The problem is that defining 0/0 leads to inconsistencies and Marx does nothing to address them here.
I quoted that bit of Wikipedia because it's simple enough for anyone with basic arithmatic to understand.
The problem isn't that you can't define 0/0 to be some number, the problem is that it has too many possible answers. It's the reverse of the operation x * 0, which is 0 for all x. Because any number could've gone into the equation, any number should be able to come out of it when you reverse it, and trying to assign it a meaning fails because anything could be the answer.
It's like trying to do logic starting with the premise A and not-A. You can derive anything from that premise, so it tells you nothing whatsoever.
Your mistake is assuming x * 0 = 0, because that leads to a contradiction if 0/0 is defined. If x = a/0, then x * 0 = a
We don't define 0/0 because it contradicts the field axioms. More specifically, we would call that form "indeterminate" rather than assigning it any value.
https://mathworld.wolfram.com/Indeterminate.html
You can prove that x * 0 = 0 from the field axioms:
https://mathworld.wolfram.com/FieldAxioms.html
Starting by using the distribution axiom on the equation x (y+0) we find that x * (y+0) = xy + x0.
But y+0=y due to the identity axiom, so x * (y+ 0)= x*y
So xy = x (y+0) = xy + x*0
So we can ignore the middle part and just look at the relation: xy = xy + x*0
Using the additive inverse rule, we can subtract xy from both sides, which gives us 0 = x0
Yeah, but those axioms only work when x/0 isn't a thing by convention. Not having x/0 be a thing is a lesser of two evils type of trade. You could make x/0=0, then we just change how we do math.
If 0 x c = a, then by definition a=0.