I'm defining rational class, so for example a = Rational(1,2) #1/2, and b = Rational(2,3) #2/3, and I want to have c = a + b so that c = Rational(7,6) #7/6. My code so far is:
class Rational(object):
def __init__(self, v1, v2):
self.value = v1/v2
def __add__(self, value2):
return Rational(self.value + value2.value)
a = Rational(1,2)
b = Rational(2,3)
c = a+b
But I get the TypeError message that init requires 3 arguments (2 given), where did it get wrong in the coding above pls? Thank you!
According to your class, you create an instance of
Rationalby passing the numerator and denominator to it, but here you're trying to create one just by passing its (floating-point) value. Of course, it's possible to find a rational equivalent to a float, but you haven't taught your class how to do it, and it's not going to magically reverse-engineer itself.Given the definition of adding fractions: p/q + r/s = (ps + qr) / qs, your addition function should return
Rational(ps + qr, qs). The problem is, you haven't kept track of your numerator and denominator in your class, so you have no way of retrieving this information.As things stand, the best you can do with your addition function is return a
self.value + value2.valueas afloat. So as it stands, your class is basically a long-winded way to do division! To have a meaningfulRationalclass, I would strongly suggest you keep everything in terms of the numerator and denominator as far as possible.Edit: I forgot to mention - if you're using Python 2.x your division won't work as it should unless you either convert one (or both) of
v1orv2tofloatbefore doing division, or better still, include the linefrom __future__ import divisionat the top, so that division behaves as you'd expect.