I'm trying to learn Object Oriented Programming in Python. To do this I need to create a method that calculates the slope of a line, which joins the origin to a point. (I think) we're assuming that the origin is (0,0). For example:
Point(4, 10).slopeFromOrigin()
2.5
Point(12, -3).slopeFromOrigin()
-0.25
Point(-6, 0).slopeFromOrigin()
0
And we're using the equation slope = (Y2 - Y1) / (X2 - X1)
to calculate the slope. Also, since dividing by 0 isn't allowed, we need to return None
when the method fails. Here's what I tried:
class Point:
#Point class for representing and manipulating x,y coordinates
def __init__(self, initX, initY):
#Create a new point at the given coordinates
self.x = initX
self.y = initY
def getX(self):
return self.x
def getY(self):
return self.y
def distanceFromOrigin(self):
return ((self.x ** 2) + (self.y ** 2)) ** 0.5
#define a method called slopeFromOrigin here
def slopeFromOrigin(self):
#set origin values for x and y (0,0)
self.x = 0
self.y = 0
#slope = (Y2 - Y1) / (X2 - X1)
if (Point(x) - self.x) == 0:
return None
else:
return (Point(y) - self.y) / (Point(x) - self.x)
#some tests to check our code
from test import testEqual
testEqual( Point(4, 10).slopeFromOrigin(), 2.5 )
testEqual( Point(5, 10).slopeFromOrigin(), 2 )
testEqual( Point(0, 10).slopeFromOrigin(), None )
testEqual( Point(20, 10).slopeFromOrigin(), 0.5 )
testEqual( Point(20, 20).slopeFromOrigin(), 1 )
testEqual( Point(4, -10).slopeFromOrigin(), -2.5 )
testEqual( Point(-4, -10).slopeFromOrigin(), 2.5 )
testEqual( Point(-6, 0).slopeFromOrigin(), 0 )
As you can see, I'm trying to say that we need the first parameter of Point to be x2, and the second parameter of Point to be y2. I tried it this way and got
NameError: name 'y' is not defined on line 32
.
I also tried to get the index values of Point like this:
return (Point[0] - self.y / (Point[1] - self.x)
But that also gave me an error message:
TypeError: 'Point' does not support indexing on line 32
I'm not sure how to get the value of the x and y parameters from Point
so that the method works when it's tested. Please share your suggestions if you have any. Thank you.
First problem
You just set the current point to the origin. Don't do that. The distance from the origin would then be 0...
Second problem
Point(x)
andPoint(y)
are not how you get the values forself.x
andself.y
.Then, slope is simply "rise over run". Plus you want to return
None
whenself.x == 0
.So, simply
Or even
Or let Python return None on its own
I think your confusion lies in that you think you need to somehow define "the origin". If you needed to do that, you would instead have this