Apr-01-2024, 02:23 PM
Hi all,
I am trying to understand the correct usage of classes and methods (== functions inside the class definition?)
Below you see a class "Punkt", which refers to points in a 2D-plane with an x and y coordinate.
Simple enough, I try to add two points p1 and p2
First try: p1+p2 --> "unsupported operand type" (understood)
2nd try: Write a function "add" wich manipulates the .x and .y properties of the object "Punkt"
3rd try: Implement the function in the class definition (is that called "method"?) Here's the code:
In line 10 you see me trying to initiate an object "Punkt" called "self", so I have something to work with.
But if I do so, the output of "print Punkt 3 {p3}" is always the default value of 99/99.
I thought, self = Punkt() calls an object called "self", which is a "Punkt()" and has the default values of x=99 and y=99.
After that, (line 11/12), I manipulate the self.x and self.y and return the new, edited values.
I tried much and over and over - just accidentially put a # in front of line 10 "self = Punkt()" and BOOM - it worked.
It would be very nice, if somebody could explain this to me (in simple words - kind of new to python :)
Greetings,
MrAyas
Output WITH line 10 active
I am trying to understand the correct usage of classes and methods (== functions inside the class definition?)
Below you see a class "Punkt", which refers to points in a 2D-plane with an x and y coordinate.
Simple enough, I try to add two points p1 and p2
First try: p1+p2 --> "unsupported operand type" (understood)
2nd try: Write a function "add" wich manipulates the .x and .y properties of the object "Punkt"
3rd try: Implement the function in the class definition (is that called "method"?) Here's the code:
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 |
class Punkt(): def __init__( self , x = 99 , y = 99 ): #ALWAYS give default-values! self .x = x self .y = y def __str__( self ): return f '( {self.x} | {self.y} )' def add( self , p1, p2): self = Punkt() #WHY? WITH this line, every result is 99/99 self .x = p1.x + p2.x self .y = p1.y + p2.y return ( self ) p1 = Punkt() #without __init (x and y defaults) here's an error p1.x = 17 #without __init (complete) no one has a problem p1.y = 12 p2 = Punkt( - 9 , - 4 ) p3 = Punkt() p3.add(p1,p2) print ( f "Punkt 1: {p1}" ) print ( f "Punkt 2: {p2}" ) print ( f "Punkt 3: {p3}" ) |
But if I do so, the output of "print Punkt 3 {p3}" is always the default value of 99/99.
I thought, self = Punkt() calls an object called "self", which is a "Punkt()" and has the default values of x=99 and y=99.
After that, (line 11/12), I manipulate the self.x and self.y and return the new, edited values.
I tried much and over and over - just accidentially put a # in front of line 10 "self = Punkt()" and BOOM - it worked.
It would be very nice, if somebody could explain this to me (in simple words - kind of new to python :)
Greetings,
MrAyas
Output WITH line 10 active
Output:Punkt 1: ( 17 | 12 )
Punkt 2: ( -9 | -4 )
Punkt 3: ( 99 | 99 )
Output WITHOUT line 10Output:Punkt 1: ( 17 | 12 )
Punkt 2: ( -9 | -4 )
Punkt 3: ( 8 | 8 )