Python Forum

Full Version: finding the closest floating point number in a list
You're currently viewing a stripped down version of our content. View the full version with proper formatting.
Pages: 1 2
(Sep-15-2019, 04:16 AM)Skaperen Wrote: [ -> ]i'm going to need to find some extra time to learn numpy.
Definitely, i actually can´t believe you didn´t learn it so far.
Maybe this vid is helpful.
Introduction to Numerical Computing with NumPy | SciPy 2019 Tutorial
i learn things when i need them. things stay with me better that way. i haven't needed numpy, yet. of course, the disadvantage is that i won't know there's a solution in there. but i won't know that if i never use it, either. so i wait until i have a real need or at least a way to use it in my current project. i remember a new friend a few years ago told me i should try python, and that i'd love it. my next project was a script to login to the main router and pull down the configuration. so i tried it in python using the pexpect module. my friend was right.
At least you can become familiar with what the most used Python's modules are used for and what they are capable. That way when you start a project you will know that there are alternative/different way to do something and you can do deeper research about it. And eventually, use it. Numpy is fast. Really fast.

I don't know NumPy and I have never used it but because of this topic, I google it just a bit to make an example. I just know that it's fast. So here is how much.

In [1]: import math

In [2]: nums = list(range(2, 100001))

In [3]: def power_it(numbers):
   ...:     return [math.pow(num, 5) for num in numbers]
   ...:

In [4]: %timeit power_it(nums)
22.6 ms ± 2.65 ms per loop (mean ± std. dev. of 7 runs, 10 loops each)

In [5]: import numpy as np

In [6]: def np_pow(numbers):
   ...:     return np.power(numbers, 5)
   ...:

In [7]: %timeit np_pow(nums)
4.89 ms ± 14.9 µs per loop (mean ± std. dev. of 7 runs, 100 loops each)

In [8]: l = np.array(nums)

In [9]: %timeit np_pow(l)
258 µs ± 1.39 µs per loop (mean ± std. dev. of 7 runs, 1000 loops each)
I have to thank you for that because I'm planning to use it constantly now. I am surpriced Sick
Consider I have opened two browsers Chrome and Firefox with 118 and 499 tabs. ;)
499 tabs????? and it didn't crash or cry?
I don't believe it either but it's in front of my eyes :D
Two windows 499 tabs both.
and i really also need to understand the depths of things. for example, i need to look at processing a tar file. the compression layer, i know, is not an issue. but can i change things in an incoming tarball and put it all in an outgoing tarball at the same time handling file contents chunk-by-chunk so i don't need to extract it all before putting it in a new outgoing tarball? this case needs to change metadata in the tarball, like file owner or file name. in the future i might even need to change file content on the fly, or change between tar and cpio format. that's why i actually need to go have a look at that module, real soon.
Well, actually the window 1 has opened 336 tabs FF window 2 has opened 499. In along with Chrome with its own 116 tabs.
But! I see that both browsers are loading pages when their tabs are clicked. The Task manager tells me that FF has 6 processes running and Chrome 15. This is why they don't crash. Or the machine.

I don't learn anything until I need it. Just like you. But I think this is a mistake. I am glad that I did this with NumPy for the simple example above. I'm going to use it instead of a regular for loop when I deal with numbers. Yesterday I realised that I can use it with my old project which I put on hold because I stuck. It involves heavy computations with really, really big numbers. It was too slow. If I knew NumPy perhaps I would have finished it by now.
both windows are FF? one process? that would be 835 tabs, total. yikes! they probably coded with inf in mind like GNU did make. all programs should be like that. let people scale up to how much memory they install.
Pages: 1 2