Python Forum
Thread Rating:
  • 0 Vote(s) - 0 Average
  • 1
  • 2
  • 3
  • 4
  • 5
Software Disenchantment
#1
http://tonsky.me/blog/disenchantment/

Article talks about how hardware is getting more and more powerful, so the same programs can do essentially the same things, at roughly the same speeds, all while using more resources. Pretty good read.
Reply
#2
Definitely interesting. I'm disappointed that he didn't address that first example about it taking >40 years for an optimization to pay for itself, and it definitely feels like exaggeration at times, but I agree that it seems absurd that Android is bigger than Windows 10. My biggest complaint about this is definitely for mobile, where resources are so constrained, and I agree that we're all building on top of things which have these endemic problems. That said, it's very true that these improvements aren't worth the time they take, especially in a business context where you're constantly trying to deliver something.

I do want to say that it depends on local culture though. My previous company was really bad about accumulating tech debt, and my current one is really good (at least from what I've seen so far) about balancing tech debt and business needs, mostly not accumulating tech debt and making a point of paying it off.
Reply
#3
I feel like it's a balancing act. End point applications are fine to be hogs at times, like Visual Studio, VScode, etc. What isn't really ok, is when the middle layer interfaces start bloating up, because they "infect" everything else in the chain of dependencies to make the whole system worse.

For the >40 years thing... it isn't that big of a deal if a single application is that much faster or slower. But if everything you use were 400% faster, the speed gain of everyone using those apps, would pay for themselves in minutes.
Reply
#4
(Sep-19-2018, 06:59 PM)nilamo Wrote: For the >40 years thing... it isn't that big of a deal if a single application is that much faster or slower. But if everything you use were 400% faster, the speed gain of everyone using those apps, would pay for themselves in minutes.
I think there's a bit more to that... just like the Python philosophy of "fast enough" you can make some things faster with no gain. Have a script that you run once a day that takes a second? I don't believe that making it 4x faster is really doing any good. Making emails that take 10s+ to load take 1 or fewer seconds would sure be worth it though, in aggregate.

I like your point about "end point applications" though, that them being slow has a far lesser impact. Along those lines, I hope to see one day a project like Pypy basically become the standard over CPython. The better the foundations are, the less we have to constantly worry about optimizations in day-to-day coding.
Reply
#5
I wonder how much of this is simply compatibility issues. Each time Windows has new features, the old features still need to be supported, so the libraries we use start ballooning in size, containing patches to ancient code that shouldn't even really be needed anymore, except that old programs still use them.

Should api maintainers be more willing to remove functionality, when there's new/better ways to do those things? Should libraries be versioned? Or is that walking backward into dll-hell?

Personally, I'm fine with wasting a ton of disk space if it meant significant improvement to run time.
Reply


Forum Jump:

User Panel Messages

Announcements
Announcement #1 8/1/2020
Announcement #2 8/2/2020
Announcement #3 8/6/2020