Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I agree with this, and the rationalization that I eventually worked out and learned to live with is that as you get into more advanced areas of math, much of what you are learning are building blocks for assembling more complex tools - but those are tools you have no use for yet, and therefore can't hook constituent elements into any existing framework of understanding.

There is no way to pass through these obstacles (without spending the multiple lifetimes it took to forge them from first principles) except to memorize them and gradually extend understanding backwards from that memorization into the broader context of dependencies that converge into its formalization.

But having this predicament explained to me up front, ideally somewhere around the age one learns about something as basic as fractions, would have been enormously helpful.



> except to memorize them and gradually extend understanding backwards

That is not the only reason, but also a big part of it, yes. Some of the things (but not all) that I memorize I admittedly don't fully understand. I try to avoid that, but it happens. Usually though, the sudden realization comes at a later point, when I understood more of something else.

Literally happened last week to me. I had been memorizing a "stupid" theorem for a while[1], not realizing why it's useful, until something I read was about discontinuities in the n-th derivation of a function, and what that means for the terms of the function's Taylor series, and it all lit up in me, tying it all together.

It's a good feeling!

[1] https://www.dsprelated.com/freebooks/sasp/Spectral_Roll_Off....


No! Instead of rushing ahead into meaningless abstraction, you should spend more learning in the world of concrete examples and applications, to provide meat for those abstract skeletons. We don't need a mathematician is who is a poor imitation of a computer or a reference book.


In my experience, it helps to approach from both sides, and that's also part of the point I'm trying to make. If you come only from the "concrete examples and applications" side, you might not actually fully understand, despite thinking you do, and miss some finer subtleties. Those are often the "gotchas" that get pointed out in textbooks (but even then are easily forgotten).

I am definitely a "hands on" math person. Almost every piece of math I learn has application ultimately, personally I would not really be motivated otherwise. I learn math because I want to use it. I then also find joy in the abstract beauty, but I would not uncover that beauty without something I could use it for in the first place in mind. For me specifically, a large application of the math I'm learning is signal processing (both digital and analog), and analog circuits (e.g. "EE stuff").

And in that, some of the learning of "meaningless abstractions" first, and later entangling it with actual application, definitely, positively helped in getting a good understanding of the underlying math, which I can in turn integrate with what I'm doing in the "real world".




Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: