Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I'm working through Aluffi's Algebra: Chapter Zero, which covers abstract algebra (groups, fields, vector spaces, etc.) with category theoretic foundations. I took undergraduate algebra several years ago, and I'm really interested in category theory from a compositionality perspective, so this is a good opportunity to brush up on both topics.

Aluffi is really well-written. It assumes some degree of mathematical maturity (so it's well-positioned for a second pass of the material), but has a generally conversational tone without being imprecise. The exercises are excellent, too, if occasionally difficult using only the machinery introduced up to that point. (Again, well-suited to readers taking a second pass at algebra.)

Why am I doing this? Leonard Susskind puts it well in this video [1]. To put it in my own words: our senses evolved for the physical world around us, and some of the most technical activities we do today are wildly underserved by our natural senses. That's why we build things like microscopes and telescopes and whatnot -- to extend our senses into new domains. Mathematical intuition is almost another sense in its own right: you gain the ability to perceive abstractions and relationships in ways that are just not well-described by sight or touch. I both enjoy this sense and find it valuable, so of course I'm going to continue honing it :)

[1] https://www.youtube.com/watch?v=2bgZmBAnhdg



IIRC in this podcast: https://player.fm/series/superdatascience-2532807/sds-345-ma...

The interviewer is a researcher at Twitter Cortex applying category theory to ML. His intuition about the possible links reminds me of this kind of extending of senses you describe.


Awesome, thanks! Listening now, and shared with brother.


This really reminds me, it's been two decades now since I've taken _any_ Algebra. I'd really love to go re-learn it from the basics on up. I mean, I still remember a lot of it here and there but some sort of refresher course up to doing more advanced would be awesome.

Any recommendations?


Hell yes! I love hearing about people interested in abstract algebra and wholly support your endeavor! Here is a pretty decent resource on relevant texts:

http://www.cargalmathbooks.com/#Abstract%20Algebra

That page in general is pretty gold for math texts in general.

Also, #math on freenode has lots of algebra-strong users on it, though depending on your luck some can be less helpful than others. I love chatting about this kind of thing with people, so if you would like an ad hoc mentor/study-buddy I would be more than happy to help. Feel free to email me at the address in my profile.

Good luck!


Hi there, the email field doesn’t show up - you need to put it in your about section for people to see it :)


Uff. Thanks! Done.


Cool, thanks for the info! I'm going to check out that list.


I think Linear Algebra is traditionally recommended, since you can readily apply a lot of geometric intuitions while picking up the mathematical ones. The drawback is that you have to make sure you're not cheating yourself of the mathematics by over-relying on the geometry.

Sheldon Axler's acclaimed "Linear Algebra Done Right" is freely available as a download [1] through July due to the pandemic. I've not read it (yet!), but I've heard so many good things about it I feel comfortable recommending it off the cuff. :)

(Recommendation: try not to focus too hard on the matrices! They're just convenient representations (syntax!) for the actual things we care about: linear transformations. It's less geometrically intuitive, but it lays a much better foundation for algebraic widgets other than vector spaces. Use the wealth of geometric intuitions to jump-start your mathematical sense.)

I don't have a lot of recommendations for other algebra topics, unfortunately. My class textbook for abstract algebra was Dummit & Foote, which I found very dry and lacking in intuition. Aluffi is perfectly servicable if you feel good about your linear algebra; just don't feel like you have to complete every exercise.

Also, I'm a sucker for order theory, so if you're up for something a little less algebraic, pick up Munkres' Topology. I'm consistently surprised at how often topological and order-theoretic intuitions come up in software development. There's a close connection between topology and logic, so perhaps I shouldn't be so surprised -- but I haven't studied Stone duality at all, so it shall remain surprising for now.

[1] https://link.springer.com/book/10.1007/978-3-319-11080-6


I wonder whether HN has heard of "Linear Algebra" by Hoffman and Kunze. I learned from it as an undergraduate and remember it being told that it was a classic. Another more abstract book is by the great Halmos: "Finite Dimensional Vector Spaces". Both books will stretch and entertain you.


Sheldon Axler book for some reason is hugely hyped in HN always when this topic comes up.

But I do wish for a more comprehensive book on Algebra covering the entire breath of the field.

Matrix Analysis by Roger A Horn, seems a good book with most of the knowledge of matrices covered. Could be helpful for people working with Graphics.


Sheldon Axler's acclaimed "Linear Algebra Done Right"

Seconded. Highly recommended.


You know something? I know a little abstract algebra: groups, subgroups, quotient groups, and the relvent theorems behind them. It's been disappointingly useless to me though. Maybe someday I will take the quotient group of two matrix groups... I'm not sure though.


I certainly haven't applied any of those examples either. ^_^ Abstract algebra, topology, etc. are all studies of problems that already have mental frameworks. People already did an incredible amount of legwork building the apparati for understanding these fields (no pun intended). The value of learning these frameworks, if you're not going to work in those fields directly, is to understand how to build your own framework.

What kind of tools are in your toolbox for breaking problems down? Where is my problem different from others, and where is my problem fundamentally the same? How can we isolate these parts and handle them on their own terms? This is fundamentally mathematics, however it's ultimately expressed.

Here's a small selection of those ideas I've picked up from mathematics that have absolutely paid dividends in my day-to-day:

* The idea of a "homomorphism", a structure-perserving map between two different domains of discourse. The more I learn about category theory, the more I realize that homomorphisms are conceptually everywhere in software. The more I learn about domain-driven design, the more I realize the role functors (a particular kind of homomorphism) really play in software design.

* The idea of a "fixed point", for limiting behavior of processes. Fixed points are especially pleasant in domains where processes have some sense in which they "grow monotonically". When I can model a system as a series of operations that "add knowledge" and don't invalidate prior results, I know I have a wealth of analytical tools at my disposal.

* The idea of products (pairing) and sums (choice) in type theory, for modeling interactions between components. I feel like I'm in a straitjacket when using a language without sum types; I have to encode what I really mean using tools that don't let me get there directly.


What I got to think recently about the value of knowing more about stuff whose usage isn't imminently obvious is that when you expand your knwoledge, the 'range' of your world changes. So yes, almost by its nature, you would not use the stuff that you don't know much of, but you would be hemmed in by your own ignorance. On the other hand, by expanding your knowledge, you would also expand your range of experience (your world) thus find it more useful.


I studied vector mathematics in high school; matrix operations, dot product, cross product etc. All through these lessons I thought; "what a stupid thing to learn, who would ever use this?". Then after school I became a CAD/CAM developer and spent most of my time working with vector mathematics. It was with the help of OpenGL so I technically didn't need to understand how these operations worked under the hood but yep... what a stupid thing to learn indeed.


Most people start abstract algebra with groups, which is not surprising since the underlying definition is very simple and the basic examples are easy to understand. But abstract algebra only really starts to come into its own when you start to learn about rings and modules, which ultimately turn out to be important in proving most of the significant theorems in group theory as well.

One example I've been toying with recently is the link between complex and split-complex numbers, and the latter are isomorphic to a direct product of two copies of R. Putting these analogies together leads to a slight improvement of Karatsuba's complex number multiplication algorithm:

https://gist.github.com/scythe/9976568d9d49b3322a33c5cd55895...

The extra storage use does call into question whether this representation can be helpful in practice, but the fact that these abstractions can be unrolled into code is pretty cool.


I am curious about category theory? Do you have any resources on where it can be applied?


David Spivak and Brendan Fong have been doing lots of work at MIT to evangelize applied category theory. See their recent books/classes:

Seven Sketches in Compositionality: An Invitation to Applied Category Theory https://arxiv.org/abs/1803.05316

Applied Category Theory mini-course: https://ocw.mit.edu/courses/mathematics/18-s097-applied-cate...

Programming with Categories mini-course (with Bartosz Milewski): http://brendanfong.com/programmingcats.html


I'm no expert, so the best I can do is point you to the research community gathering around applied category theory.

* A conference series + workshop: https://www.appliedcategorytheory.org/

* A journal: https://compositionality-journal.org/

Check out some of the people involved in organizing these events. Names I've followed include John Baez, Pawel Sobocinski, and Tai-Danae Bradley (all of whom are amazing educators; check out their work!).

Tai-Danae Bradley wrote a pamphlet on applied category theory which is very approachable: https://arxiv.org/pdf/1809.05923.pdf . Also check out Jules Hedges' thoughts on the importance of compositionality: https://julesh.com/2017/04/22/on-compositionality/ .



Haskell.

To do io you need monads. There's come from category theory. Haskell is great fun to learn, very different, focus on abstractions and abstractions of abstractions. One the one hand i highly recommend it. On the other the vast majority of people learning it never do any useful work at all in haskell. (This last statement will probably excite the haskell zealots whom I would encourage to reply with evidence.)

There are about 5 programs i know if that you might use written in haskell for a purpose other than programming a computer.

Anyway category theory definitely comes up in lazy, pure functional programming. A lot.


As a great fan of Haskell myself, I would clarify that Haskell needs monads because of the (pure functional) restrictions it sets for itself, not because I/O itself fundamentally requires explicit monads. (Haskell itself supposedly used a lazy-list approach to input and output before monads caught on -- something like `main :: [Response] -> [Command]`, I think.)

That being said, when you explicitly model side effects, you invariably end up with some kind of monad in your model. Food for thought: in a logic programming setting, where your domain is some flavor of partially ordered set, monads are closure operators (a kind of monotone function). Closure operators give a cool foundation for the semantics of certain kinds of logic paradigms, such as concurrent constraint programming.


This statement that you made—about side effects requiring monads—do you know a proof for that?

I come from the other side, Milewski's book is to me "Functional Programming for Category Theorists".


Nope, no proofs :) Formalizing questions like this is one reason why I'm interested in category theory, so I don't think I have the tools to dig into this right now. But... "side effect" literally maens it doesn't show up in a normal input/output function signature, and in a pure functional language like Haskell, there are no side effects. Monads are a particular way of explicitly capturing side effects as a "separate" kind of thing from the function output using a particular species of functor.

I suppose my statement was a bit strong in that regard. Monads tend to arise very often in the way we build systems, but that doesn't mean they're the only way to cope with side-effects. It does seem likely that any way to capture "alternate outputs" from a function will end up looking like A -> T(B) in some category, though.

(Incidentally, if "A -> T(B)" looks funky, think about polynomials `f(x)` -- the function symbol `f` is just a placeholder for some expression involving the free variable `x`. Could be `A -> (B, String)`. The monad laws only make sure you have some foundation for reasoning about the extra type structure being added in by `T`.)


Does this mean that you use a monad instead of a function?


First, a clarification: when looking at a monadic action `f : A -> T(B)`, the monad here is just `T`. The action itself is still just a function. The monad `T` lets you add some extra structure onto your usual output type in a principled way.

That being said, it's true that I'm using `T` itself in a function-like way. Category theory does blur the lines between the two ideas: monads are "functors" with some extra structure, and functors are ("just") functions between categories that preserve categorical structure. But it's critical to notice here (and it's apparent from the type `f : A -> T(B)`) that `T` is not the same kind of function that `f` is. `f` lets you move from one type to another, by mapping each value in one to a value of another. `T` lets you move from a whole category to another category, by mapping each object/type in one to an object/type of another, and mapping each arrow/function in one to an arrow/function of another.

In other words, monads occur at the type level, whereas functions occur at the value level [1]. That means that, colloquially, you can't just use a monad "instead of" a function, any more than you can use "Integer" instead of "42". But as I alluded to above, monads over partial orders are closure operators, and we can often model the evolution of data over time as a partial order. So in that domain, monads literally are functions, and the "side effect" of a closure operator is mutably updating a cell by moving its contents up in the order.

If you model the evolution of data as a partial order, you can indeed obtain monads that more closely resemble normal functions. But a partial order is just a particular kind of category, so even here we've built a separate little domain over which our monad exists. (Of course, functors are all about crossing those domains in principled ways.)

[1] That's why it's not "IO ()" or "Maybe Int" that are monads; it's "IO" and "Maybe" themselves, which are well-behaved functions from types to types.


I quite belatedly realized that you come from Category Theory, so most of my sibling reply is probably old news to you. Sorry!

The most charitable response would be "yes", but it really depends on how you model your domain. Most instances of monads in software are at the type level, not at the value level, and "function" doesn't usually make sense at the type level.

The example I gave of a monad over posets does arise in the semantics of logic programming, but I haven't seen explicit recognition of monads in that area. Not that I've looked that hard, yet.


If you don’t mind me asking, how long do you expect it will take you to go through that text?


I'm not sure! There's a good amount of material that I've never dealt with before, so I'm sure there will be parts where I go relatively slowly. I also have a day job, so I only spend time with the book when I feel up for it. It's kind of an open-ended thing for me right now.


Entirely understand, enjoy and good luck with it!




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: