Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Are graph neural networks designed to solve graph problems like this, or are they a "graph problem"? Or both? :p


Mmm... maybe I'm mistaken. Thanks for the opportunity to reflect a bit more about this. I was remembering this video [0] which talks about Graph Embeddings. In the video, the speaker talks specifically about node classification. Assuming the classes are target functions, this could potentially be used for speculative devirtualization.

Definitely not an expert, just excited to have more tools on which to work with graph data!

[0] Graph Embeddings - Neo4J. Talk by Alicia Frame. https://www.youtube.com/watch?v=oQPCxwmBiWo


I learned them as a generalisation of CNNs where the data is stored in any graph, CNNs being a special case where the graph has a grid structure. Convolutions spread information from nodes to their neighbours. It's just implemented differently because the data isn't a neat 4D array anymore.

In that way, GNNs solve graph problems, the same way CNNs solve image processing problems. Training the GNN is more of an optimisation problem.

Edit: maybe graph theory can help with training on very large graphs but I don't know enough about that.


Yeps, that's a GCN, and probably the most popular GNN is indeed a RGCN :)

The transformer comment is interesting. They're very close, and in general, the tricks people use elsewhere are getting translated to GNNs: convolution, attention, ... . But scaling is still happening, so recent couple of years have gotten folks doing 1M-1B level, but not yet LLM scale yet. Critically, the scaling work is relatively recent -- good GPU impl, good samplers, etc -- and with a good trajectory.

We tracked GNNs for years but stayed away until heterogeneity + scaling started to get realistic for commercial workloads, and that's finally happening. Major credit to folks like deepmind, michael bronstein, early graphsage / jure, various individual researchers, and now aws+nvidia engineers for practical engineering evolutions here.


Coming from a position of knowing nothing about graph neural networks, this is the best description I've seen so far in this thread.


Somewhere in my HN history is this same question and I can't say I've got a conclusive answer. My partially confident takeaway is that GNNs describe the architecture of the neural network itself, much in the same way that convolutional or recurrent are terms used to describe other network architectures.

There are two confusing parts for me

1 - The words network and graph are nearly synonymous in this context, and IIRC most neural network architectures are actually graphs that fit some specific pattern. I don't know what makes a 'graph neural network' special (my guess is it has to do with how the layers relate but i don't know)

2 - I almost always see a mention of a graph-related use cases in the context of GNNs. I don't know if there is a fundamental reason for that or if it just so happens that people who have huge graphs worth applying ML to are actually just have really good intuition about how graphs can be leveraged and go that route.


I have no idea to be honest but I think it relates to the idea of representing the computation in a reasonably comprehendable way.

Either that or networks that can solve specific problems on graph structures by more or less general and hence reusable methods. Which could go ways towards the former point, I guess, but that's also dealt eith elsewhere. Just give g-scholar a search and see


In my mind, GNNs are designed to solve graph problems, in the usual case, with message passing, that enables (I'd emphasise the aggregation step) to "do ML on graphs".




Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: