Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> You found one of your ideas appears in a recently published paper. You can no longer work on it.

This is one of the things I thought of right away when ChatGPT got released last year. "God, there's probably so many PhD candidates right now in NLP feeling despair like all their work was pointless ...as if million of voices cried out in terror and were suddenly silenced."

It's hard in the moment to know whether what you're working on has any utility. So just do your best and keep chugging!



I met someone recently who finished their PhD in computer vision related work a couple years ago and she said all of her specialization now felt useless, but that her PhD was still useful for understanding the fundamentals for a job she now has but does absolutely nothing with her research experience.


HmThat very heavily depends on the specialization. E.g, you did image processing, basically useless now, you did GANs, diffusion models took over. It's like that for probably most phds but the research skills, writing skills etc are with you forever.


Math is pointless from start to finish, but that doesn't stop them.

PhD is granted for novelty, not practicality.


> Math is pointless from start to finish

And this attitude, my friends, is the reason why so much software out there is so bad.

We need more of a math mindset when developing software. What can we be sure about, what are the invariants, what can we prove? There is so much crap out there that somebody lacking understanding just tried to wing, and I'm constantly ashamed of it.

Computer science is applied math.


He said maths not CS. A lot of research mathematics has no application.


Number theory had no applications for centuries. Now, cryptography is based on it and the modern internet would be unthinkable without.

Foundational research does often not provide immediate applications. Still, if we don't do it, out understanding of the world is lacking and it hurts us later down the road.


While there certainly exists math for the sake of math, there is a trickle down effect that is quite real (there’s also a trickle up effect that is real but that’s unrelated). Someone does some math for the sake of math. Later on, someone who is slightly more applied sees a link between that math and a more applied problem they’re working on. If the idea is truly useful, it propagates down all the way to application-focused practitioners. Researchers exist on a spectrum, generally, between pure theory and pure application.

Math has no application until you find an application for it. Differential equations are just equations until you pair them with physics. Formal logic is just an abstract discussion of human reasoning until you build a circuit, etc.


One wonders if trickle down mathematics is any more efficient than trickle down economics. It seems like we might be better off not funding pure math, as forcing function to coerce those minds to work on more applied problems directly, instead of relying on this random serendipity.


It seems like I might be better off picking the winning lottery numbers directly instead of relying on the random serendipity of guessing them and most of the time being wrong.


Why is this the case? Wouldn’t having more than one paper proving/discovering the same thing be good for confidence in either of them?


Its sort of a mix of a lot of small things - 1) The coming conferences will be flooded with LLM analysis, so the space will be heavily saturated and more difficult to find a significant contribution; 2) LLMs are a new model that you might need to include in your analysis, which means learning about and becoming familiar with them; 3) your work might get overshadowed because its now obsolete in the land of LLMs

A slight equivalent I can think about would be the emergence of neural networks. When I was working on my Masters on face recognition, neural networks were not the major force they are now. Facial landmarks used a combination of haar features and edge detection. These methods weren't outright abandoned, but if NNs had taken off during my research, then I would have had to restart my work.


In theory yes, in practice many journals are only interested in work with a clear novelty factor.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: