Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I wouldn't be surprised to see new products from OpenAI targeted specifically at doctors and/or lawyers. Forbidding them from using the regular ChatGPT with legal terms would be a good way to do price discrimination.


Definitely. And in the long run, that is the only way those occupations can operate. From that point, you are locked in to an AI dependency to operate.


Read their paper on GDPval (https://arxiv.org/abs/2510.04374). In section 3, it's quite clear that their marketing strategy is now "to cooperate with professionals" and augment them. (Which does not rule out replacing them later, when the regulatory situation is more appropriate, like AGI is already a well-accepted fact, if ever.) But this will take a lot of time and local presence which they do not have.


I have seen "AI" in my Dr's office. They have been using it to summarize visits and write after visit notes.


Can it become a proxy for AI companies to collect patient data and medical history or "train" on the data and sell that as a service to insurance companies.

There's HIPAA but AI firms have ignored copyright laws, so ignoring HIPAA or making consent mandatory is not a big leap from there.


That's likely DAX Copilot, which doesn't provide medical advice.


OpenEvidence is free for anyone with an NPI




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: