“Apple has announced new accessibility features for its devices, including Live Speech, Personal Voice, and Point and Speak in Magnifier. These updates will be available later this year and will help people with cognitive, vision, hearing, and mobility disabilities. Live Speech will read aloud text on the screen, while Personal Voice will allow users to create a custom voice for their device. Point and Speak in Magnifier will provide a spoken description of what is being pointed at. These new features are part of Apple's ongoing commitment to making its products accessible to everyone.”
That’s really not what the job is at this point. At least where I’ve been, there’s an understanding of the strengths of what compute systems and software can offer for accelerating the work that’s done and opening new avenues for experimentation and methods development - it’s very much a collaboration, not a one-sided tasking.
That said, you don’t get to just go off and faff around with code - you do have users and they do know more about you about the problem space you’re working in, so you do have to spend some time to learn to understand what you’re building and the context you’re operating in, which is different from most other tech jobs. The failure case for most software/computer folks coming into bio is not recognizing the difference between biology and physics before building systems.