Well, #2 is something that historically has regularly been required by the powers that be, so the addition of #1 as another option is actually an improvement.
it's mostly just agreeing with you (that yes, it was guessing). LLMs have very limited ability to even know whether it was guessing. But it can "cheat" and just say yes it was if it seems like that's what you expect to hear.
Judging from the number of comments, it seems that this follows the universal rule of 'yes/no question in the headline' where the answer is always 'no'.
There seems to be a focus on understanding when talking about LLMs and solving problems. Personally, I do not think understanding is required. I can write a very small program that can calculate Pi to however many digits I like, or calculate any digit in the sequence on demand, without the program or computer having any understanding at all of what Pi is or what it means. I could get Claude to output that same code when prompted to find a solution to generating Pi, also with no understanding of what Pi is, or what it means.
IMO the ability to provide an accurate solution to a problem is not always based on understanding the problem.
You can easily get compressed episodes of a TV show that are 250MB, so it's like watching a TV series at the rate of 2 episodes every 5 minutes. Obviously better quality is in the range 500MB-1.5GB for a 45-minute episode, so even being generous it's 20 minutes of compressed TV or 70 minutes of uncompressed music every 5 minutes.
reply