Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

One‑shot timestamps (the kind hard‑coded into Claude’s system prompt or passed once at chat‑start) go stale fast. In a project I did with GPT‑4 and Claude during a two‑week programming contest, our chat gaps ranged from 10 seconds to 3 days. As the deadline loomed I needed the model to shift from “perfect” suggestions to “good‑enough, ship it” advice, but it had no idea how much real time had passed.

With an MCP server the model can call now(), diff it against earlier turns, and notice: "you were away 3 h, shall I recap?" or "deadline is 18 h out, let’s prioritise". That continuous sense of elapsed time simply isn’t possible with a static timestamp stuffed into the initial prompt; you'd have to create a new chat to update the time, and every fresh query would require re‑injecting the entire conversation history. MCP gives the model a live clock instead of a snapshot.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: