Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I have been playing around with MCP, and one of its current shortcomings is that it didn’t support OAuth. This means that credentials need to be hardcoded somewhere. Right now, it appears that a lot of MCP servers are run locally, but there is no reason they couldn’t be run as a service in the future.

There is a draft specification for OAuth in MCP, and hopefully this is supported soon.



For the OAuth part, the access_token is all an MCP server needs. So users could do an OAuth Authorization like in the settings or by the chatbot, and let MCP servers handle the storage of the access_token.

For remote MCP servers, storing access_token is a very common practice. For MCP servers hosted locally, how to deal with a bunch of secret keys is a problem.


There's open source package that allows delaying providing credentials to MCP server to runtime / via MCP tool call: https://github.com/supercorp-ai/superargs

For hosted MCPs: https://supermachine.ai


You could use Nango for the OAuth flow and then pass the user’s token to the MCP server: https://nango.dev/auth

Free for OAuth with 400+ APIs & can be self-hosted

(I am one of the founders)


There are remotely run MCP server options out there, such as mcp.run and glama.ai




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: