At least not at the moment, and perhaps it will stay that way. Its logical to think that LLMs will always be more expensive to run vs a simple web or shell script for a specialised purpose.
Arguably you can drop an API or a local script for that AI to consume, but I do see benefits of having it standardised for the industry as mcp if you want something to run as an infrastructure layer that’s AI agnostic.