The llm CLI is great for quick inference, but fetching web content as context is still manual -- curl + piping, or using tools that break on Cloudflare-protected sites.
Anybrowse (https://anybrowse.dev) could work nicely as an llm plugin for this. It has a simple API (POST URL, get markdown back) and handles JavaScript rendering and Cloudflare. Something like:
llm -m claude anybrowse https://news.ycombinator.com/item?id=xxx "summarize the discussion"
Not sure if plugin architecture supports external content fetching like this, but worth floating as an idea. The MCP interface might also map cleanly to the llm tool system.
The llm CLI is great for quick inference, but fetching web content as context is still manual -- curl + piping, or using tools that break on Cloudflare-protected sites.
Anybrowse (https://anybrowse.dev) could work nicely as an llm plugin for this. It has a simple API (POST URL, get markdown back) and handles JavaScript rendering and Cloudflare. Something like:
Not sure if plugin architecture supports external content fetching like this, but worth floating as an idea. The MCP interface might also map cleanly to the llm tool system.