Skip to main content
0
S

Scpr

Web scraper CLI and MCP built for human and coding agents

Rating

0.0

Votes

0

score

Downloads

0

total

Price

Free

No login needed

Works With

Claude CodeCursorWindsurfVS CodeDeveloper tool

About

scpr

scpr is a simple and straightforward webscraping CLI tool made to scrape page as markdown content, and developed to be used both by humans and by coding agents (either as an MCP server or as a skill).

scpr is written in Go and based on colly for web scraping and `html-to-markdown` for converting HTML pages to markdown.

Installation

Install with Go (v1.24+ required):

bash
go install github.com/AstraBert/scpr

Install with NPM:

bash
npm install @cle-does-things/scpr

Extra instructions for Windows installation

If you are on Windows, scpr might not be available right after global installation with npm. In that case, you might need to take extra steps:

  1. 1.Find where the node executable is stored on your machine:
bash
Get-Command node

This will print the directory where node.exe is stored: scpr will be installed at .\bin\scpr.exe in that folder.

[!NOTE]

>

_If you are using nvm for Windows, node.exe will be at C:\Users\nvm4w\nodejs_
  1. 1.Add {NODE_FOLDER}\bin (in the case of nvm: C:\Users\nvm4w\nodejs\bin) to the PATH environment variables. Follow this guide for instructions on how to set PATH env variables.
  2. 2.Restart your computer
  3. 3.Test scpr --help from your terminal. The execution might be challenged by your antivirus, but, since the executable does not contain any harmful code, the antivirus will eventually allow it

Usage

As a CLI tool

Basic usage (scrape a single page):

bash
scpr --url https://example.com --output ./scraped

This will scrape the page and save it as a markdown file in the ./scraped folder.

Recursive scraping

To scrape a page and all linked pages within the same domain:

bash
scpr --url https://example.com --output ./scraped --recursive --allowed example.com --max 3

Parallel scraping

Speed up recursive scraping with multiple threads:

bash
scpr --url https://example.com --output ./scraped --recursive --allowed example.com --max 2 --parallel 5

Additional options

  • --log - Set logging level (info, debug, warn, error)
  • --max - Maximum depth of pages to follow (default: 1)
  • --parallel - Number of concurrent threads (default: 1)
  • --allowed - Allowed domains for recursive scraping (can be specified multiple times)

For more details, run:

bash
scpr --help

As a stdio MCP server

Start the MCP server with:

bash
scpr mcp

And configure it in agents using:

json
{
  "mcpServers": {
    "web-scraping": {
      "type": "stdio",
      "command": "scpr",
      "args": [
        "mcp"
      ],
      "env": {}
    }
  }
}
_The above JSON snippet is reported as used by Claude Code, adapt it to your agent before using it_

Contributing

Don't lose this

Three weeks from now, you'll want Scpr again. Will you remember where to find it?

Save it to your library and the next time you need Scpr, it’s one tap away — from any AI app you use. Group it into a bench with the rest of the team for that kind of task and you can pull the whole stack at once.

⚡ Pro tip for geeks: add a-gnt 🤵🏻‍♂️ as a custom connector in Claude or a custom GPT in ChatGPT — one click and your library is right there in the chat. Or, if you’re in an editor, install the a-gnt MCP server and say “use my [bench name]” in Claude Code, Cursor, VS Code, or Windsurf.

🤵🏻‍♂️

a-gnt's Take

Our honest review

This plugs directly into your AI and gives it new abilities it didn't have before. Web scraper CLI and MCP built for human and coding agents. Once connected, just ask your AI to use it. It's completely free and works across most major AI apps. This one just landed in the catalog — worth trying while it's fresh.

Tips for getting started

1

Tap "Get" above, pick your AI app, and follow the steps. Most installs take under 30 seconds.

What's New

Version 1.0.06 days ago

Imported from GitHub

Ratings & Reviews

0.0

out of 5

0 ratings

No reviews yet. Be the first to share your experience.