Start for free, then choose the plan that best fits your needs
Features by plan
Netpeak Spider Features
Availability
Core Spider Features
This feature comes in handy when crawling websites with JS scripts (meaning, almost all sites on the Internet). If you try crawling such site in without JS rendering, a crawler will be unable to detect data which is added with the help of JavaScript (links, descriptions, images, etc.), and therefore won't be able to analyse the pages correctly.
To execute JavaScript when crawling CSR sites, Netpeak Spider uses one of the latest versions of Chromium, which makes crawling as advanced and close to Googlebot as possible.
The program back up the collected data automatically. This is useful when there is a risk of a sudden computer shutdown and data loss. Additionally, after the crawling is over, you can save the project to have quick access to the crawled data, share the file with colleagues, and even open the list of URLs in Netpeak Checker (yup, our tools are cross-integrated, duh ).
Use good ol` Ctrl+C hotkeys, or the 'Copy' option in the context menu. Oh, and make sure to try the extended copy button in the sidebar: in just one click you can copy to the clipboard the contents of the 'Issues', 'Overview', 'Site structure', 'Scraping' tabs.
Built-in Tools
Reports
Integrations
Netpeak Checker Features
Availability
Core Checker Features
Using a proxy list allows you to avoid blocking while scraping SERP (Google, Bing, Yahoo, Yandex) and analyzing numerous parameters (e.g., Wayback Machine, LinkPad), especially if you deal with a large number of URLs.
We recommend using paid proxies.
The program backs up the collected data automatically. This is useful when there is a risk of a sudden computer shutdown and data loss. Additionally, after the crawling is over, you can save the project to have quick access to the crawled data, share the file with colleagues, and even open the list of URLs in Netpeak Spider (yup, our tools are cross-integrated, duh ).
Use good ol' Ctrl+C hotkeys, or the 'Copy' option in the context menu. Oh, and make sure to try the extended copy button in the sidebar: in just one click you can copy to the clipboard the contents of the 'Issues', 'Overview', 'Site structure', 'Scraping' tabs.
Parameters & Integrations
Serpstat provides data for monitoring positions, as well as on traffic and keyword research. Netpeak Checker aggregates data by 10+ Serpstat parameters (for root domains and URLs) – including visibility, SE traffic, number of keywords in top-10 and top-100, etc.
To get parameters from this service, you need to have a paid API key.
Ahrefs provides data for backlink analysis, as well as on traffic and keyword research. Netpeak Checker aggregates data by 30+ Ahrefs parameters (for subdomains, hosts, prefixes, and URLs) – including Domain Rating (DR), number of backlinks, organic traffic estimation, etc.
To get parameters from this service, you need to have a paid API key.
Majestic provides data for comprehensive backlink analysis. Netpeak Checker aggregates data by 35+ Majestic parameters (for root domains, hosts, and URLs) – including Trust Flow, Citation Flow, number of external backlinks, etc.
To get parameters from this service, you need to have a paid API key.
Moz provides data for comprehensive backlink analysis. Netpeak Checker aggregates data by 20+ Moz parameters (for root domains, subdomains, and URLs) – including Domain Authority, MozTrust, MozRank, number of external links, etc.
To get some parameters from this service, you need to have a paid API key.
SimilarWeb provides data on website traffic. Netpeak Checker aggregates data by 14+ SimilarWeb parameters – including total visits with ratio by channels, global rank of the website, category rank, etc.
To get parameters from this service, you need to have a paid API key.
SEMrush provides data for backlink analysis, as well as on traffic and keyword research. Netpeak Checker aggregates data by 15+ SEMrush parameters (for root domains, hosts, and URLs) – including Authority Score, number of backlinks, organic traffic estimation, etc.
To get parameters from this service, you need to have a paid API key.
The CAPTCHA auto-solving feature helps achieve fast and safe SERP scraping and search engine parameters, especially when using a proxy list. The program is integrated with the top CAPTCHA solving services: 2Captcha, Anti-Captcha, RuCaptcha, and CapMonster. Moreover, you can use all of them simultaneously – they will work in alternation.
In order to use CAPTCHA auto-solving, you need to enter the API key of the corresponding service account (IP address and port – for CapMonster program) on the ‘CAPTCHA‘ tab in program settings and have a surplus on your balance.
Support & Updates
Frequently asked questions
Need help?
Help Center
Go and get instant support as we have already prepared answers to the most frequently asked questions here.
Support via tickets
Do not forget to take a look at Knowledge Base before opening a ticket. Also, note that we respond within 24 hours during working days.
Submit a ticket