SEO software Netpeak Spider

Netpeak Spider

Netpeak Spider is a unique tool designed to scan and analyze websites with the help of its own robot:

  • Scans website and lists key SEO parameters.
  • Locates errors, broken links, incorrect redirects, duplicate titles, descriptions and keywords, and more.
  • Analyzes incoming and outgoing internal links.
  • Calculates internal PageRank.
  • Flexible scanning configuration and the ability to export data to Excel.
  • It’s all available for FREE!

Netpeak Spider is designed to scan websites and list SEO page data using its own robot. The program uses a unique tool for measuring PageRank and PageRank that excludes sitewide links. All results have the option to be exported to Excel.

Flexible scanning settings allow these features to be turned on or off:

  • Analysis of parameterized links, anchor links, images and JS files.
  • Using the robots.txt file while scanning.
  • Adding subdomains to spreadsheets.

List of page parameters analyzed by Netpeak Spider:

URL – Page address.
Depth – Depth of current page from the main page.
LinkCanonical – Whether there is a rel="canonical" attribute and where it links to.
Server response – HTTP status codes (200 - OK, 301 - Moved Permanently, 302 - Moved Temporarily, 404 - Not Found, 503 - Service Unavailable, etc.).
Title – Title tag in head container.
Description – Meta tag that contains a short description of the page's content.
Keywords – Meta tag that contains the page’s keywords.
robots.txt – Determines whether or not search engine page indexation is allowed in the robots.txt file.
MetaRobots – Determines the existence of the “robots” meta tag, whether page search engine indexation is allowed (index) or not allowed (noindex), and if the page allows robots to follow its links (follow or nofollow).
Redirects – Number of redirects from the page.
Heading (H1) – Number of H1 headings.
Links from this page – Number of outgoing links from this page.
Links to this page – Number of ingoing links to this page within the website.
Internal links – Number of links on a website directing to pages on the same website.
External links – Number of links from this page to other sites.
PR – Internal PageRank index for the current page.
PR (without cross-referencing) – Internal PageRank index excluding sitewide links (more than 50% of links are sitewide).
Title Duplicates – Locating pages with the same title tags.
Description Duplicates – Locating pages with the same description tags.
Duplicates by Keywords – Locating pages with the same keyword tags.
Current version of the program:
Version Release date Comments
  • endless redirects detection algorithm fixes and improvements;
  • correct handling of incorrect encoding in <meta> tag;
  • no more application crashes on loading the empty settings files;
  • fixed dissapearing of UI elements in the settings window;
  • fixes and improvements over page links handling;
  • fixed issue with impossibility to scan sites using HTTPS.
  • fixed font rendering bug occured on some systems.
  • fixed a critical bug in page content type detection;
  • fixed a bug in robots.txt handling;
  • fixed URL field resizing on application window minimizing;
  • fixed incorrect link status type detection when application is running under WinXP;
  • fixed user-interface issues when application is running under WinXP;
  • enhanced URL handling: URL-encoded sequences in URLs now converted to uppercase;
  • enhanced user preferences save/load;
  • changes in used third-party components list;
  • added new third-party.txt (info about used third-party components).
  • fixed bug in relative links handling;
  • fixed bug in invalid links handling;
  • few bugs fixed in robots.txt handling;
  • fixed bug with horizontal scrollbar disappearing after URL column resizing;
  • improved handling of start URL redirect to self with/without www.
  • fixed detection of "endless" redirects;
  • fixed a bug in <base> tag handling;
  • added correct handling for HTTP server responses with error 407 (ProxyAuthRequired);
  • fixed highlight of links with tag <meta> (robots) having "noindex" value.
  • improved handling of start-url redirects;
  • improved detection of endless redirects;
  • improved autocomplete on URL field;
  • improved handling of relative links;
  • registration and activation is now required.
  • added visual indication of scan process;
  • robots.txt handling improved: now it's possible to exclude from scan links disallowed in robots.txt;
  • fixed a bug in the redirects handling routine that caused scan process cycling;
  • fixed a bug that caused scan process cycling when trying to scan not working website;
  • fixed a bug that caused ignoring of robots.txt when using scan by Google;
  • improved GET-parameters handling;
  • improved base-URLs handling;
  • corrected mistakes in english localization.
  • fixed bug in redirects handling;
  • fixed bug caused robots.txt to work inproperly;
  • improved general links handling;
  • improved page encoding detection;
  • improved internal links detection.
  • fixed auto-update.
  • fixed bug in activation.

Did you like Netpeak Spider? Share the love via social networks:

Netpeak Soft Community

Found any bugs or errors in our programs, or just want to contact us? Write us at!


Netpeak Spider is used by many SEO specialists, and it became even more popular when we launched new version with improved interface. Here are some reviews:

Our features

  • We focus on SEO and PPC. We set a global standard with our ability to get consistent results from these two powerful methods of online marketing
  • We offer unique products on the market – [SEO 2.0] and [PPC 2.0]
  • We are dedicated to show the positive return on investment using SEO and PPC
  • We are one of the best online marketing outsourcing agencies in Eastern Europe
  • More than 1000 businesses have entrusted us with their projects
  • Netpeak strives to be a worldwide agency leader
  • The Netpeak Client Dashboard helps businesses make decisions based on an exact set of key performance indicators

We make our clients' businesses more profitable

New posts on Netpeak Blog


If you need unique high-quality images and don’t have budget on professional designer, you have no other variant but to create images yourself. This sounds scary for the most content marketers, copywriters and business owners. But don’t be afraid, as we will talk about everything you need to know about custom images without spending your time and money.


My great delusion: display networking (DN) doesn’t sell. Once, I have decided to overcome this delusion and started experimenting.


In this article, we will share with you one of our most profitable tools for Search Console data download and analysis. This tool will be useful for all types of SEO specialists.


As an advertiser you want your ads reach the target audience – no one is willing to pay for non-converted clicks. What do you have to do to make your ads seen only by interested clients?