Netpeak Spider is designed to scan websites and list SEO page data using its own robot. The program uses a unique tool for measuring PageRank and PageRank that excludes sitewide links. All results have the option to be exported to Excel.
Flexible scanning settings allow these features to be turned on or off:
Analysis of parameterized links, anchor links, images and JS files.
Using the robots.txt file while scanning.
Adding subdomains to spreadsheets.
List of page parameters analyzed by Netpeak Spider:
URL – Page address. Depth – Depth of current page from the main page. LinkCanonical – Whether there is a rel="canonical" attribute and where it links to. Server response – HTTP status codes (200 - OK, 301 - Moved Permanently, 302 - Moved Temporarily, 404 - Not Found, 503 - Service Unavailable, etc.). Title – Title tag in head container. Description – Meta tag that contains a short description of the page's content. Keywords – Meta tag that contains the page’s keywords. robots.txt – Determines whether or not search engine page indexation is allowed in the robots.txt file. MetaRobots – Determines the existence of the “robots” meta tag, whether page search engine indexation is allowed (index) or not allowed (noindex), and if the page allows robots to follow its links (follow or nofollow). Redirects – Number of redirects from the page. Heading (H1) – Number of H1 headings. Links from this page – Number of outgoing links from this page. Links to this page – Number of ingoing links to this page within the website. Internal links – Number of links on a website directing to pages on the same website. External links – Number of links from this page to other sites. PR – Internal PageRank index for the current page. PR (without cross-referencing) – Internal PageRank index excluding sitewide links (more than 50% of links are sitewide). Title Duplicates – Locating pages with the same title tags. Description Duplicates – Locating pages with the same description tags. Duplicates by Keywords – Locating pages with the same keyword tags.
Current version of the program: 220.127.116.11
endless redirects detection algorithm fixes and improvements;
correct handling of incorrect encoding in <meta> tag;
no more application crashes on loading the empty settings files;
fixed dissapearing of UI elements in the settings window;
fixes and improvements over page links handling;
fixed issue with impossibility to scan sites using HTTPS.
fixed font rendering bug occured on some systems.
fixed a critical bug in page content type detection;
fixed a bug in robots.txt handling;
fixed URL field resizing on application window minimizing;
fixed incorrect link status type detection when application is running under WinXP;
fixed user-interface issues when application is running under WinXP;
enhanced URL handling: URL-encoded sequences in URLs now converted to uppercase;
enhanced user preferences save/load;
changes in used third-party components list;
added new third-party.txt (info about used third-party components).
fixed bug in relative links handling;
fixed bug in invalid links handling;
few bugs fixed in robots.txt handling;
fixed bug with horizontal scrollbar disappearing after URL column resizing;
improved handling of start URL redirect to self with/without www.
fixed detection of "endless" redirects;
fixed a bug in <base> tag handling;
added correct handling for HTTP server responses with error 407 (ProxyAuthRequired);
fixed highlight of links with tag <meta> (robots) having "noindex" value.
improved handling of start-url redirects;
improved detection of endless redirects;
improved autocomplete on URL field;
improved handling of relative links;
registration and activation is now required.
added visual indication of scan process;
robots.txt handling improved: now it's possible to exclude from scan links disallowed in robots.txt;
fixed a bug in the redirects handling routine that caused scan process cycling;
fixed a bug that caused scan process cycling when trying to scan not working website;
fixed a bug that caused ignoring of robots.txt when using scan by Google;
improved GET-parameters handling;
improved base-URLs handling;
corrected mistakes in english localization.
fixed bug in redirects handling;
fixed bug caused robots.txt to work inproperly;
improved general links handling;
improved page encoding detection;
improved internal links detection.
fixed bug in activation.
Did you like Netpeak Spider? Share the love via social networks:
Netpeak Spider is used by many SEO specialists, and it became even more popular when we launched new version with improved interface. Here are some reviews:
Andrey Kapeltsov (Govitall, #SeoCafe )
Netpeak Spider is a good alternative to Xenu, that is recommended by Google in the webmaster help. But unlike Xenu, Spider is actively supported and developed. I think in the near future this program will become the best choice in helping you to identify problems in websites optimization.
Sergei Koksharov (Devaka.ru)
Program Netpeak Spider is similar to Screaming Frog SEO Spider. The advantage is that it is free and in addition to standard functions allows to calculate the weight of the documents on the basis of internal connections, disadvantage – that it works only with Windows. I recommend to download it and use it!
Vitaly Kravtsov (Unmedia, Kokoc.com)
Comfortable enough to search for meta tags duplicates, redirects, broken links, etc. Good alternative to Xenu and can partly replace Page Weight!
Valentin (Netpeak Blog reader)
Screaming Frog, Page Weight + their own features in one package and everything for free. Thank you!
If you need unique high-quality images and don’t have budget on professional designer, you have no other variant but to create images yourself. This sounds scary for the most content marketers, copywriters and business owners. But don’t be afraid, as we will talk about everything you need to know about custom images without spending your time and money.
My great delusion: display networking (DN) doesn’t sell. Once, I have decided to overcome this delusion and started experimenting.
In this article, we will share with you one of our most profitable tools for Search Console data download and analysis. This tool will be useful for all types of SEO specialists.