The SISTRIX Toolbox gives you the possibility to analyse a URL exactly as you would do with a domain, discovering its visibility and rankings on Google both for mobile and desktop devices. In this post we’ll show you the different elements of this section, including the indexed pages of a domain.
In the white bar, located between the Toolbox navigation menu and the filters, you’ll find general settings for the whole page, in particular:
- Date: if you don’t choose a date, the Toolbox will show the data for the current week. Thanks to this option you’ll be able to go back in time and find out how the keywords and the rankings of the website developed over the years.
- Filter: The “Expert filter” allows you to create complex filter combinations, which you can also save and load.
- Data source: the Toolbox offers an extended database for mobile SERPs, which is why this is the default option for the table. Here you can also change the data source, choosing desktop results, or see the indexed pages of the domain.
- Export: export the table as a CSV file. To do this you’ll need to use some credits.
- Shortlink: share the page with other Toolbox users. You’ll get a personalised shortlink, active for a few days, that you can share without any limitations.
Finally, the cogwheel icon of the table will let you export the data, or add them in a dashboard or a report. Here you’ll also find the function “Select columns” which allows you to add more interesting columns to the table.
Number of URLs
Here you will see the number of URLs for this domain which rank for at least one keyword in Google’s Top-100 result.
The arrow will show you if there has been an increase or decrease in overall ranking URLs since last week.
Keywords per URL
Here we will show you the average amount of keywords within the Top-10 and Top-100 for which individual URLs are ranking.
Here you can see how many of the ranking URLs use the HTTPS (SSL) protocol and how many only use HTTP.
This box will show you the average length, architectural depth (directories) and number of added parameters for all ranking URLs In most cases, there is no reason why you should have more than 3 directory levels within your website architecture and you should try to avoid parameters altogether.
Table of Ranking URLs
This table holds the different URLs for which the domain (or the host, directory or URL) you are evaluating has rankings within the Top-10 and Top-100 results on Google. Here you’ll also find the best keyword for the respective URL. We consider the best keyword to be the keyword that generates the most organic clicks for this URL.
In the column “Share of Visibility” we show you how each URL contributes to the total visibility of the domain. The value is based on an evaluation of all ranking keywords in the extended database. By using the extended database we can provide more information for domains that have a small Visibility Index.
By default, the table is sorted by the amount of Top-100 rankings. You can change this with a click on the column heading you want to sort.
The filters allow you to further refine your list of URLs and eventually discover features that could be optimized.
Some filters are immediately available on the interface (Quick Filters), while others can be selected by clicking on the green button “Filter now“.
Here’s a quick explanation of every avaliable filter for the URLs:
- URL: finds the URLs that contain (or not contain) a text. For example, the quick filter “Contains underscores” lets you quickly find all the URLs with this feature, which is particularly important because Google recommends using hyphens instead of underscores in URLs.
- Number of parameters: filters the URLs according to a specific number of parameters. Note that URLs which have more than 3 parameters are to avoid, because they are extremely vulnerable for Duplicate Content problems: that’s why we created the quick filter “At least 3 parameters“.
- Number of directories: filters the URLs according to a specific number of directories. Thanks to the quick filter “At least 3 directories” you can find those URLs that are deep in the website architecture, which run the risk not to get crawled as often by Google (or at all) as URLs higher up in the structure.
- Top-10 ratio: indicates how many Top-100 keywords are also ranking in Google’s first result page. The quick filter “Few top rankings“, for example, shows you all those URLs that are ranking on the first organic page with less than 3% of their Top-100 keywords: you might want to check if you want these URLs in Google’s index.
- Containing Session IDs: filters the URLs that contain (or not contain) Session IDs. Note that Session IDs can easily cause Duplicate Content problems, so it could be a good idea to check why your system is providing Session IDs to users and Googlebot.
- Containing uppercase characters: shows only the URLs which have uppercase characters. It is always better to avoid using uppercase characters in URLs, as upper and lower-case letters are technically different URLs which may get you in trouble when it comes to setting up 301 redirects.
- SSL encryption: shows the URLs that use (or don’t use) HTTPS protocol: this is particularly useful for those URLs that are still not secure (quick filter: “Without SSL“).
- AMP: filters all the mobile URLs that are AMP pages.
Indexed Pages History
Thanks to the option “Data Source” you’ll be able to reach the section concerning the indexed pages of the domain.
Here we will show you the amount of pages that Google has indexed for the domain (respective the host, directory or URL for detailed-evaluations) you are evaluating. These values are gathered by doing a site:-query for the domain – following the model site:domainname.tld
These values are, by Google’s own admission, very rough estimates which can vary. To catch large outliers, we will query this data multiple times per week and calculate the average.
Should this value show a large change over time, it is advisable to try and figure out the reason for these changes. You could check if Google is having problems indexing the page or if there is an excess amount of duplicate- or substandard-content in the index.
When you use the “Add to Watchlist” function for the domain, we will query the data for the indexed pages and social signals data, on a regular basis. You can have up to 100 entries on your watchlist. You can also use this watchlist feature to tell our Toolbox to query the data for a specific host, a directory or even URLs.
Note that the cogwheel icon will give you more options for your analysis.
Amount of Indexed Pages
This table shows you the current value for the indexed pages as well as the largest (max) and lowest (min) values ever recorded.