


The data that comes back helps you to see duplicate metadata, pages that are live that shouldn’t be, redirects set up incorrectly, and much, much more. The program sends a bot to crawl every page on your site, regardless of whether the page shows up in search or not. Have you ever conducted a crawl of your site? Screaming Frog is the ultimate when it comes to data provided through crawls. Review any blocked URLs or disavowed links.There are a few key areas to examine and gather any available data for your analysis. Once these are set up and a few weeks have gone by, you’ll have enough data to start on a thorough audit. Sign up for Google Webmaster Tools and Bing Webmaster Tools. While the tools may be lacking in insights and user function, they make up for it with the sheer amount of data provided. When given the opportunity to learn more about your site straight from the proverbial horse’s mouth, you should take advantage! Google Webmaster Tools and Bing Webmaster Tools provide data straight from the search engines themselves to tell you how your site is performing. Acquisition Search Engine Optimization Report.Reports such as these listed below will be a great start to your analysis. Take some time to sift through your analytics for specific data related to keywords, search traffic, where visitors go, how they interact with your site, and which visitors convert into leads.

Pull reports which showcase how visitors are getting to your site, and hone in on specific reports showing how search traffic performs. This is normal and not a cause for concern.

If you use more than one service (for example, Google Analytics and another service) the reports you get for traffic and behavior analysis may differ slightly from one another. What makes the most sense for your site? Once you answer that question you’re ready to proceed to the next steps of data gathering. If you see 95% of your traffic from Monday through Friday, then you should look at the seven-day week (or the month, or the quarter). For example, if you make 85% of your sales in Q4, then you should look at the entire year. Essential observations can be left out if the wrong time period is selected. The last thing you want is to examine data which shows only a small slice of the entire true picture. Consider your business cycles, for example, and choose date ranges which take into consideration peak traffic times and traffic lows. Measuring and comparing performance by periods of time requires a bit of thought into how your business is set up. You may have slight variations based on what you’re able to gain access to, but there are really five key areas you’re looking to gather data for:īefore you start gathering data it’s essential that you understand what date ranges to pull for your data and how that impacts your figures. If you’re lucky enough to have access to a variety of tools, I recommend gathering as much information as you possibly can before starting on your analysis. The first step of any thorough SEO audit is to do an exercise in research with a massive data pull. In the coming months, we’ll cover the other four essentials: analyze, test, improve, and repeat. This is the first post in the “How to Perform an Onsite Audit for SEO” series. But technical factors do matter, and improving them can give your pages a boost. SEO goes beyond technical best practices of course the first consideration is always the quality of the content you have on the website, and how you manage and optimize that. Begin with research, gathering information to examine if technical SEO best practices have been followed. Using this methodology, you can conduct an onsite audit for search engine optimization. We believe that every activity marketers do can be optimized, and we’ve got a favorite process: research, analyze, test, improve and repeat – that’s the true formula for optimization.
