Lastly, Cumulative Layout Shift (CLS) measures visual stability; strive for a CLS score of less than 0.1. To improve this, ensure images and embedded videos have dimensions, and avoid inserting new content above existing content unless in response to a user’s action. Another way is to use Google Analytics to view traffic from all search engines. For one client, I used user engagement metrics from Google Search Console and heatmaps to recover from a Google core update hit. While most people optimize content and backlinks, I dug deeper into how users interacted with the site after landing on it.

When you create content, you want to ensure that your pages contain original content that is informative and relevant. It should have relevant keywords and pertain to the topic at hand. If you want your site to perform well, you must nix your low-value pages. If you want to crawl your site, you can use a tool like Screaming Frog. This helps you recognize errors on your site and fix them to improve your site’s performance. If dozens of competitors already rank for a certain keyword, you may want to SEO Anomaly rethink your targeting strategy to focus on keywords that will deliver higher value results.

Step 3: Evaluate Your Site’s Technical Health

The most important things to look out for in this report are spikes or gaps in crawl requests, download size and response times. Follow the guidelines from the Coalition For Better Ads to make sure your ads are user-friendly and you’ll be golden. I recommend testing your site’s speed in both Google’s PageSpeed Insights tool and GTmetrix. But if you see any errors, then check out our guides to and HTTP to HTTPS migration for details on how to configure your redirects. Before diving into the technical audit, there are a number of manual checks I recommend performing on any site.

seo audit

Bulk Reporting

To audit your site for JavaScript issues, you need to enable JavaScript execution in expert options when you create a new project (or rebuild an existing one). Though Google has no serious penalties for duplicate content, such pages can still ruin your position and reputation. For instance, Google’s algorithm can index the wrong version of a page and it will get into the SERP. Or the whole website may be poorly indexed and considered spammy due to the huge number of duplicates.

seo audit

seo audit

The Raven study found 22% of page titles were duplicates and 17% of meta descriptions were duplicate across sites crawled . Fixing these can improve your click-through rates and avoid confusing Google. • Missing or empty titles/descriptions – these need to be written. Each page should have a unique title (50-60 characters ideally) and meta description (~155 characters) that include key terms and entice clicks. It’s been reported that over 33-34% of pages have missing meta descriptions  which is a huge missed opportunity – don’t let your site be one of them. • In Google Search Console Coverage report, look at the valid vs. error vs. excluded pages.

  • Search Engine Optimization (SEO) is a digital marketing practice that improves your website’s visibility in search engine results.
  • This is a 180-page document used by Google’s real human quality raters to evaluate content.
  • It is important to make sure your site is safe for both users and search engines.
  • They will have experience making technical improvements with your website programming languages and CMS.

This approach led to a complete recovery of the site’s performance. This not only improved time-on-page and bounce rates but also sent stronger engagement signals to Google. We kept an eye on newly found spammy backlinks for a few months, while focusing on building high-quality backlinks by reaching out to reputable sites within the client’s niche. Disavowing the harmful links and reaching out to webmasters to remove them was tedious but effective. The real game-changer, though, was focusing on creating genuinely helpful content that matched search intent and naturally attracted high-quality links.