1 / 6

5 Tips to Avoid Duplicate Content and Indexing Issues

Read about Avoiding Duplicate Content and Indexing Issues

benestrell
Download Presentation

5 Tips to Avoid Duplicate Content and Indexing Issues

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. 5 TIPS TO AVOID DUPLICATE CONTENT AND INDEXING ISSUES

  2. It is relatively common for e-commerce sites to develop URL structures that create crawling and indexing issues. This is not desirable for a business, as it can cause a plethora of duplicate content and crawl budget complications. Thankfully, we are here to share to you five tips to avoid duplicate content and indexing issues

  3. 1. Check how many of your pages Google has indexed. Use a “site:example.com” search on Google and note the number of results that Google is aware of. The Google site operator search should approximate the numbers between your content management system, sitemap, and server files. -If Google returns too few results, determine which pages from your - sitemap do not show in your Google Analytics organic search traffic. -If Google returns too many results, use tools to run a site crawl and trace the pages with duplicate titles, since these commonly have duplicate content. Then, identify what are causing the duplicates and remove them.

  4. 2. Optimize sitemaps, robots.txt, and navigation links. The only way Google could know about pages with no inbound or even internal links is through the sitemap. It is crucial that robots.txt declare the location of these. Additionally, every page that you hope to get indexed should be reachable from at least one link on your site, and therefore requires a logical navigation link structure. 3. Identify URL parameters. Any URL parameters that do not focally influence the content should be tagged with a noindex directive or canonicalisation.

  5. 4. Determine if the filters are good or bad. -Good filters should help specify a product and produce unique but substantial pages. -Bad filters only reorganise content without actually changing it. These types of filters, like those that sort by price or popularity, should be dealt with AJAX, noindex directives, or canonicalisation. 5. Use noindex and canonicalization properly. -Use canonicalisation for URL parameters toward their standard versions. Paginated content should also point to a consolidated “view all” page. -Use noindex on any membership areas or staff login pages, shopping carts and thank you pages, narrow product categories, and finally on duplicate pages that cannot be canonicalised.

  6. 5 TIPS TO AVOID SOURCE: HTTPS://DIGITALMARKETINGAUTHORITY.BLOGSPOT.COM/2018/09/ 5-TIPS-TO-AVOID-DUPLICATE-CONTENT-AND.HTML

More Related