Michael Rosa / Sr SEO Strategist, Rio SEO
The digital world never sleeps. In today’s wireless society, users have become accustomed to an ongoing dialogue across multiple channels 24 hours a day, 7 days a week. As online marketers, that task of tracking the success of our customers in both the Search Engine Rankings and now across numerous social platforms falls into our hands. Tracking success in social spheres like Facebook, Twitter and Pinterest has been a challenge for the digital marketing community. As search marketers, we know an increase in sharing, social signals, and linking to our customers’ content is great for SEO. Search engines like this and reward pages that are getting shared, liked, and tweeted. If you aren’t integrating social and sharing into your SEO strategy, you’re missing a huge opportunity. However, when handled incorrectly, the tactics for tracking client content can be responsible for creating common SEO errors like duplicate content.
When it comes to analytics tracking one of the most practiced methods is to simply add a parameter to the URL structure. Parameters like Session or AffiliateID’s do not cause any changes to the page content. Additional parameters also exist that do change the display of a page including parameters like Sortorder which will cause pages to change the order of product display, or Lang= which adjusts the language presented to users.
Additional tracking parameters for Rio SEO Tag and Trace™ technology can measure social with #FBID parameters, which attach a unique identifier to the URL. This enables you to watch how your content moves throughout Web.
The potential for danger exists when working with URL parameters. As each URL gets a parameter added, a new page of duplicate content is created. As the search engines find multiple pages with different parameters, they will then decide what page will show in the results, regardless of the marketer’s well-crafted plan. This adds another metric to the campaign success, by working with the necessary parameters without creating a deep network of duplicate pages.
Fortunately for online marketing teams, there are solutions to engaging with URL parameters and controlling duplicate content issues to the pages outlined in your campaign. Let’s take a look at each plan on an individual level.
Google’s URL Parameter Tool- This tool, located in Webmaster tools under the Crawl section will allow marketers the opportunity to inform Google about parameters added to their URLs so Google will understand the purpose of the new URLs and reduce the appearance of unwanted URLs in the SERPs. Utilize extreme caution when implanting this plan as it is very easy to configure the wrong site wide parameters, which can cause indexation issues and loss of rankings.
<a href=”http://www.robotstxt.org/robotstxt.html“>Robots TXT Files</a>- (User-agent: *Disallow: /)
For a more aggressive tactic, you can use the Robots TXT file to block pages with your parameters form being crawled. By disallowing the bots from crawling pages with the parameters you can exercise more control, but use this tactic with great care as one keystroke can disallow the wrong files and create a whole new set of issues like well-ranked pages falling out of the index.
Canonical Tags- This tag enables the site owner to specify the preferred URL for a given page. This tag tells the search engine crawler that any URL that points to this page should use the above URL when returned in natural search results. The Canonical Link Tag was introduced and adopted as a standard by all of the major search engines to address parameter issues because major sites like Amazon.com, eBay.com, and others have millions of duplicate URLs that would be affected.
<a href=”http://www.robotstxt.org/meta.html”>Robots Meta Tag</a>-(<META NAME="ROBOTS" CONTENT="NOINDEX, NOFOLLOW">)
The Robots Meta tag allows you to inform the bots if you want a specific page included in the index, and if they should follow links on pages. This can be useful, but very time consuming as the code has to be uploaded on a page by page basis in most cases, and as with all strategies for duplicate content must be handled with great care. By posting the wrong Robots Meta tag on the pages you can interfere with both your sites indexation as well as how page rank may flow through your site.
When dealing with URL parameters, it is easy to see how important careful management is for the success of tracking your marketing campaigns. Generally speaking, the Canonical Tag is the best option when dealing with URL parameters. This is the way Rio SEO Social Analyze™ is successfully able to both monitor social traffic and movement of your data and have no effect your overall SEO campaigns. This strategy allows for deep analysis of both organic traffic as well as social media patterns with your published content.
Posted October 8, 2013