Monday, January 28, 2013

Keys to SEO: Content & User Experience – Interview With Bing’s Duane Forrester

by Mark Jackson

For many years, Matt Cutts of Google has been providing a lot of helpful information on search engine optimization (SEO). In case you hadn’t noticed, his counterpart at Bing, Duane Forrester, has been pretty active himself.
If you haven’t already, you should check out Bing’s Webmaster Guidelines, the Webmaster Center Blog, and – of course – his Twitter profile. What’s really unique about Duane is the fact that the guy comes to his gig at Bing from being an SEO practitioner.
duane-forrester-bing-ses-sf
Forrester carries the official title of Senior Product Manager – Webmaster Outreach at Bing and was kind enough to and share with all of us his perspective on a few items.



  • Run an internal contest to identify content rockstars inside your company. These are folks who know your products/services inside out and have a great storytelling voice.
  • Ask employees to tell a story about your products/services.
  • Winners get (fill in the blank as you like) and they write the blogs for you, build the videos, etc.
  • Build a community around their unique voices.
  • Seek others to fill gaps and attract other people.
  • Understand what “hooks” motivate people (ego hook, humor hook, contrary hook, detail hook, anger hook, etc.). Use these hooks wisely to motivate certain segments of users towards specific actions. Be careful with them – humor is great when everyone thinks it’s funny, but when it falls flat, well, it can be embarrassing.
  • Create videos – people love to consumer reviews, news, funny stuff, etc. via video. Keep the videos short – 3 to 7 minutes or so. Create a familiar pattern with your videos: same location (or a variety of awesome locations); same flow or pattern of information coverage; same tone of voice; same presenter if desired, etc.
  • The point behind all this is to uncover new voices to amplify your messages and new ways for visitors to engage with your content.
    • Send an email to bns@microsoft.com
  1. Provide an introduction, historical background, and credentials of the site. 
  1. Credible ranking of the site in its field, if any. 
  1. Name the locale (or audience scope) the site's stories cover for. Provide the state/city names + ZIP codes or describe the groups of users. 
  1. Provide statistics on the site. 
  1. Is the site mostly news related? Please explain. 
  1. Provide the URL of main news entry point as well as the entry points of major channels. 
  1. RSS link to the site. 
  1. Does the site follow the best practices outlined in the Bing Webmaster Guidelines?

Summary


In my initial outreach to Forrester, I had tried to persuade him to share some hot tips for how to do well on Bing. Forrester was pretty persistent (insistent?) that we talk about what really matters…the high level stuff: content, promotion, and user engagement of content. These are what matter in today’s SEO.
Here’s my discussion with Duane:
Mark Jackson: Does Bing put as much emphasis on freshness of content as competitor Google?
Duane Forrester: Absolutely – searchers demand fresh content. This helps explain why we believe deeply in partnering with leaders such as Facebook, Twitter, Quora, etc. These partnerships help us bring in relevant, timely and topical information, enhancing our SERP results and helping searchers complete their tasks faster.
Crawling is also a high priority for us and as many have seen over the past year, we’ve continually ramped up our crawl pace and depth. Discovering new content is important. Discovering it fast is paramount.
MJ: How does Bing plan to leverage content from strategic partnerships with Twitter and Facebook as it relates to SERP rankings, and how much does social content influence rankings?
DF: It’s important for businesses to think of this in a broad context. It’s not like there is a number in play here – social helps rank by a factor of X, for example.
It’s important that business owners understand social is a broad communication medium in use by their customers. Whether a business participates or not in the social conversation, it’s happening. Better to be involved and seen as a supportive, inclusive business, than to ignore it and seem aloof.
This perception of a business can impact whether people engage with the website for the business, and we see that engagement, or lack of engagement. That’s a signal we can understand that helps us assign values to businesses.
Basically, if people love you, we’ll want to show you. If they dislike you, we may still want to show you, but it may coincide with negative searches about your business, reinforcing the negative side of things.
MJ: Are you referring to social engagement with the site itself or to their owned/controlled social profiles in other channels? Specifically, can you speak to whether or not you’re looking at comments on blog posts hosted on the site? RTs via Twitter? Likes? Everyone’s trying to get a sense of how much of this could be happening.
DF: How can you tell who is the SEO in the room? They're the one with the hair split 22 ways. Don't over complicate this.
Social is about people. If you walk up to the water cooler and everyone is talking about the great service they got at the local garage, you're more likely to try the garage. You go there, have a good experience and tell other people, the cycle of social sharing repeating itself. Not so different online.
Businesses need to engage their visitors - across the usual social spots, in comments on their own site, in enthusiast communities, etc. Ignoring these spaces can be seen by potential customers as a negative for your business, leading them to shop elsewhere.
I'll give you a personal example. I was shopping today for a new home stereo. I'm an audiophile, so I'm a bit picky. A local company came highly recommended, so I went by their shop. I walked into a warehouse, met the guys building the units, the owners of the company, got a hands on demo and watched them quality check my actual unit before I left.
Part way through the transaction, one of the senior guys tells me they don't sell from the shop. That I should have bought online. Luckily he completed the transaction, because by forcing me to leave empty handed and pay shipping for the box to cover 3 miles to my house, I'd have opted to spend my money elsewhere. Instead, I went home, set it up and immediately went online to tell folks in an online forum how great the company and product are.
It’s this positive reaction a business should seek out and make happen. It’s that positive experience I had that prompted me to share and sing their praises. They will succeed because they give great service and sell an excellent product. Not because they got more Likes or RTs.
With social, we watch everything – Facebook, Twitter, Quora, LinkedIn, Google+ and so on. It all helps us understand if, when we slot you in at the top of the rankings, will you bring searchers an excellent experience? That's what we need to meet – that bar for WOWing searchers is high.
You make excellent content, couple that with an excellent UX, social lights up favoring you and we take note of it all. We say, "I want me some of that action!" and rank you better to please searchers.
MJ: Bing’s attention to time on site vs. returning to SERP. Many talk about this as a Google indicator of content quality.
DF: This is referred to as dwell time. The amount of time depends on the individual and the content they see, but how we use it as a signal also varies.
For example, say you’re looking for today’s temperature. You do a search for “98033 weather”. Assuming you totally miss the temperature displayed in the search results, and click on the results you see, the amount of time it takes the human mind to see and process the number you’re looking for will be small. Thus, clicking back to the SERP in this instance quickly would be seen as normal.
Contrast that with a search for a review of a new product. You know you want an expert opinion, not a sales page. It’s a new product, so the major publications haven’t posted reviews yet and what you see are smaller sites, most simply selling the item – no reviews.
If your goal is to read a review, you’ll recognize the sales page and after a while start flipping through the SERP, clicking results, seeing a familiar pattern and clicking back to the SERP to try another result. It’s obvious (we see page size, text counts, etc.) to us you aren’t reading all the content on the page you just clicked on, so clearly that didn’t give you what you wanted.
This can help explain how fast movers – even unknowns – can gain an early ranking advantage over established brand names. In this example, small bloggers often have an edge in getting to publication faster.
In the long run, the brand names secure rankings through depth of content, trust in brand and user interaction (searchers clicking a SERP result and staying on their site because the site is trusted and answers the searchers question).
MJ: How does Bing combat duplicate content?
DF: We cannot get into the details of the process, but patterns are easy to spot, so we’re constantly scanning to understand if we already have the item or are aware of it. It’s important to keep in mind that not everything is worth indexing. Just because it’s published doesn’t mean anyone will find value in it.
MJ: Does quality of content equate to a large number of words for a page?
DF: Let me be clear about this – hell, no! Quality is quality.
If you bolted extra fenders onto a Mercedes, does it make the car a high quality product? No, it does not. Same thing happens in the world of search. More is not more, unless it’s more.
You ask me to explain how an airplane flies. I write an article explaining it. The wing moves through the air, the air on top of the wing moves faster creating a low pressure area, lifting the wing – and plane – into the air. In a nutshell, that’s how planes fly.
But to really do the topic justice, I need to not just write more words, but explain more related to the topic. Explain how the engine spins the propeller, which pulls the plane forward, moving the air over the wings. I need to explain how to change direction. I need to explain how temperature affects all these factors. So when someone comes to your site to learn “How does an airplane fly”, they get all the answers, not just some of them.
MJ: Google's Panda really hit a lot of site owners hard and some are in the mindset that they now need to be "content mills". Do you agree sites need to be content mills in order to compete in the SERPs?
DF: Absolutely not. Sites need to stay focused on the most important thing – and that’s not what the engines are doing. It’s what their visitors are doing and consuming.
Produce content that meets the consumer’s needs. Produce content that doesn’t just lead to more questions, but answers them as well. Build a user experience that’s so engaging it makes your visitors want to share it with friends.
Producing content just to publish something each week is not going to move the needle the way the business wants. The business wants traffic, page views, sales and revenue. Produce content that engages visitors and makes the visitor want to do business with you. If you WOW them, they will come.
MJ: What are some creative ways you've seen sites create fresh/unique content without churning out content for the sake of content?
DF: Some examples:
MJ: Everything else being equal (links/social, etc.), just talking about on-site…If website A has 1,000 pages of (unique) content indexed and website B has 500, will website A “generally speaking” outrank/outperform website B?
DF: Not necessarily. It depends a great deal on how searchers interact with the site. Sure, one site has more pages. But are they useful? Are they being used by people? Do users share them? Do they reference them? You don't win just by building a bigger house.
MJ: Is there an “authority” to having a deeper website? Do you look at how many pages they may have around a semantically similar topic to determine rank for “all keywords within that vertical/category”?
DF: The authority comes from people saying you're an authority. We don't assign authority because you have N number of pages.
MJ: Is it absolutely necessary that a page, that you expect to have rank, have links directly from external sources to do well on Bing?
DF: Nope, but without any links, there's a signal that no one values it – so why should we rank it? New content suffers from this, so there are dependent factors when ranking, obviously. We can't just say "There are no links to this brand new item, so it should never rank well..."
And while we're talking expectations, it’s smart to keep in mind that there is no guarantee for crawling, indexing, and ranking. If the content looks like it'll be useful, the site has a history of providing useful content, etc., then we'll crawl, index, and rank.
MJ: What are the content requirements/measurements for article placement within Bing news?
DF: Here’s the process and some suggestions:
When you take all this in, I think what Forrester is sharing is what many of us believe. SEO is leaving behind its history of being a bunch of “tactical executions” and becoming more of a high-level strategic affair in which you must think about “good marketing”, proper execution of creating meaningful content and promotion and driving visitors to a web presence that is sound in usability.
Both Google and Bing want to rank websites that are worthy of rank and have shown a history of providing a quality user experience for search queries that they may rank for.
It's interesting to think about Forrester's example above in which a quality offline customer experience can tie into signals for measurement of “quality” for SEO. I could write an entire column about this topic alone, but get acquainted with what Google is doing with Google Trusted Stores and the signals that could come from a quality customer experience.
To me, it seems like both Google and Bing are working toward the “algorithm of the future” which weights many more factors into what is determined as “quality” in the SERPs.

Share/Bookmark

The Year in Search & Social Marketing: Highs & Lows of 2012

by Lisa Raehsler
It’s time to take a trip down memory lane to look at the winning and losing moments in online and search marketing in 2012. Everyone loves the thrill of victory and the agony of defeat, so here are a few memorable highlights from search, social, and online tech.
Google – SEO
Google AdWords
Yahoo
Bing


Facebook

Facebook’s IPO in May was described as one of the worst in the last 10 years. The highly anticipated IPO was impacted by technical glitches that delayed trading and investors lost money as a result. The stock has lost more than a quarter of its value in the following weeks.

Summary


Losers
Technically algorithmic updates are ultimately a winning proposition, but when SEO professionals experience several updates in short order, they can feel a bit like a slow kid playing dodgeball. A trilogy of rapid-fire algorithmic updates included Penguin, Panda, and Exact Match Domain updates, plus more than 65 other Google updates from August and September 2012. Heads up!
Winners
Google this year ramped up unnatural link warnings to webmasters, resulting in a lot of freaked out SEOs. With no means to correct the situation, many felt Google was unfairly penalizing websites in the search results if they had questionable links aiming at them (some bought by less ethical SEO providers with the goal of gaming Google’s algorithm, some not).
This is why SEO professionals greeted Google's Distinguished Enginner Matt Cutts with cheers as he announced the release of the Disavow Links tool, which allows webmasters to ask Google not to take certain links into account when assessing your site. Likewise, this capability was released (first) by Bing, though Bing’s disavow tool didn’t get nearly as much attention.
Winners
Dozens of new features were released for AdWords in 2012, with most of them looked upon favorably.
One of the most impactful releases, in terms of creating opportunities for advertisers, was the Analytics integration with Google remarketing. Beginning in the middle of 2012, retargeting lists could be created in Google Analytics with precise granularity, not only based on visits, but also based on any combination of segments that could represent a user’s behavior on the site. The GA remarketing lists are then assigned to and accessible from your AdWords accounts. In addition, similar targeting improvements in the AdWords interface that allow advertisers to define lists based on URLs instead of creating new codes, are really advanced level features that move the dial on any PPC account.
remarketing-lists-definition
Losers
Ad rotation settings in AdWords allow advertisers to optimize and test ads using ‘optimize for clicks’, ‘optimize for conversions’, or ‘rotate evenly’. In April, some PPC managers were outragedwhen Google announced that using the ‘rotate evenly’ setting indefinitely can inhibit performance and serve poor quality ads. Instead of rotating the ads indefinitely, the setting would automatically default to a period of 30 days, then ‘optimize to show the ads expected to generate the most clicks’. This gave Google, rather than the advertiser, more control.
With all of the strong feedback coming from the PPC community, Google extended the rotation period to 90 days and offered an opt-out. Finally, Google just completely caved and gave advertisers the ‘rotate indefinity’ setting back, so freedom was restored and all was good in PPC land.
adwords-ad-rotation-options
R.I.P. Google TV ads. In September, Google TV ads were discontinued and quietly slipped away, perhaps putting an end to the lingering question: “Will Google someday dominate traditional media?” There’s always next year.
And just in case you didn’t Knol, this service was closed down in April. Once or twice called a “Wikipedia killer” Knol competed with Wikipedia, but offered a distinct difference in that Knol articles featured personal expertise by emphasizing authorship more similar to Squidoo and HubPages. Since Knol pages included personal opinions of the author, criticism arose over whether Google owned / hosted products like this may get unfairly favored in the SERPs.
Losers
Yahoo had issues with management turnover (remember Scott Thompson’s resume fiasco?) and the company cut 2,000 jobs, about 14 percent of its workforce. This was all part of an aggressive effort to restructure the company and grow advertising revenues through targeting mobile and social opportunities.
Winners
Just when Yahoo kicked itself to the curb, it experienced an emotional rebound by hiring ex-Googler Marissa Mayer as Yahoo’s new CEO, who brought optimism and excitement for a potential Yahoo fairly-tale comeback.
Interestingly, much of the new news centered around Mayer’s pregnancy and birth of baby boy shortly after starting the job. She continued to work with virtually no maternity leave, which stoked a debate about whether such an example would help or hurt feminism and women in the workplace.
In addition, Mayer made it all look easy and was quoted as saying “The baby’s been easy.” Thereby creating controversy. At least the conversation distracted the tech community away from Yahoo product and technology innovation, which have been slow to emerge from her reign.
Winners
Bing’s fancy redesign featured three panes: organic search results, paid ads, and social results. Bing was especially proud of their ability to integrate social activity (Facebook) into the SERPs, an important feature they point out that Google has not been able to achieve.
bing-redesign-drake-hotel-chicago
Meanwhile, Bing’s search share increased throughout in 2012, with a steady accumulation of market share, growing from 15.2 percent in January to 16.3 percent by December.
Losers
Was it shoppers or Bing who got Scroogled? In response to the arrival of Google Shopping, Microsoft unveiled an anti-Google campaign telling holiday shoppers they were being “Scroogled" because Google features all paid ads in its shopping results. It’s true that the product listings are paid search ads by advertisers, but Microsoft contends this discriminates against other companies who do not pay to be listed, misleading consumers.
This multimedia marketing campaign run by Microsoft includes TV spots, print ads, and a new website, Scroogled.com.
The campaign doesn't mention that Microsoft favors merchants with "higher visibility" on Bing Shopping by paying a third-party site, Shopping.com. Let’s just say plenty of the people are getting scroogled.
scroogled
Winners
While initially confusing, everyone has settled into the new Timeline Facebook brand pages rolled out early in the year. This SEW post outlined the main benefits, including the ability to brand pages with a unique cover photo, pinned posts at the top of the page to highlight quality content, and private messaging between pages and fans.
Losers
facebook-ipo
Facebook reported $1.26 billion in revenue for Q3 2012 in their second earnings call as a public company. Investors seemed pleased despite Facebook’s $59 million loss, compared to a $227 million profit for the same quarter in 2011, before they became a public company. Immediately after the announcement, Facebook stock rose 8 percent in after-hours trading, bringing hope to Facebook fan boys.
F-commerce a flop? JCPenney, Gap and Nordstrom have all closed down their Facebook storefronts after trying to gain ecommerce revenues through he social network. Senior marketers reported they were pulling budget from Facebook in to redirect funds into their own ecommerce storefronts. Is this just a downswing or does this represent the beginning of an exodus from Facebook as a direct selling platform?
2012 brought search engine and social marketers challenges, joy, and plenty to look forward to in 2013. They were more memorable moments than can be covered here (including some great YouTube stories).


Share/Bookmark

Sunday, January 27, 2013

Using Footer Links to Diversify Your Backlink Profile

by Matt Morgan

We're all pretty tired of reading about the Google Penguin algorithm update and how devastating it has been to so many websites. By now we know that an excessive amount of backlinks with the same anchor text can tank your rankings.
This hasn't been good news for all the webmasters who have been including a link to their website in the footer of each and every website they've built with the same anchor text, for example "Web design by..." As SEO professionals and webmasters, we must ensure that our link building efforts create a natural and diverse backlink profile, but how?
The age old link in the footer of a website appears to Google as if webmasters are trying to manipulate the search engine rankings for a specific key phrase – and having too many links that point to their site from non-relevant websites may severely hurt you on Google. The irrelevancy is misunderstood.
Webmasters should be able to take credit for the amazing design and development of the websites they produce. It's relevant, because they really are the one's who designed the site.

The Problems

  1. Zero relevancy.
  1. All links pointing to the homepage.
  1. Too many links with the same anchor text.
  1. Too many dofollow links from the same domain.

Fix: Zero Relevancy


Client's Industry + Your Service

Fix: All Links Pointing to the Homepage

Fix: Too Many Links With the Same Anchor Text

Fix: Too Many Dofollow Links From the Same Domain


What About the SERPs?


  • auto transport web design
  • auto part website design
  • metal building website design
  • health club website design
  • restoration web design
  • recycling web design
  • bed and breakfast website design



Final Thoughts
In order to overcome the problems that come from webmaster footer links, we need to make a simple change to these four issues:
The trick is to establish a relationship between the website you're linking from to your website. You can bridge the relevancy gap by creating a website design portfolio page and optimizing your page utilizing your client's key phrases and your key phrases. Here is a formula that might help make the connection.
For example. Here's one of our portfolio entries for a travel site my company designed:
Travel and Tourism Website Design Portfolio
We've optimized our page title, H1, H2, body text, and images for both of our industries, bridging the relevancy gap. For extra relevancy, ensure you utilize keyword rich search engine friendly URLs.
The next part of the puzzle is to adjust the way you program your footer links on your client's website.
Now that you have a portfolio entry on your website that is relevant to your client's website, your footer link can point to your portfolio page instead of your homepage.
The common anchor text for a webmaster's footer link would look something like "web design by..." You can diversify your the anchor text in your backlink profile by using anchor text using the formula "client's industry + your service". For example "Travel and Tourism Website Design by...."
Website design Footer Link Example
A dofollow link from a root domain is fantastic. However, by placing a link to your website in the footer of your client's website, you're going to end up with too many links from same domain since your link will show up on every page on your client's site. This appears unnatural to Google and other search engines.
You can fix this programmatically by writing an if statement to include a "rel=nofollow" to your footer link on any page that doesn't equal the homepage. You can write that statement in a few ways depending on your server type and programming language.
I'm a big fan of WordPress. Here is an example of an if statement that can be used on a WordPress PHP website:
Example of Footer Link Code
Here is the best part. By diversifying your footer link's anchor text and hyperlinking to a relevant page on your website, you have now built an authority for Industry + Your Service.
Going back to the travel site example from earlier, as of this writing, that website had been live for 4 weeks. My website is ranking on the first page of Google for the search term "Travel and Tourist Website Design". It's more of a long-tail search term, but hey, I'll take it.
Google Search Results Travel and Tourism Website Design






Here are a few more search terms that I've landed on the first page of Google using this method:
I'd be interested in learning about other ways webmasters have overcome the footer link Penguin algorithm issue. Please share any methods that you've read about or tried yourself in the comments below.


Share/Bookmark

Google Penguin Update: Impact of Anchor Text Diversity & Link Relevancy

by Danny Goodwin

The Google Penguin Update, much like Panda last year, has angered SEOs and webmasters, most of whom say they have played by Google’s rules. Anchor text diversity and link relevancy may be two key factors of Penguin, according to more early analysis.

60% 'Money Keyword' Anchor Text

As discussed in “Google Penguin Update: 5 Types of Link Issues Harming Some Affected Websites”, spammy link signals (paid text links using exact match anchor text, comment spam, guest posts on questionable sites, article marketing sites, and links from dangerous sites) were among the issues for some sites affected by the algorithmic change.
Google was also hitting websites for aggressive anchor text for keywords in backlinks even before Penguin. Post-Penguin, anchor text diversity becomes more critical.
Microsite Masters examined historical data for thousands of websites to investigate whether sites that saw their rankings drop after Penguin were also guilty of having an unbalanced percentage of anchor text for “optimization” or “money” keywords (i.e., whatever term you’re trying to rank No. 1 for) as opposed to more natural-looking mix of linking anchor text (e.g., Your Website’s Title, example.com, www.example.com, “click here”, “here”, “blog post”, etc.).
The results are quite interesting. Websites that saw their search rankings tumble had a money keyword for anchor text in 65 percent or more of their inbound links, according to Microsite Masters (not that this percentage was a guarantee of being hit by Penguin):
inbound-links-anchor-text-keywords-google-penguin


As for sites that weren’t penalized, they had a much more natural looking backlink profile. Sites that had money keyword anchor text less than 50 percent of the time were “all but guaranteed” not to be affected by Penguin.
So essentially, what Google Penguin did was try to correct its rankings by discounting patterns it considers to be link spam. Granted, that potentially also opens the door to negative SEO attacks allowing competitors to harm sites by pointing a bunch of bad links at a competitor with weaker link profile, but that’s a subject of another discussion.

Links from Similar Niche Sites

Another finding was that Penguin also hit sites with few incoming links from domains and websites in the same niche, according to Microsite Masters. With Google, link quality and relevance are key, so by being able to attract quality links from authoritative domains in the same niche would be a clear sign that your site or page is relevant.
Basically, Google is looking for signals that can’t be manipulated as easily as anchor text. Getting a link from an authoritative or relevant site in the same niche is much harder to manipulate.
Open Site Explorer and ahrefs are a couple of tools you can use to get a view of your link profile and anchor text diversity.

Link Pruning

If Penguin has caught you for having “unnatural” or spammy links, “link pruning” is one way to go, according to veteran SEO Bruce Clay, who was interviewed by Search Engine Watch contributorEric Enge. He advises you to consider evaluating your link profile at least once a month, looking for low quality links to remove.
Getting a link to your site removed is about twice as hard as successfully requesting a website link to you, Clay noted. In one case, a website actually tried to extort him to remove the bad links, demanding $10 per link.
If that were to happen, or you're unable to add enough good links to counter the bad, or you’re otherwise unable to get a spammy link removed, Clay advised that one thing you can do is send Google a list of links you’ve tried to get removed and ask them to discount them, to show you're making the effort.

Duplicate Content & Site Clean Up

While it’s easy to focus just on cleaning up your links in the hopes of a Penguin recovery, it could be something else dragging down your rankings. Perhaps a site Google views unfavorably has scraped your content, and is now either linking to you with some sort of “credit” or it failed to remove an internal link to your site from within the copy text, and now that link from a bad neighborhood is pointing at your website.
Is your content being duplicated on other websites? Not sure, try Copyscape.
If, after using the tool, you find that other sites have stolen your content, Google has this page to submit a DMCA report and request the removal of content. If Google removes that site, that’s one less bad backlink you have to worry about.
And while you’re at it, do a full spring cleaning or SEO audit to ensure there’s nothing you’re doing to harm yourself. Do an honest evaluation of your own website, or find an SEO pro who can do it for you.
For those websites that have impacted but haven’t done anything wrong, it may just be the case that you don’t have enough quality backlinks or your domain may not have authority. If you’ve been standing still, it’s doubtful your competitors have. It’s up to you to win back the rankings by optimizing for users, building a brand, and creating great content, because as we've learned multiple times now, Google won’t give anything to you.

Share/Bookmark

Friday, January 25, 2013

Link Equity Salvage: 7 Steps for Finding Your Long-Lost Links


by Garrett French
Link equity salvage is the process of finding and redirecting your site's dead pages, folders, and subdomains that still have links. These are the old and mis-redirected, unredirected or simply deleted sections of your site that webmaster tools doesn't know about since the URLs got axed more than 35 days ago.
We're talking about pages that even site crawlers aren't finding, presumably because they don't have any links from the visible pages of your site. And remember, link salvagers, you're not only recovering lost link equity here, but blocking competitive off site link salvage experts from capitalizing on your squandered links. Please read a great article on Great Methods for Reclaiming Lost Backlinks.
You don't necessarily need to rush off hunting for onsite link salvage opportunities though – especially if your site's only a couple of years old and never had a redesign. If you can say yes to 1-2+ of these criteria then definitely keep reading:
  • Your site is 5+ years old
  • Your content naturally earns editorial links
  • You've had 1+ CMS Migrations
  • You've had several major site redesigns over the years
  • You know of at least 1 mismanaged site redesign
  • You have a 10,000+ page site
  • You aren't seeking targeted keyword ranking increases
So I've broken the link equity salvage process into four parts: compiling, status checking and link checking a comprehensive-as-possible list of your site's URLs, and then redirecting them. The majority of the tools here are for compiling that critical master list of URLs.

1. Majestic SEO's Historic Index

Some folks complain of Majestic SEO's large quantity of dead links and pages. Not me.
Look for quick wins by placing your root domain (no www) into the Majestic Site Explorer. Click the Historic Index Radial. Then click explore. Download your Top Links (shows highest-value links and the pages on your site they point to) and Top Pages CSVs.
From both of these reports extract your site's URLs. Dedupe. Boom. Presto. Pow. You now have a big list of the most important pages on your site according to Majestic.
You could also run a full site report with Majestic and get all of your site's URLs with an AC rank of 1 or higher. This costs more resources but provides a more thorough list.

2. Xenu's Link Sleuth

Xenu is a relentless beast of URL discovery, and it even status checks the URLs for you. It won't find every last URL, at least it hasn't in my tests, and it obviously can't find your legacy pages that still have links from offsite like Majestic does. It finds only what's linked to onsite (as far as I understand how it works). 

3. Xenu's Orphan Checker

I haven't used this on a client's site yet, only salivated at the opportunity to try it out and run a comparison to what can be found via Majestic. Give Xenu's Orphan Checker FTP access and it looks for orphan pages with no links from anywhere your site.
My guess is that the Orphan Checker isn't going to show you anything that's been flat out deleted from your server, as can sometimes happen, so it's not a replacement for Majestic. If you're on an obsessive hunt for link equity it's worth a check though.

4. Check Links Pages, Old Directories and Press Releases

If your site has been getting editorial links and publishing press releases for years you could have links to now-dead pages from pages that Majestic may not have discovered. First you need to prospect for links pages and old press releases and then check those pages for your domain with a bulk link checker.

5. Use Google Queries to Find Legacy Subdomains

Not every salvage requires a dead page – it could be a long forgotten initiative prompted by an executive long gone from your organization. These subdomain-discovery queries, taught to me by Entrepreneur.com's SEO Jack Ngyuen, can help you find possibly-abandoned subdomains from your organization's subdomain-happy heyday of 2005.
  • *.domain.com
  • *.domain.com -inurl:www.
  • site:*.domain.com
  • site:*.domain.com -inurl:www.
Some of these queries work for some sites but not others. I assume it depends on the size and/or configuration of the site.

6. URL Status Checkers

Once you've compiled your insanely large list of URLs it's time to check, recheck, and rerecheck their status codes. Yup I'd advise at least three checks of a URL list no matter what tool you're using.
One-off URL status checkers abound. You're going to need something with a bit more capacity. I know and love the bulk URL status checker built into my scraper suite. It looks like Dixon Jones has an HTTP Status Checker too.
Whatever tool you choose, it needs to work in bulk – large bulk. If it's on your desktop it could be tying up a machine for a few days. And remember – check your lists of failed URLs a couple more times – you'll always shake out more false-positives.

7. Bulk Link Count Checkers

With your dead URLs in hand it's time to separate the wheat from the chaff. This requires a bulk link checker – ideally one into which you can paste (potentially) thousands of dead URLs.
I know of two. There are probably more but these are the only two known of at the moment.
Majestic has a nifty "Bulk Backlink Checker" built in, though it has a limit of 300 URLs (at least at my subscription level). If you've got 6,000 dead URLs to check you could run it 20 times. Also, I've built a bulk backlink checker that accepts as many URLs as you can copy and paste in – it utilizes the Linkscape data set.
Once you get your data back from either tool you can sort by number of referring domains and at last start the process of mapping and 301 redirecting your equity back where it belongs.

Share/Bookmark