2010年4月29日 星期四

30 SEO Problems & the Tools to Solve Them (Part 2 of 2)

Posted by randfish

Last November, I authored a popular post on SEOmoz detailing 15 SEO Problems and the Tools to Solve Them. It focused on a number of free tools and SEOmoz PRO tools. Today, I'm finishing up that project with a stab at another set of thorny issues that continually confound SEOs and how some new (and old) tools can come to the rescue.


Some of these are obvious and well known; others are obscure and brand new. All of them solve problems - and that's why tools should exist in the first place. Below, you'll find 20+ tools that answer serious issues in smart, powerful ways.


#1 - Generating XML Sitemap Files


The Problem: XML Sitemap files can be challenging to build, particularly as sites scale over a few hundred or few thousand URLs. SEOs need tools to build these, as they can substantively add to a site's indexation and potential to earn search traffic. 


Tools to Solve It: GSiteCrawler, Google Sitemap Generator


GSiteCrawler

GSiteCrawler: Downloadable software to create XML Sitemaps


Google Sitemap Generator

Download a few files from Google Code and Install on Your Webserver


Sitemap Generator

Looks like Google Webmaster Tools, doesn't it? :-)


Both GSiteCrawler & Google Sitemap Generator require a bit of technical know-how, but even non-programmers (like me) can stumble their way through and build efficient and effective XML Sitemaps.


#2 - Tracking the Virality of Blog/Feed Content


The Problem: Even experienced bloggers have trouble predicting which posts will "go wide" and which will fall flat. To improve your track record, you need historical data to help show you where and how your posts are performing in the wild world of social media. What's needed is a cloud based tracking tool that can sync up with the Twitters, Facebooks, Diggs, Reddits, Stumbleupons & Delicious' of the web to provide these metrics in an easy-to-use, historical view.


Tools to Solve It: PostRank Analytics


PostRank Analytics

PostRank's nightly emails keep me wracking my brains for better blog post ideas


PostRank sends me nightly reports on how the SEOmoz blog performs across the web - numbers from Digg, Delicious, Twitter, Facebook and more. By using this, I can get a rough sense of how posts perform in the social media marketplace and, over time, hopefully train me to author more interesting content.


Addition: Melanie from Postrank added a discount code in the comments for SEOmoz users! Use the coupon code "SEOmoz" in order to get three free months instead of just one.


#3 - Comparing the Relative Traffic Levels of Multiple Sites


The Problem: We all want to know not only how we're doing with web traffic, but how it compares to the competition. Free services like Compete.com and Alexa have well-documented accuracy problems and paid services like Hitwise, Comscore & Nielsen cost an arm and a leg (and even then, don't perform particularly well with sites in the sub-million visits/month range).


Tools to Solve It: Quantcast, Google Trends for Websites


Quantcast

If a site has been "Quantified," no other competitive traffic tool on the web will be as accurate


Quantcast

Since both sites are "Quantified," I can be sure the data quality is excellent


I've complained previously about the inaccuracies of Alexa (as have many others). It's really for entertainment purposes only. Compete.com is better, but still suffers from lots of inaccuracy, data gaps, directionally wrong estimates and a general feeling of unreliability in the marketplace. Quantcast, on the other hand, is excellent for comparing sites that have entered their "Quantified" program. This involves putting Quantcast's tracking code onto each page of the site; you're basically peeking into their analytics.


Sadly, Quantcast isn't on every site (and their guesstimates appear no better than Compete when they don't have direct data). Fortunately, one organization has stepped up with a surprisingly good alternative - Google.


Google Trends for Websites


Google Trends for Websites allows you to plug in domains and see traffic levels. Much like AdWords Keyword Tool, the numbers themselves seem to run high, but the comparison often looks much better. Google Trends has become the only traffic estimator I trust - still only as far as I could throw a Google Mini, but better than nothing.


#4 - Seeing Pages the Way Search Engine Do


The Problem: Every engineering & development team builds web pages in unique ways. This is great for making the Internet an innovative place, but it can make for nightmares when optimizing for search engines. As professional SEOs, we need to be able to see pages, whether in development environments or live on the web the same way the engines do.


Tools to Solve It: SEO-Browser, Google Cached Snapshot, New Mozbar


SEO Browser

_

A longtime favorite site of mine, SEO Browser lets you surf like an engine


SEOmoz on SEO Browser

_

Poor Google; that's all they see when they crawl our pretty site


SEO-Browser is a great way to get a quick sense of what the engines can see as they crawl your site's pages and links. The world of engines may seem a bit drab, but it can also save your hide in the event that you've put out code or pages that engines can't properly parse.


Google Cached Snapshot

_

I wonder if Googlebot ever gets tired of blue, purple and gray...


Google's own cached snapshot of a page (available via a search query, as a bookmarklet, or in the mozbar's dropdown) is the ultimate research tool to know what the engine "sees." The only trouble is that it works in the past only (and only on pages that allow caching). To get a preview, SEO Browser or our friend below can be useful.


Mozbar User Agent Switch

The mozbar lets you dress up like Google whenever the occasion is right


One of Will Critchlow's feature requests in the new mozbar was the ability to switch user agents, turn off JavaScript and images and, in essence, become the bot in your browser. Luckily, he also forced us to place a gray overlay in the right-hand corner that alerts you to the settings you've changed and gives you an easy, one-click "return to normal." Browsing like a bot = solved!


#5 - Identifying Crawl Errors


The Problem: Discovering problems on a site like 302 redirects (that should be 301s), pages that are blocked by robots.txt (here's why that's a bad idea), missing title tags, duplicate/similar content, 40x and 50x errors, etc. is a task no human can efficiently perform. We need the help of robots - automated crawlers who can dig through a site, find the issues and notify us.


Tools to Solve It: GSiteCrawler, Xenu, GGWMT


Xenu Link Sleuth

Mmmm... Parallel Threads


Xenu Link Sleuth 2

She canna hold on much longer cap'n!


We've already covered GSiteCrawler in this post, but for those unaware, it can be a great diagnostic tool as well as a Sitemap builder. Xenu is much the same, though somewhat more intuitive for this purpose. Tom's written very elegantly about it in the past, so I won't rehash much, other than to say - it shows errors & potential issues Google Webmaster Tools doesn't, and that can be a lifesaver.


GGWMTools Crawl Errors

Doh! I think we messed up some stuff when KW Difficulty relaunched :(


Google Webmaster Tools is extremely popular, well known and well used. And yet... lots of us still have crawl errors we haven't addressed (just look at the 500+ problems on SEOmoz.org in the screenshot above). Exporting to Excel, sorting, and sending to engineering with fixes for each type of issue can save a lot of heartache and earn back a lot of lost traffic and link juice.


#6 - Determine if Links to Your Site Have Been Lost


The Problem: Sites don't always do a great job maintaining their pages and links (according to our data, 75% of the web disappears in 6 months). Many times, these vanishing pages and links are of great interest to SEOs, who want to know whether their link acquisition and campaigning efforts are being maintained. But how do you confirm if the links to your site that were built last month are still around today?


Tools to Solve It: Virante's Link Atrophy Diagnosis


Virante's Link Atrophy Tool

Does that mean Stuntdubl & SEOmoz are "going steady?"


This tool comes courtesy of the great team over at Virante, and it's a pretty terrific application of an SEO need and Linkscape data through the SEOmoz API. The tool will check the links reported from Linkscape/Open Site Explorer and determine which, if any, have been lost. Many times it's just links off the front page of blogs or news sites as archives fall to the back, but sometimes it can help you ID a link partner or source that's no longer pointing your way in order to facilitate a quick, painless reclamation. The best part is there's no registration or installation required - it's entirely plug and play.


Addition: Russ from Virante added a discount code in the comments for SEOmoz users! Use the coupon code "seomoz30" in order to get more results from these tools.


#7 - Find 404 Errors on a Site (without GG WM Tools) and Create 301s


The Problem: Google's Webmaster Tools are great for spotting 404s, but the data can be, at times, unwieldy (as when thousands of pages are 404ing, but only a few of them really matter) and it's only available if you can get access to the Webmaster Tools account (which can stymie plenty of SEOs in the marketing department or from external consultancies). We need a tool to help spot those important, highly linked-to 404s and turn them into 301s. 


Tools to Solve It: Virante's PageRank Recovery Tool


Virante's PageRank Recovery Tool

3.99 mozRank for ~0.00 effort


The thinking behind this tool is brilliant, because it solves a problem from end to end. By not only grabbing well-linked-to pages that 404, but actually writing the code to create an .htaccess file with 301s to your choice of pages, the tool is a "no-brainer" solution.


#8 - See New Links that are Sending Traffic (and Old Ones that Have Stopped)


The Problem: Most analytics tools have an export function that, combined with some clever Excel, could help you puzzle out the sites/pages that have started to send you traffic (and those that once were but have stopped). It's a pain - manual labor, easy to screw up and not a particularly excellent use of your precious time.


Tools to Solve It: Enquisite


Enquisite Links Report


I love the ability to look across the past few months and see the trend of new pages and new domains sending links, as well as identifying links that have stopped sending traffic. Some of those may be ripe for reclamation, others might just need a nudge to mention or link over in their next piece/post. This report is also a great way to judge how link building campaigns are performing on the less-SEO focused pivot, sending direct traffic.


#9 - Research Trending/Temporal Popularity of Keywords


The Problem: Keyword demand fluctuates over time, sometimes with little warning. Knowing how search volume is impacted by trending and geography is critical to SEOs targeting fields with these demand fluxes.


Tools to Solve It: Google Insights, Trendistic


Google Insights

Hmmm.... Maybe we should launch Open Webmaster Tools next?


Google Insights

We need to make it out to India & Brazil more often, too!


Google Insights is great for seeing keyword trending, related terms and countries of popularity (though the last of these we've found to be somewhat suspect at times). However, sometimes you're really interested in what's about to become popular. For that, turning to trend sites can be a big help.


Trendistic


Although it doesn't yet have a "suggest" feature to help identify terms & phrases that may soon become popular searches, it does help establish the "tipping point" at which a buzzword in Twitter may become a trend in web search. As we've discussed in the WhiteBoard Friday on Twitter as an SEO Research Tool, finding the spot at which search volume begins spiking can present big opportunities for fresh content.


#10 - Analyze Domain Ownership & Hosting Data


The Problem: When researching domains to buy, considering partnerships or conducting competitive analysis, data about a site's hosting and ownership can be essential steps in the process.


Tools to Solve It: Domaintools


DomainTools

We should make sure to re-register this domain...


Long the gold standard in the domainer's toolbox, DomainTools (once called whois.sc) provides in-depth research about a domain's owners, their server and, sometimes most interestingly, the other domains owned by that entity. BTW - they're spot on; SEOmoz owns about 80 other domains besides our own (though we only really use this one and OpenSiteExplorer right now).


#11 - Investigate a Site/Page's History


The Problem: What happened on this page last month or last year? When conducting web research about links, traffic and content, we all need the ability to go "back in time" and see what had previously existed on our sites/pages (or those of competitors/link sources/etc). Did traffic referrals drop? Have search rankings changed dramatically? Did a previously available piece of content fall off the web? The question really is - how do we answer these questions?


Tools to Solve It: Wayback Machine




Before 2005, we were on a different domain!


SEOmoz in 2005

_

If you remember this version of the site, you're officially "old school"


Yeah, yeah, you've probably heard of the Wayback Machine, powered by Alexa's archive of the Internet and endlessly entertaining to web researchers and pranksters alike. What might surprise you is how valuable it can be as an SEO diagnostic tool, particularly when you're performing an investigation into a site that doesn't keep good records of its activity. Reversing a penalty, a rankings drop, an oddity in traffic, etc. can consume massive amounts of time if you don't know where to look and how. Add Wayback to the CSI weapons cache - it will come in handy.


#12 - Determine Semantically Connected Terms/Phrases


The Problem: Chances are, the search engines are doing some form of semantic analysis (looking at the words and phrases on a page around a topic to determine its potential relevance to the query). Thus, employing these "connected" keywords on your pages is a best practice for good SEO (and probably quite helpful to users in many cases as well). The big question is - which words & phrases are related (in the search engines' eyes) to the ones I'm targeting?


Tools to Solve It: Google Wonder Wheel


Google Wonder Wheel

_

Nothing about "Yellow Shoes?"


We don't know for certain that this is a technique that provides massive benefit, but we're optimistic that tests are going to show it has some value. If you'd like to participate in the experiment, take related phrases from the Wonder Wheel and employ on your pages. Please do report back with details :-)


#13 - Analyze a Page's Optimization of Images


The Problem: When image search and image accessibility/optimization is critical to your business/client, you need tools to help analyze a page's consistency and adherence to best practices in handling image dimensions, alt attributes, etc.


Tools to Solve It: Image Analyzer from Juicy Studio


Juicy Image Analyzer 

Doh! We need to add some dimensions onto our images.


It's not the prettiest tool in the world, but it does get the job done. The image analyzer will give any page a thorough evaluation, showing missing alt tags, image dimensions (which can help with page rendering speed) and informing you of the names/alts in a thorough list. If you have image galleries you're aiming at image search optimization, this is a great diagnostic system.


#14 - Instant Usability Testing


The Problem: Fast feedback on a new landing page, product page, tool design or web page (of any kind) can be essential to smoothing over rough launches. But tools aren't enough - we need actual human beings (and not the biased ones in our friend groups or company) giving fast, functional feedback. That's a challenge.


Tools to Solve It: Five Second Test, Feedback Army


FiveSecondTest

It can't be that easy, can it?


FiveSecondTest

Wow... It totally is! Here I am helping give feedback to a local geek squad.


FeedbackArmy

Users are easier to come by than we think


Both FeedbackArmy & FiveSecondTest offer the remarkable ability to get instant feedback from real users on any page, function or tool you want to test at a fraction of the price normal usability testing requires. What I love is that because it's so easy, it makes that first, critical step of reaching out to users a low barrier to entry. Over time, I hope systems like these help make the web as a whole a more friendly, easy-to-use experience. Now there's not excuse!


#15 - Measure Tweet Activity to a URL Across Multiple URL Shortener Platforms


The Problem: You've got your bit.ly, your j.mp, your tinyurl, your ow.ly and dozens more URL shorteners. Between this plethora of options and standard HTML links pasted into tweets, keeping up with all the places your URL is being shared can be a big challenge.


Tools to Solve It: Backtweets


BackTweets

Tweeting links in the middle of the night is fun!


Bit.ly can track bit.ly and many other services offer their own tracking systems, but only Backtweets is aggregating all of the sources and making it easy to see what people are saying about your pages no matter how they encode it. Now if only we could get this to integrate with PostRank and Search.Twitter.com and Trendistic and make the interface super-gorgeous and have it integrate with Google Analytics... and... and...


#16 - BONUS: Determining Keyword Competition Levels


Bonus! I mentioned last week in a comment that I'd make a post about the new Keyword Difficulty Tool. Since this post is all about tools anyway, I figured I'd toss it in and save you the trouble of clicking an extra link in your feedreader.


The Problem: Figuring out which keywords have more/less demand than which others is easy (and Google does a great job of it most of the time).


Tools to Solve It: New Keyword Difficulty Tool


The real problem was that our previous keyword difficulty tool attempted to use 2nd order effects and non-direct metrics to estimate the competitiveness level of a particular keyword term/phrase. While it's true that more popular/searched-for keywords TEND to be more competitive, this is certainly not always the case (and in fact, SEOs probably care a lot more about when a keyword has high traffic and relatively weak sites/pages in the SERPs more than anything else). The new tool attempts to fix this by relying on Page Authority (correlation data here) and using a weighted average of the top ranking sites and pages.


Keyword Difficulty

Running five keywords at a time is way better than one

(we're working to add more - promise)


Keyword Difficulty Scores

The best bet here looks like "best running shoes" - relatively lower difficulty, but still high volume


Keyword Difficulty for Best Running Shoes

Oh yeah, looking at the top positions, a few dozen good links and some on-page and we're there


Reversing the rankings is never easy, but parsing through KW Difficulty reports certainly makes it less time-consuming. Watch out for the scores, though - a 65% is pretty darn tough, and even a 40% is no walk in the park. At last, I feel really good about this tool; it was suffering for a good 18 months, and it's nice to have it back in my primary repertoire with such solid functionality.




I'm sure there are plenty of remarkable tools I've missed and there are likely questions about these problems, too. Feel free to address both in the comments!


p.s. This was written very late at night and I need to be up and on a plane at precisely butt-o'clock tomorrow morning, so editing will have to slide until Jen wakes up and gives this a good once-over. Sorry about any errors in the meantime :-)


Note from Jen: I finally woke up and made a few minor edits. :) I also added a discount code from Virante "seomoz30" AND a discount code from PostRank "SEOmoz". Tools Rule!


Do you like this post? Yes No



http://tinyurl.com/33engw5