Good morning folks. We’re coming to you live this morning from the SES conference in London. We’ll be covering the sessions all day today as well as tomorrow so feel free to follow along with all the coverage.
For our first session this morning we’re having a look at Successful Information architecture with Alan Perkins from Silverdisc and Richard Baxter from SEOgadget (the image is also from Rich).
Alan Perkins from Silverdisc
Alan started the session off by suggesting that your site has a mission that it’s trying to achieve and that should be accomplished through your information architecture.
The first step in acheiving a good IA that will reveal the most relevant and important content to the searcher is to find what a searcher’s mission is. You need to do keyword research and use the same language the searchers are using in your navigation and in the way you lay out your content. He came to defining the ultimate mission when building a site as answering these searcher’s mission. A bit meta, but I think I agree with his aims.
Alan then also talked abuot some tools you may want to be using for your keyword research and used the example of Zappos and targeting UK searchers for “mens running shoes” versus “mens trainers”. These sorts of oversights can be very costly if you don’t consider for local dialects and the difference is in the realm of 10x more searches for the first in the case of the US and 10x more searches for the second in the UK. Alan suggested that Google Insights for search
tends to be the best for this.
Building a Good Skeleton or Wireframe
Alan talked about the need for a good skeleton and a hierarchical website (think Yahoo! Directory or DMOZ). This is, in his view, a smart and logical way for both users and spiders to navigate a moderate sized site and will help you in the long term. He also provided some helpful tips for improving internal linking within this sort of website:
Breadcrumbs are a great way to improve anchor text and internal linking and pass the weight up the trail to the most important pages. Not only are they great for your site and the users as an extra way for them to navigate your site, but they are also great because they naturally build up more links back up the chain from deep pages back towards the most important or higher pages (an added pro-tip from me: it’s also a great way to increase anchor text links back to these pages!).
Additionally, Alan points out that Google is also now using breadcrumbs in the SERPs – although Google will do this manually sometimes the best way to get breadcrumbs included on the SERPs to your deeper pages is by manually using microformats.
Rules of thumb
- link to the most common/important pages of content from the homepage.
- think of the typical visitor and what is likely to be most intersting for them or searched for.
“The Marketing Spine”
For me, this was the best takeaway from Alan’s portion of the presentation. Alan talked about “the marketing spine” when building a site and by this he meant – build a structure/core that you will stick to. The IA should never change really (hierarchy anyways) and all of these essential URLs central to your brand (i.e. the most exciting content) should be permanent. This will result in fewer broken links for people linking to the site – and if they don’t need to update these links on their site all the time (or fear that they may have to in the future) they will be much more happy to link to your site.
Other Key Suggestions:
You can change sales/fulfillment/etc. URLs but on your “marketing” or landing pages you don’t want to change these
- Resist urls that include terms like “new” or “special”
- Extensions – try to avoid .html or .pl because lord only knows how your site is going to be built in the future. Suggests using URLs that end with a slash (/).
- Social – liked URLs and so forth. Just with link building, link sharing is also important and changing the URL makes it hard to keep up and means you’ll lose the link and/or social juice.
Faceted Classification (“Guiding Navigation”)
Another feature of many larger sites that Alan discussed as a great way to handle more user friendly navigation (and Duncan Morris covered this in depth at Distilled’s Pro Seminars). Sites like Zappos use this and it effectively allows users to look for shoes by all different categories (size, availability, colour, etc.). The problem with this is that it can lead to some messy problems if you let these pages get indexed and the SERPs start treating these internal search results as category pages. This is a good feature for large websites but it needs great care to achieve it from an SEO standpoint.
Use rel=canonical on these potential duplicate pages
Careful use of robots.txt and robots meta tag
Use product feeds and sitemaps
Ultimately these solutions are not fail safe and Alan was tentative to provide a “best” solution for this, but it is definitely important to think of strategy if you choose to use this type of navigation.
Alan closed by likening good IA to the difference between a castle to a sandcastle.; they both look nice but what really matters is the foundation to keep them standing.
As those who have attended conferences previously and heard Rich speak on information architecture you will know that he has creative solutions to reduce silo effects within sites, to clean up and flatten websites, and to make sure your sites are getting indexed. If you’ve been fortunate enough to see this before you probably won’t learn much from this part of the post, but if you’ve not had the pleasure, please read on: this is really sound advice.
Rich kicked it off with a humble decree that most SEOs have their own unique impression and view of I/A and these are only Rich’s views, however I would say that his advice is clever and worth listening to even if he was a bit modest.
The first step is making sure users and Search engines can find our sites and get to where they want easily and efficiently.
Getting started: indexation.
How hard is your site working for you? Are they actually generating you any traffic. Google site search is not a good representation of what is actually being indexed.
Question: how many clicks from your homepage to your deepest content? This is a question that is both essential for both users and SEO.
Deep content (generally long tail content) is a key generator of traffic, but it is often stuck at the bottom of sites with really poor architecture. Bad site architecture can bury this so deep that a long-tail strategy can fail completely – and given the amount of time that can go into content creation this is NOT what you want. As a general rule the long-tail is very volitile.
- Your first aim should really be to reduce the number of clicks to your deepest content.
- In general the most authoritative page is your homepage – and as a result it will attract the most links naturally.
- You should really try to have fewer than 3 clicks to the deepest level, 4 at a push.
- Flatten this out – it is fundamental to have a flat architecture with intelligent internal linking if you want to get the most out of these deeper pages.
Silos are often created because of the layout of a site and the natural tendency to create categories and not connect these categories to one another at all.
As an example Rich talked about a search for gifts under £20. On the site he showed in the example the “Top 5 Gifts” from the site were always displayed and the exact same across the site. Rich points out that these folks are seriously missing a trick here – internal links should aim to understand what users are looking for and show relevant and dynamically loaded categories here (all under 20, or all valentines day focussed for example). By so doing, you can increase the quality and relevance of the other products you are showing the user (i.e. hopefully drive more sales) but also incrase cross linking to a number of pages rather than the same 5 over and over.
Basics of Navigation
As Rich says, and I can attest, there are so many sites where you turn off Java and lift the CSS style sheets and all of the links disappear. This usually means that these links (i.e. your most important navigational links) cannot be seen across your site by Google or by some less abled devices. To figure out if you are suffering from this try looking at the Google cache text-only view of the page. If you can’t see the links that appeared in the navigation here, neither can Google! This seems so baisc, but you really should make sure your global navigation is set up as this is the back bone to a good IA.
Use CSS styled DHTML “cross-browser drop-down cascading validating menu”
Who is doing this well?
Amazon is really good at this (e.g. “customers who bought this item, also bought something else”). You can easily justify cross-category links here to your dev and design teams who are worried about a “less pretty” version of the navigation helping both users and SEs. Result: improvement in indexation and longer tail traffic.
How many links should I have on my homepage?
There was an intersting Q&A on this one but as a finger in the air, 150? Designers may well say – but that looks horrible and it would do if you stacked this all up, but there are cleverl pure HTML solutions to this.
Rich used Cheapflights as an example – they have so many destinations, so how do they cope? The homepage for cheapflights is categorised based upon destinations by type of holiday (skiing, beach, etc.). And rather than put everything in the same place they rely on some clever stying that allows you to click “more” they will reveal another load of links. The nice thing about this is that those links are visible to the search engines the whole time and only available to the users when they want to see them! This is a smart way to increase the number of links on the page.
Pages need to be visually appealing to convert well, but they can be done using clever HTML rather than relying on technology that hides the navigation from the search engines.
Make unique category pages! IA is only one small part of the bigger pieace. Get some unique content on every page, particularly the category pages. 200 words across 600 category pages – makes a massive change. Since Mayday this has been even more reasonable.
Rich further suggested using user generated content to compliment syndicated content and help you avoid duplicate content issues.
Rich’s main point with this was to make your pages unique! Ask for a review, can have huge impacts on the long tail.
Tools to use:
Rich suggested to use Google Webmaster Tools (how many pages does G say it has in the index) – not fail-proof but can often let you know of big blocks of content missing and so forth and is much more reliable than using Google’s site search.
SEOmoz – as they have integrated analytics tool with the PRO tool and can help you suss out some of these issues and which pages have likely been indexed – as well as when you have way too many links on a page.
That’s all for this session! Time for some Real Time Search next from me.Posted in Search Engine Strategies | Tags: SES London 2011