SEO for the Average Joe: Website Structure and Crawlability

SEO for the Average Joe: Website Structure and Crawlability

Photo credit to Jer Kunz on Flickr

How do search engines pick your website to rank for the right keywords from among millions of other sites around the world? It seems like a perfect example of finding a needle in a haystack, but they’ve created hefty programs called spiders or bots to do just that.

Googlebot, Google’s indexing program, continuously crawls the Web updating Google’s database of websites. When it gets to yours, it will take cues from the site structure and content to determine what your site is all about. Today we’ll talk about how you can optimize your site structure for the best results when the search engine bots find you.

Navigation

 

Photo credit to Dan Zen on Flickr

A good rule of thumb is the easier and more intuitive your site is to navigate for people, the easier it will be for the search engine bots to determine what your site is about. Organize your pages into a hierarchy that groups together related topics. If you have a large site, Google may not get to every page. So make sure your most important content is in the first couple tiers of pages. Ideally, these pages will be accessible from the main navigation bar on your home page.

Links between pages are the paths users and bots use to get around your site. In addition to the primary navigation bar, links between pages help visitors find their way around and tell the bots a little more about what’s important on your site. Use descriptive keywords as anchor text for links between pages. These can include links within the body of your pages, footer links, and images you use as buttons.

You’ll also want to avoid having elements on your pages that bog down the load time, such as large image files and videos that play automatically. The bots will get tired of waiting and skip pages that take too long to finish loading.

Tip: If you have lower pages you feel are important to have indexed, link to them directly from your home page — either in text, in the navigation bar, or with footer links.

URL Structure

Do the URLs of your inner pages look something like this?

www.yoursite.com/index.php?option=com_content&view=article&id=8&Itemid=4

URLs with long strings of nonsense aren’t just annoying to people, they give nothing to search engine bots. Make your inner page URLs brief and helpful, like this:

www.yoursite.com/descriptive-title

URLs should also make it easy for people and bots to understand your site’s hierarchy. Include the paths in URLs for pages deeper in your site, thusly:

www.yoursite.com/descriptive-title/specific-topic

Tip: Use a hyphen (-) rather than an underscore (_) to link words in your URLs.

Redirects

Duplicate pages and links that don’t lead anywhere can be detrimental to your search engine rankings. The solution is to use redirects to lead bots and visitors to where you want them to be. There are two kinds of redirects that will serve most of your needs — 301s and 404s.

301 redirects — These are useful when you have multiple versions of the same page either because you’ve moved your site to a new URL or your site can be accessed from multiple URLs (ex. yoursite.com, www.yoursite.com, home.yoursite.com). You simply need to add a piece of code to the old or duplicate page telling Google and browsers to go to the correct page instead. Most visitors won’t even notice they’ve been redirected. Click here for instructions for creating a 301 redirect.

404 redirects — When you delete or move a page from your website, by default visitors and bots who try to find that page’s URL will be taken to a dead end 404 error page. It’s possible to customize your 404 pages so they look friendlier and give visitors options to continue on your site. This gives you a chance to conserve any authority your deleted page had built up and keeps visitors and bots moving happily through your site. Click here for a good tutorial on creating a 404 page.

Tip: Sites with large eStores tend to delete and/or duplicate content frequently as products change. Make sure you have a system in place to implement appropriate redirects where needed.

Sitemap

Sitemap
Photo credit to Photoshop Roadmap on Flickr

Sitemaps are just what they sound like: maps of your site that make it easy for visitors or bots to find the pages they’re looking for. There are two main kinds of sitemaps — html and xml. HTML are more useful for visitors while the search engines like XML. You can add both or just one.

Keep in mind that just because you don’t have a sitemap doesn’t mean Google won’t index your site. If you have a large site, having a sitemap may get you indexed a little faster, and may even get a few more pages indexed. It’s worth the time to put one in.

Tip: You can get a free XML sitemap file made for your site at xml-sitemaps.com.

Posted in SEO

SEO for the Average Joe: Overview of Search Engine Optimization

SEO for the Average Joe: Overview of Search Engine Optimization

Most business owners these days recognize the importance of creating an online presence as a company. Having a website allows your potential customers to quickly and conveniently learn about your products or services, your company’s history, your location and hours (if applicable), your prices, and anything else they’ll want to know before deciding to give you their money. However, your website is only helpful if people are actually visiting it, and that’s where search engine optimization (SEO) comes in.

This article will teach you the basic ideas behind SEO, creating a foundation for understanding the practical advice, how-to information, and deeper SEO concepts presented in the subsequent articles in the “SEO for the Average Joe” series.

Search Engines and the World Wide Web

Search EnginesThe World Wide Web is made up of several billions of individual web pages. These pages cover every topic you can think of and tons of topics you’d never think of. These pages contain text, images, audio, video—a vastly enormous and ever-expanding pool of data in a variety of forms.

Water, Water Everywhere . . .

The sheer amount of information on the Web is awe-inspiring, but the actual value of this virtual library isn’t just the fact that all that information exists: actual value comes when you find the information you need. Sure, there are billions of pages out there, but you don’t want a billion pages when you sit down at the computer and get online. What you want is just a few relevant pages with the information you need. If you know right where that information is, you just type the address (or “URL”) of the desired web page into your web browser’s address bar or click on a bookmark and go directly to that page. That’s easy. What happens, though, if you don’t know where to find what you’re looking for? That’s when a search engine becomes your best friend.

Making the Web Practical

Search engines (like Google, Yahoo! Search, or Bing) play a vital role in today’s online world. They use automated programs to constantly scan web pages (“web crawling”), record pertinent data about those pages (“indexing”), and provide links to relevant indexed pages when you enter a search query in the search engine’s interface. They take the Web’s mind-boggling vastness and make it into something practical for you.

SERPFor example, let’s say you’re considering getting a pet and you do a search for “popular indoor dog breeds.” Nearly instantaneously, the search engine will dive into its enormous index of web page information and display links to the pages it determines are most relevant and helpful based on your search query. To determine relevance, search engines use extremely complex algorithms analyzing several different factors. What pages contain the words you searched for or words closely related to your search query? What pages are other people visiting when they’re looking for this kind of information? What sites are considered to be authoritative in this field (as evidenced by other sites linking to the authoritative site as a reference)? Using a proprietary evaluation method and a hoard of stored data about web pages, each search engine attempts to provide you with the best results for your query, and the better the result, the higher it will appear in the search engine results page (SERP).

Why Ranking Higher Matters

Here’s where things get practical for your business. Most people using a search engine will only look at results on the first page. Additionally, results listed higher on the SERP are far more likely to get clicked. In fact, the first result gets over half the clicks. If you own a business, you should immediately have dollar signs in your eyes. As mentioned earlier, your company’s website only helps if people actually visit it. What better way to get people to visit it than to rank high in the SERPs for popular searches?

Because ranking well with the search engines can bring so much traffic (i.e., potential customers) to your site, enlightened business owners are investing more and more in optimizing their sites to rank well with the search engines, especially Google, which has nearly ninety percent of the search engine market share. An entire industry has sprung up around increasing website traffic through improving search rankings.

“Black Hat” and “White Hat” SEO

SEO companies use a number of tactics to improve search rankings for websites. These tactics are broadly categorized as “black hat” and “white hat.”

The Dark Side
Black Hat SEO

Black Hat SEO

Black hat SEO consists of deceptive practices that focus on the search engine rather than site visitors, attempting to trick search engines into ranking a page higher. This often involves creating spammy inbound links on several low-quality sites, attempting to convince the search engine algorithms that the target site is authoritative and popular. Stuffing the target site’s displayed content and behind-the-scenes code with popular search terms is another major black hat technique. In the early days of search engines, tactics like these could be quite effective, but over time, search algorithms have evolved and are usually able to detect sites using unnatural SEO tactics and prevent them from influencing rankings. In fact, many sites tainted by black hat SEO are even penalized and removed from the SERPs altogether.

The Light Side
White Hat SEO

White Hat SEO

White hat SEO is based on the philosophy that by focusing on users instead of directly on search engines, the search engines will naturally recognize good, relevant sites and reward them with higher rankings. White hat SEO involves optimizing website content and coding in legitimate ways to clearly show their relevance to popular search queries. It also uses high-quality content and links on third-party sites to reference the primary site as an authority in its corresponding industry.

Choosing Sides

Because search engine algorithms are frequently updated in a never-ending quest to provide the best results, SEO practices are constantly shifting and adapting, as well. Black hat SEO tactics often lose their impact as search engines update, but white hat SEO strives for evergreen effectiveness that will weather any changes in search algorithms. Black hat is the “Dark Side” of the Force that is SEO. It may give some immediate power, but it leaves its user twisted, ugly, weakened, and despised in the end. White hat SEO, the “Light Side” of the Force, is powerful, enduring, and beneficial for all. If it isn’t obvious from everything you just read, you should never use black hat SEO tactics. White hat SEO is safer and more effective, and the days of black hat SEO are quickly coming to an end.

Enough Snorkeling—Dive Deeper!

Now that you have a solid understanding of search engines and what they can mean for your business, it’s time to go into more depth regarding specific ways to optimize your company’s online presence. Keep reading the “SEO for the Average Joe” series to learn more about everything that goes into making a site rank better with the search engines.