An important aspect of Search Engine Optimization is making your website easy for both users and search engine robots to understand. Although search engines have become increasingly sophisticated, in many ways they still can’t see and understand a web page the same way a human does. SEO helps the engines figure out what each page is about, and how it may be useful for users.
The major search engines all operate on the same principles. Automated search bots crawl the web, follow links and index content in massive databases. They accomplish this with a type of dazzling artificial intelligence that is nothing short of amazing. That said, modern search technology is not all-powerful. There are technical limitations of all kinds that cause immense problems in both inclusion and rankings. I’ve listed the most common below:
1. Spidering and Indexing Problems
- Search engines can’t complete online forms (such as a login), and thus any content contained behind them may remain hidden.
- Websites using a CMS (Content Management System) often create duplicate versions of the same page – a major problem for search engines looking for completely original content.
- Errors in a website’s crawling directives (robots.txt) may lead to blocking search engines entirely.
- Poor link structures lead to search engines failing to reach all of a website’s content. In other cases, poor link structures allow search engines to spider content, but leave it so minimally exposed that it’s deemed “unimportant” by the engine’s index.
- Interpreting Non-Text Content Although the engines are getting better at reading non-HTML text, content in rich media format is traditionally difficult for search engines to parse. This includes text in Flash files, images, photos, video, audio & plug-in content.
2. Content to Query Matching
- Text that is not written in common terms that people use to search. For example, writing about “young persons prestidigitator” when people actually search for “kids magician”.
- Language and internationalization subtleties. For example, color vs colour. When in doubt, check what people are searching for and use exact matches in your content.
- Mixed contextual signals. For example, the title of your blog post is “Detroit’s Best Magician” but the page is actually about organizing a great birthday party. These mixed messages send confusing signals to search engines.
SEO and Marketing
SEO isn’t just about getting the technical details of search-engine friendly web development correct. It’s also about marketing. This is perhaps the most important concept to grasp about the functionality of search engines. You can build a perfect website, but its content can remain invisible to search engines unless you promote it. This is due to the nature of search technology, which relies on the metrics of relevance and importance to display results.
When search marketing began in the mid-1990’s, manual submission, the meta keywords tag and keyword stuffing were all regular parts of the tactics necessary to rank well. In 2004, link bombing with anchor text, buying hordes of links from automated blog comment spam injectors and the construction of inter-linking farms of websites could all be leveraged for traffic. In 2011, social media marketing and vertical search inclusion are mainstream methods for conducting search engine optimization.
In order to be listed in the search engines, your most important content should be in HTML text format. Images, Flash files, Java applets, and other non-text content are often ignored or devalued by search engine spiders, despite advances in crawling technology.
Just as search engines need to see content in order to list pages in their massive keyword-based indices, they also need to see links in order to find the content. A crawlable link structure – one that lets their spiders browse the pathways of a website – is vital in order to find all of the pages on a website. Hundreds of thousands of sites make the critical mistake of structuring their navigation in ways that search engines cannot access, thus impacting their ability to get pages listed in the search engines’ indices.
Generally speaking, search engines cannot follow:
- Submission-required forms
- Links pointing to pages blocked by the meta robots tag or robots.txt
- Frames or I-frames
- Links in flash, java, or other plug-ins
- Links on pages with many hundreds or thousands of links
Keywords are fundamental to the search process – they are the building blocks of language and of search. In fact, the entire science of information retrieval (including web-based search engines like Google) is based on keywords. As the engines crawl and index the contents of pages around the web, they keep track of those pages in keyword-based indices. Thus, rather than storing 25 billion web pages all in one database, the engines have millions and millions of smaller databases, each centered on a particular keyword term or phrase. This makes it much faster for the engines to retrieve the data they need in a mere fraction of a second.
Obviously, if you want your page to have a chance of ranking in the search results for “magic,” it’s wise to make sure the word “magic” is part of the indexable content of your document.
Keywords dominate our search intent and interaction with the engines. For example, a common search query pattern might go something like this:
When a search is performed, the engine matches pages to retrieve based on the words entered into the search box. Other data, such as the order of the words (“corporate magician” vs. “magician corporate”), spelling, punctuation, and capitalization of those keywords provide additional information that the engines use to help retrieve the right pages and rank them.
To help accomplish this, search engines measure the ways keywords are used on pages to help determine the “relevance” of a particular document to a query. One of the best ways to optimize a page’s rankings is to ensure that keywords are prominently used in titles, text, and meta data.
Generally, the more specific your keywords, the better your chances of ranking based on less competition. The map graphic to the left shows the relevance of the broad term magician to the more specific keyword, birthday magician. Notice that while there are a lot of results (size of country) for the broad term, there are a lot less results and thus competition for the specific result.
Since the dawn of online search, folks have abused keywords in a misguided effort to manipulate the engines. This involves “stuffing” keywords into text, the url, meta tags and links. Unfortunately, this tactic almost always does more harm to your site.
In the early days, search engines relied on keyword usage as a prime relevancy signal, regardless of how the keywords were actually used. Today, although search engines still can’t read and comprehend text as well as a human, the use of machine learning has allowed them to get closer to this ideal.
The best practice is to use your keywords naturally and strategically
That said, keyword usage and targeting are still a part of the search engines’ ranking algorithms, and we can leverage some effective best practices for keyword usage to help create pages that are close to “optimized.”
- Use the keyword in the title tag at least once. Try to keep the keyword as close to the beginning of the title tag as possible.
- Once prominently near the top of the page.
- At least 2-3 times, including variations, in the body copy on the page, sometimes a few more if there’s a lot of text content. Adding more instances of a term or phrase tends to have little to no impact on rankings.
- At least once in the alt attribute of an image on the page. This not only helps with web search, but also image search, which can occasionally bring valuable traffic.
- Once in the URL.
- At least once in the meta description tag. Note that the meta description tag does NOT get used by the engines for rankings, but rather helps to attract clicks by searchers from the results page, as it is the “snippet” of text used by the search engines.
The title element of a page is meant to be an accurate, concise description of a page’s content. It is critical to both user experience and search engine optimization.
As title tags are such an important part of search engine optimization, the following best practices for title tag creation makes for terrific low-hanging SEO fruit. The recommendations below cover the critical parts of optimizing title tags for search engine and usability goals.
Place important keywords close to the front
The closer to the start of the title tag your keywords are, the more helpful they’ll be for ranking and the more likely a user will be to click them in the search results.
Ending every title tag with a brand name mention helps increase brand awareness, and create a higher click-through rate for people who like and are familiar with a brand. Sometimes it makes sense to place your brand at the beginning of the title tag, such as your homepage. Since words at the beginning of the title tag carry more weight, be mindful of what you are trying to rank for.
Consider readability and emotional impact
Title tags should be descriptive and readable. Creating a compelling title tag will pull in more visits from the search results and can help to invest visitors in your site. Thus, it’s important to not only think about optimization and keyword usage, but the entire user experience. The title tag is a new visitor’s first interaction with your brand and should convey the most positive impression possible.
The meta description tag exists as a short description of a page’s content. Search engines do not use the keywords or phrases in this tag for rankings, but meta descriptions are the primary source for the snippet of text displayed beneath a listing in the results.
The meta description tag serves the function of advertising copy, drawing readers to your site from the results and thus, is an extremely important part of search marketing. Crafting a readable, compelling description using important keywords (notice how Google “bolds” the searched keywords in the description) can draw a much higher click-through rate of searchers to your page.
Meta descriptions can be any length, but search engines generally will cut snippets longer than 160 characters.
In the absence of meta descriptions, search engines will create the search snippet from other elements of the page. For pages that target multiple keywords and topics, this is a perfectly valid tactic.
Place yourself in the mind of a user and look at your URL. If you can easily and accurately predict the content you’d expect to find on the page, your URLs are appropriately descriptive. You don’t need to spell out every last detail in the URL, but a rough idea is a good starting point.
Shorter is better
While a descriptive URL is important, minimizing length and trailing slashes will make your URLs easier to copy and paste (into emails, blog posts, text messages, etc) and will be fully visible in the search results.
Keyword use is important (but overuse is dangerous)
If your page is targeting a specific term or phrase, make sure to include it in the URL. However, don’t go overboard by trying to stuff in multiple keywords for SEO purposes – overuse will result in less usable URLs and can trip spam filters.
The best URLs are human readable without lots of parameters, numbers and symbols. Even single dynamic parameters in a URL can result in lower overall ranking and indexing.
Use hyphens to separate words
Not all web applications accurately interpret separators like underscore “_,” plus “+,” or space “%20,” so use the hyphen “-” character to separate words in a URL, as in google-fresh-factor for URLs example above.