Things to Consider When Changing your SEO Strategies

No Comments

The nature of SEO has changed more in the past year than it did in the previous five years combined. To better serve our clients, we need to commit to changing our SEO strategies and tactics at a similar pace. Mobile technologies and voice recognition and are altering the way people access the internet, with fifty percent of consumers using voice search more frequently now than they did 12 months ago.

I find it useful to think about SEO from two perspectives: users and bots. Search engine optimization is focused more on the process of optimizing for users’ queries. Googlebot optimization is focused on how Google’s crawler accesses your site. While both types of optimization aim to make your page more visible and may impact your SERPs, SEO places a heavier emphasis on user experience, while spider or crawler optimization is entirely about appealing to bots. As Neil Patel says on KISSmetrics:

Search engine optimization is focused more upon the process of optimizing for user’s queries. Googlebot optimization is focused upon how Google’s crawler accesses your site.

Before we discuss content, keywords and social signals, let’s make sure your site is search engine friendly so that Googlebot can understand your content strategy.

Make Your Website Friendly to Bots and Search Engines

Ensure Your Pages Are Crawlable

Your page is crawlable when search engine spiders can find and follow links within your website, so you’ll have to configure your .htaccess and robots.txt so that they don’t block your site’s critical pages. You may also want to provide text versions of pages that rely heavily on rich media files, such as Flash and Silverlight.

Of course, the opposite is true if you do want to prevent a page from showing up in search results. In this case, you’ll need to manually block the page from being indexed by using the noindex robots meta tag or the X-Robots-Tag HTTP header. Note that if you use noindex meta tag or X-Robots-Tag, you should not disallow the page in robots.txt, The page must be crawled before the tag will be seen and obeyed.

Avoid Redirect Chains

When your website has long redirect chains, i.e. a large number of 301 and 302 redirects in a row, spiders such as Googlebot may drop off before they reach your destination page, which means that page won’t be indexed. Best practice with redirects is to have as few as possible on your website, and no more than two in a row.

Fix Broken Links

The conventional wisdom late last year was that broken links do not play a substantial role in rankings, even though they greatly impede Googlebot’s ability to index and rank your website. That said, Google’s algorithm has improved substantially over the years, and anything that affects user experience is likely to impact search engine ranking.

Set Parameters on Dynamic URLs

Spiders treat multiple dynamic URLs that lead to the same page as separate pages. You can manage your URL parameters by going to your Google Search Console and clicking Crawl > Search Parameters. From here, you can let Googlebot know if your CMS adds parameters to your URLs that doesn’t change a page’s content.

Clean Up Your Sitemap

XML sitemaps help both your users and spider bots by making your content better organized and easier to find. Try to keep your sitemap up-to-date and purge it of any clutter that may harm your site’s usability, including 400-level pages, unnecessary redirects, non-canonical pages, and blocked pages.

Feeds Still Help

Feeds, such as RSS, XML, and Atom, allow websites to deliver content to users even when they’re not browsing your website. While RSS feeds have long been a good way to boost your readership and engagement, they’re also among the most visited sites by Googlebot. When your website receives an update (e.g. new products, blog post, website update, etc.) submit it to Google’s Feed Burner so that you’re sure it’s properly indexed.

Don’t Build External Links: Earn Them

Evaluate your inbound link profile. Google only wants to count links you editorially “earn.” Google has an uncanny ability to spot earned or editorial vs. manipulative links. Many of Google’s algorithm updates over the past two years have targeted websites with links via directories, forums, account profiles, guest posting, and commenting. Yet there remains a strong correlation between the number of spider visits and the number of quality external links.

Maintain Internal Link Integrity

A well-organized internal linking structure may also improve user experience, especially if users can reach any area of your website in three clicks. Making everything more easily accessible in general means visitors will linger longer, which may improve your SERPs.

Content, Keywords and Social Signals

Keyword matching is now intent matching. Avoid AdWords-only keyword research. Try adding these techniques to your keyword discovery process.

Use Google Suggest

Many queries with 100s or 1000s of monthly searches don’t show up in AdWords, however, they will appear in Google Suggest. Keyword Tool is a great way to streamline this process.

Broaden Your Keyword and Topic Base

How do your client’s competitors, customers, and industry professionals describe what you are promoting?  Visit industry-related sites, forums, and association sites. Look at the event calendars for seminar titles, competitor ads and member posts. Often, you’ll discover valuable keywords or hot topics that you or your client missed.

Include the voice of the customer. Popular questions, phrases, and terms that your audience uses in web forums/social media are excellent sources of keyword data expansion.

Use PPC Data

If you are using AdWords, make sure to look at the actual search queries that generated your impressions. It is always eye-opening to discover the actual terms used by your visitors, not just the terms you bid on. Google says more than 40% of search terms are unique, meaning that they have never been used before. Often these queries reveal terrific organic keyword or topic optimization opportunities.

Voice and Mobile Search

Consumers no longer have to type awkward or disjointed queries into their phones. Now, anyone can pick up their phone and ask it, “where can I find wide-fit, black men’s dress shoes for under $100?” and Siri, Cortana, or Google Now will instantly provide the answer they’re looking for. The growth of voice search is sure to lead to many more natural language queries that provide much greater amounts of contextual information and data about the searcher’s true intent. Voice search is a critical factor for local businesses, especially retailers, restaurateurs and those in services and trade industries.

Content Creation

Only a short while ago, we could still create & scale keyword-targeted pages of “good, unique content.” Remember, search engines are improving in their ability to understand meaning and context, not just keywords.

Pages with numerous keywords and keyword permutations are frequently outranked by pages with the following attributes:

  • Relevant – contains content the engines can interpret as on-the-topic to the searcher’s intent & phrasing
  • One-of-a-Kind – doesn’t appear elsewhere on the web
  • Helpful – resolves the searcher’s query in a useful, efficient manner
  • Uniquely Valuable – provides information that’s unavailable (or hard to get) elsewhere
  • Great UX– is easy & pleasurable to consume on any device

Social Media Signals

In 2014, many people believed that there was a strong correlation between tweets (and Facebook and Google+ shares) and higher SEO rankings. What appears to be true is that the value of social media in search is that social shares expose content to those who might link to it (and amplify in other ways)

Facebook still the biggest driver of social traffic, and most of that traffic still comes from a small percentage of content. Social links & shares skew to the top 5% of content. We use Facebook ads to buy exposure, which lead to links via sharing, and do not purchase links themselves.

Don’t Publish Content Just to Get Links and Rankings

Know what you’re up against. Writing to go viral is likely to fail. Facebook is still the biggest driver of social traffic, and most of that traffic comes from a small percentage of content. The top 10% drove 94.3%. This illustrates the importance of optimizing content for social sharing. For example, Upworthy A/B tests every piece of content with at least four headlines, continually refining for clicks per share and shares per view.

  • Be Strategic and Relevant– the content must tie to business goals and fit with your branding
  • Identify and Target Likely Amplifiers – there should be an obvious answer to the question “who will help spread this content and why?”
  • Think Long-term: Don’t Publish Once and Forget– most content will fail; you need to be willing to invest longer and harder than the competition to find success

Measuring Success

Before beginning any marketing initiative, we start by creating a model that specifically lists the business objectives, goals and targets, purpose, and methodology. It’s important to have at least one goal in each of these three categories: acquisition, behavior, conversion.

This allows us to streamline reporting, understand what worked and what did not, and understand the return on SEO investment. This approach also protects clients from investments with diminishing returns, as can be the case with trying to apply every SEO “trick in the book.”

Traffic and Performance by Keyword for Search Traffic Performance by Page

Google stopped providing organic keyword data a few years ago. While we have been somewhat successful trying to back into this information through combining data from Google Webmaster Tools (Search Console), Google Analytics, AdWords and Facebook Insights, we find evaluating traffic by page and source presents a more accurate picture or the customer’s experience and intent.

Make sure whatever analytics application you use is tracking your referral information properly. A fair amount of social media and mobile traffic can end up recorded as direct traffic. If necessary, tag your links to make sure you can identify each channel properly.

SEO is getting harder. More changes are coming, and fast. The “exciting new SEO miracle tactic” you read on a blog last year could have you in hot water with Google this year. Not all tactics make sense for every business. A corporate website selling software in North America is going to have vastly different needs and customers from a new restaurant or a recently-launched online clothing store.

Program and code your website for bots. Understand your customers’ needs and intent. Design and write for visitors. Learn how to use analytics to track performance. Choose your online resources carefully.

One of my favorite resources is social media. I use it to talk to other SEO professionals. Most of what I have learned is thanks to the many qualified professionals I’ve been fortunate enough to work with over the years. I’d love to hear from you, too!

Before moving to Western MA, Dan launched his career in New York in advertising and public relations, where he worked with some of the country’s top brands. Dan also has many years’ experience in small-business and corporate marketing, finance, franchise business operations and field consulting. In 2005, Dan became the first area president of TruePresence, a national internet marketing firm specializing in web design and search engine marketing. Dan’s clients have included Johnson & Johnson, Sears, Warner-Lambert, Monsanto and Pepsi, but he prefers the individuality of his smaller business clients. Dan launched The Green Internet Group to help business owners fully leverage the digital marketing and social media by offering results driven marketing planning, consulting, training and creative services.

About us and this blog

We are a digital marketing company with a focus on helping our customers achieve great results across several key areas.

Leave a Comment

More from our blog

See all posts