SEO

DuckDuckGo is growing fast but not enough to grab SEOs’ attention

[vc_section][vc_row][vc_column][vc_column_text]

The search engine’s share is small but some of its metrics are better than Bing’s, according to a third party analysis.

Last month, DuckDuckGo announced that it had exceeded 9 billion searches in 2018. That compares with 4 billion in 2016. By comparison, Google sees more than a trillion searches per year globally.

Growing fast. DuckDuckGo’s popularity has grown as privacy has become a more significant issue over the past several years. According to the company, it will “shatter” its 2018 traffic record this year. Even so, it controls less than 1 percent of all U.S. search volume.

US search engine market share (1/19)

Source: StatCounter

SEOs not paying attention. We reached out to a number of SEOs and found that no one is focused on optimizing for DuckDuckGo. However, one local SEO noted that Yelp ranks very well for local queries. So optimizing for Yelp will help locally-focused businesses with visibility and discovery on DuckDuckGo.

In 2015, Neil Patel wrote four SEO tactics for DuckDuckGo. And DuckDuckGo’s Daniel Davis offered this general advice: “Our recommendation is to continue putting users first, focusing on high quality content that they appreciate.”

Why you should care. According to a 2016 analysis of usage and traffic by SimilarWeb, DuckDuckGo outperformed Bing in terms of bounce rates and user engagement. The same analysis suggests that its audience was more tech-savvy than average and more privacy-conscious. Two years later the audience may have broadened.

Given the tiny market share, it’s unlikely that many SEOs will devote time to DuckDuckGo any time soon. But if it continues to grow, that will change. It’s not outrageous to suggest that if present trends continue, DuckDuckGo’s share could ultimately exceed either Bing’s or Yahoo’s.[/vc_column_text][/vc_column][/vc_row][vc_row][vc_column][vc_message message_box_color=”orange” icon_fontawesome=”fa fa-external-link”]This article was originally posted at Search Engine Land by Greg Sterling on January 28, 2019.[/vc_message][/vc_column][/vc_row][/vc_section]

How Reliable is Google’s Web.dev SEO Score?

[vc_section][vc_row][vc_column][vc_column_text]Google’s Web.dev tool offers an SEO score. The score offers a score from 0 to 100.  What does the tool measure for the SEO metric and how useful is it?

What is SEO?

What do you expect an SEO score to mean? That depends on how you define SEO.

Moz defines SEO within the context of traffic:

What is SEO?
Search engine optimization (SEO) is the practice of increasing the quantity and quality of traffic to your website through organic search engine results.”

Unlike Moz, Wikipedia defines SEO within the context of visibility on a search ranking. Search engine visibility is a euphemism for ranking on a search engine.

“Search engine optimization (SEO) is the process of affecting the online visibility of a website or a web page in a web search engine’s unpaid results—often referred to as “natural”, “organic”, or “earned” results”

Search Engine Journal defines SEO as:

“The process of optimizing a website… so it will appear in prominent positions in the organic results of search engines. Successful SEO makes a site appealing to users and search engines.”

  1. Moz’s definition of SEO focuses on traffic.
  2. Wikipedia’s definition is about search engine rankings.
  3. SearchEngineJournal’s definition of SEO focuses on ranking and on being appealing to users.

Search Engine Journal’s definition is interesting because it recognizes how important it is to appeal to users. Google’s search rankings are primarily about ranking what users expect to see. If users don’t see what appeals to them then they are disappointed.

How Does Google Define SEO?

Google’s SEO Starter Guide defines SEO as:

“Search Engine Optimization is about helping search engines understand and present content.”

Understanding content means writing the content in a way that is clear and focused. It may also include adding images, structured data and meta data that helps the search engine understand what a page is about.

Google’s Web.dev site defines SEO in terms of how easy a site is to crawl, how easy it is for the content to be understood, for the purpose of bringing more traffic.

Here is how Google’s Web Dev site defines SEO:

“Making your content discoverable matters because it’s how you get more relevant users viewing your content…

By making sure search engines can find and automatically understand your content, you are improving the visibility of your site for relevant searches.

This is called SEO, or search engine optimization, which can result in more interested users coming to your site. Audit your site and check the SEO results to see how well search engines can surface your content.”

What does the Web.dev SEO Score Measure?

Web.dev is based on Google’s Chrome Lighthouse extension. According to the official Lighthouse page, these are are the nine factors Google uses to create the SEO score:

  1. Document Does Not Have A Meta Description
  2. Document Doesn’t Have A Title Element
  3. Document doesn’t have a valid hreflang
  4. Document doesn’t have a valid rel=canonical
  5. Document Doesn’t Use Legible Font Sizes
  6. Document uses plugins (Flash)
  7. Links Do Not Have Descriptive Text
  8. Page has unsuccessful HTTP status code
  9. Page is blocked from indexing

What Does the Web.dev SEO Score Mean?

According to the Web.dev definition, Google’s SEO score is a measure of how well search engines can “surface” content.

What does Google mean by surface? Does Google mean how well it can rank content? Or does Google mean how well it can discover content?

I believe Google means that the SEO score is a measure of how well Google can crawl and discover content.

Google’s main Web.dev page defines SEO as:

“Checks for best practices to ensure your site is discoverable.”

The next page defines discoverability as:

“Easily discoverable
Ensure users can find your site easily through search.”

That is an artfully vague description of what it means to be discoverable. Does Google mean ranking? Or does Google mean making the content discoverable so that if other ranking factors align users can find the content through search.

Google is vague on the issue of what SEO is.

Publishers understand the concept of SEO within the context of ranking and traffic. Google’s Web.Dev page does NOT use the word Rank anywhere on its site. That is a curious omission for a tool that offers an SEO metric.

Google does not use the words rank or ranking anywhere on the Web.dev site. I believe that this is a conscious omission on the part of Google.

The Web.dev site not referencing SEO in the context of ranking. It is referencing SEO in the context of crawling and making content easy to understand.

Searching for the words rank and ranking yield zero results.

Google’s Web.Dev site does NOT use the word Ranking anywhere on its site either. Publishers are concerned with ranking. But Google is silent. This is to be expected. But it’s somewhat less expected considering that Google is offering an SEO metric to the SEO community .

Searching for the word Discoverability yields two results.

It is clear that Google’s Web.dev SEO results are not concerned with ranking.

Page Scores 90 and Ranks #85

Here is a screenshot of a web page that ranks #85 in Google for the phrase how to diagnose arthritis.

Here is the Web.dev SEO score of that .edu web page that ranks #85:

As you can see, the web page scores 90 for SEO. The only reason it didn’t rank 100 is because it was missing a meta description tag.

A meta description isn’t even necessary for ranking. Yet it still counts for 10% of the SEO score.  This makes it clear that the SEO score is based on a definition of SEO that has less to do with ranking and traffic (how SEO is defined) and more to do with crawling and indexing, otherwise known as discoverability.

Is Web.Dev SEO Score a Useful Metric?

The answer depends on how you define SEO. Google’s definition appears to be if your title tag and meta description exist and if Google can access the page. That’s a limited definition of SEO.

The SEO industry and Wikipedia start at Google’s definition then expand it to traffic, ranking and user satisfaction.

It’s naive to expect Google to provide an SEO tool that gives a clear answer as to how likely a page might rank. That’s probably one of the reasons Google removed the PageRank meter from it’s toolbar.

Google’s Web.dev SEO score does not conform to the definition of SEO. That’s a more accurate description of the tool.

The SEO score is not an indicator of how likely a page will be able to rank. A page that ranks #85 is proof of that.

Rankings and traffic are the two qualities the SEO industry associates with the word SEO. Web.dev offers no insights into those factors.

It may be more accurate if Google’s Web.dev’s SEO score were rebranded as an Indexability or Discoverability Score.[/vc_column_text][/vc_column][/vc_row][vc_row][vc_column][vc_message message_box_color=”orange” icon_fontawesome=”fa fa-external-link”]This article was originally posted at Search Engine Journal by Roger Montti on November 16, 2018.[/vc_message][/vc_column][/vc_row][/vc_section]

7 things that hurt your SEO rankings and how to fix them – Part 2

[vc_section][vc_row][vc_column][vc_column_text]A bad SEO entails practices that are outside the boundaries of Google webmaster guidelines and it affects the optimization of your website for search engines.

There are many techniques that can hurt your SEO rankings if implemented. Serpstat, in 2017, found about 300 million errors when they indexed 175 million pages with an SEO audit tool. These errors stemmed from not doing SEO the right way and while we have discussed 7 of these errors, here are more things that could hurt your rankings and how you can fix them.

Accessibility and indexation

The accessibility and indexation of your site contribute greatly to how your website pages can be seen on search engines. Some of the categories to consider include:

Canonical Tag: duplicate content makes it difficult for search engines to decide on a page to show users, which could affect the visibility of either page. If you will be implementing a rel=canonical tag, ensure it is done correctly, to avoid losing your website ranking. Situations, where you may implement wrongly, may include:

  • Incorrectly installing a code, by inserting it in <body>, instead of the <head> section
  • Implementing a rel=canonical to a 404 page
  • Launching without checking the code, which triggers search engines to de-index.

Noindex Tag: if you do no longer need a noindex tag on a webpage, ensure you remove it as soon as possible. With the tag still there, search engines will not index the webpage, which could leave you wondering why your SEO isn’t improving. Always keep track of your pages to know when a tag is no longer relevant.

Robots.txt: always check for pages hidden in robots.txt and take them out when necessary, to help improve your SEO rankings. If you have redirects in the webpage hidden in your robots.txt file, the crawler will likely not recognize it.

Nofollow links: nofollow links have no SEO value but you could be penalized by search engines for not using it properly. Many websites easily fall victim of this, as they often have links featured in their web pages that are unrelated to the content of the page. This ends up dropping their SEO rankings.

Links and bad redirects

While links are great to help drive traffic and boost your SEO ranking, they could also ruin your SEO efforts if they aren’t managed well.

Broken links

Broken links on your web pages should be rectified or removed as soon as possible. Reasons for broken links could stem from entering the wrong URL, removal or permanent move of the linked webpage by the destination website, or a software on the user’s end that is blocking access to the destination website. There is a WordPress plugin for WordPress users that can be integrated into the website to get rid of dead links. You can also manually check for broken links by using the broken link checker plugin.

How to disavow negative backlinks

Google has a Disavow Tool that can help protect your site from penalties that may arise from bad linking and also help remove bad links. This tool simply sends a signal to Google to ignore negative backlinks. To disavow negative backlinks, look for the links you want to disavow, create a disavow file and then upload to the Google Disavow Tool. Once this is done, the specified links will no longer be considered by Google

Bad redirects and best redirects – 301 and 302

301 and 302 redirects might look similar to a user but definitely not to search engines. While 301 is a permanent move to a new site, 302 is temporal but a lot of users get to mix both up and use either, without thinking much about the difference. If you use 302 rather than 301, search engines might view it as a temporal move and still continue to index the old URL, which could affect your SEO rankings.

Not maximizing Google Search Console

Google Search Console is packed with lots of benefits that should be maximized in order to have the best SEO experience. Some of the things to pay attention to in Google Search Console include search analytics, links to your site, mobile usability, robots.txt tester, sitemaps, index status, and security issues. Once an identified issue is fixed, your rankings will be improved and your website will gain more traction.

Meta tags

Meta tags are important for SEO and usually one of the first things to learn in SEO training. Your key meta tags, including keywords attribute, title tag, meta description attribute, and meta robots attribute should be taken seriously, as they help search engines understand what a page is about.

Don’t use too long or too short titles and descriptions. The optimal number of words for your title required for the best SEO practice is 10-15 words, which is about 78 characters, following Google’s current meta title guideline.

Your description should be between 110 and 120 characters, for easy optimization for both mobile and desktop. While you ensure your title and description aren’t too long, you should also be careful not to make them too short. Your meta tags should provide enough info about the page to help the search engines understand the content.

Google encourages creating good meta descriptions; ensure there’s a description for every page on your site and they must be different for each page. Duplicate content could mess with your rankings. You should also include clearly tagged facts in the description and use quality descriptions.

Conclusion

Doing SEO wrongly will affect your SEO rankings and following the accurate SEO practices, based on Google standards, will help your website success. These common errors should be avoided at all cost. If you are also caught flouting the SEO rules, you might be penalized by Google which could cause a huge drop in your rankings.[/vc_column_text][/vc_column][/vc_row][vc_row][vc_column][vc_message message_box_color=”orange” icon_fontawesome=”fa fa-external-link”]This article was originally posted at Search Engine Watch by Guy Sheetrit on December 5, 2018.[/vc_message][/vc_column][/vc_row][/vc_section]

7 things that hurt your SEO rankings and how to fix them

[vc_section][vc_row][vc_column][vc_column_text]The top listing in Google’s organic search results draws 33 percent of traffic while the second spot garners 18 percent, a study by online ad network Chitika confirms.

After that, it’s a fight to see who secures enough traffic, and of course, in this sort of scenario you need all the help you can get.

Being penalized by Google and experiencing a drop in SEO rankings is one of the worst things that can happen to a website. Now, fluctuations are par for the course, especially considering the rapidly evolving Google algorithms.

When your search rankings take a huge tumble, you need to adopt a proactive approach before your site gets lost organic search obscurity. And this “approach” involves fixing the seven cardinal SEO mistakes listed below:

Avoid keyword stuffing

Use the same keywords repeatedly? You might want to stop! Of course, if it is necessary for your content to make sense, then you’ve got no other choice. But if you seek to optimize your copy in this manner, then you’re in for a rude awakening.

Not only does it discourage visitors from reading or interacting with your content but it signals the search engines that you’re attempting to outsmart their algorithms. And that is not something Google takes lightly.

The above comic strip reimagines keyword stuffing as part of a normal conversation. See how many times the man uses “lunch,” “fine,” “talking funny,” and “mean” in the 1st, 2nd, 3rd, and 4th panels, respectively. If it’s THIS irritating in regular dialog, imagine how your readers would feel reading content like this.

https://c1.staticflickr.com/1/724/21695308292_443d1a2570_b.jpg

Use an online tool like Live Keyword Analysis or Addme.com to calculate the keyword density. Remove excess keywords to keep your density around 1.5 percent. Mention your keywords in the title, the description, your opening paragraph, and once or twice in the body of your content. Make sure it all sounds natural. That should do the trick and help you regain some of your lost SEO rankings.

Check your website speed

Almost half of the online users expect a web page to load within 2 seconds or less, and they abandon your website if it does not load in 3 seconds, revealed a survey by Akamai and Gomez.com. So, ensure quick load times for your website by leveraging browser caching, optimizing images, minifying codes, and activating resource compression.

Achieve all this by using a free tool like PageSpeed Insights from Google to determine the current speed of your website. Also, look at the actionable recommendations offered by the tool to increase your load times.

Source: CWC

Never buy links

Give your website enough time to become successful. Creating good content is hard work but it pays off in the end. Resort to shortcuts and you get penalized.

One of these no-no shortcuts involves buying backlinks, especially from unreliable sources. As soon as Google finds out, they cut your rankings significantly. 22 percent of web admins still buy links without disclosure, according to a survey.

So, the next time you spot an SEO ad promising hundreds of links along with a first page ranking for a ridiculously low price, ignore it. Links from social networking accounts and spammy, untrustworthy sites hurt your website. A few of these companies claim to protect you by creating a “link pyramid” or “link wheel” that point to an intermediary page.

The truth is, these might work for some time, but as Google continues to evolve and deal more strictly with spam content, they will learn about this practice and shut you down.

Become mobile friendly

With Google prioritizing a mobile-first approach, make sure your website is mobile friendly. According to Google, 85 percent of all websites in mobile search results now adhere to the mobile-friendly label. Become a part of the trend and enjoy a smooth flow of traffic.

Otherwise, if your site is not responsive and people are unable to view you on tablets and smartphones, then not only will your rankings suffer, but your customer inquiries and conversions will too. That’s because users will leave your website and visit one that actually fits this requirement.

Use tools like Screenfly by Quicktools to check whether your site is responsive or not. If not, use another tool like Bmobilized to convert your existing pages.

https://upload.wikimedia.org/wikipedia/commons/f/f6/Wikipedia_%22Encyclopedia%22_article_on_a_large_Android_phone%2C_2015-04-16.jpg

Get rid of ads

Recent changes made to AdSense rules by Google indicate that stricter rules are going to be put in place for sites “with more advertising than publisher-provided content.” So, if you’ve been indulging in this practice, get ready to bid your SEO rankings goodbye.

Ads prompt users to leave your website and impacts your experience metrics. Once your user experience metrics become critically low, it is usually a sign to Google that your website holds no value for your visitors. They will demote you over time.

Plus, ads have led to the rise of ad blocking. In fact, a report by Adobe and PageFair concluded that the approximate loss of worldwide Internet revenue because of blocked advertising in 2015 was $21.8 billion. So, unless you want to be penalized without any payoff, all you need to do is get rid of the ads and your site will be fine.

Handle technical issues immediately

Technical problems like network outages, poor hosting, slow connectivity, and server downtime can affect your site rankings.

If Google constantly abandons attempted crawls on your site, in due time, your SEO rankings will go down. Of course, short server outages don’t matter, but if it becomes a regular occurrence, then you need to look for a new host.

Identify the problem first. This might not be easy, but it becomes quite obvious if your site goes down every 10 minutes. Or, use an online tool like Downforeveryoneorjustme to check whether your page is up or down. Determine if the problem lies with your host and not your Internet plan. You will find plenty of decent web hosting options, like Liquidweb.

Maintain the quality of your guest posts

Guest blogging can be a great tool for SEO and lead generation. Unfortunately, as of 2015, only 6 percent of bloggers published original content as guest posts. That’s a dismal number when you consider what an amazing way it is to give your website an edge against the competition.

Use scraping tools like the one from Guestpost.com to conduct automatic scrapes of every website that accepts guest posts related to your keywords. However, when it comes to your own website, make sure you accept only high-quality guest posts.

Feature fresh writers on your site and post original and relevant content that appeals to your audience. Also, make sure you maintain a balance between content produced from the site and content offered to your page in lieu of an author bio and a link.

Source: https://cdn.searchenginejournal.com/wp-content/uploads/2013/09/low-quality-guest-post.png

Final words

If you want to survive the virtual world and stay relevant, then you need to focus on raising your SEO rankings. Follow the steps given above to help you fix bad SEO and regain your rankings.[/vc_column_text][/vc_column][/vc_row][vc_row][vc_column][vc_message message_box_color=”orange” icon_fontawesome=”fa fa-external-link”]This article was originally posted at Search Engine Watch by Guy Sheetrit on November 27, 2018.[/vc_message][/vc_column][/vc_row][/vc_section]

7 Essential SEO Browser Extensions & Plugins

[vc_row][vc_column][vc_column_text]The vast majority of people use browsers to access the web.

But most SEO professionals take it a step further and use those same browsers to do a lot more.

In fact, some of the most important tools in my arsenal are my browser and its extensions.

So, let’s dive right in and see what we can do with them. And the best part is, all of them are free.

1. Ghost Browser

Ghost Browser

Ghost Browser is built on Chrome so everything available to you there is available in Ghost Browser… and more.

The biggest reason I switched to Ghost Browser is for what they call “sessions.”

Essentially a session is a fresh instance of Chrome that operates within new windows or even tab(s).

In the image above, you’ll see there are four different colored tabs, each with a different site up.

Each of these colors represents a session and they’re independent, which is why one of them is not logged in, one can be logged in, and another is logged in from a different location.

You can run multiple tabs for each session.

So, for example, I can be logged into one session as me to manage an AdWords campaign but also logged in through a different account to access the client’s analytics and Search Console (where they aren’t controlled by the same account).

Similarly, each session can be used to log into different social accounts, etc.

Ghost Browser also has Tasks. Essentially, you can save sets of tabs with their login state as a project and not have to log in and out or open all the different tab sets you likely regularly do each time you need to access them.

The free version supports three sessions at a time – so even it is three times better than what you’re likely currently working with.

There is a pro version. I found it worth the investment based on my usage, you may not.

Either way, after using the free version for a bit you’ll wonder how you ever survived without it.

Note: I also discussed Ghost Browser in my article on non-SEO tools for the SEO.

2. Chrome Developer Tools

Chrome Developer Tools

There is virtually no way I could cover all the features, functions, and uses of Chrome Developer Tools in this article.

The tool is built into Chrome and is accessed via Chrome Menu > More tools > Developer tools.

Easily the most common tasks I use it for is to find code, determine the size of elements, and troubleshoot them.

As illustrated in the image above, the tool allows you to hover over and select an element on a given webpage and it will display its code and (in this case) the computed output characteristics.

You can even adjust the code right in the Developer Tools to see how it would render prior to making the changes to the live site. You can do this on the desktop site or set it to render the page as it would on many popular mobile devices.

This isn’t the most advanced of its functionality, which includes various speed, security, and troubleshooting capabilities ranging from generating waterfalls of resource load times to indicating which resources are slowing down the site load.

The advantage to all this vs. some of the other online tools you might use?

The results are real world.

That is, you’re seeing how your browser on your connection is impacted by the resources being loaded.

Read the tool’s page and explore. There’s a ton in there.

When you know what data you have access to, you’ll know where to look when you need answers to related questions.

3. SEO Quake

SEO Quake

SEO Quake is a classic among the extensions and plugins used by SEO pros – and for good reason.

Essentially, SEO Quake gives rapid access to an array of data that we all want.

On any given webpage a simple click to pull in the metrics will list backlinks data, cache dates, indexing information, and more.

With a couple extra clicks you gain access to the internal and external backlinks data, keyword density information (if you’re interested in that), and a ton of information regarding the use of Schema, heading tags, metas and more.

SEO Quake is not a replacement for site audit and analysis tools but gives an excellent quick snapshot of a page’s information.

Perfect when you need just some limited info or are on a phone call and need to pull up some core metrics.

It even ties in with SEMrush to yield some basic traffic stats as well – handy for competitor research, especially when considering new content strategies.

4. User-Agent Switcher

User-Agent Switcher

This is extremely helpful when developing new sites – especially when that site is built using less predictable technologies.

Essentially User-Agent Switcher is exactly what it sounds like: an easy to set up plugin that switches the user agent info sent allowing you to view a site as a different browser or bot.

I can’t count the number of times I’ve used it to troubleshoot crawl issues or uncover differences in how Googlebot is viewing a page versus a browser.

Obviously, it can also be helpful when determining how a site will load with different browsers or operating systems as well.

5. Tag Assistant

Tag Assistant

Google Tag Assistant is an extremely useful tool, especially for those who use Google Tag Manager.

It can also be handy when you simply need to identify issues with analytics or other tracking codes (AdWords, etc.)

The icon for the extension changes color depending on whether there are issues detected and makes note of the issues when clicked.

Search Engine Journal gets a green light, as illustrated above, but warnings and errors get reported when there are duplicate or empty tags or tags not configured properly.

Tag Assistant also reports on duplicate or analytics code issues.

It handily lets you record a session allowing you to navigate paths within your site (or other’s) and then review the recording to find errors and issues.

It’s an invaluable troubleshooting tool across an array of scenarios.

Here’s Google’s video outlining some of its core features:

6. Show Title Tag

Show Title Tag

By no means a critical plugin but a handy one, Show Title Tag simply displays the page title within the browser.

You can move it to any of the corners of the browser and the red text indicates where it’s likely to be cut off in search results.

It’s helpful when viewing competitors’ sites to quickly seeing how they’re doing their titling beyond the short snippet that would appear in the tab and without viewing the source or opening Developer Tools.

This plugin is also helpful when you’re navigating your own site, highlighting instances where your title might be too long.

7. Ghost Proxy Control

Ghost Proxy Control

The Ghost Proxy Control extension comes pre-loaded with Ghost Browser.

Basically, you can add your proxies in and access them easily via the extension.

One of the big perks to the combo of Ghost Browser and Ghost Proxy Control is that you can load different proxies into different session and basically have a tab for each location.

I’ve found this incredibly useful for checking SERP results from various location and having the ability to view them at the same time, side by side.

It’s specifically interesting for local SEO.

The ability to check not just rankings from different locations but compare easily how the layout might differ.

The control allows for a proxy to be assigned to a single tab or an entire session (indicated by multiple tabs of the same color).

While the extension is free, proxies generally are not.

Although free proxies are available, I pay about $20 per month for 10 dedicated proxies.

Conclusion

There are definitely more browser extensions and plugins than what I’ve included on this list of essentials.

However, most of those require subscriptions, are too similar to one of those noted above, or don’t apply to the duties of virtually every SEO pro I know.

For example, Moz and Buzzsumo have great extensions. But they can be quite frustrating unless you have a paid subscription.[/vc_column_text][/vc_column][/vc_row][vc_section][vc_row][vc_column][vc_message message_box_color=”orange” icon_fontawesome=”fa fa-external-link”]This article was originally posted at Search Engine Journal by Dave Davies on October 14, 2018.[/vc_message][/vc_column][/vc_row][/vc_section]

Will Redoing Your Homepage Every Month Help Your Google Rank?

[vc_row][vc_column][vc_column_text]Welcome to another edition of Ask an SEO! Today’s question comes from Kevin. He asks:

I handle the online marketing/website development for a B2B consulting company. After overhauling our homepage, and cleaning up much of the content, my boss tells me that he wants the homepage redone every month.

He is under the impression that it will “look better” in the eyes of Google, and help with rankings.

I told him this is not true, but he sees a well-known brand do it, so clearly that relates to our B2B website. I need a simple, flat answer, and I’m hoping an expert with the Search Engine Journal can save me the grief and headache of having to argue this.

I’ll start with an apology. There’s not really any such thing as a simple, flat answer in SEO.

But if I had to give you one in this case, it would be no.

No, it isn’t necessary to change your homepage once a month to succeed in SEO.

This idea probably comes from an SEO idea called “freshness”, which suggests that fresher content has more value in the eyes of the search engines.

This can be true in certain cases, especially when it comes to time-sensitive information or news.

But fresh content isn’t valuable for all types of websites.

Further, freshness and time relevance usually applies to pages that are about specific topics, not a general page like a home page.

Unless you are a news organization, your home page probably doesn’t change all that frequently.

A commercial site like Apple changes because they promote different or newer products, but as you’ve pointed out in your question, that’s different than a B2B website where the products or services usually stay the same.

A final consideration is the development time it would require to update the home page once per month. Most likely you would find a better return on investment for that time and expense if you use it to create new content or improve the user experience of the existing site.

In general, I recommend encouraging your boss to make things better for your users, whether that’s a more intuitive design, more FAQs, content about how to use your products or services, or helpful industry commentary.

It may seem contrary, but focusing on users rather than search engines will almost always result in more successful SEO.[/vc_column_text][/vc_column][/vc_row][vc_section][vc_row][vc_column][vc_message message_box_color=”orange” icon_fontawesome=”fa fa-external-link”]This article was originally posted at Search Engine Journal by Jenny Halasz on October 9, 2018.[/vc_message][/vc_column][/vc_row][/vc_section]

Do you still need a PPC tool with the new Google Ads?

[vc_row][vc_column][vc_column_text]

Four ways a great PPC tool can turn you into a PPC rockstar

On the surface, it’s easy to wrongly conclude that Google and Bing are automating PPC pros right out of relevance. Basic PPC tasks can now happen with very little human intervention through the Google and Bing interfaces — easy enough for a novice PPC manager to create and launch pretty good campaigns.

That’s terrific. The big engines have made viable PPC accessible to the masses. Today, even a basic mom and pop shop can drive business effectively and inexpensively by tapping into the billions of searches happening daily.

The challenge, though, is that great PPC is actually becoming more challenging than ever, in part because of automation. Ironic? Yes. And while it may seem that Google and Bing have made it really easy to operate solely within their platforms, there is actually greater need for powerful third-party tools — a trend we expect to see continue.

Let’s address the obvious head-on: Optmyzr is one of those third-party tools, so, of course, we’ll say our service is needed. It’s important, though, to have an informed discussion about the true role of automation — where, how, and when to apply machines to take PPC programs from good to great.

Smart automation: The key to greatness

PPC automation via Optmyzr is fueled by machine learning and artificial intelligence. When coupled with powerful human intelligence, smart PPC pros have the power to run extraordinary campaigns with speed and agility we could only dream of a few short years ago.

Here are four powerful ways PPC pros can save time and energy automating critical tasks, freeing up time (and brainpower) to apply human expertise to strategize, refine, and elevate their craft:

Build search campaigns from e-commerce data

Yes, you can automate the lion’s share of campaign building, start-to-finish. We’re not talking about simply setting up the basics and running them. Using tools like Campaign Automator from Optmyzr, PPC pros can tap external e-commerce data, taking complex (and often messy or confusing) spreadsheets and automatically building out keyword-targeted campaigns. Essential data such as brand, price, color, size, channel — essentially any variable — can be set for dynamic insertion using templates you create.

Advertisers can specify a template and let Optmyzr’s Campaign Automator build out and maintain an inventory-driven search campaign on Google Ads.

The PPC pro can easily tie in inventory data from the Google Merchant Center Feed as well, preview campaigns to verify how they will appear and then launch. Campaign Automator even allows fully automatic campaign updates based on changes to the templates, inventory levels and a host of other attributes.

Check out our latest demos to see the automation first hand.

Streamline Shopping Ads (for Google and Bing!)

While powerful drivers of conversion, Shopping Ads can be nothing short of tedious to create manually in Google Ads and Bing Ads. The more products and product groups you have, the more time you can spend manually creating just your campaign structures. No exaggeration, PPC pros know this can take several hours, if not several days, if you’re creating Shopping Ads for thousands of products.

The Optmyzr Shopping Campaign Builder virtually eliminates the product-by-product manual creation by automating deeper levels of campaigns. Automation puts products into product groups and generates the ad groups for you. The PPC pro can then apply his or her time to actually thinkingabout nuances, bid adjustments and fine-tuning Shopping Ads campaigns, instead of spending arduous hours setting up the structures.

Deeper automation throughout the process allows syncing campaigns with inventory, finding product attributes that don’t perform well, changing product group bids based on various attributes and identifying negative keywords. The PPC pro still has the ability to jump in at any point in the process to apply their knowledge and skill to fine tune and make critical adjustments.

An analysis of the performance of a shopping feed for Google Ads showing the number of conversions coming from products at different price points.

Remember the tedium noted above? It gets worse. After you’re done with the manual set up in Google Ads, then you’d need to turn your attention to repeating the process for Bing Ads. With Optmyzr, the same deep automation for your Google programs can be replicated for Bing Ads.

As with Campaign Automator, spend some time getting a deeper tutorial through our latest demos.

Automate repetitive tasks with advanced prebuilt scripts

Clients of all sizes garner exceptional value implementing advanced scripts into their Optmyzr workflow. Question is — are you a PPC pro or a scripting guru? PPC pros need not spend countless hours crafting the perfect script to automate key tasks. The Optmyzr team does all of that.

By making it easy to access, find and install powerful scripts, PPC pros can find time-saving automations to improve reporting, bids and budgets, notifications, optimizations and a lot more.

Manage Google Ads Scripts without editing a single line of JavaScript code through Optmyzr.

These advanced scripts go far beyond the stock scripts available via Google Ads, adding greater depth and functionality across PPC tasks. We’ve made them really easy to install and crafted a form-based user interface that allows PPC pros to modify them to their needs — without having to do the coding yourself.

Implementing scripts is as easy as downloading the script, copying /pasting into your Google Ads account and immediately beginning to generate outputs such as spreadsheets or take actions such as pausing broken links.

One more reminder — you can find in-depth demos for the automations discussed in this article.

Cross-channel reporting

One of the most time consuming tasks for PPC pros is generating reports to keep clients informed. While Google Ads includes a reporting module, it simply cannot cover what PPC pros do on other platforms like Bing, Facebook and Amazon.

Monitoring, alerts, data visualization presets, charts and tables can all be highly automated giving extraordinary insight along with visual appeal that capture a client’s attention.

A third-party tool like Optmyzr can help with this. We routinely hear from clients who say they’ve been able to automate reports that would commonly take five hours per month per client down to 30 minutes.

Winning the paid search race

Don’t get us wrong. The automations present in the Google and Bing interfaces are great, but they only go so far. PPC Management Systems — like Optmyzr — exist to help PPC pros become PPC rockstars.

In this article, we intentionally only explored four areas of opportunity that can help you strategically automate programs. The reality is there are many more automation opportunities search pros can tap within Optmyzr’s powerful PPC management system. Most critical — look to automate tasks that eat up time and don’t really tap your brainpower.

Flexibility. Ease. Efficiency. Control. By crafting automations that go deep into the tasks and functions of paid search programs, we make it our mission to put the PPC pro in the driver’s seat of all of their campaigns. Combining the power of human intelligence and vision with AI and machine learning, smart approaches to automation will take your game to the next level and help keep your organization a step ahead in the intense paid search race.[/vc_column_text][/vc_column][/vc_row][vc_section][vc_row][vc_column][vc_message message_box_color=”orange” icon_fontawesome=”fa fa-external-link”]This article was originally posted at Search Engine Land by Optmyzr (Sponsored) on October 15, 2018.[/vc_message][/vc_column][/vc_row][/vc_section]

Bing says it is improving web crawler efficiency

[vc_row][vc_column][vc_column_text]

Bing is working on making sure their crawler doesn’t miss new content and at the same time overload your web servers.

Fabrice Canel, principal program manager for Bing Webmaster Tools, provided an update on his team’s efforts to improve the efficiency of their web crawler, BingBot.

Responding to user feedback. The update is a follow up to his talk at SMX Advanced in June, during which he announced an 18-month effort to improve BingBot. Canel asked the audience to submit suggestions and feedback.

In a blog post Tuesday, Canel said the team has made numerous improvements based on this feedback and thanked the SMX audience for its contributions. He said they will “continuing to improve” the crawler and share what they’ve done in a new “BingBot series” on the Bing webmaster blog.

BingBot’s goal. In this first post, Canel outlined the goal for BingBot, which is to use an algorithm to determine “which sites to crawl, how often, and how many pages to fetch from each site.” To ensure site’s servers aren’t overloaded by the crawler, the goal of BingBot is to limit its “crawl footprint” on a site while ensuring content in its index is as fresh as possible.

This “crawl efficiency” is the balance Bing is working to strike at scale. Canel said, “We’ve heard concerns that bingbot doesn’t crawl frequently enough and their content isn’t fresh within the index; while at the same time we’ve heard that bingbot crawls too often causing constraints on the websites resources.” It’s a work in progress.

Why should you care? Bing is clearly listening to the webmaster and SEO community. The Webmaster Tools team is making changes to ensure its crawler does not overload your servers while at the same time are faster and more efficient when it comes to finding new content on your web site. Bing is actively working on this and says it will continue to work on this.

How does this impact you? If you add new content to your web site and Bing doesn’t see it, it won’t rank it. That means searchers using Bing will not find your new content.

Recently Bing shut down the anonymous submit URL tool, and we have seen reports that Bing is not listening to submit URL requests even in Bing Webmaster Tools. It is possible the tweaks and changes Bing is making is causing some of this slowness with crawling and indexing now. But ultimately, Bing is clearly working on the issue.[/vc_column_text][/vc_column][/vc_row][vc_section][vc_row][vc_column][vc_message message_box_color=”orange” icon_fontawesome=”fa fa-external-link”]This article was originally posted at Search Engine Land by Barry Schwartz on October 17, 2018.[/vc_message][/vc_column][/vc_row][/vc_section]

How Often Google Crawls and Indexes

[vc_row][vc_column][vc_column_text]In a webmaster hangout, a publisher asked how fast Google removed pages from the index if they added a noindex nofollow to it. The publisher stated they had added noindex but the page remained in Google’s index. Google’s John Mueller responded with an answer that described how often some pages are indexed.

John Mueller revealed that URLs are crawled at different rates. That’s somewhat well understood. What was of more interest was that he said some URLs can be crawled as little as once every six months.

The publisher stated:

“We’re seeing stuff that’s from a long time ago, where we’ve changed the noindex nofollow but we’re still seeing it in the index. And this is several months after we’ve changed this.”

John Mueller answered:

“I think the hard part here is that we don’t crawl URLs with the same frequency all the time. So some URLs we will crawl daily. Some URLs maybe weekly. Other URLs every couple of months, maybe even every once half year or so.

So this is something that we try to find the right balance for, so that we don’t overload your server.

And if you made significant changes on your website across the board then probably a lot of those changes are picked up fairly quickly but there will be some leftover ones.

So in particular if you do things like site queries then there’s a chance that you’ll see those URLs that get crawled like once every half year. They’ll still be there after a couple of months.

And that’s kind of… the normal time for us to kind of reprocess/re-crawl things. So it’s not necessarily a sign that something is technically completely broken.

But it does mean that if you think that these URLs should really not be indexed at all, then maybe you can kind of back that up and say well here’s a sitemap file with the last modification date so that Google goes off and tries to double-check these a little bit faster than otherwise.”

Use The Site Map to Trigger Updated Crawling

John Mueller suggested updating the site map and letting Googlebot discover the last modified date and using that as a hint for it to go out and crawl the old web pages.

Google URL Inspection Tool

Something John Mueller didn’t mention is using Google’s URL Inspection tool. According to Google’s Webmaster Help page on re-indexing,  a submission can take up to a week or two.

The URL Inspection tool is useful if you have a few individual URLs that need re-crawling. If you have a large amount of web pages, Google recommends submitting a site map instead.

More information on how to ask Google to re-crawl URLs here
https://support.google.com/webmasters/answer/6065812

Watch the Webmaster Hangout here
https://youtu.be/gC7aVygbMIk?t=2594[/vc_column_text][/vc_column][/vc_row][vc_section][vc_row][vc_column][vc_message message_box_color=”orange” icon_fontawesome=”fa fa-external-link”]This article was originally posted at Search Engine Journal by Roger Montti on October 16, 2018.[/vc_message][/vc_column][/vc_row][/vc_section]

Learn the Basics of Quality Link Building for SEO

[vc_row][vc_column][vc_column_text]

What Is Link Building? A Definition

Link building, simply put, is the process of getting other websites to link back to your website. All marketers and business owners should be interested in building links to drive referral traffic and increase their site’s authority.

Why build links? Google’s algorithms are complex and always evolving, but backlinks remain an important factor in how every search engine determines which sites rank for which keywords. Building links is one of the many tactics used in search engine optimization (SEO) because links are a signal to Google that your site is a quality resource worthy of citation. Therefore, sites with more backlinks tend to earn higher rankings.

what is link building

There’s a right way and a wrong way, however, to build links to your site. If you care about the long-term viability of your site and business, you should only engage in natural linkbuilding, meaning, the process of earning links rather than buying them or otherwise achieving them through manipulative tactics (sometimes known as black-hat SEO, a practice that can get your site essentially banned from the search results).

That said, natural, organic link building is a difficult, time-consuming process. Not all links are created equal: A link from an authoratative website like the Wall Street Journal will have a greater impact on your rankings on the SERP than a link from a small or newly built website, but high-quality links are harder to come by.

This guide will teach you how to build quality links that improve your organic rankings without violating Google guidelines.

Remember, link building is imperative in achieving high organic search rankings.

Why Link Building Is Important for SEO

Link building is important because it is a major factor in how Google ranks web pages. Google notes that:

“In general, webmasters can improve the rank of their sites by increasing the number of high-quality sites that link to their pages.”

Imagine that we own a site promoting wind turbine equipment that we sell. We’re competing with another wind turbine equipment manufacturer. One of the ranking factors Google will look at in determining how to rank our respective pages is link popularity.

link building strategies

While the above example provides a general visual understanding of why link building is important, it’s very basic. It omits key factors such as:

  • The trust and authority of the linking pages.
  • The SEO and content optimization of the respective sites.
  • The anchor text of the incoming links.

For a more in-depth explanation of how PageRank is calculated, read through these resources:

The most important concept to understand is that, as Google says, you’re more likely to have your content rank higher for keywords you’re targeting if you can get external websites to link to your pages.

Simple Link Building Strategies: How To Get Other Sites to Link to You

There are a number of link building strategies used to get external websites to link to yours:

  • Content Creation & Promotion – Create compelling, unique, high-quality content that people will naturally want to reference and link to, and tell people about it. You have to spread the word before you can expect anyone to find your content and link to it!
  • Reviews & Mentions – Put your product, service, or site in front of influencers in your industry, such as popular bloggers or people with a large social media following.
  • Links from Friends & Partners – Ask people you know and people you work with to link to your site. Remember that relevance matters; links from sites that are in the same general industry or niche as your site will have more value than links from random, unrelated sites.

It can take a while to build a lot of links, but be patient, and remember that shortcuts like buying links are against Google’s guidelines and can be devastating for your SEO. Don’t take chances.

Build Links for Free with Internal Link Building

There’s an easy, underrated way to build links to the pages you’re attempting to improve search engine rankings for. And it’s a method you have total control over: Internal link building.

In attempting to get a Web page to rank, there are a few key factors to consider:

  • Anchor Text – One of the most important things search engines take into account in ranking a page is the actual text a linking page uses to talk about your content. So if someone links to our Good Guys Wind Turbine Parts site with the text “wind turbine parts”, that will help us to rank highly for that keyword phrase, whereas if they had simply used text like “Good Guys LLC” to link to our site, we wouldn’t enjoy the same ranking advantage for the phrase “wind turbine parts”.
  • Quality of the Linking Page – Another factor taken into account is the quality of the page that is sending the link; search engines allow links from high-quality, trusted pages to count more in boosting rankings than questionable pages and sites.
  • Page the Link is Aimed At – Many times, when people talk about your site they’ll link to the home page. This makes it difficult for individual pages to achieve high rankings (because it’s so difficult for them to generate their own link equity).

These are all elements we can’t control in attempting to get other sites to link to us. We can, however, control all of these elements in linking to our own pages from our own content. We can:

  • Determine what anchor text to use.
  • Decide which page to point that anchor text at.
  • Ensure that the quality and content of the linking page is high (since it’s our page!).

Building external links to your site is important, but in focusing more of your efforts on the optimization of these internal links you can build quality in-bound links with rich anchor text to the proper pages, which will provide you with an unparalelled ranking boost (for free!).

Internal Link Building Tools and Tips

So how do you go about building these great internal links? Well, you can set up a system for interlinking your pages in a few easy steps:

  • Keyword Research for Link Building – First, you need to utilize a keyword research tool to have numerous keywords suggested to you that are both relevant and popular.
  • Assign Keywords to Content – Next, you have to group your keywords strategically, creating a search-friendly information architecture.
  • Link Pages Using Targeted Anchor Text – The final step is to apply your keyword research to intelligent inter-linking; you do this by linking to content using the keywords you’ve discovered.

The execution of the third item is key. You need to be sure that you’re linking to the right pages with the right anchor text. Here are a couple quick tips for carrying that out effectively:

Use Your Site Search

This one’s pretty simple, and can be used for multiple purposes:

  • Finding pages on your site to link to a new page – When you create new content, you want to make sure you can search your site for mentions of similar keyword variations you might want to link to that page.
  • Finding a page that’s been created to link to – Your site may have multiple content authors. In this case, you may have a vague idea that a page about “wind turbine rotors” has been created, but you don’t know the page title or URL. In this case, you can either type the keyword into your site search to find the corresponding page, or use Google itself. To do this we’d simply type: “site:http://www.goodguyswindturbineparts.com intitle:wind turbine rotors” into Google. This would return all of the pages containing that phrase that Google has indexed.

Create an Internal SEO Link Building Wire Frame

To do this, you simply need to map the keywords you’d like to target to the most logical pages. So, let’s say we have three pages to choose from:

  • goodguyswindturbineparts.com/wind-turbine-parts
  • goodguyswindturbineparts.com/turbine-rotors
  • goodguyswindturbineparts.com/wind-turbine-shaft

Since the turbine rotors page definitely seems to be the best fit for our “wind turbine rotors” keyword, we’ll align that keyword with that page.

We can similarly match “wind turbine parts” and “wind turbine shaft” with the corresponding pages. In a spreadsheet, this might look something like this:

what is internal link building

As you can see, each page is associated with mutiple keywords. By making this document available to all of your content writers, they can quickly see which pages are targeting which keywords; they can also instantly check your SEO wireframe to see which keywords have been targeted (with a simple ctrl F!).[/vc_column_text][/vc_column][/vc_row][vc_section][vc_row][vc_column][vc_message message_box_color=”orange” icon_fontawesome=”fa fa-external-link”]This article was originally posted at Wordstream by unknown author on December 30, 2008.[/vc_message][/vc_column][/vc_row][/vc_section]