Daily Search Forum Recap: March 31, 2015

Here is a recap of what happened in the search forums today, through the eyes of the Search Engine Roundtable and other search forums on the web.

Search Engine Roundtable Stories:

Other Great Search Forum Threads:


Source: SEroundTable

Google Still Dominant, But Baidu Benefitting From Google Ban In China Says eMarketer

china-flag-wall-ss-1920

Google continues to dominate the global search market with 54.7 percent of search ad revenues worldwide in 2014. But mainland China’s ban on Google is giving Baidu, the search leader in that country, a huge advantage. Baidu will see its global share of search ad revenues increase from 6.4 percent in 2013 to 8.8 percent in 2015, according to new data from eMarketer.

“Baidu is reaping the benefits of Google’s ban in China—and of course, a massive and growing internet user population,” says eMarketer in the report, which breaks out search ad revenues from the overall digital advertising market for the first time.

The research firm notes that China will account for $14.90 billion, or 32.8 percent, of the global search spend in 2015. The U.S., by comparison, will account for $25.66 billion in ad spend this year. But, with rapid growth search growth of 32.8 percent this year — nearly double the overall growth of 16.2 percent — it’s easy to see that China could soon eclipse U.S. search spend. Spend that Google is missing out on.

For another perspective on future growth, the U.S. has internet penetration of over 86 percent of the population, while in China, just 46 percent has internet access according to Internet Live Stats.

Google’s search ad share is expected to shrink marginally from 55.2 percent in 2013 to 54.5 percent in 2015. Google’s search ad revenues will continue to far outweigh its competitors this year. The company is expected to bring in $38.42 billion in search revenues in 2014 and $4446 billion in 2015. Baidu’s revenue is expected to grow from $5.35 in 2014 to $7.18 in 2015.

Microsoft and Yahoo will see their combined search share grow by just 6.5 percent in 2015. Bing saw strong growth in 2014, with its search ad share rising from 3.7 percent in 2013 to 4.2 percent in 2014. Bing’s share is expected to hold steady in 2015. Yahoo is expected to see stronger revenue growth in 2015, rising to $1.90 billion from $1.78 billion in 2014. However, Yahoo’s global share will continue to shrink from 2.5 percent in 2014 to 2.3 percent in 2015. Last week, the two companies have extended talks to renegotiate their search deal which hit its five year mark in March.

Search is expected to make up $81.59 billion globally, up 16.2 percent from 2014. Search is expected to grow at nearly 10 percent a year through 2019 to top $130.58 billion globally.

The post Google Still Dominant, But Baidu Benefitting From Google Ban In China Says eMarketer appeared first on Search Engine Land.

Source: SEland

Did Wikipedia Take An Organic Hit In Google Search?

WikipediaWikipedia has historically dominated Google’s search results – no one would argue with that.

But now with the knowledge graph, does Google have a need to rank Wikipedia at the top, when instead, it just takes the content from Wikipedia and other sources and places that information in the top knowledge graph box? The answer seems to be no.

A WebmasterWorld thread has webmasters and SEOs commenting that Wikipedia for many queries and search results no longer dominates. So I checked with SearchMetrics and there does seem to be a downward trend there:

Wikipedia SearchMetrics

Here is what one webmaster said in the forums, that before in the Google search results, the rankings were:

1. Wikipedia page
2. Big organization page
3. Government agency page
4. My page

Now they look more like this he said:

1. Big organization page
2. Government agency page
3. Wikipedia page
4. My page

I think the knowledge graph/quick answers has to do with it.

Forum discussion at WebmasterWorld.


Source: SEroundTable

Moz Builds That Spam Identification Tool In Site Explorer

taekwondo fighterAlmost two years ago, Rand Fishkin from Moz posted about looking into building a spam detection tool as a way for sites to figure out (1) who not to get links from, (2) remove bad links from and (3) see how Google may determine if a site is spammy and why.

The industry was torn, thinking this may be just Rand’s way of doing automated “outing” but the truth is, a tool like this can be useful to link forensics (I used that word) SEOs.

Rand announced the new paid tool on the Moz blog yesterday. Here is a quick video of how it works:

In short, it looks at just 17 different factors and if a site is flagged with some or many, it will score you as more and more spammy as more flags get hit.

Here are the flags Moz uses:

  • Low mozTrust to mozRank ratio: Sites with low mozTrust compared to mozRank are likely to be spam.
  • Large site with few links: Large sites with many pages tend to also have many links and large sites without a corresponding large number of links are likely to be spam.
  • Site link diversity is low: If a large percentage of links to a site are from a few domains it is likely to be spam.
  • Ratio of followed to nofollowed subdomains/domains (two separate flags): Sites with a large number of followed links relative to nofollowed are likely to be spam.
  • Small proportion of branded links (anchor text): Organically occurring links tend to contain a disproportionate amount of banded keywords. If a site does not have a lot of branded anchor text, it’s a signal the links are not organic.
  • Thin content: If a site has a relatively small ratio of content to navigation chrome it’s likely to be spam.
  • Site mark-up is abnormally small: Non-spam sites tend to invest in rich user experiences with CSS, Javascript and extensive mark-up. Accordingly, a large ratio of text to mark-up is a spam signal.
  • Large number of external links: A site with a large number of external links may look spammy.
  • Low number of internal links: Real sites tend to link heavily to themselves via internal navigation and a relative lack of internal links is a spam signal.
  • Anchor text-heavy page: Sites with a lot of anchor text are more likely to be spam then those with more content and less links.
  • External links in navigation: Spam sites may hide external links in the sidebar or footer.
  • No contact info: Real sites prominently display their social and other contact information.
  • Low number of pages found: A site with only one or a few pages is more likely to be spam than one with many pages.
  • TLD correlated with spam domains: Certain TLDs are more spammy than others (e.g. pw).
  • Domain name length: A long subdomain name like “bycheapviagra.freeshipping.onlinepharmacy.com” may indicate keyword stuffing.
  • Domain name contains numerals: domain names with numerals may be automatically generated and therefore spam.

What do you think? I have yet to play directly with the tool.

Forum discussion at Twitter.

Image credit to BigStockPhoto for taekwondo fighter


Source: SEroundTable

Google News Updates Inclusion Process: Rejection Notices Bug

Google NewsGoogle News has a pretty relaxed inclusion process relative to the older days of trying to get into Google News. The new process is documented in detail over here.

I key difference now is that you can only request inclusion via the Google News Publisher Center.

If you’ve read all of the above and your site follows our Webmaster guidelines, as well as Google news’ general technical, and quality guidelines, then you’re ready to apply for inclusion within the Google News Publisher Center!

You will see a button next to all your verified sites in the publisher center that reads “Request Inclusion Into Google News.”

When you go through that process and you are rejected, now Google may not send you a rejection notice. Stacie Chan from Google said in a Google News Help thread that they need to add in rejection notices into the new workflow. She wrote:

We have one more step we’re implementing this week- if you applied to Google News recently, we didn’t display a rejection note. We’re working on that so you’ll know exactly what your inclusion status is.

Here is what the inclusion form looks like when you try to submit a site:

Forum discussion at Google News Help.


Source: SEroundTable

When Did The Eiffel Tower Open To The Public? 126 Years Ago Today.

Did you know that 126 years ago today on March 31, 1889, the Eiffel Tower in Paris, France opened to the public?

Well, if you didn’t know, now everyone does because Google has a special logo, aka Google Doodle, on their home page reminding searchers.

Google Eiffel Tower Logo

It was built in 1889 as the entrance arch to the 1889 World’s Fair and was opened to the public on March 31st. The Eiffel Tower is 1,063 feet tall and is the tallest structure in Paris at about the same height as a 81-storey building. It was actually the tallest in the world from 1889 to 1930.

Did you know the construction started on January 28, 1887 and was completed on March 15, 1889, which is just over two years.

Here is a picture of the Eiffel Tower:

Eiffel Tower

Here is an animated GIF of the construction of the Google Doodle for the Eiffel Tower:

Forum discussion at Google+.


Source: SEroundTable

Daily Search Forum Recap: March 30, 2015

Here is a recap of what happened in the search forums today, through the eyes of the Search Engine Roundtable and other search forums on the web.

Search Engine Roundtable Stories:

Other Great Search Forum Threads:


Source: SEroundTable

Google AdWords Express Now Block Search Phrases

Google AdWords ExpressGoogle quietly announced on Google+ that those who use Google AdWords Express can now remove keyword phrases they no longer want to target.

Google said they have added the “ability to remove search phrases that may not be the best fit for your business (and add them back if you need to).”

To remove phrases you do not want to rank for:

How to remove or add back search phrases:

  • Click the Edit icon next to your search phrases.
  • Uncheck the box next to the search phrase to remove it, or check the box to add it back.
  • Click Save when you’re done.

This new feature launched Friday to AdWords Express advertisers in Australia, Canada, India, New Zealand, South Africa, the United Kingdom, and the United States.

Forum discussion at Google+.


Source: SEroundTable

Page Load Takes Two-Seconds? Google May Slow Crawling Your Site.

robot turtleAlmost all SEOs know that a busy server or slow server will result in GoogleBot slowing how they crawl your web site and we also know that extremely slow sites/pages can be negatively impacted in ranking well in the Google search results.

But how slow is too slow?

Google’s John Mueller called out a specific load time as being too slow for GoogleBot to crawl the site at its normal rate. John Mueller said in a Google Webmaster Help thread:

We’re seeing an extremely high response-time for requests made to your site (at times, over 2 seconds to fetch a single URL). This has resulted in us severely limiting the number of URLs we’ll crawl from your site, and you’re seeing that in Fetch as Google as well. My recommendation would be to make sure that your server is fast & responsive across the board. As our systems see a reduced response-time, they’ll automatically ramp crawling back up (which gives you more room to use Fetch as Google too).

He specifically called out “over 2 seconds” to load a single URL on this site, which is resulting in GoogleBot “severely limiting the number of URLs” it will crawl on that site.

Now, John is not mentioning anything about the PageSpeed algorithm, just about the crawling of the site.

We often don’t hear specific numbers from Google, like this.

Forum discussion at Google Webmaster Help.


Source: SEroundTable