v

In roughly 24 hours coronavirus makes sports, a longtime sanctuary in times of crisis, disappear


Sports has always been the escape during times of crisis and collective stress. But now the very act of conducting sports threatens to add exponentially to perpetuating the coronavirus pandemic and growing the stress.




v

‘It’s a big moment.’ Washington State leaves no doubt against Colorado, breaking drought at Pac-12 tournament


Not weighed down by their 10-year drought at the Pac-12 tournament, the Cougars trailed for just 87 seconds against Colorado on Wednesday night before driving the Buffaloes into the ground, 82-68, at T-Mobile Arena.





v

Due to coronavirus, NCAA grants extra year of eligibility to spring athletes, considers same for winter athletes


After the cancellation of the spring and winter championships tournaments stemming from concerns over the novel coronavirus pandemic, the NCAA will grant an extra year of eligibility to athletes who participate in spring sports, the organization announced Friday.





v

Pac-12 commissioner Larry Scott discusses conference’s financial hit and ‘concern and anxiety’ over athletes because of coronavirus


The Pac-12 is facing a revenue hit of at least $1 million per school from the cancellation of its men’s basketball tournament and March Madness, although the full extent of the damage won’t be known for weeks.




v

Isaiah Stewart announces he’s leaving Washington Huskies to enter NBA draft


On Wednesday, Stewart announced he's leaving Washington and entering the NBA draft where he's expected to be selected in the first round.




v

When it comes to academics and diversity, Gonzaga is No. 1 seed


Gonzaga stood out in a study that seeded men’s and women’s NCAA tournament brackets based on graduation rates, academic success and diversity in the head-coaching ranks.




v

WSU coaches Nick Rolovich and Kyle Smith taking temporary salary reductions as part of ‘cost containment’ measure


To help compensate for lost NCAA distribution and added expenditures caused by the novel coronavirus outbreak, Washington State announced multiple “cost containment” measures Monday.




v

Notre Dame, Oregon top 2021 Maui Invitational field


LAHAINA, Hawaii (AP) — Former tournament champion Notre Dame and Oregon headline the 2021 Maui Invitational field. The bracket, announced Friday, also includes Butler, Houston, Saint Mary’s, Wisconsin, Texas A&M and host Chaminade. Notre Dame won the Maui title in its last appearance in 2017, beating Wichita State in the championship game. Wisconsin is making […]




v

Virus could ‘smolder’ in Africa, cause many deaths, says WHO


JOHANNESBURG (AP) — The coronavirus could “smolder” in Africa for years and take a high death toll across the continent, the World Health Organization has warned. The virus is spreading in Africa, but so far the continent has not seen a dramatic explosion in the number of confirmed cases. More than 52,000 confirmed infections and […]




v

Canadian provinces allow locked-down households to pair up — threatening hurt feelings all around


While jurisdictions around the world begin to relax coronavirus restrictions, a handful are pioneering a novel — and potentially fraught — approach: The double bubble. In Canada they're doing it in Newfoundland, Labrador and New Brunswick.




v

Vatican cardinal in row over claim that virus hurts religion


ROME (AP) — A petition signed by some conservative Catholics claiming the coronavirus is an overhyped “pretext” to deprive the faithful of Mass and impose a new world order has run into a hitch. The highest-ranking signatory, Cardinal Robert Sarah, head of the Vatican’s liturgy office, claims he never signed the petition. But the archbishop […]




v

Spain’s army predicts 2 more waves of coronavirus


BARCELONA, Spain (AP) — Spain’s army expects there to be two more outbreaks of the new coronavirus, according to an internal report seen by The Associated Press. The army report predicts “two more waves of the epidemic” and that Spain will take “between a year and a year-and-a-half to return to normality.” The document was […]




v

Amid pandemic, Pompeo to visit Israel for annexation talks


WASHINGTON (AP) — Secretary of State Mike Pompeo will travel to Israel next week for a brief visit amid the coronavirus pandemic and lockdown, a trip that’s expected to focus on Prime Minister Benjamin Netanyahu’s plans to annex portions of the West Bank, the State Department said Friday. Pompeo will make the lightning trip to […]



  • Nation & World Politics
  • World

v

A top aide to Vice President Pence tests positive for coronavirus


WASHINGTON — A top aide to Vice President Mike Pence tested positive for coronavirus on Friday, making her the second known person working at the White House to contract the illness in the past two days, according to several sources familiar with the situation. Katie Miller, the vice president’s press secretary, was notified Friday about […]




v

Child in New York dies; syndrome tied to coronavirus is suspected


NEW YORK — A child died in a Manhattan hospital on Thursday from what appeared to be a rare syndrome linked to the coronavirus that causes life-threatening inflammation in critical organs and blood vessels of children, the hospital said. If confirmed, it would be the first known death in New York related to the mysterious […]




v

Hidden toll: Mexico ignores wave of coronavirus deaths in capital


MEXICO CITY — The Mexican government is not reporting hundreds, possibly thousands, of deaths from the coronavirus in Mexico City, dismissing anxious officials who have tallied more than three times as many fatalities in the capital than the government publicly acknowledges, according to officials and confidential data. The tensions have come to a head in […]




v

Two White House coronavirus cases raise question of if anyone is really safe


WASHINGTON — In his eagerness to reopen the country, President Donald Trump faces the challenge of convincing Americans that it would be safe to go back to the workplace. But the past few days have demonstrated that even his own workplace may not be safe from the coronavirus. Vice President Mike Pence’s press secretary tested […]




v

Reopenings bring new cases in S. Korea, virus fears in Italy


ROME (AP) — South Korea’s capital closed down more than 2,100 bars and other nightspots Saturday because of a new cluster of coronavirus infections, Germany scrambled to contain fresh outbreaks at slaughterhouses, and Italian authorities worried that people were getting too friendly at cocktail hour during the country’s first weekend of eased restrictions. The new […]




v

Coronavirus takes a toll in Sweden’s immigrant community


STOCKHOLM (AP) — The flight from Italy was one of the last arrivals that day at the Stockholm airport. A Swedish couple in their 50s walked up and loaded their skis into Razzak Khalaf’s taxi. It was early March and concerns over the coronavirus were already present, but the couple, both coughing for the entire […]




v

‘Fear kills:’ WWII vets recall war, reject panic over virus


YAKUTSK, Russia (AP) — On the 75th anniversary of the allied victory in the World War II, The Associated Press spoke to veterans in ex-Soviet countries and discovered that lessons they learned during the war are helping them cope with a new major challenge — the coronavirus pandemic. As they recalled the horrors of the […]




v

Russian volunteers search for fallen World War II soldiers


KHULKHUTA, Russia (AP) — Crouching over the sun-drenched soil, Alfred Abayev picks up a charred fragment of a Soviet warplane downed in a World War II battle with advancing Nazi forces. “You can see it was burning,” he says, pointing at the weathered trace of a red star. Abayev and members of his search team […]




v

Russia, Belarus mark Victory Day in contrasting events


MOSCOW (AP) — Russian President Vladimir Putin marked Victory Day, the anniversary of the defeat of Nazi Germany in World War II, in a ceremony shorn of its usual military parade and pomp by the coronavirus pandemic. In neighboring Belarus, however, the ceremonies went ahead in full, with tens of thousands of people in the […]




v

Google Florida 2.0 Algorithm Update: Early Observations

It has been a while since Google has had a major algorithm update.

They recently announced one which began on the 12th of March.

What changed?

It appears multiple things did.

When Google rolled out the original version of Penguin on April 24, 2012 (primarily focused on link spam) they also rolled out an update to an on-page spam classifier for misdirection.

And, over time, it was quite common for Panda & Penguin updates to be sandwiched together.

If you were Google & had the ability to look under the hood to see why things changed, you would probably want to obfuscate any major update by changing multiple things at once to make reverse engineering the change much harder.

Anyone who operates a single website (& lacks the ability to look under the hood) will have almost no clue about what changed or how to adjust with the algorithms.

In the most recent algorithm update some sites which were penalized in prior "quality" updates have recovered.

Though many of those recoveries are only partial.

Many SEO blogs will publish articles about how they cracked the code on the latest update by publishing charts like the first one without publishing that second chart showing the broader context.

The first penalty any website receives might be the first of a series of penalties.

If Google smokes your site & it does not cause a PR incident & nobody really cares that you are gone, then there is a very good chance things will go from bad to worse to worser to worsterest, technically speaking.

“In this age, in this country, public sentiment is everything. With it, nothing can fail; against it, nothing can succeed. Whoever molds public sentiment goes deeper than he who enacts statutes, or pronounces judicial decisions.” - Abraham Lincoln

Absent effort & investment to evolve FASTER than the broader web, sites which are hit with one penalty will often further accumulate other penalties. It is like compound interest working in reverse - a pile of algorithmic debt which must be dug out of before the bleeding stops.

Further, many recoveries may be nothing more than a fleeting invitation to false hope. To pour more resources into a site that is struggling in an apparent death loop.

The above site which had its first positive algorithmic response in a couple years achieved that in part by heavily de-monetizing. After the algorithm updates already demonetized the website over 90%, what harm was there in removing 90% of what remained to see how it would react? So now it will get more traffic (at least for a while) but then what exactly is the traffic worth to a site that has no revenue engine tied to it?

That is ultimately the hard part. Obtaining a stable stream of traffic while monetizing at a decent yield, without the monetizing efforts leading to the traffic disappearing.

A buddy who owns the above site was working on link cleanup & content improvement on & off for about a half year with no results. Each month was a little worse than the prior month. It was only after I told him to remove the aggressive ads a few months back that he likely had any chance of seeing any sort of traffic recovery. Now he at least has a pulse of traffic & can look into lighter touch means of monetization.

If a site is consistently penalized then the problem might not be an algorithmic false positive, but rather the business model of the site.

The more something looks like eHow the more fickle Google's algorithmic with receive it.

Google does not like websites that sit at the end of the value chain & extract profits without having to bear far greater risk & expense earlier into the cycle.

Thin rewrites, largely speaking, don't add value to the ecosystem. Doorway pages don't either. And something that was propped up by a bunch of keyword-rich low-quality links is (in most cases) probably genuinely lacking in some other aspect.

Generally speaking, Google would like themselves to be the entity at the end of the value chain extracting excess profits from markets.

This is the purpose of the knowledge graph & featured snippets. To allow the results to answer the most basic queries without third party publishers getting anything. The knowledge graph serve as a floating vertical that eat an increasing share of the value chain & force publishers to move higher up the funnel & publish more differentiated content.

As Google adds features to the search results (flight price trends, a hotel booking service on the day AirBNB announced they acquired HotelTonight, ecommerce product purchase on Google, shoppable image ads just ahead of the Pinterest IPO, etc.) it forces other players in the value chain to consolidate (Expedia owns Orbitz, Travelocity, Hotwire & a bunch of other sites) or add greater value to remain a differentiated & sought after destination (travel review site TripAdvisor was crushed by the shift to mobile & the inability to monetize mobile traffic, so they eventually had to shift away from being exclusively a reviews site to offer event & hotel booking features to remain relevant).

It is never easy changing a successful & profitable business model, but it is even harder to intentionally reduce revenues further or spend aggressively to improve quality AFTER income has fallen 50% or more.

Some people do the opposite & make up for a revenue shortfall by publishing more lower end content at an ever faster rate and/or increasing ad load. Either of which typically makes their user engagement metrics worse while making their site less differentiated & more likely to receive additional bonus penalties to drive traffic even lower.

In some ways I think the ability for a site to survive & remain though a penalty is itself a quality signal for Google.

Some sites which are overly reliant on search & have no external sources of traffic are ultimately sites which tried to behave too similarly to the monopoly that ultimately displaced them. And over time the tech monopolies are growing more powerful as the ecosystem around them burns down:

If you had to choose a date for when the internet died, it would be in the year 2014. Before then, traffic to websites came from many sources, and the web was a lively ecosystem. But beginning in 2014, more than half of all traffic began coming from just two sources: Facebook and Google. Today, over 70 percent of traffic is dominated by those two platforms.

Businesses which have sustainable profit margins & slack (in terms of management time & resources to deploy) can better cope with algorithmic changes & change with the market.

Over the past half decade or so there have been multiple changes that drastically shifted the online publishing landscape:

  • the shift to mobile, which both offers publishers lower ad yields while making the central ad networks more ad heavy in a way that reduces traffic to third party sites
  • the rise of the knowledge graph & featured snippets which often mean publishers remain uncompensated for their work
  • higher ad loads which also lower organic reach (on both search & social channels)
  • the rise of programmatic advertising, which further gutted display ad CPMs
  • the rise of ad blockers
  • increasing algorithmic uncertainty & a higher barrier to entry

Each one of the above could take a double digit percent out of a site's revenues, particularly if a site was reliant on display ads. Add them together and a website which was not even algorithmically penalized could still see a 60%+ decline in revenues. Mix in a penalty and that decline can chop a zero or two off the total revenues.

Businesses with lower margins can try to offset declines with increased ad spending, but that only works if you are not in a market with 2 & 20 VC fueled competition:

Startups spend almost 40 cents of every VC dollar on Google, Facebook, and Amazon. We don’t necessarily know which channels they will choose or the particularities of how they will spend money on user acquisition, but we do know more or less what’s going to happen. Advertising spend in tech has become an arms race: fresh tactics go stale in months, and customer acquisition costs keep rising. In a world where only one company thinks this way, or where one business is executing at a level above everyone else - like Facebook in its time - this tactic is extremely effective. However, when everyone is acting this way, the industry collectively becomes an accelerating treadmill. Ad impressions and click-throughs get bid up to outrageous prices by startups flush with venture money, and prospective users demand more and more subsidized products to gain their initial attention. The dynamics we’ve entered is, in many ways, creating a dangerous, high stakes Ponzi scheme.

And sometimes the platform claws back a second or third bite of the apple. Amazon.com charges merchants for fulfillment, warehousing, transaction based fees, etc. And they've pushed hard into launching hundreds of private label brands which pollute the interface & force brands to buy ads even on their own branded keyword terms.

They've recently jumped the shark by adding a bonus feature where even when a brand paid Amazon to send traffic to their listing, Amazon would insert a spam popover offering a cheaper private label branded product:

Amazon.com tested a pop-up feature on its app that in some instances pitched its private-label goods on rivals’ product pages, an experiment that shows the e-commerce giant’s aggressiveness in hawking lower-priced products including its own house brands. The recent experiment, conducted in Amazon’s mobile app, went a step further than the display ads that commonly appear within search results and product pages. This test pushed pop-up windows that took over much of a product page, forcing customers to either click through to the lower-cost Amazon products or dismiss them before continuing to shop. ... When a customer using Amazon’s mobile app searched for “AAA batteries,” for example, the first link was a sponsored listing from Energizer Holdings Inc. After clicking on the listing, a pop-up window appeared, offering less expensive AmazonBasics AAA batteries."

Buying those Amazon ads was quite literally subsidizing a direct competitor pushing you into irrelevance.

And while Amazon is destroying brand equity, AWS is doing investor relations matchmaking for startups. Anything to keep the current bubble going ahead of the Uber IPO that will likely mark the top in the stock market.

As the market caps of big tech companies climb they need to be more predatious to grow into the valuations & retain employees with stock options at an ever-increasing strike price.

They've created bubbles in their own backyards where each raise requires another. Teachers either drive hours to work or live in houses subsidized by loans from the tech monopolies that get a piece of the upside (provided they can keep their own bubbles inflated).

"It is an uncommon arrangement — employer as landlord — that is starting to catch on elsewhere as school employees say they cannot afford to live comfortably in regions awash in tech dollars. ... Holly Gonzalez, 34, a kindergarten teacher in East San Jose, and her husband, Daniel, a school district I.T. specialist, were able to buy a three-bedroom apartment for $610,000 this summer with help from their parents and from Landed. When they sell the home, they will owe Landed 25 percent of any gain in its value. The company is financed partly by the Chan Zuckerberg Initiative, Mark Zuckerberg’s charitable arm."

The above sort of dynamics have some claiming peak California:

The cycle further benefits from the Alchian-Allen effect: agglomerating industries have higher productivity, which raises the cost of living and prices out other industries, raising concentration over time. ... Since startups raise the variance within whatever industry they’re started in, the natural constituency for them is someone who doesn’t have capital deployed in the industry. If you’re an asset owner, you want low volatility. ... Historically, startups have created a constant supply of volatility for tech companies; the next generation is always cannibalizing the previous one. So chip companies in the 1970s created the PC companies of the 80s, but PC companies sourced cheaper and cheaper chips, commoditizing the product until Intel managed to fight back. Meanwhile, the OS turned PCs into a commodity, then search engines and social media turned the OS into a commodity, and presumably this process will continue indefinitely. ... As long as higher rents raise the cost of starting a pre-revenue company, fewer people will join them, so more people will join established companies, where they’ll earn market salaries and continue to push up rents. And one of the things they’ll do there is optimize ad loads, which places another tax on startups. More dangerously, this is an incremental tax on growth rather than a fixed tax on headcount, so it puts pressure on out-year valuations, not just upfront cash flow.

If you live hundreds of miles away the tech companies may have no impact on your rental or purchase price, but you can't really control the algorithms or the ecosystem.

All you can really control is your mindset & ensuring you have optionality baked into your business model.

  • If you are debt-levered you have little to no optionality. Savings give you optionality. Savings allow you to run at a loss for a period of time while also investing in improving your site and perhaps having a few other sites in other markets.
  • If you operate a single website that is heavily reliant on a third party for distribution then you have little to no optionality. If you have multiple projects that enables you to shift your attention toward working on whatever is going up and to the right while letting anything that is failing pass time without becoming overly reliant on something you can't change. This is why it often makes sense for a brand merchant to operate their own ecommerce website even if 90% of their sales come from Amazon. It gives you optionality should the tech monopoly become abusive or otherwise harm you (even if the intent was benign rather than outright misanthropic).

As the update ensues Google will collect more data with how users interact with the result set & determine how to weight different signals, along with re-scoring sites that recovered based on the new engagement data.

Recently a Bing engineer named Frédéric Dubut described how they score relevancy signals used in updates

As early as 2005, we used neural networks to power our search engine and you can still find rare pictures of Satya Nadella, VP of Search and Advertising at the time, showcasing our web ranking advances. ... The “training” process of a machine learning model is generally iterative (and all automated). At each step, the model is tweaking the weight of each feature in the direction where it expects to decrease the error the most. After each step, the algorithm remeasures the rating of all the SERPs (based on the known URL/query pair ratings) to evaluate how it’s doing. Rinse and repeat.

That same process is ongoing with Google now & in the coming weeks there'll be the next phase of the current update.

So far it looks like some quality-based re-scoring was done & some sites which were overly reliant on anchor text got clipped. On the back end of the update there'll be another quality-based re-scoring, but the sites that were hit for excessive manipulation of anchor text via link building efforts will likely remain penalized for a good chunk of time.

Update: It appears a major reverberation of this update occurred on April 7th. From early analysis, Google is mixing in showing results for related midtail concepts on a core industry search term & they are also in some cases pushing more aggressively on doing internal site-level searches to rank a more relevant internal page for a query where they homepage might have ranked in the past.




v

Keyword Not Provided, But it Just Clicks

When SEO Was Easy

When I got started on the web over 15 years ago I created an overly broad & shallow website that had little chance of making money because it was utterly undifferentiated and crappy. In spite of my best (worst?) efforts while being a complete newbie, sometimes I would go to the mailbox and see a check for a couple hundred or a couple thousand dollars come in. My old roommate & I went to Coachella & when the trip was over I returned to a bunch of mail to catch up on & realized I had made way more while not working than what I spent on that trip.

What was the secret to a total newbie making decent income by accident?

Horrible spelling.

Back then search engines were not as sophisticated with their spelling correction features & I was one of 3 or 4 people in the search index that misspelled the name of an online casino the same way many searchers did.

The high minded excuse for why I did not scale that would be claiming I knew it was a temporary trick that was somehow beneath me. The more accurate reason would be thinking in part it was a lucky fluke rather than thinking in systems. If I were clever at the time I would have created the misspeller's guide to online gambling, though I think I was just so excited to make anything from the web that I perhaps lacked the ambition & foresight to scale things back then.

In the decade that followed I had a number of other lucky breaks like that. One time one of the original internet bubble companies that managed to stay around put up a sitewide footer link targeting the concept that one of my sites made decent money from. This was just before the great recession, before Panda existed. The concept they targeted had 3 or 4 ways to describe it. 2 of them were very profitable & if they targeted either of the most profitable versions with that page the targeting would have sort of carried over to both. They would have outranked me if they targeted the correct version, but they didn't so their mistargeting was a huge win for me.

Search Gets Complex

Search today is much more complex. In the years since those easy-n-cheesy wins, Google has rolled out many updates which aim to feature sought after destination sites while diminishing the sites which rely one "one simple trick" to rank.

Arguably the quality of the search results has improved significantly as search has become more powerful, more feature rich & has layered in more relevancy signals.

Many quality small web publishers have went away due to some combination of increased competition, algorithmic shifts & uncertainty, and reduced monetization as more ad spend was redirected toward Google & Facebook. But the impact as felt by any given publisher is not the impact as felt by the ecosystem as a whole. Many terrible websites have also went away, while some formerly obscure though higher-quality sites rose to prominence.

There was the Vince update in 2009, which boosted the rankings of many branded websites.

Then in 2011 there was Panda as an extension of Vince, which tanked the rankings of many sites that published hundreds of thousands or millions of thin content pages while boosting the rankings of trusted branded destinations.

Then there was Penguin, which was a penalty that hit many websites which had heavily manipulated or otherwise aggressive appearing link profiles. Google felt there was a lot of noise in the link graph, which was their justification for the Penguin.

There were updates which lowered the rankings of many exact match domains. And then increased ad load in the search results along with the other above ranking shifts further lowered the ability to rank keyword-driven domain names. If your domain is generically descriptive then there is a limit to how differentiated & memorable you can make it if you are targeting the core market the keywords are aligned with.

There is a reason eBay is more popular than auction.com, Google is more popular than search.com, Yahoo is more popular than portal.com & Amazon is more popular than a store.com or a shop.com. When that winner take most impact of many online markets is coupled with the move away from using classic relevancy signals the economics shift to where is makes a lot more sense to carry the heavy overhead of establishing a strong brand.

Branded and navigational search queries could be used in the relevancy algorithm stack to confirm the quality of a site & verify (or dispute) the veracity of other signals.

Historically relevant algo shortcuts become less appealing as they become less relevant to the current ecosystem & even less aligned with the future trends of the market. Add in negative incentives for pushing on a string (penalties on top of wasting the capital outlay) and a more holistic approach certainly makes sense.

Modeling Web Users & Modeling Language

PageRank was an attempt to model the random surfer.

When Google is pervasively monitoring most users across the web they can shift to directly measuring their behaviors instead of using indirect signals.

Years ago Bill Slawski wrote about the long click in which he opened by quoting Steven Levy's In the Plex: How Google Thinks, Works, and Shapes our Lives

"On the most basic level, Google could see how satisfied users were. To paraphrase Tolstoy, happy users were all the same. The best sign of their happiness was the "Long Click" — This occurred when someone went to a search result, ideally the top one, and did not return. That meant Google has successfully fulfilled the query."

Of course, there's a patent for that. In Modifying search result ranking based on implicit user feedback they state:

user reactions to particular search results or search result lists may be gauged, so that results on which users often click will receive a higher ranking. The general assumption under such an approach is that searching users are often the best judges of relevance, so that if they select a particular search result, it is likely to be relevant, or at least more relevant than the presented alternatives.

If you are a known brand you are more likely to get clicked on than a random unknown entity in the same market.

And if you are something people are specifically seeking out, they are likely to stay on your website for an extended period of time.

One aspect of the subject matter described in this specification can be embodied in a computer-implemented method that includes determining a measure of relevance for a document result within a context of a search query for which the document result is returned, the determining being based on a first number in relation to a second number, the first number corresponding to longer views of the document result, and the second number corresponding to at least shorter views of the document result; and outputting the measure of relevance to a ranking engine for ranking of search results, including the document result, for a new search corresponding to the search query. The first number can include a number of the longer views of the document result, the second number can include a total number of views of the document result, and the determining can include dividing the number of longer views by the total number of views.

Attempts to manipulate such data may not work.

safeguards against spammers (users who generate fraudulent clicks in an attempt to boost certain search results) can be taken to help ensure that the user selection data is meaningful, even when very little data is available for a given (rare) query. These safeguards can include employing a user model that describes how a user should behave over time, and if a user doesn't conform to this model, their click data can be disregarded. The safeguards can be designed to accomplish two main objectives: (1) ensure democracy in the votes (e.g., one single vote per cookie and/or IP for a given query-URL pair), and (2) entirely remove the information coming from cookies or IP addresses that do not look natural in their browsing behavior (e.g., abnormal distribution of click positions, click durations, clicks_per_minute/hour/day, etc.). Suspicious clicks can be removed, and the click signals for queries that appear to be spmed need not be used (e.g., queries for which the clicks feature a distribution of user agents, cookie ages, etc. that do not look normal).

And just like Google can make a matrix of documents & queries, they could also choose to put more weight on search accounts associated with topical expert users based on their historical click patterns.

Moreover, the weighting can be adjusted based on the determined type of the user both in terms of how click duration is translated into good clicks versus not-so-good clicks, and in terms of how much weight to give to the good clicks from a particular user group versus another user group. Some user's implicit feedback may be more valuable than other users due to the details of a user's review process. For example, a user that almost always clicks on the highest ranked result can have his good clicks assigned lower weights than a user who more often clicks results lower in the ranking first (since the second user is likely more discriminating in his assessment of what constitutes a good result). In addition, a user can be classified based on his or her query stream. Users that issue many queries on (or related to) a given topic T (e.g., queries related to law) can be presumed to have a high degree of expertise with respect to the given topic T, and their click data can be weighted accordingly for other queries by them on (or related to) the given topic T.

Google was using click data to drive their search rankings as far back as 2009. David Naylor was perhaps the first person who publicly spotted this. Google was ranking Australian websites for [tennis court hire] in the UK & Ireland, in part because that is where most of the click signal came from. That phrase was most widely searched for in Australia. In the years since Google has done a better job of geographically isolating clicks to prevent things like the problem David Naylor noticed, where almost all search results in one geographic region came from a different country.

Whenever SEOs mention using click data to search engineers, the search engineers quickly respond about how they might consider any signal but clicks would be a noisy signal. But if a signal has noise an engineer would work around the noise by finding ways to filter the noise out or combine multiple signals. To this day Google states they are still working to filter noise from the link graph: "We continued to protect the value of authoritative and relevant links as an important ranking signal for Search."

The site with millions of inbound links, few intentional visits & those who do visit quickly click the back button (due to a heavy ad load, poor user experience, low quality content, shallow content, outdated content, or some other bait-n-switch approach)...that's an outlier. Preventing those sorts of sites from ranking well would be another way of protecting the value of authoritative & relevant links.

Best Practices Vary Across Time & By Market + Category

Along the way, concurrent with the above sorts of updates, Google also improved their spelling auto-correct features, auto-completed search queries for many years through a featured called Google Instant (though they later undid forced query auto-completion while retaining automated search suggestions), and then they rolled out a few other algorithms that further allowed them to model language & user behavior.

Today it would be much harder to get paid above median wages explicitly for sucking at basic spelling or scaling some other individual shortcut to the moon, like pouring millions of low quality articles into a (formerly!) trusted domain.

Nearly a decade after Panda, eHow's rankings still haven't recovered.

Back when I got started with SEO the phrase Indian SEO company was associated with cut-rate work where people were buying exclusively based on price. Sort of like a "I got a $500 budget for link building, but can not under any circumstance invest more than $5 in any individual link." Part of how my wife met me was she hired a hack SEO from San Diego who outsourced all the work to India and marked the price up about 100-fold while claiming it was all done in the United States. He created reciprocal links pages that got her site penalized & it didn't rank until after she took her reciprocal links page down.

With that sort of behavior widespread (hack US firm teaching people working in an emerging market poor practices), it likely meant many SEO "best practices" which were learned in an emerging market (particularly where the web was also underdeveloped) would be more inclined to being spammy. Considering how far ahead many Western markets were on the early Internet & how India has so many languages & how most web usage in India is based on mobile devices where it is hard for users to create links, it only makes sense that Google would want to place more weight on end user data in such a market.

If you set your computer location to India Bing's search box lists 9 different languages to choose from.

The above is not to state anything derogatory about any emerging market, but rather that various signals are stronger in some markets than others. And competition is stronger in some markets than others.

Search engines can only rank what exists.

"In a lot of Eastern European - but not just Eastern European markets - I think it is an issue for the majority of the [bream? muffled] countries, for the Arabic-speaking world, there just isn't enough content as compared to the percentage of the Internet population that those regions represent. I don't have up to date data, I know that a couple years ago we looked at Arabic for example and then the disparity was enormous. so if I'm not mistaken the Arabic speaking population of the world is maybe 5 to 6%, maybe more, correct me if I am wrong. But very definitely the amount of Arabic content in our index is several orders below that. So that means we do not have enough Arabic content to give to our Arabic users even if we wanted to. And you can exploit that amazingly easily and if you create a bit of content in Arabic, whatever it looks like we're gonna go you know we don't have anything else to serve this and it ends up being horrible. and people will say you know this works. I keyword stuffed the hell out of this page, bought some links, and there it is number one. There is nothing else to show, so yeah you're number one. the moment somebody actually goes out and creates high quality content that's there for the long haul, you'll be out and that there will be one." - Andrey Lipattsev – Search Quality Senior Strategist at Google Ireland, on Mar 23, 2016

Impacting the Economics of Publishing

Now search engines can certainly influence the economics of various types of media. At one point some otherwise credible media outlets were pitching the Demand Media IPO narrative that Demand Media was the publisher of the future & what other media outlets will look like. Years later, after heavily squeezing on the partner network & promoting programmatic advertising that reduces CPMs by the day Google is funding partnerships with multiple news publishers like McClatchy & Gatehouse to try to revive the news dead zones even Facebook is struggling with.

"Facebook Inc. has been looking to boost its local-news offerings since a 2017 survey showed most of its users were clamoring for more. It has run into a problem: There simply isn’t enough local news in vast swaths of the country. ... more than one in five newspapers have closed in the past decade and a half, leaving half the counties in the nation with just one newspaper, and 200 counties with no newspaper at all."

As mainstream newspapers continue laying off journalists, Facebook's news efforts are likely to continue failing unless they include direct economic incentives, as Google's programmatic ad push broke the banner ad:

"Thanks to the convoluted machinery of Internet advertising, the advertising world went from being about content publishers and advertising context—The Times unilaterally declaring, via its ‘rate card’, that ads in the Times Style section cost $30 per thousand impressions—to the users themselves and the data that targets them—Zappo’s saying it wants to show this specific shoe ad to this specific user (or type of user), regardless of publisher context. Flipping the script from a historically publisher-controlled mediascape to an advertiser (and advertiser intermediary) controlled one was really Google’s doing. Facebook merely rode the now-cresting wave, borrowing outside media’s content via its own users’ sharing, while undermining media’s ability to monetize via Facebook’s own user-data-centric advertising machinery. Conventional media lost both distribution and monetization at once, a mortal blow."

Google is offering news publishers audience development & business development tools.

Heavy Investment in Emerging Markets Quickly Evolves the Markets

As the web grows rapidly in India, they'll have a thousand flowers bloom. In 5 years the competition in India & other emerging markets will be much tougher as those markets continue to grow rapidly. Media is much cheaper to produce in India than it is in the United States. Labor costs are lower & they never had the economic albatross that is the ACA adversely impact their economy. At some point the level of investment & increased competition will mean early techniques stop having as much efficacy. Chinese companies are aggressively investing in India.

“If you break India into a pyramid, the top 100 million (urban) consumers who think and behave more like Americans are well-served,” says Amit Jangir, who leads India investments at 01VC, a Chinese venture capital firm based in Shanghai. The early stage venture firm has invested in micro-lending firms FlashCash and SmartCoin based in India. The new target is the next 200 million to 600 million consumers, who do not have a go-to entertainment, payment or ecommerce platform yet— and there is gonna be a unicorn in each of these verticals, says Jangir, adding that it will be not be as easy for a player to win this market considering the diversity and low ticket sizes.

RankBrain

RankBrain appears to be based on using user clickpaths on head keywords to help bleed rankings across into related searches which are searched less frequently. A Googler didn't state this specifically, but it is how they would be able to use models of searcher behavior to refine search results for keywords which are rarely searched for.

In a recent interview in Scientific American a Google engineer stated: "By design, search engines have learned to associate short queries with the targets of those searches by tracking pages that are visited as a result of the query, making the results returned both faster and more accurate than they otherwise would have been."

Now a person might go out and try to search for something a bunch of times or pay other people to search for a topic and click a specific listing, but some of the related Google patents on using click data (which keep getting updated) mentioned how they can discount or turn off the signal if there is an unnatural spike of traffic on a specific keyword, or if there is an unnatural spike of traffic heading to a particular website or web page.

And, since Google is tracking the behavior of end users on their own website, anomalous behavior is easier to track than it is tracking something across the broader web where signals are more indirect. Google can take advantage of their wide distribution of Chrome & Android where users are regularly logged into Google & pervasively tracked to place more weight on users where they had credit card data, a long account history with regular normal search behavior, heavy Gmail users, etc.

Plus there is a huge gap between the cost of traffic & the ability to monetize it. You might have to pay someone a dime or a quarter to search for something & there is no guarantee it will work on a sustainable basis even if you paid hundreds or thousands of people to do it. Any of those experimental searchers will have no lasting value unless they influence rank, but even if they do influence rankings it might only last temporarily. If you bought a bunch of traffic into something genuine Google searchers didn't like then even if it started to rank better temporarily the rankings would quickly fall back if the real end user searchers disliked the site relative to other sites which already rank.

This is part of the reason why so many SEO blogs mention brand, brand, brand. If people are specifically looking for you in volume & Google can see that thousands or millions of people specifically want to access your site then that can impact how you rank elsewhere.

Even looking at something inside the search results for a while (dwell time) or quickly skipping over it to have a deeper scroll depth can be a ranking signal. Some Google patents mention how they can use mouse pointer location on desktop or scroll data from the viewport on mobile devices as a quality signal.

Neural Matching

Last year Danny Sullivan mentioned how Google rolled out neural matching to better understand the intent behind a search query.

The above Tweets capture what the neural matching technology intends to do. Google also stated:

we’ve now reached the point where neural networks can help us take a major leap forward from understanding words to understanding concepts. Neural embeddings, an approach developed in the field of neural networks, allow us to transform words to fuzzier representations of the underlying concepts, and then match the concepts in the query with the concepts in the document. We call this technique neural matching.

To help people understand the difference between neural matching & RankBrain, Google told SEL: "RankBrain helps Google better relate pages to concepts. Neural matching helps Google better relate words to searches."

There are a couple research papers on neural matching.

The first one was titled A Deep Relevance Matching Model for Ad-hoc Retrieval. It mentioned using Word2vec & here are a few quotes from the research paper

  • "Successful relevance matching requires proper handling of the exact matching signals, query term importance, and diverse matching requirements."
  • "the interaction-focused model, which first builds local level interactions (i.e., local matching signals) between two pieces of text, and then uses deep neural networks to learn hierarchical interaction patterns for matching."
  • "according to the diverse matching requirement, relevance matching is not position related since it could happen in any position in a long document."
  • "Most NLP tasks concern semantic matching, i.e., identifying the semantic meaning and infer"ring the semantic relations between two pieces of text, while the ad-hoc retrieval task is mainly about relevance matching, i.e., identifying whether a document is relevant to a given query."
  • "Since the ad-hoc retrieval task is fundamentally a ranking problem, we employ a pairwise ranking loss such as hinge loss to train our deep relevance matching model."

The paper mentions how semantic matching falls down when compared against relevancy matching because:

  • semantic matching relies on similarity matching signals (some words or phrases with the same meaning might be semantically distant), compositional meanings (matching sentences more than meaning) & a global matching requirement (comparing things in their entirety instead of looking at the best matching part of a longer document); whereas,
  • relevance matching can put significant weight on exact matching signals (weighting an exact match higher than a near match), adjust weighting on query term importance (one word might or phrase in a search query might have a far higher discrimination value & might deserve far more weight than the next) & leverage diverse matching requirements (allowing relevancy matching to happen in any part of a longer document)

Here are a couple images from the above research paper

And then the second research paper is

Deep Relevancy Ranking Using Enhanced Dcoument-Query Interactions
"interaction-based models are less efficient, since one cannot index a document representation independently of the query. This is less important, though, when relevancy ranking methods rerank the top documents returned by a conventional IR engine, which is the scenario we consider here."

That same sort of re-ranking concept is being better understood across the industry. There are ranking signals that earn some base level ranking, and then results get re-ranked based on other factors like how well a result matches the user intent.

Here are a couple images from the above research paper.

For those who hate the idea of reading research papers or patent applications, Martinibuster also wrote about the technology here. About the only part of his post I would debate is this one:

"Does this mean publishers should use more synonyms? Adding synonyms has always seemed to me to be a variation of keyword spamming. I have always considered it a naive suggestion. The purpose of Google understanding synonyms is simply to understand the context and meaning of a page. Communicating clearly and consistently is, in my opinion, more important than spamming a page with keywords and synonyms."

I think one should always consider user experience over other factors, however a person could still use variations throughout the copy & pick up a bit more traffic without coming across as spammy. Danny Sullivan mentioned the super synonym concept was impacting 30% of search queries, so there are still a lot which may only be available to those who use a specific phrase on their page.

Martinibuster also wrote another blog post tying more research papers & patents to the above. You could probably spend a month reading all the related patents & research papers.

The above sort of language modeling & end user click feedback compliment links-based ranking signals in a way that makes it much harder to luck one's way into any form of success by being a terrible speller or just bombing away at link manipulation without much concern toward any other aspect of the user experience or market you operate in.

Pre-penalized Shortcuts

Google was even issued a patent for predicting site quality based upon the N-grams used on the site & comparing those against the N-grams used on other established site where quality has already been scored via other methods: "The phrase model can be used to predict a site quality score for a new site; in particular, this can be done in the absence of other information. The goal is to predict a score that is comparable to the baseline site quality scores of the previously-scored sites."

Have you considered using a PLR package to generate the shell of your site's content? Good luck with that as some sites trying that shortcut might be pre-penalized from birth.

Navigating the Maze

When I started in SEO one of my friends had a dad who is vastly smarter than I am. He advised me that Google engineers were smarter, had more capital, had more exposure, had more data, etc etc etc ... and thus SEO was ultimately going to be a malinvestment.

Back then he was at least partially wrong because influencing search was so easy.

But in the current market, 16 years later, we are near the infection point where he would finally be right.

At some point the shortcuts stop working & it makes sense to try a different approach.

The flip side of all the above changes is as the algorithms have become more complex they have went from being a headwind to people ignorant about SEO to being a tailwind to those who do not focus excessively on SEO in isolation.

If one is a dominant voice in a particular market, if they break industry news, if they have key exclusives, if they spot & name the industry trends, if their site becomes a must read & is what amounts to a habit ... then they perhaps become viewed as an entity. Entity-related signals help them & those signals that are working against the people who might have lucked into a bit of success become a tailwind rather than a headwind.

If your work defines your industry, then any efforts to model entities, user behavior or the language of your industry are going to boost your work on a relative basis.

This requires sites to publish frequently enough to be a habit, or publish highly differentiated content which is strong enough that it is worth the wait.

Those which publish frequently without being particularly differentiated are almost guaranteed to eventually walk into a penalty of some sort. And each additional person who reads marginal, undifferentiated content (particularly if it has an ad-heavy layout) is one additional visitor that site is closer to eventually getting whacked. Success becomes self regulating. Any short-term success becomes self defeating if one has a highly opportunistic short-term focus.

Those who write content that only they could write are more likely to have sustained success.




v

Brands vs Ads

Brand, Brand, Brand

About 7 years ago I wrote about how the search relevancy algorithms were placing heavy weighting on brand-related signals after Vince & Panda on the (half correct!) presumption that this would lead to excessive industry consolidation which in turn would force Google to turn the dials in the other direction.

My thesis was Google would need to increasingly promote some smaller niche sites to make general web search differentiated from other web channels & minimize the market power of vertical leading providers.

The reason my thesis was only half correct (and ultimately led to the absolutely wrong conclusion) is Google has the ability to provide the illusion of diversity while using sort of eye candy displacement efforts to shift an increasing share of searches from organic to paid results.

Shallow Verticals With a Shill Bid

As long as any market has at least 2 competitors in it Google can create a "me too" offering that they hard code front & center and force the other 2 players (along with other players along the value chain) to bid for marketshare. If competitors are likely to complain about the thinness of the me too offering & it being built upon scraping other websites, Google can buy out a brand like Zagat or a data supplier like ITA Software to undermine criticism until the artificially promoted vertical service has enough usage that it is nearly on par with other players in the ecosystem.

Google need not win every market. They only need to ensure there are at least 2 competing bids left in the marketplace while dialing back SEO exposure. They can then run other services to redirect user flow and force the ad buy. They can insert their own bid as a sort of shill floor bid in their auction. If you bid below that amount they'll collect the profit through serving the customer directly, if you bid above that they'll let you buy the customer vs doing a direct booking.

Adding Volatility to Economies of Scale

Where this gets more than a bit tricky is if you are a supplier of third party goods & services where you buy in bulk to get preferential pricing for resale. If you buy 100 rooms a night from a particular hotel based on the presumption of prior market performance & certain channels effectively disappear you have to bid above market to sell some portion of the rooms because getting anything for them is better than leaving them unsold.

"Well I am not in hotels, so thankfully this won't impact me" is an incomplete thought. Google Ads now offer a lead generation extension.

Dipping a bit back into history here, but after Groupon said no to Google's acquisition offer Google promptly partnered with players 2 through n to ensure Groupon did not have a lasting competitive advantage. In the fullness of time most those companies died, LivingSocial was acquired by Groupon for nothing & Groupon is today worth less than the amount they raised in VC & IPO funding.

Markets Naturally Evolve Toward Promoting Brands

When a vertical is new a player can compete just by showing up. Then over time as the verticals become established consumers develop habits, brands beat out generics & the markets get consolidated down to being heavily influenced & controlled by a couple strong players.

In the offline world of atoms there are real world costs tied to local regulations, shipping, sourcing, supply chains, inventory management, etc. The structure of the web & the lack of marginal distribution cost causes online markets to be even more consolidated than their offline analogs.

When Travelocity outsourced their backend infrastructure to Expedia most people visiting their website were unaware of the change. After Expedia acquired the site, longtime Travelocity customers likely remained unaware. In some businesses the only significant difference in the user experience is the logo at the top of the page.

Most large markets will ultimately consolidate down to a couple players (e.g. Booking vs Expedia) while smaller players lack the scale needed to have the economic leverage to pay Google's increasing rents.

This sort of consolidation was happening even when the search results were mostly organic & relevancy was driven primarily by links. As Google has folded in usage data & increased ad load on the search results it becomes harder for a generically descriptive domain name to build brand-related signals.

Re-sorting the Markets Once More

It is not only generically descriptive sorts of sites that have faded though. Many brand investments turned out to be money losers after the search result set was displaced by more ads (& many brand-related search result pages also carry ads above the organic results).

The ill informed might write something like this:

Since the Motorola debacle, it was Google's largest acquisition after the $676 million purchase of ITA Software, which became Google Flights. (Uh, remember that? Does anyone use that instead of Travelocity or one of the many others? Neither do I.)

The reality is brands lose value as the organic result set is displaced. To make the margins work they might desperately outsource just about everything but marketing to a competitor / partner, which will then latter acquire them for a song.

Travelocity had roughly 3,000 people on the payroll globally as recently as a couple of years ago, but the Travelocity workforce has been whittled to around 50 employees in North America with many based in the Dallas area.

The best relevancy algorithm in the world is trumped by preferential placement of inferior results which bypasses the algorithm. If inferior results are hard coded in placements which violate net neutrality for an extended period of time, they can starve other players in the market from the vital user data & revenues needed to reinvest into growth and differentiation.

Value plays see their stocks crash as growth slows or goes in reverse. With the exception of startups funded by Softbank, growth plays are locked out of receiving further investment rounds as their growth rate slides.

Startups like Hipmunk disappear. Even an Orbitz or Travelocity become bolt on acquisitions.

The viability of TripAdvisor as a stand alone business becomes questioned, leading them to partner with Ctrip.

TripAdvisor has one of the best link profiles of any commercially oriented website outside of perhaps Amazon.com. But ranking #1 doesn't count for much if that #1 ranking is below the fold. Or, even worse, if Google literally hides the organic search results.

TripAdvisor shifted their business model to allow direct booking to better monetize mobile web users, but as Google has ate screen real estate and grew Google Travel into a $100 billion business other players have seen their stocks sag.

Top of The Funnel

Google sits at the top of the funnel & all other parts of the value chain are compliments to be commoditized.

  • Buy premium domain names? Google's SERPs test replacing domain names with words & make the words associated with the domain name gray.
  • Improve conversion rates? Your competitor almost certainly did as well, now you both can bid more & hand over an increasing economic rent to Google.
  • Invest in brand awareness? Google shows ads for competitors on your brand terms, forcing you to buy to protect the brand equity you paid to build.

Search Metrics mentioned Hotels.com was one of the biggest losers during the recent algorithm updates: "I’m going to keep on this same theme there, and I’m not going to say overall numbers, the biggest loser, but for my loser I’m going to pick Hotels.com, because they were literally like neck and neck, like one and two with Booking, as far as how close together they were, and the last four weeks, they’ve really increased that separation."

As Google ate the travel category the value of hotel-related domain names has fallen through the floor.

Most of the top selling hotel-related domain names were sold about a decade ago:

On August 8th HongKongHotels.com sold for $4,038. A decade ago that name likely would have sold for around $100,000.

And the new buyer may have overpaid for it!

Growing Faster Than the Market

Google consistently grows their ad revenues 20% a year in a global economy growing at under 4%.

There are only about 6 ways they can do that

  • growth of web usage (though many of those who are getting online today have a far lower disposable income than those who got on a decade or two ago did)
  • gain marketshare (very hard in search, given that they effectively are the market in most markets outside of a few countries like China & Russia)
  • create new inventory (new ad types on image search results, Google Maps & YouTube)
  • charge more for clicks
  • improve at targeting through better surveillance of web users (getting harder after GDPR & similar efforts from some states in the next year or two)
  • shift click streams away from organic toward paid channels (through larger ads, more interactive ad units, less appealing organic result formatting, pushing organic results below the fold, hiding organic results, etc.)

Six of One, Half-dozen of the Other

Wednesday both Expedia and TripAdvisor reported earnings after hours & both fell off a cliff: "Both Okerstrom and Kaufer complained that their organic, or free, links are ending up further down the page in Google search results as Google prioritizes its own travel businesses."

Losing 20% to 25% of your market cap in a single day is an extreme move for a company worth billions of dollars.

Thursday Google hit fresh all time highs.

"Google’s old motto was ‘Don’t Be Evil’, but you can’t be this big and profitable and not be evil. Evil and all-time highs pretty much go hand in hand." - Howard Lindzon

Booking held up much better than TripAdvisor & Expedia as they have a bigger footprint in Europe (where antitrust is a thing) and they have a higher reliance on paid search versus organic.

Frozen in Fear vs Fearless

The broader SEO industry is to some degree frozen by fear. Roughly half of SEOs claim to have not bought *ANY* links in a half-decade.

Long after most of the industry has stopped buying links some people still run the "paid links are a potential FTC violation guideline" line as though it is insightful and/or useful.

Ask the people carrying Google's water what they think of the official FTC guidance on poor ad labeling in search results and you will hear the beautiful sound of crickets chirping.

Where is the ad labeling in this unit?

Does small gray text in the upper right corner stating "about these results" count as legitimate ad labeling?

And then when you scroll over that gray text and click on it you get "Some of these hotel search results may be personalized based on your browsing activity and recent searches on Google, as well as travel confirmations sent to your Gmail. Hotel prices come from Google's partners."

Ads, Scroll, Ads, Scroll, Ads...

Zooming out a bit further on the above ad unit to look at the entire search result page, we can now see the following:

  • 4 text ad units above the map
  • huge map which segments demand by price tier, current sales, luxury, average review, geographic location
  • organic results below the above wall of ads, and the number of organic search results has been reduced from 10 to 7

How many scrolls does one need to do to get past the above wall of ads?

If one clicks on one of the hotel prices the follow up page is ... more ads.

Check out how the ad label is visually overwhelmed by a bright blue pop over.

Defund

It is worth noting Google Chrome has a built-in ad blocking feature which allows them to strip all ads from displaying on third party websites if they follow Google's best practices layout used in the search results.

You won't see ads on websites that have poor ad experiences, like:

  • Too many ads
  • Annoying ads with flashing graphics or autoplaying audio
  • Ad walls before you can see content

When these ads are blocked, you'll see an "Intrusive ads blocked" message. Intrusive ads will be removed from the page.

The following 4 are all true:

And, as a bonus, to some paid links are a crime but Google can sponsor academic conferences for market regulators while requesting the payments not be disclosed.

Excessive Profits = Spam

Hotels have been at the forefront of SEO for many years. They drive massive revenues & were perhaps the only vertical ever referenced in the Google rater guidelines which explicitly stated all affiliate sites should be labeled as spam even if they are helpful to users.

Google has won most of the profits in the travel market & so they'll need to eat other markets to continue their 20% annual growth.

As they grow, other markets disappear.

"It's a bug that you could rank highly in Google without buying ads, and Google is trying to fix the bug." - Googler John Rockway, January 31, 2012

Some people who market themselves as SEO experts not only recognize this trend but even encourage this sort of behavior:

Zoopla, Rightmove and On The Market are all dominant players in the industry, and many of their house and apartment listings are duplicated across the different property portals. This represents a very real reason for Google to step in and create a more streamlined service that will help users make a more informed decision. ... The launch of Google Jobs should not have come as a surprise to anyone, and neither should its potential foray into real estate. Google will want to diversify its revenue channels as much as possible, and any market that allows it to do so will be in its sights. It is no longer a matter of if they succeed, but when.

If nobody is serving a market that is justification for entering it. If a market has many diverse players that is justification for entering it. If a market is dominated by a few strong players that is justification for entering it. All roads lead to the pile of money. :)

Extracting information from the ecosystem & diverting attention from other players while charging rising rents does not make the ecosystem stronger. Doing so does not help users make a more informed decision.

Information as a Vertical

The dominance Google has in core profitable vertical markets also exists in the news & general publishing categories. Some publishers get more traffic from Google Discover than from Google search. Publishers which try to turn off Google's programmatic ads find their display ad revenues fall off a cliff:

"Nexstar Media Group Inc., the largest local news company in the U.S., recently tested what would happen if it stopped using Google’s technology to place ads on its websites. Over several days, the company’s video ad sales plummeted. “That’s a huge revenue hit,” said Tony Katsur, senior vice president at Nexstar. After its brief test, Nexstar switched back to Google." ... "Regulators who approved that $3.1 billion deal warned they would step in if the company tied together its offerings in anticompetitive ways. In interviews, dozens of publishing and advertising executives said Google is doing just that with an array of interwoven products."

News is operating like many other (broken) markets. The Salt Lake Tribune converted to a nonprofit organization.

Many local markets have been consolidated down to ownership by a couple private equity shop roll ups looking to further consolidate the market. Gatehouse Media acquired Gannett & has a $1.8 billion mountain of debt to pay off.

McClatchy - the second largest domestic newspaper chain - may soon file for bankruptcy:

there’s some nuance in this new drama — one of many to come from the past decade’s conversion of news companies into financial instruments stripped of civic responsibility by waves of outside money men. After all, when we talk about newspaper companies, we typically use their corporate names — Gannett, GateHouse, McClatchy, MNG, Lee. But it’s at least as appropriate to use the names of the hedge funds, private equity companies, and other investment vehicles that own and control them.

The Washington Post - owned by Amazon's Jeff Bezos - is creating an ad tech stack which serves other publishers & brands, though they also believe a reliance on advertiser & subscription revenue is unsustainable: “We are too beholden to just advertiser and subscriber revenue, and we’re completely out of our minds if we think that’s what’s going to be what carries us through the next generation of publishing. That’s very clear.”

Future Prospects

We are nearing inflection points in many markets where markets that seemed somewhat disconnected from search will still end up being dominated by Google. Gmail, Android, Web Analytics, Play Store, YouTube, Maps, Waze ... are all additional points of leverage beyond the core search & ads products.

If all roads lead to money one can't skip healthcare - now roughly 20% of the United States GDP.

Google scrubbed many alternative health sites from the search results. Some of them may have deserved it. Others were perhaps false positives.

Google wants to get into the healthcare market in a meaningful way. Google bought Fitbit and partnered with Ascension on a secret project gathering health information on over 50 million Americans.

Google is investing heavily in quantum computing. Google Fiber was a nothingburger to force competing ISPs into accelerating expensive network upgrades, but beaming in internet services from satellites will allow Google to bypass local politics, local regulations & heavy network infrastructure construction costs. A startup named Kepler recently provided high-bandwidth connectivity to the Arctic. When Google launches a free ISP there will be many knock on effects causing partners to long for the day where Google was only as predatory as they are today.

"Capitalism is an efficient system for surfacing and addressing the needs of consumers. But once it veers toward control over markets by a single entity, those benefits disappear." - Seth Godin




v

Favicon SEO

Google recently copied their mobile result layout over to desktop search results. The three big pieces which changed as part of that update were

  • URLs: In many cases Google will now show breadcrumbs in the search results rather than showing the full URL. The layout no longer differentiates between HTTP and HTTPS. And the URLs shifted from an easily visible green color to a much easier to miss black.
  • Favicons: All listings now show a favicon next to them.
  • Ad labeling: ad labeling is in the same spot as favicons are for organic search results, but the ad labels are a black which sort of blends in to the URL line. Over time expect the black ad label to become a lighter color in a way that parallels how Google made ad background colors lighter over time.

One could expect this change to boost the CTR on ads while lowering the CTR on organic search results, at least up until users get used to seeing favicons and not thinking of them as being ads.

The Verge panned the SERP layout update. Some folks on Reddit hate this new layout as it is visually distracting, the contrast on the URLs is worse, and many people think the organic results are ads.

I suspect a lot of phishing sites will use subdomains patterned off the brand they are arbitraging coupled with bogus favicons to try to look authentic. I wouldn't reconstruct an existing site's structure based on the current search result layout, but if I were building a brand new site I might prefer to put it at the root instead of on www so the words were that much closer to the logo.

Google provides the following guidelines for favicons

  • Both the favicon file and the home page must be crawlable by Google (that is, they cannot be blocked to Google).
  • Your favicon should be a visual representation of your website's brand, in order to help users quickly identify your site when they scan through search results.
  • Your favicon should be a multiple of 48px square, for example: 48x48px, 96x96px, 144x144px and so on. SVG files, of course, do not have a specific size. Any valid favicon format is supported. Google will rescale your image to 16x16px for use in search results, so make sure that it looks good at that resolution. Note: do not provide a 16x16px favicon.
  • The favicon URL should be stable (don’t change the URL frequently).
  • Google will not show any favicon that it deems inappropriate, including pornography or hate symbols (for example, swastikas). If this type of imagery is discovered within a favicon, Google will replace it with a default icon.

In addition to the above, I thought it would make sense to provide a few other tips for optimizing favicons.

  • Keep your favicons consistent across sections of your site if you are trying to offer a consistent brand perception.
  • In general, less is more. 16x16 is a tiny space, so if you try to convey a lot of information inside of it, you'll likely end up creating a blob that almost nobody but you recognizes.
  • It can make sense to include the first letter from a site's name or a simplified logo widget as the favicon, but it is hard to include both in a single favicon without it looking overdone & cluttered.
  • A colored favicon on a white background generally looks better than a white icon on a colored background, as having a colored background means you are eating into some of the scarce pixel space for a border.
  • Using a square shape versus a circle gives you more surface area to work with.
  • Even if your logo has italics on it, it might make sense to avoid using italics in the favicon to make the letter look cleaner.

Here are a few favicons I like & why I like them:

  • Citigroup - manages to get the word Citi in there while looking memorable & distinctive without looking overly cluttered
  • Nerdwallet - the N makes a great use of space, the colors are sharp, and it almost feels like an arrow that is pointing right
  • Inc - the bold I with a period is strong.
  • LinkedIn - very memorable using a small part of the word from their logo & good color usage.

Some of the other memorable ones that I like include: Twitter, Amazon, eBay, Paypal, Google Play & CNBC.

Here are a few favicons I dislike & why

  • Wikipedia - the W is hard to read.
  • USAA - they included both the logo widget and the 4 letters in a tiny space.
  • Yahoo! - they used inconsistent favicons across their sites & use italics on them. Some of the favicons have the whole word Yahoo in them while the others are the Y! in italics.

If you do not have a favicon Google will show a dull globe next to your listing. Real Favicon Generator is a good tool for creating favicons in various sizes.

What favicons do you really like? Which big sites do you see that are doing it wrong?




v

Revenue Quality & Leverage

The coronavirus issue is likely to linger for some time.

Up to 70% of Germany could become infected & some countries like the UK are even considering herd immunity as a strategy:

"I’m an epidemiologist. When I heard about Britain’s ‘herd immunity’ coronavirus plan, I thought it was satire"
- William Hanage

What if their models are broken?

Many companies like WeWork or Oyo have been fast and loose chasing growth while slower growing companies have been levering up to fund share buybacks. Airlines spent 96% of free cash flow on share buybacks. The airlines seek a $50 billion bailout package.

There are knock-on effects from Boeing to TripAdvisor to Google all the way down to travel affiliate blogger, local restaurants closing, the over-levered bus company going through bankruptcy & bondholders eating a loss on the debt.

Companies are going to let a lot of skeletons out of the closet as literally anything and everything bad gets attributed to coronavirus. Layoffs, renegotiating contracts, pausing ad budgets, renegotiating debts, requesting bailouts, etc. The Philippine stock market was recently trading at 2012 levels & closed indefinitely.

Brad Geddes mentioned advertisers have been aggressively pulling PPC budgets over the past week: “If you have to leave the house to engage in the service, it just seems like it’s not converting right now.”

During the prior recession Google repriced employee options to retain talent.

In spite of consumers being glued to the news, tier one news publishers are anticipating large ad revenue declines:

Some of the largest advertisers, including Procter & Gamble Unilever, Apple, Microsoft, Danone, AB InBev, Burberry and Aston Martin, made cuts to sales forecasts for the year. With the outlook for the spread of the virus changing by day, many companies are caught in a spiral of uncertainty. That tends to gum up decisions, and ad spending is an easy expenditure to put on pause. The New York Times has warned that it expects advertising revenue to decline “in the mid-teens” in the current quarter as a result of coronavirus.

More time online might mean search engines & social networks capture a greater share of overall ad spend, but if large swaths of the economy do not convert & how people live changes for an extended period of time it will take time for the new categories to create the economic engines replacing the old out-of-favor categories.

[IMPORTANT: insert affiliate ad for cruise vacations here]

As Google sees advertisers pause ad budgets Google will get more aggressive with keeping users on their site & displacing organic click flows with additional ad clicks on the remaining advertisers.

When Google or Facebook see a 5% or 10% pullback other industry players might see a 30% to 50% decline as the industry pulls back broadly, focuses more resources on the core, and the big attention merchants offset their losses by clamping down on other players.

At its peak TripAdvisor was valued at about $14 billion & it is now valued at about $2 billion.

TripAdvisor announced layoffs. As did Expedia. As did Booking.com. As did many hotels. And airlines. etc. etc. etc.

I am not suggesting people should be fearful or dominated by negative emotions. Rather one should live as though many other will be living that way.

In times of elevated uncertainty, in business it is best to not be led by emotions unless they are positive ones. Spend a bit more time playing if you can afford to & work more on things you love.

Right now we might be living through the flu pandemic of 1918 and the Great Depression of 1929 while having constant access to social media updates. And that's awful.

Consume less but deeper. Less Twitter, less news, fewer big decisions, read more books.

It is better to be more pragmatic & logic-based in determining opportunity cost & the best strategy to use than to be led by extreme fear.

  • If you have sustainable high-margin revenue treasure it.
  • If you have low-margin revenue it might quickly turn into negative margin revenues unless something changes quickly.
  • If you have low-margin revenue which is sustainable but under-performed less stable high-margin revenues you might want to put a bit more effort into those sorts of projects as they are more likely to endure.

On a positive note, we might soon get a huge wave of innovation...

"Take the Great Depression. Economist Alexander Field writes that “the years 1929–1941 were, in the aggregate, the most technologically progressive of any comparable period in U.S. economic history.” Productivity growth was twice as fast in the 1930s as it was in the decade prior. The 1920s were the era of leisure because people could afford to relax. The 1930s were the era of frantic problem solving because people had no other choice. The Great Depression brought unimaginable financial pain. It also brought us supermarkets, microwaves, sunscreen, jets, rockets, electron microscopes, magnetic recording, nylon, photocopying, teflon, helicopters, color TV, plexiglass, commercial aviation, most forms of plastic, synthetic rubber, laundromats, and countless other discoveries."

The prior recession led to trends like Groupon. The McJobs recovery led to services like Uber & DoorDash. Food delivery has been trending south recently, though perhaps the stay-at-home economy will give it a boost.

I have been amazed at how fast affiliates moved with pushing N95 face masks online over the past couple months. Seeing how fast that stuff spun up really increases the perceived value of any sustainable high-margin businesses.

Amazon.com is hiring another 100,000 warehouse workers as people shop from home. Amazon banned new face masks and hand sanitizer listings. One guy had to donate around 18,000 cleaning products he couldn't sell.

I could see online education becoming far more popular as people aim to retrain while stuck at home.

What sorts of new industries will current & new technologies lead to as more people spend time working from home?




v

New Version of SEO Toolbar

Our programmer recently updated our SEO toolbar to work with the most recent version of Firefox.

You can install it from here. After you install it the toolbar should automatically update on a forward basis.

It is easy to toggle on or off simply by clicking on the green or gray O. If the O is gray it is off & if it is green it is on.

The toolbar shows site & page level link data from data sources like SEMRush, Ahrefs & Majestic along with estimated Google search traffic from SEMrush and some social media metrics.

At the right edge of the toolbar there is a [Tools] menu which allows you to pull in the age of a site from the Internet Archive Wayback Machine, the IP address hosting a site & then cross links into search engine cached copies of pages and offers access to our SEO Xray on-page analyzer.

SEO today is much more complex than it was back when we first launched this toolbar as back them almost everything was just links, links, links. All metrics in isolation are somewhat useless, but being able to see estimated search traffic stats right near link data & being able to click into your favorite data sources to dig deeper into the data can help save a lot of time.

For now the toolbar is still only available on Firefox, though we could theoretically have it work on Chrome *if* at some point we trusted Google.




v

How to clean your face mask to help prevent getting and spreading the coronavirus


PHILADELPHIA — Both the Centers for Disease Control and Prevention and the Pennsylvania Department of Health now recommend we all wear face masks when going about essential tasks in public, as part of the fight against the coronavirus pandemic. Those types of masks have become a hot topic online during the ongoing outbreak, and are […]




v

Here are some activities to do this weekend even while staying at home


As we continue to quarantine under Gov. Jay Inslee's "stay at home" order, there are still lots of fun activities you can do this weekend. So, stay in, read a book, start a movie marathon and order some takeout.




v

Having pandemic-related food and body anxieties amid the coronavirus pandemic? You’re not alone.


Living through a pandemic will inevitably take a toll on our minds and bodies. Here are some tips for treating your mind and body well under quarantine.




v

Here’s a mental health tip to get you through coronavirus quarantine: Find tranquility in nature


Since humans are such social animals, this time of confinement and isolation makes it more crucial than ever to connect — with friends and family, but also with nature. Here’s why being around nature can help your mental health during this stressful time.




v

Coronavirus pushed spin, barre, yoga and other fitness classes online. Here’s how Seattle-area fitness studios have adapted


In these coronavirus pandemic times, online yoga has become as ubiquitous as online dating. But for some other kinds of fitness classes, the switch to virtual instruction has been more challenging.




v

Saving money on critical brand-name drugs


Prescription drugs can cost a fortune. One reader wonders aloud about saving money on brand-name medicine.




v

Technology’s had us ‘social distancing’ for years. Can our digital ‘lifeline’ get us through the coronavirus pandemic?


In some ways, we’ve been social distancing for years as more aspects of our social lives go digital. So now, we may be uniquely equipped (if not conditioned) to adapt our lives to stay-at-home orders.




v

Coronavirus pandemic triggers a wave of self-sufficiency around Seattle: Vegetable gardens, urban chickens are in-demand


Since the outbreak of the coronavirus pandemic, many local plant nurseries say there’s been a run on seeds as people all over Seattle take to gardening to grow food and provide solace during an uncertain time.




v

Poison center calls spike during coronavirus pandemic as more people are exposed to cleaning and disinfecting agents


Be cautious handling — and mixing — cleaning supplies, read labels and follow directions. Many of the accidental, and potentially dangerous, recent exposures reported to the Washington Poison Center have been from ordinary household cleaning supplies or the combination of them.




v

Trump raises question of ultraviolet light and COVID-19. We ask doctors, scientists.


President Donald Trump speculated about ultraviolet rays. But artificial UV techniques are ineffective and likely deadly for treating an infected person, scientists say — and some can be extremely dangerous used at home for disinfecting.




v

Doctors’ practices are hurt by coronavirus pandemic, just when they’re most needed


Many physician practices, like other businesses, are questioning how they'll survive the coronavirus outbreak, according to the Washington State Medical Association.




v

How to know when you need to toss those limp vegetables


We’ve all been there before — staring down a questionable bag of veggies and a decision over what to do with them. Here’s how to tell what you should and shouldn’t eat.




v

Two celestial treats will be visible this week — and both are worth going outside in your jammies


A huge asteroid will make a (relatively) close pass of Earth early Wednesday, but you'll need a telescope to see that; however, an exceptionally bright Venus should be visible to the naked eye at dusk and in the early evenings. Look to the west.




v

Quercetin solved a spring allergy problem


Q: I had such a terrible allergy attack that I couldn’t get my head off my desk to drive myself home. It was 1987, and I was very reluctant to take any medication. My boss gave me a pill she said was safe because it was plant-based. It was quercetin. When she checked on me […]




v

The coronavirus pandemic has taken a toll on our collective mental health. Can nutrition help?


Though there isn’t a diet that has been scientifically proven to sustain or improve your mental health, research suggests eating certain foods can correlate with improved mental well-being.




v

Fast-moving weather systems mean the week will start wet and get wetter


As the rain gets heavier by midweek, we can also expect cooler lowland temperatures and snow in the mountains.




v

Sunny, beautiful weather is here this week! Getting outside can relieve stress — just stay away from other people


If self-isolating or social-distancing to slow the spread of the novel coronavirus has been stressful, you can get a much-needed mental-health boost by getting some sunshine, exercise and fresh air -- as long as you stay away from others.




v

Earthquake shakes Utah, rattling frayed coronavirus nerves


SALT LAKE CITY (AP) — A moderate earthquake Wednesday near Salt Lake City shut down a major air traffic hub, damaged a spire atop a temple and frightened millions of people already on edge from the coronavirus pandemic. There were no reports of injuries. The 5.7-magnitude quake just after 7 a.m. damaged the spire and […]




v

It’s cherry blossom season, but because of the coronavirus, the UW invites you to watch from home


The UW wants you to stay away from the quad — but you can add the school's cherry blossoms to your home streaming queue.




v

Seattle to close major parks, beaches this weekend due to coronavirus fears during expected warmer weather


Seattle is closing more than a dozen of the city’s largest and most popular parks for the weekend because officials are worried about people crowding into the parks to enjoy the pleasant spring weather and spreading the novel coronavirus to each other, Mayor Jenny Durkan said Thursday.