it

Versión de ‘Let it be’ cantada solo por mujeres




it

Gobierno debe dar señales a los bancos para dar créditos a largo plazo




it

DNC to Investigate Ohio Voting Irregularities

The leader of the Democratic National Committee announced Monday that he will launch an investigation into voting irregularities in Ohio, where lines snaked outside some inner-city polling places on Election Day and provisional ballots were sometimes in short supply.




it

On Nov. 2, GOP Got More Bang for Its Billion

In the most expensive presidential contest in the nation's history, John F. Kerry and his Democratic supporters nearly matched President Bush and the Republicans, who outspent them by just $60 million, $1.14 billion to $1.08 billion, an analysis shows.




it

In Wash. State, Democrat Takes Office Amid Suit

The freshly inaugurated Democratic governor's grip on the job she won by the tissue-thin margin of 129 votes remains wobbly, as Republicans press state courts to order a new election.




it

Kerry Cites Suppressed Votes in Election

Sen. John F. Kerry, in some of his most pointed public comments yet about the presidential election, invoked Martin Luther King Jr.'s legacy on Monday as he criticized President Bush and decried reports of voter disenfranchisement.




it

Report Acknowledges Exit Poll Inaccuracies

Interviewing for the 2004 exit polls was the most inaccurate of any in the past five presidential elections as procedural problems compounded by the refusal of large numbers of Republican voters to be surveyed led to inflated estimates of support for John F. Kerry, according to a report released Wednesday by the research firms responsible for the flawed surveys.




it

Panel to Start Writing Social Security Bill

Five months after President Bush launched his drive to overhaul Social Security, the difficult, if not impossible, task of drafting legislation begins Tuesday when the Senate Finance Committee holds the first hearing on options to secure Social Security's future.




it

Fewer U.S. Deaths Linked to Obesity

A new government study has concluded that obesity causes about 112,000 deaths each year in the United States, far fewer than a previous, highly publicized estimate by another part of the same agency.




it

Hospitals Services Performed Overseas

A movement toward greater use of telemedicine is widening the spectrum of care doctors can provide from afar and enabling more outsourcing of services overseas.




it

Corona: US-Arbeitslosenquote auf historischem Höchststand

Die Arbeitslosenquote in den USA ist im April auf 14,7 Prozent gestiegen - der höchste Wert seit Beginn der Aufzeichnungen nach dem Zweiten Weltkrieg. Die tatsächlichen Zahlen könnten sogar noch höher liegen.




it

UN-Sicherheitsrat: Maas prangert nationale Alleingänge an

Bei einer Sitzung des UN-Sicherheitsrats hat Außenminister Maas mehr Rückhalt für internationalen Gremien gefordert. Nationalismus münde in Zerstörung, warnte er anlässlich des Endes des Zweiten Weltkrieges vor 75 Jahren. Von A. Passenheim.




it

Bahn: Gewerkschaft rechnet mit langsamer Erholung

Jahrelang fuhr die Bahn Fahrgastrekorde ein. Mit der Corona-Krise kam der Einbruch. Viele Kunden kämen so schnell nicht zurück, glaubt der Chef der Gewerkschaft EVG, Hommel. Das liege vor allem am Rückgang von Dienstreisen.




it

Tod von Little Richard: Der Rock'n'Roll-Pionier ist verstummt

Die ganz großen Erfolge blieben für Little Richard seit den späten 1950er Jahren aus. Doch sein Einfluss prägte über Jahrzehnte Generationen von Künstlern. Nun ist das Urgestein des Rock'n'Roll mit 87 Jahren gestorben. Von Arthur Landwehr.




it

EU-Spitzen: Europa ist "momentan sehr zerbrechlich"

Die Spitzen der EU haben sich besorgt gezeigt über den Zustand der Gemeinschaft. Durch die Corona-Krise drohe eine Schwächung Europas - zulasten der Ärmsten. Erhebliche Kritik gibt es an den Grenzschließungen.




it

Deutschland- und Weltkarte mit Coronavirus-Fällen

Wie viele bestätigte Coronavirus-Fälle gibt es? Die interaktiven Karten geben einen aktuellen Überblick für Deutschland und die Welt. Sie zeigen auch an, wie viele Menschen gestorben und wie viele genesen sind.




it

Militärparade in Belarus: Dicht gedrängt und ohne Mundschutz

Trotz Warnungen vor Infektionsgefahr mit dem Coronavirus hat Belarus den 75. Jahrestag des Siegs über das nationalsozialistische Deutschland gefeiert. Tausende Soldaten zogen durch Minsk. Kritik kam von der WHO und Russland.




it

Ein Drittel der Gaststätten in MV wieder offen

Als erste in Deutschland durften am Sonnabend die Gastwirte in Mecklenburg-Vorpommern wieder aufschließen. Nach sieben Wochen Zwangspause haben rund ein Drittel der Betriebe die Chance genutzt.




it

Corona in Schlachthöfen: Kritik an Sammelunterkünften

Nachdem bei Hunderten Schlachthof-Mitarbeitern das Coronavirus nachgewiesen wurde, fordern mehrere Politiker Konsequenzen. Sie kritisieren vor allem die beengten Wohnverhältnisse der meist osteuropäischen Arbeiter.




it

Corona-Maßnahmen: Bischöfe verbreiten Verschwörungstheorien

Mehrere katholische Bischöfe kritisieren die Corona-Maßnahmen und greifen dabei auf weitverbreitete Verschwörungstheorien zurück. Sie sehen den "Auftakt einer Weltregierung". Die Deutsche Bischofskonferenz übt scharfe Kritik.




it

Ex-Washington State coach Mike Leach apologizes after tweeting photo of woman with noose


Mississippi State's new coach posted, and later deleted, a tweet of a photo of an elderly woman resting in a chair and simultaneously knitting a noose to pass her time during coronavirus self-quarantine.




it

WSU coach Nick Rolovich has ‘fit like a glove’ in Pullman. But success will be measured on the field.


Rolovich has brought his fun to The Palouse, hired in January as Washington State’s new football coach, replacing Mike Leach, who went to Mississippi State. But winning Cougs over will ultimately be decided on the field.




it

WSU coaches Nick Rolovich and Kyle Smith taking temporary salary reductions as part of ‘cost containment’ measure


To help compensate for lost NCAA distribution and added expenditures caused by the novel coronavirus outbreak, Washington State announced multiple “cost containment” measures Monday.




it

Former Washington State tackle Andre Dillard donates strength equipment, nutrition items to alma mater


The Woodinville grad, who plays for the Philadelphia Eagles, sent packages the school will distribute to its athletes.




it

Despite loaded receiver class, WSU Cougars’ Dezmon Patmon hopes to hear name called in NFL draft


It's considered to be a historically deep receiver draft class this year, but the 6-foot-4 receiver hopes to stand out with his size.




it

One of two Power Five schools without a 2021 commit, Washington State faces hurdle in recruiting


Of the 65 programs that make up college football’s “Power Five” conferences, 63 have at least one prospect committed in the 2021 recruiting class. Washington State and Arizona are the two that don't.




it

California wide receiver Orion Peters becomes first WSU Cougars commit in 2021 class


Inglewood (Calif.) High wide receiver Orion Peters pledged to WSU, becoming the first 2021 prospect to do so when he announced his decision on Twitter Friday night.




it

WSU receiver Renard Bell’s family survives frightening bout with the novel coronavirus


Anyone who stumbled on the tweet sent out by Renard Bell at 2:41 p.m. Friday would understand why the Washington State wide receiver is smiling again. “My grandma is fully recovered from COVID-19,” Bell posted with two emojis – the first depicting a set of hands praying and the second of a heart. My grandma […]




it

Three-star offensive tackle Christian Hilborn becomes WSU’s second 2021 commit


Christian Hilborn, a 6-foot-5, 280-pound offensive tackle from Highland High School in Utah has pledged to the Cougars, becoming WSU's second commit of the 2021 class.




it

Washington Huskies cancel all sports competitions through March 29 amid coronavirus concerns


The University of Washington will suspend athletic-related activities and events through March 29 due to concerns regarding the novel coronavirus. “The University of Washington athletic department has announced it will suspend all athletic-related activities and events, including workouts, training and practices, through the end of the winter quarter and spring break (March 29) for all […]




it

‘It’s a big moment.’ Washington State leaves no doubt against Colorado, breaking drought at Pac-12 tournament


Not weighed down by their 10-year drought at the Pac-12 tournament, the Cougars trailed for just 87 seconds against Colorado on Wednesday night before driving the Buffaloes into the ground, 82-68, at T-Mobile Arena.




it

Due to coronavirus, NCAA grants extra year of eligibility to spring athletes, considers same for winter athletes


After the cancellation of the spring and winter championships tournaments stemming from concerns over the novel coronavirus pandemic, the NCAA will grant an extra year of eligibility to athletes who participate in spring sports, the organization announced Friday.




it

Take a trip down memory lane with the best — and worst — memories of the Kingdome


On the anniversary of the Kingdome's implosion, we take a trip down memory lane to relive its best and worst moments.




it

Pac-12 commissioner Larry Scott discusses conference’s financial hit and ‘concern and anxiety’ over athletes because of coronavirus


The Pac-12 is facing a revenue hit of at least $1 million per school from the cancellation of its men’s basketball tournament and March Madness, although the full extent of the damage won’t be known for weeks.




it

When it comes to academics and diversity, Gonzaga is No. 1 seed


Gonzaga stood out in a study that seeded men’s and women’s NCAA tournament brackets based on graduation rates, academic success and diversity in the head-coaching ranks.




it

Four-star center Dishon Jackson commits to Washington State


Coach Kyle Smith has added one of the top-rated prospects in program history to an already robust 2020 recruiting class.




it

WSU coaches Nick Rolovich and Kyle Smith taking temporary salary reductions as part of ‘cost containment’ measure


To help compensate for lost NCAA distribution and added expenditures caused by the novel coronavirus outbreak, Washington State announced multiple “cost containment” measures Monday.




it

Notre Dame, Oregon top 2021 Maui Invitational field


LAHAINA, Hawaii (AP) — Former tournament champion Notre Dame and Oregon headline the 2021 Maui Invitational field. The bracket, announced Friday, also includes Butler, Houston, Saint Mary’s, Wisconsin, Texas A&M and host Chaminade. Notre Dame won the Maui title in its last appearance in 2017, beating Wichita State in the championship game. Wisconsin is making […]




it

Amid pandemic, Pompeo to visit Israel for annexation talks


WASHINGTON (AP) — Secretary of State Mike Pompeo will travel to Israel next week for a brief visit amid the coronavirus pandemic and lockdown, a trip that’s expected to focus on Prime Minister Benjamin Netanyahu’s plans to annex portions of the West Bank, the State Department said Friday. Pompeo will make the lightning trip to […]



  • Nation & World Politics
  • World

it

A top aide to Vice President Pence tests positive for coronavirus


WASHINGTON — A top aide to Vice President Mike Pence tested positive for coronavirus on Friday, making her the second known person working at the White House to contract the illness in the past two days, according to several sources familiar with the situation. Katie Miller, the vice president’s press secretary, was notified Friday about […]




it

Hidden toll: Mexico ignores wave of coronavirus deaths in capital


MEXICO CITY — The Mexican government is not reporting hundreds, possibly thousands, of deaths from the coronavirus in Mexico City, dismissing anxious officials who have tallied more than three times as many fatalities in the capital than the government publicly acknowledges, according to officials and confidential data. The tensions have come to a head in […]




it

Two White House coronavirus cases raise question of if anyone is really safe


WASHINGTON — In his eagerness to reopen the country, President Donald Trump faces the challenge of convincing Americans that it would be safe to go back to the workplace. But the past few days have demonstrated that even his own workplace may not be safe from the coronavirus. Vice President Mike Pence’s press secretary tested […]




it

Reopenings bring new cases in S. Korea, virus fears in Italy


ROME (AP) — South Korea’s capital closed down more than 2,100 bars and other nightspots Saturday because of a new cluster of coronavirus infections, Germany scrambled to contain fresh outbreaks at slaughterhouses, and Italian authorities worried that people were getting too friendly at cocktail hour during the country’s first weekend of eased restrictions. The new […]




it

Coronavirus takes a toll in Sweden’s immigrant community


STOCKHOLM (AP) — The flight from Italy was one of the last arrivals that day at the Stockholm airport. A Swedish couple in their 50s walked up and loaded their skis into Razzak Khalaf’s taxi. It was early March and concerns over the coronavirus were already present, but the couple, both coughing for the entire […]




it

Militants increasing attacks on Burkina Faso mines


BOUDA, Burkina Faso (AP) — Jihadists burst into the gold mine where Moussa Tambura worked in Burkina Faso, forbidding everyone from smoking and drinking. It wasn’t long before the men returned and leveled the place to the ground. “They attacked the site, killed people and burned houses,” said Tambura, 29, clenching his fists. He was […]




it

Google Florida 2.0 Algorithm Update: Early Observations

It has been a while since Google has had a major algorithm update.

They recently announced one which began on the 12th of March.

What changed?

It appears multiple things did.

When Google rolled out the original version of Penguin on April 24, 2012 (primarily focused on link spam) they also rolled out an update to an on-page spam classifier for misdirection.

And, over time, it was quite common for Panda & Penguin updates to be sandwiched together.

If you were Google & had the ability to look under the hood to see why things changed, you would probably want to obfuscate any major update by changing multiple things at once to make reverse engineering the change much harder.

Anyone who operates a single website (& lacks the ability to look under the hood) will have almost no clue about what changed or how to adjust with the algorithms.

In the most recent algorithm update some sites which were penalized in prior "quality" updates have recovered.

Though many of those recoveries are only partial.

Many SEO blogs will publish articles about how they cracked the code on the latest update by publishing charts like the first one without publishing that second chart showing the broader context.

The first penalty any website receives might be the first of a series of penalties.

If Google smokes your site & it does not cause a PR incident & nobody really cares that you are gone, then there is a very good chance things will go from bad to worse to worser to worsterest, technically speaking.

“In this age, in this country, public sentiment is everything. With it, nothing can fail; against it, nothing can succeed. Whoever molds public sentiment goes deeper than he who enacts statutes, or pronounces judicial decisions.” - Abraham Lincoln

Absent effort & investment to evolve FASTER than the broader web, sites which are hit with one penalty will often further accumulate other penalties. It is like compound interest working in reverse - a pile of algorithmic debt which must be dug out of before the bleeding stops.

Further, many recoveries may be nothing more than a fleeting invitation to false hope. To pour more resources into a site that is struggling in an apparent death loop.

The above site which had its first positive algorithmic response in a couple years achieved that in part by heavily de-monetizing. After the algorithm updates already demonetized the website over 90%, what harm was there in removing 90% of what remained to see how it would react? So now it will get more traffic (at least for a while) but then what exactly is the traffic worth to a site that has no revenue engine tied to it?

That is ultimately the hard part. Obtaining a stable stream of traffic while monetizing at a decent yield, without the monetizing efforts leading to the traffic disappearing.

A buddy who owns the above site was working on link cleanup & content improvement on & off for about a half year with no results. Each month was a little worse than the prior month. It was only after I told him to remove the aggressive ads a few months back that he likely had any chance of seeing any sort of traffic recovery. Now he at least has a pulse of traffic & can look into lighter touch means of monetization.

If a site is consistently penalized then the problem might not be an algorithmic false positive, but rather the business model of the site.

The more something looks like eHow the more fickle Google's algorithmic with receive it.

Google does not like websites that sit at the end of the value chain & extract profits without having to bear far greater risk & expense earlier into the cycle.

Thin rewrites, largely speaking, don't add value to the ecosystem. Doorway pages don't either. And something that was propped up by a bunch of keyword-rich low-quality links is (in most cases) probably genuinely lacking in some other aspect.

Generally speaking, Google would like themselves to be the entity at the end of the value chain extracting excess profits from markets.

This is the purpose of the knowledge graph & featured snippets. To allow the results to answer the most basic queries without third party publishers getting anything. The knowledge graph serve as a floating vertical that eat an increasing share of the value chain & force publishers to move higher up the funnel & publish more differentiated content.

As Google adds features to the search results (flight price trends, a hotel booking service on the day AirBNB announced they acquired HotelTonight, ecommerce product purchase on Google, shoppable image ads just ahead of the Pinterest IPO, etc.) it forces other players in the value chain to consolidate (Expedia owns Orbitz, Travelocity, Hotwire & a bunch of other sites) or add greater value to remain a differentiated & sought after destination (travel review site TripAdvisor was crushed by the shift to mobile & the inability to monetize mobile traffic, so they eventually had to shift away from being exclusively a reviews site to offer event & hotel booking features to remain relevant).

It is never easy changing a successful & profitable business model, but it is even harder to intentionally reduce revenues further or spend aggressively to improve quality AFTER income has fallen 50% or more.

Some people do the opposite & make up for a revenue shortfall by publishing more lower end content at an ever faster rate and/or increasing ad load. Either of which typically makes their user engagement metrics worse while making their site less differentiated & more likely to receive additional bonus penalties to drive traffic even lower.

In some ways I think the ability for a site to survive & remain though a penalty is itself a quality signal for Google.

Some sites which are overly reliant on search & have no external sources of traffic are ultimately sites which tried to behave too similarly to the monopoly that ultimately displaced them. And over time the tech monopolies are growing more powerful as the ecosystem around them burns down:

If you had to choose a date for when the internet died, it would be in the year 2014. Before then, traffic to websites came from many sources, and the web was a lively ecosystem. But beginning in 2014, more than half of all traffic began coming from just two sources: Facebook and Google. Today, over 70 percent of traffic is dominated by those two platforms.

Businesses which have sustainable profit margins & slack (in terms of management time & resources to deploy) can better cope with algorithmic changes & change with the market.

Over the past half decade or so there have been multiple changes that drastically shifted the online publishing landscape:

  • the shift to mobile, which both offers publishers lower ad yields while making the central ad networks more ad heavy in a way that reduces traffic to third party sites
  • the rise of the knowledge graph & featured snippets which often mean publishers remain uncompensated for their work
  • higher ad loads which also lower organic reach (on both search & social channels)
  • the rise of programmatic advertising, which further gutted display ad CPMs
  • the rise of ad blockers
  • increasing algorithmic uncertainty & a higher barrier to entry

Each one of the above could take a double digit percent out of a site's revenues, particularly if a site was reliant on display ads. Add them together and a website which was not even algorithmically penalized could still see a 60%+ decline in revenues. Mix in a penalty and that decline can chop a zero or two off the total revenues.

Businesses with lower margins can try to offset declines with increased ad spending, but that only works if you are not in a market with 2 & 20 VC fueled competition:

Startups spend almost 40 cents of every VC dollar on Google, Facebook, and Amazon. We don’t necessarily know which channels they will choose or the particularities of how they will spend money on user acquisition, but we do know more or less what’s going to happen. Advertising spend in tech has become an arms race: fresh tactics go stale in months, and customer acquisition costs keep rising. In a world where only one company thinks this way, or where one business is executing at a level above everyone else - like Facebook in its time - this tactic is extremely effective. However, when everyone is acting this way, the industry collectively becomes an accelerating treadmill. Ad impressions and click-throughs get bid up to outrageous prices by startups flush with venture money, and prospective users demand more and more subsidized products to gain their initial attention. The dynamics we’ve entered is, in many ways, creating a dangerous, high stakes Ponzi scheme.

And sometimes the platform claws back a second or third bite of the apple. Amazon.com charges merchants for fulfillment, warehousing, transaction based fees, etc. And they've pushed hard into launching hundreds of private label brands which pollute the interface & force brands to buy ads even on their own branded keyword terms.

They've recently jumped the shark by adding a bonus feature where even when a brand paid Amazon to send traffic to their listing, Amazon would insert a spam popover offering a cheaper private label branded product:

Amazon.com tested a pop-up feature on its app that in some instances pitched its private-label goods on rivals’ product pages, an experiment that shows the e-commerce giant’s aggressiveness in hawking lower-priced products including its own house brands. The recent experiment, conducted in Amazon’s mobile app, went a step further than the display ads that commonly appear within search results and product pages. This test pushed pop-up windows that took over much of a product page, forcing customers to either click through to the lower-cost Amazon products or dismiss them before continuing to shop. ... When a customer using Amazon’s mobile app searched for “AAA batteries,” for example, the first link was a sponsored listing from Energizer Holdings Inc. After clicking on the listing, a pop-up window appeared, offering less expensive AmazonBasics AAA batteries."

Buying those Amazon ads was quite literally subsidizing a direct competitor pushing you into irrelevance.

And while Amazon is destroying brand equity, AWS is doing investor relations matchmaking for startups. Anything to keep the current bubble going ahead of the Uber IPO that will likely mark the top in the stock market.

As the market caps of big tech companies climb they need to be more predatious to grow into the valuations & retain employees with stock options at an ever-increasing strike price.

They've created bubbles in their own backyards where each raise requires another. Teachers either drive hours to work or live in houses subsidized by loans from the tech monopolies that get a piece of the upside (provided they can keep their own bubbles inflated).

"It is an uncommon arrangement — employer as landlord — that is starting to catch on elsewhere as school employees say they cannot afford to live comfortably in regions awash in tech dollars. ... Holly Gonzalez, 34, a kindergarten teacher in East San Jose, and her husband, Daniel, a school district I.T. specialist, were able to buy a three-bedroom apartment for $610,000 this summer with help from their parents and from Landed. When they sell the home, they will owe Landed 25 percent of any gain in its value. The company is financed partly by the Chan Zuckerberg Initiative, Mark Zuckerberg’s charitable arm."

The above sort of dynamics have some claiming peak California:

The cycle further benefits from the Alchian-Allen effect: agglomerating industries have higher productivity, which raises the cost of living and prices out other industries, raising concentration over time. ... Since startups raise the variance within whatever industry they’re started in, the natural constituency for them is someone who doesn’t have capital deployed in the industry. If you’re an asset owner, you want low volatility. ... Historically, startups have created a constant supply of volatility for tech companies; the next generation is always cannibalizing the previous one. So chip companies in the 1970s created the PC companies of the 80s, but PC companies sourced cheaper and cheaper chips, commoditizing the product until Intel managed to fight back. Meanwhile, the OS turned PCs into a commodity, then search engines and social media turned the OS into a commodity, and presumably this process will continue indefinitely. ... As long as higher rents raise the cost of starting a pre-revenue company, fewer people will join them, so more people will join established companies, where they’ll earn market salaries and continue to push up rents. And one of the things they’ll do there is optimize ad loads, which places another tax on startups. More dangerously, this is an incremental tax on growth rather than a fixed tax on headcount, so it puts pressure on out-year valuations, not just upfront cash flow.

If you live hundreds of miles away the tech companies may have no impact on your rental or purchase price, but you can't really control the algorithms or the ecosystem.

All you can really control is your mindset & ensuring you have optionality baked into your business model.

  • If you are debt-levered you have little to no optionality. Savings give you optionality. Savings allow you to run at a loss for a period of time while also investing in improving your site and perhaps having a few other sites in other markets.
  • If you operate a single website that is heavily reliant on a third party for distribution then you have little to no optionality. If you have multiple projects that enables you to shift your attention toward working on whatever is going up and to the right while letting anything that is failing pass time without becoming overly reliant on something you can't change. This is why it often makes sense for a brand merchant to operate their own ecommerce website even if 90% of their sales come from Amazon. It gives you optionality should the tech monopoly become abusive or otherwise harm you (even if the intent was benign rather than outright misanthropic).

As the update ensues Google will collect more data with how users interact with the result set & determine how to weight different signals, along with re-scoring sites that recovered based on the new engagement data.

Recently a Bing engineer named Frédéric Dubut described how they score relevancy signals used in updates

As early as 2005, we used neural networks to power our search engine and you can still find rare pictures of Satya Nadella, VP of Search and Advertising at the time, showcasing our web ranking advances. ... The “training” process of a machine learning model is generally iterative (and all automated). At each step, the model is tweaking the weight of each feature in the direction where it expects to decrease the error the most. After each step, the algorithm remeasures the rating of all the SERPs (based on the known URL/query pair ratings) to evaluate how it’s doing. Rinse and repeat.

That same process is ongoing with Google now & in the coming weeks there'll be the next phase of the current update.

So far it looks like some quality-based re-scoring was done & some sites which were overly reliant on anchor text got clipped. On the back end of the update there'll be another quality-based re-scoring, but the sites that were hit for excessive manipulation of anchor text via link building efforts will likely remain penalized for a good chunk of time.

Update: It appears a major reverberation of this update occurred on April 7th. From early analysis, Google is mixing in showing results for related midtail concepts on a core industry search term & they are also in some cases pushing more aggressively on doing internal site-level searches to rank a more relevant internal page for a query where they homepage might have ranked in the past.




it

Keyword Not Provided, But it Just Clicks

When SEO Was Easy

When I got started on the web over 15 years ago I created an overly broad & shallow website that had little chance of making money because it was utterly undifferentiated and crappy. In spite of my best (worst?) efforts while being a complete newbie, sometimes I would go to the mailbox and see a check for a couple hundred or a couple thousand dollars come in. My old roommate & I went to Coachella & when the trip was over I returned to a bunch of mail to catch up on & realized I had made way more while not working than what I spent on that trip.

What was the secret to a total newbie making decent income by accident?

Horrible spelling.

Back then search engines were not as sophisticated with their spelling correction features & I was one of 3 or 4 people in the search index that misspelled the name of an online casino the same way many searchers did.

The high minded excuse for why I did not scale that would be claiming I knew it was a temporary trick that was somehow beneath me. The more accurate reason would be thinking in part it was a lucky fluke rather than thinking in systems. If I were clever at the time I would have created the misspeller's guide to online gambling, though I think I was just so excited to make anything from the web that I perhaps lacked the ambition & foresight to scale things back then.

In the decade that followed I had a number of other lucky breaks like that. One time one of the original internet bubble companies that managed to stay around put up a sitewide footer link targeting the concept that one of my sites made decent money from. This was just before the great recession, before Panda existed. The concept they targeted had 3 or 4 ways to describe it. 2 of them were very profitable & if they targeted either of the most profitable versions with that page the targeting would have sort of carried over to both. They would have outranked me if they targeted the correct version, but they didn't so their mistargeting was a huge win for me.

Search Gets Complex

Search today is much more complex. In the years since those easy-n-cheesy wins, Google has rolled out many updates which aim to feature sought after destination sites while diminishing the sites which rely one "one simple trick" to rank.

Arguably the quality of the search results has improved significantly as search has become more powerful, more feature rich & has layered in more relevancy signals.

Many quality small web publishers have went away due to some combination of increased competition, algorithmic shifts & uncertainty, and reduced monetization as more ad spend was redirected toward Google & Facebook. But the impact as felt by any given publisher is not the impact as felt by the ecosystem as a whole. Many terrible websites have also went away, while some formerly obscure though higher-quality sites rose to prominence.

There was the Vince update in 2009, which boosted the rankings of many branded websites.

Then in 2011 there was Panda as an extension of Vince, which tanked the rankings of many sites that published hundreds of thousands or millions of thin content pages while boosting the rankings of trusted branded destinations.

Then there was Penguin, which was a penalty that hit many websites which had heavily manipulated or otherwise aggressive appearing link profiles. Google felt there was a lot of noise in the link graph, which was their justification for the Penguin.

There were updates which lowered the rankings of many exact match domains. And then increased ad load in the search results along with the other above ranking shifts further lowered the ability to rank keyword-driven domain names. If your domain is generically descriptive then there is a limit to how differentiated & memorable you can make it if you are targeting the core market the keywords are aligned with.

There is a reason eBay is more popular than auction.com, Google is more popular than search.com, Yahoo is more popular than portal.com & Amazon is more popular than a store.com or a shop.com. When that winner take most impact of many online markets is coupled with the move away from using classic relevancy signals the economics shift to where is makes a lot more sense to carry the heavy overhead of establishing a strong brand.

Branded and navigational search queries could be used in the relevancy algorithm stack to confirm the quality of a site & verify (or dispute) the veracity of other signals.

Historically relevant algo shortcuts become less appealing as they become less relevant to the current ecosystem & even less aligned with the future trends of the market. Add in negative incentives for pushing on a string (penalties on top of wasting the capital outlay) and a more holistic approach certainly makes sense.

Modeling Web Users & Modeling Language

PageRank was an attempt to model the random surfer.

When Google is pervasively monitoring most users across the web they can shift to directly measuring their behaviors instead of using indirect signals.

Years ago Bill Slawski wrote about the long click in which he opened by quoting Steven Levy's In the Plex: How Google Thinks, Works, and Shapes our Lives

"On the most basic level, Google could see how satisfied users were. To paraphrase Tolstoy, happy users were all the same. The best sign of their happiness was the "Long Click" — This occurred when someone went to a search result, ideally the top one, and did not return. That meant Google has successfully fulfilled the query."

Of course, there's a patent for that. In Modifying search result ranking based on implicit user feedback they state:

user reactions to particular search results or search result lists may be gauged, so that results on which users often click will receive a higher ranking. The general assumption under such an approach is that searching users are often the best judges of relevance, so that if they select a particular search result, it is likely to be relevant, or at least more relevant than the presented alternatives.

If you are a known brand you are more likely to get clicked on than a random unknown entity in the same market.

And if you are something people are specifically seeking out, they are likely to stay on your website for an extended period of time.

One aspect of the subject matter described in this specification can be embodied in a computer-implemented method that includes determining a measure of relevance for a document result within a context of a search query for which the document result is returned, the determining being based on a first number in relation to a second number, the first number corresponding to longer views of the document result, and the second number corresponding to at least shorter views of the document result; and outputting the measure of relevance to a ranking engine for ranking of search results, including the document result, for a new search corresponding to the search query. The first number can include a number of the longer views of the document result, the second number can include a total number of views of the document result, and the determining can include dividing the number of longer views by the total number of views.

Attempts to manipulate such data may not work.

safeguards against spammers (users who generate fraudulent clicks in an attempt to boost certain search results) can be taken to help ensure that the user selection data is meaningful, even when very little data is available for a given (rare) query. These safeguards can include employing a user model that describes how a user should behave over time, and if a user doesn't conform to this model, their click data can be disregarded. The safeguards can be designed to accomplish two main objectives: (1) ensure democracy in the votes (e.g., one single vote per cookie and/or IP for a given query-URL pair), and (2) entirely remove the information coming from cookies or IP addresses that do not look natural in their browsing behavior (e.g., abnormal distribution of click positions, click durations, clicks_per_minute/hour/day, etc.). Suspicious clicks can be removed, and the click signals for queries that appear to be spmed need not be used (e.g., queries for which the clicks feature a distribution of user agents, cookie ages, etc. that do not look normal).

And just like Google can make a matrix of documents & queries, they could also choose to put more weight on search accounts associated with topical expert users based on their historical click patterns.

Moreover, the weighting can be adjusted based on the determined type of the user both in terms of how click duration is translated into good clicks versus not-so-good clicks, and in terms of how much weight to give to the good clicks from a particular user group versus another user group. Some user's implicit feedback may be more valuable than other users due to the details of a user's review process. For example, a user that almost always clicks on the highest ranked result can have his good clicks assigned lower weights than a user who more often clicks results lower in the ranking first (since the second user is likely more discriminating in his assessment of what constitutes a good result). In addition, a user can be classified based on his or her query stream. Users that issue many queries on (or related to) a given topic T (e.g., queries related to law) can be presumed to have a high degree of expertise with respect to the given topic T, and their click data can be weighted accordingly for other queries by them on (or related to) the given topic T.

Google was using click data to drive their search rankings as far back as 2009. David Naylor was perhaps the first person who publicly spotted this. Google was ranking Australian websites for [tennis court hire] in the UK & Ireland, in part because that is where most of the click signal came from. That phrase was most widely searched for in Australia. In the years since Google has done a better job of geographically isolating clicks to prevent things like the problem David Naylor noticed, where almost all search results in one geographic region came from a different country.

Whenever SEOs mention using click data to search engineers, the search engineers quickly respond about how they might consider any signal but clicks would be a noisy signal. But if a signal has noise an engineer would work around the noise by finding ways to filter the noise out or combine multiple signals. To this day Google states they are still working to filter noise from the link graph: "We continued to protect the value of authoritative and relevant links as an important ranking signal for Search."

The site with millions of inbound links, few intentional visits & those who do visit quickly click the back button (due to a heavy ad load, poor user experience, low quality content, shallow content, outdated content, or some other bait-n-switch approach)...that's an outlier. Preventing those sorts of sites from ranking well would be another way of protecting the value of authoritative & relevant links.

Best Practices Vary Across Time & By Market + Category

Along the way, concurrent with the above sorts of updates, Google also improved their spelling auto-correct features, auto-completed search queries for many years through a featured called Google Instant (though they later undid forced query auto-completion while retaining automated search suggestions), and then they rolled out a few other algorithms that further allowed them to model language & user behavior.

Today it would be much harder to get paid above median wages explicitly for sucking at basic spelling or scaling some other individual shortcut to the moon, like pouring millions of low quality articles into a (formerly!) trusted domain.

Nearly a decade after Panda, eHow's rankings still haven't recovered.

Back when I got started with SEO the phrase Indian SEO company was associated with cut-rate work where people were buying exclusively based on price. Sort of like a "I got a $500 budget for link building, but can not under any circumstance invest more than $5 in any individual link." Part of how my wife met me was she hired a hack SEO from San Diego who outsourced all the work to India and marked the price up about 100-fold while claiming it was all done in the United States. He created reciprocal links pages that got her site penalized & it didn't rank until after she took her reciprocal links page down.

With that sort of behavior widespread (hack US firm teaching people working in an emerging market poor practices), it likely meant many SEO "best practices" which were learned in an emerging market (particularly where the web was also underdeveloped) would be more inclined to being spammy. Considering how far ahead many Western markets were on the early Internet & how India has so many languages & how most web usage in India is based on mobile devices where it is hard for users to create links, it only makes sense that Google would want to place more weight on end user data in such a market.

If you set your computer location to India Bing's search box lists 9 different languages to choose from.

The above is not to state anything derogatory about any emerging market, but rather that various signals are stronger in some markets than others. And competition is stronger in some markets than others.

Search engines can only rank what exists.

"In a lot of Eastern European - but not just Eastern European markets - I think it is an issue for the majority of the [bream? muffled] countries, for the Arabic-speaking world, there just isn't enough content as compared to the percentage of the Internet population that those regions represent. I don't have up to date data, I know that a couple years ago we looked at Arabic for example and then the disparity was enormous. so if I'm not mistaken the Arabic speaking population of the world is maybe 5 to 6%, maybe more, correct me if I am wrong. But very definitely the amount of Arabic content in our index is several orders below that. So that means we do not have enough Arabic content to give to our Arabic users even if we wanted to. And you can exploit that amazingly easily and if you create a bit of content in Arabic, whatever it looks like we're gonna go you know we don't have anything else to serve this and it ends up being horrible. and people will say you know this works. I keyword stuffed the hell out of this page, bought some links, and there it is number one. There is nothing else to show, so yeah you're number one. the moment somebody actually goes out and creates high quality content that's there for the long haul, you'll be out and that there will be one." - Andrey Lipattsev – Search Quality Senior Strategist at Google Ireland, on Mar 23, 2016

Impacting the Economics of Publishing

Now search engines can certainly influence the economics of various types of media. At one point some otherwise credible media outlets were pitching the Demand Media IPO narrative that Demand Media was the publisher of the future & what other media outlets will look like. Years later, after heavily squeezing on the partner network & promoting programmatic advertising that reduces CPMs by the day Google is funding partnerships with multiple news publishers like McClatchy & Gatehouse to try to revive the news dead zones even Facebook is struggling with.

"Facebook Inc. has been looking to boost its local-news offerings since a 2017 survey showed most of its users were clamoring for more. It has run into a problem: There simply isn’t enough local news in vast swaths of the country. ... more than one in five newspapers have closed in the past decade and a half, leaving half the counties in the nation with just one newspaper, and 200 counties with no newspaper at all."

As mainstream newspapers continue laying off journalists, Facebook's news efforts are likely to continue failing unless they include direct economic incentives, as Google's programmatic ad push broke the banner ad:

"Thanks to the convoluted machinery of Internet advertising, the advertising world went from being about content publishers and advertising context—The Times unilaterally declaring, via its ‘rate card’, that ads in the Times Style section cost $30 per thousand impressions—to the users themselves and the data that targets them—Zappo’s saying it wants to show this specific shoe ad to this specific user (or type of user), regardless of publisher context. Flipping the script from a historically publisher-controlled mediascape to an advertiser (and advertiser intermediary) controlled one was really Google’s doing. Facebook merely rode the now-cresting wave, borrowing outside media’s content via its own users’ sharing, while undermining media’s ability to monetize via Facebook’s own user-data-centric advertising machinery. Conventional media lost both distribution and monetization at once, a mortal blow."

Google is offering news publishers audience development & business development tools.

Heavy Investment in Emerging Markets Quickly Evolves the Markets

As the web grows rapidly in India, they'll have a thousand flowers bloom. In 5 years the competition in India & other emerging markets will be much tougher as those markets continue to grow rapidly. Media is much cheaper to produce in India than it is in the United States. Labor costs are lower & they never had the economic albatross that is the ACA adversely impact their economy. At some point the level of investment & increased competition will mean early techniques stop having as much efficacy. Chinese companies are aggressively investing in India.

“If you break India into a pyramid, the top 100 million (urban) consumers who think and behave more like Americans are well-served,” says Amit Jangir, who leads India investments at 01VC, a Chinese venture capital firm based in Shanghai. The early stage venture firm has invested in micro-lending firms FlashCash and SmartCoin based in India. The new target is the next 200 million to 600 million consumers, who do not have a go-to entertainment, payment or ecommerce platform yet— and there is gonna be a unicorn in each of these verticals, says Jangir, adding that it will be not be as easy for a player to win this market considering the diversity and low ticket sizes.

RankBrain

RankBrain appears to be based on using user clickpaths on head keywords to help bleed rankings across into related searches which are searched less frequently. A Googler didn't state this specifically, but it is how they would be able to use models of searcher behavior to refine search results for keywords which are rarely searched for.

In a recent interview in Scientific American a Google engineer stated: "By design, search engines have learned to associate short queries with the targets of those searches by tracking pages that are visited as a result of the query, making the results returned both faster and more accurate than they otherwise would have been."

Now a person might go out and try to search for something a bunch of times or pay other people to search for a topic and click a specific listing, but some of the related Google patents on using click data (which keep getting updated) mentioned how they can discount or turn off the signal if there is an unnatural spike of traffic on a specific keyword, or if there is an unnatural spike of traffic heading to a particular website or web page.

And, since Google is tracking the behavior of end users on their own website, anomalous behavior is easier to track than it is tracking something across the broader web where signals are more indirect. Google can take advantage of their wide distribution of Chrome & Android where users are regularly logged into Google & pervasively tracked to place more weight on users where they had credit card data, a long account history with regular normal search behavior, heavy Gmail users, etc.

Plus there is a huge gap between the cost of traffic & the ability to monetize it. You might have to pay someone a dime or a quarter to search for something & there is no guarantee it will work on a sustainable basis even if you paid hundreds or thousands of people to do it. Any of those experimental searchers will have no lasting value unless they influence rank, but even if they do influence rankings it might only last temporarily. If you bought a bunch of traffic into something genuine Google searchers didn't like then even if it started to rank better temporarily the rankings would quickly fall back if the real end user searchers disliked the site relative to other sites which already rank.

This is part of the reason why so many SEO blogs mention brand, brand, brand. If people are specifically looking for you in volume & Google can see that thousands or millions of people specifically want to access your site then that can impact how you rank elsewhere.

Even looking at something inside the search results for a while (dwell time) or quickly skipping over it to have a deeper scroll depth can be a ranking signal. Some Google patents mention how they can use mouse pointer location on desktop or scroll data from the viewport on mobile devices as a quality signal.

Neural Matching

Last year Danny Sullivan mentioned how Google rolled out neural matching to better understand the intent behind a search query.

The above Tweets capture what the neural matching technology intends to do. Google also stated:

we’ve now reached the point where neural networks can help us take a major leap forward from understanding words to understanding concepts. Neural embeddings, an approach developed in the field of neural networks, allow us to transform words to fuzzier representations of the underlying concepts, and then match the concepts in the query with the concepts in the document. We call this technique neural matching.

To help people understand the difference between neural matching & RankBrain, Google told SEL: "RankBrain helps Google better relate pages to concepts. Neural matching helps Google better relate words to searches."

There are a couple research papers on neural matching.

The first one was titled A Deep Relevance Matching Model for Ad-hoc Retrieval. It mentioned using Word2vec & here are a few quotes from the research paper

  • "Successful relevance matching requires proper handling of the exact matching signals, query term importance, and diverse matching requirements."
  • "the interaction-focused model, which first builds local level interactions (i.e., local matching signals) between two pieces of text, and then uses deep neural networks to learn hierarchical interaction patterns for matching."
  • "according to the diverse matching requirement, relevance matching is not position related since it could happen in any position in a long document."
  • "Most NLP tasks concern semantic matching, i.e., identifying the semantic meaning and infer"ring the semantic relations between two pieces of text, while the ad-hoc retrieval task is mainly about relevance matching, i.e., identifying whether a document is relevant to a given query."
  • "Since the ad-hoc retrieval task is fundamentally a ranking problem, we employ a pairwise ranking loss such as hinge loss to train our deep relevance matching model."

The paper mentions how semantic matching falls down when compared against relevancy matching because:

  • semantic matching relies on similarity matching signals (some words or phrases with the same meaning might be semantically distant), compositional meanings (matching sentences more than meaning) & a global matching requirement (comparing things in their entirety instead of looking at the best matching part of a longer document); whereas,
  • relevance matching can put significant weight on exact matching signals (weighting an exact match higher than a near match), adjust weighting on query term importance (one word might or phrase in a search query might have a far higher discrimination value & might deserve far more weight than the next) & leverage diverse matching requirements (allowing relevancy matching to happen in any part of a longer document)

Here are a couple images from the above research paper

And then the second research paper is

Deep Relevancy Ranking Using Enhanced Dcoument-Query Interactions
"interaction-based models are less efficient, since one cannot index a document representation independently of the query. This is less important, though, when relevancy ranking methods rerank the top documents returned by a conventional IR engine, which is the scenario we consider here."

That same sort of re-ranking concept is being better understood across the industry. There are ranking signals that earn some base level ranking, and then results get re-ranked based on other factors like how well a result matches the user intent.

Here are a couple images from the above research paper.

For those who hate the idea of reading research papers or patent applications, Martinibuster also wrote about the technology here. About the only part of his post I would debate is this one:

"Does this mean publishers should use more synonyms? Adding synonyms has always seemed to me to be a variation of keyword spamming. I have always considered it a naive suggestion. The purpose of Google understanding synonyms is simply to understand the context and meaning of a page. Communicating clearly and consistently is, in my opinion, more important than spamming a page with keywords and synonyms."

I think one should always consider user experience over other factors, however a person could still use variations throughout the copy & pick up a bit more traffic without coming across as spammy. Danny Sullivan mentioned the super synonym concept was impacting 30% of search queries, so there are still a lot which may only be available to those who use a specific phrase on their page.

Martinibuster also wrote another blog post tying more research papers & patents to the above. You could probably spend a month reading all the related patents & research papers.

The above sort of language modeling & end user click feedback compliment links-based ranking signals in a way that makes it much harder to luck one's way into any form of success by being a terrible speller or just bombing away at link manipulation without much concern toward any other aspect of the user experience or market you operate in.

Pre-penalized Shortcuts

Google was even issued a patent for predicting site quality based upon the N-grams used on the site & comparing those against the N-grams used on other established site where quality has already been scored via other methods: "The phrase model can be used to predict a site quality score for a new site; in particular, this can be done in the absence of other information. The goal is to predict a score that is comparable to the baseline site quality scores of the previously-scored sites."

Have you considered using a PLR package to generate the shell of your site's content? Good luck with that as some sites trying that shortcut might be pre-penalized from birth.

Navigating the Maze

When I started in SEO one of my friends had a dad who is vastly smarter than I am. He advised me that Google engineers were smarter, had more capital, had more exposure, had more data, etc etc etc ... and thus SEO was ultimately going to be a malinvestment.

Back then he was at least partially wrong because influencing search was so easy.

But in the current market, 16 years later, we are near the infection point where he would finally be right.

At some point the shortcuts stop working & it makes sense to try a different approach.

The flip side of all the above changes is as the algorithms have become more complex they have went from being a headwind to people ignorant about SEO to being a tailwind to those who do not focus excessively on SEO in isolation.

If one is a dominant voice in a particular market, if they break industry news, if they have key exclusives, if they spot & name the industry trends, if their site becomes a must read & is what amounts to a habit ... then they perhaps become viewed as an entity. Entity-related signals help them & those signals that are working against the people who might have lucked into a bit of success become a tailwind rather than a headwind.

If your work defines your industry, then any efforts to model entities, user behavior or the language of your industry are going to boost your work on a relative basis.

This requires sites to publish frequently enough to be a habit, or publish highly differentiated content which is strong enough that it is worth the wait.

Those which publish frequently without being particularly differentiated are almost guaranteed to eventually walk into a penalty of some sort. And each additional person who reads marginal, undifferentiated content (particularly if it has an ad-heavy layout) is one additional visitor that site is closer to eventually getting whacked. Success becomes self regulating. Any short-term success becomes self defeating if one has a highly opportunistic short-term focus.

Those who write content that only they could write are more likely to have sustained success.




it

Revenue Quality & Leverage

The coronavirus issue is likely to linger for some time.

Up to 70% of Germany could become infected & some countries like the UK are even considering herd immunity as a strategy:

"I’m an epidemiologist. When I heard about Britain’s ‘herd immunity’ coronavirus plan, I thought it was satire"
- William Hanage

What if their models are broken?

Many companies like WeWork or Oyo have been fast and loose chasing growth while slower growing companies have been levering up to fund share buybacks. Airlines spent 96% of free cash flow on share buybacks. The airlines seek a $50 billion bailout package.

There are knock-on effects from Boeing to TripAdvisor to Google all the way down to travel affiliate blogger, local restaurants closing, the over-levered bus company going through bankruptcy & bondholders eating a loss on the debt.

Companies are going to let a lot of skeletons out of the closet as literally anything and everything bad gets attributed to coronavirus. Layoffs, renegotiating contracts, pausing ad budgets, renegotiating debts, requesting bailouts, etc. The Philippine stock market was recently trading at 2012 levels & closed indefinitely.

Brad Geddes mentioned advertisers have been aggressively pulling PPC budgets over the past week: “If you have to leave the house to engage in the service, it just seems like it’s not converting right now.”

During the prior recession Google repriced employee options to retain talent.

In spite of consumers being glued to the news, tier one news publishers are anticipating large ad revenue declines:

Some of the largest advertisers, including Procter & Gamble Unilever, Apple, Microsoft, Danone, AB InBev, Burberry and Aston Martin, made cuts to sales forecasts for the year. With the outlook for the spread of the virus changing by day, many companies are caught in a spiral of uncertainty. That tends to gum up decisions, and ad spending is an easy expenditure to put on pause. The New York Times has warned that it expects advertising revenue to decline “in the mid-teens” in the current quarter as a result of coronavirus.

More time online might mean search engines & social networks capture a greater share of overall ad spend, but if large swaths of the economy do not convert & how people live changes for an extended period of time it will take time for the new categories to create the economic engines replacing the old out-of-favor categories.

[IMPORTANT: insert affiliate ad for cruise vacations here]

As Google sees advertisers pause ad budgets Google will get more aggressive with keeping users on their site & displacing organic click flows with additional ad clicks on the remaining advertisers.

When Google or Facebook see a 5% or 10% pullback other industry players might see a 30% to 50% decline as the industry pulls back broadly, focuses more resources on the core, and the big attention merchants offset their losses by clamping down on other players.

At its peak TripAdvisor was valued at about $14 billion & it is now valued at about $2 billion.

TripAdvisor announced layoffs. As did Expedia. As did Booking.com. As did many hotels. And airlines. etc. etc. etc.

I am not suggesting people should be fearful or dominated by negative emotions. Rather one should live as though many other will be living that way.

In times of elevated uncertainty, in business it is best to not be led by emotions unless they are positive ones. Spend a bit more time playing if you can afford to & work more on things you love.

Right now we might be living through the flu pandemic of 1918 and the Great Depression of 1929 while having constant access to social media updates. And that's awful.

Consume less but deeper. Less Twitter, less news, fewer big decisions, read more books.

It is better to be more pragmatic & logic-based in determining opportunity cost & the best strategy to use than to be led by extreme fear.

  • If you have sustainable high-margin revenue treasure it.
  • If you have low-margin revenue it might quickly turn into negative margin revenues unless something changes quickly.
  • If you have low-margin revenue which is sustainable but under-performed less stable high-margin revenues you might want to put a bit more effort into those sorts of projects as they are more likely to endure.

On a positive note, we might soon get a huge wave of innovation...

"Take the Great Depression. Economist Alexander Field writes that “the years 1929–1941 were, in the aggregate, the most technologically progressive of any comparable period in U.S. economic history.” Productivity growth was twice as fast in the 1930s as it was in the decade prior. The 1920s were the era of leisure because people could afford to relax. The 1930s were the era of frantic problem solving because people had no other choice. The Great Depression brought unimaginable financial pain. It also brought us supermarkets, microwaves, sunscreen, jets, rockets, electron microscopes, magnetic recording, nylon, photocopying, teflon, helicopters, color TV, plexiglass, commercial aviation, most forms of plastic, synthetic rubber, laundromats, and countless other discoveries."

The prior recession led to trends like Groupon. The McJobs recovery led to services like Uber & DoorDash. Food delivery has been trending south recently, though perhaps the stay-at-home economy will give it a boost.

I have been amazed at how fast affiliates moved with pushing N95 face masks online over the past couple months. Seeing how fast that stuff spun up really increases the perceived value of any sustainable high-margin businesses.

Amazon.com is hiring another 100,000 warehouse workers as people shop from home. Amazon banned new face masks and hand sanitizer listings. One guy had to donate around 18,000 cleaning products he couldn't sell.

I could see online education becoming far more popular as people aim to retrain while stuck at home.

What sorts of new industries will current & new technologies lead to as more people spend time working from home?




it

Increasing Time on Site

Changing User Intents

Google's search quality rater document highlights how the intent of searches can change over time for a specific keyword.

A generic search for [iPhone] is likely to be related to the most recent model. A search for [President Bush] likely was related to the 41st president until his son was elected & then it was most likely to be related to 43.

Faster Ranking Shifts

About 17 years ago when Google was young they did monthly updates where most of any ranking signal shift that would happen would get folded into the rankings. The web today is much faster in terms of the rate of change, amount of news consumption, increasing political polarization, social media channels that amplify outrage and how quickly any cultural snippet can be taken out of context.

Yesterday President Trump had some interesting stuff to say about bleach. In spite of there being an anime series by the same name, news coverage of the presser has driven great interest in the topic.

And that interest is already folded into the organic search results through Google News insertion, Twitter tweet insertion, and the query deserves freshness (QDF) algorithm driving insertion of news stories in other organic search ranking slots.

If a lot of people are searching for something and many trusted news organizations are publishing information about a topic then there is little risk in folding fresh information into the result set.

Temporary Versus Permanent Change

When the intent of a keyword changes sometimes the change is transitory & sometimes it is not.

One of the most common ad-driven business models online is to take something that was once paid, make it free, and then layer ads or some other premium features on top to monetize a different part of the value chain. TripAdvisor democratized hotel reviews. Zillow made foreclosure information easily accessible for free, etc.

The success of remote working & communication services like Skype, Zoom, Basecamp, Slack, Trello, and the ongoing remote work experiment the world is going through will permanently change some consumer behaviors & how businesses operate.

A Pew survey mentioned 43% of Americans stated someone in their house recently lost their job, had their hours reduced, and/or took pay cuts. Hundreds of thousands of people are applying to work in Amazon's grueling fulfillment centers.

To many of these people a lone wolf online job would be a dream come true.

If you had a two hour daily commute and were just as efficient working at home most days would you be in a rush to head back to the office?

How many former fulltime employees are going to become freelancers building their own small businesses they work on directly while augmenting it with platform work on other services like Uber, Lyft, DoorDash, Upwork, Fiverr, 99 Designs, or even influencer platforms like Intellifluence?

If big publishers are getting disintermediated by monopoly platforms & ad networks are offering crumbs of crumbs there's no harm in selling custom ads directly or having your early publishing efforts subsidized through custom side deals as you build market awareness and invest into building other products and services to sell.

Wordpress keeps adding more features. Many technology services like Shopify, Stripe & Twilio are making most parts of the tech stack outside of marketing cheaper & easier to scale.

Some universities are preparing for the fall semester being entirely online. As technology improves, we spend more time online, more activities happen online, and more work becomes remote. All this leads to the distinction between online and offline losing meaning other than perhaps in terms of cost structure & likelihood of bankruptcy.

Before Panda / After Panda


Before the Panda update each additional page which was created was another lotto ticket and a chance to win. If users had a crappy user experience on a page or site maybe you didn't make the sale, but if the goal of the page was to have the content so crappy that ads were more appealing that could lead to fantastic monetization while it lasted.

That strategy worked well for eHow, fueling the pump-n-dump Demand Media IPO.

Demand Media had to analyze eHow and pay to delete over a million articles which they deemed to have a negative economic value in the post-Panda world.

After the Panda update having many thin pages laying around and creating more thin pages was layering risk on top of risk. It made sense to shift to a smaller, tighter, deeper & more differentiated publishing model.

Entropy & Decay

The web goes through a constant state of reinvention.

Old YouTube Flash embeds break.

HTTP content calls in sites that were upgraded to HTTPS break.

Software which is not updated has security exploits.

If you have a large website and do not regularly update where you are linking to your site is almost certainly linking to porn and malware sites somewhere.

As users shifted to mobile websites that ignored mobile interfaces became relatively less appealing.

Changing web browser behaviors can break website logins and how data is shared across websites dependent on third party services.

Competition improves.

Algorithms change.

Ads eat a growing share of real estate on dominant platforms while organic reach slides.

Everything on the web is constantly dying as competition improves, technology changes and language gets redefined.

Staying Relevant

Even if a change in user intent is transitory, in some cases it can make sense to re-work a page to address a sudden surge of interest to improve time on site, user engagement metrics & make the content on your page more citation-worthy. If news writers are still chasing a trend then having an in-depth background piece of content with more depth gives them something they may want to link at.

Since the Covid-19 implosion of the global economy came into effect I've seen two different clients have a sort of sudden surge in traffic which would make little to no sense unless one considered currently spreading news stories.

News coverage creates interest in topics, shapes perspectives of topics, and creates demand for solutions.

If you read the right people on Twitter sometimes you can be days, weeks or even months ahead of the broader news narrative. Some people are great at spotting the second, third and fourth order effects of changes. You can spot stories bubbling up and participate in the trends.

An Accelerating Rate of Change

When the web was slower & easier you could find an affiliate niche and succeed in it sometimes for years before solid competition would arrive. One of the things I was most floored about this year from a marketing perspective was how quickly spammers ramped up a full court press amplifying the fear the news media was pitching. I think I get something like a hundred spam emails a day pitching facemasks and other COVID-19 solutions. I probably see 50+ other daily ads from services like Outbrain & similar.

The web moves so much faster that the SEC is already taking COVID-19 related actions against dozens of companies. Google banned advertising protective masks and recently announced they are rolling out advertiser ID verification to increase transparency.

If Google is looking at their advertisers with a greater degree of suspicion even into an economic downturn when Expedia is pulling $4 billion from their ad budget & Amazon is cutting back on their Google ad budget and Google decides to freeze hiring then it makes far more sense to keep reinvesting into improving any page which is getting a solid stream of organic search traffic.

Company Town

After Amazon cut their Google ad budget in March Google decided to expand Google Shopping to include free listings. When any of the platforms is losing badly they can afford to subsidize that area and operate it at a loss to try to gain marketshare while making the dominant player in that category look more extreme.

When a player is dominant in a category they can squeeze down on partners. Amazon once again cut affiliate payouts and the Wall Street Journal published an article citing 20 current and former Amazon insiders who stated Amazon uses third party merchant sales data to determine which products to clone:

Amazon employees accessed documents and data about a bestselling car-trunk organizer sold by a third-party vendor. The information included total sales, how much the vendor paid Amazon for marketing and shipping, and how much Amazon made on each sale. Amazon’s private-label arm later introduced its own car-trunk organizers. ... Amazon’s private-label business encompasses more than 45 brands with some 243,000 products, from AmazonBasics batteries to Stone & Beam furniture. Amazon says those brands account for 1% of its $158 billion in annual retail sales, not counting Amazon’s devices such as its Echo speakers, Kindle e-readers and Ring doorbell cameras.

Amazon does not even need to sell their private label products to shift their economics. As Amazon clones products they force the branded ad buy for a company to show up for their own branded terms, taking another bite out of the partner: "Fortem spends as much as $60,000 a month on Amazon advertisements for its items to come up at the top of searches, said Mr. Maslakou."

Amazon has grown so dominant they've not only cut their affiliate & search advertising while hiring hundreds of thousands of employees, but they've also dramatically slowed down shipping times while pulling back on their on-site people also purchase promotions to get users to order less.

While they are growing stronger department stores and other legacy retailers are careening toward bankruptcy.

Multiple Ways to Improve

If you have a page which is ranking that gets a sudden spike in traffic it makes a lot of sense to consider current news & try to consider if the intent of the searcher has changed. If it has, address it as best you can in the most relevant way possible, even if the change is temporary, then consider switching back to the old version of the page or reorganizing your content if/when/as the trend has passed.

One of the pages mentioned above was a pre-Panda "me too" type page which was suddenly flooded with thousands of user visitors. A quality inbound link can easily cost $100 to multiples of that. If a page is already getting thousands of visitors, why not invest a couple hundred dollars into dramatically improving it, knowing that some of those drive by users will likely eventually share it? Make the page an in-depth guide with great graphics and some of those 10,000's of visitors will eventually link to it, as they were already interested in the topic, the page already gets a great stream of traffic, and the content quality is solid.

Last week a client had a big spike from a news topic that changed the intent of a keyword. Their time on site from those visitors was under a minute. After the page was re-created to reflect changing consumer intent their time on site jumped to over 3 minutes for users entering that page. Those users had a far lower bounce rate, a far better user experience, are going to be more likely to trust the site enough to seek it out again, and this sends a signal to Google that the site is still maintained & relevant to the modern search market.

There are many ways to chase the traffic stream

  • create new content on new pages
  • gut the old page & publish entirely new content
  • re-arrange the old page while publishing new relevant breaking news at the top

In general I think the third option is often the best approach because you are aligning the page which already sees the traffic stream with the content they are looking for, while also ensuring any users from the prior intent can still access what they are looking for.

If the trend is huge, or the change in intent is permanent then you could also move the old content to a legacy URL archived page while making the high-traffic page focus on the spiking news topic.

The above advice applies to pages which rank for keywords that change in intent, but it can also apply to any web page which has a strong flow of user traffic. Keep improving the things people see most because improvements there have the biggest returns. How can you make a page deeper, better, more differentiated from the rest of the web?

Does Usage Data Matter?

Objectively, if people visit your website and do not find what they were looking for they are going to click the back button and be done with you.

Outdated content that has become irrelevant due to changing user tastes is only marginally better than outright spam.

While Google suggests they largely do not use bounce rate or user data in their rankings, they have also claimed end user data was the best way they could determine if the user was satisfied with a particular search result. Five years ago Bill Slawski wrote a blog post about long clicks which quoted Steven Levy's In The Plex book:

"On the most basic level, Google could see how satisfied users were. To paraphrase Tolstoy, happy users were all the same. The best sign of their happiness was the "Long Click" — This occurred when someone went to a search result, ideally the top one, and did not return. That meant Google has successfully fulfilled the query."

Think of how many people use the Chrome web browser or have Android tracking devices on them all hours of the day. There is no way Google would be able to track those billions of users every single day without finding a whole lot of signal in the noise.




it

Here are some activities to do this weekend even while staying at home


As we continue to quarantine under Gov. Jay Inslee's "stay at home" order, there are still lots of fun activities you can do this weekend. So, stay in, read a book, start a movie marathon and order some takeout.