or Corona-Notstand in der brasilianischen Urwaldmetropole Manaus By www.tagesschau.de Published On :: Brasilien wird zum neuen Corona-Hotspot. Derzeit sterben jeden Tag mehr als 600 Menschen. In Rio und São Paulo droht das Gesundheitswesen zu kollabieren. In der Urwaldmetropole Manaus ist das schon passiert. Von Ivo Marusczyk und Matthias Ebert. Full Article Ausland
or Coronavirus: Wie sinnvoll sind Obergrenze und Tests? By www.tagesschau.de Published On :: Fast täglich gibt es neue Erkenntnisse zum Coronavirus. Wie sinnvoll ist die 50er-Obergrenze der Lockerungsmaßnahmen? Wie funktionieren Antikörpertests? Wann kommt die Tracing-App? Ein Überblick von Dominik Lauck. Full Article
or Chronik zu Corona: Ein Virus verändert die gesamte Welt By www.tagesschau.de Published On :: Die Folgen der Corona-Pandemie sind dramatisch - nicht nur gesundheitlich, sondern auch wirtschaftlich und politisch. Tausende Menschen sterben. In großen Teilen der Welt kommt das öffentliche Leben im März 2020 zum Erliegen. Full Article
or Folgen von Covid-19-Erkrankung: Magier Roy Horn gestorben By www.tagesschau.de Published On :: Weltbekannt wurde Roy Horn als Teil des Duos "Siegfried & Roy" - vor allem durch deren Auftritte mit weißen Tigern und Löwen. Nun ist er im Alter von 75 Jahren an den Folgen von Covid-19 gestorben. Von Julia Kastein. Full Article Ausland
or Deutschland- und Weltkarte mit Coronavirus-Fällen By www.tagesschau.de Published On :: Wie viele bestätigte Coronavirus-Fälle gibt es? Die interaktiven Karten geben einen aktuellen Überblick für Deutschland und die Welt. Sie zeigen auch an, wie viele Menschen gestorben und wie viele genesen sind. Full Article Ausland
or USA sorgen für Eklat um UN-Resolution By www.tagesschau.de Published On :: Eine Corona-Resolution bringt den UN-Sicherheitsrat an den Rand eines diplomatischen Debakels. Die USA und China streiten über die WHO, ein Kompromiss droht zu scheitern. Entwicklungsminister Müller spricht von einem fatalen Signal. Full Article Ausland
or Corona in Frankreich: Virus-Angst, Wut auf die Regierung By www.tagesschau.de Published On :: Freude über die Lockerung der Ausgangssperre? In Frankreich haben viele eher Angst davor. Und anders als in anderen Ländern wächst in der Krise auch das Vertrauen in die politisch Verantwortlichen nicht. Warum ist das so? Von Martin Bohne. Full Article Ausland
or Corona: In zwölf Landkreisen mehr als 25 Neuinfektionen By www.tagesschau.de Published On :: Wenn es mehr als 50 Corona-Neuinfektionen je 100.000 Einwohner gibt, muss ein Landkreis reagieren. In drei Kreisen - Greiz, Coesfeld und Steinburg - ist das derzeit der Fall. Doch es gibt noch weitere Regionen, die nur knapp unter der Marke liegen. Full Article Inland
or Corona in Schlachthöfen: Kritik an Sammelunterkünften By www.tagesschau.de Published On :: Nachdem bei Hunderten Schlachthof-Mitarbeitern das Coronavirus nachgewiesen wurde, fordern mehrere Politiker Konsequenzen. Sie kritisieren vor allem die beengten Wohnverhältnisse der meist osteuropäischen Arbeiter. Full Article Wirtschaft
or Corona-Maßnahmen: Bischöfe verbreiten Verschwörungstheorien By www.tagesschau.de Published On :: Mehrere katholische Bischöfe kritisieren die Corona-Maßnahmen und greifen dabei auf weitverbreitete Verschwörungstheorien zurück. Sie sehen den "Auftakt einer Weltregierung". Die Deutsche Bischofskonferenz übt scharfe Kritik. Full Article Inland
or Tausende Teilnehmer bei Protesten gegen Corona-Maßnahmen By www.tagesschau.de Published On :: Tausende Menschen haben in mehreren deutschen Städten gegen die Einschränkungen zur Eindämmung der Corona-Pandemie demonstriert. Einer der Schwerpunkte war Stuttgart. Auch in Berlin, München und Frankfurt gab es Proteste. Full Article Inland
or Former WSU Cougars DB Sean Harper Jr. will play for B.C. Lions By www.seattletimes.com Published On :: Wed, 01 Apr 2020 16:11:30 -0700 The CFL is idle during the coronavirus pandemic. Full Article Cougar Football Cougars Sports
or WSU coaches Nick Rolovich and Kyle Smith taking temporary salary reductions as part of ‘cost containment’ measure By www.seattletimes.com Published On :: Mon, 13 Apr 2020 15:24:21 -0700 To help compensate for lost NCAA distribution and added expenditures caused by the novel coronavirus outbreak, Washington State announced multiple “cost containment” measures Monday. Full Article Cougar Basketball Cougar Football Cougars Sports
or Former Washington State tackle Andre Dillard donates strength equipment, nutrition items to alma mater By www.seattletimes.com Published On :: Thu, 16 Apr 2020 16:49:04 -0700 The Woodinville grad, who plays for the Philadelphia Eagles, sent packages the school will distribute to its athletes. Full Article Cougar Football Cougars Sports
or Grooming Anthony Gordon: Meet the two men who prepared WSU Cougars’ record-setting QB for the draft By www.seattletimes.com Published On :: Thu, 23 Apr 2020 09:33:35 -0700 The quarterback is expected to be a third-day pick in this week's NFL draft. Full Article Cougar Football Cougars Sports
or How former WSU Cougars receivers Easop Winston Jr., Brandon Arconado learn to speed up for NFL dreams By www.seattletimes.com Published On :: Thu, 23 Apr 2020 10:54:21 -0700 Working with a specialty coach for 10 weeks dropped their 40-yard dash times considerably. Full Article College Football College Sports Cougar Football Cougars Sports
or WSU football player Bryce Beekman’s manner of death was accidental, coroner’s office says By www.seattletimes.com Published On :: Fri, 24 Apr 2020 16:56:42 -0700 Washington State football player Bryce Beekman died in his Pullman apartment on March 23 from ‘acute intoxication’ of fentanyl and promethazine, the Whitman County Coroner's Office said Friday. Full Article Cougar Football Cougars Sports
or California wide receiver Orion Peters becomes first WSU Cougars commit in 2021 class By www.seattletimes.com Published On :: Sat, 02 May 2020 10:29:37 -0700 Inglewood (Calif.) High wide receiver Orion Peters pledged to WSU, becoming the first 2021 prospect to do so when he announced his decision on Twitter Friday night. Full Article Cougar Football Cougars Sports
or WSU receiver Renard Bell’s family survives frightening bout with the novel coronavirus By www.seattletimes.com Published On :: Sat, 02 May 2020 11:04:10 -0700 Anyone who stumbled on the tweet sent out by Renard Bell at 2:41 p.m. Friday would understand why the Washington State wide receiver is smiling again. “My grandma is fully recovered from COVID-19,” Bell posted with two emojis – the first depicting a set of hands praying and the second of a heart. My grandma […] Full Article Cougar Football Cougars Sports
or Three-star offensive tackle Christian Hilborn becomes WSU’s second 2021 commit By www.seattletimes.com Published On :: Fri, 08 May 2020 12:47:29 -0700 Christian Hilborn, a 6-foot-5, 280-pound offensive tackle from Highland High School in Utah has pledged to the Cougars, becoming WSU's second commit of the 2021 class. Full Article Cougar Football Cougars Sports
or Remainder of Pac-12 tournament canceled due to coronavirus concerns By www.seattletimes.com Published On :: Thu, 12 Mar 2020 09:36:28 -0700 Washington State's win was the final game of the Pac-12 Men’s Basketball Tournament, after the conference decided to cancel the remainder of the 2020 event due to concerns about the coronavirus illness. Full Article College Basketball Cougar Basketball Cougars Huskies Husky Basketball NCAA Tournament Pac-12 Sports
or Washington Huskies cancel all sports competitions through March 29 amid coronavirus concerns By www.seattletimes.com Published On :: Thu, 12 Mar 2020 13:56:25 -0700 The University of Washington will suspend athletic-related activities and events through March 29 due to concerns regarding the novel coronavirus. “The University of Washington athletic department has announced it will suspend all athletic-related activities and events, including workouts, training and practices, through the end of the winter quarter and spring break (March 29) for all […] Full Article College Sports Cougar Basketball Cougar Football Cougars Huskies Husky Basketball Husky Football Pac-12 Sports
or In roughly 24 hours coronavirus makes sports, a longtime sanctuary in times of crisis, disappear By www.seattletimes.com Published On :: Thu, 12 Mar 2020 16:15:04 -0700 Sports has always been the escape during times of crisis and collective stress. But now the very act of conducting sports threatens to add exponentially to perpetuating the coronavirus pandemic and growing the stress. Full Article College Basketball College Sports Cougar Basketball Cougars Gonzaga Huskies Husky Basketball Mariners MLB NBA NCAA Tournament Pac-12 Seahawks Seattle University Soccer Sounders Sports XFL Dragons
or ‘It’s a big moment.’ Washington State leaves no doubt against Colorado, breaking drought at Pac-12 tournament By www.seattletimes.com Published On :: Thu, 12 Mar 2020 21:43:53 -0700 Not weighed down by their 10-year drought at the Pac-12 tournament, the Cougars trailed for just 87 seconds against Colorado on Wednesday night before driving the Buffaloes into the ground, 82-68, at T-Mobile Arena. Full Article College Basketball College Sports Cougar Basketball Cougars Pac-12 Sports
or Due to coronavirus, NCAA grants extra year of eligibility to spring athletes, considers same for winter athletes By www.seattletimes.com Published On :: Fri, 13 Mar 2020 13:06:55 -0700 After the cancellation of the spring and winter championships tournaments stemming from concerns over the novel coronavirus pandemic, the NCAA will grant an extra year of eligibility to athletes who participate in spring sports, the organization announced Friday. Full Article College Sports Cougar Basketball Cougars Gonzaga Huskies Husky Basketball NCAA Tournament Other Sports Pac-12 Sports
or Poll: How long before you start attending live sporting events once the games resume? By www.seattletimes.com Published On :: Sat, 14 Mar 2020 16:46:51 -0700 Full Article College Basketball College Football College Sports Cougar Basketball Cougar Football Cougars Gonzaga Hockey Huskies Husky Basketball Husky Football Mariners MLB NBA NCAA Tournament NFL Reign Seahawks Seattle University Soccer Sounders Sports Storm WNBA XFL Dragons
or Here are the 10 most memorable moments from the WSU Cougars basketball season By www.seattletimes.com Published On :: Wed, 18 Mar 2020 14:07:12 -0700 WSU's season was cut short -- along with all of college basketball -- due to fears about the spread of coronavirus. But the season was certainly entertaining, considering expectations. Here are the 10 most memorable moments. Full Article Cougar Basketball Cougars Sports
or Take a trip down memory lane with the best — and worst — memories of the Kingdome By www.seattletimes.com Published On :: Thu, 26 Mar 2020 06:00:28 -0700 On the anniversary of the Kingdome's implosion, we take a trip down memory lane to relive its best and worst moments. Full Article Cougar Basketball Mariners NBA Seahawks Soccer Sports
or Pac-12 commissioner Larry Scott discusses conference’s financial hit and ‘concern and anxiety’ over athletes because of coronavirus By www.seattletimes.com Published On :: Tue, 31 Mar 2020 09:58:14 -0700 The Pac-12 is facing a revenue hit of at least $1 million per school from the cancellation of its men’s basketball tournament and March Madness, although the full extent of the damage won’t be known for weeks. Full Article Cougar Basketball Cougars Huskies Husky Basketball Husky Football NCAA Tournament Pac-12 Sports
or Analysis: Four potential transfer targets for Washington State basketball By www.seattletimes.com Published On :: Mon, 06 Apr 2020 16:55:53 -0700 The Cougars have been in contact with a handful of potential transfers. Here's a look at four players who’ve reportedly shown interest in WSU and why they’d be a good match. Full Article Cougar Basketball Cougars Sports
or Report: WSU Cougars’ Daron Henson enters transfer portal By www.seattletimes.com Published On :: Fri, 10 Apr 2020 19:50:03 -0700 Henson played in just 10 of 32 games and logged 72 minutes for the Cougars. Full Article College Sports Cougar Basketball Cougars Sports
or WSU coaches Nick Rolovich and Kyle Smith taking temporary salary reductions as part of ‘cost containment’ measure By www.seattletimes.com Published On :: Mon, 13 Apr 2020 15:24:21 -0700 To help compensate for lost NCAA distribution and added expenditures caused by the novel coronavirus outbreak, Washington State announced multiple “cost containment” measures Monday. Full Article Cougar Basketball Cougar Football Cougars Sports
or Notre Dame, Oregon top 2021 Maui Invitational field By www.seattletimes.com Published On :: Fri, 01 May 2020 11:02:31 -0700 LAHAINA, Hawaii (AP) — Former tournament champion Notre Dame and Oregon headline the 2021 Maui Invitational field. The bracket, announced Friday, also includes Butler, Houston, Saint Mary’s, Wisconsin, Texas A&M and host Chaminade. Notre Dame won the Maui title in its last appearance in 2017, beating Wichita State in the championship game. Wisconsin is making […] Full Article Cougar Basketball Sports
or Forward Daron Henson transferring from WSU Cougars to play at Seattle U By www.seattletimes.com Published On :: Tue, 05 May 2020 13:14:36 -0700 Daron Henson is leaving Washington State to play at his fourth school, but the sharpshooting forward isn’t going far. Full Article Cougar Basketball Cougars Seattle University Sports
or Anti-India clashes continue in tense Kashmir for 3rd day By www.seattletimes.com Published On :: Fri, 08 May 2020 06:57:28 -0700 SRINAGAR, India (AP) — Anti-India protests and clashes continued for a third day in disputed Kashmir on Friday following the killing of a top rebel leader by government forces. Rebel commander Riyaz Naikoo and his aide were killed in a gunfight with Indian troops on Wednesday in the southern Awantipora area, leading to massive clashes […] Full Article Nation & World World
or Spain’s army predicts 2 more waves of coronavirus By www.seattletimes.com Published On :: Fri, 08 May 2020 09:05:39 -0700 BARCELONA, Spain (AP) — Spain’s army expects there to be two more outbreaks of the new coronavirus, according to an internal report seen by The Associated Press. The army report predicts “two more waves of the epidemic” and that Spain will take “between a year and a year-and-a-half to return to normality.” The document was […] Full Article Health World
or Amid pandemic, Pompeo to visit Israel for annexation talks By www.seattletimes.com Published On :: Fri, 08 May 2020 09:58:36 -0700 WASHINGTON (AP) — Secretary of State Mike Pompeo will travel to Israel next week for a brief visit amid the coronavirus pandemic and lockdown, a trip that’s expected to focus on Prime Minister Benjamin Netanyahu’s plans to annex portions of the West Bank, the State Department said Friday. Pompeo will make the lightning trip to […] Full Article Nation & World Politics World
or A top aide to Vice President Pence tests positive for coronavirus By www.seattletimes.com Published On :: Fri, 08 May 2020 13:30:30 -0700 WASHINGTON — A top aide to Vice President Mike Pence tested positive for coronavirus on Friday, making her the second known person working at the White House to contract the illness in the past two days, according to several sources familiar with the situation. Katie Miller, the vice president’s press secretary, was notified Friday about […] Full Article Health Nation Nation & World Nation & World Politics World
or Child in New York dies; syndrome tied to coronavirus is suspected By www.seattletimes.com Published On :: Fri, 08 May 2020 13:45:35 -0700 NEW YORK — A child died in a Manhattan hospital on Thursday from what appeared to be a rare syndrome linked to the coronavirus that causes life-threatening inflammation in critical organs and blood vessels of children, the hospital said. If confirmed, it would be the first known death in New York related to the mysterious […] Full Article Health Nation Nation & World Nation & World Politics World
or Hidden toll: Mexico ignores wave of coronavirus deaths in capital By www.seattletimes.com Published On :: Fri, 08 May 2020 17:05:32 -0700 MEXICO CITY — The Mexican government is not reporting hundreds, possibly thousands, of deaths from the coronavirus in Mexico City, dismissing anxious officials who have tallied more than three times as many fatalities in the capital than the government publicly acknowledges, according to officials and confidential data. The tensions have come to a head in […] Full Article Health Nation Nation & World Nation & World Politics World
or Two White House coronavirus cases raise question of if anyone is really safe By www.seattletimes.com Published On :: Fri, 08 May 2020 20:12:36 -0700 WASHINGTON — In his eagerness to reopen the country, President Donald Trump faces the challenge of convincing Americans that it would be safe to go back to the workplace. But the past few days have demonstrated that even his own workplace may not be safe from the coronavirus. Vice President Mike Pence’s press secretary tested […] Full Article Health Nation Nation & World Nation & World Politics World
or Reopenings bring new cases in S. Korea, virus fears in Italy By www.seattletimes.com Published On :: Fri, 08 May 2020 21:30:48 -0700 ROME (AP) — South Korea’s capital closed down more than 2,100 bars and other nightspots Saturday because of a new cluster of coronavirus infections, Germany scrambled to contain fresh outbreaks at slaughterhouses, and Italian authorities worried that people were getting too friendly at cocktail hour during the country’s first weekend of eased restrictions. The new […] Full Article Business Health World
or Coronavirus takes a toll in Sweden’s immigrant community By www.seattletimes.com Published On :: Fri, 08 May 2020 23:24:33 -0700 STOCKHOLM (AP) — The flight from Italy was one of the last arrivals that day at the Stockholm airport. A Swedish couple in their 50s walked up and loaded their skis into Razzak Khalaf’s taxi. It was early March and concerns over the coronavirus were already present, but the couple, both coughing for the entire […] Full Article World
or Russian volunteers search for fallen World War II soldiers By www.seattletimes.com Published On :: Sat, 09 May 2020 00:09:09 -0700 KHULKHUTA, Russia (AP) — Crouching over the sun-drenched soil, Alfred Abayev picks up a charred fragment of a Soviet warplane downed in a World War II battle with advancing Nazi forces. “You can see it was burning,” he says, pointing at the weathered trace of a red star. Abayev and members of his search team […] Full Article World
or Russia, Belarus mark Victory Day in contrasting events By www.seattletimes.com Published On :: Sat, 09 May 2020 01:23:16 -0700 MOSCOW (AP) — Russian President Vladimir Putin marked Victory Day, the anniversary of the defeat of Nazi Germany in World War II, in a ceremony shorn of its usual military parade and pomp by the coronavirus pandemic. In neighboring Belarus, however, the ceremonies went ahead in full, with tens of thousands of people in the […] Full Article World
or Google Florida 2.0 Algorithm Update: Early Observations By www.seobook.com Published On :: 2019-03-18T05:02:03+00:00 It has been a while since Google has had a major algorithm update. They recently announced one which began on the 12th of March. This week, we released a broad core algorithm update, as we do several times per year. Our guidance about such updates remains as we’ve covered before. Please see these tweets for more about that:https://t.co/uPlEdSLHoXhttps://t.co/tmfQkhdjPL— Google SearchLiaison (@searchliaison) March 13, 2019 What changed? It appears multiple things did. When Google rolled out the original version of Penguin on April 24, 2012 (primarily focused on link spam) they also rolled out an update to an on-page spam classifier for misdirection. And, over time, it was quite common for Panda & Penguin updates to be sandwiched together. If you were Google & had the ability to look under the hood to see why things changed, you would probably want to obfuscate any major update by changing multiple things at once to make reverse engineering the change much harder. Anyone who operates a single website (& lacks the ability to look under the hood) will have almost no clue about what changed or how to adjust with the algorithms. In the most recent algorithm update some sites which were penalized in prior "quality" updates have recovered. Though many of those recoveries are only partial. Many SEO blogs will publish articles about how they cracked the code on the latest update by publishing charts like the first one without publishing that second chart showing the broader context. The first penalty any website receives might be the first of a series of penalties. If Google smokes your site & it does not cause a PR incident & nobody really cares that you are gone, then there is a very good chance things will go from bad to worse to worser to worsterest, technically speaking. “In this age, in this country, public sentiment is everything. With it, nothing can fail; against it, nothing can succeed. Whoever molds public sentiment goes deeper than he who enacts statutes, or pronounces judicial decisions.” - Abraham Lincoln Absent effort & investment to evolve FASTER than the broader web, sites which are hit with one penalty will often further accumulate other penalties. It is like compound interest working in reverse - a pile of algorithmic debt which must be dug out of before the bleeding stops. Further, many recoveries may be nothing more than a fleeting invitation to false hope. To pour more resources into a site that is struggling in an apparent death loop. The above site which had its first positive algorithmic response in a couple years achieved that in part by heavily de-monetizing. After the algorithm updates already demonetized the website over 90%, what harm was there in removing 90% of what remained to see how it would react? So now it will get more traffic (at least for a while) but then what exactly is the traffic worth to a site that has no revenue engine tied to it? That is ultimately the hard part. Obtaining a stable stream of traffic while monetizing at a decent yield, without the monetizing efforts leading to the traffic disappearing. A buddy who owns the above site was working on link cleanup & content improvement on & off for about a half year with no results. Each month was a little worse than the prior month. It was only after I told him to remove the aggressive ads a few months back that he likely had any chance of seeing any sort of traffic recovery. Now he at least has a pulse of traffic & can look into lighter touch means of monetization. If a site is consistently penalized then the problem might not be an algorithmic false positive, but rather the business model of the site. The more something looks like eHow the more fickle Google's algorithmic with receive it. Google does not like websites that sit at the end of the value chain & extract profits without having to bear far greater risk & expense earlier into the cycle. Thin rewrites, largely speaking, don't add value to the ecosystem. Doorway pages don't either. And something that was propped up by a bunch of keyword-rich low-quality links is (in most cases) probably genuinely lacking in some other aspect. Generally speaking, Google would like themselves to be the entity at the end of the value chain extracting excess profits from markets. RIP Quora!!! Q&A On Google - Showing Questions That Need Answers In Search https://t.co/mejXUDwGhT pic.twitter.com/8Cv1iKjDh2— John Shehata (@JShehata) March 18, 2019 This is the purpose of the knowledge graph & featured snippets. To allow the results to answer the most basic queries without third party publishers getting anything. The knowledge graph serve as a floating vertical that eat an increasing share of the value chain & force publishers to move higher up the funnel & publish more differentiated content. As Google adds features to the search results (flight price trends, a hotel booking service on the day AirBNB announced they acquired HotelTonight, ecommerce product purchase on Google, shoppable image ads just ahead of the Pinterest IPO, etc.) it forces other players in the value chain to consolidate (Expedia owns Orbitz, Travelocity, Hotwire & a bunch of other sites) or add greater value to remain a differentiated & sought after destination (travel review site TripAdvisor was crushed by the shift to mobile & the inability to monetize mobile traffic, so they eventually had to shift away from being exclusively a reviews site to offer event & hotel booking features to remain relevant). It is never easy changing a successful & profitable business model, but it is even harder to intentionally reduce revenues further or spend aggressively to improve quality AFTER income has fallen 50% or more. Some people do the opposite & make up for a revenue shortfall by publishing more lower end content at an ever faster rate and/or increasing ad load. Either of which typically makes their user engagement metrics worse while making their site less differentiated & more likely to receive additional bonus penalties to drive traffic even lower. In some ways I think the ability for a site to survive & remain though a penalty is itself a quality signal for Google. Some sites which are overly reliant on search & have no external sources of traffic are ultimately sites which tried to behave too similarly to the monopoly that ultimately displaced them. And over time the tech monopolies are growing more powerful as the ecosystem around them burns down: If you had to choose a date for when the internet died, it would be in the year 2014. Before then, traffic to websites came from many sources, and the web was a lively ecosystem. But beginning in 2014, more than half of all traffic began coming from just two sources: Facebook and Google. Today, over 70 percent of traffic is dominated by those two platforms. Businesses which have sustainable profit margins & slack (in terms of management time & resources to deploy) can better cope with algorithmic changes & change with the market. Over the past half decade or so there have been multiple changes that drastically shifted the online publishing landscape: the shift to mobile, which both offers publishers lower ad yields while making the central ad networks more ad heavy in a way that reduces traffic to third party sites the rise of the knowledge graph & featured snippets which often mean publishers remain uncompensated for their work higher ad loads which also lower organic reach (on both search & social channels) the rise of programmatic advertising, which further gutted display ad CPMs the rise of ad blockers increasing algorithmic uncertainty & a higher barrier to entry Each one of the above could take a double digit percent out of a site's revenues, particularly if a site was reliant on display ads. Add them together and a website which was not even algorithmically penalized could still see a 60%+ decline in revenues. Mix in a penalty and that decline can chop a zero or two off the total revenues. Businesses with lower margins can try to offset declines with increased ad spending, but that only works if you are not in a market with 2 & 20 VC fueled competition: Startups spend almost 40 cents of every VC dollar on Google, Facebook, and Amazon. We don’t necessarily know which channels they will choose or the particularities of how they will spend money on user acquisition, but we do know more or less what’s going to happen. Advertising spend in tech has become an arms race: fresh tactics go stale in months, and customer acquisition costs keep rising. In a world where only one company thinks this way, or where one business is executing at a level above everyone else - like Facebook in its time - this tactic is extremely effective. However, when everyone is acting this way, the industry collectively becomes an accelerating treadmill. Ad impressions and click-throughs get bid up to outrageous prices by startups flush with venture money, and prospective users demand more and more subsidized products to gain their initial attention. The dynamics we’ve entered is, in many ways, creating a dangerous, high stakes Ponzi scheme. And sometimes the platform claws back a second or third bite of the apple. Amazon.com charges merchants for fulfillment, warehousing, transaction based fees, etc. And they've pushed hard into launching hundreds of private label brands which pollute the interface & force brands to buy ads even on their own branded keyword terms. They've recently jumped the shark by adding a bonus feature where even when a brand paid Amazon to send traffic to their listing, Amazon would insert a spam popover offering a cheaper private label branded product: Amazon.com tested a pop-up feature on its app that in some instances pitched its private-label goods on rivals’ product pages, an experiment that shows the e-commerce giant’s aggressiveness in hawking lower-priced products including its own house brands. The recent experiment, conducted in Amazon’s mobile app, went a step further than the display ads that commonly appear within search results and product pages. This test pushed pop-up windows that took over much of a product page, forcing customers to either click through to the lower-cost Amazon products or dismiss them before continuing to shop. ... When a customer using Amazon’s mobile app searched for “AAA batteries,” for example, the first link was a sponsored listing from Energizer Holdings Inc. After clicking on the listing, a pop-up window appeared, offering less expensive AmazonBasics AAA batteries." Buying those Amazon ads was quite literally subsidizing a direct competitor pushing you into irrelevance. And while Amazon is destroying brand equity, AWS is doing investor relations matchmaking for startups. Anything to keep the current bubble going ahead of the Uber IPO that will likely mark the top in the stock market. Some thoughts on Silicon Valley's endgame. We have long said the biggest risk to the bull market is an Uber IPO. That is now upon us.— Jawad Mian (@jsmian) March 16, 2019 As the market caps of big tech companies climb they need to be more predatious to grow into the valuations & retain employees with stock options at an ever-increasing strike price. They've created bubbles in their own backyards where each raise requires another. Teachers either drive hours to work or live in houses subsidized by loans from the tech monopolies that get a piece of the upside (provided they can keep their own bubbles inflated). "It is an uncommon arrangement — employer as landlord — that is starting to catch on elsewhere as school employees say they cannot afford to live comfortably in regions awash in tech dollars. ... Holly Gonzalez, 34, a kindergarten teacher in East San Jose, and her husband, Daniel, a school district I.T. specialist, were able to buy a three-bedroom apartment for $610,000 this summer with help from their parents and from Landed. When they sell the home, they will owe Landed 25 percent of any gain in its value. The company is financed partly by the Chan Zuckerberg Initiative, Mark Zuckerberg’s charitable arm." The above sort of dynamics have some claiming peak California: The cycle further benefits from the Alchian-Allen effect: agglomerating industries have higher productivity, which raises the cost of living and prices out other industries, raising concentration over time. ... Since startups raise the variance within whatever industry they’re started in, the natural constituency for them is someone who doesn’t have capital deployed in the industry. If you’re an asset owner, you want low volatility. ... Historically, startups have created a constant supply of volatility for tech companies; the next generation is always cannibalizing the previous one. So chip companies in the 1970s created the PC companies of the 80s, but PC companies sourced cheaper and cheaper chips, commoditizing the product until Intel managed to fight back. Meanwhile, the OS turned PCs into a commodity, then search engines and social media turned the OS into a commodity, and presumably this process will continue indefinitely. ... As long as higher rents raise the cost of starting a pre-revenue company, fewer people will join them, so more people will join established companies, where they’ll earn market salaries and continue to push up rents. And one of the things they’ll do there is optimize ad loads, which places another tax on startups. More dangerously, this is an incremental tax on growth rather than a fixed tax on headcount, so it puts pressure on out-year valuations, not just upfront cash flow. If you live hundreds of miles away the tech companies may have no impact on your rental or purchase price, but you can't really control the algorithms or the ecosystem. All you can really control is your mindset & ensuring you have optionality baked into your business model. If you are debt-levered you have little to no optionality. Savings give you optionality. Savings allow you to run at a loss for a period of time while also investing in improving your site and perhaps having a few other sites in other markets. If you operate a single website that is heavily reliant on a third party for distribution then you have little to no optionality. If you have multiple projects that enables you to shift your attention toward working on whatever is going up and to the right while letting anything that is failing pass time without becoming overly reliant on something you can't change. This is why it often makes sense for a brand merchant to operate their own ecommerce website even if 90% of their sales come from Amazon. It gives you optionality should the tech monopoly become abusive or otherwise harm you (even if the intent was benign rather than outright misanthropic). As the update ensues Google will collect more data with how users interact with the result set & determine how to weight different signals, along with re-scoring sites that recovered based on the new engagement data. Recently a Bing engineer named Frédéric Dubut described how they score relevancy signals used in updates As early as 2005, we used neural networks to power our search engine and you can still find rare pictures of Satya Nadella, VP of Search and Advertising at the time, showcasing our web ranking advances. ... The “training” process of a machine learning model is generally iterative (and all automated). At each step, the model is tweaking the weight of each feature in the direction where it expects to decrease the error the most. After each step, the algorithm remeasures the rating of all the SERPs (based on the known URL/query pair ratings) to evaluate how it’s doing. Rinse and repeat. That same process is ongoing with Google now & in the coming weeks there'll be the next phase of the current update. So far it looks like some quality-based re-scoring was done & some sites which were overly reliant on anchor text got clipped. On the back end of the update there'll be another quality-based re-scoring, but the sites that were hit for excessive manipulation of anchor text via link building efforts will likely remain penalized for a good chunk of time. Update: It appears a major reverberation of this update occurred on April 7th. From early analysis, Google is mixing in showing results for related midtail concepts on a core industry search term & they are also in some cases pushing more aggressively on doing internal site-level searches to rank a more relevant internal page for a query where they homepage might have ranked in the past. Full Article
or Keyword Not Provided, But it Just Clicks By www.seobook.com Published On :: 2019-04-09T15:09:28+00:00 When SEO Was Easy When I got started on the web over 15 years ago I created an overly broad & shallow website that had little chance of making money because it was utterly undifferentiated and crappy. In spite of my best (worst?) efforts while being a complete newbie, sometimes I would go to the mailbox and see a check for a couple hundred or a couple thousand dollars come in. My old roommate & I went to Coachella & when the trip was over I returned to a bunch of mail to catch up on & realized I had made way more while not working than what I spent on that trip. What was the secret to a total newbie making decent income by accident? Horrible spelling. Back then search engines were not as sophisticated with their spelling correction features & I was one of 3 or 4 people in the search index that misspelled the name of an online casino the same way many searchers did. The high minded excuse for why I did not scale that would be claiming I knew it was a temporary trick that was somehow beneath me. The more accurate reason would be thinking in part it was a lucky fluke rather than thinking in systems. If I were clever at the time I would have created the misspeller's guide to online gambling, though I think I was just so excited to make anything from the web that I perhaps lacked the ambition & foresight to scale things back then. In the decade that followed I had a number of other lucky breaks like that. One time one of the original internet bubble companies that managed to stay around put up a sitewide footer link targeting the concept that one of my sites made decent money from. This was just before the great recession, before Panda existed. The concept they targeted had 3 or 4 ways to describe it. 2 of them were very profitable & if they targeted either of the most profitable versions with that page the targeting would have sort of carried over to both. They would have outranked me if they targeted the correct version, but they didn't so their mistargeting was a huge win for me. Search Gets Complex Search today is much more complex. In the years since those easy-n-cheesy wins, Google has rolled out many updates which aim to feature sought after destination sites while diminishing the sites which rely one "one simple trick" to rank. Arguably the quality of the search results has improved significantly as search has become more powerful, more feature rich & has layered in more relevancy signals. Many quality small web publishers have went away due to some combination of increased competition, algorithmic shifts & uncertainty, and reduced monetization as more ad spend was redirected toward Google & Facebook. But the impact as felt by any given publisher is not the impact as felt by the ecosystem as a whole. Many terrible websites have also went away, while some formerly obscure though higher-quality sites rose to prominence. There was the Vince update in 2009, which boosted the rankings of many branded websites. Then in 2011 there was Panda as an extension of Vince, which tanked the rankings of many sites that published hundreds of thousands or millions of thin content pages while boosting the rankings of trusted branded destinations. Then there was Penguin, which was a penalty that hit many websites which had heavily manipulated or otherwise aggressive appearing link profiles. Google felt there was a lot of noise in the link graph, which was their justification for the Penguin. There were updates which lowered the rankings of many exact match domains. And then increased ad load in the search results along with the other above ranking shifts further lowered the ability to rank keyword-driven domain names. If your domain is generically descriptive then there is a limit to how differentiated & memorable you can make it if you are targeting the core market the keywords are aligned with. There is a reason eBay is more popular than auction.com, Google is more popular than search.com, Yahoo is more popular than portal.com & Amazon is more popular than a store.com or a shop.com. When that winner take most impact of many online markets is coupled with the move away from using classic relevancy signals the economics shift to where is makes a lot more sense to carry the heavy overhead of establishing a strong brand. Branded and navigational search queries could be used in the relevancy algorithm stack to confirm the quality of a site & verify (or dispute) the veracity of other signals. Historically relevant algo shortcuts become less appealing as they become less relevant to the current ecosystem & even less aligned with the future trends of the market. Add in negative incentives for pushing on a string (penalties on top of wasting the capital outlay) and a more holistic approach certainly makes sense. Modeling Web Users & Modeling Language PageRank was an attempt to model the random surfer. When Google is pervasively monitoring most users across the web they can shift to directly measuring their behaviors instead of using indirect signals. Years ago Bill Slawski wrote about the long click in which he opened by quoting Steven Levy's In the Plex: How Google Thinks, Works, and Shapes our Lives "On the most basic level, Google could see how satisfied users were. To paraphrase Tolstoy, happy users were all the same. The best sign of their happiness was the "Long Click" — This occurred when someone went to a search result, ideally the top one, and did not return. That meant Google has successfully fulfilled the query." Of course, there's a patent for that. In Modifying search result ranking based on implicit user feedback they state: user reactions to particular search results or search result lists may be gauged, so that results on which users often click will receive a higher ranking. The general assumption under such an approach is that searching users are often the best judges of relevance, so that if they select a particular search result, it is likely to be relevant, or at least more relevant than the presented alternatives. If you are a known brand you are more likely to get clicked on than a random unknown entity in the same market. And if you are something people are specifically seeking out, they are likely to stay on your website for an extended period of time. One aspect of the subject matter described in this specification can be embodied in a computer-implemented method that includes determining a measure of relevance for a document result within a context of a search query for which the document result is returned, the determining being based on a first number in relation to a second number, the first number corresponding to longer views of the document result, and the second number corresponding to at least shorter views of the document result; and outputting the measure of relevance to a ranking engine for ranking of search results, including the document result, for a new search corresponding to the search query. The first number can include a number of the longer views of the document result, the second number can include a total number of views of the document result, and the determining can include dividing the number of longer views by the total number of views. Attempts to manipulate such data may not work. safeguards against spammers (users who generate fraudulent clicks in an attempt to boost certain search results) can be taken to help ensure that the user selection data is meaningful, even when very little data is available for a given (rare) query. These safeguards can include employing a user model that describes how a user should behave over time, and if a user doesn't conform to this model, their click data can be disregarded. The safeguards can be designed to accomplish two main objectives: (1) ensure democracy in the votes (e.g., one single vote per cookie and/or IP for a given query-URL pair), and (2) entirely remove the information coming from cookies or IP addresses that do not look natural in their browsing behavior (e.g., abnormal distribution of click positions, click durations, clicks_per_minute/hour/day, etc.). Suspicious clicks can be removed, and the click signals for queries that appear to be spmed need not be used (e.g., queries for which the clicks feature a distribution of user agents, cookie ages, etc. that do not look normal). And just like Google can make a matrix of documents & queries, they could also choose to put more weight on search accounts associated with topical expert users based on their historical click patterns. Moreover, the weighting can be adjusted based on the determined type of the user both in terms of how click duration is translated into good clicks versus not-so-good clicks, and in terms of how much weight to give to the good clicks from a particular user group versus another user group. Some user's implicit feedback may be more valuable than other users due to the details of a user's review process. For example, a user that almost always clicks on the highest ranked result can have his good clicks assigned lower weights than a user who more often clicks results lower in the ranking first (since the second user is likely more discriminating in his assessment of what constitutes a good result). In addition, a user can be classified based on his or her query stream. Users that issue many queries on (or related to) a given topic T (e.g., queries related to law) can be presumed to have a high degree of expertise with respect to the given topic T, and their click data can be weighted accordingly for other queries by them on (or related to) the given topic T. Google was using click data to drive their search rankings as far back as 2009. David Naylor was perhaps the first person who publicly spotted this. Google was ranking Australian websites for [tennis court hire] in the UK & Ireland, in part because that is where most of the click signal came from. That phrase was most widely searched for in Australia. In the years since Google has done a better job of geographically isolating clicks to prevent things like the problem David Naylor noticed, where almost all search results in one geographic region came from a different country. Whenever SEOs mention using click data to search engineers, the search engineers quickly respond about how they might consider any signal but clicks would be a noisy signal. But if a signal has noise an engineer would work around the noise by finding ways to filter the noise out or combine multiple signals. To this day Google states they are still working to filter noise from the link graph: "We continued to protect the value of authoritative and relevant links as an important ranking signal for Search." The site with millions of inbound links, few intentional visits & those who do visit quickly click the back button (due to a heavy ad load, poor user experience, low quality content, shallow content, outdated content, or some other bait-n-switch approach)...that's an outlier. Preventing those sorts of sites from ranking well would be another way of protecting the value of authoritative & relevant links. Best Practices Vary Across Time & By Market + Category Along the way, concurrent with the above sorts of updates, Google also improved their spelling auto-correct features, auto-completed search queries for many years through a featured called Google Instant (though they later undid forced query auto-completion while retaining automated search suggestions), and then they rolled out a few other algorithms that further allowed them to model language & user behavior. Today it would be much harder to get paid above median wages explicitly for sucking at basic spelling or scaling some other individual shortcut to the moon, like pouring millions of low quality articles into a (formerly!) trusted domain. Nearly a decade after Panda, eHow's rankings still haven't recovered. Back when I got started with SEO the phrase Indian SEO company was associated with cut-rate work where people were buying exclusively based on price. Sort of like a "I got a $500 budget for link building, but can not under any circumstance invest more than $5 in any individual link." Part of how my wife met me was she hired a hack SEO from San Diego who outsourced all the work to India and marked the price up about 100-fold while claiming it was all done in the United States. He created reciprocal links pages that got her site penalized & it didn't rank until after she took her reciprocal links page down. With that sort of behavior widespread (hack US firm teaching people working in an emerging market poor practices), it likely meant many SEO "best practices" which were learned in an emerging market (particularly where the web was also underdeveloped) would be more inclined to being spammy. Considering how far ahead many Western markets were on the early Internet & how India has so many languages & how most web usage in India is based on mobile devices where it is hard for users to create links, it only makes sense that Google would want to place more weight on end user data in such a market. If you set your computer location to India Bing's search box lists 9 different languages to choose from. The above is not to state anything derogatory about any emerging market, but rather that various signals are stronger in some markets than others. And competition is stronger in some markets than others. Search engines can only rank what exists. "In a lot of Eastern European - but not just Eastern European markets - I think it is an issue for the majority of the [bream? muffled] countries, for the Arabic-speaking world, there just isn't enough content as compared to the percentage of the Internet population that those regions represent. I don't have up to date data, I know that a couple years ago we looked at Arabic for example and then the disparity was enormous. so if I'm not mistaken the Arabic speaking population of the world is maybe 5 to 6%, maybe more, correct me if I am wrong. But very definitely the amount of Arabic content in our index is several orders below that. So that means we do not have enough Arabic content to give to our Arabic users even if we wanted to. And you can exploit that amazingly easily and if you create a bit of content in Arabic, whatever it looks like we're gonna go you know we don't have anything else to serve this and it ends up being horrible. and people will say you know this works. I keyword stuffed the hell out of this page, bought some links, and there it is number one. There is nothing else to show, so yeah you're number one. the moment somebody actually goes out and creates high quality content that's there for the long haul, you'll be out and that there will be one." - Andrey Lipattsev – Search Quality Senior Strategist at Google Ireland, on Mar 23, 2016 Impacting the Economics of Publishing Now search engines can certainly influence the economics of various types of media. At one point some otherwise credible media outlets were pitching the Demand Media IPO narrative that Demand Media was the publisher of the future & what other media outlets will look like. Years later, after heavily squeezing on the partner network & promoting programmatic advertising that reduces CPMs by the day Google is funding partnerships with multiple news publishers like McClatchy & Gatehouse to try to revive the news dead zones even Facebook is struggling with. "Facebook Inc. has been looking to boost its local-news offerings since a 2017 survey showed most of its users were clamoring for more. It has run into a problem: There simply isn’t enough local news in vast swaths of the country. ... more than one in five newspapers have closed in the past decade and a half, leaving half the counties in the nation with just one newspaper, and 200 counties with no newspaper at all." As mainstream newspapers continue laying off journalists, Facebook's news efforts are likely to continue failing unless they include direct economic incentives, as Google's programmatic ad push broke the banner ad: "Thanks to the convoluted machinery of Internet advertising, the advertising world went from being about content publishers and advertising context—The Times unilaterally declaring, via its ‘rate card’, that ads in the Times Style section cost $30 per thousand impressions—to the users themselves and the data that targets them—Zappo’s saying it wants to show this specific shoe ad to this specific user (or type of user), regardless of publisher context. Flipping the script from a historically publisher-controlled mediascape to an advertiser (and advertiser intermediary) controlled one was really Google’s doing. Facebook merely rode the now-cresting wave, borrowing outside media’s content via its own users’ sharing, while undermining media’s ability to monetize via Facebook’s own user-data-centric advertising machinery. Conventional media lost both distribution and monetization at once, a mortal blow." Google is offering news publishers audience development & business development tools. Heavy Investment in Emerging Markets Quickly Evolves the Markets As the web grows rapidly in India, they'll have a thousand flowers bloom. In 5 years the competition in India & other emerging markets will be much tougher as those markets continue to grow rapidly. Media is much cheaper to produce in India than it is in the United States. Labor costs are lower & they never had the economic albatross that is the ACA adversely impact their economy. At some point the level of investment & increased competition will mean early techniques stop having as much efficacy. Chinese companies are aggressively investing in India. “If you break India into a pyramid, the top 100 million (urban) consumers who think and behave more like Americans are well-served,” says Amit Jangir, who leads India investments at 01VC, a Chinese venture capital firm based in Shanghai. The early stage venture firm has invested in micro-lending firms FlashCash and SmartCoin based in India. The new target is the next 200 million to 600 million consumers, who do not have a go-to entertainment, payment or ecommerce platform yet— and there is gonna be a unicorn in each of these verticals, says Jangir, adding that it will be not be as easy for a player to win this market considering the diversity and low ticket sizes. RankBrain RankBrain appears to be based on using user clickpaths on head keywords to help bleed rankings across into related searches which are searched less frequently. A Googler didn't state this specifically, but it is how they would be able to use models of searcher behavior to refine search results for keywords which are rarely searched for. In a recent interview in Scientific American a Google engineer stated: "By design, search engines have learned to associate short queries with the targets of those searches by tracking pages that are visited as a result of the query, making the results returned both faster and more accurate than they otherwise would have been." Now a person might go out and try to search for something a bunch of times or pay other people to search for a topic and click a specific listing, but some of the related Google patents on using click data (which keep getting updated) mentioned how they can discount or turn off the signal if there is an unnatural spike of traffic on a specific keyword, or if there is an unnatural spike of traffic heading to a particular website or web page. And, since Google is tracking the behavior of end users on their own website, anomalous behavior is easier to track than it is tracking something across the broader web where signals are more indirect. Google can take advantage of their wide distribution of Chrome & Android where users are regularly logged into Google & pervasively tracked to place more weight on users where they had credit card data, a long account history with regular normal search behavior, heavy Gmail users, etc. Plus there is a huge gap between the cost of traffic & the ability to monetize it. You might have to pay someone a dime or a quarter to search for something & there is no guarantee it will work on a sustainable basis even if you paid hundreds or thousands of people to do it. Any of those experimental searchers will have no lasting value unless they influence rank, but even if they do influence rankings it might only last temporarily. If you bought a bunch of traffic into something genuine Google searchers didn't like then even if it started to rank better temporarily the rankings would quickly fall back if the real end user searchers disliked the site relative to other sites which already rank. This is part of the reason why so many SEO blogs mention brand, brand, brand. If people are specifically looking for you in volume & Google can see that thousands or millions of people specifically want to access your site then that can impact how you rank elsewhere. Even looking at something inside the search results for a while (dwell time) or quickly skipping over it to have a deeper scroll depth can be a ranking signal. Some Google patents mention how they can use mouse pointer location on desktop or scroll data from the viewport on mobile devices as a quality signal. Neural Matching Last year Danny Sullivan mentioned how Google rolled out neural matching to better understand the intent behind a search query. This is a look back at a big change in search but which continues to be important: understanding synonyms. How people search is often different from information that people write solutions about. pic.twitter.com/sBcR4tR4eT— Danny Sullivan (@dannysullivan) September 24, 2018 Last few months, Google has been using neural matching, --AI method to better connect words to concepts. Super synonyms, in a way, and impacting 30% of queries. Don't know what "soapopera effect" is to search for it? We can better figure it out. pic.twitter.com/Qrwp5hKFNz— Danny Sullivan (@dannysullivan) September 24, 2018 The above Tweets capture what the neural matching technology intends to do. Google also stated: we’ve now reached the point where neural networks can help us take a major leap forward from understanding words to understanding concepts. Neural embeddings, an approach developed in the field of neural networks, allow us to transform words to fuzzier representations of the underlying concepts, and then match the concepts in the query with the concepts in the document. We call this technique neural matching. To help people understand the difference between neural matching & RankBrain, Google told SEL: "RankBrain helps Google better relate pages to concepts. Neural matching helps Google better relate words to searches." There are a couple research papers on neural matching. The first one was titled A Deep Relevance Matching Model for Ad-hoc Retrieval. It mentioned using Word2vec & here are a few quotes from the research paper "Successful relevance matching requires proper handling of the exact matching signals, query term importance, and diverse matching requirements." "the interaction-focused model, which first builds local level interactions (i.e., local matching signals) between two pieces of text, and then uses deep neural networks to learn hierarchical interaction patterns for matching." "according to the diverse matching requirement, relevance matching is not position related since it could happen in any position in a long document." "Most NLP tasks concern semantic matching, i.e., identifying the semantic meaning and infer"ring the semantic relations between two pieces of text, while the ad-hoc retrieval task is mainly about relevance matching, i.e., identifying whether a document is relevant to a given query." "Since the ad-hoc retrieval task is fundamentally a ranking problem, we employ a pairwise ranking loss such as hinge loss to train our deep relevance matching model." The paper mentions how semantic matching falls down when compared against relevancy matching because: semantic matching relies on similarity matching signals (some words or phrases with the same meaning might be semantically distant), compositional meanings (matching sentences more than meaning) & a global matching requirement (comparing things in their entirety instead of looking at the best matching part of a longer document); whereas, relevance matching can put significant weight on exact matching signals (weighting an exact match higher than a near match), adjust weighting on query term importance (one word might or phrase in a search query might have a far higher discrimination value & might deserve far more weight than the next) & leverage diverse matching requirements (allowing relevancy matching to happen in any part of a longer document) Here are a couple images from the above research paper And then the second research paper is Deep Relevancy Ranking Using Enhanced Dcoument-Query Interactions "interaction-based models are less efficient, since one cannot index a document representation independently of the query. This is less important, though, when relevancy ranking methods rerank the top documents returned by a conventional IR engine, which is the scenario we consider here." That same sort of re-ranking concept is being better understood across the industry. There are ranking signals that earn some base level ranking, and then results get re-ranked based on other factors like how well a result matches the user intent. Here are a couple images from the above research paper. For those who hate the idea of reading research papers or patent applications, Martinibuster also wrote about the technology here. About the only part of his post I would debate is this one: "Does this mean publishers should use more synonyms? Adding synonyms has always seemed to me to be a variation of keyword spamming. I have always considered it a naive suggestion. The purpose of Google understanding synonyms is simply to understand the context and meaning of a page. Communicating clearly and consistently is, in my opinion, more important than spamming a page with keywords and synonyms." I think one should always consider user experience over other factors, however a person could still use variations throughout the copy & pick up a bit more traffic without coming across as spammy. Danny Sullivan mentioned the super synonym concept was impacting 30% of search queries, so there are still a lot which may only be available to those who use a specific phrase on their page. Martinibuster also wrote another blog post tying more research papers & patents to the above. You could probably spend a month reading all the related patents & research papers. The above sort of language modeling & end user click feedback compliment links-based ranking signals in a way that makes it much harder to luck one's way into any form of success by being a terrible speller or just bombing away at link manipulation without much concern toward any other aspect of the user experience or market you operate in. Pre-penalized Shortcuts Google was even issued a patent for predicting site quality based upon the N-grams used on the site & comparing those against the N-grams used on other established site where quality has already been scored via other methods: "The phrase model can be used to predict a site quality score for a new site; in particular, this can be done in the absence of other information. The goal is to predict a score that is comparable to the baseline site quality scores of the previously-scored sites." Have you considered using a PLR package to generate the shell of your site's content? Good luck with that as some sites trying that shortcut might be pre-penalized from birth. Navigating the Maze When I started in SEO one of my friends had a dad who is vastly smarter than I am. He advised me that Google engineers were smarter, had more capital, had more exposure, had more data, etc etc etc ... and thus SEO was ultimately going to be a malinvestment. Back then he was at least partially wrong because influencing search was so easy. But in the current market, 16 years later, we are near the infection point where he would finally be right. At some point the shortcuts stop working & it makes sense to try a different approach. The flip side of all the above changes is as the algorithms have become more complex they have went from being a headwind to people ignorant about SEO to being a tailwind to those who do not focus excessively on SEO in isolation. If one is a dominant voice in a particular market, if they break industry news, if they have key exclusives, if they spot & name the industry trends, if their site becomes a must read & is what amounts to a habit ... then they perhaps become viewed as an entity. Entity-related signals help them & those signals that are working against the people who might have lucked into a bit of success become a tailwind rather than a headwind. If your work defines your industry, then any efforts to model entities, user behavior or the language of your industry are going to boost your work on a relative basis. This requires sites to publish frequently enough to be a habit, or publish highly differentiated content which is strong enough that it is worth the wait. Those which publish frequently without being particularly differentiated are almost guaranteed to eventually walk into a penalty of some sort. And each additional person who reads marginal, undifferentiated content (particularly if it has an ad-heavy layout) is one additional visitor that site is closer to eventually getting whacked. Success becomes self regulating. Any short-term success becomes self defeating if one has a highly opportunistic short-term focus. Those who write content that only they could write are more likely to have sustained success. A mistake people often make is to look at someone successful, then try to do what they are doing, assuming it will lead to similar success.This is backward.Find something you enjoy doing & are curious about.Get obsessed, & become one of the best at it.It will monetize itself.— Neil Strauss (@neilstrauss) March 30, 2019 Full Article
or AMP'd Up for Recaptcha By www.seobook.com Published On :: 2019-06-30T21:47:54+00:00 Beyond search Google controls the leading distributed ad network, the leading mobile OS, the leading web browser, the leading email client, the leading web analytics platform, the leading mapping platform, the leading free video hosting site. They win a lot. And they take winnings from one market & leverage them into manipulating adjacent markets. Embrace. Extend. Extinguish. Imagine taking a universal open standard that has zero problems with it and then stripping it down to it's most basic components and then prepending each element with your own acronym. Then spend years building and recreating what has existed for decades. That is @amphtml— Jon Henshaw (@henshaw) April 4, 2019 AMP is an utterly unnecessary invention designed to further shift power to Google while disenfranchising publishers. From the very start it had many issues with basic things like supporting JavaScript, double counting unique users (no reason to fix broken stats if they drive adoption!), not supporting third party ad networks, not showing publisher domain names, and just generally being a useless layer of sunk cost technical overhead that provides literally no real value. Over time they have corrected some of these catastrophic deficiencies, but if it provided real value, they wouldn't have needed to force adoption with preferential placement in their search results. They force the bundling because AMP sucks. Absurdity knows no bounds. Googlers suggest: "AMP isn’t another “channel” or “format” that’s somehow not the web. It’s not a SEO thing. It’s not a replacement for HTML. It’s a web component framework that can power your whole site. ... We, the AMP team, want AMP to become a natural choice for modern web development of content websites, and for you to choose AMP as framework because it genuinely makes you more productive." Meanwhile some newspapers have about a dozen employees who work on re-formatting content for AMP: The AMP development team now keeps track of whether AMP traffic drops suddenly, which might indicate pages are invalid, and it can react quickly. All this adds expense, though. There are setup, development and maintenance costs associated with AMP, mostly in the form of time. After implementing AMP, the Guardian realized the project needed dedicated staff, so it created an 11-person team that works on AMP and other aspects of the site, drawing mostly from existing staff. Feeeeeel the productivity! Some content types (particularly user generated content) can be unpredictable & circuitous. For many years forums websites would use keywords embedded in the search referral to highlight relevant parts of the page. Keyword (not provided) largely destroyed that & then it became a competitive feature for AMP: "If the Featured Snippet links to an AMP article, Google will sometimes automatically scroll users to that section and highlight the answer in orange." That would perhaps be a single area where AMP was more efficient than the alternative. But it is only so because Google destroyed the alternative by stripping keyword referrers from search queries. The power dynamics of AMP are ugly: "I see them as part of the effort to normalise the use of the AMP Carousel, which is an anti-competitive land-grab for the web by an organisation that seems to have an insatiable appetite for consuming the web, probably ultimately to it’s own detriment. ... This enables Google to continue to exist after the destination site (eg the New York Times) has been navigated to. Essentially it flips the parent-child relationship to be the other way around. ... As soon as a publisher blesses a piece of content by packaging it (they have to opt in to this, but see coercion below), they totally lose control of its distribution. ... I’m not that smart, so it’s surely possible to figure out other ways of making a preload possible without cutting off the content creator from the people consuming their content. ... The web is open and decentralised. We spend a lot of time valuing the first of these concepts, but almost none trying to defend the second. Google knows, perhaps better than anyone, how being in control of the user is the most monetisable position, and having the deepest pockets and the most powerful platform to do so, they have very successfully inserted themselves into my relationship with millions of other websites. ... In AMP, the support for paywalls is based on a recommendation that the premium content be included in the source of the page regardless of the user’s authorisation state. ... These policies demonstrate contempt for others’ right to freely operate their businesses. After enough publishers adopted AMP Google was able to turn their mobile app's homepage into an interactive news feed below the search box. And inside that news feed Google gets to distribute MOAR ads while 0% of the revenue from those ads find its way to the publishers whose content is used to make up the feed. Appropriate appropriation. :D Thank you for your content!!! Well this issue (bug?) is going to cause a sh*t storm... Google @AMPhtml not allowing people to click through to full site? You can’t see but am clicking the link in top right iOS Chrome 74.0.3729.155 pic.twitter.com/dMt5QSW9fu— Scotch.io (@scotch_io) June 11, 2019 The mainstream media is waking up to AMP being a trap, but their neck is already in it: European and American tech, media and publishing companies, including some that originally embraced AMP, are complaining that the Google-backed technology, which loads article pages in the blink of an eye on smartphones, is cementing the search giant's dominance on the mobile web. Each additional layer of technical cruft is another cost center. Things that sound appealing at first blush may not be: The way you verify your identity to Let's Encrypt is the same as with other certificate authorities: you don't really. You place a file somewhere on your website, and they access that file over plain HTTP to verify that you own the website. The one attack that signed certificates are meant to prevent is a man-in-the-middle attack. But if someone is able to perform a man-in-the-middle attack against your website, then he can intercept the certificate verification, too. In other words, Let's Encrypt certificates don't stop the one thing they're supposed to stop. And, as always with the certificate authorities, a thousand murderous theocracies, advertising companies, and international spy organizations are allowed to impersonate you by design. Anything that is easy to implement & widely marketed often has costs added to it in the future as the entity moves to monetize the service. This is a private equity firm buying up multiple hosting control panels & then adjusting prices. This is Google Maps drastically changing their API terms. This is Facebook charging you for likes to build an audience, giving your competitors access to those likes as an addressable audience to advertise against, and then charging you once more to boost the reach of your posts. This is Grubhub creating shadow websites on your behalf and charging you for every transaction created by the gravity of your brand. Shivane believes GrubHub purchased her restaurant’s web domain to prevent her from building her own online presence. She also believes the company may have had a special interest in owning her name because she processes a high volume of orders. ... it appears GrubHub has set up several generic, templated pages that look like real restaurant websites but in fact link only to GrubHub. These pages also display phone numbers that GrubHub controls. The calls are forwarded to the restaurant, but the platform records each one and charges the restaurant a commission fee for every order Settling for the easiest option drives a lack of differentiation, embeds additional risk & once the dominant player has enough marketshare they'll change the terms on you. Small gains in short term margins for massive increases in fragility. "Closed platforms increase the chunk size of competition & increase the cost of market entry, so people who have good ideas, it is a lot more expensive for their productivity to be monetized. They also don't like standardization ... it looks like rent seeking behaviors on top of friction" - Gabe Newell The other big issue is platforms that run out of growth space in their core market may break integrations with adjacent service providers as each want to grow by eating the other's market. Those who look at SaaS business models through the eyes of a seasoned investor will better understand how markets are likely to change: "I’d argue that many of today’s anointed tech “disruptors” are doing little in the way of true disruption. ... When investors used to get excited about a SAAS company, they typically would be describing a hosted multi-tenant subscription-billed piece of software that was replacing a ‘legacy’ on-premise perpetual license solution in the same target market (i.e. ERP, HCM, CRM, etc.). Today, the terms SAAS and Cloud essentially describe the business models of every single public software company. Most platform companies are initially required to operate at low margins in order to buy growth of their category & own their category. Then when they are valued on that, they quickly need to jump across to adjacent markets to grow into the valuation: Twilio has no choice but to climb up the application stack. This is a company whose ‘disruption’ is essentially great API documentation and gangbuster SEO spend built on top of a highly commoditized telephony aggregation API. They have won by marketing to DevOps engineers. With all the hype around them, you’d think Twilio invented the telephony API, when in reality what they did was turn it into a product company. Nobody had thought of doing this let alone that this could turn into a $17 billion company because simply put the economics don’t work. And to be clear they still don’t. But Twilio’s genius CEO clearly gets this. If the market is going to value robocalls, emergency sms notifications, on-call pages, and carrier fee passed through related revenue growth in the same way it does ‘subscription’ revenue from Atlassian or ServiceNow, then take advantage of it while it lasts. Large platforms offering temporary subsidies to ensure they dominate their categories & companies like SoftBank spraying capital across the markets is causing massive shifts in valuations: I also think if you look closely at what is celebrated today as innovation you often find models built on hidden subsidies. ... I’d argue the very distributed nature of microservices architecture and API-first product companies means addressable market sizes and unit economics assumptions should be even more carefully scrutinized. ... How hard would it be to create an Alibaba today if someone like SoftBank was raining money into such a greenfield space? Excess capital would lead to destruction and likely subpar returns. If capital was the solution, the 1.5 trillion that went into telcos in late '90s wouldn’t have led to a massive bust. Would a Netflix be what it is today if a SoftBank was pouring billions into streaming content startups right as the experiment was starting? Obviously not. Scarcity of capital is another often underappreciated part of the disruption equation. Knowing resources are finite leads to more robust models. ... This convergence is starting to manifest itself in performance. Disney is up 30% over the last 12 months while Netflix is basically flat. This may not feel like a bubble sign to most investors, but from my standpoint, it’s a clear evidence of the fact that we are approaching a something has got to give moment for the way certain businesses are valued." Circling back to Google's AMP, it has a cousin called Recaptcha. Recaptcha is another AMP-like trojan horse: According to tech statistics website Built With, more than 650,000 websites are already using reCaptcha v3; overall, there are at least 4.5 million websites use reCaptcha, including 25% of the top 10,000 sites. Google is also now testing an enterprise version of reCaptcha v3, where Google creates a customized reCaptcha for enterprises that are looking for more granular data about users’ risk levels to protect their site algorithms from malicious users and bots. ... According to two security researchers who’ve studied reCaptcha, one of the ways that Google determines whether you’re a malicious user or not is whether you already have a Google cookie installed on your browser. ... To make this risk-score system work accurately, website administrators are supposed to embed reCaptcha v3 code on all of the pages of their website, not just on forms or log-in pages. About a month ago when logging into Bing Ads I saw recaptcha on the login page & couldn't believe they'd give Google control at that access point. I think they got rid of that, but lots of companies are perhaps shooting themselves in the foot through a combination of over-reliance on Google infrastructure AND sloppy implementation Today when making a purchase on Fiverr, after converting, I got some of this action Hmm. Maybe I will enable JavaScript and try again. Oooops. That is called snatching defeat from the jaws of victory. My account is many years old. My payment type on record has been used for years. I have ordered from the particular seller about a dozen times over the years. And suddenly because my web browser had JavaScript turned off I was deemed a security risk of some sort for making an utterly ordinary transaction I have already completed about a dozen times. On AMP JavaScript was the devil. And on desktop not JavaScript was the devil. Pro tip: Ecommerce websites that see substandard conversion rates from using Recaptcha can boost their overall ecommerce revenue by buying more Google AdWords ads. --- As more of the infrastructure stack is driven by AI software there is going to be a very real opportunity for many people to become deplatformed across the web on an utterly arbitrary basis. That tech companies like Facebook also want to create digital currencies on top of the leverage they already have only makes the proposition that much scarier. If the tech platforms host copies of our sites, process the transactions & even create their own currencies, how will we know what level of value they are adding versus what they are extracting? Who measures the measurer? And when the economics turn negative, what will we do if we are hooked into an ecosystem we can't spend additional capital to get out of when things head south? Full Article
or New Keyword Tool By www.seobook.com Published On :: 2019-09-14T10:06:45+00:00 Our keyword tool is updated periodically. We recently updated it once more. For comparison sake, the old keyword tool looked like this Whereas the new keyword tool looks like this The upsides of the new keyword tool are: fresher data from this year more granular data on ad bids vs click prices lists ad clickthrough rate more granular estimates of Google AdWords advertiser ad bids more emphasis on commercial oriented keywords With the new columns of [ad spend] and [traffic value] here is how we estimate those. paid search ad spend: search ad clicks * CPC organic search traffic value: ad impressions * 0.5 * (100% - ad CTR) * CPC The first of those two is rather self explanatory. The second is a bit more complex. It starts with the assumption that about half of all searches do not get any clicks, then it subtracts the paid clicks from the total remaining pool of clicks & multiplies that by the cost per click. The new data also has some drawbacks: Rather than listing search counts specifically it lists relative ranges like low, very high, etc. Since it tends to tilt more toward keywords with ad impressions, it may not have coverage for some longer tail informational keywords. For any keyword where there is insufficient coverage we re-query the old keyword database for data & merge it across. You will know if data came from the new database if the first column says something like low or high & the data came from the older database if there are specific search counts in the first column For a limited time we are still allowing access to both keyword tools, though we anticipate removing access to the old keyword tool in the future once we have collected plenty of feedback on the new keyword tool. Please feel free to leave your feedback in the below comments. One of the cool features of the new keyword tools worth highlighting further is the difference between estimated bid prices & estimated click prices. In the following screenshot you can see how Amazon is estimated as having a much higher bid price than actual click price, largely because due to low keyword relevancy entities other than the official brand being arbitraged by Google require much higher bids to appear on competing popular trademark terms. Historically, this difference between bid price & click price was a big source of noise on lists of the most valuable keywords. Recently some advertisers have started complaining about the "Google shakedown" from how many brand-driven searches are simply leaving the .com part off of a web address in Chrome & then being forced to pay Google for their own pre-existing brand equity. When Google puts 4 paid ads ahead of the first organic result for your own brand name, you’re forced to pay up if you want to be found. It’s a shakedown. It’s ransom. But at least we can have fun with it. Search for Basecamp and you may see this attached ad. pic.twitter.com/c0oYaBuahL— Jason Fried (@jasonfried) September 3, 2019 Full Article
or Dofollow, Nofollow, Sponsored, UGC By www.seobook.com Published On :: 2019-10-24T05:20:14+00:00 A Change to Nofollow Last month Google announced they were going to change how they treated nofollow, moving it from a directive toward a hint. As part of that they also announced the release of parallel attributes rel="sponsored" for sponsored links & rel="ugc" for user generated content in areas like forums & blog comments. Why not completely ignore such links, as had been the case with nofollow? Links contain valuable information that can help us improve search, such as how the words within links describe content they point at. Looking at all the links we encounter can also help us better understand unnatural linking patterns. By shifting to a hint model, we no longer lose this important information, while still allowing site owners to indicate that some links shouldn’t be given the weight of a first-party endorsement. In many emerging markets the mobile web is effectively the entire web. Few people create HTML links on the mobile web outside of on social networks where links are typically nofollow by default. This reduces the potential signal available to either tracking what people do directly and/or shifting how the nofollow attribute is treated. Google shifting how nofollow is treated is a blanket admission that Penguin & other elements of "the war on links" were perhaps a bit too effective and have started to take valuable signals away from Google. Google has suggested the shift in how nofollow is treated will not lead to any additional blog comment spam. When they announced nofollow they suggested it would lower blog comment spam. Blog comment spam remains a growth market long after the gravity of the web has shifted away from blogs onto social networks. Changing how nofollow is treated only makes any sort of external link analysis that much harder. Those who specialize in link audits (yuck!) have historically ignored nofollow links, but now that is one more set of things to look through. And the good news for professional link auditors is that increases the effective cost they can charge clients for the service. Some nefarious types will notice when competitors get penalized & then fire up Xrummer to help promote the penalized site, ensuring that the link auditor bankrupts the competing business even faster than Google. Links, Engagement, or Something Else... When Google was launched they didn't own Chrome or Android. They were not yet pervasively spying on billions of people: If, like most people, you thought Google stopped tracking your location once you turned off Location History in your account settings, you were wrong. According to an AP investigation published Monday, even if you disable Location History, the search giant still tracks you every time you open Google Maps, get certain automatic weather updates, or search for things in your browser. Thus Google had to rely on external signals as their primary ranking factor: The reason that PageRank is interesting is that there are many cases where simple citation counting does not correspond to our common sense notion of importance. For example, if a web page has a link on the Yahoo home page, it may be just one link but it is a very important one. This page should be ranked higher than many pages with more links but from obscure places. PageRank is an attempt to see how good an approximation to "importance" can be obtained just from the link structure. ... The denition of PageRank above has another intuitive basis in random walks on graphs. The simplied version corresponds to the standing probability distribution of a random walk on the graph of the Web. Intuitively, this can be thought of as modeling the behavior of a "random surfer". Google's reliance on links turned links into a commodity, which led to all sorts of fearmongering, manual penalties, nofollow and the Penguin update. As Google collected more usage data those who overly focused on links often ended up scoring an own goal, creating sites which would not rank. Google no longer invests heavily in fearmongering because it is no longer needed. Search is so complex most people can't figure it out. Many SEOs have reduced their link building efforts as Google dialed up weighting on user engagement metrics, though it appears the tide may now be heading in the other direction. Some sites which had decent engagement metrics but little in the way of link building slid on the update late last month. As much as Google desires relevancy in the short term, they also prefer a system complex enough to external onlookers that reverse engineering feels impossible. If they discourage investment in SEO they increase AdWords growth while gaining greater control over algorithmic relevancy. Google will soon collect even more usage data by routing Chrome users through their DNS service: "Google isn't actually forcing Chrome users to only use Google's DNS service, and so it is not centralizing the data. Google is instead configuring Chrome to use DoH connections by default if a user's DNS service supports it." If traffic is routed through Google that is akin to them hosting the page in terms of being able to track many aspects of user behavior. It is akin to AMP or YouTube in terms of being able to track users and normalize relative engagement metrics. Once Google is hosting the end-to-end user experience they can create a near infinite number of ranking signals given their advancement in computing power: "We developed a new 54-qubit processor, named “Sycamore”, that is comprised of fast, high-fidelity quantum logic gates, in order to perform the benchmark testing. Our machine performed the target computation in 200 seconds, and from measurements in our experiment we determined that it would take the world’s fastest supercomputer 10,000 years to produce a similar output." Relying on "one simple trick to..." sorts of approaches are frequently going to come up empty. EMDs Kicked Once Again I was one of the early promoters of exact match domains when the broader industry did not believe in them. I was also quick to mention when I felt the algorithms had moved in the other direction. Google's mobile layout, which they are now testing on desktop computers as well, replaces green domain names with gray words which are easy to miss. And the favicon icons sort of make the organic results look like ads. Any boost a domain name like CreditCards.ext might have garnered in the past due to matching the keyword has certainly gone away with this new layout that further depreciates the impact of exact-match domain names. At one point in time CreditCards.com was viewed as a consumer destination. It is now viewed ... below the fold. If you have a memorable brand-oriented domain name the favicon can help offset the above impact somewhat, but matching keywords is becoming a much more precarious approach to sustaining rankings as the weight on brand awareness, user engagement & authority increase relative to the weight on anchor text. Full Article