When we were putting together our new book, Internet Marketing Hype, we often found ourselves returning to the first chapter and debating whether the assertion that “Content is King” can really be counted as a myth. It’s certainly a cliché, repeated endlessly on sites around the Web, but for good reason. The pre-eminence of content makes publishers feel that what they’re doing is worthwhile. All of that research, writing, persuading and audience-building is what publishing is all about. But as anyone who has ever set up a website, written posts, then reviewed their Google Analytics stats knows, good content isn’t enough. You also need the search engine juice, the marketing and the promotion to bring in readers.
In the book, we stressed the importance of good content but argued that distribution should stand alongside it as an equal partner in a site’s success. The two though don’t always complement each other. In fact, the most common arena for the conflict between good content and good marketing is the Web page itself and the words it contains. For content producers, Web pages should be well-written, carefully researched and thoughtful enough to build an audience, create trust and produce sales. For search engine experts, sites are meant to be read by robots. They should be filled with keywords that push them up in the search results and bring in masses of traffic in the hope that enough visitors will convert to bring in revenue even if they don’t enjoy what they read.
Turning off the Keyword Firehose
That keyword-based firehose approach may now be under threat. Internet marketing expert John Schwartz’s plea for pages written for humans not search engine rankings is one symptom of a backlash against websites that sacrifice quality for quantity. Google’s own Farmer update, which penalized content farms like Answerbag and ArticlesBase that produce large amounts of keyword-based articles, could suggest that the days of keyword-stuffing are now as much a part of Internet history as Lycos and AskJeeves.
In fact though, the picture is more complex. Although some keyword sites took a major hit from Farmer (eZineArticles fell 90 percent; AssociatedContent dropped 93 percent) others did surprisingly well. The Huffington Post saw a sharp rise in rankings after Farmer was rolled out and even eHow saw an improvement. As Jim Edwards of Bnet explained, Google appeared to have targeted the Web’s bottom-feeders, striking down sites that offer no value at all while allowing sites like Huffington Post that offer a little value from unpaid contributors to benefit.
The difference between those two classes of website isn’t big. eZineArticles has plenty of poor content written to gain links and exposure but buried in its piles are some worthwhile posts with real value. The Huffington Post has lots of well-known writers but it also publishes plenty of posts written by people who are best ignored. Perhaps the clearest difference is in the intent. Article banks exist to promote websites; bloggers, even on content platforms, write to be read.
JC Penney’s Black Hat Link-Buying
One option could be to skip the SEO on the Web page altogether and turn to more black-hat techniques. Even giant companies do this. Earlier this year, The New York Times reported on JC Penney’s link-buying strategy that had netted it the top spot on search results for terms ranging from “dresses” to “grommet top curtains.” The company’s SEO firm had been paying for links on unrelated websites in a successful attempt to push it up the rankings. Once the scheme was discovered, Google dropped JC Penney down the rankings and the catalog firm fired its SEO team. Although the company is likely to have made plenty of short-term sales as a result of its link-buying the long-term effect of Google’s punishment may well end up costing it more.
So what stand should a website publisher take in the battle between key words and meaningful words?
One solution might to be separate the two elements of a successful Web page. Even large commercial sites like Match.com draw a distinction between the user-side of a Web page and the search engine side. The top of the company’s home page is dominated by a search box that pulls visitors in and quickly gives them faces to browse. The bottom of the page though, written in a grey font that’s difficult to read, is standard, keyword-stuffed SEO copy targeted at robots. It’s not a practice that can work easily on a content page like a blog but it might be possible to squeeze a few extra keywords into an author bio at the end of the post.
A better approach though may be more nuanced. Google’s own definition of “good quality content” is vague and while the rules on link-trading and false linking are clear enough, the differences between sites like ArticleBase and Huffington Post aren’t always obvious.
The best solution then is likely to be a mixture of:
- Write the best possible content in the clearest possible way;
- Work in natural keywords at a rate of about 1-3 percent, sacrificing keywords for clarity where necessary;
- Give the page time to build an audience and develop links naturally. (Google, apparently, becomes suspicious of sites that suddenly develop a rash of backlinks).
That last point is really key. Mark Schwartz’s argument against posts written for robots wasn’t based on the fear that Google penalizes sites that stuff their pages with keywords. It was based on the fear that keyword rich posts fail to engage audiences. Good content, he argues, doesn’t just do well in search engine, it also resonates with readers, building the trust necessary to create sales. As he puts it:
“Rankings are great, but they don’t buy anything. PEOPLE do.”
The usual reason that people turn to black-hat methods, whether it’s keyword stuffing or link-buying isn’t that they want success but that they want success now. Winning a top ranking though takes time — as does building the trust with an audience necessary to turn readers into buyers, something that can only be done with good content.