Content management systems: Why we can’t have nice things?

In a rare but welcome turn of events, this week I read three thoughtful deep dives about content management systems.

1

I found myself nodding a lot at this Mediashift piece that discussed how magazines can better use analytics to determine their digital focus. Some highlights:

“We watch numbers on each of these platforms and determine what platforms can have a rich workflow and rich experience, and where we want to enhance the content with video. We also have replica editions where people are happy with just a flipbook. We make decisions on a per-platform basis [by considering] the return on investment of any of these.” —Kerrie Keegan, Reader’s Digest

“All of the different platforms — not even just production platforms like Mag+, Zinio, Adobe DPS, but also Apple versus Google versus Amazon versus Next Issue — all of those have a different set of analytics and metrics that can be obtained. Those really differ widely. It’s one of the core challenges for anybody trying to publish in this space and across those markets…. The challenges aren’t really technical at this point. The challenges are what I call infrastructure. In print, we all know what rate base is, what CPMs are going to be, what metrics we pay attention to. We don’t have the same infrastructure for monetizing digital. From an advertising point of view, does rate base matter, or is it interaction, engagement, time in app?” —Mike Haney, Mag+

2

This excellent piece from Neiman Lab gets into the inner workings of Scoop, the New York Times‘s CMS, with Luke Vnenchak. The parts I found most interesting had to do with something I always advocate: better integration of basic editorial functions, such as, oh, I don’t know, editing words, into CMSes.

Scoop incorporates a number of real-time editing options that might look familiar to Google Docs users. Different team members can work on different parts of a story at the same time: “For example, a reporter can work on the article while an editor is writing the headline and summary and a producer is adding multimedia. But one editor can’t work on the headline while another works on the summary.”

Isn’t it amazing that this very basic functionality is so hard to come by in most off-the-shelf CMSes? Additionally, for being content management systems, most CMSes are abysmal at actually managing content in the editorial sense:

One thing that is always handy in newsrooms is a system for tracking the status of stories as they move from assigning and writing to editing. Beyond knowing the status of an article, Vnenchak said they want the system to track when stories run online and in print, and how a story is performing once it’s published.

Our asks as editors are quite standard, if not primitive, from a content-making standpoint. Something as essential as status tracking being incorporated into a CMS should be common, not rare.

3

Finally, an intriguing post that could indicate the end of cobbled together, homegrown editorial CMSes. Much can be said about Google, but even its detractors have to admit that when the company puts its mind to doing something, it gets done. That something might soon be a CMS “that would unify editorial, advertising and perhaps commerce activities for media companies.”

The so-far-untapped opportunity that Google is chasing — articulated with greater frequency this year in ad tech circles — is to take a holistic approach to managing yield that spans multiple publisher revenue sources and screen form factors.

The idea that a editorial-based, unifying CMS hasn’t yet been developed is rather shocking in itself. But the arguments the article makes against Google developing such a product are the pinnacle of self-reproach and shame. It’s almost as though all of online publishing has been told by its shrieking mother, “This is why we can’t have nice things!” and internalized the message:

A CMS could be a tough sell for Google, especially as a number of publishers have lately staked their future on the strength of a proprietary CMS. Three prominent examples are Vox Media, whose vaunted Chorus CMS is considered its secret sauce, BuzzFeed, which has baked native advertising into its content platform, and The New York Times, where technology-powered storytelling is seen as core to its editorial and advertising mission. For such publishers, adopting a CMS from a large platform player like Google would be tantamount to outsourcing the very notion of innovation.

Additionally many established publishers have customized their content tools to integrate with legacy publishing systems. Many publishers use multiple CMSs, for instance a custom platform powered by Drupal alongside WordPress for blogging. So there’s a big technical hurdle to adopting any off-the-shelf solution Google has on offer. That’s setting aside the technical and human resources barriers required to migrate away from “good enough” content systems.

This last part reminds me of the great Aimee Mann song “Momentum”: “But I can’t confront the doubts I have/I can’t admit that maybe the past was bad/And so, for the sake of momentum/I’m condemning the future to death/So it can match the past.”

It seems obvious that we should embrace enhancements to CMSes for editorial, be they analytics, metrics, platforms, workflows, or appropriate ad-edit collaboration.

Read More

Junk at scale vs. quality in proportion

SF Weekly recently published an in-depth look at the Bleacher Report, a sports-centric site whose content is populated almost entirely by its readers. As the article notes, it “[tapped] the oceanic labor pool of thousands of unpaid sports fanatics typing on thousands of keyboards.” The site is user-generated content taken to its logical extreme, for good and bad. The good being the scale of coverage; the bad, the poorly written content.

But now it’s gone pro, hired real writers and editors, and been polished up — and the “lowest-common-denominator crap,” editor King Kaufman says, has been gussied up. The site is now owned by Turner Broadcasting, which snapped it up this summer for a couple hundred mil. Not bad for a site that was built on the backs on unpaid superfans.

I’m not interested in the Bleacher Report per se, but I am interested in the idea that nowadays, crap at scale matters less than quality in proportion, because it’s part of a larger trend sparked by disparate forces in the evolution of the Internet. They’ve come together to wipe away a short-lived business model that called for garbage content that ranked well in search but left the user unfulfilled. This model’s most prominent proponent was Demand Media (and its sites, among which are eHow and Livestrong), but certainly the Bleacher Report qualifies too.

The article does a good job explaining how Bleacher Report (and Demand) initially found so much success — basically, by cheating search engines:

Reverse-engineering content to fit a pre-written headline is a Bleacher Report staple. Methodically crafting a data-driven, SEO-friendly headline and then filling in whatever words justify it has been a smashing success.

The piece also touches on the larger context of the shift from what it calls “legacy media” to the current landscape:

After denigrating and downplaying the influence of the Internet for decades, many legacy media outlets now find themselves outmaneuvered by defter and web-savvier entities like Bleacher Report, a young company engineered to conquer the Internet. In the days of yore, professional media outlets enjoyed a monopoly on information. Trained editors and writers served as gatekeepers deciding what stories people would read, and the system thrived on massive influxes of advertising dollars. That era has gone, and the Internet has flipped the script. In one sense, readers have never had it so good — the glut of material on the web translates into more access to great writing than any prior era. The trick is sifting through the crap to find it. Most mainstream media outlets are unable or unwilling to compete with a site like Bleacher Report, which floods the web with inexpensive user-generated content. They continue to wither while Bleacher Report amasses readers and advertisers alike.

But that being the case, we’re now entering a brand-new era, one that will attempt to combine the scale and optimization of the new guys with the polish of the old. And we’re seeing the end of the SEO-engineered-dreck model for three reasons:

1. The rise of social media as currency
2. Google’s Panda algorithm change
3. Advertiser interest

1. The rise of social media as currency
Used to be, back in the aughts, when you were looking for (for example) a podiatrist, you’d Google “podiatrist 10017.” You’d get pages and pages of results; you’d sift through them and cross-reference them to your insurance provider, then go to the doctor, discover he had a terrible bedside manner, and decide you’d rather keep your darn ingrown toenail. Nowadays, your first move would probably be to ask your friends on Facebook or Twitter, “Anyone in NYC have a recommendation for a good podiatrist who takes Blue Cross?” And you’d get a curated response from a dependable source (or even a few of them).

Plainly, social media users endorse people, products and articles that are meaningful. You’d never tweet, “Great analysis of how to treat an ingrown toenail on eHow” (at least not unironically). But you might recommend an article from Fast Company on the latest from ZocDoc.

There will always be a place for search — it’s one of the main entryways into any news or information site, and that’s not going to change anytime soon — but good quality content from a trustworthy source is becoming increasingly valuable again.

2. Google’s Panda algorithm change
In early 2011, Google changed its algorithm in an update it called Panda. This meant that, broadly speaking, better content ranked higher in Google’s results. Its advice to publishers regarding SEO was basically, “Create good content and we’ll find it.”

No longer could Demand Media’s and Bleacher Report’s search-engine-spamming formula win them page views. In fact, Demand Media completely retooled itself in response, saying that “some user-generated content will be removed from eHow, while other content will run through an editing and fact-checking process before being re-posted.”

In other words, quality started to matter to users, who let Google know it, and Google responded accordingly. The result was a sea change from how it had been done, leading to a completely new business model for Demand and its ilk.

3. Advertiser interest
Advertisers have long shunned poor quality content. From the beginning, they almost never wanted placements on comment pages, which can feature all-caps rants, political extremism at its worst and altogether unsavory sentiments (which is why many news sites feature comments separately — you thought that tab or link to comments on a separate page was a UX choice? Hardly). The SF Weekly article quotes Bleacher Report’s Kaufman, who says of its transformation to better quality stuff, “This was not a decision made by the CEO, who got tired of his friends saying at parties, ‘Boy, Bleacher Report is terrible.’ Bleacher Report reached a point where it couldn’t make the next level of deal, where whatever company says ‘We’re not putting our logo next to yours because you’re publishing crap.’ Okay, that’s the market speaking.”

So it is. A longer story for another time, but neither advertisers nor publishers are getting a lot of bang out of banner ads, CPMs and click-through rates. Increasingly, the least you can do to appeal to the market, if you’re a publisher, is create good content. How to do it without breaking your budget and while devising new technologies, maintaining your legacy product and operations, and appealing to readers…well, if I knew the answer to that, I’d be a rich woman.

Meantime, even though “critics from traditional journalistic outlets continue to knock Bleacher Report as a dystopian wasteland where increasingly attention-challenged readers slog through troughs of half-cooked word-gruel, inexpertly mixed by novice chefs,” they’re making money like you wouldn’t believe. They don’t break stories, they own them (the same is true of the Huffington Post).

Time for the “legacy” to embrace the future.

Read More

Trusted brands rule social

UCLA and HP researchers have determined that successful tweets have common — and predictable — characteristics. Per this fascinating piece in the Atlantic, the researchers’ algorithm can predict a tweeted article’s popularity “with a remarkable 84 percent accuracy” based on the principle that news’ social success can be defined by source, category, language used and the celebrity factor. But the striking thing is just how much the “source” part accounts for:

“What led most overwhelmingly, and most predictably, to sharing was the person or organization who shared the information in the first place. …Brand, even and especially on the Internet, matters. Online, the researchers are saying, the power of the brand is exactly what it has been since brands first emerged in the Middle Ages: It’s a vector of trust. ..When it comes to news, trust is actually much more important than emotion. Shareability is largely a function of reliability.”

It’s all a part of the trend of consumers having conversations with brands and vice versa — instead of being overtly bought and sold as in days past — and the resulting trust rewarded to brands who do it well. Extrapolating, content marketing and social marketing, which help brands build that trust and have those conversations, have with this study been proven out with measurable statistics.

As recently as last year, many brands’ strategy could be summarized by the following (ridiculous) two-pronged approach: 1. Chase SEO (damn the quality of the result); 2. Pray for something to (somehow) go viral. But the Internet changes with alarming rapidity, and the past year and a half has seen a major shift away from these tactics. SEO baiting abated, thanks to Google tweaking its algorithms to rank better content higher, and brands acknowledged that since viral content is by its nature unreliable, they shouldn’t rely on it.

This isn’t to say that search and innate shareability shouldn’t be considerations for brands — they absolutely should; they are foundational. But the new forward strategy is reaching users where they are (Facebook, Twitter, Instagram, Pinterest, etc.), giving them something reliable and useful, and earning trust in return.

In the case of so-called old media, they must become trusted sources again in this new landscape. Successful new brands (Fab.com to name one) are taking it even one step further with an almost post-branded attitude: Their online presence not only establishes trust with consumers, but their conversational and understanding tone also unpacks branding itself and exposes undisguised sellers as outmoded entities that peddle wares to you but don’t really get you.

Reaching consumers and establishing trust by getting them isn’t a new concept in advertising and marketing, but it’s one that must be repeatedly learned anew as consumer attitudes evolve. It’s a snarky world, but it’s the one we live in, and brand strategies must evolve or perish.

Read More

Content strategy in context

From content strategist Rahel Bailie’s Intentional Design blog, regarding what she terms Big Content, which is to say: “Consideration of content beyond the copy, and even beyond the content.”

“When users feel good about an experience now, they will give feedback now. Conversely, when users have a bad experience, they are more likely to hold onto that feeling of indignation until they feel heard. …For organizations that increasingly depend on user-generated content as part of their marketing strategy, it’s important for them to (a) get users to generate content and (b) get users to generate content that reflects well on their customer experience. In other words, building an environment that encourages users to give immediate feedback should increase the number of instances of positive feedback.”

More — way more — about content strategy and how it relates to user experience at her Big Design Slideshare. Below is my favorite bit: What content means in context:

Read More