The deflationary impact of the Internet on incumbent business models
Okay, so I am back onto the Internet and Journalism series. In case you’ve arrived here without reading the others first, no worries; this post can work as a standalone. But I do recommend backfilling and reading the other posts. While I am making recommendations, let me also add that this site is reader-supported via Patreon. So please hit up this link and become a Patron.
My narrative on the Internet thus far
Here’s what I’ve laid out:
First, in “The Internet is for data“, I wrote how the Internet has to be dominated by search because it’s whole purpose — by connecting billions of people — is sifting through a byzantine maze of data to find the specific data you want. Yahoo tried to win with curation of data approach that works in the “walled garden” atmosphere of a 1990s America Online. But, when you have billions of people and organizations adding data to the web daily, curation isn’t scalable.
In my second post, I wrote how “Suspension of disbelief is critical to process data“. There were two corollaries I wanted to open up there.I laid out the first – that the panoply of virtual sources of information increases scepticism by disseminating fake news. We can’t rely on the reliability of crowd-sourced source authority or authenticity. So we retreat to safe spaces to consume our media. That leads to polarization. The second point, I haven’t gone into detail on — that news organizations that rely on search will do well despite reader scepticism.
But, in the third post, I went in a different direction and talked about “The Internet as a disrupter on data, audio and video platforms“. And I want to link up this idea to the idea that search is critical for news organizations in this post, especially now that high data rates are disrupting all three platforms – written (data), audio and video. Let’s start with newspapers.
Newspapers get shellacked by the HotJobs and Ebays of the world
In the old days, news was subsidized by classified advertising. If you read a newspaper for the news, opinion and information, all you would see was the news and the display ads next to the news. But deep in the recesses of the newspaper was a treasure trove of income flow, the classifieds.
When I was at Yahoo’s HotJobs division, we were busy siphoning off that revenue stream by taking the job classifieds to the Internet, making them more compelling via search and serving up a much wider geographical job advert footprint to job seekers. The basic job classified model was the same. but the digital platform was superior to what the newspapers had. And it was cheaper for employers to place ads too. So all of those ad dollars got sucked out of the newspapers. Only through their CareerBuilder platform did they win a decent amount of it back.
At the same time, Ebay was busy crushing the other big classified revenue stream of newspapers. And Ebay ramped so quickly after the Internet bubble crashed that it became the first example of the power of network effects in peer-to-peer Internet networks. All of the sellers were on Ebay. And all of the buyers too. Which is more important? Who cares. If you’re a buyer, you know where to go. If you’re a seller, same thing. And so, all of that revenue went out the window for newspapers.
It’s the data rates that mattered here
What I’d like to stress, though, is that the reason this happened so fast is that newspapers were in a uniquely bad position since they were selling written data. And written data transmission is the least bulky of the media platforms to pipe across the net. As the Internet built out, industries that relied almost exclusively on written words (and a few pictures) would naturally be the first to feel massive disintermediation. Newspapers are in one such industry.
But you don’t need a huge ramp in Internet data rates to get to audio platforms. As I mentioned in the last post in the series:
Text is much less data intensive than audio. But audio is an order of magnitude less data intensive than video. A resume might take up 100 or 200 kilobytes of data, while an mp3 audio song file might take up 4 to 8 megabytes of data. The biggest leap is to video, where a 40 minute video encoded at 720p would take up 2.4 gigabytes.
You’re talking 40 times more data for a song than a resume. Whereas you’re talking about 10,000 to 20,000 times more data for a decent-quality video. So, there’s no wonder the music industry was pretty quick to see it’s business model disrupted.
Data-only platforms make unbundling easier and that reduces margins
In business, tying two products together is a great way to get some incremental revenue, especially when one product is indispensable. The core of the anti-trust case against Microsoft 20 years ago was about Microsoft tying their Windows operating system to other software products Microsoft was selling. The recent EU fine against Google is essentially about the same issue, this time on Google’s Android platform.
But Microsoft and Google are the exception to the rule. On data platforms like the Internet, it’s actually more common to see the unbundling of these ties that exist on other platforms.
For example, in the old days, if you wanted to get a job in America, you would buy the Sunday newspaper. This paper was chock full of classified advertising, much more than the job adverts available on other days. And of course, the Sunday newspaper cost a lot more as a result. Now, the newspapers offered a lot of extra entertainment and opinion on Sunday too. But the reality is that a lot of what made the Sunday paper indispensable was the classifieds. But the Internet ‘unbundled’ that offering. And that made the Sunday paper less much less indispensable.
The music industry was built around bundling
Since audio data isn’t that much more dense than text data, it’s no surprise that the music industry came under attack very early on. A lot of people talk about piracy and services like Napster as the driving force of what killed the music industry’s margins. But the reality is it was unbundling – and services like iTunes – that did it.
The music industry bundles music in albums by single artists or in a compilation or in a soundtrack. These albums are big sellers. But in the pre-Internet days they were much bigger sellers.
Here’s Billboard magazine from 2014:
You know this. You tossed your CD tower years ago. The Sam Goody at your shopping mall is a dim memory. And while downloads have taken the place of compact discs, the number of albums downloaded each year pales in comparison to the number of CDs people bought back in the day.
As much as that’s common knowledge, you might be shocked how insanely different today’s album-buying landscape is from the music industry in 1994. As part of our celebration of 1994, we compared the 10 best-selling albums of the first eight months of 1994 (period ending 9/4/1994) to the 10 best-selling albums of the first eight months of 2014 (period ending 9/7/2014). All data comes from Nielsen SoundScan.
…Let’s put it this way: The second best-selling album of 2014, Beyonce‘s self-titled, hasn’t sold more than one million copies in this calendar year. But back in 1994, 38 albums had already sold more than a million copies by Sept. 4…
It’s not just that hit albums sold more back then — it’s also that there were more platinum-selling albums in the U.S. each year. In 1994, dozens of records went platinum every year. These days, not so much.
That’s unbundling.
Here’s the thing though: when iTunes started selling singles by the gazillion, all of that incremental revenue music companies got from bundling 11 songs into an album and selling it went poof. What’s more, the price point of $0.99 for a single on iTunes was much lower than the price point for selling a CD single or a record single with a B-side. So margins got crushed.
Quick thought on how digitizing data makes disruption possible
As I was writing this I thought about the Kodaks, Nikons and Canons of the world. And it occurred to me that one piece I have left out is about the digitization process. Words are easy to digitize because there are only 256 characters in the ASCII character set. That’s what makes the data payload for written text so low. Digitizing a song requires more data. And digitizing a photo requires even more data.
Just over twenty years ago, I bought a nice Nikon N6006 analogue SLR camera that I took on trips. And, of course I usually used Kodak film. Getting film developed was expensive and time-consuming. And often times the shots I took sucked. So I took as few pictures as possible, with appropriate care on each shot, knowing every bad one cost me money.
Once Digital SLRs hit the 6 megapixel size almost 15 years ago, that was over. The picture quality was terrific. And I could take as many pictures as I wanted because I didn’t have to pay for them if they were bad.
Kodak got crushed.
The same evolution has occurred with moving pictures, which require a step up in data intensity and processing speed. That has made the motion picture industry ripe for disintermediation. The key factor holding things back was the Internet and slow data speeds.
Now it’s the movie and TV industry’s turn to feel the Internet
Data transfer speeds are huge these days. So, we have finally arrived at the moment where unbundling occurs in the motion picture and TV industry.
Right now, I am trialing several live TV services so that I can stop paying for channels I don’t care about. The way the US cable industry has become profitable is by forcing customers to pay for bundles of content that include stuff they don’t care about. The cable companies know which content is ‘indispensable’ and so they bundle that content – aka tie it with – other less valuable content, jacking up the price in the process.
With data transfer speeds as high as they are now, live TV services can chop those bundles up and reduce the price. As it stands now, we still have bundles. But over time, I believe we will see a much more a la carte pricing system that will reduce how much money the cable companies can generate.
The key variable in that process is vertical integration, where cable companies like Comcast also own content like the NBC suite of stations. Comcast can use this vertical integration as leverage to prevent unbundling in order to keep their profit margins high. And the threat of disintermediation and lax anti-trust laws in the US are a major reason media mergers are leading the way in deal flow.
Despite the mergers, media companies’ revenue models will come under assault. The days of huge margins from bundling are over.
Final thought: back to newspapers
The New York Daily News lad off half of its editorial staff yesterday. The cost pressures are just too great. A paper like the Daily News just doesn’t offer enough of a differentiated product in a world awash in free news on the Internet to garner enough revenue to maintain its existing staff count.
As newspapers downsize under the assault of Internet-inspired revenue loss, alternative media will spring up to replace them. And from my perspective, much of this alternative media will have lower journalistic standards, contain more bias and partisanship, and be more prone to peddling ‘fake news’.
The question for these media outlets is where the revenue stream will come. That’s the same question newspapers are asking themselves. One thing is clear though: alternative media outlets that can make themselves appear high in search and share algorithms on the Internet will win in this world. And to the degree they can become embedded inside the partisan safe spaces now developing in reaction to the byzantine media landscape, they will be met with a more trusting and credulous audience.
Comments are closed.