Following a comprehensive report on market structure and antitrust, the Committee for the Study of Digital Platforms (at the Stigler Center at the University of Chicago) has released a report that I was a co-author on, on the media and regulatory options to enhance citizen welfare. Our main policy recommendations are:
- Introducing some public funding for news organizations, relying on citizen choice, to support journalism. The allocation mechanism of the funds should be designed to promote competition and entry and limit the entrenchment of incumbent large news media outlets. The funds should be allocated directly by the citizens, independently of any government intervention. Special consideration should be given to the funding of local journalism, where we see most of the aforementioned problems concentrated today. This funding mechanism is highly cost effective: $50 per US adult is likely to be sufficient.
- All mergers and acquisitions involving news companies should be subject, in addition to the standard antitrust review, to a news plurality review. Standard competition policy protects direct consumer welfare, and therefore does not take into account the indirect effect that excessive media concentration can cause on citizen welfare. We propose an approach to quantifying news plurality that is neutral to the identities of the owners of the merging entities and to the platform on which news content is delivered. The proposed approach, based on attention shares, has been used in a recent merger decision in the UK.
- Developing a new regulatory system that will ensure necessary transparency regarding information flows and algorithms. This can be done through a new regulatory framework and oversight body that sets standards for the disclosure of information and news sources, develop source-based reputational mechanisms and bring to light biases and choices in editorial decisions and algorithms for the presentation of the news. These regulators should produce periodical reports on news consumption and the influence of algorithm design on the distribution of news and the behavior of users.
- Digital platforms enjoy a hidden subsidy worth billions of dollars by being exempted from any liability for most of the speech on their platforms (Section 230). We do not propose to repeal Section 230 but rather propose that platforms that would like to enjoy this protection should have to agree to take clear measures to prioritize content according to criteria other than the maximization of ad revenue.
The report develops each in considerable detail.
The one that I am most interested in are the transparency recommendations. This is to deal with the problem of which is nowadays called ‘fake news.’ In actuality, we saw it not as a problem that some content may be untruthful or purposely misleading but that, as a practical matter, that is the problem we face all the time with information economics. Instead, it is a problem of empowerment. Attention is limited so when people evaluate information presented to them for its quality, they need context or at least the hope of context. Fake news is a problem when it is difficult to convince others news is fake. The best mechanisms we have to deal with the co-existence of limited attention and the assessment of information quality is an alternative assessment of the reputation of the entities providing the information.
The problem today is that it is hard to identify the source of information, let alone their interests. The good news is that we don’t need to use a heavy hand to provide that information. Instead, we need a mechanism so that those who want to provide ‘source context’ can do so in a verifiable manner.
The idea in the report is to create a registry of a kind where entities can (a) provide verifiable information on their interests and identity and (b) can sign particular information sources. That’s it. When information sourced by them is published, it is done so with a link back to the registry. If you are a media outlet this is something quite natural. If you are someone else, you can register and then provide information.
The end result is those content providers who are happy to sign their work will do so. Others will not. Does this stop someone signing information content that is misleading? No. It just gives people the opportunity to find out who that person is and what their interests are. In other words, it empowers.
There is much more of interest in the report and hopefully these recommendations will provoke new debate.