Abstract
Because the United States has no digital regulator to set minimum quality or safety standards for digital products, dominant platforms have both the ability and permission to harm consumers, a trend which will accelerate. Digital platforms with market power have no incentive to shoulder the expense of providing safe, high-quality services, because the marginal costs of providing increased quality and safety—often human beings engaged in content moderation or fact-checking—are so high. If providing better quality would increase profits, digital platforms would have done so already. Instead, digital platforms act like automobile manufacturers before regulators required seatbelts: they will insist that people would rather ride in unsafe cars than shoulder the extra expense of a seatbelt. This argument is especially difficult to deal with in the case of digital markets, because many consumers do not realize that these services are not free; they are paying for them with their attention, their data, their mental health, and their democracy. Recent actions by the CEOs of Meta and X have demonstrated the propensity of social media sites to increase their profit by ending investments in safety, regardless of the risks to users, and in violation of the laws of other nations. It is past time for United States lawmakers to hold digital businesses to the same standards as other sectors such as food, automobiles, and pharmaceuticals and require their products to be safe.
Department
Law
Subject
Computer Law, Internet Law, consumer safety law
Publication Date
2025
Journal Title
Richmond Journal of Law & Technology (JOLT)
Publisher
University of Richmond
Document Type
Article
Recommended Citation
Margaret O’Grady & Fiona Scott Morton, Digital Platform Safety and The Problem of Variable Costs, 31 RICH. J.L. & TECH. 195 (2025).