November 22, 2024
The Friel-Scanlan lecture was established in 1989 in memory of Francis S. Friel and Sara G. Scanlan Friel by Francis A. Scanlan (LAW ’50). The award is given to a Temple Law professor for outstanding faculty scholarship, who then delivers a lecture to the law school community. Professor Salil K. Mehra, Professor of Law, Director of the LL.M. in Asian Law, and Co-Faculty Director of Temple University’s Institute for Law, Innovation, and Technology, received the award and delivered a lecture entitled “Antitrust and AI: Price-fixing and Monopolization by Algorithm.”
Professor Mehra discussed “algorithmic collusion” and “surveillance pricing” — each of which describes a potential harm to consumers driven by the use of increasingly powerful software and technologies. Professor Mehra suggests that each of these tools, along with their interaction, allows for greater potential consumer harm.
Kevin Sherry [KS]: Can you talk a little bit about your background with antitrust and what led you down this path?
Salil Mehra [SM]: It was an area that appealed to me when I was a student. At the time, this would have been the late 90s, and then really for the 15 years after that, it wasn’t seen as a particularly lively area. You know, as late as 2010, Richard Posner was saying it was a dead field. But I thought there was potential in it. I thought it was interesting, so, I started my career after law school. I clerked for a judge in the First Circuit, but then I went to the U.S. DOJ and worked in the Antitrust Division, and that’s where I started my legal career, really.
KS: You broke down antitrust law into a few different sections in terms of traditional merger review and the Sherman Act, but would you say more recent advancements in terms of technology could lead to potentially more consumer harm?
SM: Yes, there are three big areas in terms of how much activity there is in antitrust. The one is merger review, the other is Section 2 (Sherman Act) monopolization, which wasn’t very active until about 2019 or 2020, although suddenly we have, you know, five cases against four big tech firms. But what I’ve been focused on, like you said, is Section 1, which is about agreements and restraint of trade. It’s the idea that, effectively, it’s become easier for competitors to collude and potentially fix prices because of the breakthroughs in technology and in the kinds of software tools that are available to them. The ability to collect and process data and digest it and then propose or recommend or set prices based on that data has also developed.
Depending on how these apps or services are provided, they can be very useful. You can get a handle on what the market is if you’re competing in a market to set your price and you’re able to understand how fast things are changing in terms of supply and demand. That’s very useful and can potentially be really good in terms of helping the market adjust to changes. That could be what we would call efficiency. But there’s also potential for harm, and that’s what we see in some of the cases I was talking about in this talk with some of the providers who are drawing scrutiny like Realcomm and apartment rental pricing. Here, the technology now allows for the collection of real time data from competitors by a third party who can then quickly supply various suggestions. The concern is that the suggestions are actually potentially price-fixing.
KS: In terms of looking at an algorithm leading to inflated pricing, do you think there’s any natural market factor deterrence to this type of collusion and do you think that can be kind of effective in its own way?
SM: I think there is always an effective constraint on competitive conduct like monopolization and price-fixing: even a monopolist can’t raise prices without constraint. Even if you’re, say, the monopoly, you know if we had one national private automobile company, they would certainly charge more for cars than they do now, but they would still be constrained by people’s budgets. They don’t want to, even if they have a monopoly, sell zero cars by charging $200,000 per car. So, while there’s a constraint, the constraint happens after a significant amount of pain for the consumer.
KS: Leading up to the election, there was some discussion of the idea of price-gouging. Is that referring to a similar notion of algorithmic pricing or surveillance pricing?
SM: Not necessarily. Some countries do have competition or antitrust laws that that allow for the conclusion that a price is exorbitant or too high. In the US, at least at the federal level, we don’t have a concept like that. We do have price-gouging laws in a lot of states, but they tend to be focused on sharp increases in prices during periods of declared emergency.
KS: You mentioned some pending legislation in terms of creating a rebuttable presumption of an agreement when competitors are using the same software. Is there any thought to what that would look like in reality?
SM: Yes, it comes down to sharing data with your competitors that you would not normally share with them. Effectively, it could put providers of such products on notice if they’re taking nonpublic information from various competitors. Basically, saying if you’re collecting it, processing it, and sharing it back with them, that you should be careful. When we see that it is reminiscent of the plus factors in Twombly, something more than parallel conduct where they are sharing that kind of information, or they’re meeting in secret, or any number of other things, this presumption allows a court to make the inference that there is an agreement. I think the goal would be to hopefully not just have more litigation, but to have more carefully designed, less anticompetitive third-party software.
However, we also see examples like the city of Philadelphia, which has a has a bill on the table that just proposes to ban a similar type of software used by landlords. I think that’s a little too far because there’s probably a lot of situations where it can be properly designed and maybe not built into a subscription service that suggests what price you and your competitors should charge. There’s a lot of situations where actually some sort of revenue management software makes sense and where it’s potentially good for everyone.
KS: Some current articles mention the idea that eventually machine learning and AI can arrive at a collusion decision on its own without any bad (human) actors. Do you think that’s possible?
SM: There are some theoretical papers by some economists who’ve done experiments with software in a kind of software lab, who have shown that under certain assumptions, the software applications actually do learn to talk to each other and collude. I don’t know if we have that yet in real life, but it is something to be concerned about going forward.
KS: You spoke about a hypothetical surveillance pricing in terms of looking at the labor market specifically, where theoretically an employer could train software to come up with the lowest possible wage they could pay an employee. Is there currently a legal protection for employees in this type of situation?
SM: In general, when it’s the private sector, you know when you’re hiring someone, a contract at will is the presumption and it’s what you negotiate subject to basic safety regulation and whatever the appropriate minimum wage is in that area. Employers are increasingly using or scraping social media to see what exactly people are tied to personally. You could see software tools that currently suggest a hotel price or a rent suggest the lowest wage or salary that someone be willing to work for. And for an employer, that would be huge. A big factor in keeping costs down is labor costs. Right now, there’s really no legal protection against that.
Kevin Sherry (LAW ’26) is a Student Editor for the Temple 10-Q.