The De-Humanization of Anti-Trust

Categories: technology, economy

What? Companies are increasingly relying on algorithms to set prices. These ‘bots’ mine vast quantities of data, monitor competitors, develop predictions, and automatically set prices to maximize profits. Some are even capable of learning, like humans, through trial-and-error. But what happens if these bots collude instead of competing?

So what? Antitrust laws prohibit collusion but were designed with human actors in mind. Prosecutors rely on evidence of “communication” and legal concepts like “agreement” and “intent” that are not necessarily applicable to autonomous machines interacting in the cloud. Moreover, computers do not agonize from the same cognitive (or ethical) constraints that humans face. In the digital world, cartel-like outcomes may be achievable on a scale and scope not previously imagined. Consumers may suffer from higher prices, but who would be liable, if anyone, if bots collude? How could this be detected and prevented?

Source: The New Yorker – When Bots Collude

Economy Weakly Scan icon
    
Environment Weakly Scan icon
    
Governance Weakly Scan icon
    
Security Weakly Scan icon
    
Social Weakly Scan icon
    
Technology Weakly Scan icon

Date modified

2017-03-29