Are you being overcharged by clever AI? Watchdog looks at whether algorithms hurt competition

To stop algorithms from charging unfair prices when we shop online, the UK’s competition watchdog is launching a new investigation into the ways that AI systems might harm consumers – an issue that has so far lacked in-depth research and analysis, says the organization, and yet affects most of us in our everyday lives. 

While a lot of attention has focused on algorithmic harms in general, the Competition and Markets Authority (CMA) suggested that little work has been done in the specific area of consumer and competition harms, and reported that almost no research on the topic exists in the UK. There is particularly little insight into the ways that automated systems tailor costs, shopping options or rankings to each individual’s online behavior, often leading to consumers paying higher prices than they should. 

For this reason, the CMA has asked academics and industry to submit evidence about the potential misdeeds caused by the misuse of algorithms, and is launching a program called “Analyzing Algorithms”, which could even help identify specific firms that are violating consumers’ rights, so that cases can be taken forward if needed. 

Kate Brand, director of data science at CMA, said: “We want to receive as much information as possible from stakeholders in academia, the competition community, firms, civil society and third sector organizations in order to understand where the harm is occurring and what the most effective regulatory approach is to protect consumers in the future.” 

From ordering food to arranging travel, a huge part of everyday life is spent making choices that involve an online platform of some sorts. Equipped with ever-larger datasets about each individual’s preferences and behavior, businesses widely use algorithms to make sure that those choices are as optimal as possible. In many cases this is beneficial to the consumer, who will find themselves faced with a selection of options that better answer their expectations; but algorithms can also be exploited to manipulate users’ choices, sometimes unfairly, and often without them being aware of the process. 

In a preliminary research document that was published as part of the CMA’s call for evidence, the organization lifted the lid on the harmful impact of some of these algorithms. AI systems, for instance, might be used for personalized pricing – the act of advertising different prices to different customers, based on how much the technology predicts a given user is willing to pay. 

Personalized pricing is only the tip of the iceberg, and the most obvious form of unfair practices. Behind the scenes, said the CMA, examples abound of algorithms influencing customers’ behavior, and indirectly leading them to pay higher prices.  

Take the all-too-familiar promotional message informing you that the product you are browsing has limited availability – an easy process that can be performed by an algorithm that calculates the required metric. According the CMA, this can create a sense of urgency in consumers, leading them to buying more and spending less time to search. While there is nothing harmful in informing customers about stock, reports have shown that these messages can be misleading or false; in some cases, the number of people also viewing that pair of shoes is effectively random. 

According to the CMA, some companies also target buyers at a calculated time when they are more likely to give a positive review, which generates less useful information for consumers generally. Ineffective algorithms targeting fake reviews can equally hype some products or brands, misleading future buyers. With three-quarters of people reporting that they are influenced by reviews when they shop online, “the failure to detect these reviews and remove them can lead to consumers purchasing products or services that they do not want,” said the CMA. 

A major sticking point that was identified by the competition watchdog is that of rankings. There is still little understanding of the algorithms that determine what comes out, and in which order, when a consumer types in a search query. What seems certain, however, is that the list of options that is automatically generated is not always created with the buyer’s best interest in mind. 

In a previous investigation, for example, the CMA found that some hotel booking sites ranked search results in a way that gave a false impression of a room’s popularity, when in fact the order in which the options were presented depended on the amount of commission a hotel paid to the site. 

Lines get blurrier in the case of online marketplaces that sell their own goods alongside those of third parties, with many examples of companies that presented their products and services more favorably to gain buyers’ favors.  

There are multiple examples of bad algorithmic practices that eventually affect consumers’ decisions and the price they pay for their purchases, but there is ultimately little knowledge of how widespread the problem is. The CMA, in fact, maintained that regulators are only starting to touch on the depth of the issue.  

“Algorithms play an important role online but, if not used responsibly, can potentially do a tremendous amount of harm to consumers and businesses. Assessing this harm is the first step towards being able to ensure consumers are protected and complements our wider work in digital markets to promote greater competition and innovation online,” said Brand. 

Given the opacity of the algorithmic systems at play, and the size of some of the firms implementing them, some strong regulatory work is needed to limit the potential harm that the technology could cause.  

Among some of the powers that might be granted to regulators, the CMA suggested requiring firms to disclose information about their algorithms to researchers and auditors, as well as ordering companies to make certain changes in the design of their existing systems. In the meantime, the most long-established advice still applies: think twice before you buy. 

Access the original article