Australia reckons it’s ready to fight algorithmic collusion as world scrambles to review laws

22 November 2017 10:14am
Australia on globe

21 November 2017. By James Panichi, Phoebe Seers and Matthew Newman.

The prospect of unaccountable computer programs scattered around the world joining forces to fix prices and create global cartels is not something that Australian regulators are losing much sleep over.

While regulators in Europe, the US and Asia scramble to work out how best to tackle the threat posed by collusive algorithms, Australia’s top competition official made a speech last week on the issue that was surprisingly laidback.

“In Australia, we take the view that you cannot avoid liability by saying ‘my robot did it’,” said Rod Sims, delivering a deft quote destined to punch through to the next day’s headlines.

So, as regulators in Asia-Pacific and Europe are saying that holding collusive artificial intelligence to account under existing laws won’t be easy, the Australian Competition & Consumer Commission is telling the world that it has things in hand.

Whatever the reason for the ACCC's optimism, its view is at some variance with the outlook shared by regulators in other jurisdictions.

In the EU, for example, the competition regulator’s ongoing investigation of electronics suppliers Asus, Denon & Marantz, Philips and Pioneer suggests that pricing software that adjusts retail prices to match those of rivals is already creating problems.

And it’s not just the EU watchdog that’s playing catch-up. Earlier this year, the UK's top antitrust official, David Currie, said that competition authorities may not have the right tools to confront pricing algorithms that react quickly to changes in competitors' prices.

"Machine learning means that the algorithms may themselves learn that coordination is the best way to maximize longer-term business objectives," said Currie, chairman of the Competition and Markets Authority.

In other words, artificial intelligence may choose to adopt anticompetitive behavior without the consent or even the knowledge of company managers, creating a regulatory headache when it comes to liability.

But Sims' argument is that Australia has what other jurisdictions may lack: a revamped set of competition laws that target the effects of collusion, not those making efforts to collude. The ACCC believes that is an important distinction.

Harper power

Australia’s beefed-up competition laws, based on the 2015 Harper Report, specify that a company is prohibited from “engaging in a concerted practice that has the purpose, effect or likely effect of substantially lessening competition.”

That, Sims says, is the key to fighting algorithms. The ACCC no longer has to limit itself to pursuing backroom deals involving captains of industry — as it did, for example, in the 2007 Visy packaging cartel case, when the decision to collude with rival Amcor was shown to have gone all the way to the top.

Under the Harper changes to Section 46 of the 2010 Competition and Consumer Act, the ACCC can move beyond the “meeting of the minds” test when considering whether there has been cooperation between companies that harms competition. It needs only to identify the effects of collusion.

“By focusing on the effect or the likely effect of conduct ... the new misuse of market power provision is fit for purpose to prohibit this conduct,” Sims told a conference in Sydney last week, referring specifically to a “profit-maximizing algorithm” that may “work out the oligopolistic pricing game.”

Sims believes that, under Australian laws, companies that have benefited from the artificial intelligence mechanisms they have deployed can be held responsible, whether or not there had been a deal — or even the intention — to collude.

Asian equation

The ACCC’s upbeat assessment of its ability to take on algorithms stands in contrast to the more cautious approach taken in the few Asian countries that have considered the issue.

In a statement to MLex, the Competition Commission of Singapore said that although current laws would be able to catch aspects of collusion involving algorithms, it had nonetheless identified some areas of regulatory uncertainty and promised to “stay vigilant.”

“[T]here are currently no settled positions on ... concerns involving algorithms, such as how a [company’s] independent business justifications for using third-party pricing algorithms may be weighed against any competitive effect that may result from its use,” the CCS said.

The Singapore regulator said there was little clarity on “how liability may be established for any autonomous decision-making that results in collusive outcomes.” In other words, the Singaporeans have concerns about the very issue — that of liability — that has given Sims a spring in his step. 

“As the increasing use of algorithms in the Big Data environment is currently an evolving field, it is perhaps too early or even inappropriate for there to be a settled position on these aspects,” the CCS said.

That gray area was highlighted earlier this year, when the CCS joined forces with the Personal Data Protection Commission and the Intellectual Property Office of Singapore to prepare a study of the impact of data use on competition law.

Published in August, the study noted a “lively discussion” among stakeholders about whether existing laws were adequately equipped to deal with future developments involving algorithms.

Under Singaporean legislation, prohibitions against price fixing require an “agreement.” Therefore, liability will not be established for any autonomous decision-making that results in collusive outcomes in situations where self-learning algorithms are involved.

However, the CCS says that when “the use of algorithms is to effect, further, support or facilitate any pre-existing or intended anticompetitive agreement or concerted practice,” it will have no problem in hitting those involved with the full force of the Competition Act.

Yet any prosecution for collusion still relies on the existence of an agreement among flesh-and-blood humans rather than robots.

Other competition regulators in the region haven’t shown the same interest in getting ahead of the threat posed by artificial intelligence, or are choosing to see how other jurisdictions tackle the problem.

The Japan Fair Trade Commission has expressed an awareness of “digital cartels” and possible issues around how Japanese legislation is to be interpreted in dealing with the problem, but has not yet made any progress on that front. A Big Data report released earlier in the year said Japan would monitor the situation closely.

New Zealand’s competition regulator, the Commerce Commission, told MLex that it was “aware of overseas cases” in which prices set by algorithms had been “investigated as anticompetitive conduct” and that it would “continue to adapt its understanding” of the issue to “apply its investigative toolkit appropriately.”

- Additional reporting by Toko Sekiguchi

ABA 2019