One of the challenges of the collapse of 2008 was lots of banking models being applied incorrectly (or being incorrect) and which in a classic case of ‘group think’ all came up with the wrong answer. This is in a market where they are directly competing but that group think challenge has an interesting implication for the future of competition. Historically collusion has required businesses to get together and make a conscious decisions to collude and analytical models have been used to detect those collusions. The film ‘A Beautiful Mind’ about the life of John Nash covers how he invented much of the modelling around game theory, it was mentioned at the end how this was used then to detect collusions and cartels in economics.
But is there a problem of this in a big data world? If I’m getting a real-time feed of weather, inventory and my competitors prices on products and then using that to dynamically adjust my prices, including understanding how my price shifts have historically impacted theirs then I’m building a model for collusion without explicit collusion. In other words the models of a few companies could reach a point of stability which maximises each of their profitability and as one moves the price up the rest then follow to further increase that profitability. These models are potentially fully automated on each side and are not intended to drive collusion but the net impact of every model adapting to the competition can be that collusion becomes implicit.
Its certainly a challenge for regulators in the future to identify where collusion is made deliberately and where model based implicit collusion is working against consumers but may not fall under the current legal definition of collusion.
In an automated Big Data and Predictive Analytics world what will be the new definition of a cartel?