Getting over COVID-19

Getting over COVID-19:

Effective decision-making using personalised analytics


Professor Ben Fahimnia

As Australia appears to have successfully flattened the curve against COVID-19, it is now at a crucial point to find a way towards economic recovery, without unleashing a second wave of infection. Policymakers are having to review and analyse numerous reports, articles, and data from around the world in order to provide a solution that is best suited to our unique Australian circumstances.

Artificial intelligence (AI) and data science tools can track and analyse data faster than a virus can travel, but the accuracy of these tools is reliant on the judgement of a human.

There are situations, such as ordering toilet paper stock[1], in which machines (analytics) are capable of making automated decisions with no human involvement, merely based on statistics or heuristics. This is not the case in decisions related to COVID-19 for two reasons.

First, analytics require a great deal of past data to be able to generate meaningful insights. In the case of COVID-19, we have insufficient historical data to facilitate automated decisions.

Second, when facing decisions with multiple objectives – such as COVID-19 policy decisions – machines are used to generate a set of “feasible solutions”. Each of these solutions may partially satisfy some of the objectives. A human decision maker then chooses the most preferred one [2].

Therefore, we are currently relying on humans to interpret and apply the results produced by the analytical models. But we know that human judgement is influenced by an individual’s background and cognitive biases[3]. Decisions are even more biased – in a negative way – in highly uncertain and high-stress environments.

For example, if a decision maker is biased against China, they are probably more comfortable making policy decisions in line with Trump tweets that emphasize the Chinese origins of the COVID-19. This is the so-called confirmation bias in which decision makers favour information that supports their own views.

Another type of bias is the “framing effect” which related to how a problem is framed. Imagine the difference in policy if decisions are made to “minimise damage to economy” vs “avoiding a national health crisis. These are just 2 of over 100 biases which affect how we interpret data and make decisions.

Therefore, it is paramount to understand how human cognition and individual biases can influence the solutions generated by AI-powered tools. This is an emerging area of academic research.

“Personalised analytics” is the new generation of data analytics that aims to get the best use out of data science and human judgement by minimising the impact of detrimental cognitive biases. Personalised analytics are analytical models that can be customised to an individual’s background and personal characteristics.

There is no doubt that the increasing amount of data and continuing maturation of AI and data analytics can help our policymakers make more informed decisions. But the shift toward the development and adoption of personal analytics is essential to help integrate “analytics” and “intuition” to achieve what neither individuals nor machines could reach on their own.

 

[1] Perera, N., Fahimnia, B., & Travis, T. (2020), ‘Behavioral experiments on inventory and ordering decisions: A systematic review’, International Journal of Operations & Production Management, in press.

[2] Fahimnia, B., Pournader, M., Siemsen, E., Bendoly, E., Wang, C. (2019). Behavioral Operations and Supply Chain Management- A Review and Literature Mapping. Decision Sciences, 50(6), 1127-1183.

[3] Bendoly, E., Croson, R., Goncalves, P., & Schultz, K. (2010), ‘Bodies of knowledge for research in behavioral operations’, Production and Operations Management, 19(4), 434-452.