top of page

Data Study Group

Scroll Down

Untitled-5.jpg

Principal Investigator: Doyne Farmer

J. Doyne Farmer is Director of the Complexity Economics program at the Institute for New Economic Thinking at the Oxford Martin School, Professor in the Mathematical Institute at the University of Oxford, and an External Professor at the Santa Fe Institute.

​

His current research is in economics, including agent-based modeling, financial instability and technological progress. He was a founder of Prediction Company, a quantitative automated trading firm that was sold to the United Bank of Switzerland in 2006. His past research includes complex systems, dynamical systems theory, time series analysis and theoretical biology.

Claire-Connelly-face.jpg

Researcher: Claire Connelly

Claire Connelly is a visiting academic at the Institute for NewEconomic Thinking in Oxford, and a research fellow at the Global Institute for SustainableProsperity.

​

​With a background in journalism, Connelly has written for leading publications including TheAustralian Financial ReviewThe Saturday Paperthe ABCSBSWhich-50New Matilda,and the Sydney Morning Herald. She has featured regularly on television shows, web series,radio programmes and podcasts, including ABC’s The Drum, Channel 10’s The Project,SBS’ The Feed, 2SER’s Fourth Estate, Radio National’s Breakfast, Drive, Hack, andDownload This Show.

Project Summary

Data are necessary for measuring and understanding the macroeconomy. Indeed, macroeconomics can be thought of as a collection of formal frameworks to explain some measured outcomes that arise from complex interactions of many different agents. It follows that the data we observe and collect can influence research agendas. Different observations may encourage different approaches in macroeconomics. In fact, many sciences advance by creating new datasets that then lead to new questions, insights and theories.

​

Collecting data in the social sciences is a value-laden exercise in terms of what to include or exclude. Even the question of what constitutes data can be interpreted through different scholarly lenses. But without data, it is difficult to make advances which would pass the falsification principle.

​

Since the creation of the system of national and international accounts seventy years ago, the structure of economies has changed profoundly. Today we have global supply chains, sub-national governments, cross border movements of labour, capital, ideas and ownership with consequences for the transfer and bearing of risks and even the traditional domains of monetary and taxation control.

​

Yet at the same time, technological progress enhances our ability to conceptualise, observe and measure these interactions. It opens up the possibility of going beyond observing traditional prices and quantities to look at the motivations behind economic actions. Measuring different outcomes and underlying motivations could transform some of the deep empirical problems in macroeconomics.

​

We are setting up a ‘Data Study Group’ to ask what kind of data could significantly enhance our understanding of the macroeconomy. The Data Study Group is tasked with collecting perspectives on what an ‘ideal data set’ would look like from both leading academics and policy makers from around the world. We would also be interested in the macroeconomic questions that such data would attempt to answer, what problems might be solved, and how difficult the data would be to assemble.

Results

Results will be published here when available.
bottom of page