Barely a month goes by in China or in India without some macroeconomic data being published which leaves investors scratching their heads. However, before most investors have even had time to absorb the data release, macroeconomic forecasts based on the puzzling data is published by breathless economists working for brokerages. In this opinion piece, Pramit Bhattacharya explains how the Economics profession came to this low ebb. He begins by creating a ‘burning platform’ which is that Economists trained in India’s top universities are increasingly finding no takers in the job market:
“The poor placement record of India’s top engineering schools this year has received much attention. What went unnoticed are dismal placements at top economics schools. One reason is the growing caution among employers that face multiple global uncertainties. The other is a reassessment of the need for economic analysts in the age of artificial intelligence (AI).
While there is a cyclical element to this downturn, it is worth asking if the training that economists receive needs an upgrade. Over the past four decades, they have gained more from computerization and digitization than other social scientists. Rising computing power made it increasingly easy to run econometric models on large data-sets. It helped academic researchers produce more research, and enabled graduate students to master sophisticated modelling techniques.
Over time, economists began taking the data-generating process for granted. Whatever appeared on their computer screens was viewed as an accurate description of reality. In search of quick results, data scrutiny and primary research took a backseat.
Now, a faster breed of analysts has appeared. AI models today can produce as good or as bad a forecast as an average graduate, and in a much shorter span of time. To be sure, AI models suffer from the garbage-in-garbage-out syndrome. But economists who aren’t trained well in scrutinizing data-sets also suffer from the same syndrome. Unless they re-examine their relationship with economic statistics, economists are likely to lose jobs to AI-driven bots.”
So how can Economists fight back and ensure their survival in this age of AI. Mr Bhattacharya says that they need to use their brains a little bit more and crunch data a little less: “Writing more than half a century ago, Nobel-winning economist Oskar Morgenstern pointed out that most economic statistics are built on a number of simplifying assumptions and judgement calls. But economists often use these statistics as if they were error-free measures of socioeconomic realities. In The Accuracy of Economic Observations, Morgenstern criticized economic statisticians for failing to alert data users about the error margins in economic estimates. But he was equally critical of his own tribe of economic researchers for failing to “distinguish between what we think we know and what we really can and do know”.
To improve the accuracy of economic observations, Morgenstern advocated deeper engagement between data producers and users, openness in acknowledging errors of economic measurement, and careful reviews to bring down errors over time. Most importantly, he wanted economics students to be trained to question the ‘facts’ put up before them, so that they learn “how terribly hard it is… to find out what truly is a ‘fact’.”
Morgenstern’s observations hold true for all geographies, but more so for data-deficient poor countries such as India. In post-independent India, the first generation of economic statisticians (such as Moni Mukherjee and Uma Dutta Roy Choudhury) and economic practitioners (such as K.N. Raj, Jagdish Bhagwati and T.N. Srinivasan) were deeply conscious of this problem. In most of their writings, you will find careful descriptions of the data-sets they used and their limitations.
In his foreword to Mukherjee’s 1969 book on national accounting in India, Nobel laureate Simon Kuznets noted that the problems in adapting international standards to a poor and diverse economy ran like a “red thread through the volume” and reflected “a proper concern with this problem.”
Faced with incomplete data-sets, leading economists of the day learned to complement quantitative techniques with other methods, including case studies and historical analysis. The limited access to high-speed computers meant that they couldn’t run endless regression models even if they wanted to. Some of the most prominent economic studies of that era, including Bhagwati’s 1971 critique of India’s ‘licence raj’ regime and Raj’s 1975 report of Kerala’s human development model were essentially case studies. They relied on both quantitative and qualitative evidence to build their arguments.
Sensing the inadequacy of the state-level National Sample Survey consumption data, Raj had to collect primary data on food consumption patterns to construct alternate estimates of household consumption expenditure in Kerala. Bhagwati and his co-author, the late Padma Desai, had to reconstruct industry-wise input-output ratios to make them comparable over time. If they had just run regression models on official data-sets, they wouldn’t have unearthed the insights that they did.
Today’s generation of young economists need to relearn the skills that the old masters possessed.”
If you want to read our other published material, please visit https://marcellus.in/blog/
Note: The above material is neither investment research, nor financial advice. Marcellus does not seek payment for or business from this publication in any shape or form. The information provided is intended for educational purposes only. Marcellus Investment Managers is regulated by the Securities and Exchange Board of India (SEBI) and is also an FME (Non-Retail) with the International Financial Services Centres Authority (IFSCA) as a provider of Portfolio Management Services. Additionally, Marcellus is also registered with US Securities and Exchange Commission (“US SEC”) as an Investment Advisor.