Published on: 6th June, 2019

At the end of each week, we will share with you our favourite reads. We would be grateful if you could reciprocate. This week’s reads focus on the inverse co-relation between manager compensation and return on capital, the world of lab-grown diamonds, why doctors hate computers, why symptom trackers make you feel worse, the Deccan kings’ religious history and the need for continuous upskilling.
If you want to read our other published material, please visit

1. Long read: Return on Capital Super Heroes
Author: Thomas Macpherson
Source: GuruFocus (
American fund manager, Tom Machperson of Nintai Partners, highlights something that many of us who manage money know but never fully acknowledge i.e. “Managers who are frugal, both in their personal lives and in their compensation, generally run companies with higher returns on capital than those overseen by the most highly compensated managers.”
Nintai Partners’ number crunching suggests that CEOs who are paid less generate more Return on Capital: “We chose 36 companies from the top decile of the S&P 500 and 49 companies from the top decile of the Russell 3000 and calculated return on capital for the years 2009 to 2013. We chose the same amount for the lowest decile. In addition, we wanted to see if individuals who are compensated at either the highest or lowest decile had a different tenure within their respective companies. Last, we were interested to see what happened to the company’s return on capital after that management team had left. (The pool of candidates was smaller here simply because not everyone has left yet.)
It turns out there was a strong correlation. The numbers seen below show that management who are compensated the least have a dramatic – and positive – impact on ROC. I should note this applied in every industry sector listed (in the Corporate Sector Make Up). The lowest 10% in compensation achieved an average ROC from 2009 to 2013 of 17.3%. The highest 10% compensated managers achieved a ROC of only 11.4%. Additionally, the lowest paid managers stayed on in their CEO position nearly 10 years longer. As an investor, not only does your holding managed by the lowest compensated managers generate a higher ROC, but it achieves it for a longer period of time.”
The same finding apparently holds true in fund management – lower paid fund managers generate better returns: “It turns out the data show that managers who are frugally compensated have a tendency to generate much higher return on capital than those who are lavishly compensated.”
So why do we see this correlation? Is there causation here i.e. are lower paid CEOs and fund managers doing something through their frugality which is helping their firms make more money? Macpherson sees three effects at play:
·       “….some individuals run a tight financial ship throughout their lives. This type of tight-fisted approach to spending can run through an entire business, creating extraordinary internal returns (such as return on capital, return on equity and return on assets) as well as investor returns….
·       …managers who faithfully showed up on the lowest compensated list year after year remained at their post far longer than those who were the highest compensated.
·       …While return on capital decreased slightly after the departure of a lowest-compensated manager, generally the company continued to outperform companies with the highest compensated managers. This persisted over the next five to 10 years in nearly every case. The values of the cheapskate CEO had a lasting effect.”

2.  Long read: Beyond the Hype of Lab-Grown Diamonds
Author: Maddie Stone
Source: Gizmodo (
Diamonds were formed deep inside the bowels of Planet Earth billions of years ago. “As the edges of Earth’s tectonic plates plunged down into the upper mantle, bits of carbon, some likely hailing from long-dead life forms were melted and compressed into rigid lattices. Over millions of years, those lattices grew into the most durable, dazzling gems the planet had ever cooked up. And every so often, for reasons scientists still don’t fully understand, an eruption would send a stash of these stones rocketing to the surface inside a bubbly magma known as kimberlite. There, the diamonds would remain, nestled in the kimberlite volcanoes that delivered them from their fiery home, until humans evolved, learned of their existence, and began to dig them up.”
However, diamond mining scars the earth. Hence, some diamond seekers have turned to lab-grown diamonds. “These gems aren’t simulants or synthetic substitutes; they are optically, chemically, and physically identical to their Earth-mined counterparts. They’re also cheaper, and in theory, limitless. The arrival of lab-grown diamonds has rocked the jewelry world to its core and prompted fierce pushback from diamond miners.”
The lab-grown diamond industry is about to take-off: “Today, that sector is taking off. The International Grown Diamond Association (IGDA), a trade group formed in 2016 by a dozen lab diamond growers and sellers, now has about 50 members, according to IGDA secretary general Dick Garard. When the IGDA first formed, lab-grown diamonds were estimated to represent about 1 percent of a $14 billion rough diamond market. This year, industry analyst Paul Zimnisky estimates they account for 2-3 percent of the market. He expects that share will only continue to grow as factories in China that already produce millions of carats a year for industrial purposes start to see an opportunity in jewelry.”
As a result, the establishment is not just waking up and taking notice, it is blessing lab-grown diamonds: “In the summer, the Federal Trade Commission (FTC) reversed decades of guidance when it expanded the definition of a diamond to include those created in labs and dropped ‘synthetic’ as a recommended descriptor for lab-grown stones. The decision came on the heels of the world’s top diamond producer, De Beers, announcing the launch of its own lab-grown diamond line, Lightbox…”
Now, here comes the problem for you and me – the lab-grown diamonds are virtually indistinguishable from the natural diamonds and that creates a real risk that any diamonds we have might lose their value: “And while lab-grown diamonds boast the same sparkle as their Earthly counterparts, they do so at a significant discount. Zimnisky said that today, your typical one carat, medium quality diamond grown in a lab will sell for about $3,600, compared with $6,100 for its Earth-mined counterpart—a discount of about 40 percent. Two years ago, that discount was only 18 percent. And while the price drop has “slightly tapered off” as Zimnisky put it, he expects it will fall further thanks in part to the aforementioned ramp up in Chinese production, as well as technological improvements. …Zimnisky said that if the price falls too fast, it could devalue lab-grown diamonds in the eyes of consumers.”

3. Long read: Why Doctors Hate Their Computers
Author: Atul Gawande
Source: The New Yorker (
IT systems in large enterprises often cause as much anguish among the users whose lives these very systems were created to simplify in the first place. In this piece from the New Yorker, Atul Gawande a surgeon and better known as a best-selling author applies this situation particularly to the world of doctors where the advent of healthcare systems haven’t really helped them be significantly more effective nor productive enough to have a better work-life balance but instead diverted much of their working hours away from facing patients to facing computer screens logging information. He points out that at the same time, there is no denying the fact that systems can help doctors improve the diagnosis by for example providing better access to case histories, etc. Dr Gawande concludes by saying much like everywhere else, it is a matter of a fine balance between systems and human interaction that makes service delivery most effective. As we build out our systems to  make Marcellus a scalable organisation, this comes as a timely reminder.
“A 2016 study found that physicians spent about two hours doing computer work for every hour spent face to face with a patient—whatever the brand
of medical software. In the examination room, physicians devoted half of their patient time facing the screen to do electronic tasks. And these tasks were spilling over after hours. The University of Wisconsin found that the average workday for its family physicians had grown to eleven and a half hours. The result has been epidemic levels of burnout among clinicians. Forty per cent screen positive for depression, and seven per cent report suicidal thinking—almost double the rate of the general working population.
Something’s gone terribly wrong. Doctors are among the most technology-avid people in society; computerization has simpli?ed tasks in many industries. Yet somehow we’ve reached a point where people in the medical profession actively, viscerally, volubly hate their computers.
…Before, Sadoughi almost never had to bring tasks home to ?nish. Now she routinely spends an hour or more on the computer after her children have gone to bed. She gave me an example. Each patient has a “problem list” with his or her active medical issues, such as difficult-to-control diabetes, early signs of dementia, a chronic heart-valve problem. The list is intended to tell clinicians at a glance what they have to consider when seeing a patient. Sadoughi used to keep the list carefully updated—deleting problems that were no longer relevant, adding details about ones that were. But now everyone across the organization can modify the list, and, she said, “it has become utterly useless.” Three people will list the same diagnosis three different ways. Or an orthopedist will list the same generic symptom for every patient (“pain in leg”), which is sufficient for billing purposes but not useful to colleagues who need to know the speci?c diagnosis (e.g., “osteoarthritis in the right knee”). Or someone will add “anemia” to the problem list but not have the expertise to record the relevant details; Sadoughi needs to know that it’s “anemia due to iron de?ciency, last colonoscopy 2017.” The problem lists have become a hoarder’s stash.
“They’re long, they’re de?cient, they’re redundant,” she said. “Now I come to look at a patient, I pull up the problem list, and it means nothing. I have to go read through their past notes, especially if I’m doing urgent care,” where she’s usually meeting someone for the ?rst time. And piecing together what’s important about the patient’s history is at times actually harder than when she had to leaf through a sheaf of paper records. Doctors’ handwritten notes were brief and to the point. With computers, however, the shortcut is to paste in whole blocks of information—an entire two-page imaging report, say—rather than selecting the relevant details. The next doctor must hunt through several pages to ?nd what really matters. Multiply that by twenty-some patients a day, and you can see Sadoughi’s problem.
…The I.B.M. software engineer Frederick Brooks, in his classic 1975 book, “The Mythical Man-Month,” called this ?nal state the Tar Pit. There is, he said, a predictable progression from a cool program (built, say, by a few nerds for a few of their nerd friends) to a bigger, less cool program product (to deliver the same function to more people, with different computer systems and different levels of ability) to an even bigger, very uncool program system (for even more people, with many different needs in many kinds of work).
As a program adapts and serves more people and more functions, it naturally requires tighter regulation. Software systems govern how we interact as groups, and that makes them unavoidably bureaucratic in nature. There will always be those who want to maintain the system and those who want to push the system’s boundaries. Conservatives and liberals emerge.
The Tar Pit has trapped a great many of us: clinicians, scientists, police, salespeople —all of us hunched over our screens, spending more time dealing with constraints on how we do our jobs and less time simply doing them. And the only choice we seem to have is to adapt to this reality or become crushed by it.
….Medicine is a complex adaptive system: it is made up of many interconnected, multilayered parts, and it is meant to evolve with time and changing conditions. Software is not. It is complex, but it does not adapt. That is the heart of the problem for its users, us humans. Adaptation requires two things: mutation and selection. Mutation produces variety and deviation; selection kills off the least functional mutations. Our old, craft-based, pre-computer system of professional practice—in medicine and in other fields—was all mutation and no selection. There was plenty of room for individuals to do things differently from the norm; everyone could be an innovator. But there was no real mechanism for weeding out bad ideas or practices.
Computerization, by contrast, is all selection and no mutation. Leaders install a monolith, and the smallest changes require a committee decision, plus weeks of testing and debugging to make sure that fixing the daylight-saving-time problem, say, doesn’t wreck some other, distant part of the system.
Why can’t our work systems be like our smartphones—?exible, easy, customizable? The answer is that the two systems have different purposes. Consumer technology is all about letting me be me. Technology for complex enterprises is about helping groups do what the members cannot easily do by themselves—work in coördination. Our individual activities have to mesh with everyone else’s. What we want and don’t have, however, is a system that accommodates both mutation and selection.
Many fear that the advance of technology will replace us all with robots. Yet in ?elds like health care the more imminent prospect is that it will make us all behave like robots. And the people we serve need something more than either robots or robot-like people can provide. They need human enterprises that can adapt to change.”

4.  Short read: If Akbar and Ibrahim II had met, they would have liked each other: Manu S Pillai
Author: William Dalrymple interviews Manu Pillai
Source: The Hindu (
This conversation between two of India’s best historians who are currently in circulation – William Dalrymple and Manu Pillai – exposes the cardboard cutout history that is routinely taught in India’s schools and which infect the popular narrative of how India has evolved.
Pillai points out that whilst Shivaji (and his war with the Mughals) is the main prism through which most of the view the Deccan in the Middle Ages, the history of the Deccan is far richer than that:“The Deccan witnessed fascinating events, presided over by even more fascinating people. We have Persian immigrants transform into kings; African warlords marrying their daughters to Sultans; begums who refused purdah (provoking their sons to blot them out of paintings); and a constant religious exchange, whether it was at court where Hindu poets sang of the Mahabharata to Sultans, or on the ground where the saint Eknath could produce a ‘Hindu-Turk Samvad’.”
In fact, as Pillai explains in his book “Rebel Sultans”, the Deccan can boast of a star cast of heroes beyond Shivaji and the Mughals: “Ibrahim Adil Shah II of Bijapur ranks high on my list. This late-16th, early-17th century prince was an extraordinary figure. He could get, given the times in which he lived, cruel where matters of power were concerned…To some today, this sort of action can be taken out of its historical context and used to paint people as “good” or “bad” when the fact is that across the board, till a few centuries ago, violence and power went hand in hand. Ibrahim, however, was supremely interesting. His patronage of art (including a luckless European painter and the miniaturist Farrukh Beg), his love for music, his interest in literature, his contribution to architecture, all combined in a long reign to establish him as one of the finest historical figures to have shaped early-modern India.
The other figure has to be Malik Ambar. Here was a man born in Africa, snatched as a boy and sold into slavery, who arrives in India and rises to near-princely rank through sheer determination and not a little shrewdness.
As the Mughals begin their conquest of the Deccan in the early 17th century, all that stands in their way for decades is this man, and at least two generations of emperors were reduced to barking insults in frustration. Jehangir, especially, hated him to the extent that he commissioned a painting showing himself shooting an arrow at Ambar’s impaled head — something he never succeeded in doing in real life, of course.”
Pillai also does his bit to dispel the notion that the Deccan is where Islam and Hinduism clashed to the detriment of one or the other. Religion, Pillai explains, was used then – as it is now – as a way of canvassing support: “Naipaul is not the first to succumb to the romanticised idea of Vijayanagar as a Hindu bulwark against Islamic aggression. It was certainly founded by Hindu brothers, whose ideology of state was set in Sanskritic terms. But from the very start, we find that it was not driven by religion. Bukka, one of the empire’s founders, for instance, was hardly a man who despised Muslims — why, then, would he invite the Sultan of Delhi to ally with him and destroy the Bahmani state? In inscriptions, we find that ‘Turks’ are only despised as much as other Hindu kingdoms are, and Muslims are not at the receiving end of any pronounced, unusual hostility.
Besides, Vijayanagar wasn’t sunk in a sea of religious resentments from the past — it was a land of bold innovations that looked enthusiastically to the future. One of the most remarkable things about this empire and its rulers is also that, from the late 1340s, we have them use a particularly revealing title. They called themselves Hinduraya Suratrana, Sultans among Hindu Kings. This is fascinating because on the one hand this appears to be the first time that Indian sovereigns use the word ‘Hindu’ consciously in projecting their self-image. But while they do so, they also lay claim to the title of ‘Sultan’. At once, they were both — Hindu Sultans alongside Muslim Sultans.
Bijapur, Golconda and Ahmadnagar too are interesting from the ‘Hindu-Muslim’ question. They too formally projected themselves as ideal Muslim rulers. The Sultans were out to “destroy infidels”, while Vijayanagar wanted to rid the world of “Turks” and “mlechchas”. But real life was not over-blown rhetoric — we find Hindus at the feet of Muslim saints, and Muslims adopting Hindu customs. Vijayanagar actively sought Muslim cavalrymen for its forces, just as the Sultans needed Brahmins and Marathas to sustain their power.”

5.  Short read: Why Tracking Your Symptoms Can Make You Feel Worse
Author: Elena Lacey
Source: WIRED ( )
Elena Lacey at WIRED cites some research studies to show how tracking your symptoms actually makes you feel worse than help in a recovery.
“Fifteen percent of adults in the US use an app regularly or occasionally to track symptoms of a disease. About as many use a sleep-tracking app to figure out whether they get enough shut-eye. Turns out, dwelling on symptoms, including insomnia, makes them more likely to occur. Call it the nocebo effect—the dark sibling of the placebo effect, the familiar mind-over-matter tendency that makes us feel better if we take a sugar pill that we believe is an effective medication.
“The body’s response can be triggered by negative expectations,” says Luana Colloca, a University of Maryland neuroscientist and physician who studies placebo and nocebo effects. “It’s a mechanism of self-defense. From an evolutionary point of view, we’ve developed mechanisms to prevent dangerous situations.”
The symptom tracker doesn’t just reveal your highs and lows. It produces a state of anxiety—and possibly more pain.
That’s because our expectations shape how we feel. About 18 percent of people enrolled in trials of migraine drugs reported side effects—from a sugar pill. (They didn’t know if they were taking the real drug or the fake one.) In a different study, people who were told that their postoperative morphine was ending felt a sudden surge in pain; other patients whose morphine drip stopped without a specific warning didn’t feel that intense pain.
A stunning example of how the mind shapes our physiology emerged from a recent Stanford University study of how people react to learning about genetic risk factors. About 200 study participants took genetic tests and were told that, based on the results, they were either at risk of or protected from two obesity-related factors: cardiorespiratory (heart-lung) exercise capacity or satiety (feeling full) after eating. In fact, they had been assigned to the different groups randomly.
The news changed their physiology to match what they were told. Regardless of their actual DNA-based risk, they had more or less lung capacity and endurance when exercising and more or less of a hormone that makes people feel full.”

6. Short read: Influencer grannies are a lesson in how to reinvent oneself
Author: Utkarsh Amitabh
Source: Livemint (
Utkarsh Amitabh in his succinct post talks about the need for everyone to keep evolving to stay relevant in the current environment. He cites the example of Italian grannies who have embraced social media & digital marketing to spread their handmade pasta recipe and their culture. Their readiness to evolve & accept technology has helped them augment their skill and made them relevant in today’s world. Utkarsh refers to the authors of “100 Year Life” to predict three defining features of work in the 21st century:
1)      People are likely to live longer
2)      the lifespan of organizations will reduce &
3)      the concept of retirement will fade away, partly due to financial reasons and partly out of choice.
Basis the above factors, Utkarash says that people might have to learn to work in different industries, sectors and functions every few years and hence ability to evolve & learn will be the key to success in the future.
“Vicky Bennison read zoology in college and graduated with an MBA from the University of Bath. She then worked in international development across Siberia, South Africa and Turkmenistan. Today, she is best known as the person behind Pasta Grannies, a YouTube channel that finds and films real Italian grannies — nonnas — making handmade pasta. These grannies make lip-smacking pasta and tell delightful stories. What amazes me even more is how grandmas have embraced social media, learned digital marketing and emerged as media entrepreneurs across the world. Closer home, we have the example of Mastanamma, the world’s oldest celebrity chef who got her big break at 105 when her grandson filmed her cooking eggplant curry and put it online. She had cataract, wore dentures, and cooked on an open fire. As The New York Times said, it was all part of the charm. Mastanamma was a natural on camera and got one million subscribers in two years.
These grandmas offer precious insights about the future of work, especially the importance of reinventing oneself. Unfortunately, most journals and media reports overemphasize the importance of certain skills without explaining how challenging it gets to acquire them with each passing year. Reinventing our mental models will probably be the most crucial aspect of finding work in the coming years. Let’s understand why. The authors of 100 Year Life, Lynda Gratton and Andrew Scott, offer three defining features of work in the 21st century. First, people are likely to live longer. Second, the lifespan of organizations will reduce. Third, the concept of retirement will fade away, partly due to financial reasons and partly out of choice. Combining all these factors, it is easy to visualize how one might have to learn to work in different industries, sectors and functions every few years. One of the first things we will see is the disruption of the traditional study, work and retire model by loops of work followed by study. People will probably go to college multiple times in their lives or enrol in a specialized degree at 75. It is also possible that college degrees get split into chunks or the notion of going to college is replaced by alternate learning and apprenticeship models. Several venture capital backed companies in Silicon Valley are already tinkering with this. It is clear that lifelong learning will be central to all our lives. Lifelong learning doesn’t mean chasing buzzwords, hashtags and the latest obsessions. If we do that, we will be on a perennial wild goose chase because there are way too many new things. To become effective lifelong learners, we must figure out ways to connect the dots between what we already know and what we aspire to know. What we aspire to know must follow our curiosity and factor our strengths, interests and time availability. The Italian grandmas and centenarians like Mastanamma succeeded because they leveraged their strengths and worked on things they cared about. They used technology to augment their potential and thoroughly enjoyed the experience of reinventing themselves as cutting-edge digital content producers. In addition to income, starting up at 100 gave them something to look forward to and added meaning to their lives. If you want to see how this manifests, check out Gina Petitti’s YouTube video thanking her fans on reaching 100,000 subscribers.
Videos are great but if you prefer a real-life demo, you can meet my grandmother at the India International Centre Library, writing chapters of her new book on her tablet. Perhaps Vicky Bennison will consider doing a video series on Indian grannies as well.”

Note: the above material is neither investment research, nor financial advice. Marcellus does not seek payment for or business from this email in any shape or form. Marcellus Investment Managers is regulated by the Securities and Exchange Board of India as a provider of Portfolio Management Services and as an Investment Advisor.
Copyright © 2019 Marcellus Investment Managers Pvt Ltd, All rights reserved.

2024 © | All rights reserved.

Privacy Policy | Terms and Conditions