On Tuesday’s broadcast of CNN’s “Cuomo Primetime,” acting Deputy DHS Secretary Ken Cuccinelli said that President Donald Trump’s statement on Thursday about the riot at the U.S. Capitol is “what we wanted to hear on Wednesday,” the day that the riot took place, but that the president’s message about the riot on Thursday was one that came “late.”
Cuccinelli said, “[W]hat he said on Thursday is what we wanted to hear on Wednesday, right? And it was late.”
Cuccinelli added that he “can’t imagine in a million years that he’d be calling people to violence next week.”
Thank you for visiting My Local Pages. We hope you enjoyed seeing this news release involving USA Business, International and Political news named “Trump’s Thursday Statement on Riot Should Have Been Made Earlier”. This story was presented by MyLocalPages as part of our local news services.
Health authorities have issued a warning to Canberrans after death cap mushrooms were spotted sprouting earlier than usual in the territory. The ACT’s acting deputy chief health officer, Miranda Harris, on Thursday said the territory normally wouldn’t see the lethal mushrooms until March or April. But they’d popped up already this year – most likely due to wet weather and milder summer temperatures. In 2012 the mushrooms killed two Canberra residents who ate them at a dinner party on New Year’s Eve and in 2014 they seriously poisoned four others. “As the name suggests, death cap mushrooms can be deadly,” Dr Harris said. “All parts of the mushroom are poisonous whether they have been cooked or not.” In other news: Death cap mushrooms, which are easily mistaken for edible mushrooms, often grow near established oak trees. A spokesman for ACT Health reiterated there had been sightings in the territory already this year, but did not confirm where. He said the mushrooms could sprout anywhere. Dr Harris warned the community not to risk touching wild mushrooms with their bare hands, and to keep animals and children away from them. “If you think you may have eaten a death cap mushroom, urgently seek medical attention at a hospital emergency department and take any remaining mushroom to the hospital for identification,” she said. “Eating wild mushrooms is just not worth the risk. “Don’t eat mushrooms you have found in the wild, and only purchase mushrooms from a reputable supplier.” Dr Harris said symptoms of death cap mushroom poisoning generally started appearing between six and 24 hours or more after eating them. The symptoms included pains, nausea, vomiting and diarrhea. “The chances of survival increase where treatment is started early,” Dr Harris said. She said anyone who spotted a wild mushroom in a public area could report it to Access Canberra on 13 22 81. More information on death cap mushrooms can be found on ACT Health’s website. For faster access to the latest Canberra news, download The Canberra Times app for iOS and Android.
Health authorities have issued a warning to Canberrans after death cap mushrooms were spotted sprouting earlier than usual in the territory.
The ACT’s acting deputy chief health officer, Miranda Harris, on Thursday said the territory normally wouldn’t see the lethal mushrooms until March or April.
But they’d popped up already this year – most likely due to wet weather and milder summer temperatures.
In 2012 the mushrooms killed two Canberra residents who ate them at a dinner party on New Year’s Eve and in 2014 they seriously poisoned four others.
“As the name suggests, death cap mushrooms can be deadly,” Dr Harris said.
“All parts of the mushroom are poisonous whether they have been cooked or not.”
Death cap mushrooms, which are easily mistaken for edible mushrooms, often grow near established oak trees.
A spokesman for ACT Health reiterated there had been sightings in the territory already this year, but did not confirm where. He said the mushrooms could sprout anywhere.
Dr Harris warned the community not to risk touching wild mushrooms with their bare hands, and to keep animals and children away from them.
“If you think you may have eaten a death cap mushroom, urgently seek medical attention at a hospital emergency department and take any remaining mushroom to the hospital for identification,” she said.
“Eating wild mushrooms is just not worth the risk.
“Don’t eat mushrooms you have found in the wild, and only purchase mushrooms from a reputable supplier.”
Dr Harris said symptoms of death cap mushroom poisoning generally started appearing between six and 24 hours or more after eating them.
The symptoms included pains, nausea, vomiting and diarrhea.
“The chances of survival increase where treatment is started early,” Dr Harris said.
She said anyone who spotted a wild mushroom in a public area could report it to Access Canberra on 13 22 81.
Thanks for checking this post involving ACT news called “Death cap mushrooms spotted sprouting earlier than usual in the ACT | The Canberra Times”. This article was posted by MyLocalPages Australia as part of our national news services.
London: A British judge has denied Julian Assange bail finding he is a flight risk.
It comes just two days after Judge Vanessa Baraister ordered his discharge from extradition to the United States to face spying charges.
The United States government is appealing the ruling and on Wednesday evening (AEDT) successfully argued that the Australian be kept in Belmarsh Prison while the High Court proceedings take place.
More to come.
Thank you for dropping by My Local Pages and seeing this news article involving Queensland and Australian news published as “Bailed denied for Australian Julian Assange despite earlier extradition win”. This news article was shared by My Local Pages as part of our news aggregator services.
U.S. President Donald Trump signed a $900 billion pandemic relief package Sunday, ending days of drama over his refusal to accept the bipartisan deal that will deliver long-sought cash to businesses and individuals and avert a federal government shutdown.
The massive bill includes $1.4 trillion to fund government agencies through September and contains other end-of-session priorities such as money for cash-starved transit systems and an increase in food stamp benefits.
The signing, at his private club in Florida, came after a day of vocal criticism from Republicans and Democrats over his objections to the bipartisan agreement, which passed the House and Senate by large margins with lawmakers believing they had Trump’s support. His eleventh-hour demands, including a push for larger relief checks and scaled-back spending, had blindsided members of both parties. His subsequent foot-dragging resulted in a lapse in unemployment benefits for millions struggling to make ends meet and threatened a government shutdown in the midst of a pandemic.
Story continues below advertisement
Unemployment benefits for millions expire as Trump refuses to sign COVID-19 relief bill
Signing the bill into law prevents another crisis of Trump’s own creation and ends a standoff with his own party during the final days of his administration.
It was unclear what Trump had accomplished with his delay, beyond empowering Democrats to push for the higher checks that his party opposes. In a statement, Trump repeated his frustrations with the COVID-19 relief bill for providing only $600 checks to most Americans instead of the $2,000 that his fellow Republicans rejected. He also complained about what he considered unnecessary spending by the government at large.
“I will sign the Omnibus and Covid package with a strong message that makes clear to Congress that wasteful items need to be removed,” Trump said in the statement.
While the president insisted he would send Congress “a redlined version” with items to be removed under the rescission process, those are merely suggestions to Congress. The bill, as signed, would not necessarily be changed.
Trump plays golf as COVID-19 relief bill hangs in limbo
Trump plays golf as COVID-19 relief bill hangs in limbo
Lawmakers now have breathing room to continue debating whether the relief checks should be as large as the president has demanded. The Democratic-led House supports the larger checks and is set to vote on the issue Monday, but it’s expected to be ignored by the Republican-held Senate where spending faces opposition. For now, the administration can only begin work sending out the $600 payments.
Ontario confirms Canada’s 1st known cases of U.K. coronavirus variant
Coronavirus: B.C. confirms first case of COVID-19 variant detected in U.K.
Story continues below advertisement
Republicans and Democrats swiftly welcomed Trump’s decision to sign the bill into law.
“The compromise bill is not perfect, but it will do an enormous amount of good for struggling Kentuckians and Americans across the country who need help now,” said Senate Majority Leader Mitch McConnell, R-Ky. “I thank the President for signing this relief into law.”
House Speaker Nancy Pelosi, D-Calif., called the signing “welcome news for the fourteen million Americans who just lost the lifeline of unemployment benefits on Christmas Weekend, and for the millions more struggling to stay afloat during this historic pandemic and economic crisis.”
But others slammed Trump’s delay in turning the bill into law. In a tweet, Rep. Gerry Connolly, D-Va., accused Trump of having “played Russian roulette with American lives. A familiar and comfortable place for him.”
Trump plays golf on Christmas Eve as coronavirus relief bill up in the air
Senate Democratic leader Chuck Schumer, D-N.Y., said he would offer Trump’s proposal for $2,000 checks for a vote in Senate — putting Republicans on the spot.
“The House will pass a bill to give Americans $2,000 checks. Then I will move to pass it in the Senate,” Schumer tweeted. “No Democrats will object. Will Senate Republicans?”
Story continues below advertisement
Democrats are promising more aid to come once President-elect Joe Biden takes office, but Republicans are signalling a wait-and-see approach.
In the face of growing economic hardship, spreading disease and a looming shutdown, lawmakers on Sunday had urged Trump to sign the legislation immediately, then have Congress follow up with additional aid. Aside from unemployment benefits and relief payments to families, money for vaccine distribution, businesses and more was on the line. Protections against evictions also hung in the balance.
“What the president is doing right now is unbelievably cruel,” said Sen. Bernie Sanders, I-Vt. “So many people are hurting. … It is really insane and this president has got to finally … do the right thing for the American people and stop worrying about his ego.”
Coronavirus: Trump demands changes to $900 billion COVID-19 relief bill
Coronavirus: Trump demands changes to $900 billion COVID-19 relief bill
Republican Sen. Pat Toomey of Pennsylvania said he understood that Trump “wants to be remembered for advocating for big checks, but the danger is he’ll be remembered for chaos and misery and erratic behaviour if he allows this to expire.”
Story continues below advertisement
Toomey added: “So I think the best thing to do, as I said, sign this and then make the case for subsequent legislation.”
The same point was echoed by Maryland Gov. Larry Hogan, a Republican who’s criticized Trump’s pandemic response and his efforts to undo the election results. “I just gave up guessing what he might do next,” he said.
U.S. Congress passes long-awaited $900B coronavirus relief package
Republican Rep. Adam Kinzinger of Illinois said too much is at stake for Trump to “play this old switcheroo game.”
“I don’t get the point,” he said. “I don’t understand what’s being done, why, unless it’s just to create chaos and show power and be upset because you lost the election.”
Washington had been reeling since Trump turned on the deal. Fingers pointed at administration officials, including Treasury Secretary Steven Mnuchin, as lawmakers tried to understand whether they were misled about Trump’s position.
“Now to be put in a lurch, after the president’s own person negotiated something that the president doesn’t want, it’s just — it’s surprising,” Kinzinger said.
Kinzinger spoke on CNN’s “State of the Union,” and Hogan and Sanders on ABC’s “This Week.”
‘Dear Disney family, I’m here to let you know Leanza passed this afternoon,’ a friend said in a post. ‘She was so loved. I don’t feel like writing a lot right now; my heart is broken.’
Tragic: Leanza Cornett (pictured left in 2018) died Wednesday at a Jacksonville hospital where she was being treated for a head injury she suffered less than three weeks ago. Cornett was crowned Miss America in 1993
Cornett, 49, was married to news presenter Mark Steines from 1995 until 2013. He took to social media on Wednesday to pay tribute to his ex-wife and mother of their two sons
The Miss America Organization subsequently shared a statement on Facebook confirming her passing.
‘Leanza had a bright and beautiful spirit and her laugh was infectious. We know she meant so much to so many, including all of you,’ the organization said.
‘We are devastated by this sudden loss in our Miss America family and we are deeply sorry for her family and close friends for their loss.
‘At the moment, we do not have any further information regarding a service for Leanza and we ask that you please respect her family during this difficult time. Hold on tight to those you love today. Time is certainly precious.’
No cause of death was given, however, posts from her support page reveal Cornett ‘sustained an enormous blow to the back of her head’ and underwent emergency surgery after falling in her kitchen on October 12.
‘The surgery she had on Tuesday was to stop the bleeding in her brain,’ Close friend Sue Roberts said in an October 18 post.
‘Right now there is some continued bleeding and swelling. Brain injuries are not black and white. We have to just take this day by day.’
During beauty pageant career, Cornett had won the Miss Florida title a year before being crowned Miss America
The Miss America Organization confirmed the beauty queen’s death in a statement shared on Facebook
The beauty pageant organization said Cornett ‘had a bright and beautiful spirit and her laugh was infectious’. She is pictured alongside Miss Florida 2017 Sara Zeng and Miss Louisiana Laryssa Bonacquisti during a competition in Atlantic City
Roberts did not provide further details of the accident but revealed it had occurred in a moment that ‘could have happened to anyone.’
‘A fall in the kitchen. I am a self-proclaimed dork and have fallen a million times in my house. Accidents happen. I’m saying this because I think all of us need to just hug each other a little tighter and love each other a little longer,’ she added.
During her pageant career, Cornett, who grew up in Jacksonville, won the Miss America title in 1993 a year after being crowned Miss Florida.
She was also known as the first actress to play a live-action version of Ariel in the production of The Little Mermaid at Florida’s Walt Disney World Resort in 1991, according to WJXT.
Cornett was married to Mark Steines (pictured together left) from 1995 until 2013. He is now married to author Julie Steines (right)
Cornett went on to appear as an actress on TV shows including CSI: Crime Scene Investigation, Weeds and Saved by the Bell: The New Class.
She married former Entertainment Tonight host Mark Steines in 1995 before the pair divorced in in 2013.
She is survived by their two sons, Avery Steines, 16, and Kai Steines, 18.
Steines, 56, who is now married to author Julie Steines, paid tribute to his ex-wife in an emotional Instagram post on Wednesday.
‘It is with a heavy heart that I share with you the passing of my ex-wife, Leanza the mother to our two extraordinary sons Kai and Avery,’ he wrote.
‘We will always remember the wonderful times shared during her short time here on earth I find comfort knowing Kai and Avery will forever have the best guardian angel watching over them as they navigate life’s path.
‘I ask that you please keep them as well as Leanza’s parents and her family in your prayers.’
These words came from an elderly woman sitting behind me on a late-night flight from Los Angeles to Washington, D.C. The plane was dark and quiet. A man I assumed to be her husband murmured almost inaudibly in response, something to the effect of “I wish I was dead.”
I didn’t mean to eavesdrop, but couldn’t help it. I listened with morbid fascination, forming an image of the man in my head as they talked. I imagined someone who had worked hard all his life in relative obscurity, someone with unfulfilled dreams—perhaps of the degree he never attained, the career he never pursued, the company he never started.
At the end of the flight, as the lights switched on, I finally got a look at the desolate man. I was shocked. I recognized him—he was, and still is, world-famous. Then in his mid‑80s, he was beloved as a hero for his courage, patriotism, and accomplishments many decades ago.
As he walked up the aisle of the plane behind me, other passengers greeted him with veneration. Standing at the door of the cockpit, the pilot stopped him and said, “Sir, I have admired you since I was a little boy.” The older man—apparently wishing for death just a few minutes earlier—beamed with pride at the recognition of his past glories.
For selfish reasons, I couldn’t get the cognitive dissonance of that scene out of my mind. It was the summer of 2015, shortly after my 51st birthday. I was not world-famous like the man on the plane, but my professional life was going very well. I was the president of a flourishing Washington think tank, the American Enterprise Institute. I had written some best-selling books. People came to my speeches. My columns were published in The New York Times.
But I had started to wonder: Can I really keep this going? I work like a maniac. But even if I stayed at it 12 hours a day, seven days a week, at some point my career would slow and stop. And when it did, what then? Would I one day be looking back wistfully and wishing I were dead? Was there anything I could do, starting now, to give myself a shot at avoiding misery—and maybe even achieve happiness—when the music inevitably stops?
Though these questions were personal, I decided to approach them as the social scientist I am, treating them as a research project. It felt unnatural—like a surgeon taking out his own appendix. But I plunged ahead, and for the past four years, I have been on a quest to figure out how to turn my eventual professional decline from a matter of dread into an opportunity for progress.
Here’s what I’ve found.
The field of “happiness studies” has boomed over the past two decades, and a consensus has developed about well-being as we advance through life. In The Happiness Curve: Why Life Gets Better After 50, Jonathan Rauch, a Brookings Institution scholar and an Atlantic contributing editor, reviews the strong evidence suggesting that the happiness of most adults declines through their 30s and 40s, then bottoms out in their early 50s. Nothing about this pattern is set in stone, of course. But the data seem eerily consistent with my experience: My 40s and early 50s were not an especially happy period of my life, notwithstanding my professional fortunes.
So what can people expect after that, based on the data? The news is mixed. Almost all studies of happiness over the life span show that, in wealthier countries, most people’s contentment starts to increase again in their 50s, until age 70 or so. That is where things get less predictable, however. After 70, some people stay steady in happiness; others get happier until death. Others—men in particular—see their happiness plummet. Indeed, depression and suicide rates for men increase after age 75.
This last group would seem to include the hero on the plane. A few researchers have looked at this cohort to understand what drives their unhappiness. It is, in a word, irrelevance. In 2007, a team of academic researchers at UCLA and Princeton analyzed data on more than 1,000 older adults. Their findings, published in the Journal of Gerontology, showed that senior citizens who rarely or never “felt useful” were nearly three times as likely as those who frequently felt useful to develop a mild disability, and were more than three times as likely to have died during the course of the study.
One might think that gifted and accomplished people, such as the man on the plane, would be less susceptible than others to this sense of irrelevance; after all, accomplishment is a well-documented source of happiness. If current accomplishment brings happiness, then shouldn’t the memory of that accomplishment provide some happiness as well?
Maybe not. Though the literature on this question is sparse, giftedness and achievements early in life do not appear to provide an insurance policy against suffering later on. In 1999, Carole Holahan and Charles Holahan, psychologists at the University of Texas, published an influential paper in The International Journal of Aging and Human Development that looked at hundreds of older adults who early in life had been identified as highly gifted. The Holahans’ conclusion: “Learning at a younger age of membership in a study of intellectual giftedness was related to … less favorable psychological well-being at age eighty.”
This study may simply be showing that it’s hard to live up to high expectations, and that telling your kid she is a genius is not necessarily good parenting. (The Holahans surmise that the children identified as gifted might have made intellectual ability more central to their self-appraisal, creating “unrealistic expectations for success” and causing them to fail to “take into account the many other life influences on success and recognition.”) However, abundant evidence suggests that the waning of ability in people of high accomplishment is especially brutal psychologically. Consider professional athletes, many of whom struggle profoundly after their sports career ends. Tragic examples abound, involving depression, addiction, or suicide; unhappiness in retired athletes may even be the norm, at least temporarily. A study published in the Journal of Applied Sport Psychology in 2003, which charted the life satisfaction of former Olympic athletes, found that they generally struggled with a low sense of personal control when they first stopped competing.
Recently, I asked Dominique Dawes, a former Olympic gold-medal gymnast, how normal life felt after competing and winning at the highest levels. She told me that she is happy, but that the adjustment wasn’t easy—and still isn’t, even though she won her last Olympic medal in 2000. “My Olympic self would ruin my marriage and leave my kids feeling inadequate,” she told me, because it is so demanding and hard-driving. “Living life as if every day is an Olympics only makes those around me miserable.”
Why might former elite performers have such a hard time? No academic research has yet proved this, but I strongly suspect that the memory of remarkable ability, if that is the source of one’s self-worth, might, for some, provide an invidious contrast to a later, less remarkable life. “Unhappy is he who depends on success to be happy,” Alex Dias Ribeiro, a former Formula 1 race-car driver, once wrote. “For such a person, the end of a successful career is the end of the line. His destiny is to die of bitterness or to search for more success in other careers and to go on living from success to success until he falls dead. In this case, there will not be life after success.”
Call it the Principle of Psychoprofessional Gravitation: the idea that the agony of professional oblivion is directly related to the height of professional prestige previously achieved, and to one’s emotional attachment to that prestige. Problems related to achieving professional success might appear to be a pretty good species of problem to have; even raising this issue risks seeming precious. But if you reach professional heights and are deeply invested in being high up, you can suffer mightily when you inevitably fall. That’s the man on the plane. Maybe that will be you, too. And, without significant intervention, I suspect it will be me.
The Principle of Psychoprofessional Gravitation can help explain the many cases of people who have done work of world-historical significance yet wind up feeling like failures. Take Charles Darwin, who was just 22 when he set out on his five-year voyage aboard the Beagle in 1831. Returning at 27, he was celebrated throughout Europe for his discoveries in botany and zoology, and for his early theories of evolution. Over the next 30 years, Darwin took enormous pride in sitting atop the celebrity-scientist pecking order, developing his theories and publishing them as books and essays—the most famous being On the Origin of Species, in 1859.
But as Darwin progressed into his 50s, he stagnated; he hit a wall in his research. At the same time an Austrian monk by the name of Gregor Mendel discovered what Darwin needed to continue his work: the theory of genetic inheritance. Unfortunately, Mendel’s work was published in an obscure academic journal and Darwin never saw it—and in any case, Darwin did not have the mathematical ability to understand it. From then on he made little progress. Depressed in his later years, he wrote to a close friend, “I have not the heart or strength at my age to begin any investigation lasting years, which is the only thing which I enjoy.”
Presumably, Darwin would be pleasantly surprised to learn how his fame grew after his death, in 1882. From what he could see when he was old, however, the world had passed him by, and he had become irrelevant. That could have been Darwin on the plane behind me that night.
It also could have been a younger version of me, because I have had precocious experience with professional decline.
As a child, I had just one goal: to be the world’s greatest French-horn player. I worked at it slavishly, practicing hours a day, seeking out the best teachers, and playing in any ensemble I could find. I had pictures of famous horn players on my bedroom wall for inspiration. And for a while, I thought my dream might come true. At 19, I left college to take a job playing professionally in a touring chamber-music ensemble. My plan was to keep rising through the classical-music ranks, joining a top symphony orchestra in a few years or maybe even becoming a soloist—the most exalted job a classical musician can hold.
But then, in my early 20s, a strange thing happened: I started getting worse. To this day, I have no idea why. My technique began to suffer, and I had no explanation for it. Nothing helped. I visited great teachers and practiced more, but I couldn’t get back to where I had been. Pieces that had been easy to play became hard; pieces that had been hard became impossible.
Perhaps the worst moment in my young but flailing career came at age 22, when I was performing at Carnegie Hall. While delivering a short speech about the music I was about to play, I stepped forward, lost my footing, and fell off the stage into the audience. On the way home from the concert, I mused darkly that the experience was surely a message from God.
But I sputtered along for nine more years. I took a position in the City Orchestra of Barcelona, where I increased my practicing but my playing gradually deteriorated. Eventually I found a job teaching at a small music conservatory in Florida, hoping for a magical turnaround that never materialized. Realizing that maybe I ought to hedge my bets, I went back to college via distance learning, and earned my bachelor’s degree shortly before my 30th birthday. I secretly continued my studies at night, earning a master’s degree in economics a year later. Finally I had to admit defeat: I was never going to turn around my faltering musical career. So at 31 I gave up, abandoning my musical aspirations entirely, to pursue a doctorate in public policy.
Life goes on, right? Sort of. After finishing my studies, I became a university professor, a job I enjoyed. But I still thought every day about my beloved first vocation. Even now, I regularly dream that I am onstage, and wake to remember that my childhood aspirations are now only phantasms.
I am lucky to have accepted my decline at a young enough age that I could redirect my life into a new line of work. Still, to this day, the sting of that early decline makes these words difficult to write. I vowed to myself that it wouldn’t ever happen again.
Will it happen again? In some professions, early decline is inescapable. No one expects an Olympic athlete to remain competitive until age 60. But in many physically nondemanding occupations, we implicitly reject the inevitability of decline before very old age. Sure, our quads and hamstrings may weaken a little as we age. But as long as we retain our marbles, our quality of work as a writer, lawyer, executive, or entrepreneur should remain high up to the very end, right? Many people think so. I recently met a man a bit older than I am who told me he planned to “push it until the wheels came off.” In effect, he planned to stay at the very top of his game by any means necessary, and then keel over.
But the odds are he won’t be able to. The data are shockingly clear that for most people, in most fields, decline starts earlier than almost anyone thinks.
According to research by Dean Keith Simonton, a professor emeritus of psychology at UC Davis and one of the world’s leading experts on the trajectories of creative careers, success and productivity increase for the first 20 years after the inception of a career, on average. So if you start a career in earnest at 30, expect to do your best work around 50 and go into decline soon after that.
The specific timing of peak and decline vary somewhat depending on the field. Benjamin Jones, a professor of strategy and entrepreneurship at Northwestern University’s Kellogg School of Management, has spent years studying when people are most likely to make prizewinning scientific discoveries and develop key inventions. His findings can be summarized by this little ditty:
Dirac overstates the point, but only a little. Looking at major inventors and Nobel winners going back more than a century, Jones has found that the most common age for producing a magnum opus is the late 30s. He has shown that the likelihood of a major discovery increases steadily through one’s 20s and 30s and then declines through one’s 40s, 50s, and 60s. Are there outliers? Of course. But the likelihood of producing a major innovation at age 70 is approximately what it was at age 20—almost nonexistent.
Much of literary achievement follows a similar pattern. Simonton has shown that poets peak in their early 40s. Novelists generally take a little longer. When Martin Hill Ortiz, a poet and novelist, collected data on New York Times fiction best sellers from 1960 to 2015, he found that authors were likeliest to reach the No. 1 spot in their 40s and 50s. Despite the famous productivity of a few novelists well into old age, Ortiz shows a steep drop-off in the chance of writing a best seller after the age of 70. (Some nonfiction writers—especially historians—peak later, as we shall see in a minute.)
This research concerns people at the very top of professions that are atypical. But the basic finding appears to apply more broadly. Scholars at Boston College’s Center for Retirement Research studied a wide variety of jobs and found considerable susceptibility to age-related decline in fields ranging from policing to nursing. Other research has found that the best-performing home-plate umpires in Major League Baseball have 18 years less experience and are 23 years younger than the worst-performing umpires (who are 56.1 years old, on average). Among air traffic controllers, the age-related decline is so sharp—and the potential consequences of decline-related errors so dire—that the mandatory retirement age is 56.
In sum, if your profession requires mental processing speed or significant analytic capabilities—the kind of profession most college graduates occupy—noticeable decline is probably going to set in earlier than you imagine.
If decline not only is inevitable but also happens earlier than most of us expect, what should we do when it comes for us?
Whole sections of bookstores are dedicated to becoming successful. The shelves are packed with titles like The Science of Getting Rich and The 7 Habits of Highly Effective People. There is no section marked “Managing Your Professional Decline.”
But some people have managed their declines well. Consider the case of Johann Sebastian Bach. Born in 1685 to a long line of prominent musicians in central Germany, Bach quickly distinguished himself as a musical genius. In his 65 years, he published more than 1,000 compositions for all the available instrumentations of his day.
Early in his career, Bach was considered an astoundingly gifted organist and improviser. Commissions rolled in; royalty sought him out; young composers emulated his style. He enjoyed real prestige.
But it didn’t last—in no small part because his career was overtaken by musical trends ushered in by, among others, his own son, Carl Philipp Emanuel, known as C.P.E. to the generations that followed. The fifth of Bach’s 20 children, C.P.E. exhibited the musical gifts his father had. He mastered the baroque idiom, but he was more fascinated with a new “classical” style of music, which was taking Europe by storm. As classical music displaced baroque, C.P.E.’s prestige boomed while his father’s music became passé.
Bach easily could have become embittered, like Darwin. Instead, he chose to redesign his life, moving from innovator to instructor. He spent a good deal of his last 10 years writing The Art of Fugue, not a famous or popular work in his time, but one intended to teach the techniques of the baroque to his children and students—and, as unlikely as it seemed at the time, to any future generations that might be interested. In his later years, he lived a quieter life as a teacher and a family man.
What’s the difference between Bach and Darwin? Both were preternaturally gifted and widely known early in life. Both attained permanent fame posthumously. Where they differed was in their approach to the midlife fade. When Darwin fell behind as an innovator, he became despondent and depressed; his life ended in sad inactivity. When Bach fell behind, he reinvented himself as a master instructor. He died beloved, fulfilled, and—though less famous than he once had been—respected.
The lesson for you and me, especially after 50: Be Johann Sebastian Bach, not Charles Darwin.
How does one do that?
A potential answer lies in the work of the British psychologist Raymond Cattell, who in the early 1940s introduced the concepts of fluid and crystallized intelligence. Cattell defined fluid intelligence as the ability to reason, analyze, and solve novel problems—what we commonly think of as raw intellectual horsepower. Innovators typically have an abundance of fluid intelligence. It is highest relatively early in adulthood and diminishes starting in one’s 30s and 40s. This is why tech entrepreneurs, for instance, do so well so early, and why older people have a much harder time innovating.
Crystallized intelligence, in contrast, is the ability to use knowledge gained in the past. Think of it as possessing a vast library and understanding how to use it. It is the essence of wisdom. Because crystallized intelligence relies on an accumulating stock of knowledge, it tends to increase through one’s 40s, and does not diminish until very late in life.
Careers that rely primarily on fluid intelligence tend to peak early, while those that use more crystallized intelligence peak later. For example, Dean Keith Simonton has found that poets—highly fluid in their creativity—tend to have produced half their lifetime creative output by age 40 or so. Historians—who rely on a crystallized stock of knowledge—don’t reach this milestone until about 60.
Here’s a practical lesson we can extract from all this: No matter what mix of intelligence your field requires, you can always endeavor to weight your career away from innovation and toward the strengths that persist, or even increase, later in life.
Like what? As Bach demonstrated, teaching is an ability that decays very late in life, a principal exception to the general pattern of professional decline over time. A study in The Journal of Higher Education showed that the oldest college professors in disciplines requiring a large store of fixed knowledge, specifically the humanities, tended to get evaluated most positively by students. This probably explains the professional longevity of college professors, three-quarters of whom plan to retire after age 65—more than half of them after 70, and some 15 percent of them after 80. (The average American retires at 61.) One day, during my first year as a professor, I asked a colleague in his late 60s whether he’d ever considered retiring. He laughed, and told me he was more likely to leave his office horizontally than vertically.
Our dean might have chuckled ruefully at this—college administrators complain that research productivity among tenured faculty drops off significantly in the last decades of their career. Older professors take up budget slots that could otherwise be used to hire young scholars hungry to do cutting-edge research. But perhaps therein lies an opportunity: If older faculty members can shift the balance of their work from research to teaching without loss of professional prestige, younger faculty members can take on more research.
Patterns like this match what I’ve seen as the head of a think tank full of scholars of all ages. There are many exceptions, but the most profound insights tend to come from those in their 30s and early 40s. The best synthesizers and explainers of complicated ideas—that is, the best teachers—tend to be in their mid-60s or older, some of them well into their 80s.
That older people, with their stores of wisdom, should be the most successful teachers seems almost cosmically right. No matter what our profession, as we age we can dedicate ourselves to sharing knowledge in some meaningful way.
A few years ago, I saw a cartoon of a man on his deathbed saying, “I wish I’d bought more crap.” It has always amazed me that many wealthy people keep working to increase their wealth, amassing far more money than they could possibly spend or even usefully bequeath. One day I asked a wealthy friend why this is so. Many people who have gotten rich know how to measure their self-worth only in pecuniary terms, he explained, so they stay on the hamster wheel, year after year. They believe that at some point, they will finally accumulate enough to feel truly successful, happy, and therefore ready to die.
This is a mistake, and not a benign one. Most Eastern philosophy warns that focusing on acquisition leads to attachment and vanity, which derail the search for happiness by obscuring one’s essential nature. As we grow older, we shouldn’t acquire more, but rather strip things away to find our true selves—and thus, peace.
At some point, writing one more book will not add to my life satisfaction; it will merely stave off the end of my book-writing career. The canvas of my life will have another brushstroke that, if I am being forthright, others will barely notice, and will certainly not appreciate very much. The same will be true for most other markers of my success.
What I need to do, in effect, is stop seeing my life as a canvas to fill, and start seeing it more as a block of marble to chip away at and shape something out of. I need a reverse bucket list. My goal for each year of the rest of my life should be to throw out things, obligations, and relationships until I can clearly see my refined self in its best form.
And that self is … who, exactly?
Last year, the search for an answer to this question took me deep into the South Indian countryside, to a town called Palakkad, near the border between the states of Kerala and Tamil Nadu. I was there to meet the guru Sri Nochur Venkataraman, known as Acharya (“Teacher”) to his disciples. Acharya is a quiet, humble man dedicated to helping people attain enlightenment; he has no interest in Western techies looking for fresh start-up ideas or burnouts trying to escape the religious traditions they were raised in. Satisfied that I was neither of those things, he agreed to talk with me.
I told him my conundrum: Many people of achievement suffer as they age, because they lose their abilities, gained over many years of hard work. Is this suffering inescapable, like a cosmic joke on the proud? Or is there a loophole somewhere—a way around the suffering?
Acharya answered elliptically, explaining an ancient Hindu teaching about the stages of life, or ashramas. The first is Brahmacharya, the period of youth and young adulthood dedicated to learning. The second is Grihastha, when a person builds a career, accumulates wealth, and creates a family. In this second stage, the philosophers find one of life’s most common traps: People become attached to earthly rewards—money, power, sex, prestige—and thus try to make this stage last a lifetime.
The antidote to these worldly temptations is Vanaprastha, the third ashrama, whose name comes from two Sanskrit words meaning “retiring” and “into the forest.” This is the stage, usually starting around age 50, in which we purposefully focus less on professional ambition, and become more and more devoted to spirituality, service, and wisdom. This doesn’t mean that you need to stop working when you turn 50—something few people can afford to do—only that your life goals should adjust.
Vanaprastha is a time for study and training for the last stage of life, Sannyasa, which should be totally dedicated to the fruits of enlightenment. In times past, some Hindu men would leave their family in old age, take holy vows, and spend the rest of their life at the feet of masters, praying and studying. Even if sitting in a cave at age 75 isn’t your ambition, the point should still be clear: As we age, we should resist the conventional lures of success in order to focus on more transcendentally important things.
I told Acharya the story about the man on the plane. He listened carefully, and thought for a minute. “He failed to leave Grihastha,” he told me. “He was addicted to the rewards of the world.” He explained that the man’s self-worth was probably still anchored in the memories of professional successes many years earlier, his ongoing recognition purely derivative of long-lost skills. Any glory today was a mere shadow of past glories. Meanwhile, he’d completely skipped the spiritual development of Vanaprastha, and was now missing out on the bliss of Sannyasa.
There is a message in this for those of us suffering from the Principle of Psychoprofessional Gravitation. Say you are a hard-charging, type-A lawyer, executive, entrepreneur, or—hypothetically, of course—president of a think tank. From early adulthood to middle age, your foot is on the gas, professionally. Living by your wits—by your fluid intelligence—you seek the material rewards of success, you attain a lot of them, and you are deeply attached to them. But the wisdom of Hindu philosophy—and indeed the wisdom of many philosophical traditions—suggests that you should be prepared to walk away from these rewards before you feel ready. Even if you’re at the height of your professional prestige, you probably need to scale back your career ambitions in order to scale up your metaphysical ones.
When the New York Times columnist David Brooks talks about the difference between “résumé virtues” and “eulogy virtues,” he’s effectively putting the ashramas in a practical context. Résumé virtues are professional and oriented toward earthly success. They require comparison with others. Eulogy virtues are ethical and spiritual, and require no comparison. Your eulogy virtues are what you would want people to talk about at your funeral. As in He was kind and deeply spiritual, not He made senior vice president at an astonishingly young age and had a lot of frequent-flier miles.
You won’t be around to hear the eulogy, but the point Brooks makes is that we live the most fulfilling life—especially once we reach midlife—by pursuing the virtues that are most meaningful to us.
I suspect that my own terror of professional decline is rooted in a fear of death—a fear that, even if it is not conscious, motivates me to act as if death will never come by denying any degradation in my résumé virtues. This denial is destructive, because it leads me to ignore the eulogy virtues that bring me the greatest joy.
How can I overcome this tendency? The Buddha recommends, of all things, corpse meditation: Many Theravada Buddhist monasteries in Thailand and Sri Lanka display photos of corpses in various states of decomposition for the monks to contemplate. “This body, too,” students are taught to say about their own body, “such is its nature, such is its future, such is its unavoidable fate.” At first this seems morbid. But its logic is grounded in psychological principles—and it’s not an exclusively Eastern idea. “To begin depriving death of its greatest advantage over us,” Michel de Montaigne wrote in the 16th century, “let us deprive death of its strangeness, let us frequent it, let us get used to it; let us have nothing more often in mind than death.”
Psychologists call this desensitization, in which repeated exposure to something repellent or frightening makes it seem ordinary, prosaic, not scary. And for death, it works. In 2017, a team of researchers at several American universities recruited volunteers to imagine they were terminally ill or on death row, and then to write blog posts about either their imagined feelings or their would-be final words. The researchers then compared these expressions with the writings and last words of people who were actually dying or facing capital punishment. The results, published in Psychological Science, were stark: The words of the people merely imagining their imminent death were three times as negative as those of the people actually facing death—suggesting that, counterintuitively, death is scarier when it is theoretical and remote than when it is a concrete reality closing in.
For most people, actively contemplating our demise so that it is present and real (rather than avoiding the thought of it via the mindless pursuit of worldly success) can make death less frightening; embracing death reminds us that everything is temporary, and can make each day of life more meaningful. “Death destroys a man,” E. M. Forster wrote, but “the idea of Death saves him.”
Decline is inevitable, and it occurs earlier than almost any of us wants to believe. But misery is not inevitable. Accepting the natural cadence of our abilities sets up the possibility of transcendence, because it allows the shifting of attention to higher spiritual and life priorities.
But such a shift demands more than mere platitudes. I embarked on my research with the goal of producing a tangible road map to guide me during the remaining years of my life. This has yielded four specific commitments.
The biggest mistake professionally successful people make is attempting to sustain peak accomplishment indefinitely, trying to make use of the kind of fluid intelligence that begins fading relatively early in life. This is impossible. The key is to enjoy accomplishments for what they are in the moment, and to walk away perhaps before I am completely ready—but on my own terms.
So: I’ve resigned my job as president of the American Enterprise Institute, effective right about the time this essay is published. While I have not detected deterioration in my performance, it was only a matter of time. Like many executive positions, the job is heavily reliant on fluid intelligence. Also, I wanted freedom from the consuming responsibilities of that job, to have time for more spiritual pursuits. In truth, this decision wasn’t entirely about me. I love my institution and have seen many others like it suffer when a chief executive lingered too long.
Leaving something you love can feel a bit like a part of you is dying. In Tibetan Buddhism, there is a concept called bardo, which is a state of existence between death and rebirth—“like a moment when you step toward the edge of a precipice,” as a famous Buddhist teacher puts it. I am letting go of a professional life that answers the question Who am I?
I am extremely fortunate to have the means and opportunity to be able to walk away from a job. Many people cannot afford to do that. But you don’t necessarily have to quit your job; what’s important is striving to detach progressively from the most obvious earthly rewards—power, fame and status, money—even if you continue to work or advance a career. The real trick is walking into the next stage of life, Vanaprastha, to conduct the study and training that prepare us for fulfillment in life’s final stage.
Time is limited, and professional ambition crowds out things that ultimately matter more. To move from résumé virtues to eulogy virtues is to move from activities focused on the self to activities focused on others. This is not easy for me; I am a naturally egotistical person. But I have to face the fact that the costs of catering to selfishness are ruinous—and I now work every day to fight this tendency.
Fortunately, an effort to serve others can play to our strengths as we age. Remember, people whose work focuses on teaching or mentorship, broadly defined, peak later in life. I am thus moving to a phase in my career in which I can dedicate myself fully to sharing ideas in service of others, primarily by teaching at a university. My hope is that my most fruitful years lie ahead.
Because I’ve talked a lot about various religious and spiritual traditions—and emphasized the pitfalls of overinvestment in career success—readers might naturally conclude that I am making a Manichaean separation between the worlds of worship and work, and suggesting that the emphasis be on worship. That is not my intention. I do strongly recommend that each person explore his or her spiritual self—I plan to dedicate a good part of the rest of my life to the practice of my own faith, Roman Catholicism. But this is not incompatible with work; on the contrary, if we can detach ourselves from worldly attachments and redirect our efforts toward the enrichment and teaching of others, work itself can become a transcendental pursuit.
“The aim and final end of all music,” Bach once said, “should be none other than the glory of God and the refreshment of the soul.” Whatever your metaphysical convictions, refreshment of the soul can be the aim of your work, like Bach’s.
Bach finished each of his manuscripts with the words Soli Deo gloria—“Glory to God alone.” He failed, however, to write these words on his last manuscript, “Contrapunctus 14,” from The Art of Fugue, which abruptly stops mid-measure. His son C.P.E. added these words to the score: “Über dieser Fuge … ist der Verfasser gestorben” (“At this point in the fugue … the composer died”). Bach’s life and work merged with his prayers as he breathed his last breath. This is my aspiration.
Throughout this essay, I have focused on the effect that the waning of my work prowess will have on my happiness. But an abundance of research strongly suggests that happiness—not just in later years but across the life span—is tied directly to the health and plentifulness of one’s relationships. Pushing work out of its position of preeminence—sooner rather than later—to make space for deeper relationships can provide a bulwark against the angst of professional decline.
Dedicating more time to relationships, and less to work, is not inconsistent with continued achievement. “He is like a tree planted by streams of water,” the Book of Psalms says of the righteous person, “yielding its fruit in season, whose leaf does not wither, and who prospers in all he does.” Think of an aspen tree. To live a life of extraordinary accomplishment is—like the tree—to grow alone, reach majestic heights alone, and die alone. Right?
The secret to bearing my decline—to enjoying it—is to become more conscious of the roots linking me to others. If I have properly developed the bonds of love among my family and friends, my own withering will be more than offset by blooming in others.
When I talk about this personal research project I’ve been pursuing, people usually ask: Whatever happened to the hero on the plane?
I think about him a lot. He’s still famous, popping up in the news from time to time. Early on, when I saw a story about him, I would feel a flash of something like pity—which I now realize was really only a refracted sense of terror about my own future. Poor guy really meant I’m screwed.
But as my grasp of the principles laid out in this essay has deepened, my fear has declined proportionately. My feeling toward the man on the plane is now one of gratitude for what he taught me. I hope that he can find the peace and joy he is inadvertently helping me attain.
Arthur C. Brooks is a contributing writer at The Atlantic, a professor of the practice of public leadership at the Harvard Kennedy School, a senior fellow at the Harvard Business School, and host of the podcast The Art of Happiness With Arthur Brooks.
Kyrgyzstani Prime Minister Kubatbek Boronov has resigned from office after just a few months on the job, amid protests and riots in Bishkek against election parliamentary election results. National lawmakers have already appointed his replacement: Sadyr Zhaparov, a former member of parliament who was freed from prison just hours earlier by demonstrators. A court previously sentenced Zhaparov to 10 years behind bars for allegedly organizing riots.
WITHIN A WEEK of discovering she was pregnant in late April, Sylvie (not her real name) knew she wanted an abortion. The pandemic had made her the sole breadwinner, and she had a young daughter to look after. She called Marie Stopes, a charity, which arranged a phone consultation with a representative from BPAS, another charity, at her local hospital. Four days later a packet of medicine arrived through the letterbox, and she terminated her pregnancy at home with the support of her partner. Abortion is a “horrible thing,” she says. But “in terms of how it was handled, it couldn’t have gone better”.
Sylvie is one of 23,000 women in England and Wales who had an abortion at home between April and June. That this was possible was due to a temporary change in the rules introduced as the country went into lockdown. In normal times, the first of the two pills required for a medical abortion must be taken at a hospital or clinic. But emergency measures, introduced on March 30th to avoid unnecessary hospital visits, designated women’s homes as another place where the pills could be taken, at least until ten weeks of gestation.
As a result of the change, abortions are now happening earlier. Data published on September 10th show that between January and June this year, there were 109,836 abortions in England and Wales. Some 50% of these, including Sylvie’s, were performed before seven weeks, compared with fewer than 40% during the same period in 2019. The proportion performed before ten weeks rose from 81% to 86%. There was also a small uptick in the overall number.
Abortion is usually a safe procedure, but there is a direct correlation between the risk of complication and weeks of gestation, says Sam Rowlands, a doctor at the British Society of Abortion Care Providers, a representative group. That means easing access to early terminations has increased the safety of abortion care, says Edward Morris of the Royal College of Obstetricians and Gynaecologists. Both groups have called for the changes to be made permanent. The government has said it will launch a public consultation on the matter.
The picture is gloomier in those parts of Europe where politicians did not do much to ease access to abortion. Recent research by Abigail Aiken of the University of Texas at Austin looked at enquiries to Women on Web, a Canadian charity that provides pills to women in countries where at-home abortions are illegal. She found that during the pandemic they shot up in Italy (by 68%) and Portugal (by 139%). In Britain they fell to negligible levels.
Sylvie says the new way of doing things also reduced the psychological toll of the procedure. In 2011 she had to wait five weeks for an abortion, by which point she was nearing her second trimester. She lives in rural Cornwall, an area she says is “lacking in health care [providers] and forward thinking”. She remembers being passed “from pillar to post” while attempting to get an abortion. The experience was so bad she made a formal complaint. This time, however, she says the process was “respectful”, “compassionate” and, crucially for her, “private”. ■
Editor’s note: Some of our covid-19 coverage is free for readers of The Economist Today, our daily newsletter. For more stories and our pandemic tracker, see our hub
This article appeared in the Britain section of the print edition under the headline “Call to action”
Bipolar disorder and thought disorders such as schizophrenia are serious mental disorders, which often have a great impact on a person’s life and well-being. In a number of cases, bipolar disorder and schizophrenia are first diagnosed several years after the onset of the disorder. The delay in diagnosis is often associated with unfavorable prognosis for the course of the disorders.
The sooner the patient gets the correct diagnosis and begins targeted treatment, the better the prognosis. For this reason, researchers are aiming at identifying risk factors that will aid psychiatrists in reaching the correct diagnosis as early as possible.
Many people who develop bipolar disorder or psychosis initially come into contact with the mental health services due to depression. A research team from the Danish psychiatry project iPSYCH, examined a dataset consisting of 16,949 people aged 10-35 who had been treated for depression at a psychiatric hospital in Denmark.
“Our goal with the study was to investigate whether genetic factors are associated with an increased risk of developing bipolar disorder or psychosis among patients with depression. This knowledge can potentially be used in clinical practice to identify patients who should be monitored even more closely,” said lead author Dr. Katherine Musliner from the National Centre for Register-based Research.
Study results are published in the American Journal of Psychiatry.
Among the factors the researchers looked into in the study was whether the genetic risk scores for bipolar disorder and schizophrenia could possibly help psychiatrists determine which of their patients with depression was at greatest risk of subsequently developing bipolar disorder or a psychosis. The genetic risk scores represent a person’s individual genetic risk of developing the disorders.
“One thing we discovered was that the genetic risk score for bipolar disorder is associated with an increased risk of developing bipolar disorder, and that the genetic risk score for schizophrenia is associated with an increased risk of developing a psychosis among patients who have been diagnosed with depression,” says Musliner.
Musliner clarifies that although there was a correlation, the effect of the genetic risk scores were relatively small. Another member of the research group behind the study, Professor Søren Dinesen Østergaard from the Department of Clinical Medicine and Aarhus University Hospital – Psychiatry, said caution is needed when interpreting the results.
“At present, the genetic risk scores cannot contribute to early diagnosis of bipolar disorder and psychoses in clinical practice, but it cannot be ruled out that this could be the future scenario. On the other hand, our study confirms that having a parent with bipolar disorder or a psychosis is a strong predictor for the development of these particular disorders after depression.
“This underlines the importance of getting information about mental disorders in the family as part of the assessment of people suffering from depression,” he said.
SHANGHAI (Reuters) – Mainland China reported nine new COVID-19 cases as of Sept. 16, down from 12 reported a day earlier, the country’s national health authority said on Thursday.
The National Health Commission said in a statement all new cases were imported infections involving travellers from overseas. The number of new asymptomatic patients also fell to 14 from 16 a day earlier, though China does not count these patients as confirmed cases.
Total of confirmed COVID-19 cases in mainland China now stands at 85,223, while the death toll remained unchanged at 4,634.
(Reporting by Jing Wang and Engen Tham; writing by Se Young Lee; Editing by Shri Navaratnam)