Connect with us

Volatility And Valuations

Volatility And Valuations

Authored by Nick Colas via DataTrek Research,

We will start today’s discussion with two numbers: 5.6 percent…

Published

on

Volatility And Valuations

Authored by Nick Colas via DataTrek Research,

We will start today’s discussion with two numbers: 5.6 percent and 3.4 percent. Those are the 20-year compounded annual growth rates (CAGR) for the S&P 500 from 1999 – 2018 on a nominal and real (after-inflation) basis. If those strike you as pathetically low, you are correct:

  • That 5.6 percent nominal return is barely half the S&P’s long run average 20-year CAGR of 10.8 percent.
  • It is also the worst nominal 20-year CAGR since the period spanning the Great Depression.
  • The 3.4 percent inflation-adjusted 20-year CAGR is the worst since the 1969 – 1988 timeframe, where the 1980s bull market only barely made up for the inflationary 1970s. Long run inflation-adjusted S&P returns are 7.1 percent, more than double that 3.4 percent result.

Now, one might say that 1999 is an unfair starting point, but the 20-year CAGR data tells much the same story about sub-par or merely average returns across other timeframes:

  • 1997 – 2016: 7.6 percent nominal, 5.5 percent real returns
  • 2002 – 2021: 9.4 percent nominal, 7.1 percent real returns

The reason for these disappointing results for long-term equity investors comes down to two 5-year periods: 1997 - 2002 and 2007 - 2012. In both cases, the S&P 500 went nowhere for half a decade. Returns from 2003 – 2006 were decent, averaging 15 percent with no drawdown years, but that still was not enough to make up for the stagnant bookends on either side of that time span.

Interestingly, it was not corporate earnings power than caused these two “lost” half decades:

  • S&P 500 earnings were $44.01 in 1997, when the index closed the year at 970. When the S&P got back to that level in Q2 2003, trailing 4 quarter earnings were $48.95/share. That is a difference of 11 percent.
  • In 2007, the S&P earned $82.54/share and the index finished the year at 1,475. The next time the S&P was at similar levels after the Financial Crisis/Great Recession was in early 2013, when the S&P had earned $98.35/share in the prior 4 quarters. The difference here is 19 percent, but the index was flat from 2007 to 2013.

We cannot blame Interest rates for this contraction in price-earnings multiples. Ten-year Treasury yields were lower in 2003 than 1997 (3 versus 6 percent) and 2013 relative to 2007 (1-2 percent versus 5 percent). If anything, multiples should have recovered more quickly and been higher in 2003 and 2013 and in 1997 and 2007. And yet, they clearly were not.

The chart below, which shows the 100-day rolling average of the CBOE VIX Index, offers a reasonable explanation for why US equity valuations contracted over 1997 – 2002 and 2007 – 2012 even with the tailwind of lower interest rates. As highlighted, in both periods the VIX was consistently above its long-run average of 20 for years on end. Yes, the S&P bottomed before the end of each period of volatility (2002 and 2009). The trouble was that valuations remained compressed for far longer than just the 2000 – 2002 and 2008 bear markets. That is why the S&P flatlined for 5 years in each case rather than just 1 – 3 years.

We have highlighted the current VIX running averages on the rightmost part of the graph, and those broadly resemble prior problematic periods for equity valuations. Happily, volatility has not yet overly damaged S&P price-earnings multiples relative to pre-pandemic levels. The S&P trades for 17.5x current earnings power of $228/share. In 2019, PE ratios ran between 18.5 – 19.5x. A bit of a haircut, true, but consistent with higher interest rates so let’s call it a wash.

I do, however, worry about long-lasting equity market volatility far more than I worry about recession. Large public companies know how to make money, even during periods of economic stress, as the data presented above shows. The market-weighted nature of the S&P indexing process constantly resets in favor of businesses that accomplish that task better than others. Recession or no, these are fundamentally positive and permanent features and the cornerstone of our view that US large caps are the most productive asset class for long term investors.

The problem is that volatility grinds away at investor confidence. The longer it lasts, the lower stock valuations go. That is entirely rational, if unwelcomed, but it takes years to regain investors’ trust after a long bout of volatility. That is how you end up with zero stock market returns for 5 years, and subpar returns for periods as long as 2 decades.

I think Chair Powell and the Federal Reserve understand this problem, albeit from the wider perspective of creating an environment consistent with sustainable economic growth. The Fed needs to get inflation under control quickly and permanently, because until they accomplish that goal capital markets volatility will remain high. That will limit capital formation and investment over the longer term, making the next economic cycle weaker than it would otherwise be.

Takeaway: while it may be painful in the near-term, long-term US equity investors should be hoping for very aggressive and effective monetary policy over the next 6-12 months and look to add stock exposure to portfolios as that unfolds. 

That will be the pathway to holding equity valuations at current levels and offers the possibility of better multiples in the next cycle.

The alternative – another 1-2 years of uncertainty – would threaten structural returns for 5 years or longer. History is clear on that point.

Tyler Durden Sun, 09/11/2022 - 21:00

Read More

Continue Reading

International

Elon Musk’s says the Boring Company to reach $1 trillion market cap by 2030

Musk said there’s really only one roadblock to this company achieving this mega-cap value.

Published

on

Elon Musk wants to create and control an artificial superintelligence and guide humanity in an effort to colonize Mars. But before we get there, he wants to solve the problem of traffic right here on Earth. 

In 2016, the tech billionaire tweeted himself into a new company: "Traffic is driving me nuts. I am going to build a tunnel boring machine and just start digging..." he wrote. A series of tweets followed this proclamation as the idea germinated and cemented in Musk's head: "It shall be called 'The Boring Company.' I am actually going to do this."

Related: Elon Musk is frustrated about a major SpaceX roadblock

The firm's goal is to "solve the problem of soul-destroying traffic," by creating a series of underground transportation tunnels. Taking transportation underground, the company says, should additionally "allows us to repurpose roads into community-enhancing spaces, and beautify our cities."

The tunneling company broke ground on its first project in Feb. 2017 and has since completed three projects: the Las Vegas Convention Center (LVCC), the Hyperloop Test Track and the R&D Tunnel. It is currently working on a 68-mile Las Vegas Loop station that will eventually connect 93 stations between Las Vegas and Los Angeles. Once in operation, the Vegas Loop will transport 90,000 passengers every hour, according to the company. 

More Elon Musk News:

Part of Musk's proposition is that, with the right technology, he can make tunneling a quick and relatively inexpensive process. The company's Prufrock machine allows Boring to "construct mega-infrastructure projects in a matter of weeks instead of years." The machine can mine one mile/week, with new iterations expected to further increase that output. 

Elon Musk is looking to transform traffic and transportation with one of his many ventures. 

Bloomberg/Getty Images

By 2030, Youtuber and investor Warren Redlich wrote in a post on X, Boring will have more than 10,000 miles of tunnel. By 2035, he said, that number will rise to 100,000. With that increase in tunnel space, Redlich thinks that Boring will IPO by 2028 and hit a $1 trillion market valuation by 2030. 

Musk said that this bullish prediction might actually be possible. 

"This is actually possible from a technology standpoint," he wrote in response. "By far the biggest impediment is getting permits. Construction is becoming practically illegal in North America and Europe!"

Action Alerts PLUS offers expert portfolio guidance to help you make informed investing decisions. Sign up now.

Read More

Continue Reading

Uncategorized

NULISA: Ultra-Sensitive Immunoassay Platform for Profiling Fluid-Based Neurodegenerative Protein Biomarkers

Efforts to identify biomarkers for neurodegenerative disease have been hampered by the lack of a proteomic tool with the required sensitivity to detect…

Published

on

Broadcast Date: October 11, 2023
Time: 8:00 am PT, 11:00 am ET, 17:00 CET

Efforts to identify biomarkers for neurodegenerative disease have been hampered by the lack of a proteomic tool with the required sensitivity to detect very low concentrations of brain-derived proteins in plasma or serum and the ability to multiplex many analytes in a single assay.

In this webinar, we will describe the NULISA Platform, a novel immunoassay with attomolar level sensitivity and high multiplex capability. The performance of the NULISA assay was benchmarked to existing immunoassay platforms and Dr. Zetterberg will present data from his evaluation of the NULISA system’s ability to detect serum biomarkers associated with Alzheimer’s disease. Dr. Henrik Zetterberg is a leading researcher in the field of Alzheimer’s disease who has spent the past 10 years focused on the discovery and validation of blood-based biomarkers for CNS disorders.

 

During the presentation we will offer a chance to pose questions to our expert panelists. Any questions submitted during the webinar will be answered at a later date.

Doug Hinerfeld, PhD
Senior Director of Application Support
Alamar Biosciences
Henrik Zetterberg
Henrik Zetterberg, MD, PhD
Professor of Neurochemistry
University of Gothenburg

 

The post NULISA: Ultra-Sensitive Immunoassay Platform for Profiling Fluid-Based Neurodegenerative Protein Biomarkers appeared first on GEN - Genetic Engineering and Biotechnology News.

Read More

Continue Reading

International

NASA’s Webb finds carbon source on surface of Jupiter’s moon Europa

Jupiter’s moon Europa is one of a handful of worlds in our solar system that could potentially harbor conditions suitable for life. Previous research…

Published

on

Jupiter’s moon Europa is one of a handful of worlds in our solar system that could potentially harbor conditions suitable for life. Previous research has shown that beneath its water-ice crust lies a salty ocean of liquid water with a rocky seafloor. However, planetary scientists had not confirmed if that ocean contained the chemicals needed for life, particularly carbon.

Credit: Credits: Science Credit: Geronimo Villanueva (NASA/GSFC), Samantha Trumbo (Cornell Univ.), NASA, ESA, CSA. Image Processing Credit: Geronimo Villanueva (NASA/GSFC), Alyssa Pagan (STScI)

Jupiter’s moon Europa is one of a handful of worlds in our solar system that could potentially harbor conditions suitable for life. Previous research has shown that beneath its water-ice crust lies a salty ocean of liquid water with a rocky seafloor. However, planetary scientists had not confirmed if that ocean contained the chemicals needed for life, particularly carbon.

Astronomers using data from NASA’s James Webb Space Telescope have identified carbon dioxide in a specific region on the icy surface of Europa. Analysis indicates that this carbon likely originated in the subsurface ocean and was not delivered by meteorites or other external sources. Moreover, it was deposited on a geologically recent timescale. This discovery has important implications for the potential habitability of Europa’s ocean.

“On Earth, life likes chemical diversity – the more diversity, the better. We’re carbon-based life. Understanding the chemistry of Europa’s ocean will help us determine whether it’s hostile to life as we know it, or if it might be a good place for life,” said Geronimo Villanueva of NASA’s Goddard Space Flight Center in Greenbelt, Maryland, lead author of one of two independent papers describing the findings.

 

“We now think that we have observational evidence that the carbon we see on Europa’s surface came from the ocean. That’s not a trivial thing. Carbon is a biologically essential element,” added Samantha Trumbo of Cornell University in Ithaca, New York, lead author of the second paper analyzing these data.

 

NASA plans to launch its Europa Clipper spacecraft, which will perform dozens of close flybys of Europa to further investigate whether it could have conditions suitable for life, in October 2024.

 

A Surface-Ocean Connection

 

Webb finds that on Europa’s surface, carbon dioxide is most abundant in a region called Tara Regio – a geologically young area of generally resurfaced terrain known as “chaos terrain.” The surface ice has been disrupted, and there likely has been an exchange of material between the subsurface ocean and the icy surface.

 

“Previous observations from the Hubble Space Telescope show evidence for ocean-derived salt in Tara Regio,” explained Trumbo. “Now we’re seeing that carbon dioxide is heavily concentrated there as well. We think this implies that the carbon probably has its ultimate origin in the internal ocean.”

 

“Scientists are debating how much Europa’s ocean connects to its surface. I think that question has been a big driver of Europa exploration,” said Villanueva. “This suggests that we may be able to learn some basic things about the ocean’s composition even before we drill through the ice to get the full picture.”

 

Both teams identified the carbon dioxide using data from the integral field unit of Webb’s Near-Infrared Spectrograph (NIRSpec). This instrument mode provides spectra with a resolution of 200 x 200 miles (320 x 320 kilometers) on the surface of Europa, which has a diameter of 1,944 miles, allowing astronomers to determine where specific chemicals are located.

 

Carbon dioxide isn’t stable on Europa’s surface. Therefore, the scientists say it’s likely that it was supplied on a geologically recent timescale – a conclusion bolstered by its concentration in a region of young terrain.

 

“These observations only took a few minutes of the observatory’s time,” said Heidi Hammel of the Association of Universities for Research in Astronomy, a Webb interdisciplinary scientist leading Webb’s Cycle 1 Guaranteed Time Observations of the solar system. “Even with this short period of time, we were able to do really big science. This work gives a first hint of all the amazing solar system science we’ll be able to do with Webb.”

Searching for a Plume

 

Villanueva’s team also looked for evidence of a plume of water vapor erupting from Europa’s surface. Researchers using NASA’s Hubble Space Telescope reported tentative detections of plumes in 2013, 2016, and 2017. However, finding definitive proof has been difficult.

 

The new Webb data shows no evidence of plume activity, which allowed Villanueva’s team to set a strict upper limit on the rate of material potentially being ejected. The team stressed, however, that their non-detection does not rule out a plume.

 

“There is always a possibility that these plumes are variable and that you can only see them at certain times. All we can say with 100% confidence is that we did not detect a plume at Europa when we made these observations with Webb,” said Hammel.

 

These findings may help inform NASA’s Europa Clipper mission, as well as ESA’s (European Space Agency’s) upcoming Jupiter Icy Moons Explorer (JUICE).

 

The two papers will be published in Science on Sept. 21.

 

The James Webb Space Telescope is the world’s premier space science observatory. Webb is solving mysteries in our solar system, looking beyond to distant worlds around other stars, and probing the mysterious structures and origins of our universe and our place in it. Webb is an international program led by NASA with its partners, ESA (European Space Agency) and the Canadian Space Agency.


Read More

Continue Reading

Trending