Case Study Analysis 1
Case Study Analysis
Student’s Name
Institution
Case Study Analysis 2
Financial Statements Analysis
The materials costs as a proportion of the cost of goods sold was 33.8%, 33.3% and 34.3% of
sales in 2005, 2006 and 2007 respectively. Direct labor was 10.1%, 10.4% and 11.2% within the
same period. The total cost of goods sold as a percentage of sales was 50.5%, 49.4% and 50.5%
in the years 2005, 2006 and 2007 respectively, and this means that respective gross margin was
49.5 in 2005 and 2007, and 50.6 in the year 2006. Gross margin refers to the gross profit divided
by the sales. It indicates the level of profitability and efficiency the company is at its most basic
level, before deducting operating expenses. It estimates the how much the company earns after
considering the direct costs need to generate products and deliver services.
Market Approach
The market approach is based on the value of the targeted business on the sales of comparable
businesses or business interest. It is particularly important when valuing public companies, or
even private ones that are large enough to think of listing public, because data on the comparable
firms is publicly and readily available. Under the approach, the analysts identify the recent and at
an arm’s length transactions that involve similar private and public companies and then comes up
with the pricing multiples. Many different approaches are available, and thy include the
guideline public company approach and the merger and acquisition approach (Palepu, Wright,
Bradbury, & Coulton, 2020). The guideline public company technique considers the market price
of comparable stocks of public companies. A pricing multiple is then calculated when
comparable stock price is divided by any accounting factor such as net income or the operating
cash flow. The merger and acquisition approach is developed when analysts develop pricing
multiples that are based off on real-world transactions that involve entire comparable firms or
operating divisions that have been sold. The pricing multiples are then used to the targeted
Case Study Analysis 3
company’s economic variables (Damodaran, 2012). Under the market approach, the level of
value that is developed depends on whether the targeted company’s economic variables are
adjusted for discretionary items. When an expert makes discretionary adjustments that exist only
to controlling shareholders, it can exclude the use of the control premium. However, the
preliminary value can contain any implicit discount for lack of control.
The Income Approach
When reliable market data are harder to obtain, the business valuation analyst can turn to the
income method. The method transforms future expected economic benefits, and in general, cash
flow into a present value. Since this approach can base it value on the ability of the business to
generate future economic benefits. In general, the method is suitable for profitable businesses
that are well established in their industries (Boisjoly, Conine Jr, & McDonald IV, 2020). The
capitalization of earning approach capitalizes the projected future economic benefits using a
relevant rate of return. The analyst might consider adjustments for items that include things
discretionary expenses, non-recurrent revenue and expenses, strange tax issues or accounting
approaches and the differences in capital structure, and it is the most relevant for firms with
stable returns and cash flows.
The discounted cash flow method is considered an income method. Also, the factors thought in
the capitalization of earnings approach, the analysts account for the projected cash flows over the
discreet period, and in this case, the projections run for five years and the terminal value at the
end of the projected five-year period (Henschke & Homburg, 2009). All the future cash flows as
well as the terminal value are then discounted to the present value using the discounting rate
rather than the capitalization rate.
Case Study Analysis 4
Analysts often spend so much time building hyper-complex three-statement models, complete
with revenue builds and an interesting number of line-items. (Da & Schaumburg, 2011) Now,
these are absolutely useful in understanding a company's financial condition, margins, growth,
and others, but are extremely unreliable when trying to determine what price to purchase a stock
at. There is far too much ambiguity. However, if an analyst can perfectly project a company's
financials and link it all into a FCF DCF, there is no guarantee the company's stock price will
align with its calculated intrinsic value (Jennergren, 2010). A stock price represents the
consensus view of millions of individuals at a given point in time, and if they are all determining
their opinion using different models and valuation methods, the stock's price would not line up
precisely with the target of the one analyst who did it correctly.
So, since projecting a precise target is not an effective way to generate alpha. Well, many
analysts rely on sell-side price targets for the trajectory of a stock, believing that, over time, a
stock's price will roughly follow the trajectory of its earnings. They think if a stock is trading at
$50 and the Wall Street consensus target is $60, the stock will trend upwards. In some sense this
may be true if enough investors take action on the Wall Street recommendation. Investors
seldom if ever take sell-side targets at face-value. But, many cross-check their valuations against
the sell-side ones, and there is a bias towards some degree of alignment with them (Moyo &
Mache, 2018). Many amateur investors also take sell-side recommendations, as do many
financial advisors. Analysts have rarely been accurate with their price targets in the past
(Bergeron, Gueyie, & Sedzro, 2018). Most use comp multiples in their valuations, which are a)
extremely subjective to manipulation, which is problematic since analysts are incentivized to put
out "buy" recommendations, and b) inherently ignore the state of the broader market (if the entire
market or sector is overvalued, the comp multiple will be unjustifiably higher). Plus, sell-side
Case Study Analysis 5
analysts do not skin in the game with their recommendations, and this means that they do not
have to put their money where their mouth is.
Many investors end up going deep into the weeds in building monster financial models for their
valuations. However, after a certain point the more complex the model, the less accurate. There
is a fascinating study about professional odds-makers for horse racing. A group of them were
asked to place odds on a race between 10 horses, and were told they could have any 4 pieces of
data for comparing them (i.e. jockey weight, age, breed, etc). They forecasted with 19%
accuracy, which is not half bad. They reported they felt about 10% confident in their
conclusions. Then, they were asked to project another race, and this time were given more pieces
of information. After several iterations of this (up to something like 30 pieces of data), their
accuracy had remained about constant, but their confidence had risen significantly despite the
stagnation in accuracy. So, complexity and extra information is not always as beneficial as one
may intuitively think.
The main conclusion is that there must be a simpler way to value a stock that accounts for reality
- meaning the actual behavior of the market - along with some degree of fundamentals instead of
all one or the other.
Consider this valuation technique:
The historical harmonic average P/E ratio of the S&P 500 is 14.25 (monthly since 1928).
The average annual EPS growth over that time was 6.32%. Both appear to revert to the
mean over time.
o Hence, this seems to be a better way to capture the average perception of the market:
6.32% earnings growth warrants a 14.25 P/E multiple
Case Study Analysis 6
So, a company with >6.32% growth should trade at a higher multiple, and one with lower
growth at a lower multiple
Then, one adds an adjustment for risk. A company with 6.32% growth but less risk
(narrower distribution of possible outcomes) than the average company should trade at a
higher multiple, and vice-versa.
Finally, one adds on a margin of safety (i.e. 5-10%) to compensate for unexpected events
and forecasting error.
This method would only require calculating a growth rate. This could be done with a simplified
income P&L forecast sheet, and would be easy to run a sensitivity analysis on.
The qualitative factors are, therefore, the most important piece of the analysis. There have also
been studies showing that people making decisions who have lots of quantitative data will
overweight their decisions towards quantifiable indications while missing qualitative ones.
Buffet and Munger do not spend their time modeling out depreciation and CAPEX or equity
issuances. This is not to say these factors should be ignored, but there is an opportunity cost in
spending too much time on them, and, according to what I've seen, decreasing marginal returns
the deeper into the weeds one gets.
Valuation is one part of a framework to pick stocks, the other two legs should be sentiment and
finance. To make an out of consensus stock pick people should have a different and correct view
on at least one of those three dimensions. If one is using a valuation approach what one is hoping
to do is learn which valuation the market is using to value, the stock and to understand if that
metric or analysis is the correct way (has the most predictive power in the future) (Saastamoinen
& Savolainen, 2019). To make an out of consensus stock call one will need some sort of thesis as
Case Study Analysis 7
to why the market is using X when it should be looking at Y and most importantly the catalyst
that will cause the market to switch to Y.
Valuation can also be useful when evaluating sentiment. If one believes that the stock is
overbought or oversold, then both intrinsic valuation exercises and market valuations can be
helpful in identifying mis-pricings (Hawkins, 2002). However, care should be taken when using
market based comparable valuation / benchmarking because sectors and even industries exhibit
momentum sometimes and will tend to eliminate relative mis-pricings. Intrinsic valuations can
be more helpful during these times.
Finally, valuation is of critical importance to determine if a financial factor is causing a
mispricing. For example, if a company reports that it had an accounting error that seems
superficially important but does not affect the valuation method the market uses, then it should
not affect the price over a longer period of time and one should be able to take advantage of any
overblown immediate market reactions. Additionally, taking historical views of multiple
valuation metrics can actually help identify financial / accounting changes that are material and
should move the stock (that is If margins improve and more free cash flow gets generated but it
is reinvested in the business and does not hit eps, and this should increase the amount of future
cash flows and move the stock but might get overlooked because eps did not move) (Da &
Schaumburg, 2011). In short, there should not be one valuation method that one use as a panacea
for a given company, sector, industry, and others. Valuation exercises are tools to help
understand how a stock could be mispriced and should be used alongside other quantitative,
technical, and fundamental research methods.
Multiples and residual valuation are my go-to. DCF is the theoretically correct way to value an
asset (and works fine for bonds), but unless it's a mature, stable company with historically
Case Study Analysis 8
consistent cash flows (utilities, and the JNJs and PGs of the world), I don't bother with it because
it's bound to be pretty incorrect. DCF valuations vary greatly based upon some of the input
assumptions. We tend to get more accurate assumptions for more stable, mature businesses.
However, in hindsight (yes, hindsight is always 20/20), I have found DCF to be accurate even for
fast growing, emerging markets. Furthermore, one develops better assumptions about these
markets. One can also backrest managements forecasts. So if management is constant and its
2009 forecasts were accurate for 2010, and so on, one knows management forecasts are accurate.
Then one can use their 2016 forecasts and so on to model future earnings effectively. Throw in
the own variables as well for safety. If one comes out grossly below current stock price one has a
winner. DCF worked best for the broader market, using assumptions based purely on past
performance. DCF proved to be the most accurate and consistent overall, and that accuracy and
consistency was further enhanced by altering my assumptions for those markets. The statistical
improvement over other techniques was overwhelming, even with naive assumptions. A model,
such as a DCF, is a tool for converting one kind of belief into a different kind of estimate. No
more, no less. Its ingredients are risk-adjusted time value of money and the tautological
observation that a financial asset is worth the PV of its future cash flows.
Recommendations
Different approaches are suitable for different desired outcomes. If one cares where the market in
a company’s common equity goes in 6 months, no fundamental valuation metric is going to get
the analyst there. If one is looking for long-term value, a DCF allows the analyst to translate
what price levels convert the operating assumptions into what E(return) for the capital stack.
Then one needs to compare that E(R) (and that is the WACC) to the best guess of the market’s
E(R) (with CAPM being most common estimator prism) and decide whether price-value
Case Study Analysis 9
convergence is likely. With a margin of safety, and others There are no serious investors that
look at SS price targets and conclude that they predict security price movements. SS analysts are
useful mainly because they can help one understand a business or gain one more access to
management.
References
Bergeron, C., Gueyie, J. P., & Sedzro, K. (2018). Consumption, residual income valuation, and
long-run risk. Journal of Theoretical Accounting Research, 13(2), 1-32.
Boisjoly, R. P., Conine Jr, T. E., & McDonald IV, M. B. (2020). Working capital management:
Financial and valuation impacts. Journal of Business Research, 108, 1-8.
Da, Z., & Schaumburg, E. (2011). Relative valuation and analyst target price forecasts. Journal
of Financial Markets, 14(1), 161-192.
Damodaran, A. (2012). An Introduction to Valuation. Retrieved June 25, 2021, from
http://people.stern.nyu.edu/adamodar/pdfiles/eqnotes/ValIntro.pdf
Hawkins, G. B. (2002). Why Time Travel in Business Valuation is Wrong. Business Valuation
Review, 21(3), 1-8.
Henschke, S., & Homburg, C. (2009). Equity valuation using multiples: controlling for
differences amongst peers. SSRN, 1270812. Retrieved from http://ssrn. com/abstract
Jennergren, L. P. (2010). On the forecasting of net property, plant and equipment and
depreciation in firm valuation by the discounted cash flow model. Journal of Business
Valuation and Economic Loss Analysis, 5(1).
Moyo, V., & Mache, F. (2018). Inferring The Cost Of Equity: Does The CAPM Consistently
Outperform The Income And Multiples Valuation Models? Journal of Applied Business
Research (JABR), 34(3), 519-532.
Palepu, K. H., Wright, S., Bradbury, M., & Coulton, J. (2020). Business analysis and valuation:
Using financial statements. Cengage AU.
Saastamoinen, J., & Savolainen, H. (2019). Does the choice in valuation method matter in the
judicial appraisal of private firms? Journal of Business Finance & Accounting, 46(1-2),
183-199.