Financial Python

Studies in Finance and Python

Monte carlo and fundamental analysis

leave a comment »

A recent discussion about stock options and the creation of Trefis (and it's ability to model firm value in a friendly way) made me wonder: Why isn't monte carlo isn't used more often in standard valuation models? Every b-school graduate has used @Risk or Crystal Ball, so associating probability distributions to revenue, expense, and other model drivers should be vaguely familiar at least.

This occurred to me because Trefis has a "crowdsourcing" feature that allows users to share their valuations with each other. If one could extract the driving assumptions from all these models (assuming there area lot of them for a given firm), I imagine the resulting valuation distribution might approximate a monte carlo model a single analyst might come up with.

But why do this? Growth estimates (e.g. sales, expenses, etc.) reflect an analyst's opinion about the stock, right? If you don't believe your valuation and outlook, what's the point? By articulating a risk profile for a given valuation, one is forced to consider the risk picture more broadly. Even if your expected valuation agrees with the last trading price, the risk profile of the valuation can still be used (via options) to account for other potential outcomes. One could even compare the "fundamental" risk profile with that implied by stock options to determine whether there are meaningful differences in opinion. I know Bloomberg has implemented the variance-gamma option model that allows analysts to extract a return distribution that takes into account the implied volatility skew. Combining this with a Black-Litterman exercise to estimate returns for a given portfolio (e.g. S&P500) might make for some interesting analysis.

For example, I imagine a portfolio manager might apply the Black Litterman approach to the SP500 and determine where the firm's fundamental analysts diverge meaningfully from returns implied by the current 'optimal' index pricing/weighting. By adding a risk profile layer to this basic analysis using monte carlo, the portfolio manager might find ways to trade a portfolio of options more effectively than simply buying or selling the underlying stock as he attempts to trade into his optimal exposures. Indeed, even if the firm's fundamental analysts agree completely with the returns implied by the Black-Litterman exercise, the individual firm risk profiles could suggest some micro or macro hedging via individual stock options or index options.

One concern is term mismatch. Stock options are short-dated options whereas fundamental analysts typically (or should I say allegedly?) look for fundamental value to be realized over a longer term (years vs. weeks or months). I suppose one could look at LEAPs, but I'm not sure how practical it is to trade those longer-dated contracts.

Anyway, food for a future notetoself. Maybe I just ate too much thai food.

Written by DK

January 22, 2010 at 2:28 am

Posted in Finance

Synthetic tranches intuition for stock option guys

leave a comment »

I had a couple of interesting conversations comparing equity options to tranches, so I thought I'd develop some of the parallels here.

I'm assuming you're already familiar with equity options, however, so let me walk you through an example. Let's assume there is a stock index that, for argument's sake, can vary between $0 and $100. Now, consider the following series of call spreads on this index.

  • Call spread A = long call option with a strike of $0, short a call with a strike of $3.
  • Call spread B = long call option with a strike of $3, short a call with a strike of $7.
  • Call spread C = long at $7, short at $10.
  • Call spread D = long at $10, short at $15.
  • Call spread E = long at $15, short at $30.
  • Call spread F = long at $30, short at $100 (I know we've limited the stock to $100, but work with me here).

Let's say the index trades at around $1.50. Call spread A is most sensitive to changes in the index price (relative to the other call spreads) since it is "at-the-money" (ATM). In contrast, the $30-$100 spread offers little value since it is so far "out-of-the-money" (OTM). If the stock price increases to $5, call spread A has moved completely "in-the-money" (ITM) and is no longer as sensitive to moves in the underlying index (the maximum PnL for the spread has been realized). Call spread B is now the ATM option portfolio. As the index price moves, the value of each call spread will fluctuate depending on whether it is ITM, ATM, or OTM. Another way to look at it is in terms of option premium. If the index is trading at $1.50, I'll likely get much more premium by selling call spread A or B than call spread F.

Now let's consider the constituents of this index. Let's say it's made up of biotech companies that are highly dependent upon a certain upstream compound, pending FDA approval, for their businesses to succeed. If the compound is approved, these companies are going to make tons of money and the value of the index will likely approach $100. If it is not approved, the value of the index will approach $0. Your estimate of the compound's likelihood of approval will bias your estimate of call spread relative value. If you think approval is more likely than expected, you may be able to purchase the $30-100 call spread cheaply since it's OTM. If enough people agree with you, the premium associated with the $30-100 call spread will be driven higher until it reaches some equilibrium level. This reflects the binary nature of the approval process and the highly correlated expected returns of the index constituents.

The example would be much different if the index was made up of a well-diversified group of companies, spanning different sectors, etc. Some constituent stocks will go up and some will go down, but one might expect the distribution of potential index values to approach something more bell-curved than the binary outcome described in the biotech example. In this case, the value of the $30-100 call spread will remain low since the index probably won't generate those higher expected returns (again, relative to the biotech example).

Now stop. Replace the "$" signs in the example above with "%", generalize the "biotech vs. diversified" discussion in your head to correlated vs. uncorrelated, and substitute "expected loss" for "expected return." You officially understand standardized synthetic tranches. Tranches on the standard CDX index work in exactly the same manner. The expected loss of the index is tranched into 0-3%, 3-7%, etc., slices. If the index is implying a loss of 1.5%, for example, the 0-3% tranche is the ATM tranche. The intuition regarding the greeks, discussed in previous posts, follows naturally (delta, gamma, rolldown/theta, vega/correl01).

One common stumbling block is the whole expected return vs. expected loss business.  To be explicit, credit guys are primarily concerned with expected loss (default risk) whereas equity guys are focused on expected return. If I buy protection on the 0-3% tranche, I expect default risk to increase. When I buy the $0-3 call spread, I expect the stock price to increase. So remember, when you talk about CDS, you should talk explicitly in terms of buying and selling protection.

  • Buy protection = I expect things to get crappier (I want to short the credit)
  • Buy call option = I expect things to improve (I want to get long the stock)

So from a directional perspective (crappier <–> better), I suppose buying tranche protection is more like buying a put spread on a stock/index. For whatever reason, though, I prefer to think of it as buying a call spread on expected loss. This preference is driven by the quoting conventions of credit vs. stocks. CDS is quoted in spread (which reflects default risk) while stocks are quoted in terms of price.

The same term structure considerations are also applicable, though one should remember CDS maturities (e.g. 5, 7, 10y) are much longer than equity options.
 
Anyway, there are direct lines one can draw between stock options and standard synthetic tranches. Hopefully this helps bridge the gap.

And for something totally unrelated, here's a link to an oldie but goodie:

Written by DK

January 9, 2010 at 7:14 pm

Posted in Finance

Top 5 Posts

leave a comment »

I just noticed I passed 100 posts a little while ago. Small potatoes in comparison to the geometrically increasing data puking contest that is the Internet. Nevertheless, as we close out 2009, I thought it would be interesting to review the top five posts:

1) Delta and Mark-to-Market. A brief explanation of corporate synthetic tranche value sensitivity to the underlying portfolio.
2) Use your.flowingdata.com…for the children. Track your baby's sleep schedule (and pretty much anything else) via Twitter.
3) Sqlite and Sqlalchemy. An example of using python and a popular object-relational mapper.
4) Using Google Apps Python Provisioning API. An example of pulling user data via the python API and writing it to excel.
5) Use python and sqlite3 to build a database. A quick intro to python's sqlite3 module.

The "best of the rest":

It's been an interesting year. While most of my posts are derived from Interweb tidbits I find interesting, my original posts were much more popular (according to the admittedly crude Posterous stats). I have no ambitions for this blog, but I hope some of the factoids featured here have helped you or at least offered some entertainment. Best wishes for 2010!

Written by DK

December 29, 2009 at 12:58 am

Posted in Finance

This Time It’s Different

leave a comment »

I was looking up Reinhart and Rogoff's book, This Time It's Different, on Amazon and the following results appeared:

Awesome…

Written by DK

December 21, 2009 at 4:53 pm

Posted in Finance

Gamma, Delta, and Mark-to-Market

leave a comment »

I've gotten a few questions about gamma vs. delta as it relates to tranches (partially in reaction to an old post), so I thought I'd post my response here.

As I mentioned in "Delta and Mark-to-Market," one way to describe tranche risk is in terms of delta:

The delta of a tranche describes the leverage of a tranche relative to the underlying portfolio. So if a given tranche has a delta of 3x, a one dollar swing in the underlying portfolio should result in a roughly $3 dollar swing in the value of the tranche.

So, in the correlation market, one can "delta-hedge" spread risk by buying/selling the underlying index against the tranche. Given the example above, if I sold $10m in tranche protection and wanted to delta-hedge, I would buy 3x notional (or $30m) index protection. Theoretically speaking, this hedged position is now immunized against spread movement but still exposed to correlation risk.

As I also mentioned in that older post, however, tranches gain and lose delta depending on the expected loss of the underlying index. If the expected loss of the index is moving toward the attachment/detachment point of the tranche, the tranche gains delta. If expected loss is moving away from the attachment/detachment point, the tranche is losing delta.

This change in delta is sometimes called gamma risk. A position that is "long gamma" typically benefits from market volatility. The easiest way to explain this is to examine a hedged position. Let's examine a delta-hedged equity tranche position, where one sells equity tranche protection and buys index protection. Back when spreads were low, the equity tranche exhibited deltas on the order of 15x. So for the sake of argument (meaning the numbers that follow are completely made up but are directionally accurate), you'd sell $1 million in equity protection and buy $15 million in index protection.

Now, let's assume the index spread increases such that the equity tranche delta has fallen to 10x. As a result, your delta-hedged position is now over-hedged, in the sense that you own 15x delta but only need 10x. So you can now sell your excess $5 million in index protection at a profit (since you bought protection and spreads are higher). As a result, you've experienced a mark-to-market loss on your equity tranche, but made money on your hedge. Now let's assume spreads fall again, such that your delta increases to 18x. You now buy more index protection to reset your hedge. If this process repeats itself over time, with spreads oscillating back and forth, you'll make money on your hedge by "buying low and selling high" due to the change in delta. This is an example of a "long gamma" position that benefits from market volatility. There

Keep in mind that the numbers in the example above are pretty crude and not representative of what you'd see in current markets. Nevertheless, the basic mechanics should provide some decent intuition.

Written by DK

December 6, 2009 at 9:52 pm

Posted in Finance

Mint on Unemployment

leave a comment »

Written by DK

December 4, 2009 at 2:23 pm

Posted in Finance

The Physics of Economics & Climate Change

leave a comment »

This article describes a recent study by Tim Garrett, a scientist in Utah. The study is based on the concept that physics can be used to characterize the evolution of civilization. From the article:

"I'm not an economist, and I am approaching the economy as a physics problem," Garrett says. "I end up with a global economic growth model different than they have."

Garrett treats civilization like a "heat engine" that "consumes energy and does 'work' in the form of economic production, which then spurs it to consume more energy," he says.

"If society consumed no energy, civilization would be worthless," he adds. "It is only by consuming energy that civilization is able to maintain the activities that give it economic value. This means that if we ever start to run out of energy, then the value of civilization is going to fall and even collapse absent discovery of new energy sources."

Garrett says his study's key finding "is that accumulated economic production over the course of history has been tied to the rate of energy consumption at a global level through a constant factor."

One zinger comes towards the end:

"The problem is that, in order to stabilize emissions, not even reduce them, we have to switch to non-carbonized energy sources at a rate about 2.1 percent per year. That comes out to almost one new nuclear power plant per day."

I have no idea how solid this type of analysis is, but it sure is creative! It can't be any worse than some of the other models out there that are currently in use…

Written by DK

November 28, 2009 at 8:18 pm

Posted in Finance

Google Wave is built for sales & trading desks (and a little on Chrome OS)

leave a comment »

I finally got a Google Wave invitation (yaay) and have fooled around with it a bit. It’s tough to really kick the tires when most of the people you would wave with don’t have an account yet. The only other option is to wade into massive public waves that appear a bit chaotic. It’s like when I first discovered usenet and electronic bulletin boards way back when. I had no idea what was going on and the geek factor was kicked up a notch. But it was also sort of cool. Anyway, here’s former Lifehacker Gina Trapani explaining Google Wave at W2E:

Nevertheless, is it just me, or does Google Wave cry out for a trading desk application? I can see an enterprising outfit using Google’s open source Wave protocol to bring trading communications into the 21st century. Between the persistent state of wave “documents” and the extensibility it offers with bots and gadgets, I could see Google Wave replacing many solutions firms currently depend on for internal and external communication. There are good structural reasons why it probably won’t happen, but a little speculation doesn’t hurt.

From my experience, investment banks currently use a patchwork of communication channels. Most have their own internal chat system, Bloomberg messaging/chat, email, AIM (well they used to use AIM), and the telephone. From a research perspective, notes are syndicated via email, Bloomberg, internal chat, proprietary blog-like systems, and (of course) hardcopy.

So what does Google Wave offer? From an inside-the-firm perspective, it’s easy to see Wave helping traders, analysts, and salespeople collaborate around a central hub of information. That’s the whole point of having a “desk” where people sit right next to each other – to improve communication. In a global enterprise, however, it can be difficult to achieve the immediacy market-making demands. Using a centralized waves to manage communications would certainly reduce the number of tools in use and provide a re-playable record of the day’s activity. For example, currency traders in NY could replay or review a shared global wave as they take over for London, etc. Wave gadgets could also be created for the ever popular polls that get sent out to clients and other traders in the bank. In-line responses would also help organize the information in a single place rather than switching from chat to email to bloomberg, etc. etc. throughout the day. I could see a salesperson subscribing to a trading wave (it may be he can make a risk free trade by crossing with another salesperson), and maintaining a client wave (for those who choose to do so).

For firms with strong data infrastructures, I could see Wave paired with plotting and analytical extensions that could be used to share data and potential insights. Before Lehman’s demise, LehmanLive was a great example of a firm moving to the web in a way that allowed the entire firm to leverage its data and analytics. For those of you who remember, imagine LehmanLive, POINT, and Google Wave all wrapped up into a single package, and you get where I’m going with this.

Many of the same benefits could be enjoyed by clients in separate sandboxed waves. And since firms can implement their own Wave system, client accounts could be created that access the firm’s servers rather than Google’s. And compliance will love it since wave’s are persistent (again, see the playback feature). Those who want to do something shady will probably stick to the phone…

Of course, it’s probably a long shot any of this will happen. The Bloomberg network effect has been well-documented. Everyone uses it because everyone uses it! As such, it can crowd out patience for another system. Furthermore, the wave approach isn’t immediately familiar (though I have no doubt Wall Street would adopt the technology if it thought it would make more money). One might argue that, in liquid markets, information is already traveling pretty darn fast (particularly as computers cut humans out of the loop). In less liquid, over-the-counter markets, there’s actually an incentive to fight transparency since it has a direct negative impact on profitability…though the drive to gain volume and market sustainability often drives the market towards transparency in the end. Finally, for structured products, the process is so darn long and complicated, who cares? Just tell the lawyers to hurry up!

A final thought on Google OS. I watched the presentation today and was tickled by a pointed question by a member of the audience that essentially asked “What can I do on Chrome OS that I cant’ do on a regular browser?” The answer was along the lines of “uh, nothing really…but you won’t get the really fast boot-up!” From an IT perspective, however, I could see Chrome OS being a godsend. Again, as an open source project, a firm could build Chrome OS into a netbook for use with a distributed workforce. If you are the aforementioned firm with a strong, web-enabled infrastructure (using Wave even!), an analyst or salesperson in the field could have instant access to most or all relevant data on the road, either using local storage or a wifi connection/vpn. Since all data is encrypted on the netbook (at least according to the keynote), it’s essentially worthless (from a corporate perspective) to anyone who steals it. And netbooks are CHEAP.

Anyway, my two cents…

Written by DK

November 19, 2009 at 11:52 pm

Posted in Finance

Tagged with ,

Trefis decomposes stock price

leave a comment »

via TechCrunch:

Started by three engineers and math whizzes from MIT and Cornell (Manish Jhunjhunwala, Adam Donovan, and Cem Ozkaynak) who did time at McKinsey and UBS bank, Trefis breaks down a stock price by the contribution of a company’s major products and businesses. For instance, 51.3 percent of Apple’s stock price is attributed to the iPhone, 25.5 percent to the Macintosh, and only 7.7 percent to iTunes and iPhone apps. Don’t agree? You can change the underlying assumptions by simply dragging lines on charts forecasting the future price of the iPhone, its market share going out to 2016, and so forth. Every time you change an assumption, the price target changes accordingly.

So let’s take a company we all love to hate, AT&T. The screenshot above shows how Trefis decomposes the company’s stock price. You can click through to get a more in-depth breakdown of AT&T’s business. There’s also a social component to the service where subscribers can contribute their own customized models.

There aren’t that many companies to choose from, but Trefis is still in the free period. I imagine users will have to pay for full access in the future. In any case, it seems like a neat toy.

Written by DK

November 17, 2009 at 12:59 pm

Posted in Finance

Stock Ticker Orbital Comparison = COOL

leave a comment »

Care of Flowing Data, Stock Ticker Orbital Comparison (STOC) is one of the coolest representations of the market I’ve seen. Although I can’t see anyone really trading on top of this visualization metaphor, it does make one think of how correlations and other parameters might be represented via animation.

STOC was built using Processing, a Java-based visualization IDE developed at MIT. I understand there are Scala and Javascript versions in development as well. The closest python equivalents I can think of are NodeBox and Mayavi. In any case, STOC has swerve. Respect.

Written by DK

October 13, 2009 at 11:50 am

Posted in Finance

Tagged with ,