The recent SEC curbs on shorting a select list of financial stocks got me thinking about ways to avoid these silly procedural rules. I was short some of these financials, and in deciding whether or not to cover, I had to also consider the fact that I would likely be unable to short the stock again.
I called my broker, and they confirmed my suspicion: there was no stock available for shorting--for the time being. I asked whether it was likely that my short position might get called away--that is, that I might be forced to cover: highly unlikely, was the answer. I described my predicament (a juxtaposition of fear against greed), that I wanted to cover (fear), and still keep the borrowed shares around for shorting later (greed), and they said there would be no guarantee that there'd be any more shares to short after I had covered. How about if I bought the stock in my long account, didn't touch my short account, and later sold the shares in the long side of my account? I suggested. Can't do that either, they said. Once I bought the shares on the long side, they would be in a boxed position, and I would not be able to sell any more shares. I thanked the broker and hung up.
The solution to my problem had already presented itself in that last question. All I had to do was to go long the shares in another account I own at another broker: the combined account positions could exactly offset each other, and I would be able to later return to a net short position simply by selling the shares in the long account at the other broker.
I doubt the procedure described here is illegal, [ Ignorantia juris non excusat warning and other usual IANAL, IANAIA disclaimers go here], and if it is, I certainly couldn't find anything about it on the web. (Truth be told, I didn't ask either of my brokers: I had heard enough stupid "No"s for one day.)
My solution to this short term problem, created by a seemingly capricious regulator, led me to consider its more general application. For the regulator, by making shorting difficult, has created artificial scarcity. A short position, then, by virtue of its scarcity, must have more value than its nominal value. How much more?
Let's call this double account tactic, where your long shares in one account are exactly offset by your short shares in another account, an open boxed position. I suggest the scarcity value of a short position might somehow be connected to the cost of money (e.g. as employed in the Black-Scholes model) required to maintain an open boxed position in the same number of shares. I say "somehow", because while I can easily quantify how much it's currently costing me to maintain my open box, there is no general way to determine what it might cost another person (the cost of borrowing shares varies from broker to broker).
To put it another way, I suggest an open boxed position is a store of all the regulatory overhead associated with shorting. A speculator wishing to short hard-to-borrow shares, must consider both price movement and the availability of shares to short in the timing of their trade. It is a well known fact that open interest tends to increase at price peaks, making shorting at the best price a double challenge. This puts the short trader at a tactical disadvantage to the long trader. In order to mitigate that disadvantage, a speculator who plans to short shares at, for example, some future higher price might build a (net neutral) open boxed position, to be unboxed (net short) at a future time when the price or trend is right. In building the open box position, you are satisfying all the regulatory overhead of shorting ahead of time (whether that overhead is the requirement that your short never be naked, the now defunct up-tick rule, or some other silly red tape impeding your ability to short at the precise time of your choosing). Alternatively, as described at the beginning of this post, a short position can evolve into an open boxed position.
That you can easily skirt the intent, if not the letter, of regulatory restrictions designed to impede shorting is a testament to the fact that shorting is a natural, organic outgrowth of market activity. It is not an activity born of the invention of some new fangled derivatives instrument; rather, people have been shorting stuff for centuries. So I guess it shouldn't surprise us if we later find that the new SEC curbs on shorting only worked temporarily. The market always finds ways to correct artificial scarcity.
Friday, July 18, 2008
Sunday, July 13, 2008
Let me begin with this disclaimer, first: though mathematically trained (physics background), I am not a mathematician. OK, got that out of the way.
I have found a simple way to derive Gauss's estimate of the prime density function using probability heuristics, alone.
Excerpt From MathWorld: http://mathworld.wolfram.com/PrimeNumberTheorem.html
In 1792, when only 15 years old, Gauss proposed that , the prime density function
Gauss later refined his estimate to
I have not seen a simple derivation for this estimate, and if it exists, I am surprised why it is not more widely used in expositions on the subject of the distribution of primes. What follows is a very short argument based on a probability model. According to this model, we'll find that the distribution of primes is governed by a delay differential equation of the form
Q'(x) = - Q(x) Q( √x ) / x
which has a solution
Q(x) = 1 / 2 ln x
Anyway, to me, what's interesting is how using some specious probability arguments about the distribution primes I was able to set up the equation and try a test function inspired by Gauss's estimate to solve it. This hints at something more meaningful in my Clouseau-esque accident.
The setup, I have come to find, is similar in spirit to Harald Cramer's probabilistic, heuristic arguments for estimating of the distribution primes (the difference here being that we don't use any other results from number theory). Here it is..
The joint probability of a randomly chosen positive number n not
being divisible by two relative primes p and k is (1 - 1/p)(1 - 1/k).
Define Q(x) = ∏(1 - 1/p)
taken over all primes p ≤ x.
The joint probability of a randomly chosen positive number n not being divisible by any prime p ≤ x is Q(x).
Using Equ. 2a, vigorous hand waving, and a pinch of salt, we can say
The probability that a randomly chosen positive number x is a prime is Q( √x ).
What we're saying here is that for x to be prime, it suffices to show that it is not divisible by any prime less than the square root of x. Equ. 3 says that in the neighborhood of x, the 'average' distance between primes is 1 / Q( √x ).
Now the d.d.e. above comes from trying to approximate Q(x) using this probabilistic model. The idea is to use that approximation in order to estimate a prime counting function:
∫ Q (√x) dx
But we don't have an analytic expression that approximates Q, yet. Instead of setting up an integral equation, we try a differential approach. Consider the change in Q as x passes over 2 very large consecutive primes p1 < p2:
ΔQ = -Q(p1) / p2 ~ -Q(x) / x
Δx ~ 1 / Q( √x )
Dividing the top equation by the bottom one, you get
Q'(x) = - Q(x) Q( √x ) / x
the d.d.e. I described above.
I had read somewhere how the 15 year old Gauss had been able to come up with his logarithmic integral for estimating the number of primes less than n. Was his integral inspired by a similar probabilistic argument? Maybe, but googling it, I couldn't find much. So, I plugged in C / ln x and solved for the constant C (=1/2).
Q(x) = 1 / 2 ln x
Does it mean anything?
I suspect it might, which is why I posted it. I did a bit of cursory reading on the topic, but alas, I'm an amateur. My claim that
Q(x) = ∏( 1 - 1/p) ~ 1 / 2 ln x
does not agree with Mertens' asymptotic formula (1874)
∏( 1 - 1/p) ~ exp(-C) / ln x,
where C is Euler's constant.
Still, there's something here that piques my nose. My result, when plugged into the prime counting integral, agrees exactly with Gauss's estimate:
∫ Q (√x) dx ~ ∫ (1 / ln x) dx
What do you think? Is this interesting, or is this old?
7-09 Addendum: Interestingly, if we adjusted the model so that
Q'(x) = - Q(x) Q( x1/n ) / x
then Gauss's estimate would still hold for most n. It's as if the forward distribution constrains that back distribution.