Author Topic: Jim Simons rennaisance technologies - is value învesting not the only way ?  (Read 6254 times)

LC

  • Hero Member
  • *****
  • Posts: 4238
I would actually disagree with you there. I think the problem is these modellers are requiring precision and not accuracy.

The entire point statistical modelling is to use a sparse number of datapoints to create a generalized model. Go back to stats 101 and the sample size problem. What is the generally accepted minimum number of samples? It is 25 or 30. At 200 points you can get to significance at 99% confidence.

I have some coworkers from medical research - we used 50, 100 samples to draw medical conclusions back then...and now portions of the bank claim they can't build a sufficiently accurate model due to lack data when they have datasets in the thousands.

The tradoff is you can use 500 data points to create a generalized model but it will lack precision. Or you can build a model with 500,000,000 datapoints (we have them - do not believe anyone who says they lack data unless it is risk-specific) but it lacks the ability to generalize over time.

Modellers try to take the best of both worlds with various methods to reduce overfitting (you can google the regularization methods) but IMHO there is only one true method, which is intuition - and this currently cannot be modelled or at least I am not aware how.

You can model things very close to intuition but you need highly non-linear models that requires a lot of data points.  For trading on info from 10-Q, I can assure you a huge problem is data issues.  I’m not sure what you mean when you say lacking precision versus generalization.  Are you talking about the bias variance tradeoff?  The bias variance trade off is a more nuanced model requires more data to generalize well, so a 500 million data model should generalize very well unless your model complexity is very high. 
They are using consumer credit data to estimate 10Q reported sales. There are most definitely 100s of millions of data points for consumer  data, market data - we build various types of ML and NN models using this data and they still suffer from overfitting. In some cases (GBM) they actually are your parent's logistic regressions and decision trees...but we still run into the problem of overfitting, particularly with the NNs. We are purposefully using less data (large holdout samples) and other techniques to battle overfitting problem. Where you run into real problems is specific risk stripe data such as ops risk where you only have a handful of incidents over 20 years. Difficult to model using any methodology.
"Lethargy bordering on sloth remains the cornerstone of our investment style."
----------------------------------------------------------------------------------------
brk.b | irm | nlsn | pm | t | v | xom


cameronfen

  • Hero Member
  • *****
  • Posts: 670
I would actually disagree with you there. I think the problem is these modellers are requiring precision and not accuracy.

The entire point statistical modelling is to use a sparse number of datapoints to create a generalized model. Go back to stats 101 and the sample size problem. What is the generally accepted minimum number of samples? It is 25 or 30. At 200 points you can get to significance at 99% confidence.

I have some coworkers from medical research - we used 50, 100 samples to draw medical conclusions back then...and now portions of the bank claim they can't build a sufficiently accurate model due to lack data when they have datasets in the thousands.

The tradoff is you can use 500 data points to create a generalized model but it will lack precision. Or you can build a model with 500,000,000 datapoints (we have them - do not believe anyone who says they lack data unless it is risk-specific) but it lacks the ability to generalize over time.

Modellers try to take the best of both worlds with various methods to reduce overfitting (you can google the regularization methods) but IMHO there is only one true method, which is intuition - and this currently cannot be modelled or at least I am not aware how.

You can model things very close to intuition but you need highly non-linear models that requires a lot of data points.  For trading on info from 10-Q, I can assure you a huge problem is data issues.  I’m not sure what you mean when you say lacking precision versus generalization.  Are you talking about the bias variance tradeoff?  The bias variance trade off is a more nuanced model requires more data to generalize well, so a 500 million data model should generalize very well unless your model complexity is very high. 
They are using consumer credit data to estimate 10Q reported sales. There are most definitely 100s of millions of data points for consumer  data, market data - we build various types of ML and NN models using this data and they still suffer from overfitting. In some cases (GBM) they actually are your parent's logistic regressions and decision trees...but we still run into the problem of overfitting, particularly with the NNs. We are purposefully using less data (large holdout samples) and other techniques to battle overfitting problem. Where you run into real problems is specific risk stripe data such as ops risk where you only have a handful of incidents over 20 years. Difficult to model using any methodology.

No.  Your credit card data has 100million data points but you have only revenue numbers every quarter so that’s your bottleneck.  Even though you have 100 million x values you only have about 500 y values.  That being said you may have more in the cross section but then you need to do something fancy like transfer learning as it’s not so easy to use revenue from one company to predict another.

There are a lot of quants still using logistic regression and decision trees, but I think based on the people Rentech hires they do not focus on these things.  Based on the people they hire, the big name quants who focus on building better models (as opposed to finding more interesting factors) focus on deep learning and heavy duty models and so they can’t use fundamental data.  You sound like you are more in the AQR/Tobias Carlisle approach to quantitative investing where you try to find predictive factors.  The Rentech approach is quite different than your approach to quant trading.  And again it’s forex so how predictive are your fundamental indicators going to be outside of occasional tail events. 

ritrading

  • Jr. Member
  • **
  • Posts: 67
My understanding of what Renaissance does is value investing. The only way to make money is to buy high and sell low. If you want to keep the money you make, buy below fair value and sell higher.

They have automated what people like Buffett used to do manually. Scan a huge number of documents, scan every company, and build the ultimate contrarian investor by programming a machine to do it. What you get out of this process of:

   scanning every document/piece of data + a machine-trader   =   a super-Buffett.

What I see is hedge-funds who cannot build such a program resort to momentum investing. The "quant" funds such as Renaissance, Two Sigma, Millennium lack the momentum stocks in their 13-Fs. Munger is right that investing has become a lot more competitive.

Renaissance does statistical arbitrage. They operate in a similar manner to physicists trying to predict behavior or movements in stars, quantum particles, etc. It is many iterations of hypothesis testing and lots of data. It does not look anything like value investing.

Poor Charlie

  • Full Member
  • ***
  • Posts: 144
A few comments:

Returns
You can’t compare the returns given in the book and reach the conclusion, as Zuckerman does, “that no one in the investment world comes close [to Medallion].”  The returns for the funds listed in the appendix are internal rates of return (and effectively the compounded annual growth rate for Berkshire).  But Medallion’s is an arithmetic return.  For a fund like Medallion that pays out most of its earnings (they’ve distributed 10x their current capital), the arithmetic return will be very different from the compounded return.  Consider that if Simons was indeed able to compound at 66%, the initial $18 million in Medallion would now be worth $120 trillion ($18 million*1.66^31 = $120 trillion).

I would actually prefer to have an investment return, say, 15% and reinvest at 15% over one that returned 66% but couldn’t reinvest.  It wouldn’t even be a close call.

Leverage
Renaissance isn’t the only firm operating with a lot of leverage in the financial world.  Banks are levered +10x and do fine.  Same with trading houses (Marc Rich + Co, Phibro, etc.), which run +20x.  Even market-makers (Citadel and all the Chicago prop shops) seem to do OK with +30x leverage.  What’s even more remarkable is that they all do it with either short-term funding (commercial paper, letters of credit, repo, etc.) or call loans (margin debt, demand deposits, etc.). 

I wouldn’t want to have 30:1 in call loans against soybeans or some other commodity, but a lot of people seem to make it work. 

“Is value investing not the only way?”
I don’t understand what the big deal is about Renaissance.  Is it really that surprising someone with a 180 IQ and a background in mathematics is making money trading?  He’s not the first and he won’t be the last. 

scorpioncapital

  • Lifetime Member
  • Hero Member
  • *****
  • Posts: 1938
    • scorpion capital
Is there a way to extrapolate this sort of cumulative return of rennaisance into an annual compound return assuming reinvestment?

TwoCitiesCapital

  • Hero Member
  • *****
  • Posts: 2484
Is there a way to extrapolate this sort of cumulative return of rennaisance into an annual compound return assuming reinvestment?

I mean, yea you could extrapolate and calculate, but part of the reason their returns are what they are is because they DIDN'T compound.

You probably couldn't do what they have done managing the amount of money Buffett does. Too much scrutiny from regulators, too much at risk of it goes wrong, too hard to move quietly in the markets, and gets hard to continue running leverage at 30x if you manage $100 billion.

Jurgis

  • Hero Member
  • *****
  • Posts: 4971
    • Porfolio
An interesting exercise might be: assume you get RenTech return but not compounded. Assume you then invest it 60/40 index/bonds or whatever. Are you better off than investing in BRK 10/20 years ago? What if 100/0 index? Not sure if there's enough data to model this.
"Human civilization? It might be a good idea." - Not Gandhi
"Before you can be rich, you must be poor." - Nef Anyo
"Money is an illusion" - Not Karl Marx
--------------------------------------------------------------------
"American History X", "Milk", "The Insider", "Dirty Money", "LBJ"

Poor Charlie

  • Full Member
  • ***
  • Posts: 144
Is there a way to extrapolate this sort of cumulative return of rennaisance into an annual compound return assuming reinvestment?

One thing mentioned in the book I thought was telling was Simons’ Madoff investment.  Madoff built his scheme around the idea he could deliver steady 12 to 13% returns.  So what is Simons doing withdrawing from his fund, which is earning 66%, and investing in a fund earning a purported 13%?  If anyone could/should have been reinvesting in Medallion it was Simons.  And this wasn’t when he was running billions of dollars; this was in the early 90s when the fund had less than $100 million in capital.

It’s tough to come up with a valid compounded return figure for Medallion given that (a) nearly all the fund’s earnings were distributed and (b) the guy running the fund was reinvesting his distributions at a much lower rate (even when the fund was small). 

scorpioncapital

  • Lifetime Member
  • Hero Member
  • *****
  • Posts: 1938
    • scorpion capital
It's interesting. To me it boils down to do you have the guts to put and compound 100 percent of your net worth each year. So many investors get wild returns on 1/10 of Their portfolio but ask them would they put everything in and they wouldn't. But if you don't put everything in your return on total growing capital is going to decline.  I can be a genius on some ten bagger but if i put 10 Percent in my capital isn't going up 10x . Seems Amazing when an investor can compound all capital every year at anything above 15 percent.

cherzeca

  • Hero Member
  • *****
  • Posts: 2844
simons is an anomaly. a quant who caters to quants and who is willing to give them everything they ask for.  only a quant would do that, and most firms aren't run by quants.  I saw a video of simons explaining his keys to success, and the one thing that struck me was when he said he gives his employees the best infrastructure there is.  if you think about it, how do you attract and retain great employees?  with a tough NDA and noncompete? no, with a great organization that says yes to any reasonable employee request.  harkens back to his days at the think tank where everyone could do 50% of their time in whatever they wanted.  if one realizes that if you give employees the massive computing power, massive data access and supportive culture they want, they will stay...and make damn good models.  they may even have a good cafeteria...