Roger Bagula
2008-12-17 19:20:54 UTC
http://arstechnica.com/news.ars/post/20081215-axioms-downturns-and-a-global-computer-crash.html
Posted by: "V.Z. Nuri" ***@yahoo.com vznuri
Tue Dec 16, 2008 7:46 pm (PST)
Axioms, downturns, and a global (computer?) crash
By Jon Stokes | Published: December 15, 2008 - 09:20PM CT
One of the oldest and best acronyms in computing is GIGO--"garbage in,
garbage out"--and it sums up a truth about the world of automated number
crunching that is often forgotten by both programmers and end users. The
GIGO principle means that no matter how detailed and accurate your
computational model is, the results are always dependent on the quality
of your input. A well-crafted model that is supplied with a credible,
realistic set of inputs can greatly improve the quality of your
forecasting and decision-making. But when you start feeding that same
model biased data and outright lies, the results can be catastrophic.
Just ask Wall Street.
Related Stories
* Downturn hitting tech, semiconductor sector hard
When bankers do models
Right now, there are no shortage of attempts under way to blame our
current economic woes on this or that party or factor or
institution--and, hey, why not? There's plenty of blame to go around.
But when the history of this epoch-making period in capitalism is
written, there will definitely be a few chapters devoted to computers
and the people who ran them.
Two very brief first drafts of that history have already come out, one
of them a New York Times article from earlier this month. In the piece,
journalist Stephen Lohr looks at the role of financial engineering in
the crisis and concludes that the computer-driven risk assessment models
that were used to value many of the complex securities at the root of
the current crisis weren't necessarily badly designed. Rather, the money
men took these elaborate tools--all of them designed by a class of math
and physics PhDs known as "quants"--and loaded them with bad data built
on faulty assumptions.
Perhaps the downturn's ur-assumption--the mistaken notion at the root of
so much of the world's present pain--was that the value of US real
estate would continue to rise indefinitely. That single, very faulty
assumption was used as input for model after model, and the resulting
output was sliced, diced, baked, and served up to investors from Los
Angeles to the Ukraine.
In this respect, you can think of the present crisis as the fruit of
Wall Street's (willful?) forgetfulness of two important truisms: the
aforementioned GIGO principle and the famous economic dictum, attributed
variously to Lord Acton and Herbert Stein, that "things that can't go on
forever, don't."
When models do bankers
The second writer to recently tackle the role of computer models in the
downturn is Barry Ritholtz, proprietor of The Big Picture blog and one
of the handful of macroeconomists who's now getting plenty of
well-deserved attention for having publicly and loudly called the
downturn well in advance. Ritholtz's new article in Scientific American
explains that there were more faulty assumptions at work than just the
idea that the real estate boom was permanent. (As another economist
pointed out recently, whenever everyone thinks that a particular asset
class's value will continue to rise indefinitely, that's called a "bubble.")
_/*Ritholtz writes, the following about the way in which the quants
built their models:
"As Benoit Mandelbrot, the fractal pioneer who is a longtime critic
of mainstream financial theory, wrote in Scientific American in 1999,
established modeling techniques presume falsely that radically large
market shifts are unlikely and that all price changes are statistically
independent; today's fluctuations have nothing to do with
tomorrow's--and one bank's portfolio is unrelated to the next's. Here is
where reality and rocket science diverge."
The bit about the models' presumptions about the unlikelihood of
radically large market shifts echoes the thinking of Nasim Taleb,
another prominent "permabear" who has greatly enhanced his fortune and
his reputation because of his accuracy and profitability in forecasting
the recent fall.*/_
A global hard reset?
Taleb also has another major problem with computers that has nothing to
do with their uses and abuses in banking. From Taleb's perspective,
computers have made the whole of the modern economy too complex and too
efficient. From inventory management systems that ensure that retail
outlets hold the optimal amount of inventory (no less and no more) for a
given day and location, to the massive options pricing machines that
time trades with millisecond precision, the entirety of the
computer-driven global economy is like one massive model that was
assembled--most of it over the course of the past decade--on the
governing assumption that the future would look pretty much like the
past. And when that widely shared assumption breaks down, then the
system ceases to behave in a predictable way, because it has been too
finely tuned to operate under a set of parameters that no longer
pertain. (Or so the argument goes.)
In computer science terms, you could say that both Taleb and Mandelbrot,
in a recent and very scary interview with Charlie Rose, have essentially
argued that the current global system is in an "undefined state." This
means that there's no way to predict what its output will be, which is
why attempts by governments to meddle massively with the inputs will
definitely have some kind of impact, but nobody can say what it really
is. Government intervention becomes the equivalent of "percussive
maintenance," i.e., beating on the side of the machine on the chance
that you'll magically unbreak it.
If Ritholtz, Taleb, Mandelbrot, and the rest of the computer modeling
and financial engineering naysayers are correct about the big picture,
then we really are arguably in the midst a bona fide computer crash. Not
an individual computer crash, of course, but a computer crash in the
sense of Sun Microsystems' erstwhile marketing slogan, "the network is
the computer." That is, we have all of these machines in different
sectors of the economy, and we've networked all of them together either
directly (via an actual network) or indirectly (by using the collective
"output" of machines in one sector as input for the machines in another
sector), and like any other computer system the whole thing hums along
nicely... up until the point when it doesn't.
It may be that we should fundamentally rethink our pervasive reliance on
these machines that we've come to fetishize, especially if we wind up
with a hard reset. It's not that they aren't very effective tools, but
rather that they're so devastatingly effective, especially they're
networked together and Metcalfe's Law kicks in to multiply not only our
intelligence, but our collective human frailty.
Posted by: "V.Z. Nuri" ***@yahoo.com vznuri
Tue Dec 16, 2008 7:46 pm (PST)
Axioms, downturns, and a global (computer?) crash
By Jon Stokes | Published: December 15, 2008 - 09:20PM CT
One of the oldest and best acronyms in computing is GIGO--"garbage in,
garbage out"--and it sums up a truth about the world of automated number
crunching that is often forgotten by both programmers and end users. The
GIGO principle means that no matter how detailed and accurate your
computational model is, the results are always dependent on the quality
of your input. A well-crafted model that is supplied with a credible,
realistic set of inputs can greatly improve the quality of your
forecasting and decision-making. But when you start feeding that same
model biased data and outright lies, the results can be catastrophic.
Just ask Wall Street.
Related Stories
* Downturn hitting tech, semiconductor sector hard
When bankers do models
Right now, there are no shortage of attempts under way to blame our
current economic woes on this or that party or factor or
institution--and, hey, why not? There's plenty of blame to go around.
But when the history of this epoch-making period in capitalism is
written, there will definitely be a few chapters devoted to computers
and the people who ran them.
Two very brief first drafts of that history have already come out, one
of them a New York Times article from earlier this month. In the piece,
journalist Stephen Lohr looks at the role of financial engineering in
the crisis and concludes that the computer-driven risk assessment models
that were used to value many of the complex securities at the root of
the current crisis weren't necessarily badly designed. Rather, the money
men took these elaborate tools--all of them designed by a class of math
and physics PhDs known as "quants"--and loaded them with bad data built
on faulty assumptions.
Perhaps the downturn's ur-assumption--the mistaken notion at the root of
so much of the world's present pain--was that the value of US real
estate would continue to rise indefinitely. That single, very faulty
assumption was used as input for model after model, and the resulting
output was sliced, diced, baked, and served up to investors from Los
Angeles to the Ukraine.
In this respect, you can think of the present crisis as the fruit of
Wall Street's (willful?) forgetfulness of two important truisms: the
aforementioned GIGO principle and the famous economic dictum, attributed
variously to Lord Acton and Herbert Stein, that "things that can't go on
forever, don't."
When models do bankers
The second writer to recently tackle the role of computer models in the
downturn is Barry Ritholtz, proprietor of The Big Picture blog and one
of the handful of macroeconomists who's now getting plenty of
well-deserved attention for having publicly and loudly called the
downturn well in advance. Ritholtz's new article in Scientific American
explains that there were more faulty assumptions at work than just the
idea that the real estate boom was permanent. (As another economist
pointed out recently, whenever everyone thinks that a particular asset
class's value will continue to rise indefinitely, that's called a "bubble.")
_/*Ritholtz writes, the following about the way in which the quants
built their models:
"As Benoit Mandelbrot, the fractal pioneer who is a longtime critic
of mainstream financial theory, wrote in Scientific American in 1999,
established modeling techniques presume falsely that radically large
market shifts are unlikely and that all price changes are statistically
independent; today's fluctuations have nothing to do with
tomorrow's--and one bank's portfolio is unrelated to the next's. Here is
where reality and rocket science diverge."
The bit about the models' presumptions about the unlikelihood of
radically large market shifts echoes the thinking of Nasim Taleb,
another prominent "permabear" who has greatly enhanced his fortune and
his reputation because of his accuracy and profitability in forecasting
the recent fall.*/_
A global hard reset?
Taleb also has another major problem with computers that has nothing to
do with their uses and abuses in banking. From Taleb's perspective,
computers have made the whole of the modern economy too complex and too
efficient. From inventory management systems that ensure that retail
outlets hold the optimal amount of inventory (no less and no more) for a
given day and location, to the massive options pricing machines that
time trades with millisecond precision, the entirety of the
computer-driven global economy is like one massive model that was
assembled--most of it over the course of the past decade--on the
governing assumption that the future would look pretty much like the
past. And when that widely shared assumption breaks down, then the
system ceases to behave in a predictable way, because it has been too
finely tuned to operate under a set of parameters that no longer
pertain. (Or so the argument goes.)
In computer science terms, you could say that both Taleb and Mandelbrot,
in a recent and very scary interview with Charlie Rose, have essentially
argued that the current global system is in an "undefined state." This
means that there's no way to predict what its output will be, which is
why attempts by governments to meddle massively with the inputs will
definitely have some kind of impact, but nobody can say what it really
is. Government intervention becomes the equivalent of "percussive
maintenance," i.e., beating on the side of the machine on the chance
that you'll magically unbreak it.
If Ritholtz, Taleb, Mandelbrot, and the rest of the computer modeling
and financial engineering naysayers are correct about the big picture,
then we really are arguably in the midst a bona fide computer crash. Not
an individual computer crash, of course, but a computer crash in the
sense of Sun Microsystems' erstwhile marketing slogan, "the network is
the computer." That is, we have all of these machines in different
sectors of the economy, and we've networked all of them together either
directly (via an actual network) or indirectly (by using the collective
"output" of machines in one sector as input for the machines in another
sector), and like any other computer system the whole thing hums along
nicely... up until the point when it doesn't.
It may be that we should fundamentally rethink our pervasive reliance on
these machines that we've come to fetishize, especially if we wind up
with a hard reset. It's not that they aren't very effective tools, but
rather that they're so devastatingly effective, especially they're
networked together and Metcalfe's Law kicks in to multiply not only our
intelligence, but our collective human frailty.