This TED talk by Conrad Wolfram, of Wolfram Research, will really resonate with anyone who follows system dynamics and learner-directed learning.

He asks, “what is math?” and decomposes it into four steps:

- Posing the right questions
- Translating the real world problem into a mathematical formulation
- Computation
- Map the mathematical answer back to the real world, and verify it

He argues that 80% of conventional education is expended on step 3, which is boring if you do it by itself. Instead, he says, we should use increasingly-ubiquitous computers for step 3, and focus on the fun parts – 1, 2 & 4.

This is basically a generalization of the modeling process and the SD approach to education. I do millions of calculations per day, but not more than a few by hand or mind. The real wrangling is with steps 1,2 & 4 – real world problems that Conrad describes as knotty and horrible, with hair all over them.

I relate well to a few of Wolfram’s thoughts and not to others.

Programming to understand? Sure; I agree. Use J. Seriously. That combines what he seeks with something so much more powerful (both on the thinking and computing levels) than most ever see. While it’s a great computation engine, I used APL, its ancestor, for years purely as a notation of thought.

I use R for stats because of the libraries, but it’s a challenge compared to the cleanness and terseness of J, and having to wade through thousands of libraries to find the one you want is less than fun (even as many of the libraries are excellent — which is one of the things that leads to my next comment.

The computer as silver bullet? Yikes! I’ve learned to run from most silver bullets, including SD when pitched as such. There are few such bullets in my experience, but there are many who are trying to cast their pet approach as one. That said, there are lots of useful tools. (The connection to the prior paragraph? Great code libraries are only great when you can remember and find them.)

Automated calculation as important? Certainly. SD is a great example. I use R and ggobi for statistics, and I have done MCMC calculations on SD models that involve thousands of successive simulation runs to estimate parameter values from data. I can imagine doing SD with paper and pencil (that’s how Jay Forrester started out, right?), but I can’t imagine doing MCMC that way.

Yet I also use and teach what some have called “pocket stats”: rules of thumb devised by Tukey and others that you can apply in the moment, without a computer or even calculator, to solve real problems. And I still have a circular slide rule I use on occasion — oops, I shouldn’t have revealed that, I suppose, but it does fit nicely in my pocket, and it has never run low on energy. ðŸ™‚

If energy is limited (see LTG, peak oil, whatever) and if materials are limited (hafnium, lithium, etc.), then isn’t computing limited? Should we at least spend some time enabling ourselves to make sense through mental math if calculating becomes expensive again?

Real-world math as distinct from “school math”? Of course. In many ways, school math is too simplistic (linear ODEs, etc.), but, in many ways, real-world math (statistics in Excel? confidence intervals and P-values instead of credible intervals and probabilities?) is a bit simplistic or, on the other hand, difficult to understand, too. Good math for a good purpose sounds good, but Wolfram seems to assume that computed math is always good math.

For a nuanced approach, see Donald Knuth’s approach to a graduate-level problem solving seminar I once watched via video tape. He emphasized that it was good to apportion tasks between the human and the computer appropriately, and then he gave the students really interesting, really hard problems to work. I think you can find information on one of those sessions at http://www.cs.columbia.edu/~kar/pubsk/seminar.ps (I don’t recall any of these problems from the video, so perhaps it was a different year or a different course; the one I saw had some degree of more real-world problems, as I recall).

Reactions?

Bill