Feynman’s Rigor
All of the things I admire about Richard Feynman -- his intellect, and verve, and eloquence -- seem like special cases of a more general feature of his mind, namely, its almost fanatical devotion to rigor.
Here's an example of what I mean, taken from a wonderful essay, "Richard Feynman and The Connection Machine":
Concentrating on the algorithm for a basic arithmetic operation was typical of Richard's approach. He loved the details. In studying the router, he paid attention to the action of each individual gate and in writing a program he insisted on understanding the implementation of every instruction. He distrusted abstractions that could not be directly related to the facts. When several years later I wrote a general interest article on the Connection Machine for [Scientific American], he was disappointed that it left out too many details. He asked, "How is anyone supposed to know that this isn't just a bunch of crap?"
Feynman would only claim to know something if he knew he knew it. And the way he got there, I think, was by never stretching himself too thin -- never moving to step C without first nailing A and B. So when he approached a new problem he would start with concrete examples at the lowest level; when he read, he would "translate" everything into his own words (or better yet, a vivid picture). My guess is that for him, the worst feeling in the world was not being able to explain an idea clearly, because that meant he didn't really own it.
Of course I'd like to think I'm the same way, but I'm not. I skim passages and skip equations. I claim to know more than I do.
But I say that fully confident that most everyone I know is the same way; it turns out that being a "details guy" is harder than it sounds.
* * *
Now the exciting thing, I think, is that you can teach yourself to work a little more rigorously.
One way is to buy a good abstract algebra textbook and work through all of the problems. Algebra's a good place to start because (a) it doesn't require calculus, (b) the problems are intuitive, and (c) it's mostly proof-based. Which means you'll get the full thrill of really knowing things (because you proved them), without having to learn a whole new language (e.g. analysis).
But a better recommendation might be to start hacking. For one thing, you can start building stuff right away; with math it takes a lot longer (7-8 years) to get to the point where you can produce something original.
What's really good about hacking, though, for the purposes of rigorizing, is that you can't make a program work without an honest-to-God understanding of the details. The reason is that the interpreter doesn't actually do much "interpreting"; it does exactly what you say, no more or less. Which means you have to know exactly what you are talking about.
That imperative becomes clearest when something goes wrong, because that's when you really have to look under the hood. Was that index supposed to cut off at the second-to-last item of your list or the third-to-last? What's happening to those instance variables at the end of each loop? Why is that function not getting called? The only way to ensure a long chain of computation ends up the way it's supposed to is to know what's happening at every step, and in that sense, debugging a program teaches you to do explicitly what guys like Feynman seem to do naturally: work hard at level zero, keeping a close eye on every moving part.