#
Abstract thinking

As I was struggling through my calculus homework tonight, I came to a possibly self-defining realization:

I do much better in situations where there could be many answers, depending on how one perceives something--rather than situations where one has to dig deeply and use up tons of energy to find only one correct answer.

_________________

Who’s better at math than a robot? They’re made of math!

Sedaka

Veteran

Joined: 16 Jul 2006

Age: 38

Gender: Female

Posts: 4,597

Location: In the recesses of my mind

is why i love evolution... all about versatility and wonder.

_________________

Neuroscience PhD student

got free science papers?

www.pubmed.gov

www.sciencedirect.com

http://highwire.stanford.edu/lists/freeart.dtl

The biggest problem for everyone with stuff like calculus is the error dispertion. If you make a grammatical error on an essay, it doesn't effect the corectness of the rest of your work after that point. In math it either escapes your notice or you realise something's not working right, but you don't know where you went wrong so you have to backtrack. The longer the problem, the more likely you are to make one or more errors, and the more time you have to tediously spend figuring out where you made the dumb mistake, or else turn in an incorrect answer. I like learning the concepts and theories, but I don't like doing all the work. That's why we get computers to do that kind of stuff for us.

I really love learning theories and concepts, too. And that is a good part of how undergrad physics is usually taught, too. But without doing the actual work itself as well, much intuition and inspiration is lost in the process. It's hard to know the balance here. I'm not trying to say one has to go calculate all the numbers, for example. Or do grungy manipulation of symbols forever. But there is a very, very important role to be played in having to do that work, too. And it's not just a matter of being able to do these things when your computer is down for repair. It's about seeing patterns in the work and gaining new insights from there.

Some time ago, there was a science news report on some software that together with some force-feedback gloves and special goggles was used to simulate the forces of electic potentials between molecular fragments. A chemist would put on the glasses and wear the gloves and would see himself/herself in a virtual reality where they could grab up a peptide chain, for example, and try to bring it near to some protein receptor or the like. The computer would calculate all the forces involved and press back via the gloves, so that the chemist would get some direct feedback. Over time, these chemists would begin to develop an instinct that they couldn't get any other way, about what could be done chemically. Granted, here the computer is doing the tiny calculations to operate the gloves and virtual reality simulation, but the chemist wasn't just learning about the concepts and theories here. They'd already been through all that, before. This was about sitting there trying personally to actually get two molecule pieces to orient themselves and bond, if possible, and to get some sense about what all those complex forces actually "feel" like, from "doing it manually," so to speak.

In any case, I've learned a lot of insights that no book was able to teach me just from having to actually sit down and "do the calcs," manually. Things that teachers don't teach, books don't publish, and insights about new ways to think about how things work that no one else around me had ever even considered before. I know, because I've explained some of these insights to other physicists and have received the accolades one gets when others "see" what you see and find it very helpful. And you don't get that from books. You get it from doing things, personally.

Galileo wrote something about this process, though I'd need to write a lot more to get the fuller context across. Just a sec.... Ah, yes. Here it is. Keep in mind that in his time, the term 'philosophy' was roughly equivalent to today's use of 'physics', but not quite the same, but note that in included math in the meaning of it as physics often does. It was Galileo himself who developed the idea of philosophy into the much better idea of experimental philosophy (closer to modern scientific methods.)

But men who go on forever copying pictures and never get around to drawing from nature can never become perfect artists, or even good judges of painting. For they remain unpracticed in separating the good from the bad, or the accurately from the poorly drawn, by means of recognizing in nature itself as a result of countless experiences the true effects of foreshortening, of backgrounds, of lights and shadows, of reflections, and of the infinite variations in differing viewpoints.

In the same way a man will never become a philosopher by worrying forever about the writings of other men, without ever raising his own eyes to nature's works in the attempt to recognize there the truths already known and to investigate some of the infinite number that remain to be discovered. This, I say, will never make a man a philosopher, but only a student of other philosophers and an expert in their works. I do not believe that you would esteem as a good painter a man who had made so great a study of the drawings and canvases of all painters that he could promptly identify the style of each one, even if he could also imitate them.

The point here, to my mind, is that learning good concepts and learning good theory is certainly part of a process. But only part. That the hard work of going forth and actually doing the drudgery and discovering for yourself what others have also discovered, but doing so on your own right and with your own hard work, has its value. To learn to separate the good from the bad takes direct, hands-on experience.

One can _learn_ about the theory of pendulums from a book. But when you set out to build them, one after another, and test what the theory says, you may find something new on your own or gain some new insight that others may have missed. Doing the work teaches you an eye for recognition of something important that cannot be had from learning good concepts from others, alone. For example, in the practice of making pendulums in class, one finds that students will often make slightly different sized holes that the pin rocks inside of as the pendulum swings. The relative diameters of pin and hole actually can affect the period by say 2-3% or so. Much more than the error of timing devices commonly found in the lab and much more than the locally known error of 'g' and much more than the approximation equation (the commonly found equation is NOT an exact one, as the 2nd order differential equation has a transcendental in it that complicates the solution) error of the common equation used. Most students will just do the experiments and chalk up the error as some random flaw they don't even bother to investigate or worry about. Learning to discern when it is important to dig deeper is a skill you learn by doing and no book will teach you that skill. It comes from gaining a practiced eye.

Anyway, it all goes together -- theory and practice.

Jon

_________________

Say what you will about the sweet mystery of unquestioning faith. I consider a capacity for it terrifying. [Kurt Vonnegut, Jr.]

Interesting thread.

I lean more towards calling it lateral thinking - but it's probably the same thing. Thinking outside the square. If this line of thinking is respected, it's a great tool to use to obtain knowledge from others in order to get by when it comes to getting along with others.

Of course if it's not, all heck breaks loose!

For myself, it's a bit of a mixed bag. Cheifly because I know I'm most comfortable when things are black and white - and lateral thinking doesn't always fit that bill, so I adapt it as best I can. I call that "a shade of grey that I can cope with".

But as a rule I like lateral/abstract thinking. It's what makes us unique, and I for one am proud of that,

I do much better in situations where there could be many answers, depending on how one perceives something--rather than situations where one has to dig deeply and use up tons of energy to find only one correct answer.

I'm the opposite, I thrive on the objective. I love facts and deriving patterns from those facts. I love it when my hypotheses are corroborated by objective facts. I am at my best when I am digging deeply to find the "one correct answer."

I'm the same. I love mathematical concepts, but hate doing the actual math problems!

richardbenson

Xfractor Card #351

Joined: 30 Oct 2006

Gender: Male

Posts: 13,553

Location: Leave only a footprint behind

i have more concrete thinking. than abstarct but i think thats why i have such a rough time trying to live on my own i have expectations of how things should work and once they dont all hell breaks lose

_________________

Winds of clarity. a universal understanding come and go, I've seen though the Darkness to understand the bounty of Light

I do much better in situations where there could be many answers, depending on how one perceives something--rather than situations where one has to dig deeply and use up tons of energy to find only one correct answer.

One of the problems with the way calculus is usually still taught in school is that it develops concepts not in the natural way they originally evolved from Newton and Leibnitz, but instead as they were much later founded more rigorously via Weierstrass and Dedekind in the mid/late 1800's. The physicists had been using a concept that the mathematicians refused to embrace as valid (it was not a rigorous concept from a mathematician's point of view, though it was very easy to imagine with) and it took the mathematicians some 100 years to figure out why what the physicists had been using seemed to work so well. But what they created was something very difficult to get across to most people. It was rigorous. But it was Rube-Goldberg, by comparison.

It wasn't until Abraham Robinson's work in the 1960's that some of the original, much more natural way of thinking about calculus used by the early physicists, would be placed on sound mathematical ground -- called "non-standard analysis," I think. I find it much, much more natural to apply.

I was taught the standard way and had a lot of difficulty placing the concepts into the pictures I needed to think in. They seemed non-visual and highly dependent upon rote memory to apply well -- the Weierstrass/Dedekind foundations using limit definition of derivatives and then backing into the idea of integrals from there. It was hard work for me to painstakingly begin to visualize things that way. Some years later, I was reaching into my library of math books for a table of integrals and derivatives, pulled out a book (red textbook) I had on the shelf with the basic formulas on the inside flaps. But the derivatives were given in a different-than-usual form there and because of that, and perhaps because I was ready to see something new and discover something new to mind from it, I did a complete double-take as a new world dawned on me. It was like a revelation of sorts and suddenly all this made complete sense. It was only years later in reading Newton's original writing and talking with other physicists about this that I learned that this wasn't just my own discovery but that it had been the way the whole concept had been discovered in the first place and also learned then of Robinson's non-standard analysis work from the '60s. So it is pretty well grounded now. But a whole lot easier to see than the traditional rube-goldberg, limit theorem, backwards way of seeing things that Weierstrass and Dedekind thrust upon the world; where later math educators wrote their textbooks to require.

I'm not saying that the rigor of Weierstrass and Dedekind's approach is a waste. It's not. Especially for those who want to become mathematicians, it's absolutely vital. But for most folks who just want to apply it well, and who are not planning to become mathematicians in their own right, I think it's a bit much.

Basically, calculus is just old fashioned algebra with a new type of variable added into the mix. All the same rules apply -- cancelling terms and so on -- but this new kind of variable is always prefixed with a 'd' to keep in mind that it can only hold infinitesimal values (values greater than zero but smaller than the smallest finite value nearest to zero.) The ratio dy/dx just compares the size of the tiny infinitesimal in dy to the tiny infinitesimal in dx and it is no problem to understand, for example, that one might be twice the size of the other, so that dy/dx = 2. Anyway, it's just the addition of a new variable type -- every thing else remains the same. Summing them up is quite visually expressed too. For example, the integral of (x dx) should actually be seen as (x * dx), in other words, as some finite value in x multiplied by some infinitesimal in dx. That value will of course be infinitesimal in size, too. But it is really just an area (a product can be seen as a rectangle), so imagine an object 'x' high and 'dx' wide. To sum them up, just place them side by side with each other and look at the picture. If you go from x=0 to say x=x0, then the first rectangle is 0 high and dx wide, the next one is dx high and dx wide, the next one is 2*dx high and dx wide, and so on. Set side to side, the picture is just a triangle that is x0 wide (the sum of all the little dx wide pieces must sum to x0, of course) and that last rectangle was x0 high and dx wide, so the height of the triangle is also x0. So a right triangle is formed that is x0 wide on the base and x0 high at x=x0, the total area of which is clearly just (1/2)*x0^2. Quite visual and better than backing into the idea from reversing the limit theorem stuff.

Much much more to say on this, but that's a start. I wish classes would catch up on this.

Jon

_________________

Say what you will about the sweet mystery of unquestioning faith. I consider a capacity for it terrifying. [Kurt Vonnegut, Jr.]

Similar Topics | |
---|---|

Concrete thinking vs Abstract thinking... |
23 Feb 2008, 10:15 pm |

Abstract Thinking in AS |
11 Nov 2013, 10:18 pm |

abstract thinking |
20 Jul 2018, 1:32 am |

The Art of Abstract Thinking... |
25 Apr 2016, 7:30 pm |