Hubris, Original Sin, and Heuristics II: Methodology and Uncertainty

The Task.  Here I set aside my daily life, my social roles, the narrowly professional scope of my training to undertake the philosopher’s task:  to start from nothing and understand everything.

I can, and indeed must, stand on the shoulders of the billions of students and teachers who have come before me:  family, friends, professionals, artists, academics, prophets.  I shall make the best use I can of every idea, tool, record, and inspiration I can find they have left behind; but to the limit of my ability I shall not accept or reject anything without consideration and as much understanding as I can muster.

Even with hundreds of years of enlightenment, thousands of history, and millions of evolution, to help me and perhaps inherent in me, I can only hope the philosopher’s task is merely audacious rather than vain; but for me, it is necessary.

We—or at least I—am not great enough to comprehend everything at once.  I believe now that is the sole province of God; and to be human is to be essentially limited.  Being merely human, I must take philosophy one step at a time, breaking the task into pieces I can hope to manage.

Experience suggests the obvious place to start any task is at the beginning, with the first step.  Logic suggests conclusions can only be as sound as the original premises or principles they proceed from, and the method by which they are derived.  I believe this makes sense; but it leaves me with the difficulty of identifying the original principles and the root method.

The Middle.  The challenge is profound; and compounded because surely I do not stand at the beginning.   Instead, I must be deep in the middle—of learning, of life, of science and art, of history, of evolution.  I cannot even clearly imagine, let alone reason back to or perceive, the beginning… can I?

The beginning—my beginning—is lost in the past.  And despite the extraordinary progress of technology and science, I doubt they may ever permit us to record for anyone’s later reference, each step in the evolution of their worldview and understanding (unless somehow science and technology can jump the bridge between the subjective and objective, which at least now, I suspect may prove to be impossible).

How could one possibly begin with one’s own beginnings?  Today, and very likely forever, we cannot even tell for certain when human existence begins (or even life itself, other than by reference to objective definitions), let alone the soul.  If not as a categorical matter, how possibly as an individual experience?  I can remember almost nothing from before a certain age; and when memory begins, it is for all but a handful of people, spotty, piecemeal, and structured in a manner other than chronological or formally deductive.  Babies, even weeks after birth, may be outraged when they pull their own hair because they don’t yet understand what their hand is pulling on is their own hair—and they don’t yet understand they are moving and grasping anything with their own hand!  It takes weeks and months to sort out that one has a body, that there is something beyond one’s body, that acting in the most basic and physical ways has consequences, however simple and uneventful.  This learning process doesn’t reach fruition automatically, or even in a time period measurable in minutes or hours or days.  Simply for a child to roll over, to push up with their arms from the ground, long before she can even walk, the baby has to learn a thousand prerequisite principles, and at least an essential or working knowledge of sensation, motor control, time, space, and the limits of their own body.

On one level it seems natural that when sitting up or standing are extraordinary achievements, examining one’s nature, identity, life, and meaning are things many people don’t prioritize, or indeed may actively avoid.  But the more important conclusion is surely that if a baby can achieve so very and extraordinarily much, in so brief a time, what we can achieve in adults within the human lifespan must be truly amazing!  And, we should try our best to do so.

Even the most theoretical physicists and clergy must construct an interpretation of the past based on observation and inference, mustn’t they?  Unfortunately, although I believe they are not only useful, but necessary, inference and observation are categorically and necessarily flawed, in a way epistemologists cannot bear to contemplate but small children understand and experience.

For methodology, the implication is that one cannot possibly, truly start any analysis at the beginning because (1) we cannot remember the beginnings of our own analysis, and probably it would be impossible to do so; (2) probably, we cannot practically reconstruct the beginnings of our own analysis because there are so many thousands or millions of steps from whatever the first principles actually are, until we even arrive at a point where we can conceive of original premises; and (3) probably, there is a point (which I am not persuaded anyone can yet identify, but probably somewhere between the instant of conception and the time of birth) where whatever is happening as we learn changes fundamentally in nature from the objective to the subjective.

As an imperfect analogy to illustrate this third point, one might compare human development and learning to hardware and software.  Software, which one might compare imperfectly (but simply for purposes of illustration) to the mind, is a language based on deductive reasoning structures.  Hardware, which one might compare equally imperfectly to the brain, is a physical construct.  Similarly comparing life to our conscious efforts, we may assemble, shape, or at least affect materials from the world around us using our empirical observations about the qualities of their constituent materials and how they react to stimuli, to create logical pathways.  Miraculously, when silicon and other trace elements are arranged into very complex patterns, they create channels that can support and express at least an electrical analogy to reason in the form of software.

Inference.  For centuries children have frustrated their parents’ reliance on inference by repeating the question “Why?” in response to every explanation.  There is always a practical (and frequently an emotional) end to this line of questioning; but never a logical, categorical, necessary, or sufficient one.  Arguably the greatest methodological dispute in history is between rationalists and empiricists.  Surely it persists, at a minimum, because we sense each position is useless without the other:  induction is inarguable and practical, but inherently passive, essentially objective, limited to correlation, and statistical; whereas deduction is logical, powerful, essentially subjective, causal, and goal-oriented, but untethered and either indeterminate or internally inconsistent.

Or to take things a step further, induction and deduction may require one another.  If formal logic is intuitive, does that make it natural or only genetically-determined?  More importantly, it seems unlikely if it were intuitive, it would be the subject of college courses with, effectively, a prerequisite of a generation of life and a decade of formal education.  Like many people, I have struggled my whole life to understand the world around me; but it took decades to even begin understanding myself, and perhaps it even took 20 years to begin to perceive my true self, or its existence.  If arriving at logic has so many prerequisites, how can we not suspect that each element of formal logic is something we accept only because empirical induction persuades us to accept it?  Conversely, as deceptively simple as the notion of induction is, it is a notion we question, study, and test; and perhaps more profoundly, only correlation, and at the outermost limit perhaps temporal sequence—not causation—can be observed or tested.  The human mind may take many empirical experiences for granted—people acted as if an unsupported object would fall for millennia before Isaac Newton stopped to ask “what’s going on here?”—but the human mind seems to search for some kind of meaning, and perhaps utility or purpose, in the world around it, and even when a particular phenomenon is unquestioned, people necessarily make sense of it, or at least make use of it because they are living their lives in its presence—and either course seems to embrace and rely on induction.

To step back a step, in a page and a half of trying to understand where I can begin, I’ve relied on (or at least considered) dozens of concepts that most of us don’t even have the vocabulary or background to study until adolescence or adulthood.  Along the same lines, developmental theorists struggle to even understand what mental understanding the human brain is even capable of supporting in different phases.  Rightly (showing how difficult it is to think about first principles until we are educated and acculturated to do so) or wrongly (showing how impossible it is to fully comprehend or even remember or perceive our own past), developmental theory seems to indicate that faculties are added and improved from conception to about 25 years of age; implying that we are capable of more profound understanding only deep into our learning and development cycle (even if the rate of improvement and acquisition of understanding is steadily decreasing).  And, perhaps, suggesting reason and sense themselves are only permitted by, and conversely must necessarily be limited by, our biology.

Observation.  Although adults may understand and appreciate it more fully than children, most of us can remember, probably back into childhood, the difficulty of distinguishing between dreams and reality.  Indeed, researchers of young children have posited that young children are incapable of distinguishing between reality and dreams.  I, and apparently many others, can remember believing a dream is real, experiencing a dream as reality, and even waking up and not immediately being able to distinguish between what is real and what is a dream.  In the twilight between dreaming and reality I have experienced, and many others have reported, that even after awaking one may still be unsure how much is real and how much is a dream, until…. Well, perhaps, until the experience of the dream fades and the experience of reality is in front of them.  But maybe this suggests not that the difference between reality and dreaming is clear; only that whatever we are experiencing, seems real to us.

One may make observations about the objective world, or the subjective one.  Perhaps for the moment I may assume all of us, through our experiences with the follies of others (and in moments when we may admit them, of ourselves) are aware of how tricky and unreliable our observations about subjective experiences and meaning can be.  But I suspect it is harder to hang onto the trickiness and unreliability of observations about objective experiences because we are mediated from them and thus from inconsistent perspectives about them, in a way we are not from subjective observations.

To me, for whatever reason, it is often helpful to remember two things to recognize the difficulty in understanding observations of the objective world, and the deeper question of their validity as sometimes claimed by some proponents of radical empiricism and scientific method.  First, is the contrast between what we perceive as and intuitively accept as the baby’s immediate and very real experience of pain when it pulls its own hair, and the mechanisms (at least two of them—motor control and sensation) connecting the baby’s consciousness to the world which are necessary for the baby to experience subjective pain in relation to an objective event.  The world comes to us and affects us, imperfectly, through our senses and sensory nerves; and we go to and affect it, as imperfectly, through our motor neurons.  One may call the preceding description a subjective one, from the perspective of the baby’s experience.  Second, by contrast, attempting to observe or explain the same process from an objective viewpoint, might begin with the comparison often made between the brain studied by neuroscience, and the mind studied by psychology, which to be generous to a discipline, might both be studied by psychiatry.  From this perspective, it is the subjective experience of the person that is at arms’ length, and often doubted altogether as a valid or real or meaningful point of view.

Leave a Reply

Your email address will not be published. Required fields are marked *