Might be easier to separate them into tasks. Reasoning about reasoning itself could be a meta activity. Whatever you learn about reasoning itself by reasoning about reasoning (it’s pros and cons) can THEN feed into your reasoning about other things.
It’s one direction at least.
Example: Reasoning is pragmatic. That’s a kind of axiom, a general truism. But it’s missing a word. Reasoning is […] pragmatic. Sometimes pragmatic. Often pragmatic. Usually pragmatic. There’s a number of ways you can make the statement true without committing to a complete equivalence.
From that basis, reasoning can lead to pragmatic truths. But is it ONLY useful in pragmatic truths? Are thought experiments pragmatic truths? What about “pure reasoning?”
“Pure reasoning” is somewhat analogous to a game. It has strict rules to follow. These strict rules are, to put it bluntly, “made up”. Yet, they’re useful. Back to pragmatic we go.
They create a self-supporting framework whereby *as long as* you accept “the rules”, you end up with logically consistent results. This is useful – ie – pragmatic, rather than ending up with a system that “leaks”.
Then we have the biology of the situation. Can humans utilize pure reasoning? Perhaps with damaged emotional systems they can but otherwise, not really. It’s more of a “nice goal”.
Yet ,can we create machines based upon pure reasoning? Sure. It’s a subset of our available capacities to do so. But, it’s always incomplete as a mirror for human truth because human truths utilize emotional content (such as the ‘need to be right’ – one of many driving forces towards truth-seeking – or perhaps it’s a ‘need to regulate ambiguity”).
So, what do I do?
I divorce “reason” from the systems we created called “logic”. I prefer to think of reason and reasoning in the form of, “What conclusions might humans typically come to after thought and reflection?”
I can use the systems of logic as well as other systems that we have : introspection, coming together for a consensus opinion and declaring it “objective”, stuff like that.