───✱*.。:。✱*.:。✧*.。✰*.:。✧*.。:。*.。✱ ───
The discussion for module three was focused on cognitive biases and common errors in reasoning, and specifically what they imply for out everyday reliability in judgment. In my post, I explained that confirmation bias was my main pitfall. For instance, I tend to seek sources that support what I already think and skim past counter-evidence. I also noticed that hindsight bias can make outcomes feel predictable after the fact. Instead of assuming that I can avoid biases entirely, I think that better reasoning comes from slowing down, writing out my reasons, and grounding my opinions in evidence, and making sure that I recognize that fatigue and stress makes it harder to detect these biases in your own judgment.
Lindsay mentioned that even when memory and perception are unreliable (like conflicting recollections after a car incident), structural aids and objective records can justify greater confidence. Bobby also adds that anchoring can skew judgments (like first prices framing later ones), which shows why deliberate checks mater. Reygan and Maya stressed that although we can’t remove bias completely, we should ask whether we believe something because it’s true or just because we want it to be true, and that we should actively seek disconfirming views, which challenges our judgments and makes them more reliable rather than arbitrary.
Our group connected these ideas to real practice. For example, Reygan and Bobby pointed out how social media creates echo chambers, which can intensify confirmation bias. This made me think about how intellectual humility is earned not by certainty but by exposure to diverse viewpoints and by “checking ourselves”. Maya mentioned something about social setting showed that the group agreement can nudge us to conform, even if we don’t fully believe something, which I think is also important. This reminds me of the concept of herd mentality, which is the tendency for behavior to conform to those of some group, rather than ones’ own.
I think that good reasoning is to form the best supported judgments given current evidence, and not just ones bias. I think that it’s impossible to eliminate bias completely, but slowing down and thinking helps you think more critically, rather than just going with what you’re familiar with. I also think that biases don’t fully defeat rational thinking, but they can motivate us to adopt practices like slowing down and seeking counter-evidence.
───✱*.。:。✱*.:。✧*.。✰*.:。✧*.。:。*.。✱ ───