© Inconnu
(Warning - this may get a little philosophical)
Should we tell the truth? At face value, it's a question we all most likely would answer in the affirmative. I believe most people would agree that they
should tell the truth but at the same time admit that they don't always tell the truth. Some would even suggest there are certain "exceptions" for when we shouldn't tell the truth - times when it's "ok" to fib a little (
Hope Hicks might be one of these people). On the surface, truth seems pretty easy to understand, but when you dig deeper, things can get a little more confusing.
For the sake of argument (which I enjoy, by the way), we might ask, "well what is the truth exactly?" Does it mean telling it like it is, or telling it like we see it, or telling it like we think it should be? And to go further, you might ask if truth is the same thing or within the same arena as reality. In other words, does what we see in reality amount to truth, or
is truth something else altogether - something transcendent, something outside of humanity that gives it its authority. Here, we're not talking about the truth as something we can attest to in the real world but about the
idea of truth - Truth with a capital T.
Comment: Why Does Writing Make Us Smarter?