EDIT: got downvoted! I would love love love to know why! Not offended, just curious.
However, one day, a philosopher found a situation in which a justified true belief was incorrect. This is the Gettier problem.
What you describe is something akin to a network of baysian conditionals attached to certain proposition, which upon confrontation with new information update their relative weights. We know with certainty that this process has significant benefits in general (it's certainly better than most systems not internalizing new information), but can and does create false reasoning.
In short, it's good but not sufficient to create knowledge. The problem of individuals creating ideological filter bubbles around themselves is very related to the idea that their evidentiary priors become more and more rigid as they note confirmatory evidence over time that justifies their views over time. The issue isn't that they stop intaking new information, but that their priors and the new information are interpreted based upon that belief network.
Thankfully, as a super-organism, we have a great solution for that mental ossification. We die. New people who have less evidentiary accumulation can address the issue with new priors and often that's all that's needed for huge breakthroughs.
The question is "what is knowledge?", not "do we know that we know p?". And I see no issue with the definition of knowledge as justified, true belief. Now, if I believe p, and you ask me whether I know p, I may say yes. But whether I actually know p will depend on whether my justification is valid (that it really is a justification and a sufficient one) and whether it is true, which has nothing to do with whether anyone knows whether the justification is valid and the belief is true. It's a separate question, and conflating the two questions leads to an infinite regress of skepticism. So the definition of knowledge qua knowledge still stands.
I would also suggest you try to apply your general approach to the very theory you are proposing. I see an opportunity for retorsion arguments.
Death forces our species wide belief set to go through the constrained channel of education and communication, the same way that our bodily attributes go through the constrained channel of our germ-line genes.
This process lossily compresses the signals, which allows for drift or attenuation when the next generation reconstructs the beliefs and associated behaviors. Transmission also applies stress that acts as a filter to weed out beliefs that are no longer adaptive.
I disagree (without down-voting). This is basically 1-man echo chamber, you take what you like (it doesn't matter how many eloquent words you use to describe this, result is same), reject what would challenge your beliefs and would make them weaker. That's the opposite of critical thinking so needed in real world, and prime source why the current world, particularly west, is so torn to pieces about shit like russia, trump, guns, migrants and so on.
Stuff in life is complex, always, almost at fractal level. You keep learning, if you actually want, about new viewpoints that will challenge your current ones, every effin' day. Maybe at the end conclusion is don't trust anybody, people are generally a-holes etc. That's still fine as long as it represents truth.
Stuff in life is complex, people are assholes, but even assholes have good ideas sometimes. I recommend listening to everyone who speaks for themselves in good faith. Anyone can cook!
Most unresolved disagreements I know of are because the groups disagree on some unprovable underlying assumption. Switching positions on it doesn't make the beliefs stronger or weaker
Being able to believe something and stick to it, regardless of challenges from competing interests or forms of coercion: that's more valuable in practice than being more reconciliatory
(Quoting Aristotle always puts me in the mood to rank things.)
Nothing ground breaking, and in the end nothing that needed to have so much perfectionism around.
The belief of having to need something perfect is one of the strongest I see among founders here on HN and elsewhere. It's almost always bad. I have zero examples where that ended up being good. Yet, even though the facts are clear, it's extremely hard to overcome.
* Your input box doesn't look like a text box
* The 'enter' key doesn't work in the text box
* 'Refresh' neither looks like a refresh icon, nor has a label
* The fade on the right of the gallery implies you can scroll, but this isn't possible
* The generated logo + icon pair wasn't immediately noticeable (the first image is the icon without text, and the first icon isn't guaranteed to be noticeable), possibly generate image with text + logo on a transparent background and put it above the 4 sample images.
On desktop, you can actually scroll right now
That being said, this comment feels more like self-promotion than conversation. Don't do that.
And the comment section is rarely a representative sample of your target audience.
I get really OCD at times and to avoid that I started focusing on "most basic functionality" and forced myself to launch when that experience was possible - from there everything I get caught up in is still an improvement.
I've done exactly that tho and I well overcooked a project holding it back til I had it just right.
Hopefully you didn't hate the experience and your better for it.
I liked the idea of the site immediately - logos can really suck and this site is perfect when a logo doesn't really matter enough to spend time on.
I'm sure your very limited in scope by trademarks and copyrights but a little more variety would be great. Colors would be awesome - choosing 2-3 would be better, that's with everything the same.
More background images or use my own option. Seriously tho - not a bad start at all. Fantastic side project.
I had no issues with the UI and was plugging away almost immediately.
I'll check it out again for sure.
If so does it matter if it’s perfect when yore goal is just to boost top of funnel for the agency?
- What it actually means to fail
- That failure is inherently bad
- What will happen next after failure occurs
- What it says about me when fail
- What others will think about me when I fail
- That I can’t recover from failure
etc.
If you grow up hearing that failure is bad/wrong/implies something about you as a person, it might never occur to you that another framing is that life is a series of experiments, and failure can be one of the best ways to zero in on success (in some cases, this may be the only possible way).
As far as I can tell, it’s beliefs all the way down, and adjusting certain beliefs can fundamentally transform experience relative to all downstream implications of that belief.
This in turn inspired Bell's theorem, and eventually quantum information theory.
Where Kuhn is so helpful in understanding that even scientists have immense difficulty, if not vigorous myopia, stuck with wrong beliefs. Paradigm shifts with funerals is easier over decades than getting scientists to evolve their models.
Not only is it impossible with current human knowledge to construct an infallible theory that predicts everything we encounter, it is also impossible with current human physiology never to cling to wrong ideas in the face of counter evidence. When examining our rationality, we must not only admit our data are incomplete and our theories flawed, but we ourselves might be thinking foolishly.
I strongly disagree with this, unless we are only talking about beliefs that are about facts of the universe.
For example, my strongest belief is that all people have an equal right to exist and pursue their own purpose... this is not a belief about the facts of the universe, but about my own morality. I don't think it has a chance to be 'wrong'
Example: I love my wife. This is an emotional commitment. It can't be 'wrong' in a factual sense - that's the wrong rubric for it. So it's not really a belief in that sense either.
A belief should be amenable to facts, evidence, or some sort of feedback. If it isn't it's ultimately not a belief. It's excluded from the kinds of decision-making and reasoning he's describing.
Everyone who says this naturally excludes pedophiles, nazis, or any other "undesirables" in their given society, whoever is deemed to be socially unacceptable in the current moral framework.
I get your point, but I disagree.
I won't say (because it isn't true) that "my strongest belief is that all people have an equal right to exist and pursue their own purpose."
However, as you point out there are those who are "deemed to be socially unacceptable in the current moral framework." I agree that subjective standards have no place in the law -- anywhere.
I do believe that all sentient beings have agency. But in a free, open society, such agency needs to be constrained in order to promote social cohesion.
That said, we shouldn't attempt to restrict what someone believes. Rather, we should restrict the actions of others to infringe on the rights of others within a society.
To use your examples, there's nothing "wrong", per se, with being a "pedophile" (that is, a person who is attracted to/sexually desires pre-pubescent children), but actually abusing children to satisfy those desires infringes on the rights of those children and should be (and in most places already is) restricted/criminalized.
As far as "nazis" are concerned, again holding beliefs in accord with nazi-ism/white supremacy isn't "wrong", but taking action in support of such beliefs may well be wrong (e.g., shooting up a church full of African-Americans or a synagogue full of Jews, etc.) when it infringes on the rights of others.
To put a fine point on this, there's nothing inherently wrong/evil/bad about believing that sex with four year-olds or killing jews and people of color is right and good. Personally, I find such beliefs to be repugnant, but that doesn't give me (and shouldn't give the government either) the right to restrict those people from holding/sharing such beliefs.
At the same time, should someone act on those beliefs (e.g., abusing children, killing Jews or blacks, etc.), the government should absolutely at least attempt to stop (and/or detain/incarcerate) folks who are actually infringing on the rights of others.
Belief is far stronger - that's why people do things all the time they themselves at one point "knew" they couldn't do.
If you start with a flawed belief - things won't improve from there. You'll ending "knowing" a whole lot of stuff that reinforces your flawed belief - simply glossing/ignoring/downplaying the facts that don't support... this becomes a bit of feedback loop after awhile.
So either learn to let go of your beliefs and adapt or at least don't firmly establish beliefs until after you know enough stuff to decide for yourself what to believe.
I reevaluate mine all the time and I'm not wrong on of my strong convictions - albeit from my point of view, which I've made as broad as possible but I'm still human.
My highest beliefs today are built upon a foundation of information, learning and mistakes - I may state a belief with a single sentence but I can write books about why I've arrived at that belief.
I don't that's morality - I sometimes do things I "know" to be immoral, when the justification warrants it, I've never knowingly decided to believe something I know is wrong - even if I was forced, I'd only pretend to believe at best.
In college I'd cheat on a test tho if I thought it the only way I'd pass - bc I believed passing was more important than the test... maybe it's a bad example of immorality.
Anyways, I completely agree with Cortesoft - I'm settling on the understanding that all people everywhere are fundamentally important, collectively and individually.
Allowing and empowering all people to live their best lives is in all of our best interest. I've gone further even than equal right to existence and yet I'm supremely confident.
I think this rant also rather effectively demonstrates exactly what the OP was saying about our strongest convictions.
An incorrect fundamental belief - like say I believed the earth was flat, that belief would be implicit in all that I believe after that, just part of my world view and muddling up everything I think about anything - I wouldn't even be aware of that.
Mental liquidity. Fantastic.
Otherwise knowledge can be an immovable trap that becomes harder to avoid/escape the more stuff you know.
Scientists are great examples of this - if it can't be scientifically methoidized it doesn't exist and therefore must be explainable within the framework they already know, bc that's always right ;)
"What if ***** were true? Surely it can't be true. If it were, that would be terrible."
That's motivated reasoning. Remember that the truth of any hypothesis is not influenced by how much you want it to be true, or false. Some hypotheses are deeply uncomfortable, but you should nonetheless strive to believe the truth. Or rather, what is best supported by the evidence. Even if it hurts.
Taking up the most uncomfortable (i.e. "forbidden") hypothesis and giving it the weight required to attempt to prove it to yourself is not a systematic way of finding truth; it's a way of deceiving yourself into believing in the simplistic frameworks of other people's paranoid conspiracy theories.
It is worth citing the Litany of Gendlin:
What is true is already so.
Owning up to it doesn't make it worse.
Not being open about it doesn't make it go away.
And because it's true, it is what is there to be interacted with.
Anything untrue isn't there to be lived.
People can stand what is true,
for they are already enduring it.
Don't jump to new (wrong) conclusions, break out of your mental prison of conclusions.
It's unclear to me in what respect the opinions are "strong" if not one's conviction in them. To my mind a strong opinion is an opinion one is confident in.
Also it's unclear to me if/how/why this is better than "less opinions". Like is it better to have a "strong opinion weakly held" on topic X versus "My opinion is pending scientific research will answer this"?
A nitpick -- I actually have a pretty big distaste for maxims that have some cutesy rhyming/wordplay to them (in this case it's X y, !X z, X = strong).
As far as "strong opinions, weakly held", this is one of my favorites at work in a large scale product engineering environment. It goes beyond "mental liquidity" as described in the OA (which is really just about the "weakly held" part). The "strong opinions" part is that often times groups will succumb to analysis paralysis or unwillingness to make a decision due to group dynamics. Having a strong opinion (ideally backed by knowledge and expertise) is a way to push through and bring clarity. The risk is there is a personality type prone to blustering overconfidence that will push a group in a certain direction without reasonable justification. Ideally what you want is a critical mass of smart, decisive, but open-minded people who are quick to assimilate new evidence into their viewpoint.
That's the correct interpretation of "strong opinions", as I understand the phrase.
The "weakly held" part means that you are willing to adjust your opinion in the face of contradictory evidence, which is difficult to do for deeply held beliefs.
A lot of it also sounds like common sense to me, the people capable of grasping this:
> Be careful what beliefs you let become part of your identity.
Are quite capable of adjusting themselves.
Everything else falls into either Ego, or people being self-(un)aware, and for the latter - you can only change "their" belief system if they themselves are willing to change.
“I have a tight enough knowledge and grasp of my beliefs to intentionally control my sense of identity” is a fascinating belief to turn into an identity.
According to the Kegan theory, it's possible. I'd be fascinated to see it if anyone knows of a study that demonstrates self authorship in a population of real people.
It’s a somewhat amusing thought that there is this human phenomenon wherein we can transcend nature, nurture, the id, the ego, the superego, biology and chemistry — and overwhelmingly those that achieve this enlightened state coincidentally tend to end up as self-help bloggers and motivational speakers.
http://www.vega.org.uk/video/subseries/8
> Q: "Do you like the idea that our picture of the world has to be based on a calculation which involves probability?"
> A: "...if I get right down to it, I don't say I like it and I don't say I don't like it. I got very highly trained over the years to be a scientist and there's a certain way you have to look at things. When I give a talk I simplify a little bit, I cheat a little bit to make it sound like I don't like it. What I mean is it's peculiar. But I never think, this is what I like and this is what I don't like, I think this is what it is and this is what it isn't. And whether I like it or I don't like it is really irrelevant and believe it or not I have extracted it out of my mind. I do not even ask myself whether I like it or I don't like it because it's a complete irrelevance."
I think that's critical, because if you become emotionally involved with promoting an abstract idea, it becomes part of your personal identity or self-image, and then changing your mind about it in the face of new evidence becomes very difficult if not impossible.
In another lecture, Feynman also said something about not telling Nature how it should behave, as that would be an act of hubris or words to that effect, you just have to accept what the evidence points to, like it or not.
(Changing your mind about what's morally acceptable, socially taboo, aesthetically pleasing etc. is an entirely different subject, science can't really help much with such questions.)
Coining a new term when a perfectly good one exists is unfortunate but happens, as see with the author here.
Edit: here's a link to neuroplasticity (aka brain plasticity):
This is a great question. And "decade" is a good time frame not only because of size but because it's a long enough time frame there's a better chance people will have good answers.
The Dee Hock quotes (“A belief is not dangerous until it turns absolute” and “We are built with an almost infinite capacity to believe things because the beliefs are advantageous for us to hold, rather than because they are even remotely related to the truth”) are great too.
Different angle: it's not simply "fooling" oneself, but it's because ideas are one way or another built on top of an ideological foundation.
Einstein rejecting quantum theory on the basis the universe shouldn't have a random component to it is also rejecting the idea of having to re-examine all philosophy past Descartes and Newton, which aligned so well with society's viewpoint at the time - a deterministic, cause-consequence universe, where things have logical explanations and where hard work is rewarded.
"I never allow myself to have an opinion on anything that I don't know the other side's argument better than they do.”
Check out this article [0] for a description of ACT from a founder's perspective.
This sounds like a rehash of Popperian epistemology. We should look forward to disproving existing theories (finding new problems), because it leads to new, better theories.
A great essay in this area is Venk’s Cactus and the Weasel. https://www.ribbonfarm.com/2014/02/20/the-cactus-and-the-wea...
The link contains a number of reasons why people get trapped in sunk cost fallacy.
When you accept on faith a handful of principles that deal with an unknowable domain, it becomes much easier to be less attached to the other stuff.
For me, I’ve found success and deep value in exploring non-sectarian Buddhist philosophy, which points directly at the problems caused by attachment to ideas and things, and does a good job of deconstructing thought processes that most of us engage in without realizing.
To me, this is less about choosing to accept certain principles on faith as much as it is about recognizing/acknowledging that this is what we already do in most aspects of our lives.
To anyone who can find value in traditional religious contemplation while avoiding the downsides, more power to you. The point of my comment isn’t to say there’s nothing to be found there, but if the version of religiosity you’re familiar with is the toxic kind, there are other paths to follow that get at some arguably important insights without some of the baggage that can be difficult to avoid.
(I realize Buddhism has religious roots, but there is a long history of exploring the underlying insights in a non-religious context e.g. Zen, and the analytical framework associated with traditions like Dzogchen and Vipassana are applicable without any of the metaphysical underpinnings).
On the toxic part, sorry to hear that. I think anything can be toxic originally to the value of the concept. (ie someone may have a horrible experience with a coach but that doesn't take away from the value of fitness in general) but it sounds like you have a pattern that works well for you.
I've heard Judaism characterized as very accepting of discourse and reinterpretation of itself. Does this strike you as accurate? If so, it sounds like a kind of mental liquidity...
> When you accept on faith a handful of principles that deal with an unknowable domain
Sounds like mathematics, in which practitioners become used to both the process of relying on a set of axioms and selecting them for the purposes of exploring or constraining systems, which makes one aware that there's a certain degree of choice or even potentially arbitrariness to it...
For example, the study of the Talmud is an example of both mental training in debating an issue from several perspectives, and the installment of the idea that this is part of the religion.
You can also look up "Jewish responsa" on Wikipedia as a diving point into this.
Perhaps community fits even better.
I personally am free enough to design my own life without boundaris.
I am not trying to persuade you and I am holding back from expounding on what I mean at length here, just sharing the perspective.
Einstein came up with most of what physicists now recognize as the essential features of quantum physics. He was not anti quantum, he just believed randomness could not be a fundamental feature of nature.
One of the big ones had to do with whether the "fields" formulation was valid and primary. One of the issues is that if you follow the fields formulations that Einstein believed in out to conclusion you get things like "atomic oribtals never decay".
Which, of course, is obviously wrong. And an example of one of the reasons why Bohr is considered to have won his debates with Einstein.
Except
Einstein was right! We now know that when you isolate an atom, it's atomic orbital decay gets slower and slower the more you isolate it.
The problem at the time was that all of the experiments that could be run were statistical aggregations and obscured the nature of single state quantum systems.
Long story. Lots of tears. Get your hanky.
It's served me well, in my technical work.
I now do a lot of stuff that I used to scoff at.
Liquidity implies the frame of fluid dynamics, just like data liquidity.
Bringing physics to science is as useful in the mind as it is in software.
I think from my teens to my early 20s my political stance changed dramatically, and at any one point in time I would think that whatever I held to be true I would continue to in the future. But what always changed my belief system was not encountering some new piece of information that changed my idea or made me "update my priors" (in the crude Bayesian system, a most despicable philosophy of our era). It was always something that radically changed how it was that I understood the world around me, something that made my way of thinking about things shift so dramatically that I had to abandon my old ideas. I think everyone should read Marx, Nietzsche, and Freud for that reason, even if you think they are heinous and evil, because they radically question the logic and order of society and knowledge, and their writings are deeply disturbing to many for that reason.
What changes people's perspectives is generally what people want to avoid (to the author's point). And the more you want to avoid something or "prove it wrong," oftentimes the more it changes the way you think about the world.
> It was always something that radically changed how it was that I understood the world around me, something that made my way of thinking about things shift so dramatically that I had to abandon my old ideas.
If you have examples, I would much appreciate it, although I do not mean to pry.
For the reason I said in that quote: Bayesian updates require upholding the same structure and considering things in networks of pre-established probabilities. This is very useful for say sportsbetting, or the weather, but in ones day to day life, especially in political circumstances, things often happen that are completely unpredictable because the socially normative prediction algorithms are always set to re-enforce the normative operations of society, making any radical change inconceivably impossible.
>If you have examples, I would much appreciate it, although I do not mean to pry.
I moved towards the Left because it seemed to me that the basis of so called "reasonable" ideologies like libertarianism and general laissez-fare market conditions always held within them notions of "fairness" and that people would "get what they deserved." It seemed to me that this is language more appropriate for disciplining a child, than ordering our society. In a sense, what was conceived of as "rational" by those who supported the systems of power was revealed to me to be nothing more than ideology which justified that power.