EDIT: To clarify, using the term "erroneous" under the Kuhnian view (disclaimer, my view), is a bit disingenuous because the criteria under which we are claiming these theories to be wrong did not exist in their historical contexts: on the contrary, some of these theories, such as gravity, happenned to be progressive, brilliant, and in some cases extremely useful as a simplifying framework in the next paradigm. All of the historical and technological ingredients, such as the ability to calculate the speed of light accurately, or Maxwell's equations didn't exist for Newton. Do we call that he missed it an error?
In my opinion, "erroneous" as a term should be reserved for beliefs that were incorrect given existing frameworks: Lamarkian evolution, Einstein's cosmological constant, the postulating of the lumeferious ether, Hilbert's axiomatic program that was disproven in his time by Godel...THESE were erroneous.
It would be like saying in the future when (hypothetically) a more advanced technology comes along and computes a totally revolutionary and paradigm-shifting scientific framework that our views in 2010 are "erroneous". Probably a little extreme.
Fairly sure that's the just a progressive step towards Einstein's model.
It is difficult to define wrong in a scientific context since everything is just model that matches observations. Of course, as more observations come, models will be naturally disproved. That doesn't mean they were wrong for the entirety of their existence which I think was your point, but you haven't applied it to the examples you raised.
In the example of ether, It was believed to be permeate all matter and the medium for light to travel through. It explained all observations at that time until the Michelson-Morley experiment which settled it.
I agree that wrong is difficult to define in these contexts, but remember that the only proof of the existence of the ether was that 1) light was thought to be a wave by the mathematical formulation of Maxwell, and 2) all known waves had a physical medium. That's it. It was a very ad hoc construction that immediately presented ways to test it. When it was tested by the MM experiment, it turned out to be undetectable at best, thereby exposing its weakness as a theory. Einstein didn't use the theory as a springboard because it was so good and useful; he used it as a springboard because it was so bad.
Basically, some life events lead to inheritable changes. The mechanism is apparently methylation (or other similar changes) to the DNA. Why this is inheritable I do not understand.
One rather relevant example from a recent new scientist that, in rats, overweight dad rats had offspring who more likely to get diabetes.
I get the impression it's early days with all this stuff, but it's going to be interesting.
Most of our epigenetic information is reset when we reproduce. Epigenetic changes play a big role in the differentiation of cells into their different types; since epigenetics changes the expression of genes, it can make cells behave differently. There are several methyltranserase enzymes whose job is to set up methylation patterns in the different types of cells while the embryo is still developing.
Methylation of DNA tends to persist through cell division because there are enzymes which approximately copy over the methylation patterns during DNA replication. If some epigenetic changes happen in sperm or eggs, then some of this may be carried over to offspring. You're right that we're still in the early days of mapping this out and understanding it. Molecular biology is a crazy mess that makes the worst spaghetti code look downright reasonable. That's evolution for you....
DNA methylation is heritable because the DNA strands that are methylated become part of the gametes and thus part of the developing fetus. The DNA is physically modified, and that same modified DNA combines with DNA from the other parent to form the DNA for the child.
This is an oversimplification, but imagine you have a sweater that you dye bright green. You then unravel it and use part of the yarn, along with yarn from another sweater, to knit a new sweater. Some of the new sweater will have the same physical change, the bright green dye, that the old sweater had.
> that same modified DNA combines with DNA from the other parent to form the DNA for the child.
Sure, but only cell#1 has the actual molecules from the parents.
Cell division requires that the DNA be replicated. What surprises me is that the replication step (apparently) preserves methylation. (If it didn't, there would be no observable epigenetic effect in the organism).
i.e. from an information perspective, the bits encoded in the DNA aren't don't just come from the sequence of bases, but also "out of band" data.
Strongly suggest that you read What is Life by Lynn Margulis and Dorion Sagan.
In it the authors discuss the evolution of the eukaryotic cell: a multi-part cell that evolved due to penetration or symbiosis of multiple organisms. Pretty much all complex life evolved out of eukaryotic cells. In particular the mitochondria that power your cells did not evolve from the same stuff that the rest of your cells did -- they were an addition later on.
Likewise, most of the genetic diff between us and Chimpanzees (or for that matter, blue green algae) are from external sources -- viruses and 'parasites'.
For that matter, 90% of the cells in your body are not genetically yours -- they are other microbes (though these account for only 10% of your body mass). If you are purged of these cells, you die. This lack of microbial transfer was one of the reasons suggested for why infant mortality was so high in the years when babies were isolated from their mothers and other human contact upon birth.
I like this one. It almost seems to be a contradict our current education system:
"Among cognitive psychologists, there is widespread agreement that people learn best when they are actively engaged with a topic, have to actively problem solve, as we would put it 'construct meaning.' Yet, among individuals young and old, all over the world, there is a view that is incredibly difficult to dislodge. To wit: Education involves a transmission of knowledge/information from someone who is bigger and older (often called 'the sage on the stage') to someone who is shorter, younger, and lacks that knowledge/information. No matter how many constructivist examples and arguments are marshaled, this view — which I consider a misconception — bounces back. And it seems to be held equally by young and old, by individuals who succeeded in school as well as by individuals who failed miserably.
Now this is not a scientific misconception in the sense of flat earth or six days of creation, but it is an example of a conception that is extraordinarily robust, even though almost no one who has studied cognition seriously believes it hold water.
Let me take this opportunity to express my appreciation for your many contributions to our current thinking."
That one sounds rather argumentative. He's taking conventional wisdom about learning (active engagement, problem solving) that is accepted dogma among practicing teachers (at least in U.S. primary and secondary schools) and trying to paint it as a minority view. Without hearing some evidence for the robustness of the "transmission" idea, I'm tempted to dismiss this as a rhetorical trick. I suspect there are some popular classroom practices he disagrees with, and he's trying to stigmatize them by linking them to an old, discredited idea.
U.S. primary and secondary schools do not accept the constructivist school as dogma. Perhaps you have simply misunderstood the differences he was stressing.
For more information on this, you should probably look at what Seymour Papert has been doing in Maine lately.
What he said was "people learn best when they are actively engaged with a topic, have to actively problem solve, as we would put it 'construct meaning.'" That isn't controversial at all, which is why I suspect it's a rhetorical trick. Scratch the surface and ask him why he thinks teachers really don't believe in it (which they all claim to do) and I'm guessing he'll say they don't believe it because they don't accept some other ideas (the entire dogma of the constructivist school, perhaps?) which he thinks are consequences of believing in engaged, active learning.
In other words, he's saying Y follows from X, so if you don't accept Y then you don't believe X. It's the same rhetorical device as saying, "Nobody cares about civil liberties anymore [because George Bush got re-elected]," or "I guess nobody else here loves his country [because I'm the only one who went to the Tea Party rally last weekend]."
It's always easier to stick up for a vague but popular concept (such as liberty, love of country, or active, engaged learning) than to argue in favor of the controversial concrete policies you think follow from it.
Anecdotally, I'd concur with the author. Most of my education seemed irrelevant. I learned to program and understand math because I wanted to make computer games.
My maths is pretty rusty but... I think describing Euclidean Geometry as wrong is a bit harsh - it's just the system you get when you treat the Parallel Postulate as axiomatic.
If you allow variation in this area you get Elliptical/Hyperbolic geometries - but these extend Euclidean Geometry, they don't invalidate it.
"Wrong" might overstate it, but certainly, for a long time Euclidean Geometry was thought to be a statement about an absolute truth that existed in the Universe. There could be no other geometry. Recall the shock and dismay with which the educated public first heard about hyperbolic geometry. If math is always true, they wondered, then how could another geometry exist alongside Euclidean Geometry? It was as if reality had been disproven. It took a long time for the educated public to come round to the view that we could have multiple geometries, each true for certain given assumptions.
That is, of course, correct - but surely that is more how Euclidean Geometry was interpreted rather than there actually being an internal problem with Euclidean Geometry.
Wouldn't this be a case of Newtonian mechanics as well? There's nothing "wrong" with the math, other than it doesn't fully explain reality like we initially thought it did, and so the theory is wrong. A lot less wrong than anything we had before, but still wrong.
But don't all theories of the physical world come with caveats and assumptions attached? In the case of Newtonian mechanics it is clear that there are cases (the very small, the very fast) where it doesn't apply - but that doesn't mean it isn't useful in more normal contexts.
I seem to recall (I could well be wrong) that the calculations for the Voyager space probes are all done using Newtonian mechanics. For something to be wrong implies to me that it is of no utility - which clearly isn't the case with Newtons work.
Exactly, which is why I was wondering why you'd hesitate to call Euclidean geometry wrong when, asked to explain reality, it fails to do so. But it seems we're operating under quite different definitions of wrong. (Whose utility are you basing your definition on? For me there are plenty of obviously wrong things that nevertheless can be useful for me or someone else.) I lump myself in the Asimov crowd of "wrongness", which is linked on this page. (Hey, a flat-earth theory works in limited cases too.)
I think it's the term "wrong" is what I object to - Newtonian mechanics is still perfectly "correct" for a large number of circumstances. Just because a more refined, and rather more complex theory, comes along a few centuries later doesn't invalidate the fact that if you stay within the boundaries of the where it is known to be perfectly valid it is the best thing to use.
When you model some physical system you do it with and end in mind and a lot of assumptions have to be made (e.g. that gas flow is non-turbulent, heat flow is uniform) which are all wrong in some sense because reality isn't like that but if you didn't make these simplifying assumptions then you'd never get anything done.
Mathematical models aren't reality - they are an attempt to create structures that allow us to explain and predict and the assumptions underlying the mappings between these structures and reality are just as important as the contents of the theories themselves. It's possible to model the same system in umpteen different ways - indeed as there is no all encompassing theory-of-absolutely-everything you have to make some important decisions when you model something as to what is relevant. Trying to model plate tectonics in terms of string theory is unlikely to get you very far - that doesn't make string theory "wrong".
The idea that medieval people believed in a flat Earth is not true. The shape and approximate size of the Earth was knowm since ancient times (3rd century BC).
Columbus was not turned down at first because people believed he'd fall off the edge of the Earth; they knew he could not have possibly reached Asia, the distance was too great. It was only pure luck (finding America) that saved him. A venture capitalist in those times would have been rational not to buy into the expedition.
As for geocentrism, that has some merits. But note that Ptolemy and his followers were really hindered by a lack of good astronomical data. Using methods of those times, an epicyclical model of the planets was pretty accurate and could have been refined further. The model of Copernicus was deemed superior because it needed fewer epicycles to make the math right. The breakthrough came only after Kepler got the very accurate Huygens observations (taken over 20 years); after he had enough datapoints, he could come up with his famous laws.
And so, let's not blame the Ancients right away. They were not as ignorant as we may believe.
Reasonable people thought that the ocean (without america, there would be no separate atlantic and pacific ocean) was considerably smaller than it really is, not least due to the impossibility to reliably measure longitude, so they vastly overestimated the east-west extension of eurasia. Just look at the globes Behaim made in 1491-1493 (based on data by Pope Sixtus IV) which shows a really small ocean: http://en.wikipedia.org/wiki/Martin_Behaim#The_Erdapfel
According to the best contemporary data, the investment would have been highly speculative (as all voyages across the sea), but not insanely so.
Maybe the flat earth theory was invoked as a polite way of telling Columbus no. In the same way that many investors never say no, they often say 'not right now'. The story must have had a kernel of truth to grow from.
The belief that Earth was flat only existed up until around eight thousand years ago, before the institution of anything you could call "science," so I don't even think that one should count.
I was really disappointed to see that sort of thing mentioned; I had been hoping for better, more wild-goose-chase-y examples where the wrong hypothesis was compelling and consistent with equations that accurately modeled reality - things like the luminiferous ether, which yielded the accurate Lorentz equations, and the caloric theory of heat which yielded many accurate equations and experimental results (indeed, its fluid model of heat is so parallel to reality that you probably recall your mechanical engineer friends at school taking "thermals and fluids" classes). The Earth being flat was debunked the first time someone saw a boat come in to pier and is thoroughly uninteresting.
Well, what about the notion of a fixed earth around which other things rotate? Granted, this gets fuzzy about how wrong it is when you consider that there's no absolute reference frame, but if you read history, you'll find out that the reason they couldn't believe that the earth was moving was that they could not measure the stellar parallax. In other words, they knew that the stars should appear to shift a little if the earth was moving. Yes, they do in fact shift, but they're so many light years away that it took a long time before anyone came up with anything that could measure it.
>Well, what about the notion of a fixed earth around which other things rotate?
As you, and relativity, say - it's not wrong to choose an arbitrary fixed point.
>it took a long time before anyone came up with anything that could measure [stellar parallax]
Indeed, the Pythagoreans assumed terrestrial motion (according to Copernicus) just as others assumed opposite. Neither assumption is bad, neither is worse given that there was no obtainable evidence to support either position at the time.
I think you're wrong here. It's true that it's a myth that scholars in Columbus's time thought the earth was round, but the sphericity of the earth was certainly was not known for the past eight thousand years worldwide, nor was it even common knowledge among non-scholars in Columbus's time (as evidenced by the fact that artistic depictions of a flat earth were common).
Wikipedia:
> The Flat Earth model is a view that the Earth's shape is a flat plane or disk. Most pre-modern cultures have had conceptions of a flat Earth, including ancient Greece until the classical period, the Bronze Age and Iron Age civilizations of the Ancient Near East until the Hellenistic period, Ancient India until the Gupta period (early centuries AD) and China until the 17th century. It was also typically held in the cultures of the New World until the time of European contact, and a flat Earth domed by the firmament in the shape of an inverted bowl is common in pre-scientific societies.[1]
The paradigm of a spherical Earth was developed in ancient Greek astronomy, beginning with Pythagoras (6th century BC), although most Pre-Socratics retained the flat Earth model. Aristotle accepted the spherical shape of the Earth on empirical grounds around 330 BC, and knowledge of the spherical Earth gradually began to spread beyond the Hellenistic world from then on.[2][3][4][5]
The misconception that educated people at the time of Columbus believed in a flat Earth has been referred to as "The Myth of the Flat Earth".[6] In 1945, it was listed by the Historical Association (of Britain) as the second of 20 in a pamphlet on common errors in history
>The belief that Earth was flat only existed up until around eight thousand years ago, [...]
Can you give your evidence for this assertion?
I think this is a modern interpretation as an attempt to demonstrate that we are somehow advanced beings now. IMO the person of 8000 years ago was probably as capable as the modern man, given the same nutrition and with the same access to reference works and technology I don't think we would outshine them generally. Quite possibly the opposite.
I don't know how "force of gravity" is a wrong belief but "the only force you are actually feeling is the upward force exerted by your own muscles in order to keep your arm accelerating continuously away from a straight path in spacetime." is correct. can anyone explain?
EDIT: my mind is totally blown by this; it put together some pieces of general relativity in a new way for me today. thanks HN!
The "force" of gravity is not a fundamental concept in general relativity.
Here's a way of thinking about what's going on.
Consider a large mass, like the Earth. That mass curves spacetime in the vicinity of the Earth. A small object, like a satellite, simply moves completely freely in a "straight line" (i.e., geodesic path) according to that curved geometry. The curved geometry is such that those geodesics are just the satellite orbits that we see. In other words, the satellite isn't affected by any "force", it's just moving in a straight line in a geometry that happens to be curved, so we see it doing circles around the Earth.
It's exceedingly neat: in John Wheeler's great phrase, matter tells spacetime how to curve, and spacetime tells matter how to move. No forces required! Of course, after the fact you can tack on a notion of "force", but it's in no way fundamental.
Same thing is going on with projectile motion here on Earth.
Once you internalize this point of view, the statement about holding your arm in place becomes a lot clearer. The "natural" force-free thing your arm wants to do is to move along geodesics of spacetime, which means falling toward the Earth at an acceleration of g. But if we exert a (muscular) force to keep it up, then we can hold it in place. In short, in this point of view, forces are things which cause deviations from geodesic motion.
What's always bothered me about these kinds of animations is that there is a grid of straight lines, but none of the objects actually follow these grid lines. Instead, the geodesics they follow are determined by the topology of the deformed surface, invoking the viewer's intuitive understanding of gravity...in order to understand the unintuitive formulation of it.
Actually, the math is wonderfully simple deriving a great deal of benefit from tensor theory and the power of coordinate invariant arguments. It's a non-trivial concept in full, the shape of space and time, but considering the wild opportunity for it to be complex, the formulae are exceedingly simple.
He is saying this because, in general relativity, gravitation is no longer explained as a force between two masses, but as their "warping" of their surrounding space(time).
This is true, but it doesn't invalidate the description of gravity as being an attractive force between masses. The warping of spacetime is just an explanation of what's happening to create that force.
There are no forces involved in Einstein's gravitation. You can see it as a pseudo-force (like Coriolis or centripetal) if you like, for practical (and mental sanity) reasons. But it's not a true force.
I never really understood this part. General relativity gives us one explanation of gravity (mass curves space-time), while quantum theory tells us that gravity arises from the exchange of particles. So... how do we reconcile? I imagine if we ever discover a unified theory, it'll manage to deal with that. But how does say string theory (for lack of other examples) reconcile these two views?
...or is it something as simple as the exchange of particles causes the warping of space >.>
> quantum theory tells us that gravity arises from the exchange of particles
No, it doesn't. That statement is wrong on two levels:
1) We don't have a confirmed quantum theory of gravity
2) In quantum field theories in general it's not really accurate to say that "[forces] arise from the exchange of particles". In some QFTs, in some situations, you can perform a perturbative expansion which maps onto the Feynman diagram view of particle exchange, but that's not universally possible. They really are quantum _field_ theories, not quantum particle theories.
vibragiel is basically right, but I'll just comment here that if you think hard about what you are arguing, it is semantics. A physicist will define a force as an object's deviation from a geodesic (straight line motion in the relevant geometry). In this case, gravity is not a force as objects attracted by gravity are indeed following geodesics. On the other hand, if you define a force as an object's deviation caused by a second object from the path that would be taken were the second object not there, then gravity is indeed still a force. In this case, the second object is changing the geometry itself and, hence, the associate geodesic.
It is semantics, and all comes down to what's a 'force' and what isn't in theoretical physics. I was arguing from the second definition.
The first definition is tautological: First, you define a curved space, accounting for gravitational forces such that a small object will follow a certain geodesic. Then, you say that since it does indeed follow said geodesic in that geometry without deviation, gravity is not a force. Well of course not!
With the same same hand-wavy argument, you could define the curvature of space based on electromagnetic field strength, for example, and argue that electromagnetism is not a force.
>With the same same hand-wavy argument, you could define the curvature of space based on electromagnetic field strength, for example, and argue that electromagnetism is not a force.
No. You're very wrong here. You (irritatingly) seem to think that because you have only been exposed to popularized hand-wavy arguments, that physicists necessarily rely on them.
The vast majority of possible forces (including Newtonian gravity and electromagnetism) can not be explained the way gravity is explained. Your confusion probably comes from not understanding how much smaller the space of possible geometries is than the space of possible forces.
In fact, the stunningly beautiful thing about gravity is that it can be derived from knowing only that it is a geometric effect and obeys a few symmetries. Even if you relax some of these symmetries (e.g. Brans-Dicke theory) the resulting parameterizable space of possible geometries is very restricted.
> With the same same hand-wavy argument, you could define the curvature of space based on electromagnetic field strength, for example, and argue that electromagnetism is not a force.
Not really so much, since different objects couple to electromagnetism differently based on their charge. All objects follow the same geodesics in general relativity.
>All objects follow the same geodesics in general relativity.
As far as we know .. I'm trying to imagine particles that do not [Higgs-teflon engage] I'm assuming they would be extra-dimensional (by definition when considering geodesics).
If you decouple particles from the Higgs field where do they go.
Sorry, schoolboy error - I was thinking of the relativistic mass as if it were interacting with the Higgs field. TBH I'm still not clear that it couldn't.
It's the same way that there's not really such a thing as centrifugal force. It a fictitious force that you perceive as a result of being in an accelerating frame of reference.
If you are standing on the ground, the only actual force is the normal force of the ground acting on your feet to stop you from falling into the center of the Earth. However, it feels like you are pushing down on the Earth, but that's technically a pseudo-force.
Another way of looking at it is to imagine that you are standing in an elevator in space that is accelerating with the speed of 1g. That scenario will feel exactly the same as if that elevator was standing still on the surface of the Earth, but it's more clear that the elevator is pushing you, and not the other way around.
From the article: "I know a union that got a substantial pay raise because a politician did not understand that adding and then subtracting 20% gets you to another result from the one you started."
I've been trying to work this one through. One possible solution, (using a hefty dose of mathematical illiteracy, that looks like one of those word problems that occasionally appears in Marilyn vos Savant's column):
True. 20% of y is larger than 20% of x (which is what was added to x to obtain y) because y is a larger number. Therefore subtracting 20% of y from y will give you some value smaller than x. Eg: 100 + 20% = 120; 120 - 20% = 96.
I don't whether the error you introduced is intentional or not, but 120 * 0.8 = 96, not 106.
Another way of thinking about this is that 1.2 * 0.8 = 0.96, so it is a four percent pay cut. (This is true for any situation like this and can be remembered by the difference of two squares formula from your early algebra class -- (1 + r)(1-r) = 1 - r^2, where r is the fraction raise/pay cut.)
I think that one of the interesting threads between contributors to this is the general expression of empathy and consideration for those who espoused or worked towards ideas that were later proven mistaken. An example is Charles Simonyi:
I think we are all too fast to label old theories "wrong" and with this we weaken the science of today — people say — with some justification from the facts as given to them — that since the old "right" is now "wrong" the "right" of today might be also tainted. I do not believe this — today's "right" is just fine, because yesterday's "wrong" was also much more nuanced "more right" that we are often led to believe.
One belief that wasn't mentioned is the idea of phlogiston. It's the idea that flammable material contains a substance called phlogiston and that the escaping of that substance is equivalent to burning. This believe was held for around a hundred years, from the end of the seventeenth century until the end of the eighteenth century
The debate as to whether the Earth or sun is the center of the universe was silly. It's all about frames of reference and simplicity of calculations in predicting observations. See "The Grand Design" by Stephen Hawking and Leonard Mlodinow.
People prefer a simplified view of history where people where dumb and ignorant even if it's often proven wrong. The reality is reasonably correct interpretations / ideas are often replaced by misinformation for long periods of time. The most obvious example is how do you lose a city. But, something as simple as crop rotation is often found, then lost then found again.
For a more popular example, two thousand years ago many people thought the earth was round based on a wide range of evidence. They even calculated the size of the earth with a fair amount of precision. A more striking example a scurvy cure was found, then lost and then found again over hundreds of years. http://idlewords.com/2010/03/scott_and_scurvy.htm
Might that not be because people were illiterate and any permanent record was very rare indeed?
Perhaps people were indeed dumb and ignorant. If no one told you all of these things since birth, would you have any idea that the earth goes around the moon. Supposing you are a knight, a farmer, a priest, a whatever, rather than someone dedicating yourself to the study of the stars, and no one told you ever that the earth goes around the sun, or even moves, would you ever find out or know at all and would you not believe the priest or whoever seems to know about the subject.
It is not that people are dumb. They just do not know. A doctor can tell me whatever he likes that sounds reasonable and as I would have no option I would believe him.
The most educated have learned quite a bit, but once you start talking about average populations you find a higher percentage of people in the US belied in evolution 100 years ago than do today.
And when you start taking about actual numbers you find that there are over a billion people that can't read living in the world today. Heck, due to the population increase there are probably more people in the world today that believe the world is flat than did in the 300 BC and there pictures of the thing.
Many (but not all) scientists assumed the far side of the moon would turn out to look much the same as the side we are familiar with. [...] And I argued with Hornig [Donald Hornig, Chairman of the President's Science Advisory Committee] about it and he said, 'Why? It looks just like this side.' And it turned out it didn't."
I did not know that. How significant is the difference?
> Among cognitive psychologists, there is widespread agreement that people learn best when they are actively engaged with a topic, have to actively problem solve, as we would put it 'construct meaning.' Yet, among individuals young and old, all over the world, there is a view that is incredibly difficult to dislodge. To wit: Education involves a transmission of knowledge/information from someone who is bigger and older (often called 'the sage on the stage') to someone who is shorter, younger, and lacks that knowledge/information. No matter how many constructivist examples and arguments are marshaled, this view — which I consider a misconception — bounces back. And it seems to be held equally by young and old, by individuals who succeeded in school as well as by individuals who failed miserably.
Some of my favorite "wrong" or erroneous 'beliefs' (apart from the Flat Earth, and geo-centric beliefs)
-- Luminiferous ether: that light propagates via the "ether" medium
-- "Bad air" theory of infectious diseases (which is still held in many countries)
-- "Stress causes ulcers"
-- Leprosy is contagious
-- The belief that first five (or two, or three) years of a child's life determine his/her personality in adulthood. This is one is still widely held and accepted.
Oh, and just for kicks: 'Intelligent Design' and its variants. :-)
And just to throw it out: What will it take to convince that Global Warming / Climate Change is real? This is perhaps one of the leading least-understood / most controversial beliefs of our times.
Dirichlet used the ~4000 B.c. date for the creation of the Earth in one of his examples of statistical inference. Not sure whether that counts as believing it, though.
I’m shocked that none of the experts mentioned any of widely-held Eastern “scientific beliefs” in dietary supplements, herbal remedies, or reflexology.
These are still commonly practiced yet are scientifically proven no more effective than a placebo.
Many herbs are pharmacologically active, to the point of being dangerous if taken in conjunction with OTC or prescription medication. Rhino horn and reflexology do seem to belong in the "bogus belief" category.
Exactly! Peyote, aspirin, cannabis, et cetera are real drugs. Most herbs are not active. Some herbs can even worsen the condition they are traditionally used to treat.
Reading through the list, it shows just what a high standard any current theory must be held to, given the amount of past failures in what are fundamental understandings of the world we inhabit.
I don't know about ridiculed but once quantum gravity gets sorted out there is a good chance that either relativity or quantum mechanics is going to be "wrong" in the sense the Newtonian sense of "inaccurate at X scale."
I think Edge picked a very bad word. Isn't "scientific belief" is a silly oxymoron?
At least, many of those "beliefs" weren't actually beliefs, but scientific theories that worked very well but were later improved. It's the case of Newtonian gravitational force.
What point am I missing? Clearly, the definition of "scientific belief" used in this context is something like "belief held by people in the scientific community". The central point here seems to be that, while "belief" may have a connotation of something that is lacking evidence, it's in fact a far more general term that applies even to genuine knowledge.
>the definition of "scientific belief" used in this context is something like "belief held by people in the scientific community"
I hope not. The definition of "scientific belief" is belief in something supported by a scientific method. Put another way formally and rigorously established based on agreed axioms.
Being a scientist doesn't make your belief that you're Imhotep reincarnated a scientific belief.
The definition of "scientific belief" is belief in something supported by a scientific method.
How many examples on the OP did you read? How many of those examples were supported by a scientific method? (Probably some, but not all.) I think my definition better suits the context, quibbling aside.
>you either have "knowledge" or you have "false beliefs" based on best available science.
Personally I think it would be truly naive to imagine that we now have what you call "knowledge" as opposed to having in science the best (based on consensus) available description of the universe. If you wish to call the standard model "false belief" then I can go with that but it seems a bit overly fussy.
We should understand that we take our axioms and build on them and measure against them but that we need to adjust those axioms as evidence comes to light.
Axioms, many scientist fail to realise, are beliefs without scientific justification. Not only is physics built on them but the mathematics we use to build our physics and the logic that we use to support our mathematics are built on them too. What is more Godel shows us that we can't prove that logic to be complete and consistent from within.
Yes Pyrrho is my hero but I think Carneades went a bit far.
Doesn't applying no false premise (a la Nozick) remove the so-called Gettier problem entirely though, at least from an epistemological viewpoint? It seems to based on my hour or so of reading just now.
Thanks for that. I always thought that definition was somehow just very, very wrong, but it kept getting cited by people who didn't really care what the definition was.
I still think it's a very bad word to pick. "Belief" is widely accepted to mean holding a proposition regardless of evidence, which isn't something you could call "scientific".
EDIT: To clarify, using the term "erroneous" under the Kuhnian view (disclaimer, my view), is a bit disingenuous because the criteria under which we are claiming these theories to be wrong did not exist in their historical contexts: on the contrary, some of these theories, such as gravity, happenned to be progressive, brilliant, and in some cases extremely useful as a simplifying framework in the next paradigm. All of the historical and technological ingredients, such as the ability to calculate the speed of light accurately, or Maxwell's equations didn't exist for Newton. Do we call that he missed it an error?
In my opinion, "erroneous" as a term should be reserved for beliefs that were incorrect given existing frameworks: Lamarkian evolution, Einstein's cosmological constant, the postulating of the lumeferious ether, Hilbert's axiomatic program that was disproven in his time by Godel...THESE were erroneous.
It would be like saying in the future when (hypothetically) a more advanced technology comes along and computes a totally revolutionary and paradigm-shifting scientific framework that our views in 2010 are "erroneous". Probably a little extreme.