Spotting bad science & Critical Thinking * give handout * go through the list below * ask them to think about this on their own to come up with an example of something they have seen or read that has one or more of the elements below [think] fine to spend a few minutes online * have them pair up in groups to share their thoughts and come up with a coherent combination of their ideas [pair] also fine to spend some time online * share their findings with the rest of the class briefly [share] how does this affect you? consequences? * follow up: critical thinking cheat sheet * then: logical fallacies if short on time, skip critical thinking worksheet and do logical fallacies ************************ * Spotting Bad Science * ************************ All of this applies to all sciences, including physics, but the easy examples are largely from medicine and health since these areas are more broadly familiar. 1. sensationalized headlines need for clicks - not new, but amplified oversimplifying - always trying to reduce cognitive load. Feyman quote - "if I could explain it in 5 min it wouldn't be worth a nobel prize" drawing conclusions beyond the scope of the work widely cited study on benefits of champagne had no human participants driving while dehydrated as dangerous as drunk driving - funded by Coke physics: string theory may solve X [it has not] 2. Misinterpreted results distorting or misrepresenting actual research - try to read original source at least abstract and conclusions usually readable perhaps writer didn't understand science, perhaps misled, perhaps deliberate 3. Conflicts of interest well-known that Exxon funded climate change denial companies do fund legit research, but should add a bit of skepticism Andrew Wakefield - antivax doctor - had ties to group suing vaccine makers paper claimed no causal relationship, but press conference was another story 4. Correlation & Causation particularly diet & health - _correlation_ of variables can be an accident! example of 'false dilemma' - that only options are correlated _causally_ or not it can be coincident! 5. Unsupported conclusions champagne example - may not work in humans! conclusions do not follow from study lab from Wednesday: straight line graph of T vs L is _not_ correct in general! only over the limited range you studied _should_ be T^2 ~ L, double length and period only up by 40% 6. Problems with sample size statistics ... next week. did you study enough examples to rule out coincidences? Coke/dehydration example - only 11 volunteers! odds of a coincidence are high NFL season vs MLB season (and _still_ it comes down to the last days) birthday problem - only 23 people to have 50% chance of 2 of you having same bday better than 50-50 chance two of us have the same birthday! **** [circulate a sheet, write day & month, just for fun] 7. Unrepresentative samples used e.g., human trials not reflecting larger population huge and well-documented issue for women in general and people of color no, really. https://endpoints.elysiumhealth.com/why-women-are-underrepresented-in-clinical-trials-398c9e0735a https://www.scientificamerican.com/article/clinical-trials-have-far-too-little-racial-and-ethnic-diversity/ 8. No control group used e.g., drug trial - has to be a group that _doesn't_ take it physics: lab last time - controlling for all other variables as much as possible! starting angle, twist/rotation, etc. do everything the same _except one thing_ 9. No blind testing used placebo effect - neither researchers nor patients should know! both can bias physics: only looking at data when collection is done 10. Selective reporting of data "cherry picking" - just dishonest. using data that supports your conclusions alarmingly common in everyday life - remember what reinforces, forget what doesn't complicated experiment: _hate_ data that disagrees, but may be even more interesting once you understand it! 11. Irreproducible results has someone else shown this independently? huge in health/diet, but common to all sci big problem: it isn't splashy to reproduce, everyone wants to be first result - tons of research has not been reproduced and may not be right! bigger problem: no one reports failures or what didn't work. wasted time & information 12. Non-peer reviewed material need honest and anonymous critique to find flaws - we all have blind spots ********************* * Critical Thinking * ********************* As above: go over list below briefly, ask them to think about it briefly, then discuss with neighbors. Can you think of situations where this would have helped recently? WHO: who benefits? who is harmed? WHAT: are the strengths/weaknesses is another perspective is another alternative would be a counter-argument WHERE: would we see this in the real world? are there similar concepts/situations would this be a problem? WHEN: are there benefits are there problems does action need to be taken WHY: is this a problem is this relevant to me/others is it this way are people influenced by this HOW: is this similar to ... does this disrupt things does this benefit us does this harm us ********************* * Logical Fallacies * ********************* Logical fallacies: Tu quoque, "appeal to hypocrisy", or "whataboutism" Person A makes claim X Person B asserts A's actions or past claims are inconsistent with the truth of X Therefore, X is false i.e., "the opposition has done bad things, so how can they be right?" charge with hypocrisy rather than addressing their argument manner of false equivalence this is what "but her emails" was - no matter your opinion, it was a successful attempt to avoid one argument in favor of another False equivalence: The Deepwater Horizon oil spill is no different from your neighbor dripping some oil on the ground when changing oil in his car. You do not have billions of neighbors changing oil, that's the comparison. differs by _many_ orders of magnitude negative news: if everything is negative, who's to say this person is any worse? "all is bad, degree doesn't matter" "criminals don't obey laws, so more laws won't help" so ... we're just doing The Purge all the time then? Wronger than Wrong: equating two errors when one is clearly more wrong than the other Saying the earth is flat is really, really wrong and easily disproven Saying the earth is not perfectly spherical is a _subtlety_ Related: false _balance_, "need to look at both sides" there are not always two morally or factually equivalent sides this is why you do not see creationist vs scientist debates very often Common themes: wanting everything to be binary, and both choices equivalent. tribalism. physics: 'proven' or 'disproven' isn't a thing, matter of degrees wanting everything to be simple and lacking nuance. reduced cognitive load. reporting physics: trying to appeal to layperson gives wrong idea where are our flying cars? don't promise the moon if you can't deliver but the public demands the moon. so what do you to? Relating to science (everyday applications are obvious): "whataboutism": Newtonian gravity does a good job of predicting planetary orbits, etc but it doesn't contain Einstein's theory of relativity. Therefore, it is invalid. problem: the theories are _complementary_ - Newtonian is a simple case of Einstein its predictions are valid _within its domain_, Einstein just does better basically any recommendation about food false equivalence: nearly any case of "doing X is as bad as drunk driving" wronger than wrong: flat earth example binary choices: eating X is either good or bad for you [no, we are complicated] red wine shown to have some positives. are you shocked it also has negatives? lacking nuance: basically all science reporting you read nuance does not generate clicks. in any discipline. we are lazy.