Controversy Warning: You are not logged in. Your IP address will be publicly visible if you make any edits. If you log in or create an account, your edits will be attributed to your username, along with other benefits.Anti-spam check. Do not fill this in! {{Short description|State of prolonged public dispute or debate}} {{Other uses}} '''Controversy''' is a state of prolonged public dispute or debate, usually concerning a matter of conflicting [[opinion]] or point of view. The word was coined from the [[Latin]] ''controversia'', as a composite of ''controversus'' β "turned in an opposite direction". ==Legal== In the [[jurisprudence|theory of law]], a controversy differs from a [[legal case]]; while legal cases include all suits, [[Criminal law|criminal]] as well as [[civil law (common law)|civil]], a controversy is a purely civil proceeding. For example, the [[Case or Controversy Clause]] of [[Article Three of the United States Constitution]] ([[Article Three of the United States Constitution#Section 2: Judicial power, jurisdiction, and trial by jury|Section 2]], Clause 1) states that "the judicial Power shall extend ... to Controversies to which the United States shall be a Party". This clause has been deemed to impose a requirement that United States federal courts are not permitted to cases that do not pose an actual controversy—that is, an actual dispute between adverse parties which is capable of being resolved by the [court]. In addition to setting out the scope of the jurisdiction of the federal judiciary, it also prohibits courts from issuing [[advisory opinion]]s, or from hearing cases that are either [[ripeness|unripe]], meaning that the controversy has not arisen yet, or [[mootness|moot]], meaning that the controversy has already been resolved. ==Benford's law== {{Main|Gregory Benford#Benford's law of controversy|l1=Benford's law of controversy}} [[Gregory Benford#Benford's law of controversy|Benford's law of controversy]], as expressed by the astrophysicist and science fiction author [[Gregory Benford]] in 1980, states: ''[[Passion (emotion)|Passion]] is [[Proportionality (mathematics)#Inverse proportionality|inversely proportional]] to the amount of real [[information]] available.''<ref>{{cite web |url=https://www.eff.org/Misc/EFF/?f=quotes.eff.txt |title=EFF Quotes Collection 19.6 |publisher=[[Electronic Frontier Foundation]] |date=2001-04-09 |access-date=2016-12-04 |archive-date=2007-09-29 |archive-url=https://web.archive.org/web/20070929083639/http://www.eff.org/Misc/EFF/?f=quotes.eff.txt |url-status=dead }}</ref><ref>{{cite web|archive-url=https://web.archive.org/web/20080822143815/http://www.sysprog.net/quotlaws.html |archive-date=2008-08-22|url=http://www.sysprog.net/quotlaws.html|title=Quotations: Computer Laws |work=SysProg |access-date=2007-03-10}}</ref> In other words, it claims that the less factual information is available on a topic, the more controversy can arise around that topic β and the more facts are available, the less controversy can arise. Thus, for example, controversies in physics would be limited to subject areas where experiments cannot be carried out yet, whereas controversies would be inherent to politics, where communities must frequently decide on courses of action based on insufficient information. ==Psychological bases== Controversies are frequently thought to be a result of a lack of confidence on the part of the disputants β as implied by [[Gregory Benford#Benford's law of controversy|Benford's law of controversy]], which only talks about lack of information ("passion is inversely proportional to the amount of real information available"). For example, in analyses of the political controversy over [[anthropogenic climate change]], which is exceptionally virulent in the [[United States]], it has been proposed that those who are opposed to the scientific consensus do so because they don't have enough information about the topic.<ref>{{Cite journal| volume = 9| issue = 3| pages = 297β312| last = Ungar| first = S.| s2cid = 7089937| title = Knowledge, ignorance and the popular culture: climate change versus the ozone hole| journal = Public Understanding of Science| year = 2000 | doi = 10.1088/0963-6625/9/3/306}}</ref><ref>{{Cite journal| volume = 1| issue = 1| pages = 35β41| last = Pidgeon| first = N.|author2=B. Fischhoff| s2cid = 85362091| title = The role of social and decision sciences in communicating uncertain climate risks| journal = Nature Climate Change| year = 2011|bibcode = 2011NatCC...1...35P |doi = 10.1038/nclimate1080 }}</ref> A study of 1540 US adults<ref>{{Cite journal| last = Kahan| first = Dan M.|author2=Maggie Wittlin |author3=Ellen Peters |author4=Paul Slovic |author5=Lisa Larrimore Ouellette |author6=Donald Braman |author7=Gregory N. Mandel | title = The Tragedy of the Risk-Perception Commons: Culture Conflict, Rationality Conflict, and Climate Change| year = 2011| doi = 10.2139/ssrn.1871503| ssrn = 1871503|hdl=1794/22097 | s2cid = 73649608|hdl-access=free }}</ref> found instead that levels of scientific literacy correlated with the strength of [[public opinion on climate change|opinion on climate change]], but not on which side of the debate that they stood. The puzzling phenomenon of two individuals being able to reach different conclusions after being exposed to the same facts has been frequently explained (particularly by Daniel Kahneman) by reference to a '[[bounded rationality]]' β in other words, that most judgments are made using fast acting [[heuristic]]s<ref>{{Cite journal| issn = 0002-8282| volume = 93| issue = 5| pages = 1449β1475| last = Kahneman| first = Daniel| title = Maps of Bounded Rationality: Psychology for Behavioral Economics| journal = The American Economic Review| date = 2003-12-01| jstor = 3132137| doi = 10.1257/000282803322655392| url = http://www.econ.tuwien.ac.at/Lotto/papers/Kahneman2.pdf| citeseerx = 10.1.1.194.6554| access-date = 2017-10-24| archive-url = https://web.archive.org/web/20180219074537/http://www.econ.tuwien.ac.at/lotto/papers/Kahneman2.pdf| archive-date = 2018-02-19| url-status = dead}}</ref><ref>{{Cite journal| volume = 185| issue = 4157| pages = 1124β31| last = Tversky| first = A.| author2 = D. Kahneman| title = Judgment under uncertainty: Heuristics and biases| journal = Science| year = 1974| bibcode = 1974Sci...185.1124T| doi = 10.1126/science.185.4157.1124| pmid = 17835457| s2cid = 143452957| url = https://apps.dtic.mil/sti/citations/AD0767426| access-date = 2017-08-30| archive-date = 2018-06-01| archive-url = https://web.archive.org/web/20180601235707/http://www.dtic.mil/docs/citations/AD0767426| url-status = live}}</ref> that work well in every day situations, but are not amenable to decision-making about complex subjects such as climate change. [[Anchoring]] has been particularly identified as relevant in climate change controversies <ref>{{Cite journal| doi = 10.1016/j.jenvp.2010.03.004| issn = 0272-4944| volume = 30| issue = 4| pages = 358β367| last = Joireman| first = Jeff|author2=Heather Barnes Truelove |author3=Blythe Duell | title = Effect of outdoor temperature, heat primes and anchoring on belief in global warming| journal = Journal of Environmental Psychology| date = December 2010}}</ref> as individuals are found to be more positively inclined to believe in climate change if the outside temperature is higher, if they have been primed to think about heat, and if they are primed with higher temperatures when thinking about the future temperature increases from climate change. In other controversies β such as that around the [[HPV vaccine]], the same evidence seemed to license inference to radically different conclusions.<ref>{{Cite news| issn = 0362-4331| last = Saul| first = Stephanie|author2=Andrew Pollack| title = Furor on Rush to Require Cervical Cancer Vaccine| work = The New York Times| access-date = 2011-11-26| date = 2007-02-17| url = https://www.nytimes.com/2007/02/17/health/17vaccine.html}}</ref> Kahan et al.<ref>{{Cite journal| last = Kahan| first = Dan M.|author2=Donald Braman |author3=Geoffrey L. Cohen |author-link3=Geoffrey L. Cohen|author4=Paul Slovic |author5=John Gastil | title = Who Fears the HPV Vaccine, Who Doesn't, and Why? An Experimental Study of the Mechanisms of Cultural Cognition| date = 2008-07-15 |journal=Law and Human Behavior | ssrn = 1160654}}</ref> explained this by the cognitive biases of biased assimilation<ref>{{Cite journal| doi = 10.1037/0022-3514.37.11.2098| issn = 0022-3514| volume = 37| issue = 11| pages = 2098β2109| last = Lord| first = Charles G.|author2=Lee Ross |author3=Mark R. Lepper | title = Biased assimilation and attitude polarization: The effects of prior theories on subsequently considered evidence.| journal = Journal of Personality and Social Psychology| year = 1979| citeseerx = 10.1.1.372.1743}}</ref> and a credibility heuristic.<ref>{{Cite journal| doi = 10.1086/266350| volume = 15| issue = 4| pages = 635β650| last = HOVLAND| first = CARL I.|author2=WALTER WEISS| title = The Influence of Source Credibility on Communication Effectiveness| journal = Public Opinion Quarterly| date = 1951-12-21}}</ref> Similar effects on reasoning are also seen in non-scientific controversies, for example in the [[Gun politics in the United States|gun control debate in the United States]].<ref name="guncontrol">{{Cite journal| last = Braman| first = Donald|author2=James Grimmelmann |author3=Dan M. Kahan | title = Modeling Cultural Cognition |journal=Social Justice Research | date = 20 July 2007| ssrn = 1000449}}</ref> As with other controversies, it has been suggested that exposure to empirical facts would be sufficient to resolve the debate once and for all.<ref>{{Cite journal| volume = 151| issue = 4| pages = 1341β1348| last = Fremling| first = G.M.|author2=J.R. Lott Jr| title = Surprising Finding That Cultural Worldviews Don't Explain People's Views on Gun Control, The| journal = U. Pa. L. Rev.| year = 2002| doi = 10.2307/3312932| jstor = 3312932| url = https://scholarship.law.upenn.edu/cgi/viewcontent.cgi?article=3214&context=penn_law_review}}</ref><ref>{{Cite conference| publisher = National Bureau of Economic Research| last = Ayres| first = I.|author2=J.J. Donohue III| title = Shooting down the more guns, less crime hypothesis| year = 2002}}</ref> In computer simulations of cultural communities, beliefs were found to polarize within isolated sub-groups, based on the mistaken belief of the community's unhindered access to ground truth.<ref name="guncontrol" /> Such confidence in the group to find the ground truth is explicable through the success of [[wisdom of the crowd]] based inferences.<ref>{{Cite journal| last = Lee| first = M.D. |author2=M. Steyvers |author3=M. de Young |author4=B.J. Miller | title = A Model-Based Approach to Measuring Expertise in Ranking Tasks}}</ref> However, if there is no access to the ground truth, as there was not in this model, the method will fail. [[Bayes estimator|Bayesian decision theory]] allows these failures of rationality to be described as part of a statistically optimized system for decision making. Experiments and computational models in [[multisensory integration]] have shown that sensory input from different senses is integrated in a statistically optimal way,<ref>{{Cite journal| issn = 0028-0836| volume = 415| issue = 6870| pages = 429β433| last = Ernst| first = Marc O.|author2=Martin S. Banks| title = Humans integrate visual and haptic information in a statistically optimal fashion| journal = Nature| date = 2002-01-24| doi = 10.1038/415429a|bibcode = 2002Natur.415..429E| pmid=11807554| s2cid = 47459}}</ref> in addition, it appears that the kind of inferences used to infer single sources for multiple sensory inputs uses a Bayesian inference about the causal origin of the sensory stimuli.<ref>{{Cite journal| volume = 8| issue = 3| pages = 24.1β11| last = Wozny| first = D.R. |author2=U.R. Beierholm |author3=L. Shams| title = Human trimodal perception follows optimal statistical inference| journal = Journal of Vision| year = 2008| doi = 10.1167/8.3.24| pmid = 18484830| doi-access = free}}</ref> As such, it appears neurobiologically plausible that the brain implements decision-making procedures that are close to optimal for Bayesian inference. Brocas and Carrillo propose a model to make decisions based on noisy sensory inputs,<ref>{{Cite journal| doi = 10.1016/j.geb.2011.10.001| issn = 0899-8256| last = Brocas| first = Isabelle|author2=Juan D. Carrillo| title = From perception to action: An economic model of brain processes| journal = Games and Economic Behavior| volume=75| pages=81β103| year = 2012}}</ref> beliefs about the state of the world are modified by Bayesian updating, and then decisions are made based on beliefs passing a threshold. They show that this model, when optimized for single-step decision making, produces belief [[anchoring]] and polarization of opinions β exactly as described in the [[global warming controversy]] context β in spite of identical evidence presented, the pre-existing beliefs (or evidence presented first) has an overwhelming effect on the beliefs formed. In addition, the preferences of the agent (the particular rewards that they value) also cause the beliefs formed to change β this explains the biased assimilation (also known as [[confirmation bias]]) shown above. This model allows the production of controversy to be seen as a consequence of a decision maker optimized for single-step decision making, rather than a result of limited reasoning in the [[bounded rationality]] of [[Daniel Kahneman]]. ==See also== {{Spoken Wikipedia|Controversy.ogg|date=2013-06-27}} *[[Argument]] *[[Bipartisanship]] *[[Dialectic]] *[[Misinformation]] *[[ProCon.org]] *[[Scandal]] *[[Third rail (politics)]] {{clear}} ==References== {{Reflist|30em}} ==External links== {{Wikiquote}} {{Wiktionary|controversy}} * [[Brian Martin (social scientist)|Brian Martin]], ''[https://www.bmartin.cc/pubs/14cm/14cm.pdf The Controversy Manual]'' (SparsnΓ€s, Sweden: Irene Publishing, 2014). * [https://www.semanticjuice.com/controversial-topics/ Controversial topics] based on [[machine learning]] on Wikipedia data * [https://controversial.today/ Controversial Today] {{Authority control}} [[Category:Controversies| ]] [[Category:English words]] Summary: Please note that all contributions to Christianpedia may be edited, altered, or removed by other contributors. If you do not want your writing to be edited mercilessly, then do not submit it here. You are also promising us that you wrote this yourself, or copied it from a public domain or similar free resource (see Christianpedia:Copyrights for details). Do not submit copyrighted work without permission! Cancel Editing help (opens in new window) Templates used on this page: Controversy (edit) Template:Authority control (edit) Template:Cite conference (edit) Template:Cite journal (edit) Template:Cite news (edit) Template:Cite web (edit) Template:Clear (edit) Template:Main (edit) Template:Main other (edit) Template:Other uses (edit) Template:Reflist (edit) Template:Reflist/styles.css (edit) Template:Short description (edit) Template:Sister project (edit) Template:Spoken Wikipedia (edit) Template:Wikiquote (edit) Template:Wiktionary (edit) Module:Arguments (edit) Module:Check for unknown parameters (edit) Module:Citation/CS1 (edit) Module:Citation/CS1/COinS (edit) Module:Citation/CS1/Configuration (edit) Module:Citation/CS1/Date validation (edit) Module:Citation/CS1/Identifiers (edit) Module:Citation/CS1/Utilities (edit) Module:Citation/CS1/Whitelist (edit) Module:Citation/CS1/styles.css (edit) Module:Format link (edit) Module:Hatnote (edit) Module:Hatnote/styles.css (edit) Module:Hatnote list (edit) Module:Labelled list hatnote (edit) Module:Yesno (edit) Discuss this page