In Journal Club, students and professors talk together about exciting new psychology research. By reading and discussing a short, current article, students learn necessary skills about understanding and interpreting psychological research.


Spring Semester 2018

  • Monday, February 26, 2017, 4:30-5:30pm, Centenary Square 210
    Discussion lead by Dr. Amy Hammond (PSY369 Human Sexuality)

The Political Divide Over Same-Sex Marriage: Mating Strategies in Conflict?
By David Pinsof & Martie Haselton

Although support for same-sex marriage has grown dramatically over the past decade, public opinion remains markedly divided. Here, we propose that the political divide over same-sex marriage represents a deeper divide between conflicting mating strategies. Specifically, we propose that opposition to same-sex marriage can be explained in terms of (a) individual differences in short-term mating orientation and (b) mental associations between homosexuality and sexual promiscuity. We created a novel Implicit Association Test to measure mental associations between homosexuality and promiscuity. We found that mental associations between homosexuality and promiscuity, at both the implicit and the explicit levels, interacted with short-term mating orientation to predict opposition to same-sex marriage. Our model accounted for 42.3% of the variation in attitudes toward same-sex marriage, and all predictors remained robust when we controlled for potential confounds. Our results reveal the centrality of mating psychology in attitudes toward same sex marriage.


  • Monday, March 12, 2018, 4:30-5:30pm, Centenary Square 210
    Discussion lead by Dr. Jessica Alexander (PSY329W Brain & Language)

Color naming across languages reflects color use
By Edward Gibson, Richard Futrell, Julian Jara-Ettinger, Kyle Mahowald, Leon Bergen, Sivalogeswaran Ratnasingam, Mitchell Gibson, Steven T. Piantadosi, and Bevil R. Conway

What determines how languages categorize colors? We analyzed results of the World Color Survey (WCS) of 110 languages to show that despite gross differences across languages, communication of chromatic chips is always better for warm colors (yellows/reds) than cool colors (blues/greens). We present an analysis of color statistics in a large databank of natural images curated by human observers for salient objects and show that objects tend to have warm rather than cool colors. These results suggest that the cross-linguistic similarity in color-naming efficiency reflects colors of universal usefulness and provide an account of a principle (color use) that governs how color categories come about. We show that potential methodological issues with the WCS do not corrupt information-theoretic analyses, by collecting original data using two extreme versions of the color-naming task, in three groups: the Tsimane’, a remote Amazonian hunter-gatherer isolate; Bolivian-Spanish speakers; and English speakers. These data also enabled us to test another prediction of the color-usefulness hypothesis: that differences in color categorization between languages are caused by differences in overall usefulness of color to a culture. In support, we found that color naming among Tsimane’ had relatively low communicative efficiency, and the Tsimane’ were less likely to use color terms when describing familiar objects. Color-naming among Tsimane’ was boosted when naming artificially colored objects compared with natural objects, suggesting that industrialization promotes color usefulness.


  • Monday, April 9, 2018, 4:30-5:30pm, Centenary Square 210
    Discussion lead by Dr. Peter Zunick (PSY205 Research Methods)

False-Positive Psychology: Undisclosed Flexibility in Data Collection and Analysis Allows Presenting Anything as Significant
By Joseph P. Simmons, Leif D. Nelson, and Uri Simonsohn 

In this article, we accomplish two things. First, we show that despite empirical psychologists’ nominal endorsement of a low rate of false-positive findings (≤ .05), flexibility in data collection, analysis, and reporting dramatically increases actual false-positive rates. In many cases, a researcher is more likely to falsely find evidence that an effect exists than to correctly find evidence that it does not. We present computer simulations and a pair of actual experiments that demonstrate how unacceptably easy it is to accumulate (and report) statistically significant evidence for a false hypothesis. Second, we suggest a simple, low-cost, and straightforwardly effective disclosure-based solution to this problem. The solution involves six concrete requirements for authors and four guidelines for reviewers, all of which impose a minimal burden on the publication process.


If you have questions, please contact Dr. Amy Hammond in the Psychology Department.