Click Sign-Up Today to register an account to access your prescriptions. Please note: Your previous login credentials will not work with this new service. Download our NEW Mobile App! See details below!
Tinsley Bible Drug Co. Logo

Get Healthy!

Reading Beyond Headline Rare For Most on Social Media, Study Finds
  • Posted November 22, 2024

Reading Beyond Headline Rare For Most on Social Media, Study Finds

Three out of four times, your Facebook friends don't read past the headline when they share a link to political content. 

Experts say that's somewhat surprising -- and downright scary. 

People who share without clicking may be unwittingly aiding hostile adversaries aiming to sow seeds of division and distrust, warned S. Shyam Sundar, a professor of media effects at Penn State University.

"Superficial processing of headlines and blurbs can be dangerous if false data are being shared and not investigated," said Sundar, corresponding author of the new study published Nov. 19 in the journal Nature Human Behavior. 

"Disinformation or misinformation campaigns aim to sow the seeds of doubt or dissent in a democracy -- the scope of these efforts came to light in the 2016 and 2020 elections," he added in a Penn State news release.

To learn more about content shared on social media, his team analyzed more than 35 million public posts containing links shared on Facebook between 2017 and 2020. The links included political content from both ends of the spectrum — and it was shared without clicking more often than politically neutral content.

While the study was limited to Facebook, researchers said their findings likely apply to other social media platforms as well. 

Data for the analysis were provided in collaboration with Facebook's parent company, Meta. 

It included user demographics and behaviors, including a "political page affinity score." This was determined by identifying pages that users follow.

Users fell into one of five groups — very liberal, liberal, neutral, conservative and very conservative.

Researchers then used AI to find and classify political terms in linked content, scoring content on that same scale, based on number of shares from each affinity group.

One by one, researchers manually sorted 8,000 links, identifying content as political or non-political. That data trained an algorithm that analyzed 35 million links that were shared more than 100 times by Facebook users in the United States. 

From that analysis, a pattern emerged that held true at the individual level.

"The closer the political alignment of the content to the user -- both liberal and conservative -- the more it was shared without clicks," said study co-author Eugene Cho Snyder, an assistant professor of humanities and social sciences at the New Jersey Institute of Technology. "They are simply forwarding things that seem on the surface to agree with their political ideology, not realizing that they may sometimes be sharing false information."

Meta also provided data from a third-party fact-checking service, which flagged more than 2,900 links to false content.

In all, these links were shared more than 41 million times -- without being clicked, according to the study.

Of these, 77% came from conservative users and 14% from liberal users. Up to 82% of links to false information came from conservative news domains, researchers found.

Sundar said social media platforms could take steps to curb sharing without clicking -- for example, users could be required to acknowledge that they have read the content in full before sharing.

"If platforms implement a warning that the content might be false and make users acknowledge the dangers in doing so, that might help people thing before sharing," Sundar said.

It wouldn't, however, stop intentional disinformation campaigns, he added.

"The reason this happens may be because people are just bombarded with information and are not stopping to think it through," Sundar said. "Hopefully, people will learn from our study and become more media literate, digitally savvy and, ultimately, more aware of what they are sharing."

More information

For further details, refer to the American Psychological Association for more about misinformation and disinformation.

SOURCE: Penn State, news release, Nov. 20, 2024

HealthDay
Health News is provided as a service to Tinsley Bible Drug Co. site users by HealthDay. Tinsley Bible Drug Co. nor its employees, agents, or contractors, review, control, or take responsibility for the content of these articles. Please seek medical advice directly from your pharmacist or physician.
Copyright © 2024 HealthDay All Rights Reserved.