top of page

Problem Statement

Misinformation, as a problem, does not have its origins in contemporary times. The current information dissemination landscape consisting mainly of social media platforms has transformed misinformation into one of the defining problems of the information age. This brings us to the question - What makes people believe in fake news and why is it such a hard problem to tackle effectively.

Solution

News Guide is a browser extension automated on social media platforms that scores news based on "the quality of sources," "the quality of citations," and "the tone used". Unlike other current AI tools, we rank the content on a 0-100 scale and give the user a recommended result instead of only showing credible/not credible.

Approach

PROJECT TYPE

3 members' team project

MY ROLE

UX Researcher

UX Designer

DURATION

4 months

YEAR

2022

Literature Review

User Interviews

Human AI Interaction principles have been incorporated into our design:

Explainability

We have included explanations throughout the app to clarify the algorithmic process - for both the total scoring and for each individual indicator. We will also provide users the option to read more by linking to our external website which will have our detailed methodology and will also include model cards.

Transparency

For transparency, the result calculation process is clearly displayed using a pie chart. We will also present a breakdown of the score for each article that is evaluated so as to ensure maximum transparency in the scoring process.

Fairness and Accountability

For accountability, we provide users the option to give feedback in case of an unfair assessment by the algorithm. Having this recourse to human intervention in case of an unfair decision by an algorithm will allow for more accountability and will help foster trust in our solution.

01

BACKGROUND RESEARCH

Literature Review

We carefully analyzed Pew Research Center data examining the News Consumption Habits across Social Media platforms and News Platforms Fact Sheet in detail.

90%

Between ages 18-29 get their news from digital devices

86%

Do not fact-check often

68%

Worn out by the amount of news online - News Fatigue

31%

Nearly a third of Americans still regularly get their news from Facebook

Competitive Analysis

We found that digital tools designed to combat fake news could be broadly categorized into the following groups based on their primary focus: Bot/Spam Detection, Credibility Scoring, Fact-checking, Education/Training, Verification, Whitelisting, and Codes/Standards19. To further evaluate AI-based solutions built for tackling misinformation, we chose to explore two of these categories in depth - Credibility Assessment and Fact-Checking Tools.

Credibility Assessment

Above were not succinct and brief in the information that they provided.

Fact-Checking Tools

We ran initial assessments on their usability and ease of information access and found that it is difficult to gather information at a glance from these websites.

The tools above that particularly interested us, however, were tools that straddled the boundaries of both Fact-Checking and Credibility Assessment. We found several examples of such tools and chose the Factual and Newstrition to be the two applications we dove deeper into.

User Interview

GOAL

1. Understand participant news consumption habits on social media.

2. Learn how often people evaluate news and what motivates them to do so.

3. Get insights about their current news evaluation methods and challenges in using them.

4. Gather expectations for a news evaluation tool.

PARTICIPANTS

All in the age group of 18-29.

4 KEY POINTS

Thoughts on the relevancy of the problem of misinformation

  • 80% said Misinformation is a crucial issue which is complex to solve.

  • 50% said There is just too much information online and information is subjective which can make it hard to distinguish the truth and what is right or wrong.

  • 40% said Marketing strategies and political agendas as driving factors for this problem which adds an extra layer of complexity in solving it.

Attitudes towards News Evaluation​

  • Motivation and interest were two majorly cited reasons which pushed people to critically evaluate news that they consume online.

  • Check hotly contested current affairs so that they can make up their mind about the issue on their own.

  • Time as an important factor when it comes to evaluating news.

  • “I do not have time to evaluate” was a common refrain.

  • Tend to evaluate information that is relevant or specific to their context.

Current News Evaluation Strategies

  • Google Search was mentioned as the most common method for evaluating a piece of news.

  • Seek out official news sources or trusted sources to verify claims.

  • Discuss important events or news with families or friends so that they can get more information to evaluate the news article further.

  • Placed on looking at an issue from multiple viewpoints as it helped them with building context and formulating their own options.

  • A few participants heavily relied on knowledge based websites like Wikipedia and Youtube for further news evaluation.

Concerns or Expectations from an AI-based News Credibility Assessment Tool

Concerns

  • 100% said I would not trust any tool by itself 100% as these tools themselves could be biased.

  • Tool would have been influenced by the developer’s or designer’s own biases, political agendas or general outlook on contentious issues.

Expectations

  • Present with multiple viewpoints or coverages of the same issue.

  • Help them evaluate information at a glance and does not worsen the information overwhelm that they feel.

BG Research
Competitive Analysis
Literature Review
User Interview

02

USER RESEARCH

Re-defined Goal

The literature review, competitive analysis and interview that we conducted as part of generative research has helped us to redefine our goal as follows:

01

Discover the challenges and barriers that users face in the adoption and sustained use of News Credibility Evaluation Tools.

02

Create a prototype of a fact-checking tool that would integrate seamlessly into the user’s news consumption environment.

After revising our goal, we designed an observational study and an online survey to evaluate existing tools and assess evaluation criteria that should be included in the news credibility assessment tool we design.

Observational Study

  1. What tools people use to evaluate news on social media.

  2. What are their reasons for or concerns in using these tools.

PARTICIPANTS

6 social media users in various occupations who are regular consumers of news on social media and all in the age group of 18-29.

gender2.png

GOAL

PROCESS

Each participant was directed to browse news posts on a test Facebook account which we had set up for the purposes of this experiment. We then asked them to go through the following two tasks:

Evaluate an article of their choice using their preferred method for assessing news articles.

Task 1

Evaluate the same article from Task 1 using the evaluation tools - The Factual and Newstrition.

Task 2

RESULT

Preferred news evaluation method

Preferred news evaluation method.png

Positive

  1. Convenient and comfortable for them to use.

  2. Big repository of articles in Google.

Negative

  1. Overloaded information on Google Search.

  2. Need to think of right keywords for efficient search.

Additional Finding

Participants tend to look for further articles which report the same news, even better if the other news outlets are trustworthy.

We decided to incorporate selected Google search results into our system.

Factors that make people not trust the news

Motivation of using evaluation tools

BA.png

Reasons of rising motivations

  1. Result will show automatically after installing the tools.

  2. More awareness about the credibility of news on social media.

Comparison of The Factual and Newstrition

Comfortable to read/use

(rating from 1 to 5) 

Comfortable to read_use.png

Indicators clear to you

(rating from 1 to 5) 

Indicators clear to you.png

Tool Preference

Something we can learn from

The Factual

  1. Color showing

    • Each section with representative colors

  2. Smooth user experience

    • ​Shows the summary result directly on each social media post

    • Easy setup without much initiative on the user’s end

  3. Simple layout

    • Divides information in different box sections with colors

Newstrition

  1. Understandable language

Need to be improved

The Factual

  1. Lack of understandability

  2. Lack of transparency on the result

  3. Low color contrast interface

Newstrition

  1. Low discoverability

  2. Lack of transparency

  3. Complicated user flow

The most mentioned concerns are 
“Don’t understand how the result comes out ”
“Don’t understand the meaning of the result”
“Don't know how those indicators help to build the truth of news”

News Assessment Factor Survey 

GOAL

Look deeper to figure out the weight of importance of news credibility factors.

RESULT

We got 38 responses in total with about an even distribution of males and females.

While evaluating a news article, which of the factors matter to you?

Importance of the factors by calculating weighted score

Weighted Score =
count of Yes*1 +
count of Maybe*0.5 +
count of No*0

How does the recommended result / score come?

From the survey, the seven top factors - Sources, Tone of articles, Title, Date of publication, Evidence cited, Source bias, and Meaningful quotes, had over 90% of positive responses. We decided to use these factors in our main content.
The percentage of each factor is determined by the weighted score that we previously calculated in the result analysis of News Assessment Factor Survey.

Redefined goal
Observational
User Research
Survey

03

Ideate & Design

User Flow

Participants liked The Factual’s thumbnail on the Facebook post and they favored the simple and intuitive flow of the extension. We decided to follow a similar flow as The Factua as follows:

01

Thumbnail of Credibility Score will be showed automatically on the right top of the posts.

02

Hover information of thumbnail will pop up and show evaluation criteria and total score / recommended result.

03

The user can click the view summary button to see the complete assessment content.

Content Design

THUMBNAIL OF CREDIBILITY SCORE

HOVER INFO OF THUMBNAIL

SUMMARY

We divided the  assessment content into few sections:

Result

Basic Information

It's basic but crucial. Users can judge by themselves while looking at these info since site bias might affect their trust on a news article according to the previous interview feedback.

Evaluation Factors

The section provides more information on each individual indicator and what their calculated score is which contributes to the overall credibility scoring.

User Feedback

We believe user feedback and control are critical to improving the underlying AI model’s output and user experience. When users have the right level of control over the system, they’re more likely to trust it.

Hint Design for Explanation

Adding hints to each indicator to increase the explainability of assessment factors.

Tab Design for Related Information

According to observational study result, people tend to read more related sources to evaluate news and they prefer Google for this. In order to increase the convenience and usability, we incorporated this into our extension so that users can directly browse Google articles and videos in our extension.

User Flow
Concept Design
Ideate & Design

04

Evaluation

Usability Testing

GOAL

Checking the usefulness and efficiency of our proposed design solution so as to make any changes or improvements as required.

PARTICIPANTS

We recruited 6 participants who are also avid social media news consumers. Participants included both genders and fell in our target age group of 18-29.

gender2.png

PROCESS

Moderator asked the participants to perform FIVE tasks and think-aloud their thoughts during the process, using the high-fi user interface we designed.

Thumbnail of Credibility Score

TASK 1

Hover Info of Thumbnail

TASK 2

Summary Page

TASK 3

Related Articles Page

TASK 4

Related Videos Page

TASK 5

RESULT

Evaluate the Thumbnail of Credibility Score on Facebook

Task 1

Browse Facebook as usual and observed their reaction while noticing the thumbnail of overall credibility score.

PROS

  • 5 out of 6 notice the thumbnail.

  • Color alerts them to stay doubtful of the post.

CONS

  • Logo is unclear and resembles a foreign language character which confused them.

Evaluate the Hover Information of Thumbnail

Task 2

Walk through the evaluation process in their own way and then asked them about the thought of hover information needs.

PROS

  • Hover information is concise and clear.

CONS

  • 50% misinterpreted the meaning of date posted

  • View Summary button with black color is not appealing.

  • The brief summary should show all the indicators

Evaluate Summary Page

Task 3

01

PROS

  • The hint of credibility score helped them understand the relationship between the statement and numerical score.

  • All the participants said the hint besides the Credible wording is clear.

03

PROS

  • 5 out of 6 stated that the hint gave them a good mental model of the working of the extension.

  • The graph about the weight for each indicator is especially useful for understanding the methodology.

05

CONS

  • 4 participants pointed out that the wording “External Sources” didn’t match with the scale (medium) we use.

  • They suggested that“ Citations Quality” better match the quality of cited sources as our intention.

  • Color representation should be consistent as other indicators.