Lapse

User Research, UX Design
Brief
Misinformation has become a huge problem on social media platforms, the challenge is to to design a new outcome that is different to the current solutions.
Solution
I have design a system that requires the user to pause and think before sharing content on social media platforms. This can be done in two different ways, queuing and reply and share.

Full Process

Primary Research

I conducted interviews with different social media users to have a better understanding of how misinformation on social media platforms have influenced them.

Secondary Research

To understand the content that is shared on social media, I have collected these content by screenshot.

Research - articles

After my primary and secondary research I have a better understanding of how misinformation is impacting social media platform and its users. I continued to research on articles and to understand what are the results of others research.

Pew Research Centre did an asked respondent on whether misinformation problem improve in the future, 51% chose the option that the information environment will not improve, and 49% said the information environment will improve.

Pew Research
Philip J. Nickel, lecturer at Eindhoven University of Technology in the Netherlands, “The decline of traditional news media and the persistence of closed social networks will not change in the next 10 years. These are the main causes of the deterioration of a public domain of shared facts as the basis for discourse and political debate.”
Christian H. Huitema, “The quality of information will not improve in the coming years, because technology can’t improve human nature all that much.”

Science Magazine 
Lies spread faster than the truthThere is worldwide concern over false news and the possibility that it can influence political, economic, and social well-being. To understand how false news spreads, Vosoughi et al. used a data set of rumor cascades on Twitter from 2006 to 2017. About 126,000 rumors were spread by 3 million people. False news reached more people than the truth; the top 1% of false news cascades diffused to between 1000 and 100,000 people, whereas the truth rarely diffused to more than 1000 people. Falsehood also diffused faster than the truth. The degree of novelty and the emotional reactions of recipients may be responsible for the differences observed.

Research Summary

All of the people I have interviewed have come across misinformation, most of the topic that they have come across are about politics due to elections or about covid, vaccine and treatments. 
Social media has become a platform that helps the spread of new information for the general public to receive it but this could also be a major threat when it comes to misleading information whether it is been spread maliciously or not.

Who was sharing it? Mostly social media groups or people who have strong opinion on the internet
What content was it? Politics or Covid
Where was it? Social media and WhatsApp group

Problem Statement - social media

Social media has a low entry level for all users, any one can post and create content they like to share on the platform, companies use algorithm to suggest content the user like creating echo chambers to keep them on their platform as long as possible in order to profit from advertisement, users begin to trust content on social media due to the lost on major media and users also share content by only reading titles.

Ideation

Through out ideation, I have sketched out different ways that could be possible to help solve misinformation. From having exhibition display to raise user's awareness to having the user to provide evidence when posting a content on social media.

User Behaviour Research

“Whereas false stories inspired fear, disgust, and surprise in replies, true stories inspired anticipation, sadness, joy, and trust. Contrary to conventional wisdom, robots accelerated the spread of true and false news at the same rate, implying that false news spreads more than the truth because humans, not robots, are more likely to spread it.” - MIT article on ScienceMag

"Human are more likely to share misinformation comparing to robots and algorithms, most of these false content are being shared between small, trusted networks such as friends and family members. Most false content provoke user’s negative emotions such as anger, anxiety and fear therefore leading users share content without thinking, giving the users time to stop and rethink can prevent them from sharing misinformation."

“More specifically within the domain of political fake news, anger has been suggested to promote politically aligned motivated belief in misinformation, whereas anxiety has been posited to increase belief in politically discordant fake news due to increased general feelings of doubt (Weeks 2015). In other words, anger may promote biased, intuitive, motivated reasoning, whereas anxiety may encourage individuals to consider opposing viewpoints (MacKuen et al. 2010) and perhaps even improve the overall quality of information seeking (Valentino et al. 2008).” - Cognitive Research: Principles and Implications

“If you get people to stop and think, they do a better job of evaluating what they’re reading.” - the Times 

Problem Statement - User Behaviour

I have come to a conclusion that human are more likely to share misinformation comparing to robots and algorithms, most of these false content are being shared between small, trusted networks such as friends and family members. Most false content provoke user’s negative emotions such as anger, anxiety and fear therefore leading users share content without thinking, giving the users time to stop and rethink can prevent them from sharing misinformation. 

Idea Sketch

I came up with the idea having the user to pause and think before sharing content on social media platform.
The user has to answer three questions about what they think others would feel when they see the content in order to post the content, and ask the user how they feel emotionally.
This solves the problem the users allow the user to pause and think about the content they are sharing.

User Persona

From my interview with different users, I have came up with two user personas.
One is a person who uses social media very often and wants to be an influencer on social media and
one uses social media casually.

User Journey

Consider there are users who share content on social media very often or even rely on posting content as an occupation. I have designed two paths for the users to choose when they decide to share content.

WireFraming

Based on the User Journey I designed, I created a few wireframes for user testing.

Questions for the user

Open ended question
1. What did you feel when you see this content?
2. Why are you sharing this?
3. How do you imagine others feel when they see this content?
4. What did you like about the content?
5. Who posted the content?
6. Could you describe this article in 3 words?
auxiliary verb + subject + main verb

The questions for the user are phrased to make them feel comfortable, rather than interrogated or challenged. By asking these questions, the users will re-ask themselves in their mind while they answer the question. The expectation outcome is not about the users answer, it is a strategy on having the user to pause their action and think.

User Testing & Result

The user testing was asking the user to share the first content immediately but they could only share every half hour. The second content ask the user different questions before they share. 

Result: The results of the testing was positive. Users has no trouble with the journey and completed the task without guidance. 

Prototype

I have chosen Facebook and Twitter as social media platform to design Lapse, because they are the two most popular social media platform worldwide and also these two platforms have a been impacted by misinformation the most.