Zagrut, meaning beware, is a news reader app that critiques the existing system of online news and gently guides the user against prevalent media bias.
Zagrut reached the Top 12 finalists from over 100 projects at the PILOT Innovation Challenge, National Association of Broadcasters, USA, Nov 2018. Currently, contending to be among the Top 6, at NAB Futures, Seattle, Jan 2019.
Discipline: UX Design, UI Design
Technology: Sketch, Invision Studio, Adobe Suite
Association: Parsons School of Design, MFA Design+Tech Studio, NAB Futures
Work in Progress: Second iteration, prototyping, user tests, and Service Blueprinting
What is Zagrut?
Zagrut in the Indian language of Konkani means beware.
It explores the route of empathy towards a user who is biased or affected by bias, which in turn makes the app, unbiased, trustworthy and thus contributes to solve the ‘wicked problem’ of media bias.
It functions like a regular news reader but gently coaches users to get them aware of their own bias trends without finger pointing, and encourages them to stay as close to a central point of view. It has four main functions:
Context before content Multiple Perspectives Reading TLDR versions Keeping an Eye
Information about the article at the user’s fingertips that includes coaching the user to judge the nature of the headline, understanding the date and how aged the news is, who is the news source, the news organization, their related owners and links and also the author of the article. Each information has easy one-click access to know more. This ‘context before content’ is sourced from Wikipedia indexing.
Every article when read or shared is accompanied by two other perspectives. This helps break the social-media echo chamber because one of the main cause for it is that people only share the single-perspective news articles that feed into similar views.
Also, the news feed is heavily from known center-bias sources like Bloomberg, BBC or Reuters, that is neither known to be left or right inclined. The more the user reads left, the more right sources populate the feed and vice versa. This will coach the user to develop a habit of reading multiple perspectives to stay at the center of bias spectrum.
Reading a shorter version, especially when guided to read multiple perspectives can be important. Users can at least get a glance at what news from other perspectives is trying to say.
Zagrut gives easy access to related articles for a particular news story that the user is interested in. This way the user will be able to see how the story has changed with time, exposing political evils like double-speak, denial, and hypocrisy.
Context before content
Reading TLDR versions
Keeping an Eye
In the month of March 2018, the Facebook-Cambridge Analytica Data Scandal exploded so hard, that it even exposed flaws within democratic systems globally, in addition to that of USA. Being someone from India living in the United States, the idea sprung up from a personal place of frustration, against the nexus between media and political parties, generating mammoth distrust in data, privacy, and truth.
My ideation started off by putting my thoughts and feelings into what I call, free writing, where you just stick your pen to paper and write a marathon of thoughts and questions.
I filtered them into key issues because the problem at hand is a large and messy one, that will go around in circles. Here, I realized strict focus was important to move further in this project.
I further organized them into a mind map so that I understand the connections laterally and not just linearly. This was very helpful for my process and served as a good guide into my domain research. Again, it was important to keep the research focussed and stay cautious of wandering into the never-ending space of information in this domain.
My research took shape in broadly four parts:
- Understanding the Problem
- Understanding the Solutions
- User research
Understanding the Problem
The internet and social media may have exacerbated low trust and ‘fake news’, but we find that in many countries the underlying drivers of mistrust are as much to do with deep-rooted political polarisation and perceived mainstream media bias.
Echo chambers and filter bubbles are undoubtedly real for some, but we also find that – on average – users of social media, aggregators, and search engines experience more diversity than non-users. With data covering more than 30 countries and five continents, the Reuter’s Digital News research of 2017 is a reminder that the digital revolution is full of contradictions and exceptions. Some of the key findings from the research are:
Growth in social media for news is flattening out in some markets, as messaging apps that are (a) more private and (b) tend not to filter content algorithmically are becoming more popular. Only a quarter (24%) of the respondents think social media do a good job of separating fact from fiction, compared to 40% for the news media. Our qualitative data suggest that users feel the combination of a lack of rules and viral algorithms are encouraging low quality and ‘fake news’ to spread quickly.5
In most countries, we find a strong connection between distrust in the media and perceived political bias. This is particularly true in countries with high levels of political polarisation like the United States, Italy, and Hungary. Almost a third of our sample (29%) say they often or sometimes avoid the news. For many, this is because it can have a negative effect on mood. For others, it is because they can’t rely on news to be true.
Importance of Smartphones
Mobile marches on, outstripping computer access for news in an increasing number of countries. Mobile news notifications have grown significantly in the last year. In a related development, there has been a significant growth in mobile news aggregators, notably Apple News, but also Snapchat Discover for younger audiences. Both have doubled usage with their target groups in the last year.
Smartphones are now as important for news inside the home as outside. More smartphone users now access news in bed (46%) than use the device when commuting to work.
In terms of online news subscriptions, we have seen a very substantial ‘Trump bump’ in the US (from 9 to 16%) along with a tripling of news donations. Most of those new payments have come from the young – a powerful corrective to the idea that young people are not prepared to pay for online media, let alone news. We have new evidence that news brands may be struggling to cut through on distributed platforms. In a study tracking more than 1,500 respondents in the UK, we found that while most could remember the path through which they found a news story (Facebook, Google, etc.), less than half could recall the name of the news brand itself when coming from search (37%) and social (47%).
The biggest change has been the growth of news accessed via social media sites like Facebook and Twitter. In the United States, social media became a key player in the story of the election not least because of its well-documented role in spreading made-up news stories, such as that Pope Francis endorsed Donald Trump or that Hillary Clinton sold weapons to ISIS. Over half (51%) of our US sample now get news via social media – up five percentage points on last year and twice as many as accessed in 2013.
We should also remember that there are significant generational splits in the sources used for news. Across all countries, younger groups are much more likely to use social media and digital media as their main source of news, while older groups cling to the habits they grew up with (TV, radio, and print).
Flipboard curates the news you want by watching what you read and offering similar stories. Apple News app is a decent offering if you take the time to curate what you want to see. You can select the sources you want to accept stories from and then they will filter through to your notification center.
We show, via a massive (a sample of 689,003) experiment on Facebook, that emotional states can be transferred to others via emotional contagion, leading people to experience the same emotions without their awareness. We provide experimental evidence that emotional contagion occurs without direct interaction between people (exposure to a friend expressing an emotion is sufficient), and in the complete absence of nonverbal cues.
We tend to follow politicians we agree with; respondents on the left are five times more likely to follow left-leaning politicians in social media than politicians from the right. The same is true in reverse in equal proportion.
This suggests that following politicians in social media may be contributing to greater polarisation. On the other hand, we should remember that in a pre-digital age political activists would have spent a considerable amount of time with people who held similar views as well. What is different is the scale of this activity. Over half of social media users (54%) in the United States following politicians equates to around a third of the entire US online population.
From the above reports we can also find that:
- The biggest change has been the growth of news accessed via social media sites like Facebook and Twitter. It is striking that, outside the United States and the United Kingdom, growth in the use of social media for news seems to be flattening out.
- Overall around a quarter (23%) of our respondents now find, share, or discuss news using one or more messaging applications. We’ve been tracking the growth of WhatsApp for some time but its use for news has jumped significantly in the last year to 15%, with considerable country-based variation.
- On a mobile phone in particular, where it can be difficult to move quickly between multiple apps and websites, the convenience of a one-stop-shop can be compelling. Sometimes these news aggregations are stand-alone products (Flipboard, SmartNews), at other times they are part of a wider service (Apple News, Google News, Snapchat Discover, Kakao Channel, and Line News). This second group – that are both destinations in their own right and allow content to hook into established ecosystems – are currently showing the strongest growth in our data. Despite the rise of aggregators, social media and search remain the most important gateways to online content, alongside traffic coming to their own websites and apps.
- We can also add up preferences for content that is selected by an algorithm (search, social, and many aggregators) and compare with that selected by an editor (direct, email, and mobile notifications). More than half of us (54%) prefer paths that use algorithms to select stories rather than editors or journalists (44%). This effect is even more apparent for those who mainly use smartphones (58%) and for younger users (64%).
- As the smartphone extends its grip on the home and becomes the central organizing device of the digital age, it is worth reflecting on the implications for publishers. Two key factors are likely to be at play: (a) more publishers have enabled deep linking to apps from search, social, and email; (b) the substantial increase in mobile notifications noted earlier, as publishers pursue loyalty strategies and take advantage of new platform capabilities.
- Reading news in Text is a more preferred choice of news consumption, than video.
- The widespread public debate over fake news and media bias has prompted us to look in detail at the issue of trust in the news media and in social media. Part of that has been to investigate a link between political polarisation and perceived media bias in a number of countries. We have explored these issues through our core survey, through analyzing open-ended answers on trust from 10 countries and from our focus group activity in a smaller group of countries including the United States.
Understanding the Solutions
On any platform of social media, when we point out with fact and reliable documentation that either the news is fake, or the news post or source is biased, irrespective of the truth, there tends to be a knee-jerk reaction to oppose fingers are pointed at someone. It needs to be determined if this is a platform effect, where we don’t want to be corrected in public, or some other factor or just our humane approach to things that we identify with.
Though partisanship and selective exposure are strong determinants of attitudes and behavior, intense media concentration on an issue may alter partisans’ evaluations of politicians by changing the balance of headlines. Contrary to conventional wisdom, robots accelerated the spread of both true and false news at the same rate. This implies that false news spreads more than the truth because humans–not robots–are more likely to spread it.
The solution here is to take the problem and try to draw it in another way – for example, if echo chamber means reading the same source, create a solution that will force you to read multiple sources. If forcing is close to poking fingers at a person’s biases or beliefs, then do this in a bland way or with no labels attached – either to the source or to the user’s bias.
We can deduce the following from the above reflections:
- It’s easy to identify and study the very biased to least biased sources
- We can mention who is the owner of media, funded by, and other details of the source
- Searching the same topic on left, center, right view sources present articles/views
- Following a story, leads to accountability
- AI can be used to summarise articles for TLDR (Too Long, Didn’t Read) versions.
My user research included a quantitative approach of sending surveys out to users and getting their opinions on how they chose a news app to read or use daily. I also wanted a user poll on understanding a few icons to be used in the visual design of my app. I wanted to force myself to take a risk and pick out one icon that is generally not used in the context of this app and wanted to try to test that on user’ perception.
The qualitative research included narration sessions with users that were required for my sense and flow process. This process is to ask users to narrate out loud whatever they are seeing, thinking or understanding about the paper prototype of the app. This helps me understand if the users are understanding the solution and the essence of the app’s intended solution.
The Mobile App – Zagrut
Today, most news is biased. We are unaware of our own biases. Bias, is as per, rating a single source by people from multiple perspectives. Our apps and news readers have bias built in. The ‘My Feed’ or ‘News for you’ is a biased feed, that only feeds into the news you like.
I named the app ‘zagrut’, pronounced za-groot, which means ‘beware’ in my mother tongue, Konkani, from South India. Zagrut is a mobile app to critique the existing system of news propagation and consumption and makes the user aware of the bias in this environment.
Zagrut focuses on the following solutions:
- Home Page presents a ‘latest feed’ with the option of choosing topics but not sources. ‘My feed’ or ‘news for you’ is absent.
- Meta Article (data about the article) is shown in focus before the content itself.
- Users are propelled to judge the headline.
- Echo Chambers – users are seamlessly shown multiple perspectives and feed changes to the other biased side that is not read as often.
- The app has a dedicated onboarding for new users
- Users can follow a particular news story
- Sharing a single news article will always be shared with three multiple perspectives.
Why will people trust this app?
- Why do people trust bottled water, when they can actually drink safe tap water?
- Why did people trust facebook so far, but after the data scandal broke out, their trust is shaken from facebook?
- Are reasons more than the obvious?
- Manufactured demand: Research shows no trust in ‘media’ irrespective of side5
- Perception and perceived control: My app keeps the reader in the judging seat
- Scaring and Misleading: Leaders like Trump/Modi openly shunning media
- Transparency: App states everything as is with no labels.
- Why do people trust Wikipedia or why is it seen as trustworthy?
- Machine learning mammoths and Tech Giants, like Facebook, Google Search and Youtube are already quoting Wikipedia.
- Bias Pointers are propping up by demand: AllSides.com, MediaBiasFactCheck.com
How can I make an app for smartphones that will
- engage users from all sides
- make them read both sides of a story
- make them aware of media sources, authors, known biases
- getting them to follow a particular story
- restrict their curation and “trends”
- Generate TLDR versions as optional reading
- Vox: is an American news and opinion website owned by Vox Media. The website was founded in 2014 by Melissa Bell, Matthew Yglesias, and Ezra Klein. Vox is noted for its concept of explanatory journalism.
- Snopes and PolitiFact:
Snopes.com is one of the first online fact-checking websites. It is a widely known resource for validating and debunking urban legends and similar stories in American popular culture, receiving 300,000 visits a day in 2010.
PolitiFact.com is a project operated by the Tampa Bay Times, in which reporters and editors from the Times and affiliated media fact-check statements by members of Congress, the White House, lobbyists, and interest groups. They publish original statements and their evaluations on the PolitiFact.com website and assign each a “Truth-O-Meter” rating.
- ‘Fake News’ Games:
PolitiTruth is a game about “distinguishing political fact from fiction.” In our current political nightmare climate, full of fake news and lying fascists co-opting the term fake news, anything that gets the truth out to as many people as possible as easily and as fun as possible is more than just a Game of the Year.
Factitious is another game to test users’ ability to detect fake news from real.
- The Wall Street Journal – Red Feed, Blue Feed: To demonstrate how reality may differ for different Facebook users, The Wall Street Journal created two feeds, one “blue” and the other “red.” If a source appears in the red feed, a majority of the articles shared from the source were classified as “very conservatively aligned” in a large 2015 Facebook study. For the blue feed, a majority of each source’s articles aligned “very liberal.” These aren’t intended to resemble actual individual news feeds. Instead, they are rare side-by-side looks at real conversations from different perspectives.
- AllSides.com is a multi-partisan revolution and set of technologies fueled by data and people like you. The AllSides Bias Rating is based on a crowd-driven, patented technology. From bias ratings to AllSides for Schools and civil dialogue across divides, it’s people-powered tech.
MediaBiasFactCheck.com founded in 2015, is an independent online media outlet. MBFC News is dedicated to educating the public on media bias and deceptive news practices. MBFC News’ aim is to inspire action and a rejection of overtly biased media. We want to return to an era of straightforward news reporting. Funding for MBFC News comes from site advertising, individual donors, and the pockets of bias checkers.
- InShorts is a news app that selects latest and best news from multiple national and international sources and summarises them to present in a short and crisp 60 words or less format, personalized for you, in both, English or Hindi. All summarised stories contain only headlines and facts, no opinions, to help you stay informed of the current affairs.
- Check for being truly Partisan Agnostic – your own presumptions
- Evaluating sentiment and not poke at belief or identity, like the arguments that happen on social media
- Finding the right proxy
- Measure the ‘do’ than ‘say’ – collect feedback of what people are reading than opinionating
- People at the extreme ends of bias are not the focus – to attempt the caution the people around the center to stay away from drifting towards extremes
I imagined my process to be in stages – in a linear flowchart, but through my design thought process and practice, it evolved into a circular method. I first started with sketches, from writing to mind maps, to rough drawings, and then charting out the user flow. This was then made into a paper prototype to be presented in front of a few users – not for the user interface, but to understand the essence of the app, its meaning, the flow of thoughts and actions and to spot anything abrupt or unusual.
While presenting the paper prototypes to the users, I asked them to narrate it out loud, so that their thoughts, expectations, and understanding of the app and their actions are clear to themselves and to me, and I can record their reactions perfectly. This went back to the drawing board for a better fidelity prototype. All this comprised the stage of what I term as “sense and flow”. After sense and flow, the process was linear, but of course, had many iterations and requires more too.
- Sense and Flow: the above described circular process. Prototypes 1 and 2 below belong to this section of the process.
- Primary high fidelity: This is the most understandable prototype you can quickly make – that would seem like a well-developed wireframe. Prototypes 3 and above were built here at this stage.
- Word Design and Visual Design: Choosing the right wording, expressions, and phrases.
- User Reactions: Showing high fidelity interactive prototypes and recording reactions – this was done during the ‘Major Major DIMENSIONS’ Studio Exhibition show, May 5th, 2018.
- Iterations – for refining the app, minor details, added features – based on more elaborate user feedback and reactions recorded.
For a news app that speaks against bias, I decided that it should be ‘colorless’ and stick to greys and whites. But to add to the aesthetic, I chose the Pantone Color of the Year, Ultra Violet, that also well represents the ‘centre’ that is neither left or right biased and is only used as an accent color.
The rest of the visual design was simple, not too far away from the regular news apps, and I based this on Google’s Material Design basics. The app will need a few more iterations on the visual design.
[ Work in progress ]
- You are only fed with news from ‘center’ sources.
- If you read one biased news source, you will be shown another biased news source from the opposite end – same translated on to the news feed.
- You will have one news story from each side, left, right and center to read and without which you will have to reset back home and pick your next story.
- Generating a unique key for every page combination, so that when shared outside the app, clicks to this will take you back to the screen showing multiple sources of the same story.
- Filtering and judging would not change the general news feed but reading them will.
- TLDR versions of articles will be shown by machine learning and a summary will be created. This can be turned off in settings and users will get direct access to the full versions out of the article.
- Next steps are to frame and refine all the features iteratively.
- Give multiple iterations to the visual design
- Frame this in two contexts of people – USA and India and record more user reactions.
Journalism is being hit by forces that have been building for some time but the past year has seen this story break out from its media bubble to attract the attention of policymakers, politicians, and even the wider public. The news itself has become the news.
From the platform perspective, there is an increased recognition that algorithms are rarely neutral, nor can they deal with the nuances and complexities of our modern world. As regulators and legislators circle in the wings, Google and Facebook are responding in various ways including – in the news area – through partnerships with independent fact-checkers and the testing of new algorithms that attempt to break people out of their bubbles.10 They know too that their long-term business depends on building far higher levels of trust than our survey demonstrates people currently have in social media in particular. The crisis over fake news could be the best thing that has happened to journalism – or the worst.
Zagrut is not a magic wand that is aimed to solve all these problems. However, it’s a step in that direction – to critique the existing system of news propagation and consumption and makes the user aware of the bias in this environment.