Sponsored Links

Jumat, 22 Juni 2018

Sponsored Links

The Filter Bubble â€
src: www.tracyparish.ca

filter bubbles is an intellectual isolation state that can result from personalized search when website algorithms selectively guess what information the user wants to see based on information about the user, such as location, previous behavioral clicks and search history. As a result, users become separated from information that does not agree with their point of view, effectively isolating them in their own cultural bubble or ideology. The choices made by this algorithm are not transparent. Key examples include Google Personalized Search results and personalized news streams on Facebook. The bubble effect may have negative implications for citizenship discourse, according to Pariser, but a contrasting view considers the effect as minimal and can be addressed. The surprising results of the US presidential election in 2016 have been attributed to the influence of social media platforms such as Twitter and Facebook, and as a result have been questioned the effect of the phenomenon of "filter bubbles" on users' exposure to false news and echo space, spurring renewed interest in terms, with many who are worried that the phenomenon may endanger democracy.

(Technology like social media) lets you go with like-minded people so you do not mix and share and understand other points of view... This is very important. This turned out to be more of a problem than I, or many others, had hoped for.


Video Filter bubble



Drafts

The term was coined by Internet activist Eli Pariser around 2010 and discussed in his 2011 book of the same name; according to Pariser, users get less exposure to conflicting and intellectually isolated perspectives in their own information bubble. He hooked an example where one user searched Google for "BP" and got investment news about British Petroleum, while other searchers were informed about the Deepwater Horizon oil spill, and noted that the two search results pages were "very different".

Pariser defines the concept of filter bubbles in more formal terms as "the personal information ecosystem that has been fulfilled by this algorithm". The last search history and searches of Internet users are built from time to time as they show interest in topics by "clicking links, viewing friends, placing movies in their queue, reading news", and so on. An Internet company then uses this information to target ads to users, or to make certain types of information appear more clearly on search results pages. The process is also not random, and operates under three step processes. According to Eli Pariser's book, the process states, "First, you find out who people are and what they like, then you provide them with the content and services that work best for them, and finally you set to fit precisely. "It illustrates how we let the media formulate our minds because of the repeated messages we encounter every day.

How the bubbles and filter algorithms work according to the Wall Street Journal Study is, "The top 50 internet sites install 64 cookies and track data tracking or tracking algorithms of the data mentioned above." Google specifically has 57 algorithms for the purpose of customizing your search. For example finding a word like "depression" in Dictionary.com allows the site to install more than 200 tracking algorithms on your computer so that websites can target you with antidepressants.

Other terms have been used to describe this phenomenon, including the "ideological framework" and "the figurative sphere that surrounds you as you search the Internet". The related term, "echo chamber", was originally applied to news media, but is now applied to social media as well.

Paris's idea of ​​a filter bubble was popularized after the TED talk he gave in May 2011, where he gave examples of how filter bubbles work and where they can be seen. In tests trying to demonstrate the effect of a filter bubble, Pariser asked some friends to search for the word "Egypt" on Google and send the results. Comparing the first two pages of friends' results, while there is an overlap between them on topics such as news and travel, one friend's results prominently include a link to information about the ongoing 2011 Egyptian revolution while the other friend's first results page. do not include such links.

In Bubble Filter , Pariser warns that the potential loss to filtered searches is that it "closes us to new ideas, subjects, and important information", and "creates the impression that our narrow self-interest is all there is ". This is potentially harmful to both the individual and the community, in his view. He criticized Google and Facebook for offering users "too much candy, and not enough carrots". He warned that "unobserved algorithmic editing on the web" could limit our exposure to new information and narrow our view. According to Pariser, the detrimental effects of filter bubbles include endangering the general public in the sense that they have the possibility of "damaging civil discourse" and making people more vulnerable to "propaganda and manipulation". He writes:

The world built from the familiar is a world where nothing needs to be learned... (because there is) an invisible autopropaganda, indoctrinating us with our own ideas.

Many people do not realize that even filter bubbles exist. This can be seen in an article in The Guardian, which mentions the fact that "more than 60% of Facebook users are totally unaware of any curation on Facebook, believing that every story from friends and pages followed shows up in their news feeds." about how Facebook decides what happens to a user's news feed is through an algorithm that takes into account "how you interacted with similar posts in the past."

Bubble filters have been described as exacerbating the phenomenon called splinternet or cyberbalkanization, which occurs when the internet is divided into subgroups of like-minded people who become isolated within their own online community and failed to gain exposure to various views. These concerns stem from the early days of the publicly accessible Internet, with the term "cyberbalkanization" created in 1996.

Similar concepts

Barack Obama's farewell address identifies the same concept for filtering bubbles as "a threat to [America's] democracy", that is, "backing into our own bubble,... especially our social media bait, surrounded by people who look like us and share the same political views and never challenge our assumptions... And the more we become very secure in our bubble that we begin to accept only information, whether it is true or not, which is in our opinion, rather than basing our opinion on the evidence in out there. "

Maps Filter bubble



Reaction

There are conflicting reports about the extent to which personalized filtering occurs and whether the activity is useful or harmful. Analyst Jacob Weisberg, writing in June 2011 for Slate, conducted a small non-scientific experiment to test Parisian theories involving five colleagues with different ideological backgrounds performing a series of searches, "John Boehner", "Barney Frank "," Ryan plan ", and" Obamacare ", and send a Weisberg screenshot of their results. The result varies only in small things from person to person, and any distinction does not seem to be related to ideology, which leads Weisberg to conclude that the filter bubbles do not apply, and wrote that the idea that most Internet users "feed on the troughs of the Daily Me "is too much. Weisberg asked Google to comment, and a spokesperson stated that the algorithm was deliberately designed to "limit personalization and promote variation". Book review Paul Boutin conducted a similar experiment with Weisberg among people with different search histories, and once again found that different searchers received nearly identical search results. Interviewing programmers on Google from recording journalist Per Grankvist found that user data was used to play a larger role in determining search results but Google, through testing, found that search queries so far were the best determinants of what results would be shown.

A Wharton scientific study analyzing personalized recommendations also finds that these filters can actually create similarity, not fragmentation, in the tastes of online music. Consumers are reportedly using filters to expand their appetites rather than restricting them. Harvard law professor Jonathan Zittrain debates the extent to which personalized filters distort Google's search results, saying that "the personalization effect of search has been mild". Furthermore, Google provides the ability for users to turn off personalization features if they choose, by deleting Google records about their search history and managing Google not to remember their search keywords and visiting links in the future.

Although the algorithm limits political diversity, some filter bubbles are the result of user choice. In a study by data scientists on Facebook, they found that for every four Facebook friends who share the ideology, users have one friend with a contrasting view. No matter what the Facebook algorithm for News Giving is, people are more likely to be friends/follow people with similar beliefs. The nature of the algorithm is that it ranks stories based on user history, resulting in a reduction of "cross-sectional political content by 5 percent for conservatives and 8 percent for liberals." However, even when people are given the option to click on links that offer a contrasting look, they still use the most viewed sources. "[U] ser option reduces the possibility of clicking on cross-sector links by 17 percent for conservatives and 6 percent for liberals." Cross-sector links are links that introduce different perspectives from the user's alleged point of view, or what the website believes to be confident of the user.

Thus Facebook's study found that it was "inconclusive" whether or not the algorithm played a major role in filtering News Feeds as people assumed. The study also found that "individual choice," or confirmation bias, also affects what is filtered from the News Feed. Some social scientists criticize this conclusion, because the protest point of the filter bubble is that the algorithm and individual choices work together to filter the News Feed. They also criticized Facebook's small sample size, which is about "9% of actual Facebook users", and the fact that the study results "can not be reproduced" due to the fact that this study was conducted by "Facebook scientists" who have access to data that is not provided for Facebook outside researchers.

Although this study found that only about 15-20% of the average Facebook friends who subscribed to the opposite side of the political spectrum, Julia Kaman of Vox theorizes that this can have positive implications for the diversity of points of view. These "friends" are often acquaintances with whom we may not share our politics without the internet. Facebook can develop a unique environment where users view and possibly interact with content posted or re-posted by these "second tier" friends. The study found that "24 percent of the news items viewed by liberals have a conservative tendency and 38 percent of conservative news viewing is liberal-inclined." This interaction has the ability to provide a variety of information and resources that can moderate the views of users. Similarly, the study of Twitter filter bubbles by New York University concludes that "Individuals now have access to a broader range of views on news events, and most of this information does not come through traditional channels, but either directly from political actors or through peers. their friends and relatives.In addition, the interactive nature of social media creates opportunities for individuals to discuss political events with their friends, including those with weak social ties.According to this study, social media can diversify the information and opinions that connect with users, although there is a lot of speculation around the filter bubbles and their ability to create deeper political polarization.

A study by researchers from Oxford, Stanford, and Microsoft examined the exploratory history of 1.2 million US users from Bing Toolbar add-ons for Internet Explorer between March and May 2013. They selected 50,000 users who were active news consumers, then classified whether the news outlets they visit are left or right, based on whether the majority of voters in the area associated with the IP address of the user elect Obama or Romney in the 2012 presidential election. They then identify whether news stories are read after accessing the publisher's site directly, through Google's aggregation service News, through web search, or through social media. The researchers found that while web search and social media contribute to ideological segregation, most online news consumption consists of users who directly visit mainstream or leftist news websites, and consequently are exposed almost exclusively to views from one side of the political spectrum. Research limitations include selection issues such as Internet Explorer users who are leaning higher in age than the general Internet population; Use of the Bing Toolbar and share a history of voluntary (or unknown) choice of history for users who are less concerned about privacy; the assumption that all the stories in left-leaning publications are leaning to the left, and the same for leaning to the right; and the possibility that users who are not active news consumers may get most of their news through social media, and thus experience a stronger effect of social or algorithmic bias than users who are essentially self-selecting their bias through choice news publication (assuming they are aware of publication bias).



There are reports that Google and other sites store a huge "file" of information about their users that allows them to personalize further the individual Internet experience if they choose to do so. For example, technology exists for Google to track past users' history even if they do not have a personal Google account or are not signed in to one. One report states that Google has collected "10 years valuable" information collected from various sources, such as Gmail, Google Maps, and services other than search engines, although the reverse report is that trying to personalize the Internet for each user is technically challenging for a company Internet to achieve despite the large amount of data available. Analyst Doug Gross from CNN suggests that filtered searches seem to be more beneficial to consumers than to citizens, and will help consumers searching for "pizza" find local delivery options based on personalized searches and precise filtering of pizza shops. far. Organizations such as Washington Post , The New York Times , and others have experimented with the creation of a new, personalized information service, in order to tailor search results to users who are likely to like or agree.

When the filter bubbles are in place, they can create certain moments that scientists call the 'Whoa' moment. When 'Whoa' is when articles, ads, posts, etc. Appears on your computer that is associated with the current action or the use of the current object. The scientists discovered this term after a young woman did her daily routine, including drinking coffee, when she opened her computer and saw an advertisement for the same coffee brand she was drinking. "Sit down and open up this morning over coffee, and there are two ads for Nespresso, the 'whoa' moment when the product you're drinking is on the screen in front of you." The 'Whoa' moment occurs when people are "discovered." Which means the ad algorithm targets specific users based on their 'click behavior' to increase their sales revenue. Whoa moments can also trigger the discipline of the user to stick to routines and similarities with the product.

Some designers have developed tools to counteract the effects of the filter bubbles (see Counter steps). The Swiss SRF radio station chose the word (German translation of the filter bubble) word 2016.

INMA: Popping the filter bubble
src: www.inma.org


Countdown steps

By individual

In Bubble Filter: What Internet Hides from You , Internet activist Eli Pariser highlights how increasing the occurrence of filter bubbles further emphasizes the value of a person's social bridging capital as defined by Robert Putman. Indeed, while capital ties fit on the one hand for the formation of strong bonds between like-minded people, thus reinforcing some sense of social homogeneity, bridging social capital on the other hand is the creation of a weak relationship between people with divergent potential and viewpoint , then it introduces significantly more heterogeneity. In that sense, high bridging capital is far more likely to promote social inclusion by increasing our exposure to a space where we address problems that go beyond our niche and narrow self-interest. Fostering one's bridging capital - for example by connecting with more people in an informal setting - can therefore be an effective way to reduce the influence of the filter bubble phenomenon.

Users can actually take a lot of action to break through their filter bubbles, for example by making a conscious effort to evaluate what information they are dealing with themselves, and by critically thinking about whether they are involved with a wide variety of content. This view holds that users should change the psychology of how they approach the media, rather than relying on technology to negate their biases. Users can consciously avoid sources of news that can not be verified or weak. Chris Glushko, Deputy Director of Marketing at IAB, supports the use of fact-checking sites like Snopes.com to identify fake news. Technology can also play a valuable role in combating filter bubbles.

Websites like allsides.com and hifromtheotherside.com aim to expose readers to different perspectives with diverse content. Some additional plug-ins are meant to help us get out of our filter bubble and make us aware of our personal perspective; thus, this medium shows content that conflicts with our beliefs and opinions. For example, Escape Your Bubble asks users to show certain political parties they want to know more. The plug-in will then suggest articles from established sources for you to read related to the political party, encouraging users to become more educated about the other party. In addition to plug-ins, there are apps created with the mission of encouraging us to open our echo chamber. Reading across the Aisles is a news app that reveals whether users are reading from a variety of new sources that include multiple perspectives. Each color source is coordinated, representing the political order of each article. When a user only reads news from one perspective, the app communicates it to the user and encourages the reader to explore other sources in opposite viewpoints. Although apps and plug-ins are tools that humans can use, Eli Pariser states "of course, there are some individual responsibilities here to really look for new sources and people who do not like you."

Because web-based advertising can improve the effect of a filter bubble by exposing users to more of the same content, users can block many ads by deleting their search history, turning off targeted ads, and downloading browser extensions. Extensions like Escape Your Bubble for Google Chrome aim to help curate content and prevent users from exposure to biased information, while Mozilla Firefox extensions like Lightbeam and Self -Destructing Cookies allows users to visualize how their data is being tracked, and allows them to delete some tracking cookies. Some use anonymous or un-personalized search engines like YaCy , duckduckgo , StartPage , Disconnect , and Searx to prevent companies from collecting their web search data. The Swiss Neue ZÃÆ'¼rcher Zeitung is a beta test of personalized news engine apps that use machine learning to guess what content users are interested in, while "always include a surprise element"; The idea is to mix in stories that users may not have followed in the past.

The EU is taking steps to reduce the effects of the filter bubbles. The European Parliament sponsors the question of how filter bubbles affect people's ability to access a variety of news. In addition, introducing a program that aims to educate citizens about social media. In the US, the CSCW panel recommends using news aggregator apps to expand media consumer media intake. The news aggregator app scans all the latest news articles and directs you to different perspectives on a particular topic. Users can also use a conscious-conscious news counter, which visually indicates media consumers if they lean to the left or right when reading the news, indicating leaning with a big red bar or a left tilt with a bigger blue bar. A study that evaluated this news counterweight found "a small change in reading behavior, toward more balanced exposure, among users who saw feedback, compared with control groups".

By media company

Given recent concerns about filtering information on social media, Facebook acknowledges the existence of a filter bubble and has taken steps to remove it. In January 2017, Facebook removed personalization from its Trending Topics list in response to a problem with some users who did not see the highly discussed event there. The Facebook Strategy is reversing the Related Articles feature that has been implemented in 2013, which will post related news after users read the article together. Now, a revamped strategy will reverse this process and post articles from different perspectives on the same topic. Facebook is also trying to go through a review process where only articles from reputable sources will be displayed. Along with the founders of Craigslist and several others, Facebook has invested $ 14 million into an effort "to boost confidence in journalism around the world, and to better inform public conversations". The idea is that even if people just read posts that are shared from their friends, at least this post will be credible.

Similarly, Google, on January 30, 2018, also acknowledged the difficulty of filter bubbles in its platform. Because Google search currently pulls ranking results based on algorithms based on "validity" and "relevance" that show and hide certain search results, Google strives to combat this. By training its search engine to recognize the intent of search search rather than the literal syntax of the question, Google is trying to limit the size of the filter bubbles. Until now, the early stages of this training will be introduced in the second quarter of 2018. Questions involving a controversial bias and/or opinion will not be discussed until later, prompting a bigger problem that still exists: whether the search engine acts either as a truth referee or as a knowledgeable guide to make a decision.

In April 2017, news surfaced that Facebook, Mozilla and Craig Craig contributed to the majority of the $ 14 million donation to CUNY's "News Integrity Initiative", which is ready to eliminate false news and create more honest news media.

Then, in August, Mozilla, whose service hosted the Firefox web engine, announced the creation of the Mozilla Information Trust Initiative (MITI). MITI will serve as a collective effort to develop community-based products, research, and solutions to combat the effects of filter bubbles and false news proliferation. Mozilla's Open Innovation Team leads the initiative, striving to combat misinformation, with a particular focus on products related to literacy, research, and creative intervention.

Filter bubble - Wikipedia
src: upload.wikimedia.org


Ethical implications

As the popularity of cloud services increases, the personalized algorithm used to build the filter bubbles is expected to become wider. Scholars began to consider the effects of filter bubbles on social media users from an ethical point of view, particularly on areas of personal freedom, security, and information bias. Bubble filters on popular social media and personalized search sites can define specific content viewed by users, often without their consent or direct awareness, because of the algorithm used to curate the content. Critics of the use of filter bubbles speculate that individuals may lose autonomy over their own social media experience and their identity is socially constructed as a result of the outbreak of filter bubbles.

Technologists, social media engineers, and computer specialists have also examined the prevalence of filter bubbles. Mark Zuckerberg, founder of Facebook, and Eli Pariser, author of "The Filter Bubble," have even expressed concern about the risks of privacy and information polarization. Information from personalized search engine users and social media platforms is not personal, though some people believe it should be. Attention to privacy has led to debate whether or not it is moral for information technology to take users online activities and manipulate future exposure for related information.

Because the content viewed by individual social media users is affected by algorithms that generate filter bubbles, users of social media platforms are more susceptible to confirmation bias, and may be subject to bias, misleading information. Social sorting and other unintentional discriminatory practices are also anticipated as a result of personalized screening.

Given that US 2016 scholarship recipients have also expressed concern about the effects of filter bubbles on democracy and democratic processes, as well as the rise of "ideological media". These scholars fear that users will not be able to "think" outside of their narrow personal interests "because the filter bubbles create a personalized social feed, isolating them from different perspectives and communities around them. What's interesting is how the filter bubbles manipulate news feeds through algorithms, which contribute to the proliferation of "fake news" and can affect political trends, including how users choose.

The filter bubble - The Long and Short
src: thelongandshort.org


See also

  • Communal reinforcement
  • Confirmation bias
  • Content farming
  • Deradicalization
  • The echo chamber - a similar phenomenon in which ideas are amplified in a closed system, and opposite views are censored aggressively
  • The consensus effect is incorrect
  • Group polarization
  • Media consumption
  • Search engine manipulation effect
  • Selective exposure theory
  • The accidental discovery, the antithesis of the filter bubble
  • Search engines that claim to avoid filter bubbles: DuckDuckGo, Ixquick, MetaGer, Searx, and Startpage.

Fake news, echo chambers and filter bubbles: Underresearched and ...
src: images.theconversation.com


Note


Breaking Through the Filter Bubble -- How to Approach Content ...
src: api.ning.com


References


Do-Gooder Technologists Are Trying to Burst the Post-Election ...
src: media.wired.com


Further reading

  • Pariser, Eli. Bubble Filters: What Internet Hides from You , Penguin Press (New York, May 2011) ISBN 978-1-59420-300-8
  • Green, Holly (August 29, 2011). "Breaking Out of Your Internet Filter Bubble". Forbes . Retrieved December 4, 2011 .
  • Friedman, Ann. "Go Viral." Columbia Journalism Review 52.6 (2014): 33-34. Communication & amp; Bulk Media Complete.
  • Bozdag, Engin; van den Hoven, Jeroen (December 18, 2015). "Breaking up the filter bubble: democracy and design". Ethics and Information Technology . 17 (4): 249-265. doi: 10.1007/s10676-015-9380-y.

Are You Trapped in Your Filter Bubble? | Cooler Insights
src: coolerinsights.com


External links

  • Filtering bubbles in internet search engines, Newsnight/BBC News , June 22, 2011

Source of the article : Wikipedia

Comments
0 Comments