Essay planning
Critically analyse some societal benefits and dangers of social media
• Find, evaluate, synthesise and use information from a variety of sources.
• Apply skills of critical analysis to the task.
• Express ideas effectively, and communicate information appropriately and accurately.
• Implement the Harvard referencing style
When handing in the essay include
-Your name
-Student id
-Unit title
-Assessment title
-Word count
___________________________________________________________________________________
Benefits:
-https://www.businessinsider.com/facebook-is-using-ai-to-try-to-predict-if-youre-suicidal-2018-12?r=US&IR=T /
While facebook has been scrutenized for collecting data from its users, it seems they are using this data collection for benefitial use. In an attempt to help users who seem to be suicidal by monitoring their posts.
-Filter bubbles // In my opinion these are both beneficial and negative
-https://ebookcentral.proquest.com/lib/mmu/reader.action?docID=5963805 blockchains how they are good and bad
-https://mmu.on.worldcat.org/search/detail/1107456276?queryString=How%20social%20media%20benefits%20&clusterResults=true&groupVariantRecords=false /how social media effects higher education g
-fandoms
Dangers
-music ai, fandoms have pushed drake and weekend ai song which umg clapped back at by taking down
-Digital currencies and NFTs and how they are used for scamming and Pump and dumps through social media and influencers- nft bros/ pump and dumps and the dangers of them
-ai in media/ music ai voices/ presedents/ mis and dis info
-fandoms Active audinece is about fandoms/reddit and twitter fan art and fan fiction
-moral panic is a theory that can be used to describe the way in which the mass media create folk devils out of products or groups of people. corona virus media /Tik Tok media stealing data
-Trolling/cyberbullying are terms used to describe the way that celebrities'/ individuals are targeted online in a negative way
-Hypodermic syringe is about media having a negative impact and that people are easily swayed by media to believe what they are told. raid on the capital that was formed via ideas started by news channels like fox and social media. This links to the filter bubbles as these people were able to set up this event without most people knowing because they were consuming content about this
filter bubble tiktok something to do with kids being depressed and being fed depressing content on tiktok
__________________________________________________________________________________
This is an great example of a introduction copy it!!!
This essay will examine and explain how the impact of social media has effected society both beneficially and negatively. It will prove how filter bubbles, created by algorithms, can cause moral panic and effect peoples experiences online. It will also show how the worldviews of the users in these active audiences can change due to what they watch and consume weather it be through miss or disinformation that has spread online. The essay will also show how these fandoms can effect outside influence and what that can cause on a global scale. Further more, the essay will conclude by showcasing how people can use social media while minimising the effects that filter bubbles can potentially cause, which can decrease the amount of miss and disinformation that is present on social media.
https://participedia.net/case/7999
https://www.npr.org/2022/10/28/1131500833/me-too-harvey-weinstein-anniversary
Filter bubbles can pose to have both a beneficial and negative effect on social media. A user's experience that is made specifically for what they enjoy seems great, as it can allow the connectivity of communities and people to grow online and lets users experience content that they will agree with. "They can help us find the information we need to see and hear." (Pariser Eli, 2011) One specific event that took place over social media that changed the landscape of society was the 'Me too' movement. In which Women used social media platforms, such as Twitter and Instagram to talk about their past experience with famous figures, or past relationships, that had left them traumatised. In this journal article, the author explains how the 'virality' of the Me too hashtag, that trended on twitter, 'fulfilled its purpose of raising awareness of sexual violence.' Twitter and Instagram's algorithmic filter bubbles were what made this hashtag become viral, which opened up more people to a growing community of women that were speaking up online about their past experiences. Furthermore, a journal article from NPR examines the ripples effects that this community had caused. One such example being the infamous rise of 'cancel culture'. The author delves into Harvey Weinstein's termination on Twitter from a recent Wall street journal article recounting his crimes of sexual abuse. Which interjected with the growing me too movement taking place on social media. The author begins to state that 'Dozens of women stepped forward to publicly share the extent of the powerful producer's bad acts.' Which was thanks how more women felt comfortable coming out, knowing their was a community of women that were alongside them. Even though cancel culture has a bad reputation now, during its begging stages it showed how a community formed through an algorithmic filter bubble could change how society acts. Which relates to Eli Pariser's statement, as it helped a lot of people find, and post information that they needed to seen and hear. Proving how social media can have a beneficial effect on society, and how people with good faith that need help, can uses these tools to find a community that can help them make a change.
Pariser, Eli (2011) 'The filter bubble, what is the internet hiding from you' Google books [Online] 10th March 2023
https://www.washingtonpost.com/technology/2021/09/03/facebook-misinformation-nyu-study/
https://www.thelancet.com/journals/landig/article/PIIS2589-7500(20)30227-2/fulltext
https://www.abc.net.au/news/science/2018-01-17/what-anti-vaccination-groups-reveal-about-facebook-filter-bubble/9324876
https://networkconference.netstudies.org/2021/2021/04/25/how-filter-bubbles-and-echo-chambers-reinforce-negative-beliefs-and-spread-misinformation-through-social-media/
While their are many examples that show how filter bubbles can have beneficial effects on society, One such negative example is the constant flow of miss, and disinformation that is spread across social media. During the pandemic, large amounts of post spread online that either downplayed the spread of the virus, or misinformed people about what was actually in the vaccine. The term 'anti vax' has been around for years, as misinformation has been used in the past to spread the idea that vaccines cause autism in children. An article by ABC science, written by Ariel Boggle, examined how Facebooks filter bubbles, cause 'echo chamber' and gave them more incite as to how communities such as 'Anti vax' are formed on these apps. Along side this, a report from The Lancet Digital health published during the pandemic 'Noted that 31 million people follow anti-vaccine groups on Facebook' Which proves that Facebook's algorithm leads people with these views into echo chambers, in which they are feed the same misinformation they believe. Which, during the roll out of the vaccine, greatly mislead the publics opinions on it. Which caused the pandemic to last a lot longer than it was estimated to. There are many contributing factors that can explain why Facebook is home to so many of these misinformed communities that spread miss, and disinformation on the app. A journal article by The Washington Post, published in 2021, proves that American users are more likely to interact with 'fake news' rather than factual posts. They go on to explain, that these posts containing misinformation are politically motivated, and that they contain both far right, and far left views. As well as being posted by users who are not politic journalists. While factual information on the app, is majority politically neutral, and is posted by professional news accounts. The article proceeds to say that 'posts with far right or left leaning views, gain more traction' This shows how Facebooks filter bubbles, and its algorithm spread misinformation based on a persons political views. Which can have massive repercussions on political elections. Relating to hypodermic needle theory proposed by Elihu Katz, in 1955, which stated that people are easily swayed into believing information that is from a source that they trust. Furthermore, events such as the 2021 raid on the Capitol, are seen to have been cause by misinformation spread through Facebook, by far right accounts, claiming the 2020 election was rigged. Which shows how these filter bubbles that are present on almost all social media, could potentially be a danger to society. As misinformation could spread so far that it causes as mass amount of uninformed people to attack people and damage property. This proves that while it may help positive communities grow online a make society change for the better. Online filter bubbles, and the algorithms that social media companies use could potential be the cause of a lot of problems, and form negative communities that are fuelled by misinformation that they consume online.
https://about.fb.com/news/2017/03/building-a-safer-community-with-new-suicide-prevention-tools/
https://newatlas.com/facebook-suicide-prevention-algorithm-ethics-concern/58415/#:~:text=The%20authors%20cite%20a%20variety,a%20profound%20lack%20of%20transparency.
While Facebook has been scrutinized for collecting data from its users, and for its algorithm that spreads misinformation. It seems they are using these tools for a beneficial use. The company seems to be using its resources to help prevent suicide across the world. From Meta's website in 2017, they showcased how they were building a safer community with their 'suicide prevention tools' which would be used to predict your mental state from your posts, and contact authorities in any chance that they may help this person. Through this, Meta have created a way to use filter bubbles and their algorithm to benefit society, and try helping people in need. They also include many built in reporting, and contacting support tools that allow people to ask for help for themselves or other people. However, while this may seem like a step in the right direction. It also poses as a danger for venerable people who did not consent to real world intervention, because of what they post on social media. An article from 'New Atlas' commented that the algorithm preventing suicide raises 'ethical issues', and adds to this by explains how the tool has been banned in Europe through a GDPR rule that deems it a 'Privacy violation'. To build on this the journal article explains how 'Neither the general public nor the medical community actually know how successful the system is', as Meta is not transparent as to how many suicides are prevented by these tools. Plus, the tools rapid action to calling authorities could cause massive social issues for people that don't actually need help, as police could be called on unwitting citizens. Furthermore, the reporting system that they announced alongside this, can easily be used for online cyberbullying, trolling, and even by toxic fandoms. This proves to be a huge design flaw, that could potentially have massive repercussions, and be a danger to society, as people can be targeted for their opinions on the app. Which can become a bigger issue, due to the large amounts of misinformation, and disinformation that is prevalent on the app. Altogether, this tool has shown to be promising. As it helps to prevent tragedies, and prevent the spread of suicidal material on the app. Which could be beneficial to society, as mental health has been a problem for many young adults and teenagers on social media. Through the tool Meta had attempted to use their algorithm and filter bubbles to create a easy way for people to get help for their struggling mental health, before self harm was inflicted. However, the system they created has shown to be inconsistent, and a societal danger to people who may not have been in need of help. As well as being an invasion of privacy that dangers its users.
https://theconversation.com/ukraine-how-social-media-images-from-the-ground-could-be-affecting-our-response-to-the-war-178722
https://apnews.com/article/tiktok-russia-propaganda-labels-ukraine-kremlin-china-88ecd866a2c34218ebbaccb45e435c12#:~:text=WASHINGTON%20(AP)%20%E2%80%94%20A%20year,policy%20has%20been%20applied%20inconsistently.
https://www.theatlantic.com/technology/archive/2021/06/your-tiktok-feed-embarrassing/619257/
https://www.youtube.com/watch?v=FEXlPxwZE_A
Nowadays, information can spread fast through social media, and this helps society understand what is effecting or changing their lives. We live in a digital society in which access to world news is easier than it ever has been. Many 'journalist' accounts across social media, help to inform the viewer of issues everyday. Which helps more people understand current events that may effect them personally. One such example of this is the mass broadcasting of the war taking place in Ukraine on TikTok and Twitter. An article from 'The Conversation' calling it the first 'TikTok war' due to the degree at which civilians in Ukraine have been documenting the war on these apps. Showing people the effects of the war and explaining how they can help. This amount of citizen journalism allows society to understand the situation more, and even shows imagery and videos from the ground of the war; adding a human feeling to the post, that resonates with a lot of people. Alongside this, an article from 'The Guardian' builds upon this by explaining how social media influencers in Russia, have become beacons of resistance that. Speaking out against the conditions the Russian government is putting Ukraine's people through. Through this we can understand that these social media apps, are being used to spread information that will greatly effect how people react to the war. Plus, the Russian influencers and civilian journalists speaking out against the war, and documenting it, greatly benefit society across the world. As it can give hope to people that are suffering because of this war, and potentially change the outcome of this terrible tragedy. Of course with these tools, countries such as Russian, and North Korea can use them to their advantage, and spread misinformation and propaganda across these social media sites. TikTok's algorithm has already garnered a lot of controversy over its addicting nature, and how it directs users into strong filter bubbles that could be detrimental to their social life, or mental health. 'I'm scared of the person TikTok thinks I am.' (Tiffany Kaitlyn, 2021) However, what Russia have been using the algorithm for could be a lot more dangerous for society, especially the younger audience that dominates the apps audience. A journal article published by 'AP News' proved that despite TikTok's attempts to mitigate 'Pro-Russian propaganda' accounts, researchers found more than 80 accounts that posted were exploiting TikTok's algorithm, to spread misinformation concerning Western governments such as US and UK. This proves that social media channels, like TikTok can be a huge danger to society. As they are able to reach such a large audience of young adults and teens and potentially indoctrinate them using this misleading content, into supporting the Russian invasion. This is similar to how many terrorist organisations were able to indoctrinate teens using social media or online webchats. Which resulted in an increase of young adults running away and joining these groups. Overall, apps such as, TikTok or Twitter need to implement better anti-propaganda tools to help prevent young adults and teens from being mislead. As these situations in the past, have shown to be very dangerous for society.
To conclude, many of these sources have shown that these tools that social media companies use to keep you on their app, and in a filter bubble, can have a positive effect on society. As it can help lock up criminals, spread information that would be beneficial for their lives, and help people when they are in need. However, this does not downplay the danger that these echo chambers and algorithms can inflict on society. Misinformation is very hard to miss online and many times people believe information that isn't true, which can cause communities, that spread even more 'fake' news', to a larger amount of people. This is a massive danger to society, as it can cause moral panic in large audiences, and events such as the 2021 raid on the capitol. The best way to avoid misinformation online, is to watch or read your news from a trusted and verified news source, and not always believe what a civilian journalist that comes up on your for you page, as the information they are spreading to their audience could be incorrect. Plus, it is rather easy for more informed people to understand if what they are watching is correct, but parents should understand that their children can be easily influenced by content on social media, and that they should make sure their child isn't accessing websites or accounts they shouldn't.
Comments
Post a Comment