As social media platforms move towards a more personalized experience, users are now receiving more information that they themselves would be more interested in, according to their prior browsing and 'clicks'. This is starting to play a huge role into our civic system. Someone who feels one way on a social issue will view suggestive ads to articles that reinforces their viewpoint instead of ads that show the other side of the argument. Another example is how people spend their money using online stores. This personalization by algorithms is creating a 'loyal' customer base by using suggestive ads that draw the user to buy from the same online store. Some argue that it plays a huge part into the polarization of our society. Ultimately, however, it does fall upon the user to make an executive decision to choose that suggestive information. A good example is the role of peer pressure in drinking alcohol. Sure, peer pressure has been shown to increase the chances of someone partaking in the imbibement of alcohol, but the final decision is made by that individual whether to take a sip or not to. As is with personalization of information.
When reflecting back onto my own filter bubble, I decided to look at my Youtube page. Looking at someone's youtube page can give you a legitimate guess on a person's personality. It shows the music they tend to listen to, political videos they choose to watch, the attention grabbing viral videos they tend to be drawn to, and overall what videos they choose to watch in their free time. At the top of the youtube page there is a "Recommended" video category. This is the "money maker" of all algorithms on Youtube. This section shows videos that you might be interested in watching based upon your prior viewing. Mine in particular showed 5 music type videos (note: some videos were songs I have already listened to on youtube, but were covers, live performances, and newly released videos), 2 Steven Colbert videos, a John Oliver video, and a "Yao Ming top 10 plays of his career video". Looking at these particular suggestions, the totality of content that the algorithm chose for me is not surprising. I primarily use Youtube to listen to music anymore now-a-days with an occasional sports video or a politically left short video, and I am subscribed to the "Late Show with Stephen Colbert" youtube page. What surprised me was the particular band of the suggested videos. Five of the six music videos were the band "Tool". I'm not an avid listener of Tool, but I had listened to a few of their songs in the past week in particular. This shows that the algorithms powering this recommended section is weighted by recent views just as much, if not more, than total views.
As you scroll down the page, the next section you see is the "Watch It Again" section. This section seems to show videos that you may have watched once or twice in the past, but haven't watched in a while, similar to the "Friends that Need Love" section in Snapchat. The next section for me is "Recently Uploaded", followed by the words, "Recommended videos for you". I do often look for videos that are very up to date and recent, so that can explain that. The next sections are a mix of pages I am subscribed to and, again, recommended playlists for me. These recommended playlist show more of my consistent "music filter bubble", with playlists that consist of 'Sublime', '311', 'Incubus', and others. To me, it is interesting that they would put these selections closer to the bottom of the page rather than into the top 'Recommended' section. Is that particular algorithm making a judgment on my music selection based on a simple recency equation? Or is it more complex than that? For example, it could see that I tend to gravitate towards more recent videos when it comes to news, therefore it determines that music that I have been listening to more recently. My brain hurts.
In conclusion, the complexity/simplicity of an algorithm is truly not the final decision on what information you choose to take in. Whether it's the music you listen to, the news articles you read, or your own purchases, the user has the ability to pick and choose the information they take in. The entirety of information is at our disposal. It just requires an extended amount of effort to look past this personal bubble that has formed. And the sad truth is, that really isn't in our nature. We continue to look for the easiest route as humans and a society. With this particular issue, it isn't anything different. Now we can just find the things we enjoy more easily. We need to be careful that we don't abuse this accommodation and split our society.