onthegotore.blogg.se

Fortnite 1kill 1piece of cloth
Fortnite 1kill 1piece of cloth













fortnite 1kill 1piece of cloth

fortnite 1kill 1piece of cloth

And yet hundreds of videos still game the algorithm, dropping every popular superhero and Disney character in the title to grab the attention of a young, innocuous audience. YouTube has since acknowledged and addressed the issue (opens in new tab) by deleting around 50 channels right away and many more since (though they're unfairly forgiving to YouTube icons they can't afford to lose (opens in new tab) completely). Many of these channels had millions of subscribers, and some had billions of views, clearly reaching their target audience, disguising genuine horror beneath cute thumbnails any tech-savvy preschooler would click on. In November of 2017, James Bridle collected his findings (opens in new tab) on a then widespread trend of videos aimed at children, and often starring them, in which the children or popular superhero, cartoon, or Disney characters were placed in disturbing situations. Strip Fortnite is only the latest in a string of problems with YouTube's search and suggestion algorithms. Good on TSM_Myth (opens in new tab) for knowing his audience, though I worry YouTube's slow enforcement of their existing policy on a gathering storm of illicit content designed to exploit young Fortnite enthusiasts won't deter anyone looking for some easy views-unless YouTube comes out in force, that is. As pointed out by Polygon (opens in new tab), popular Fortnite Twitch streamers are becoming aware of the trend, with TSM_Myth calling out RiceGum's open abuse of the Fortnite/nudity-tease killer combo. It's exploitative garbage.īut the strategy has worked out well for the guy: a few of his Strip Fortnite videos have over 10 million views, doubling, sometimes tripling other videos of his released at the same time. Quick cuts between cleavage and the subjects screaming usually conclude with a half-second of a shirt beginning of come off, but the videos themselves are necessarily censored when it ever gets to that point. Each starts with a micro-montage of what is implied to occur in the video. Take known idiot RiceGum's Strip Fortnite videos, for example. It's just enough to send little Billy's imagination of a sheer sexual cliff.

#FORTNITE 1KILL 1PIECE OF CLOTH SKIN#

These videos don't actually contain explicit sexual content, some skin here and there and the occasional blurred chest, but they do allude to and tease the possibility of uncensored nudity at every opportunity. It's indicative of a larger problem on YouTube in which such creators conflate known, sometimes taboo, fetishes with popular search terms in order to tap into the same audience for the largest audience possible on YouTube. But the video represents the extent to which some YouTubers will go to dance around nudity and sexual content guidelines in order to grab eyes. Content fool touchdalight recently posted and shortly after removed a video that implied he would be playing Strip Fortnite with his 13-year-old sister, a fictional premise that wasn't actually depicted in the video-they're cheap actors playing roles. Trouble over a particular YouTuber's take on the trend has drawn the most (well-deserved) ire. OK, so there's a lot of super horny, crass imagery on YouTube and every little Billy in the world can see the lion's share of it without any restrictions, even though there are policies in place to shield poor little Billy's eyes.















Fortnite 1kill 1piece of cloth