Washington post recently discovered that many of Apple’s social networking apps contain complaints about unwanted sexual experiences, what is shocking is that many of them are directed towards children. Now, this questions Apple’s reputation for being a safe and secure place.
Washington post dug through over 130,000 reviews for six apps on the app store, which include Chat for Strangers, ChatLive, Monkey, Yubo and Holla. They did this by using a machine learning algorithm. These apps rank among the top 100 social media apps on the Apple app store. They found that at least 2 per cent of the reviews for the app called Monkey were about unwanted sexual advances which were also directed towards children.
Monkey is a social media app where you can video chat with strangers. The reviews included “A man who is sick in the head and disgusting decided to show some things that shouldn’t have been shown”. This review was published in September. A more recent review posted last month stated “this is a lawsuit waiting to happen. Predators all over the site”.
All of these random chat apps shows such reviews; in fact, the app named ChatLive had at least one-fifth of the reviews that complained about such unsolicited actions taken by people with insidious intentions. Apple’s website claims that it carefully ‘reviews’ every app, and has always taken pride in providing security. But these reviews do make you wonder if Apple will be able to maintain its layer of protection as the platform grows. It is well known that Apple does receive a cut from the apps that generate revenue.
According to Apple, they review 100,000 apps every week with the help of software as well a human. Apple’s Spokesperson Fred Sainz said “we created the app store to be a safe and trusted place for our customers to get apps and take all reports of inappropriate or illegal contact extremely seriously. He also added, “if the purpose of these apps is not inappropriate, we want to give the developers a chance to ensure they are properly complying with the rules, but we don’t hesitate to remove them from the app store if they don’t”.Source
But the main problem here is that some of these apps have existed in the App Store for years and it appears that Apple is not bothered to check all the reviews posted by users, at least that is what a former employee stated. The problem may be bigger than what appears to be because the only a handful of people write a review and after interviewing parents and teenagers there appears to be a broader problem across the platform according to the experts.
These random chat apps are designed to connect to random people, unlike other social media apps that just connects you with people you are acquainted. The idea for these apps is to make strangers meet in order to create a romantic connection. But this does not stop the younger generation of users from accessing it. The usage of the app is very easy; a single tap matches you with any person and connects you through a video call. Lonely teenagers are more vulnerable to accessing these apps to combat their loneliness.
According to Apple’s website, they are promising to remove content that cross line or that threaten the safety of children. They state that they are strictly against pornographic content and that they take full responsibility for the content that slips through. Despite the fact that these apps have age restrictions of 17 and older, they don’t exactly stop underage users from downloading the apps. A 9-year-old was able to download adult apps without any restrictions.
Apple does have the control as to who can download the app and who can’t yet they find it difficult to apply this across the App Store. It is not just limited to sexual advances; there are reviews about racism and bullying. Apple does say that its approach to control has improved and that it rejects 40 per cent of the app entries it receives.