Posts

Former Employees Confirm Disturbing Political Bias on Facebook

Conservatives have been complaining about Facebook being biased against them for years. According to a new report, those claims are more than just paranoia.

Michael Nunez at Gizmodo reports that according to a former journalist who worked on the project, “Facebook workers routinely suppressed news stories of interest to conservative readers from the social network’s influential ‘trending’ news section.” The report lists specific topics that were prevented from trending, including former IRS official Lois Lerner, who was accused of targeting conservative groups; Wisconsin Gov. Scott Walker; and Chris Kyle, the former Navy SEAL who was killed in 2013.

Furthermore, several former Facebook news curators informed Gizmodo they were explicitly ordered to “inject” certain topics into the trending column even if they weren’t popular, insinuating that liberal topics were given preference based on the company’s ideological bias.

According to Gizmodo’s source, Facebook’s trending team — a small group of young writers, largely from Ivy League or private East Coast universities — prioritizes which stories trend on the website and wields an incredible amount of influence over what the company’s billion-plus users see.

“Depending on who was on shift, things would be blacklisted or trending,” the source said, adding, “I’d come on shift and I’d discover that CPAC or Mitt Romney or Glenn Beck or popular conservative topics wouldn’t be trending because either the curator didn’t recognize the news topic or it was like they had a bias against Ted Cruz.”

Gizmodo’s source, who is conservative, was so troubled by these topics being suppressed that he kept a journal of examples, which included conservative news aggregator the Drudge Report and former Fox News contributor Steven Crowder.

“I believe it had a chilling effect on conservative news,” the source said.

In addition, curators were told to insert stories as “trending” even when they weren’t organically trending, according to a former curator.

One example was “Black Lives Matter,” which the former curator says wasn’t getting much Facebook attention until it was injected into the trending topic by the curators.

“Facebook got a lot of pressure about not having a trending topic for ‘Black Lives Matter,’” the source said. “They realized it was a problem, and they boosted it in the ordering. They gave it preference over other topics. When we injected it, everyone started saying, ‘Yeah, now I’m seeing it as number one.’” (For more from the author of “Former Employees Confirm Disturbing Political Bias on Facebook” please click HERE)

Follow Joe Miller on Twitter HERE and Facebook HERE.

Facebook Employees Asked Mark Zuckerberg If They Should Try to Stop a Donald Trump Presidency

MarkZuckerberg-cropThis week, Facebook CEO Mark Zuckerberg appeared to publicly denounce the political positions of Donald Trump’s presidential campaign during the keynote speech of the company’s annual F8 developer conference.

“I hear fearful voices calling for building walls and distancing people they label as ‘others,’” Zuckerberg said, never referring to Trump by name. “I hear them calling for blocking free expression, for slowing immigration, for reducing trade, and in some cases, even for cutting access to the internet.”

For a developer’s conference, the comments were unprecedented—a signal that the 31-year-old billionaire is quite willing to publicly mix politics and business. Zuckerberg has donated to campaigns in the past, but has been vague about which candidates he and his company’s political action committee support.

Inside Facebook, the political discussion has been more explicit. Last month, some Facebook employees used a company poll to ask Zuckerberg whether the company should try “to help prevent President Trump in 2017.”

Every week, Facebook employees vote in an internal poll on what they want to ask Zuckerberg in an upcoming Q&A session. A question from the March 4 poll was: “What responsibility does Facebook have to help prevent President Trump in 2017?” (Read more from “Facebook Employees Asked Mark Zuckerberg If They Should Try to Stop a Donald Trump Presidency” HERE)

Follow Joe Miller on Twitter HERE and Facebook HERE.

Facebook Bans Firearms Sales

Facebook says it’s cracking down on online gun sales, announcing Friday a new policy barring private individuals from advertising or selling firearms on the world’s largest social network.

The new policy applies also to Facebook’s photo-sharing service Instagram. It comes after gun control groups have long complained that Facebook and other online sites are frequently used by unlicensed sellers and buyers not legally eligible to buy firearms.

Facebook “was unfortunately and unwittingly serving as an online platform for dangerous people to get guns,” said Shannon Watts of Everytown for Gun Safety, a group that launched a public campaign to convince the social network to change its policies two years ago. (Read more from “Facebook Announces Stricter Policy on Firearms Sales” HERE)

Follow Joe Miller on Twitter HERE and Facebook HERE.

The FCC Says It Can’t Force Google and Facebook to Stop Tracking Their Users

The Federal Communications Commission said Friday that it will not seek to impose a requirement on Google, Facebook and other Internet companies that would make it harder for them to track consumers’ online activities.

The announcement is a blow to privacy advocates who had petitioned the agency for stronger Internet privacy rules. But it’s a win for many Silicon Valley companies whose business models rely on monetizing Internet users’ personal data.

It’s also the latest move in an ongoing battle to defend the agency’s new net neutrality rules, which opponents warned would result in the regulation of popular Web sites and online services. By rejecting the petition, the FCC likely hopes to defuse that argument. The rules, which took effect this summer, allow the FCC to regulate only providers of Internet access, not individual Web sites, said a senior agency official.

Consumer Watchdog, an activist group, petitioned the FCC in June to support a technology that would allow consumers to signal to Web sites that they did not want to be tracked. By clicking a button in their browser settings, users would have been able to send a “do not track” message to Web site operators when they surfed the Internet.

Some Web sites have committed to honoring those requests voluntarily, but many do not. If it had succeeded, the petition could have made Do Not Track a U.S. standard. (Read more from “The FCC Says It Can’t Force Google and Facebook to Stop Tracking Their Users” HERE)

Follow Joe Miller on Twitter HERE and Facebook HERE.

No Surprise: Facebook Is Stalking You

Facebook is following you around the Web. You knew that, right?

How else would Facebook know to serve that panda video straight into your news feed, and leave your college friend’s ill-informed rant about Pacific trade deals in the dark bowels of its servers? How else would it know to serve you with 7,000 ads for wedding dress vendors the very day you announce your engagement?

Facebook knows what you like. It knows what you don’t like. It probably knows whether you have been naughty or nice, and will be selling that data to Santa this Christmas season.

This bothers many people, especially since Facebook keeps expanding the list of things it knows about you, and the ways it is willing to use that data to make money.

The recent announcement that Facebook would soon target ads using your “likes” and “shares” has triggered some Olympic-level teeth- gnashing from the Electronic Frontier Foundation, because Facebook will get information from you not just when you actually like, “like” something, but when you load a page that has a “like” button on it. (Read more from “No Surprise: Facebook Is Stalking You” HERE)

Follow Joe Miller on Twitter HERE and Facebook HERE.

Here’s How Many ‘Dislikes’ a Post on Facebook Needs to Be Removed

364f5a8fe6703b4311d1f205293f8815Facebook released new information on its planned integration of a dislike button into user posts [last week] and it will likely be greeted with ambivalence. It appears that the dislike button will not just be for show, but will in fact be functional, effecting any post that reaches a certain threshold of dislikes received. The number being bandied around by Mark Zuckerberg and Facebook developers at the moment is ten dislikes, and receiving ten dislikes on your post will result in Facebook’s algorithms removing the post as being potentially disruptive and upsetting to fellow users.

“Facebook is about connections and bringing people together,” Facebook founder Mark Zuckerberg said regarding the upcoming changes, “and creating a space where people can interact in a safe and respectful manner. Occasionally our users are subjected to things that they find upsetting, are objectionable, and violate the community standards. Currently the mechanism we have in place to deal with this issue is the report option and function. It allows us to remove objectionable material, but requires us to maintain a large workforce to process post reports at great expense. It is also both cumbersome and slow. Many inappropriate and hurtful posts can remain up for extended periods of time before a moderator is able to examine the material of the post and remove it if necessary.”

“At the same time that we have this inefficient system of reporting we also have had user requests to integrate a dislike button into user posts for several years”, continued Zuckerberg. “We realized that by combining the two functions, reporting and the existence of a dislike button, that not only could we streamline the removal of objectionable material from Facebook but we could also achieve massive cost savings as the amount of employees needed to monitor and process reports was entirely eliminated. It’s a win for our shareholders as we increase profitability and it’s a win for users who will be able to police the potentially psychologically harmful material they are exposed to in real-time.” (Read more from “Here’s How Many ‘Dislikes’ a Post on Facebook Needs to Be Removed” HERE)

Follow Joe Miller on Twitter HERE and Facebook HERE.

If You Changed Your Facebook Pic to a Rainbow Flag, You May Have Fallen Into One BIG Trap

Recently, with the announcement of the legalization of same sex marriage in America, people have been coming out voicing their opinions on the ruling very loudly. For supporters, one of the easiest yet most effective ways has been to change their profile picture so that it has a transparent rainbow overlay using a tool set up by Facebook itself.

Within just a few hours, over 1 million users had changed their Facebook profile picture to include this rainbow image using the ‘Celebrate Pride’ tool. To date, more than 26 million users have used the tool.

While it seems like an innocent way to show your support for marriage for gay Americans, there may be something slightly more ingenuine going on. Facebook users using the ‘Celebrate Pride’ tool may be falling into a trap – a data trap, that is.

Back in 2014, it was discovered that Facebook had been conducting psychological experiments on its users without their knowledge. The social media giant studied how different users moods and statuses reflected what they saw on their own news feed, even going so far as to cater specific content to users to determine how it would effect their own mood.

Now, the rise of the ‘Celebrate Pride’ tool has some users theorizing that their information is being collected again. The thought is that the company is trying to find out how likely a user is to change their profile picture in response to a social issue and whether it is based on previous social declarations or even based on their other friends’ profile pictures. (Read more from “If You Changed Your Facebook Pic to a Rainbow Flag, You May Have Fallen Into One BIG Trap” HERE)

Follow Joe Miller on Twitter HERE and Facebook HERE.

Facebook Will Now Take Your Silent Lurking Into Account for News Feed Rankings

By now, most people know that their comments, likes, clicks, and shares on Facebook affect what they ultimately see on the social network. But Facebook revealed today that it has tweaked its news feed algorithm to also take into account the time people spend on posts, even if they don’t take any actions. That’s right, Facebook knows when you’re lurking . . .

But the modification isn’t as simple as counting the number of seconds people spend on posts. Those with slower internet connections, for instance, are likely to spend more time on stories due to the loading time. So Facebook will look at how much time a post spends on a user’s screen relative to other posts presented in the user’s news feed. Based on that, it will surface similar stories higher up in the feed. (A spokesperson for the company tells [this reporter] that Facebook is able to tell if a user is actively looking at the social network on a computer, or if the site is open in a tab or window in the background.) (Read more from “Facebook Will Now Take Your Silent Lurking Into Account for News Feed Rankings” HERE)

Follow Joe Miller on Twitter HERE and Facebook HERE.

No Surprise: Facebook DOES Collect the Text You Don’t Post

Photo Credit: Information Age

Photo Credit: Information Age

Facebook collects all content that is typed into its website, even if it is not posted, a tech consultant has discovered.

In December 2013, it was reported that Facebook plants code in browsers that returns metadata every time somebody types out a status update or comment but deletes it before posting.

At the time, Facebook maintained that it only received information indicating whether somebody had deleted an update or comment before posting it, and not exactly what the text said.

However, Príomh Ó hÚigínn, a tech consultant based in Ireland, has claimed this is not the case after inspecting Facebook’s network traffic through a developer tool and screencasting software.

‘I realised that any text I put into the status update box was sent to Facebook’s servers, even if I did not click the post button,’ he wrote on his blog yesterday. (Read more from “No Surprise: Facebook DOES Collect the Text You Don’t Post” HERE)

Follow Joe Miller on Twitter HERE and Facebook HERE.

Facebook Accused of Tracking All Users Even If They Delete Accounts, Ask Never to Be Followed

bookA new report claims that Facebook secretly installs tracking cookies on users’ computers, allowing them to follow users around the internet even after they’ve left the website, deleted their account and requested to be no longer followed.

Academic researchers said that the report showed that the company was breaking European law with its tracking policies. The law requires that users are told if their computers are receiving cookies except for specific circumstances.

Facebook’s tracking — which it does so that it can tailor advertising — involves putting cookies or small pieces of software on users’ computers, so that they can then be followed around the internet. Such technology is used by almost every website, but European law requires that users are told if they are being given cookies or being tracked. Companies don’t have to tell users if the cookies are required to connect to a service or if they are needed to give the user information that they have specifically requested.

But Facebook’s tracking policy allows it to track users if they have simply been to a page on the company’s domain, even if they weren’t logged in. That includes pages for brands or events, which users can see whether or not they have an account.

Facebook disputes the accusations of the report, it told The Independent. (Read more from “Facebook Accused of Tracking All Users Even If They Delete Accounts, Ask Never to Be Followed” HERE)

Follow Joe Miller on Twitter HERE and Facebook HERE.