This week alone, Facebook revealed two more mistakes in mishandling user data without permission. One was to four Chinese companies over a period of years in a case that may violate a 2011 FTC agreement. The other opened up 14 million users' private posts for four days in May and made them public without their consent. The worst is yet to come.
The Frightening Reality of What Facebook is Doing to Us
As news surfaced in March about Cambridge Analytica’s abuse of private data provided to it by Facebook, a much more serious issue was brewing in the background regarding the social media giant. The problem is not how third parties use Facebook’s data but how Facebook plans to use it for its own purposes, powered by artificial intelligence.
During the Cambridge Analytica scandal, Facebook’s users learned with considerable shock precisely how much data Facebook is gathering about them. Under CEO Mark Zuckerberg’s direction, almost every move an individual makes on a smartphone or computer with Facebook installed and operating is being captured and logged.
It has long been understood that just using Facebook involves giving up some of your privacy. The app openly asks for access to all of your contact information to allow it to operate – something it says it is doing to help you connect to potential friends. It also asks for access to your browsing history, phone call information and more. It is so “boilerplate” in its requests that most of us do not blink when we click “okay” because many other apps ask for similar things.
As the details of the Cambridge Analytica mess became better known, so, too, did the details of what information Facebook is storing about its users. This includes every message with Facebook Messenger – details of every phone call made and every text message – for at least a year and, of course, browsing history. All that data is also tracked with respect to time, sequence, location and duration where appropriate. Instagram and the stand-alone version of Facebook Messenger are also tracked in a similar manner.
What U.K.-based Cambridge Analytica did first was to create an app called This Is Your Digital Life, a product developed by researcher Aleksandr Kogan. Two hundred and seventy thousand people were paid to take part in using the app. Then, using Facebook data connected with the app (with Facebook’s explicit permission and knowledge), Cambridge Analytica pulled in data on that group of 270,000 and their Facebook friends. The total number of users Cambridge Analytica pulled data from is estimated to be 87 million.
Cambridge Analytica then used that data to create targeted political advertising during the 2016 U.S. political campaign, apparently including the U.S. presidential campaign.
For its part, Facebook called the misuse of the data by Cambridge Analytica a “breach of trust” from the original contact it had provided data access to, for Kogan’s app. Zuckerberg also responded to European Union (EU) lawmakers regarding the case that it would refuse to compensate users for the misuse of that data. That came on May 24 in written answers to questions Zuckerberg did not have time to answer while present with the lawmakers earlier that week. Zuckerberg’s responses were justified apparently by the statements that no bank account or credit card details had been shared as part of the breach and further that no EU user data was compromised in any way. It is a cold, unfeeling answer that only an accountant could love. It also ignores that Facebook ethically and probably legally has liability for giving the information away in the first place without ensuring that it would be safeguarded.
This only covers the issue of customer information leaks and the misuse of data by outsiders. What no one was talking about in any of these cases is what Facebook itself is going to do with the access to its own data.
With 2.19 billion active Facebook users worldwide as of the first quarter of 2018, there is an enormous treasure chest of data available for the company to run through. Since an estimated 98% of its revenue comes from advertising, we already know that Facebook will use that data to generate the highest possible revenue by creating advertising precisely targeting those users.
How Facebook is going to use that data is where things get a little frightening.
It is an established practice that Facebook already uses the data in one simple way: by sending you advertisements for the things you are already looking at. Keep in mind that with access to your browsing data, Facebook already knows that you are checking out certain sites. Those of us with Facebook accounts have probably been shocked more than once by seeing an advertisement directly sent to us after we had already been looking at it earlier in a separate browser window. As an example, if you happen to have opened a website for a luxury vacation resort in one window, do not be surprised if you soon see an advertisement for that same resort appearing in your Facebook feed.
A second and somewhat similar approach to how Facebook will use this data is by providing advertising alternatives to what you were looking at or browsing for. A classic example would be if you had been looking at one brand of soft drink a lot; Facebook might provide an ad for a different soft drink in your feed.
A third type of approach is old technology but is still a little disturbing when one realizes it is happening. It involves stitching together a number of steps for simple predictive behavior modeling. In this case, again because Facebook has access to so much information about who you are, where you live, demographics that can predict your buying capability and habits and where you have been browsing, the company can make guesses and target advertising to what you might be looking for. An example of that practice, which others have done but no one would acknowledge is happening now at Facebook, would be if it sees you having searched for a specific Blue Book price for a certain make, model and year of car; sees you looking at multiple new car sites; and then sees you checking out financing options on still another site, it is an easy logical jump to suggest that you may be considering buying a new car and will need financing. Facebook could then provide a targeted ad for a specific new car model for you based on what else it also knows about you. It knows whether you are single or married because many users give that information away. It knows the locations you go to from geotagging. It often knows your age and can estimate the income you might be making based on your country region and other factors it has already harvested. It can then make that one giant leap forward and guess with very high accuracy what kinds of vehicles you might want.
This is just the beginning. There is a very much related sort of technology that Facebook has developed – something the company calls loyalty prediction. This involves using all that information about you – including all those demographics and search information discussed earlier – to estimate when you are likely to stop being loyal to a particular product brand. The other way of saying the same thing is, of course, to predict when you are likely the most vulnerable to advertisements that might nudge you to buy something different.
Facebook and other companies apparently attempted to track information at one time about the things each of us purchased online. Then they shared that information with your Facebook friends, saying that this “friend” of yours had purchased such an item and maybe you might want to try it yourself. That approach backfired as being too invasive, even in an era when we all regularly give away our privacy in return for access to tools and technologies such as what Facebook offers.
The loyalty prediction model could make use of the same data, since it not only has access to your own data but also the data of your friends. The company likely “knows”’ – with a high degree of accuracy – what your closest friends are purchasing. So it can use that information to guess what you are personally most likely susceptible to purchase, based on your own data as well as those in your influencers group.
Loyalty prediction is also part of an even more important advertising-related technology at Facebook. That one is called FBLearner Flow. The way this works starts with three types of information Facebook has on each user. The first is what users submit about themselves just by their actions on the site and in connection with other actions they take (such as allowing Facebook to track their browsing history). The second is by classifying each user according to an estimated 52,000+ unique attributes Facebook assigns to each of them as “categories of interest.” Examples might include information that suggests where one shops regularly or what household income they have. The third is by purchasing data from third-party data brokers, presumably for the individuals who are users on the site.
According to a study by ProPublica, which was also the source of the information about the 52,000 unique attributes Facebook tracks on its users, Facebook provides potential ad buyers with a list of approximately 29,000 categories of information. Most of those come from the attributes list noted earlier. An estimated 600 of those categories were purchased from third parties who have data on us that Facebook can merge with its own information. That information can then be used for advertisers to decide which of those categories of potential purchasers they want to reach with their ads.
This helps provide many of those somewhat creepy advertisements some of us see in our Facebook feed that seem way too “on target.” This is just the beginning.
With FBLearner Flow, what Facebook does is it uses a form of artificial intelligence (AI) that injects all that it knows about each of its users, both individually and collectively, into a computer learning simulation program. That program uses that data to predict a variety of possible consumer behaviors for the users, based on what it has observed happening in the past and what it calculates can be influenced in the future. It then sorts those outcomes into collective groups of people who can logically be expected to respond in near-identical ways. Finally, Facebook markets those groups as potential subjects for advertisers to target with specific advertising designed for those markets.
If that all sounds nearly impossible to imagine, remember again the sheer magnitude of the database of behaviors and people that Facebook has. It has 2.19 billion active users multiplied by the 52,000+ unique attributes it categorizes, plus outside data it buys on its users to help fill in important details. With such large numbers, it is relatively straightforward to come up with high-likelihood targets for almost any kind of advertisers.
FBLearner Flow does not work in a vacuum, of course. It is an AI-driven machine-learning program. It tracks the success of specific types of advertisements with specific groups, so it can retune where ads can be placed for the highest level of return for the advertisers. The program is also actively adjusting and retuning its data inputs, so the groups that are being targeted are always up-to-date.
Where this all goes beyond creepy and becomes almost unimaginable is when one also considers what in computer systems is referred to as the “feedback loop” for these advertisements. What if Facebook were to “promise” a certain level of return for an advertiser? That is certainly a reasonable conclusion for something it could do, if not now then definitely in the future. Then imagine that a given advertising campaign does not meet its targets. Facebook could then jump in and, perhaps with its own in-house digital advertising studios, create a customized campaign to nudge just the right number of potential buyers over the edge and bring things up to target. It could also check those campaigns in its simulations in advance and predict what could happen with staggering accuracy.
The combination of the surgically precise influencing tools Facebook has, plus the sheer volume of users and its dominant position in the advertising business, is only going to make the company more powerful in the future. The precision with which Facebook can do this also means it could be less costly to run any kind of campaign designed to influence human behavior. It is entirely logical to assume that soon we could see the following come to pass, using Facebook as the means to make it all happen:
- Plan precision targeted campaigns (totally within U.S. borders, so they are all legal) that are bought and paid for by major corporations and activist groups to get candidates who are “on the edge” into office.
- Convince groups who are critically needed to elect certain candidates not to vote at all or, alternatively, to come in large numbers where they may never have shown up at all before.
- Use data that is at best misleading but that does not fall under an absolute position of “lying” to support these advertising campaigns.
- Target start-ups with disruptive innovations so the existing companies will continue to dominate. That has not been possible to do with such precision and such clear return on investment. The big money-makers can now make this part of their regular business campaigns to push down potential competitors via this kind of manipulation.
- Drive stock prices up via legal manipulation means in advertising that can be timed.
- Help drive local support for a potential harmful business venture in a local community, again by targeting influencers in the local area with far more precision than ever before.
- More precisely lobby potential members of Congress.
- Manipulate the population at large to support positions they may not have considered in the past. Actively manipulate the language the population at large uses in talking about things so that it is easier to accept an idea that the majority might normally reject.
- Launch targeted campaigns to get the population “in line” with unpopular decisions that have already been made. If a president was to declare an unpopular war, for example, imagine if the government invested a billion dollars in targeted ads and news manipulation in Facebook to turn opinions in its favor. Imagine if the divisive move to relocate the U.S. embassy in Israel from Tel Aviv to Jerusalem, something that has been panned widely in Europe, was to be the subject of a targeted influence campaign by the United States to Facebook users in the EU.
Unfortunately, these are all likely scenarios that not only might happen in the future but just might already be happening now.
There are uglier things to consider that could happen just by labeling them differently than the obvious. These could include making hate groups more likable, encouraging discrimination without having to pass a single law or getting the public to accept near-criminal actions by large corporations as not just “okay” but also something to celebrate as being in “America’s best interests.”Facebook’s advertisements, with their precision targeting and likely increasing ability to guarantee results using AI, could end up being one of the main ways the country gets in line with everything from consumer marketing to broad-based economic behavior and the support of government actions. They could, if not regulated correctly, rule the world.
There are those who might question how something as simple as advertisements could make such a difference. Consider, then, what has already been alleged regarding Donald Trump regularly watching the popular morning talk show Fox & Friends. Many media groups have noted that whatever is discussed on that show often ends up referenced in Trump’s Twitter feed, both during and immediately after the show. What is rarely noted but is also true is that corporate advertisers also know this. They are apparently now buying and running advertisements to run on that show in the Washington, D.C., market just for Donald Trump to see. It has even been alleged that some of those advertisements may have been what pushed Donald Trump over the edge to support the current steep steel and aluminum import tariffs imposed recently.
If targeting just one person with the right advertisements at the right time can make such a difference, imagine what it could mean for 2.19 billion people to be targeted all at the same time – on any topic.
This is just what we know Facebook is doing now. The questions we need to be asking are what is it doing that we don’t yet know about and what will it do in the future?
Most humans are easily influenced, and social engineering and mind control technology and techniques have evolved rapidly during the past century.
Have you ever gone into a big box wholesale store and bought far more than you intended to or felt that you should really be careful to not accidentally steal something? If you have, then you were consciously feeling the effects of very basic mind control technology using subliminal messaging. Most large stores legally use hidden technology to influence shoppers’ behavior; it works very well on most people, and most people have no idea that it is occurring.
Vast amounts of money are being invested in developing more effective ways to influence your behavior, and much of it is being invested in digital communications and social media.
Facebook exists to make more and more money, and its primary revenue stream is from providing access to your brain and everything that can be known about you in order to manipulate your behavior.
Facebook is going to do whatever makes it the most money without its key people going to jail. Most of its advertisers are the same way. They are amoral money-making systems that push ethical and legal boundaries to make increasingly more money, and they do that by persuading you to buy their products and stealing your money when they can get away with it. They also use their growing power to remove ethical and legal boundaries that might limit their profit. They actually write most of the new laws and the repeals of existing laws in the United States.
But money is not the only motivator for Facebook and its clients. It is also about power – using power to gain more power – and, as is shown in our next article, Facebook is willing to align itself with some very sinister forces in order to increase its power.
No one forces us to use Facebook, and we can all live without it. We could use the time spent on Facebook to have real face-to-face relationships with real people or get more involved in our communities and create better conditions for everyone.