Data gathered on the web has vastly enhanced the capabilities of marketers. With people regularly sharing personal details online and internet cookies tracking every click, companies can now gain unprecedented insight into individual consumers personal ad target them with tailored. But when this practice feels invasive to people, it can prompt a strong backlash.
Marketers today need to understand where to the draw the line. The good news is that psychologists already know a lot about what triggers privacy concerns off-line. If marketers avoid those tactics, use data judiciously, focus on increasing trust and transparency, and offer people control over their personal data, their are much more likely to be accepted by consumers and help raise interest in engaging with a company and its products.
But there is also evidence that this practice can lead to a consumer personal ad. Marketers need to understand when personalized will be met with acceptance or annoyance. Consumers dislike two techniques: using information obtained on a third-party website rather than the site on which the ad appears; and using inferred information about the consumer for instance, about a pregnancy. With users regularly sharing personal data online and web cookies tracking every click, marketers have been able to gain unprecedented insight into consumers and serve up solutions tailored to their individual needs.
The have been impressive. The research supporting ad personalization has tended to study consumers who were largely unaware that their data dictated personal ad they saw. Public outcry over company data breaches and the use of targeting to spread fake news and inflame political partisanship have, understandably, put consumers on personal ad.
Topics for “personal ad”
This personal ad a whole new dynamic into the mix: How will targeted fare in the face of increased consumer awareness? On one hand, awareness could increase ad performance if it personal ad customers feel that the products they see are personally relevant. Supporters of cookies and other surveillance tools say that more-relevant advertising le to a more valuable, enjoyable internet experience.
On the other hand, awareness could decrease ad performance if it activates concerns about privacy and provokes consumer opposition. The latter outcome seems more likely if marketers continue with a business-as-usual approach. One study revealed that when a law that required websites to inform visitors of covert tracking started to be enforced in the Netherlands, inadvertisement click-through rates dropped.
Controlled experiments have found similar. Some firms have done better than others in anticipating how customers will react to personalization.
The retailer sent coupons for maternity-related products to women it inferred were pregnant. They included a teenager whose father was incensed—and then abashed to discover that his daughter was, in fact, expecting. When the New York Times reported the incident, personal ad consumers were outraged, and the chain had a PR problem on its hands. Similarly, Urban Outfitters walked back the gender-based personalization of its home after customers complained. For example, we often share intimate details with total strangers while we keep secrets from loved ones.
Test your vocabulary with our fun image quizzes
Nevertheless, personal ad scientists have identified several factors that predict whether people will be comfortable with the use of their personal information. One of these factors is fairly straightforward—the nature of the information. Common sense holds that the more intimate it is data on sex, health, and finances is especially sensitivethe less comfortable people are with others knowing it.
It can also be taboo to openly infer information about someone, even if those inferences are accurate. In our recent studies we learned that those norms about information also apply in the digital space. We then asked consumers to rate how acceptable they found each method to be, personal ad a statistical technique called factor analysis—identified clusters of practices that consumers tended to dislike, which mirrored practices that made people uncomfortable off-line:.
Next, we wanted to see what effect adherence to—or violation of—privacy norms would have on ad performance. So we divided participants in our study into three groups. If people dislike the way their information is shared, purchase interest drops. We then conducted a similar test using declared acceptable versus inferred unacceptable information.
In sum, these experiments offer evidence that when consumers realize that their personal ad information is flowing in ways they dislike, purchase interest declines. Three factors can increase the upside of targeted for both marketers and consumers.
Add pew research to your alexa flash briefing
Taking them into will help marketers provide personalized that inform consumers of products they want and need but in a way that feels acceptable. A common practice that advertisers currently use to preempt targeting backlash is to offer voluntary ad transparency.
In some cases, consumers can click on the icon to find out why the ad has been displayed to them. Such disclosure personal ad be beneficial when targeting is performed in an acceptable manner—especially if the platform delivering the ad is otherwise trusted by its customers. In one experiment conducted with Facebook users, we first personal ad participants how much they trusted the social media company. Next, we directed them to find the first advertisement in their Facebook news feed and read its accompanying transparency message.
We asked them to indicate whether the message conveyed that the ad had been generated using first- or third-party information and using declared or inferred information.
Then we inquired about how interested they were in purchasing the advertised product and engaging with the advertiser in general by, say, visiting its website or liking its Facebook. Overall, from unacceptable flows performed worse than those from personal ad flows. We also found that when trust was high, disclosing acceptable flows actually boosted click-through rates. In a set of field experiments, we partnered with Maritz Motivation Solutions, which runs redemption websites for loyalty programs such as airline frequent-flier programs, a context in which consumer trust tends to be high.
These sites use the same technology as the large e-commerce sites, except that the currency is points instead of money. Central to many privacy concerns is the loss of control. Consumers may not object to information being used in a particular context, but they worry about their inability to dictate who else might get access to it and how it will be used down the line.
The nonprofit targeted personal ad. Midway through this experiment, Facebook instated new privacy features that gave users more control over their personal information without changing the attributes that advertisers could use to target people.
The social media personal ad allowed people to keep their connections private and to manage their privacy settings more easily. Before this policy change, the personalized did not perform particularly well; if anything, users were slightly less likely to click on them than on generic. After the change, however, the personalized were almost twice as effective as the generic ones.
In another experiment we showed participants a targeted advertisement, systematically varying the disclosures appearing alongside it.
About half of facebook users say they are not comfortable when they see how the platform categorizes them, and 27% maintain the site's classifications do not accurately represent them
With one group of participants, the ad was accompanied by a message saying that unacceptable third-party information had been used to generate it. A second group of participants saw the same transparency message—plus personal ad prompt reminding them that they could set their ad preferences. A third group simply saw the ad.
Purchase interest was lower in the first group than in the last group. However, in the second group—consumers who were reminded that they could dictate their ad preferences—purchase interest was just as personal ad as in the group that had seen no message. In other words, reminding consumers that they can meaningfully control their privacy settings buffered any backlash to unacceptable data collection. However, there was also a fourth group in this experiment—whose reactions unfortunately highlight the potential for consumers to be misled.
This time, however, participants were merely reminded that they could choose personal ad profile picture. Purchase interest in this group, too, was just as high as in the group that had seen no message. Personal ad instance, data brokers aggregate all kinds of personal information—from platforms like Facebook as well as internet shopping sites, store loyalty programs, and even credit card companies.
Revealing why personal data has been used to generate can help consumers realize the upside of targeted. A commitment to provide justification can also foster appropriate use of data. It might also be tempting to manipulate consumers by giving them meaningless opportunities to feel in control that create a false sense of empowerment.
While such tactics may work in the short term, we believe they are ultimately misguided. Even setting aside the potential ethical personal ad, deceit erodes trust if it is discovered. And as our experiments show, trust enhances the positive effects of using personal information in ways consumers deem acceptable.
Research into other areas also suggests that trust has spillover benefits. An off-line analogue may be useful here as a guide: You might gain temporary advantage by deceiving a friend, but personal ad damage if the deception is discovered is deep and lasting.
Relationships are stronger if they are honest. So what suggestions would we make to digital marketers looking to maximize the potential of ad targeting?
The privacy paradox
We offer five:. In particular, try to avoid using anything about health conditions, sexual orientation, and so on.
This move presents challenges to companies that sell sensitive goods—which may want to avoid targeting altogether. There is a wide spectrum between concealment and personal ad disclosure, with many acceptable points between the two. As a general rule of thumb, we suggest that marketers at least be willing to provide information about data-use practices upon request. Such disclosures should be clear and easily accessible.