9 Jun 2017UpdateFive big ideas we’re exploring at Nudgestock 2017UPDATE: Dispatches from the Canvas8 HQ
image-52aa3f1e05acd69cbb90599f19d85d78e1a0fac7-1349x470-jpg

For advertisers and marketers, established in the game of inspiring emotional reactions, an understanding of behavioural economics can reveal powerful insights. At Nudgestock 2017 – created by Ogilvy Change – some of the world’s biggest thinkers are gathering today to discuss how behavioural economics can lead marketers to new creative heights. As the event’s official insights partner, Canvas8 conducted a series of interviews with the speakers in anticipation of the main event. Here are five of the biggest ideas that these thinkers are exploring.

Author
Lore OxfordLore Oxford is a cultural theorist and strategist. She's also the author of Substack column 'Why tho?', where she writes about internet culture and the adoption of Web3.

1. How can handwriting make marketing more human?

“As amazing as chat features are on websites, when someone has a problem or a complaint, they can be frustrating and you just want to speak to a human. And often, when a case has been resolved, there is no afterthought,” says Inkpact founder Charlotte Pearce. Considering that 35% of consumers in the UK, Germany and Benelux countries say they won’t use a brand again after just one poor customer experience, getting that apology right is crucial.

As brands strive to become more ‘human’ – whether that’s chatting to customers on WhatsApp or providing empathy training to staff – marketing communications are following suit. “Our mission is to help other companies think of their customers as a friend. If your friend wasn’t happy, would you leave it and not talk to them again? How would send a communication?” asks Pearce. “When a customer expects an apology, they are looking for something that shows they care about me. If you were to annoy a friend or a partner, generally you would pick up the phone or go and see them. Companies can’t do that, but writing an apology over an email is just like breaking up over an email – no one wants to do that. It doesn’t feel that they care.”

Instead, Pearce argues that marketing should be made as human as possible; “We suggest that clients sign off from a person or a persona and a department. So Jenny in the marketing team or David in the finance team.” But what if a customer finds out it wasn’t really written by Jenny or Dave? “We get the question all of the time; what if someone found out that it wasn’t from the CEO,” says Pearce. “I actually think that a lot of people don’t mind. It’s the fact that someone sat down and wrote it and that the company wanted you to get a really personal message. The fact that a human has sat down and wrote the letter makes the customer value it more.”

Explore the full report here.

Charlotte Pearce is the founder of Inkpact and has been named in the 30 Under 30 list 2017.

A nicely-timed note can nudge a customer into actionПарки Татарстана, Creative Commons (2017)

2. Is it possible to program ethical tech?

Much of the discussion around treating robots badly comes down to the fact that we can’t help but treat technology like it’s human. “People say cats rub themselves against your leg because they think you’re a big cat, and that this is the standard greeting among felines,” says technology ethicist Dr. Blay Whitby. “Cats know you’re not a cat, but they only have cat behaviour in their repertoire. They can only do cat behaviour. It’s the same with humans; we only have human behaviour in our repertoire.” It’s why robots that are designed to be cute are more likely to win us over, and it’s why people will respond emotionally to seeing a robot being abused as they will to seeing a person receiving the same treatment.

Technology doesn’t need to be particularly sophisticated to elicit this response; Whitby refers to the ‘Eliza effect’. “The Eliza effect is the tendency of humans to attribute much more understanding and human like features to a piece of AI than it actually has,” says Whitby. “Every time I see a chatbot, I see Eliza technology. There’s no real attempt to understand what you’re saying – it’s just an attempt to fool somebody by producing canned text.”

This attribution of human qualities leads to two things; an expectation of morality, and an instinct to treat it as you would treat a human. But when something behaves ‘too human’, that can also end badly. The term ‘uncanny valley’ is traditionally used to describe the unsettling feeling someone experiences when a robot looks too human, but studies suggest it applies to how a robot behaves, too. With the market for robots predicted to be worth $1.5 billion by 2019, it’s important for developers understand how they can be developed so people feel comfortable. In early 2017, the Defense Advanced Research Projects Agency (DARPA) invested $6.5 million in research to understand how to design AI that’s more trustworthy.

Explore the full report here.

Dr. Blay Whitby is a philosopher and technology ethicist, specialising in computer science, artificial intelligence and robotics.

There’s never been a better time to be a technology ethicistInformedmag, Creative Commons (2017)

3. How do we shape each other’s behaviours?

Think about how much time you spend thinking about the things your friends would like, planning a date that they would find pleasing, or dating people and thinking about what their favourite things are and picking out gifts for them,” says Psychologist Diana Fleischman. “There is a huge amount of cognitive effort that goes into remembering other people’s preferences and aversions.” In today’s unprecedentedly social environment, Fleischman argues that we’re quite unaware of how much effort we put into remembering these very personal details. But this attention pays off during emotional interactions, in which we reinforce our relationships when we do things others like – such as giving the right gift or making a special meal.

“We have things that we all tend to like – food, warmth, sex, things like that,” explains Fleischman. “There are also idiosyncratic preferences that individuals have that are very unique to them, [like] the colour blue or certain types of cars, food or interactions.” It’s a lifetime of strategic behaviour we engage in to help achieve what we want when dealing with others. “It happens a lot with people who want something from someone else. So you will probably remember your boss’ favourite foods or the things that drive them crazy more than you will remember other people’s [preferences and aversions]. And that is not just so that you don’t annoy them, but also so that you can potentially reinforce them.”

In fact, it happens most often between people with a shared future, who train each other to become better at cooperating and satisfying each other’s desires. “The mating relationships are interesting because in a sense there is a lot of shared fate,” says Fleischman. “One of the most important adaptive problems that we have is finding a mate and getting out of them what we want, whether they’re a man or a woman. This is an area where there is very strong pressure to shape one another.”

Explore the full report here.

Diana Fleischman is a Senior Lecturer in Psychology at The University of Portsmouth, and part of the Centre for Comparative and Evolutionary Psychology.

We’re constantly conditioning others, often subconsciouslyLeo Hidalgo, Creative Commons (2016)

4. How do you solve a problem with nudges?

“Imagine a problem as a table with a bunch of legs, each one supporting the existence of that problem. If you knock those legs out one by one, eventually the table comes down,” says Stevyn Colgan, author of Why Did the Policeman Cross the Road?. “That kind of ethos can be applied to any problem.” It’s true that even subtle solutions can prove effective in solving problems. For example, Happy City is a Canada-based start-up helping city planners create public infrastructure that encourages good behaviour and deters bad behaviour, while also fighting the anxiety epidemic impacting many city-dwellers. Meanwhile, Know My Neighbour is a scheme in Brighton and Hove that aims to help the city become the first where everyone knows their neighbours, thereby combatting isolation.

It’s not always the solution that’s the hard part though – identifying the problem in the first place is also tricky. In a survey of senior executives at public and private companies across 17 countries, 85% agreed that their organisations were bad at problem diagnosis, with the majority saying this carries significant costs. It’s hardly a surprise then that the management consultancy industry is booming, reaching a valuation of £45 billion in the US in 2015, with the UK’s sector estimated to be worth nearly £6.8 billion.

As much as companies, brands and governments can nudge people towards better behaviour, ultimately the change has to come from the individual. But we’re a pessimistic bunch, with research from Ipsos Mori in 2015 finding that Britons consistently overestimate the bad behaviour of others; 69% think their fellow citizens eat more sugar than they should, though nutritional surveys suggest the figure is only 47%. People also think 65% of the population aren’t saving enough for retirement, while government studies place that proportion at 43%. “As soon as you involve people in anything, there is no simple solution,” claims Colgan. However, pre-empting people’s actions through behavioural insights could help fix some of society’s ills. “Call me an idealist, but I believe we can change the world if we move away from reactive change to proactive change.”

Explore the full report here.

Stevyn Colgan was a member of the Metropolitan Police Problem Solving Unit and he is the author of 2016 book ‘Why Did the Policeman Cross the Road?’

How can behavioural insights help solve society’s problems?Tim Gouw, Creative Commons (2016)

5. How do people make moral judgements?

“Morality is all about cooperation,” explains Dr. Oliver Scott Curry, Director of the Oxford Morals Project. “It’s a collection of different strategies for solving social problems.” As humans, we face a number of issues when it comes to cooperating, because people can work together in different ways. “From things like helping your family or your social group, to compromising or trading favours – like I’ll scratch your back if you scratch mine – there are a whole bunch of strategies we use to resolve these conflicts between people, either by forming hierarchies where everyone knows their place, negotiating fair compromises or respecting each other's property.” This range of tactics or rules people use are called morals.

“When it comes to working as a team, you have loyalty, solidarity, unity and conformity morals,” he adds, discussing how different types of moral emotions are expressed in various cooperation scenarios. “When it comes to trade, you have trust, honesty, reciprocity, guilt, gratitude and revenge, whereas when it comes to resolving conflict, you might have bravery or generosity morals.”

These moral dispositions are rooted in our nature as social animals. Indeed, research suggests we’re hardwired with knowledge of right and wrong, and we see the same set of moral values in nearly all cultures. But, as Dr. Curry explains, culture does have a part to play in shaping people’s morals. “Generally, people in different cultures value the same kind of behaviours, however, cultures vary by prioritising or emphasising one or other type of cooperation, which means different moral values are prioritised. In some places, your first duty is to your family, but in other places that is seen as nepotism and other more impartial norms, like fairness, take precedence.” People recognise these influences – a survey conducted by the Barna Group found that 57% of Americans say their morals are shaped by personal experiences and 65% say wider cultural influences have an impact.

Explore the full report here.

Dr. Oliver Scott Curry is a Senior Researcher and Director of the Oxford Morals Project at the Institute of Cognitive and Evolutionary Anthropology, University of Oxford.

nudgestockfestival.com