Sales psychology has a problem

Psychology naturally always wants to publish new findings. And in fact, social science magazines have simply rejected 95 percent of the replication studies submitted. Could this have something to do with the fact that we can't really trust psychology in the sales world?

In today's podcast, Jörg Dennis Krüger discusses some important points about psychology, and has a few interesting examples.

TRANSCRIPTION OF THIS EPISODE OF THE PODCAST

Hello, my name is Jörg Dennis Krüger, and as my head psychologist at reception has just said:

Yes, I am the conversion hacker. 

And in this issue, let's just talk briefly about sales psychology or psychology in general. Because psychology has a problem. It is not a system. If I drop something here, I can't say that it will fall to the ground. That works in physics, but it doesn't work in psychology. I can't assume that just because I mark something red, the user will perceive it accordingly.

Because it is much more complex than that. Much more complex than many sales psychologists would have us believe. And many of these sales psychology tricks don't work at all in many stores. Or they only work in a very selective way, only in certain situations, only in very specific target groups and so on and so forth, because it is simply much more complex. 

It is simply the case that some psychological triggers and psychological approaches work or don't work. What these triggers or approaches say, what their purpose or effect is, depends very much on how the person in question is thinking. What situation they are in, what emotional state they are in, what their general starting position is. 

What kind of person is he? What kind of background knowledge does he have? And especially in psychology, there are an extremely large number of tiny triggers that can completely change behavior, but which are often not even known. And this becomes particularly clear when you look at the topic of the replication crisis in psychology.

The replication crisis (which exists in psychology, but also in other disciplines) is to a certain extent a methodological crisis. This crisis became known because studies that were relied upon, whose results were trusted, simply could not be replicated. 

Because a study is only good if it delivers the same result every time I do it again. If the study delivers a different result every time, then it's not worth anything. For example, if I drop my cell phone, sometimes it falls on the floor, sometimes it flies upwards, sometimes to the left, sometimes to the right, and then this study is worth nothing, it is not reliable. And in psychology, this kind of methodological crisis has been increasingly discussed since 2011, and this has led to an incredible number of doubts about the reliability of published psychological studies. 

And the whole thing was subjected to a lot of attention because the studies by social psychologist Daryl Bem, simply three times in replication attempts, could not be replicated. These critical reports were initially rejected by major journals like Science and a few others because they didn't want to believe it! "We published this study once, and it's valid. And the fact that it can't be replicated now is probably more down to Deryl Bem's approach, because he did a bad job." And then the whole issue suddenly became bigger. 

After all, the reproducibility of research results by other researchers is simply a fundamental requirement of scientific research. And if that doesn't work, if you can't reproduce it, then you have a problem.

And in psychology in particular, positive results are often easily disseminated. So then you're happy that it worked. And then there are lots of lectures by psychologists, tips from psychologists and specialist articles on sales psychology, because these are all positive results from certain studies. These then also shape the content of most specialist journals, which is no wonder, because of course people like to report on new, exciting studies. But attempts to reproduce them often remain unpublished. 

We also have this problem in the natural sciences, but obviously most strongly in psychology. And what are the causes of these problems? What are the causes of this lack of scientific control in science, in physics, chemistry and the like? Of course, you always want to replicate this directly. A study is published and immediately a lot of scientists around the world start to see if it works. Because they also want to use these findings. And it's very easy to falsify something. 

In psychology, the aim is to publish new findings, which is particularly popular with young scientists. And indeed, magazines in the social sciences have simply rejected 95 percent of the replication studies submitted. And as many as 54 percent of reviewers in such journals say that they prefer new studies to replication studies. 

And now, of course, you can ask yourself whether these magazines in particular also have a certain fear that much is not reproducible and that they are therefore calling their entire magazine into question. Why else would they reject so many of the manuscripts on replication studies? And you can see that we are suddenly in such a huge complex that people may not even want to try to verify points made in psychology. 

You just believe it, because our entire "science" could be affected by it. And the whole thing goes a little further. Because a lot of research has been done on this in the last twelve years, and a lot of thought has gone into it. For example, there is a very well-known social psychologist, Dietrik Stapel. He has written at least 30 publications with completely invented data. This was not found by replicating the whole thing, but on the basis of information from a working group.

There are also new allegations against two other social psychologists, Dirk Mester and Jens Förster. They are said to have worked with data that is somehow not so proper. And there have almost never been recalls of study results in psychology, almost in all social sciences, but they have suddenly increased significantly, with fraud also being the main reason. And yes, of course it's a difficult thing when the whole branch of research is more or less called into question and there are actually so many shortcomings in such studies. 

This is because not only are good studies published, but even the good studies are often carried out with samples that are far too small. We are often talking about studies with only 20 to 30 people, which of course has practically no significance at all. The results, which only have small samples, can quickly be reversed if an outlier is already taken into account or excluded before the calculation. Then you have much less data or the whole thing goes in the wrong direction and you can make up your own data, so to speak. 

And in this respect it is an extreme problem, especially in psychology. 18 percent of studies also have statistical flaws, and in 15 percent there are errors that are often in favor of the hypothesis (that you really got what you wanted). 15 percent! That's huge! So, that's insane. The A/B tests we do have much more data. They have a much better statistical basis, and people often complain that the data is not really meaningful because you don't have enough conversions. 

But all these scientific studies or many of the scientific studies, especially in psychology, are unfortunately very difficult to observe. 

And now I come to the fact that it has also been found that in many cases even tiny things influence reproducibility. There is this very famous experiment that if people put a pencil in their mouth and automatically smile a little, their mood would be better. 

And this is a very old, well-known experiment, which is of course being blown out of all proportion. It goes back to a study by Martin and Stepper in 1988, in which they put forward a facial feedback hypothesis that is said to go back to Charles Darwin and states that a change in facial expression is associated with changes in subjectively perceived emotions. 

In other words, they made two groups. One group was given a pencil between their teeth and the other group was asked to hold the pencil with their lips. And with a real smile, you use two muscles, the "zygomaticus major" for the mouth movement and the "orbicularis oculi" for the eye and cheek movement. And this is of course different from a simulated, fake smile, in which the eyes in particular are often not moved. We know this from many stars who simply smile with their mouths, which always looks a bit strange. 

So, with the pencil between their lips or teeth, these test groups then had to watch a cartoon video and then rate this video. The result says that those with the pencil between their teeth rated the cartoon video as significantly funnier than the other participants. 

And then word got around that this study showed that this pencil helped a little. And, of course, they went public with it and made a wide audience with this facial feedback. The only problem is that it could never really be replicated. 

And in this respect, we tried to reproduce the whole time, but we never really got it reproduced in an unknown way. So was the result really significant? Obviously not, because if you can't reproduce it, then we can just save ourselves all the fun, because then it's no good. But now one of these professors has taken up this criticism and tried to reproduce the whole experiment. And it was found that the smallest changes to the experiment cause changes to the data.

For example, in the reproduction experiments, the participants were always filmed with a camera, so they were monitored. This was not the case in the original experiment, where people were not monitored. And in his reproduction study, in his replication experiments, he claims to have discovered something important. Namely, that this observation by the camera changes the result, i.e. the smile, so much that people suddenly no longer take it as positive feedback. And they didn't care about the pen because they felt they were being watched. 

So, we can't really trust all this psychology stuff. And we know that from psychologists and therapists and so on, they can't just press buttons with people. 

You can't just say: "He's depressed, I'll have to tell him this and that, and then he'll be fine", as is perhaps the case in medicine with medication. But even there you always find out that certain things don't work. 

Tip: read up on the effectiveness of paracetamol. It is very interesting to note that for a relatively large number of people, paracetamol has no effect beyond the placebo effect. Very exciting! 

Yes, and in this respect, when it comes to online store design, we have to completely abandon all this psychology stuff where we simply say, "Yes, that applies to everyone in general". It all sounds great and convincing, but then it just doesn't work. Instead, of course, we first have to set up the store well on the basis of best practices. 

You can also take certain best practices from psychology. But it's actually more about perception, contrasts. It's about presenting the elements in such a way that they are perceived and recognized correctly. And then you really have to go into A/B testing and really test what really works reliably for my visitors who come to my site, for which traffic channels and so on. And psychology, conversion psychology, sales psychology, a very difficult topic that only works to a very limited extent. 

I would really like to call for comments now. 

If you go to jkd.de/podcast, you can find all our podcast episodes directly from our site, and you can leave comments there. 

So please write in the comments, what do you think about sales psychology? What good experiences have you had? What bad experiences have you had? I would like to revisit this topic. I would also like to show a few examples and so on, so that we can delve deeper into the subject. And on the one hand, collect information: which "best practices" with perhaps a psychological background can we recommend, and what should we rather leave out? And on the other hand, think about what has no influence and just sounds great?

So jkd.de/podcast. And as always, leave five stars on the podcast service of your choice and simply follow Jörg Dennis Krüger on Instagram and Facebook. I'm looking forward to it. 

So all the best and see you next time, Dennis Krüger.

Write a comment

Scan the code