Of course, psychology always wants to publish new findings. And in fact, social science magazines simply rejected 95 percent of replication studies submitted. Could this have something to do with the fact that we can't really trust psychology in the sales world?
In today's podcast, Jörg Dennis Krüger discusses some important points on the subject of psychology and has a few interesting examples.
TRANSCRIPTION OF THIS EPISODE OF THE PODCAST
Hello, my name is Jörg Dennis Krüger, and as my chief psychologist at reception just said:
Yes, I am the Conversion Hacker.
And in this issue let's just talk briefly about the topic of sales psychology or psychology in general. Because psychology has a problem. It's not a system. Now if I drop something here, I can't say it falls on the floor. This works in physics, but it doesn't work in psychology. I can't assume that just because I mark something red that the user will perceive it accordingly.
Because that is much more complex. Also much more complex than many sales psychologists would like us to believe. And many of these sales psychology tricks don't work at all in many shops. Or it only works in very select cases, only in certain situations, only in very specific target groups and so on and so forth, because it is much more complex.
It's just that some psychological triggers and psychological approaches work or don't work. What these triggers or approaches say, what they aim to do or what they do depends very much on how the person who is there thinks. What situation he is currently in, what emotional state he is currently in, what starting position he is in in general.
What kind of person is this? What kind of background knowledge does he have? And there are, especially in psychology, an extremely large number of tiny triggers that can completely change behavior, but which we often don't even know about. And this becomes particularly clear when you look at the topic of the replication crisis in psychology.
Because the replication crisis (it exists in psychology, but also exists in other disciplines) is, in a sense, a methodological crisis. This crisis became known because studies on which people had trusted, whose results were trusted, simply could not be replicated.
Because a study is only good if it delivers the same result every time I do it again. If the study produces a different result every time, then it is worthless. For example, if I drop my cell phone, it sometimes falls on the ground, sometimes it flies up, sometimes to the left, to the right, and then this study is worthless, then it is not reliable. And such a methodological crisis has been increasingly discussed in psychology since 2011, and this raises an incredible number of doubts about the reliability of published psychological studies.
And the whole thing received a lot of attention because social psychologist Daryl Bem's studies simply couldn't be replicated three times in replication attempts. These critical reports were initially rejected by major magazines like Science and a few others because they didn't want to believe it! “We published this study once, and it is valid. And the fact that it can't be replicated now is probably due to Deryl Bem's approach, because he did a bad job." And then the whole issue suddenly became bigger.
The reproducibility of research results by other researchers is simply a fundamental requirement for scientific research work. And if it doesn't work and you can't reproduce it, you have a problem.
And especially in psychology, positive results are often easily spread. So then you're happy that it worked. And then there are a bunch of lectures from psychologists, tips from psychologists and articles about sales psychology, because these are all these positive results from certain studies. They then also shape the content of most specialist journals, which is no wonder, because of course people like to report on new, exciting studies. But attempts to reproduce these often remain unpublished.
We also have this problem in the natural sciences, but it is obviously most severe in psychology. And what are the causes of these problems? What are the causes of this lack of scientific control in science, physics, chemistry or similar? Of course you always want to replicate directly. A study is published and a lot of scientists around the world immediately start to see if it works. Because they also want to use this knowledge. And it's very easy to falsify something.
In psychology, people particularly want to publish new findings, which is particularly welcome by young scientists. And in fact, magazines in the social sciences simply rejected 95 percent of the replication studies submitted. And even 54 percent of the reviewers in such magazines believe that they prefer new studies to a replication study.
And now you can of course ask yourself whether, especially with these magazines, there is also a certain fear that a lot of things cannot be reproduced and thus they question their entire magazine. Why else would they reject so many of the manuscripts about replication studies? And you can see that we are suddenly in such a huge complex that one might not even want to try to verify points made in psychology.
You just believe in it, because our entire “science” could be affected by it. And then the whole thing goes a little further. A lot of research and thought has been given to this over the last twelve years. For example, there is a well-known social psychologist Dietrik Stapel. He has written at least 30 publications with completely made up data. This was not found by replicating the whole thing, but rather based on information from a working group.
There are also new allegations against two other social psychologists, Dirk Mester and Jens Förster. They are said to have worked with data that is somehow not so tidy. And recalls of study results almost never occurred in psychology, in almost all social sciences, but suddenly increased significantly, with fraud also being the main reason. And yes, that is of course a difficult thing when the entire branch of research is more or less called into question and that there are actually so many points missing in such studies.
Not only are good studies published, but the good studies are often carried out with samples that are far too small. We are often talking about studies with only 20 to 30 people, which of course has practically no significance at all. The results, which only have small samples, can quickly be reversed if an outlier is taken into account or excluded before the calculation. Then you have a lot less data or the whole thing goes in the wrong direction and you can, in a sense, make your own data.
And in this respect it is an extreme problem, especially in psychology. 18 percent of the studies also have statistical deficiencies, and 15 percent have errors that were often in favor of the hypothesis (that you really got what you wanted). 15 percent! This is huge! Well, that's crazy. The A/B tests we do have a lot more data. They have a much better statistical basis, and the criticism is often that the data isn't really meaningful because somehow you don't have enough conversions.
But all of these scientific studies or many of the scientific studies, especially in psychology, are unfortunately very difficult to take into account.
And now I come to the fact that when it comes to reproducibility, it was also found that in many cases even tiny things influence reproducibility. There's this very famous experiment that if people put a pencil in their mouth and automatically smiled a little, their mood would be better.
And this is a very old, well-known experiment, and of course it is being blown out of proportion. This goes back to a study by Martin and Stepper in 1988. In this study, they put forward a facial feedback hypothesis that goes back to Charles Darwin and states that a changed facial expression is associated with changed, subjectively perceived emotions.
That means they made two groups. One group was given a pencil between their teeth and the other group was asked to hold the pencil with their lips. And with a real smile, you use two muscles, the “zygomaticus major” for mouth movement and the “orbicularis oculi” for eye and cheek movement. And of course this differs from a simulated, fake smile, in which the eyes in particular often do not move. We know this from many stars who simply smile with their mouths and it always looks a bit strange.
So, with the pen between their lips or teeth, these test groups then had to watch a cartoon video and then judge this video. The result says that those with the pencil between their teeth rated the cartoon video as significantly funnier than the other participants.
And then there was a lot of word that this study showed that this pencil helped somewhat. And of course this went public and made a wide audience with this facial feedback. The problem is, it could never really be replicated.
And so you tried to reproduce it the whole time, but you never really got it reproduced in an unknown way. So was the result really significant? Obviously not, because if you can't reproduce it, then we can just skip all the fun, because then it's no use. But now one of these professors has taken up this criticism and tried to reproduce the entire experiment. And it was found that the smallest changes to the experiment caused changes to the data.
For example, in the reproduction experiments, the participants were always filmed with a camera, so they were monitored. This was not the case in the original experiment, where people were not monitored. And in his reproduction study, in his replication experiments, he says he has discovered something important. Namely, that this observation through the camera changes the result, i.e. the smile, so much that people suddenly no longer take it as positive feedback. And then they didn't care about the pen because they felt like they were being watched.
So, we can't really trust all this psychology stuff. And we also know this from psychologists and therapists and so on, they can't just push people's buttons.
You can't just say here: "He's depressed, then I have to tell him this and that, and then it'll work again," as might be the case in medicine with medication. But even then you always find out that certain things don't work.
Tip: find out more about the effectiveness of paracetamol. Very interesting that for a relatively large number of people, paracetamol has no effect that goes beyond the placebo effect. Very exciting!
Yes, and in this respect, especially when it comes to online shop design, we have to completely say goodbye to all this psychology nonsense where we simply say, “Yes, that applies to everyone in general”. It all sounds great and convincing, but then it just doesn't work. Of course, we first have to build the shop well based on best practices.
You can also take certain best practices from psychology. But it's actually more about perception and contrasts. It's about presenting the elements in such a way that they can be perceived and recognized correctly. And then you really have to go into A/B testing and really test what really works reliably for my visitors who come to my site, for which traffic channels and so on. And psychology, conversion psychology, sales psychology, very difficult topic that only works to an extremely limited extent.
I really want to call for comments now.
If you go to jkd.de/podcast, all of our podcast episodes are available directly from our site and you can leave comments there.
So please, write to me in the comments, what do you think about sales psychology? What good experiences have you had? What bad experiences have you had? I would like to take up the topic again. I would also like to show a few examples and so on so that we can go deeper into this. And on the one hand, collect: which “best practices” with perhaps a psychological background can we recommend, and which should we leave behind? And on the other hand, think about what has no influence, so to speak, and just sounds great?
So jkd.de/podcast. And as always, leave five stars on the podcast service of your choice and simply follow Jörg Dennis Krüger on Instagram and Facebook. I look forward to it.
So all the best and until next time, yours, Dennis Krüger.