Blog

Now showing items tagged trends expert

We all like to believe we put our mind over matter. Overall, we are rational agents with free will who have control over our bodies, impulses and sensations. However, more and more studies are emerging that prove this belief to be far from the truth.

Researchers across disciplines and cultures are showing that our bodies are far more involved in our thinking than we like to believe. Our cognitive processes are embedded in a system that involves various parts of the rest of our body, from our heart to our gut.

The human instinct to avoid social humiliation is deep. Psychologists point to shame as being one of the deepest fears held nearly universally by human beings, coming close to the fear of death. We all have the impulse to save face, and many of us get particularly defensive, aggressive or withdrawn when that impulse is challenged.

This fear of losing our dignity plays out in important ways in our everyday conversations. It is this very fear that is often the cause of us advocating opinions long after we have abandoned them, for fear of embarrassing ourselves by acknowledging our prior ignorance.

We are already seeing the metaverse transform things. All the way from work to play, this immersive and interactive version of the internet is engaging users across every field of life.

It’s worth noting that as recently as July 2021, the word ‘metaverse’ was nowhere to be found in the world of business and technology. While the term itself had been used by science fiction writers previously, the idea of a metaverse entered the mainstream lexicon in late 2021 when Mark Zuckerberg and others began to tout it as the way of the future. Facebook’s high-profile rebranding as Meta in October 2021 only solidified this idea.

At some point in history, the ideas and assumptions we take for granted were controversial. Now, their opposing idea would be the one considered extreme, and those who hold it are likely to be rejected or simply remain silent. Majority opinion feels stable, but it is constantly in flux.

Back in the 1930s, French historian and political scientist Alexis de Tocqueville, suggested that “As long as the majority is still undecided, discussion is carried on; but as soon as its decision is irrevocably pronounced, everyone is silent, and the friends as well as the opponents of the measure unite in assenting to its propriety.” Known as the Tocqueville effect, this dynamic in human behaviour is alive and well today.[1]

If you have ever found yourself believing that humans only use 10 percent of their brain, that vitamin C cures a cold or that eating carrots improves eyesight, you have experienced the persuasive impact of repetition.[1]

The reality is that none of statements above are true, but we have each heard them so many times throughout our lives that we naturally come to believe that they must be. This is something referred to as the “illusory truth effect.”

Politicians and marketers know well that if we hear something long enough and repeated often enough, we are likely to believe it to be true. The reason repetition is so effective is that it plays to the two variables by which we assess whether something is true or not: whether the information concurs with our understanding; and whether it feels familiar.

In a world of increasing polarisation, empathy stands out as a virtue that restores humanity to conversations and is persuasively powerful enough to change even the firmest of opinions. But, few things evoke empathy as poignantly as first-hand experience.

I vividly remember a simulated experience that forever changed my perspective on the plight of refugees. A few years ago, my family and I spent a few weeks volunteering at an aid agency in Hong Kong called Crossroads. While Crossroads’ core work is the shipment of essential aid and materials to impoverished people around the globe, they also have a commitment to education and community engagement.

Are you remembering things correctly? What do you do when you know something you can’t quite recall? What if our memories have evolved over time? What if we are in fact imagining that we have a memory that in fact does not exist?

What does it take to change a mind? This question has garnered much attention in recent years. Shifting the beliefs and behaviour of an individual and group has only grown as a priority in the minds of many, as politics becomes more polarised and opinions more opposed.

Some persuasion practitioners have sought to create formulas or frameworks for doing so. Take Blair Warren’s one-sentence persuasion approach which promises to capture the secret of meaningful influence in a single 27-word sentence:

Storytelling is one of the hottest buzzwords of the last few years. Far from being merely a trend or fad though, the power of narrative is unrivalled in its ability to capture and maintain the attention of a listener. From science to sociology, narrative consistently emerges as fundamental to human wiring – as a tool of persuasion, its significance is hard to overstate.

The human brain appears to be so wired for narrative that when a story is absent, we will create one in order to make sense of the world around us. Some of the pioneering work examining this dynamic was conducted by legendary Austrian psychologist Fritz Heider back in the 1940s. Heider’s research centred on showing subjects the animation of 2 simple triangles, a rectangle and a circle. With only one exception, every single research participant read an entire plot line complete with drama, love affairs and bullying into the four shapes on the page. Without a clear narrative, it is human nature to invent one.[1]

You’ve likely read the headlines regarding how many million jobs will be taken by robots, or what percentage of professions will disappear in the coming years. While some of these predictions are deliberately crafted for dramatic effect, they may well be close to the mark.

The most thorough and widely reported research looking at the potential of automation-led job losses in the coming years was conducted by researchers at Oxford University. These researchers found that as many as 47% of total United States employment had a ‘high risk of computerisation’ by the early 2030s[1] — more than 64 million jobs in all.[2]

Page 6 of 19