“1
CUSTOMER EXPERIENCE MINDSET
If there’s one reason we have done better than our peers . . . it is because we have focused like a laser on customer experience.
—JEFF BEZOS
Customer experience has become a front-and-center conversation across the world. In fact, 72 percent of businesses say that improving the customer experience is their top priority. Brands that have superior customer experience bring in 5.7 times more revenue than their competitors. Customer experience is a big differentiator, and three trends are behind how this has come to be: the rise of the experience economy, a power shift from the company to the customer, and advances in technology that allow us to create more powerful customer experiences.
While price and quality are still the top considerations for consumers making a purchase decision, in one study 73 percent of respondents said that a good experience is key in influencing their brand loyalties.4 Providing an experience around a product or service is a smart investment. Companies that differentiate on experience find ways to make the customer’s life easier or better, to make the customer feel special, or to wrap a story around the product. One cannot simply throw up a product that is identical to thousands of other products and expect to be successful.
Today we live in a world where YOLO (“you only live once”) drives purchasing decisions for generation Z (born between 1998 and 2016), as well as millennials (born after 1980). Consumers increasingly value experiences over things, powering the experience economy. By 2020, generation Z will account for 40 percent of all consumers.
Part of what is driving the experience economy is the desire to curate digital lives, which you will read more about in this chapter. In one study, nearly 20 percent of generation Z respondents said they have stayed at a specific hotel or destination in order to score a positive response from followers on posts on their own social media channels. Up to 71 percent of generation Z would get a part-time job to save up for a leisure trip.
Millennials are also saving their money for interesting experiences. More than three in four millennials would choose to spend money on a desirable experience or event “over buying an attractive product, and 55 percent of millennials say they’re spending more on events and live experiences than ever before. Recent research from Dr. Thomas Gilovich suggested that people find more lasting happiness investing in experiences over buying things. The study found experiences are the glue of our social lives. Experiences enhance social relations, form a person’s identity—even a bad experience makes for a great story to share via social media.
A recent example of how this trend translates into business is LVMH—a luxury retailer we’ll talk more about later in this chapter—which recently purchased luxury travel company Belmond with plans to make traditional luxury travel less about opulent hotels and accommodations and more about one-of-a-kind experiences. They realize modern consumers care more about creating an Instagramable memory than purchasing the hottest new product. Brands like T-Mobile and Casper also are creating showrooms where customers can experience the product, make a memory, and buy something if they desire. “But how did we get here, to a time where people would rather save up for an exotic trip, participate in a compelling adventure, or learn a new skill with friends over buying a thing?
In 2008, the stock market crashed, the worst financial crisis since the Great Depression in 1933. I was living and working in New York City at the time, at a conference company, and I will never forget how quiet the streets were. Restaurants once filled with patrons were completely empty. This crisis in the subprime mortgage market in the United States became a global full-blown banking crisis with the collapse of the investment bank Lehman Brothers. The same year, an investor named Bernie Madoff—whose firm was one of the top market makers on Wall Street—was found to have operated the biggest Ponzi scheme in history, a form of fraud that gives the illusion of a sustainable business. Prosecutors estimated the size of the fraud to be almost $65 billion.
Millennials—today the largest living adult generation—were watching as their career plans were disrupted. Young people who went to school to become bankers or lawyers found the jobs had vanished. Their parents and grandparents lost their 401(k) plans or retirement savings. Young people learned that wealth could disappear in a moment’s notice.
The things that had defined their parents’ and grandparents’ lives were not going to be the main achievements of millennials’ lives. More young people opted to live in cities. Millennials opted to focus on their own aspirational dreams, putting off marriage, homeownership, and kids.
I say all this without mentioning the enormous innovation happening in tandem with these events: the internet and the proliferation of the smartphone—which armed 2.5 billion people (as of 2019) with a small computer in their pocket. People were now able to broadcast their lives online. Social media played a big role in enabling young people to create an identity online.
“This leads us to modern life, where many people choose to spend their money on experiences over things. According to McKinsey, consumers of all ages are opting for experiences, with millennials—the largest-spending group today—spending the most. Lifecasting has become a popular activity, and people curate their digital footprints with extreme focus and care. Facebook (created in 2004), YouTube (created in 2005), Twitter (created in 2006), and Instagram (created in 2010) ushered in a new era of curation and lifecasting, inspiring users to generate Instagramable content.” The smartphone, in tandem with the viral growth of these social networks, played a critical role in getting us to this inflection point today.”
“The sharing economy democratized travel, empowering more people to explore the world at better prices. Airbnb launched an offering on their platform called “experiences” to “discover things the locals do.” Airbnb advertises this offering as “activities designed and led by inspiring locals.” They immerse guests in unique worlds. Learn to bake with a famous pastry chef in Paris, take a dance class in Havana, Cuba, or get a photography lesson from an expert who will help you photograph the Sydney Opera House at sunset.
People want to feel something: they want to experience all that this world has to offer and they want to share these experiences with the world. Customer experience must be geared to this modern customer, who values experiences over things, and expects the businesses they frequent to empower and assist them as they navigate modern life.”
“THIS BRINGS US TO THE second trend driving the customer of the future: social media and the shift in power and influence from the corporation to the customer. Social media and smartphones armed customers with a platform and a microphone. Suddenly customers were talking about the experiences they were having, and companies were not prepared to engage with the customers. Companies thought social media would be a way for them to engage positively with customers, but customers expected brands on sites like Twitter and Facebook to help solve customer service issues for them. Companies were forced to engage with customers, expanding their social media teams and adding resources to serve customers on multiple channels.
Not only did customers demand companies help them with service needs, but they also had unprecedented power and an ability to share their responses to marketing campaigns. Companies faced backlash when they released ads that were perceived as racist, sexist, or offensive to groups such as veterans. Tone-deaf advertisements resulted in social media protests, and sometimes affected stock prices. Companies struggled to engage in public discourse.
The business world expected social media to empower brands, but instead customers became empowered. Nearly three-quarters of millennials report that their perception of a “brand improves when it is clear the company responds to customers’ social media inquiries. But brands were not accustomed to immediate feedback from customers (and from people on the internet who might not even be customers). Brands were required to respond and engage. Brands became more self-conscious of public opinion, now that it could travel fast and light a fire, which could bring down the company. From a customer service perspective, there has been a palpable shift in power from the brand to the customer. Social media forced brands to treat customers better, because the world was watching and they were now being held accountable.
“THE THIRD TREND THAT SHAPES the current state of customer experience is technology. In the past, customer experience technology was an “above-and-beyond” customer relationship management tool, but today having a customer relationship management tool is table stakes. Technology has empowered businesses to create efficiencies and meet the needs of a new era of demanding customers. More than half of customers actively seek to buy from the most innovative companies. We are seeing exponential growth in technology: Over the last five decades, the number of transistors (the tiny electrical components that perform basic operations) on a single chip have doubled regularly. This is the reason a modern smartphone packs incredible capability into such a small package. Connectivity has proven to be a critical aspect of modern life: Whether they’re in a rice paddy in China or on a cruise off the coast of New Zealand, people expect to be connected to the internet, and able to communicate with their family and friends. There are more than 5 billion unique mobile users in the world today, up 100 million (2 percent) in the past year. They all expect to have connectivity wherever they go.
“The cloud—which we will talk more about in chapter 6—has made data storage incredibly easy and affordable. Every company today is a technology company. Increasingly, we’re seeing companies call that out—like Allstate, an insurance company that now calls itself a data and technology company.
Technology increasingly shapes customers’ most beloved customer experiences. In a survey of 15,000 consumers, 73 percent of respondents said they value companies that offer up-to-date technology through customer experience. Technology allows companies to provide zero-friction, seamless customer experiences. Many of these companies are digital natives that deeply value the importance of digital innovation. According to Fortune, five of the top ten “most admired companies” are technology companies, including Apple, Amazon, Microsoft, Netflix, and Alphabet (Google). Most brands are reliant on a few key technology companies. Even Netflix—which competes with Amazon’s content services—relies on Amazon to host its content in the cloud through Amazon Web Services. Advances in machine learning and artificial intelligence have enabled a new era of automation, “of robots, and of personalization. Companies can do much more with data than they ever could before, not only gaining insights on the past, but also predicting the future—enabling them to better anticipate customer needs, gain real-time feedback, optimize pricing, identify customer flight risk factors, staff up or down, or make real-time marketing bets. In a 2018 survey of US senior decision-makers, big data and analytics was listed as the most important emerging technology for enhancing the customer experience.
Customers today get consumer-grade technology experiences in their personal lives, enjoying the use of social media, apps, and Apple products. Consumer-grade technology is what you experience with apps that are seamless. Customers expect the same frictionless experiences from the companies they do business with. Companies today are scrambling to digitize and are focusing on digital transformations, better leveraging technology to solve all kinds of problems with efficiencies, logistics, and supply chain. Companies are in a race to make life easier and better for customers—and they are starting by creating efficiencies for employees. “By focusing on issues like logistics and supply chain, companies are better able to get customers products faster, and create more personalized customer experiences. In chapter 7, we will look at digital transformation and how companies can do it right. A study at MIT found that companies that have embraced digital transformation are 26 percent more profitable than their peers.
Excerto de: Blake Morgan. “The Customer of the Future”. Apple Books.
TESTE
Wednesday, November 6, 2019
Chap 5 - 1 Sinek
THE RESPONSIBILITY OF BUSINESS (REVISED)
Business today is subject to a dizzying rate of change. And all that change seems to be taking its toll. The time it takes before a company is forced out of the game is getting shorter and shorter. The average life of a company in the 1950s, if you recall, was just over 60 years. Today it is less than 20 years. According to a 2017 study by Credit Suisse, disruptive technology is the reason for the steep decline in company life span. However, disruptive technologies are not a new phenomenon. The credit card, the microwave oven, Bubble Wrap, Velcro, transistor radio, television, computer hard disks, solar cells, optic fiber, plastic and the microchip were all introduced in the 1950s. Save for Velcro and Bubble Wrap (which are disruptive in a completely different way), that’s a pretty good list of disruptive technologies. “Disruption” is likely not the cause of the challenge, it’s a symptom of a more insidious root cause. It is not technology that explains failure; it is less about technology, per se, and more about the leaders’ failure to envision the future of their business as the world changes around them. It is the result of shortsightedness. And shortsightedness is an inherent condition of leaders who play with a finite mindset. In fact, the rise of this kind of shortsightedness over the past 50 years can be traced back to the philosophies of a single person.
In a watershed article from 1970, Milton Friedman, the Nobel Prize–winning economist, who is considered one of the great theorists of today’s form of capitalism, laid out the foundation for the theory of shareholder primacy that is at the heart of so much finite-minded business practice today. “In a free-enterprise, private-property system,” he wrote, “a corporate executive is an employee of the owners of the business. He has direct responsibility to his employers. That responsibility is to conduct the business in accordance with their desires, which generally will be to make as much money as possible while conforming to the basic rules of the society, both those embodied in law and those embodied in ethical custom.” Indeed, Friedman insisted that “there is one and only one social responsibility of business, to use its resources and engage in activities designed to increase its profits so long as it stays within the rules of the game.” In other words, according to Friedman, the sole purpose of business is to make money and that money belongs to shareholders. These ideas are now firmly ingrained in the zeitgeist. Today it is so generally accepted that the “owner” of a company sits at the top of the benefit food chain and that business exists solely to create wealth, that we often assume that this was always the way that the game of business was played and is the only way it can be played. Except it wasn’t . . . and it isn’t.
Friedman seemed to have a very one-dimensional view of business. And as anyone who has ever led, worked for or bought from a “business knows, business is dynamic and complicated. Which means, it is possible that, for the past 40+ years, we have been building companies with a definition of business that is actually bad for business and undermines the very system of capitalism it proclaims to embrace.
Capitalism Before Friedman”
“For a more infinite-minded alternative to Friedman’s definition of the responsibility of business, we can go back to Adam Smith. The eighteenth-century Scottish philosopher and economist is widely accepted as the father of economics and modern capitalism. “Consumption,” he wrote in The Wealth of Nations, “is the sole end and purpose of all production and the interest of the producer ought to be attended to, only so far as it may be necessary for promoting that of the consumer.” He went on to explain, “The maxim is so perfectly self-evident, that it would be absurd to attempt to prove it.” Put simply, the company’s interests should always be secondary to the interest of the consumer (ironically, a point Smith “believed so “self-evident,” he felt it was absurd to try to prove it, and yet here I am writing a whole book about it).
Smith, however, was not blind to our finite predilections. He recognized that “in the mercantile system the interest of the consumer is almost constantly sacrificed to that of the producer; and it seems to consider production, and not consumption, as the ultimate end and object of all industry and commerce.” In a nutshell, Smith accepted that it was human nature for people to act to advance their own interests. He called our propensity for self-interest the “invisible hand.” He went on to theorize that because the invisible hand was a universal truth (because of our selfish motivations we all want to build strong companies), it ultimately benefits the consumer. “It is not from the benevolence of the butcher, the brewer, or the baker that we can expect our dinner, but from their regard to their own interest,” he explained. The butcher has a selfish desire to offer the best cuts of meat without regard for the brewer or the baker. And the brewer wants to make the best beer, regardless of what meat “or bread is available on the market. And the baker wants to make the tastiest loaves without any consideration for what we may put on our sandwiches. The result, Smith believed, is that we, the consumers, get the best of everything . . . at least we do if the system is balanced. However, Smith did not consider a time in which the selfishness of outside investors and an analyst community would put that system completely out of balance. He did not anticipate that an entire group of self-interested outsiders would exert massive pressure on the baker to cut costs and use cheaper ingredients in order to maximize the investors’ gains.”
“If history or 18th-century brogue-tongued philosophers are not your jam, we need simply look at how capitalism changed after the idea of shareholder supremacy took over—which only happened in the final decades of the twentieth century. Prior to the introduction of the shareholder primacy theory, the way business operated in the United States looked quite different. “By the middle of the 20th century,” said Cornell corporate law professor Lynn Stout in the documentary series Explained, “the American public corporation was proving itself one of the most effective and powerful and beneficial organizations in the world.”
“Companies of that era allowed for average Americans, not just the wealthiest, to share in the investment opportunities and enjoy good returns. Most important, “executives and directors viewed themselves as stewards or trustees of great public institutions that were supposed to serve not just the shareholders, but also bondholders, suppliers, employees and the community.” It was only after Friedman’s 1970 article that executives and directors started to see themselves as responsible to their “owners,” the shareholders, and not stewards of something bigger. The more that idea took hold in the 1980s and ’90s, the more incentive structures inside public companies and banks themselves became excessively focused on shorter-and-shorter-term gains to the benefit of fewer and fewer people. It’s during this time that the annual round of mass layoffs to meet arbitrary projections became an accepted and common strategy for the first time. Prior to the 1980s, such a practice simply didn’t exist. It was common for people to work a practical lifetime for one company. The company took care of them and they took care of the company. Trust, pride and loyalty flowed in both directions. And at the end of their careers these long-time employees would get their proverbial gold watch. I don’t think getting a gold watch is even a thing anymore. These days, we either leave or are asked to leave long before we would ever earn one.
Capitalism Abuse”
“The finite-minded form of capitalism that exists today bears little resemblance to the more infinite-minded form that inspired America’s founders (Thomas Jefferson owned all three volumes of Smith’s Wealth of Nations) and served as the bedrock for the growth of the American nation. Capitalism today is, in name only, the capitalism that Adam Smith envisioned over 200 years ago. And it looks nothing like the capitalism practiced by companies like Ford, Kodak and Sears in the late 19th and early 20th centuries, before they too fell prey to finite thinking and lost their way. What many leaders in business practice these days is more of an abuse of capitalism, or “capitalism abuse.” Like in the case of alcohol abuse, “abuse” is defined as improper use of something. To use something for a reason other than that for which it was intended. And if capitalism was intended to benefit the consumer and the leaders of companies were to be the stewards of something greater than themselves, they are not using it that way today.
“Some may say my view—that the purpose of a company is not just to make money but to pursue a Just Cause—is naïve and anticapitalist. First, I would urge us all to beware the messenger. My assumption is that those who most fiercely defend Friedman’s views on business, and many of the current and accepted business practices he inspired, are the ones who benefit most from them. But business was never just about making money. As Henry Ford said, “A business that makes nothing but money is a poor kind of business.” Companies exist to advance something—technology, quality of life or anything else with the potential to ease or enhance our lives in some way, shape or form. That people are willing to pay money for whatever a company has to offer is simply proof that they perceive or derive some value from those things. Which means the more value a company offers, the more money and the more fuel they will have for further advancements. Capitalism is about more than prosperity (measured in features and benefits, dollars and cents); it’s also about progress (measured in quality of life, technological advancements and the ability of the human race to live and work together in peace).
The constant abuse since the late 1970s has left us with a form of capitalism that is now, in fact, broken. It is a kind of bastardized capitalism that is organized to advance the interests of a few people who abuse the system for personal gain, which has done little to advance the true benefits of capitalism as a philosophy (as evidenced by anticapitalist and protectionist movements around the globe). “Indeed, the entire philosophy of shareholder primacy and Friedman’s definition of the purpose of business was promoted by investors themselves as a way to incentivize executives to prioritize and protect their finite interests above all else.
It is due in large part to Milton Friedman’s ideas, for example, that corporations started tying executive pay to short-term share price performance rather than the long-term health of the company. And those who embraced Friedman’s views rewarded themselves handsomely. The Economic Policy Institute reported that in 1978, the average CEO made approximately 30 times the average worker’s “pay. Where the average CEO has seen a nearly 950 percent increase in their earnings, the American worker, meanwhile, has seen just over 11 percent in theirs. According to the same report, average CEO pay has increased at a rate 70 percent faster than the stock market!
It doesn’t take an MBA to understand why. As Dr. Stout explains in her book, The Shareholder Value Myth, “If 80 percent of the CEO’s pay is based on what the share price is going to do next year, he or she is going to do their best to make sure that share price goes up, even if the consequences might be harmful to employees, to customers, to society, to the environment or even to the corporation itself in the long-term.” When we tie pay packages directly to stock price, it promotes practices like closing factories, keeping wages down, implementing extreme cost cutting and conducting annual rounds of layoffs—tactics that might boost the stock price in the near term, but often do damage to an organization’s ability to survive and thrive in the Infinite Game. Buybacks are another often legitimate practice that has been abused by public company executives seeking to prop up their share price. By buying back its own shares, based on the laws of supply and demand, they temporarily increase demand for their stock, which temporarily drives up the price (which temporarily makes the executives look good).
Though many of the practices used to drive up stock prices in the short term sound ethically dubious, if we look back to Friedman’s definition of the responsibility of business, we find that he leaves the door wide open for such behavior, even encourages it. Remember, his only guidance for the responsibility companies must obey is to act within the bounds of the law and “ethical custom.” I, as one observer, am struck by that awkward phrase, “ethical custom.” Why not just say “ethics”? Does ethical custom mean that if we do something frequently enough it becomes normalized and is thus no longer unethical? If so many companies use regular rounds of mass layoffs, using people’s livelihoods, to meet arbitrary projections, does that strategy then cease to be unethical? If everyone is doing it, it must be okay.
As a point of fact, laws and “ethical customs” usually come about in response to abuses, not by predicting them. In other words, they always lag behind. Based on the common interpretation of Friedman’s definition, it’s almost a requirement for companies to exploit those gaps to maximize profit until future laws and ethical customs tell them they can’t. Based on Friedman, it is their responsibility to do so!
Technology companies, like Facebook, Twitter and Google, certainly look like they are more comfortable asking for forgiveness as they run afoul of ethical customs, as opposed to leading with a fundamental view of how they safeguard one of their most important assets: our private data. Based on Friedman’s standards, they are doing exactly what they should do.
If we are using a flawed definition of business to build our companies today, then we are likely also promoting people and forming leadership teams best qualified to play by the finite rules that Friedman espoused—leadership teams that are probably the least equipped to navigate the ethical requirements necessary to avoid exploiting the system for self-gain. Built with the wrong goal in mind, these teams are more likely to make decisions that do long-term damage to the very organizations, people and communities they are supposed to be leading and protecting. As King Louis XV of France said in 1757, “Après moi le dèluge.” “After me comes the flood.” In other words, the disaster that will follow after I’m gone will be your problem, not mine. A sentiment that seems to be shared by too many finite leaders today.
The Pressure to Play with a Finite Mindset”
It’s a big open secret among the vast majority of public-company executives that the theory of shareholder primacy and the pressure Wall Street exerts on them are actually bad for business. The great folly is that despite this knowledge and their private grumblings and misgivings, they continue to defend the principle and yield to the pressure.
I am not going to waste precious ink making a drawn-out argument about the long-term impact of what happened to our country and global economies when executives bowed to those pressures. It is enough to call attention to the man-made recession of 2008, the increasing stress and insecurity too many of us feel at work and a gnawing feeling that too many of our leaders care more about themselves than they do about us. This is the great irony. The defenders of finite-minded capitalism act in a way that actually imperils the survival of the very companies from which they aim to profit. It’s as if they have decided that the best strategy to get the most cherries is to chop down the tree.
Thanks in large part to the loosening of regulations that were originally introduced to prevent banks from wielding the kind of influence and speculative tendencies that caused the Great Depression of 1929 to happen, investment banks once again wield massive amounts of power and influence. The result is obvious—Wall Street forces companies to do things they shouldn’t do and discourages them from doing things they should.
Entrepreneurs are not immune from the pressure either. In their case, there is often intense pressure to demonstrate constant, high-speed growth. To achieve that goal, or when growth slows, they turn to venture capital or private equity firms to raise money. Which sounds good in theory. Except there is a flaw in the business model of private equity that can wreak havoc with any company keen to stay in the game. For private equity and venture capital firms to make money, they have to sell. And it’s often about three to five years after they make their initial investment. A private equity firm or venture capitalist can use all the flowery, infinite game, Cause-focused language they want. And they may believe it. Up until the point they have to sell. And then all of a sudden many will care a lot less about the Just Cause and all the other stakeholders. The pressure investors can exert on the company to do things in the name of finite objectives can be and often is devastating to the long-term prospects of the company. Long is the list of purpose-driven executives who say that their investors are different, that they do care about the company’s Cause . . . until it’s time to sell. (The ones I talked to asked that I not mention the names of their companies for fear of upsetting their investors.)
There is no such thing as constant growth, nor is there any rule that says high-speed growth is necessarily a great strategy when building a company to last. Where a finite-minded leader sees fast growth as the goal, an infinite-minded leader views growth as an adjustable variable. Sometimes it is important to strategically slow the rate of growth to help ensure the security of the long-term or simply to make sure the organization is properly equipped to withstand the additional pressures that come with high-speed growth. A fast-growing retail operation, for example, may choose to slow the store expansion schedule in order to put more resources into training and development of staff and store managers. Opening stores is not what makes a company successful; having those stores operate well is. It’s in a company’s interest to get things done right now rather than wait to deal with the problems high-speed growth can cause later. The art of good leadership is the ability to look beyond the growth plan and the willingness to act prudently when something is not ready or not right, even if it means slowing things down.
From the 1950s to the ’70s, the concept of “forecasting” was considered critical across multiple institutions. Teams of “futurists” were brought in to examine technological, political and cultural trends in order to predict their future impact and prepare for it. (Such a practice may have helped Garmin proactively adapt to advancements in mobile phone technology instead of being forced to react to it.) Even the United States federal government was in on it. In 1972, Congress established the Office of Technology Assessment specifically to examine the long-term impact of proposed legislation. “They’re beginning to realize that legislation will remain on the books for 20 or 50 years before it’s reviewed,” said Edward Cornish, president of the World Future Society, “and they want to be sure that what they do now won’t have an adverse impact years from today.” However, the discipline fell out of favor during the 1980s, with some in government thinking it a waste of money to try to “predict the future.” The office was officially closed in 1995. Though today futurists still exist in the business world, they are usually tasked with helping a company predict trends that can be marketed to rather than assessing future impact of current choices.
Finite-focused leaders are often loath to sacrifice near-term gains, even if it’s the right thing to do for the future, because near-term gains are the ones that are most visible to the market. And the pressure this mindset exerts on others in the company to focus on the near-term often comes at the detriment of the quality of the services or the products we buy. That is the exact opposite of what Adam Smith was talking about. If the investor community followed Smith’s philosophies, they would be doing whatever they could to help the companies in which they invested make the best possible product, offer the best possible service and build the strongest possible company. It’s what’s good for the customer and the wealth of nations. And if shareholders really were the owners of the companies in which they invested, that is indeed how they would act. But in reality, they don’t act like owners at all. They act more like renters.
Consider how differently we drive a car we own versus one we rent, and all of a sudden it will become clear why shareholders seem more focused on getting to where they want to go with little regard to the vehicle that’s taking them there. Turn on CNBC on any given day and we see discussions dominated by talk of trading strategies and near-term market moves. These are shows about trading, not about owning. They are giving people advice on how to buy and flip a house, not how to find a home to raise a family. If short-term-focused investors treat the companies in which they invest like rental cars, i.e., not theirs, then why must the leaders of the companies treat those investors like owners? The fact is, public companies are different from private companies and do not need to conform to the same traditional definition of ownership. If our goal is to build companies that can keep playing for lifetimes to come, then we must stop automatically thinking of shareholders as owners, and executives must stop thinking that they work solely for them. A healthier way for all shareholders to view themselves is as contributors, be they near-term or long-term focused.
Whereas employees contribute time and energy, investors contribute capital (money). Both forms of contribution are valuable and necessary to help a company succeed, so both parties should be fairly rewarded for their contributions. Logically, for a company to get bigger, stronger or better at what they do, executives must ensure that the benefit provided by investors’ money or employees’ hard work should, as Adam Smith pointed out, go first to those who buy from the company. When that happens, it is easier for the company to sell more, charge more, build a more loyal customer base and make more money for the company and its investors alike. Or am I missing something here? In addition, executives need to go back to seeing themselves as stewards of great institutions that exist to serve all the stakeholders. The impact of which serves the wants, needs and desires of all those involved in a company’s success, not just a few.
The fact is, we all want to feel like our work and our lives have meaning. It’s part of what it means to be human. We all want to feel a part of something bigger than ourselves. I have to believe this contributes to the reason so many companies say they primarily serve their people and their customers when they are in fact primarily serving their executive ranks and their shareholders. For many of us, even if we don’t have the words, the modern form of capitalism we have just feels like something doesn’t align with our values. Indeed, if we all truly embraced Friedman’s definition of business, then companies would have visions and missions that were solely about maximizing profit and we’d all be fine with it. But they don’t. If the true purpose of business was only to make money, there would be no need for so many companies to pretend to be cause or purpose driven. Saying a business exists for something bigger and actually building a business to do it are not the same thing. “And only one of those strategies has any value in the Infinite Game.”
Business today is subject to a dizzying rate of change. And all that change seems to be taking its toll. The time it takes before a company is forced out of the game is getting shorter and shorter. The average life of a company in the 1950s, if you recall, was just over 60 years. Today it is less than 20 years. According to a 2017 study by Credit Suisse, disruptive technology is the reason for the steep decline in company life span. However, disruptive technologies are not a new phenomenon. The credit card, the microwave oven, Bubble Wrap, Velcro, transistor radio, television, computer hard disks, solar cells, optic fiber, plastic and the microchip were all introduced in the 1950s. Save for Velcro and Bubble Wrap (which are disruptive in a completely different way), that’s a pretty good list of disruptive technologies. “Disruption” is likely not the cause of the challenge, it’s a symptom of a more insidious root cause. It is not technology that explains failure; it is less about technology, per se, and more about the leaders’ failure to envision the future of their business as the world changes around them. It is the result of shortsightedness. And shortsightedness is an inherent condition of leaders who play with a finite mindset. In fact, the rise of this kind of shortsightedness over the past 50 years can be traced back to the philosophies of a single person.
In a watershed article from 1970, Milton Friedman, the Nobel Prize–winning economist, who is considered one of the great theorists of today’s form of capitalism, laid out the foundation for the theory of shareholder primacy that is at the heart of so much finite-minded business practice today. “In a free-enterprise, private-property system,” he wrote, “a corporate executive is an employee of the owners of the business. He has direct responsibility to his employers. That responsibility is to conduct the business in accordance with their desires, which generally will be to make as much money as possible while conforming to the basic rules of the society, both those embodied in law and those embodied in ethical custom.” Indeed, Friedman insisted that “there is one and only one social responsibility of business, to use its resources and engage in activities designed to increase its profits so long as it stays within the rules of the game.” In other words, according to Friedman, the sole purpose of business is to make money and that money belongs to shareholders. These ideas are now firmly ingrained in the zeitgeist. Today it is so generally accepted that the “owner” of a company sits at the top of the benefit food chain and that business exists solely to create wealth, that we often assume that this was always the way that the game of business was played and is the only way it can be played. Except it wasn’t . . . and it isn’t.
Friedman seemed to have a very one-dimensional view of business. And as anyone who has ever led, worked for or bought from a “business knows, business is dynamic and complicated. Which means, it is possible that, for the past 40+ years, we have been building companies with a definition of business that is actually bad for business and undermines the very system of capitalism it proclaims to embrace.
Capitalism Before Friedman”
“For a more infinite-minded alternative to Friedman’s definition of the responsibility of business, we can go back to Adam Smith. The eighteenth-century Scottish philosopher and economist is widely accepted as the father of economics and modern capitalism. “Consumption,” he wrote in The Wealth of Nations, “is the sole end and purpose of all production and the interest of the producer ought to be attended to, only so far as it may be necessary for promoting that of the consumer.” He went on to explain, “The maxim is so perfectly self-evident, that it would be absurd to attempt to prove it.” Put simply, the company’s interests should always be secondary to the interest of the consumer (ironically, a point Smith “believed so “self-evident,” he felt it was absurd to try to prove it, and yet here I am writing a whole book about it).
Smith, however, was not blind to our finite predilections. He recognized that “in the mercantile system the interest of the consumer is almost constantly sacrificed to that of the producer; and it seems to consider production, and not consumption, as the ultimate end and object of all industry and commerce.” In a nutshell, Smith accepted that it was human nature for people to act to advance their own interests. He called our propensity for self-interest the “invisible hand.” He went on to theorize that because the invisible hand was a universal truth (because of our selfish motivations we all want to build strong companies), it ultimately benefits the consumer. “It is not from the benevolence of the butcher, the brewer, or the baker that we can expect our dinner, but from their regard to their own interest,” he explained. The butcher has a selfish desire to offer the best cuts of meat without regard for the brewer or the baker. And the brewer wants to make the best beer, regardless of what meat “or bread is available on the market. And the baker wants to make the tastiest loaves without any consideration for what we may put on our sandwiches. The result, Smith believed, is that we, the consumers, get the best of everything . . . at least we do if the system is balanced. However, Smith did not consider a time in which the selfishness of outside investors and an analyst community would put that system completely out of balance. He did not anticipate that an entire group of self-interested outsiders would exert massive pressure on the baker to cut costs and use cheaper ingredients in order to maximize the investors’ gains.”
“If history or 18th-century brogue-tongued philosophers are not your jam, we need simply look at how capitalism changed after the idea of shareholder supremacy took over—which only happened in the final decades of the twentieth century. Prior to the introduction of the shareholder primacy theory, the way business operated in the United States looked quite different. “By the middle of the 20th century,” said Cornell corporate law professor Lynn Stout in the documentary series Explained, “the American public corporation was proving itself one of the most effective and powerful and beneficial organizations in the world.”
“Companies of that era allowed for average Americans, not just the wealthiest, to share in the investment opportunities and enjoy good returns. Most important, “executives and directors viewed themselves as stewards or trustees of great public institutions that were supposed to serve not just the shareholders, but also bondholders, suppliers, employees and the community.” It was only after Friedman’s 1970 article that executives and directors started to see themselves as responsible to their “owners,” the shareholders, and not stewards of something bigger. The more that idea took hold in the 1980s and ’90s, the more incentive structures inside public companies and banks themselves became excessively focused on shorter-and-shorter-term gains to the benefit of fewer and fewer people. It’s during this time that the annual round of mass layoffs to meet arbitrary projections became an accepted and common strategy for the first time. Prior to the 1980s, such a practice simply didn’t exist. It was common for people to work a practical lifetime for one company. The company took care of them and they took care of the company. Trust, pride and loyalty flowed in both directions. And at the end of their careers these long-time employees would get their proverbial gold watch. I don’t think getting a gold watch is even a thing anymore. These days, we either leave or are asked to leave long before we would ever earn one.
Capitalism Abuse”
“The finite-minded form of capitalism that exists today bears little resemblance to the more infinite-minded form that inspired America’s founders (Thomas Jefferson owned all three volumes of Smith’s Wealth of Nations) and served as the bedrock for the growth of the American nation. Capitalism today is, in name only, the capitalism that Adam Smith envisioned over 200 years ago. And it looks nothing like the capitalism practiced by companies like Ford, Kodak and Sears in the late 19th and early 20th centuries, before they too fell prey to finite thinking and lost their way. What many leaders in business practice these days is more of an abuse of capitalism, or “capitalism abuse.” Like in the case of alcohol abuse, “abuse” is defined as improper use of something. To use something for a reason other than that for which it was intended. And if capitalism was intended to benefit the consumer and the leaders of companies were to be the stewards of something greater than themselves, they are not using it that way today.
“Some may say my view—that the purpose of a company is not just to make money but to pursue a Just Cause—is naïve and anticapitalist. First, I would urge us all to beware the messenger. My assumption is that those who most fiercely defend Friedman’s views on business, and many of the current and accepted business practices he inspired, are the ones who benefit most from them. But business was never just about making money. As Henry Ford said, “A business that makes nothing but money is a poor kind of business.” Companies exist to advance something—technology, quality of life or anything else with the potential to ease or enhance our lives in some way, shape or form. That people are willing to pay money for whatever a company has to offer is simply proof that they perceive or derive some value from those things. Which means the more value a company offers, the more money and the more fuel they will have for further advancements. Capitalism is about more than prosperity (measured in features and benefits, dollars and cents); it’s also about progress (measured in quality of life, technological advancements and the ability of the human race to live and work together in peace).
The constant abuse since the late 1970s has left us with a form of capitalism that is now, in fact, broken. It is a kind of bastardized capitalism that is organized to advance the interests of a few people who abuse the system for personal gain, which has done little to advance the true benefits of capitalism as a philosophy (as evidenced by anticapitalist and protectionist movements around the globe). “Indeed, the entire philosophy of shareholder primacy and Friedman’s definition of the purpose of business was promoted by investors themselves as a way to incentivize executives to prioritize and protect their finite interests above all else.
It is due in large part to Milton Friedman’s ideas, for example, that corporations started tying executive pay to short-term share price performance rather than the long-term health of the company. And those who embraced Friedman’s views rewarded themselves handsomely. The Economic Policy Institute reported that in 1978, the average CEO made approximately 30 times the average worker’s “pay. Where the average CEO has seen a nearly 950 percent increase in their earnings, the American worker, meanwhile, has seen just over 11 percent in theirs. According to the same report, average CEO pay has increased at a rate 70 percent faster than the stock market!
It doesn’t take an MBA to understand why. As Dr. Stout explains in her book, The Shareholder Value Myth, “If 80 percent of the CEO’s pay is based on what the share price is going to do next year, he or she is going to do their best to make sure that share price goes up, even if the consequences might be harmful to employees, to customers, to society, to the environment or even to the corporation itself in the long-term.” When we tie pay packages directly to stock price, it promotes practices like closing factories, keeping wages down, implementing extreme cost cutting and conducting annual rounds of layoffs—tactics that might boost the stock price in the near term, but often do damage to an organization’s ability to survive and thrive in the Infinite Game. Buybacks are another often legitimate practice that has been abused by public company executives seeking to prop up their share price. By buying back its own shares, based on the laws of supply and demand, they temporarily increase demand for their stock, which temporarily drives up the price (which temporarily makes the executives look good).
Though many of the practices used to drive up stock prices in the short term sound ethically dubious, if we look back to Friedman’s definition of the responsibility of business, we find that he leaves the door wide open for such behavior, even encourages it. Remember, his only guidance for the responsibility companies must obey is to act within the bounds of the law and “ethical custom.” I, as one observer, am struck by that awkward phrase, “ethical custom.” Why not just say “ethics”? Does ethical custom mean that if we do something frequently enough it becomes normalized and is thus no longer unethical? If so many companies use regular rounds of mass layoffs, using people’s livelihoods, to meet arbitrary projections, does that strategy then cease to be unethical? If everyone is doing it, it must be okay.
As a point of fact, laws and “ethical customs” usually come about in response to abuses, not by predicting them. In other words, they always lag behind. Based on the common interpretation of Friedman’s definition, it’s almost a requirement for companies to exploit those gaps to maximize profit until future laws and ethical customs tell them they can’t. Based on Friedman, it is their responsibility to do so!
Technology companies, like Facebook, Twitter and Google, certainly look like they are more comfortable asking for forgiveness as they run afoul of ethical customs, as opposed to leading with a fundamental view of how they safeguard one of their most important assets: our private data. Based on Friedman’s standards, they are doing exactly what they should do.
If we are using a flawed definition of business to build our companies today, then we are likely also promoting people and forming leadership teams best qualified to play by the finite rules that Friedman espoused—leadership teams that are probably the least equipped to navigate the ethical requirements necessary to avoid exploiting the system for self-gain. Built with the wrong goal in mind, these teams are more likely to make decisions that do long-term damage to the very organizations, people and communities they are supposed to be leading and protecting. As King Louis XV of France said in 1757, “Après moi le dèluge.” “After me comes the flood.” In other words, the disaster that will follow after I’m gone will be your problem, not mine. A sentiment that seems to be shared by too many finite leaders today.
The Pressure to Play with a Finite Mindset”
It’s a big open secret among the vast majority of public-company executives that the theory of shareholder primacy and the pressure Wall Street exerts on them are actually bad for business. The great folly is that despite this knowledge and their private grumblings and misgivings, they continue to defend the principle and yield to the pressure.
I am not going to waste precious ink making a drawn-out argument about the long-term impact of what happened to our country and global economies when executives bowed to those pressures. It is enough to call attention to the man-made recession of 2008, the increasing stress and insecurity too many of us feel at work and a gnawing feeling that too many of our leaders care more about themselves than they do about us. This is the great irony. The defenders of finite-minded capitalism act in a way that actually imperils the survival of the very companies from which they aim to profit. It’s as if they have decided that the best strategy to get the most cherries is to chop down the tree.
Thanks in large part to the loosening of regulations that were originally introduced to prevent banks from wielding the kind of influence and speculative tendencies that caused the Great Depression of 1929 to happen, investment banks once again wield massive amounts of power and influence. The result is obvious—Wall Street forces companies to do things they shouldn’t do and discourages them from doing things they should.
Entrepreneurs are not immune from the pressure either. In their case, there is often intense pressure to demonstrate constant, high-speed growth. To achieve that goal, or when growth slows, they turn to venture capital or private equity firms to raise money. Which sounds good in theory. Except there is a flaw in the business model of private equity that can wreak havoc with any company keen to stay in the game. For private equity and venture capital firms to make money, they have to sell. And it’s often about three to five years after they make their initial investment. A private equity firm or venture capitalist can use all the flowery, infinite game, Cause-focused language they want. And they may believe it. Up until the point they have to sell. And then all of a sudden many will care a lot less about the Just Cause and all the other stakeholders. The pressure investors can exert on the company to do things in the name of finite objectives can be and often is devastating to the long-term prospects of the company. Long is the list of purpose-driven executives who say that their investors are different, that they do care about the company’s Cause . . . until it’s time to sell. (The ones I talked to asked that I not mention the names of their companies for fear of upsetting their investors.)
There is no such thing as constant growth, nor is there any rule that says high-speed growth is necessarily a great strategy when building a company to last. Where a finite-minded leader sees fast growth as the goal, an infinite-minded leader views growth as an adjustable variable. Sometimes it is important to strategically slow the rate of growth to help ensure the security of the long-term or simply to make sure the organization is properly equipped to withstand the additional pressures that come with high-speed growth. A fast-growing retail operation, for example, may choose to slow the store expansion schedule in order to put more resources into training and development of staff and store managers. Opening stores is not what makes a company successful; having those stores operate well is. It’s in a company’s interest to get things done right now rather than wait to deal with the problems high-speed growth can cause later. The art of good leadership is the ability to look beyond the growth plan and the willingness to act prudently when something is not ready or not right, even if it means slowing things down.
From the 1950s to the ’70s, the concept of “forecasting” was considered critical across multiple institutions. Teams of “futurists” were brought in to examine technological, political and cultural trends in order to predict their future impact and prepare for it. (Such a practice may have helped Garmin proactively adapt to advancements in mobile phone technology instead of being forced to react to it.) Even the United States federal government was in on it. In 1972, Congress established the Office of Technology Assessment specifically to examine the long-term impact of proposed legislation. “They’re beginning to realize that legislation will remain on the books for 20 or 50 years before it’s reviewed,” said Edward Cornish, president of the World Future Society, “and they want to be sure that what they do now won’t have an adverse impact years from today.” However, the discipline fell out of favor during the 1980s, with some in government thinking it a waste of money to try to “predict the future.” The office was officially closed in 1995. Though today futurists still exist in the business world, they are usually tasked with helping a company predict trends that can be marketed to rather than assessing future impact of current choices.
Finite-focused leaders are often loath to sacrifice near-term gains, even if it’s the right thing to do for the future, because near-term gains are the ones that are most visible to the market. And the pressure this mindset exerts on others in the company to focus on the near-term often comes at the detriment of the quality of the services or the products we buy. That is the exact opposite of what Adam Smith was talking about. If the investor community followed Smith’s philosophies, they would be doing whatever they could to help the companies in which they invested make the best possible product, offer the best possible service and build the strongest possible company. It’s what’s good for the customer and the wealth of nations. And if shareholders really were the owners of the companies in which they invested, that is indeed how they would act. But in reality, they don’t act like owners at all. They act more like renters.
Consider how differently we drive a car we own versus one we rent, and all of a sudden it will become clear why shareholders seem more focused on getting to where they want to go with little regard to the vehicle that’s taking them there. Turn on CNBC on any given day and we see discussions dominated by talk of trading strategies and near-term market moves. These are shows about trading, not about owning. They are giving people advice on how to buy and flip a house, not how to find a home to raise a family. If short-term-focused investors treat the companies in which they invest like rental cars, i.e., not theirs, then why must the leaders of the companies treat those investors like owners? The fact is, public companies are different from private companies and do not need to conform to the same traditional definition of ownership. If our goal is to build companies that can keep playing for lifetimes to come, then we must stop automatically thinking of shareholders as owners, and executives must stop thinking that they work solely for them. A healthier way for all shareholders to view themselves is as contributors, be they near-term or long-term focused.
Whereas employees contribute time and energy, investors contribute capital (money). Both forms of contribution are valuable and necessary to help a company succeed, so both parties should be fairly rewarded for their contributions. Logically, for a company to get bigger, stronger or better at what they do, executives must ensure that the benefit provided by investors’ money or employees’ hard work should, as Adam Smith pointed out, go first to those who buy from the company. When that happens, it is easier for the company to sell more, charge more, build a more loyal customer base and make more money for the company and its investors alike. Or am I missing something here? In addition, executives need to go back to seeing themselves as stewards of great institutions that exist to serve all the stakeholders. The impact of which serves the wants, needs and desires of all those involved in a company’s success, not just a few.
The fact is, we all want to feel like our work and our lives have meaning. It’s part of what it means to be human. We all want to feel a part of something bigger than ourselves. I have to believe this contributes to the reason so many companies say they primarily serve their people and their customers when they are in fact primarily serving their executive ranks and their shareholders. For many of us, even if we don’t have the words, the modern form of capitalism we have just feels like something doesn’t align with our values. Indeed, if we all truly embraced Friedman’s definition of business, then companies would have visions and missions that were solely about maximizing profit and we’d all be fine with it. But they don’t. If the true purpose of business was only to make money, there would be no need for so many companies to pretend to be cause or purpose driven. Saying a business exists for something bigger and actually building a business to do it are not the same thing. “And only one of those strategies has any value in the Infinite Game.”
Friday, March 8, 2019
hghg
1. Introduction
Only a few years after introducing strategy maps to performance management by incorporating them into the balanced scorecard, Kaplan and Norton (2004) remarked that these were “as big an insight to executives as the balanced scorecard itself” (p. 9). It was a significant observation, given that the balanced scorecard became one of the most widely used frameworks used in practice (Rigby and Bilodeau, 2015). The power of the map as first introduced stemmed from its purported ability to effectively describe strategy in a cohesive and straightforward way, thereby increasing the likelihood of successful strategy implementation (Kaplan and Norton, 2001). Strategy maps could also be used in aiding in formulating strategy, in structuring problems, in defining measures and objectives, and in decision making (Kaplan and Norton, 2004; Kaplan and Norton, 2006; Lueg and Julner, 2014).
However, nearly two decades after introducing the strategy map to performance management, evidence suggests that the impact of strategy maps for performance management practice remains limited. There is evidence that few organizations use strategy maps as a part of the balanced scorecard or other performance management framework (Speckbacher et al., 2003; Tapinos et al., 2010), despite these being linked to effective use and satisfaction (Laitinen et al., 2010; Lueg and Julner, 2014). Further, strategy mapping in general often fails to be included in descriptions of the balanced scorecard (see Rigby, 2017), and is seldom used as a standalone tool in practice (Tapinos et al., 2010). In short, it appears that the strategy map and strategy mapping have not realized their potential for performance management.
There are several issues that could explain the lack of impact to date. First, descriptions of the role of strategy maps and how they are meant to work within the balanced scorecard framework have remained vague, often do not specify the outcome intended through their use, or apply overly generalized conceptions of performance (Hoque, 2014; Lueg, 2015; Öllinger et al., 2015). Second, many scholarly works on the strategy maps remain normative (Islam, 2018), or take the limited view of the strategy map as a management control device (Tapinos et al., 2010). Despite a few developments (e.g. the possibility of including time delays), this narrow focus contrasts with an evolving discussion of strategy mapping and its related causal mapping in general in management and operations research (Hodgkinson and Clarkson, 2005), which has not entered mainstream discussions of the tool within performance management. Rather, discussions of the strategy map as it appears in performance management remain bound to the balanced scorecard framework, which, it should be noted, appears to be on the decline (Rigby and Bilodeau, 2018). Therefore, if the map is to reach its breakthrough potential for performance management, it is useful to consider it separately from the balanced scorecard.
Therefore, the aim of this paper is to revisit a major component of performance management, the strategy map, to thoroughly consider the theory of how they work, and further consider this within a unique performance management context. There are two intended contributions through this aim: first, specifying purpose and extracting theory can help practitioners better fit them to purpose and allow maps to be employed more effectively. This synthesis addresses this aim specifically by offering several propositions inferred from the review results. Second, it aims to permit performance management research and practice to be able to adapt, adjust, and expand existing and emerging theory on maps and mapping beyond that offered in the original balanced scorecard framework. In other words, instead of whether strategy maps “work,” the interest of this study is to develop an understanding of the generative mechanisms behind strategy maps:
RQ1. How and in what circumstances do strategy maps contribute to increased organizational performance?
The objective of this paper is to address the research question through a realist synthesis (Pawson, 2006) of empirical studies on the use of strategy maps as a part of a performance management framework. A realist synthesis is a type of systematic literature review that focuses on developing a theory of how a particular tool, framework, program or intervention is meant to work, and then examines the evidence to evaluate the strength of the theory. Because it focuses on theory rather than the tool itself, it is well-suited for evaluating complex interventions like the use of strategy maps, in which there may be multiple, conflicting factors influencing its outcomes. The idea is that by separating the theory from the tool, realist synthesis can facilitate knowledge creation and make it easier to adapt its use to a particular context.
The paper proceeds as follows: first, it explores realist synthesis and the methods of review. Next, results are presented, and then discussed along with implications for research and practitioners.
Downloaded by 50.224.134.150 At 11:12 07 January 2019 (PT)
2. Methodology
Most interest around the strategy map within performance management has maintained Kaplan and Norton’s focus on the technical aspects of strategy maps (see Islam, 2018, for a recent review of these) to the detriment of the sensemaking processes that take place around them. Underlying this focus is a common position within performance management studies that the interpretation of performance information is straightforward, linked to positivism (Micheli and Mari, 2014). These assumptions can be problematic when considering the social aspects of performance management (Beer and Micheli, 2018), a criticism that has been applied to strategy maps (Modell, 2012).
Therefore, a potentially fruitful means of understanding how maps work is to also revisit the philosophical assumptions upon which considerations of the strategy map in performance management have been built.
This paper describes a realist synthesis (Pawson, 2006). In practical terms, the method begins with a guiding question: “What works for whom under what circumstances, how, and why?” (Wong et al., 2013). Underlying this question is a realist philosophy of science, which will be briefly discussed in the following paragraphs as a backdrop to the synthesis method.
2.1 Why realism?
Scientific realism developed largely in response to a criticism that traditional research approaches were limited in their ability to provide explanations because they relied on artificially creating or assuming closed experimental conditions (Sayer, 1992). In most cases, experimental closure is undesirable or impossible because reality is fundamentally open (Bhaskar, 1975). This openness quickly comes into conflict with the more commonly employed Humean view of causality which seeks to establish scientific laws by seeking events in succession (Hume, 1967).
Under this empiricist approach, reality is seen as obeying universal laws which can be uncovered through the repeated observation of events. Researchers can then induce the existence of these laws, which can then be tested via statistical methods to establish their validity.
However, scientific practice under the empiricist approach has been criticized because it effectively reduces reality to observable events. In social systems, this position has been cited as especially problematic because it allows for the meaningfulness of social interactions to be completely ignored or greatly reduced (Bhaskar, 1979).
As an alternative, realism adopts a generative view of causality under which cognitive, social and physical entities interrelate to produce events via mechanisms. The primary aim of science under this perspective is to identify these mechanisms and understand their nature in order to improve practice (Bhaskar, 2014, p. v). However, disagreements exist on the meaning of the term “mechanism”, which have complicated its application in practice (Dalkin et al., 2015), and so some further clarification is needed.
First, mechanisms are described as the generally unobservable relations between processes, physical and social structures, and ideas that produce outcomes (Astbury and Leeuw, 2010; Mingers and Standing, 2017), which may operate in different contexts in which other mechanisms may be operating simultaneously. Because of the focus on how mechanisms operate in particular contexts to produce outcomes, realist evaluation often reports results in a “CMO” configuration for context, mechanism and outcome (Pawson, 2013). However, several researchers have pointed out continued confusion on what constitutes a mechanism and what does not (Craver, 2009; Dalkin et al., 2015; Mingers and Standing, 2017). This discussion adopts the view of Mingers (2014), in which the mechanism explains the relation between the entities within a system that gives rise to the outcome of interest.
Before illustrating the concept of mechanism used here, it is important to note that from the realist perspective, mechanisms operate in a stratified reality (Astbury and Leeuw, 2010;
Revisiting strategy mapping
Downloaded by 50.224.134.150 At 11:12 07 January 2019 (PT)
IJPPM
Bhaskar and Danermark, 2006). There are a number of ways in which realists conceive of stratification (Bhaskar, 2010), but what is important here is the concept of emergence, i.e. that the properties of an entity cannot be reduced to any one of its components, but rather emerge from their interaction.
An example using a matchstick can help to illustrate these concepts. At one level, the combination of its chemical composition and the friction of the surface create a process of combustion which, given the right conditions (e.g. the presence of oxygen), will produce a flame. Chemical composition and combustion is the mechanism that explains the outcome of the flame but provide part, but not all of the explanation. For example, to achieve the generation of the flame matches generally cannot be lit under water. Neither will the flame be produced if the wrong technique is used: too much pressure, and the matchstick breaks. Too little, and there will not be enough friction for the reaction to take place.
This type of analysis is open to higher-order considerations such as why the match might be struck in the first place, or the systems of production and infrastructure that could explain its existence. It also includes an interest in secondary outcomes: light a match on an airplane, for example, and the interrelation of various social structures will likely result in the person’s arrest – an emergent outcome which cannot be explained through the match’s chemical properties alone and requires understanding how people make sense of the action.
2.2 Why realist synthesis?
Adopting a realist approach to discovery has several implications for how research is carried out and, importantly, how evidence is cumulated and synthesized. Critically, rejecting a view of causality based on events implies that traditional forms of systematic literature review (Tranfield et al., 2003) require revisiting.
Systematic literature review originated in the field of medicine as means of consolidating existing knowledge. These reviews were meant to increase rigor over traditional, narrative reviews through transparency, inclusivity, and a focus on explanation (Denyer and Tranfield, 2009). Realist synthesis adopts many of the elements of these reviews, but requires adapting explanations into the generative view, adopting a more flexible approach to evidence gathering and to collection, and by abandoning the traditional hierarchy of evidence in evaluation. These elements and their implications will be discussed below corresponding with the stages of review, but essentially realist syntheses involve two processes: extracting the theories of how a particular intervention works (the mechanisms) via abductive redescription or abstraction, and evaluating the strength of those theories through a critical examination of the studies uncovered through the search processes.
The following section describes the stages and methods of review, which following Pawson (2006) include identifying a topic, extracting theory, search for literature, selection and appraisal, extraction, analysis and synthesis.
2.3 Identifying the topic of review
The interest of this discussion is in extracting the theory of strategy maps within a performance management context, where with few exceptions, strategy maps are discussed as a part of the balanced scorecard framework. Here, a scoping study revealed generally vague descriptions of how the strategy maps were meant to work, corroborating observations of much literature on the balanced scorecard in general (Hoque, 2014). Therefore, it was thought that a focus on strategy maps would have the greatest potential impact for practitioners and also would benefit performance measurement theory building.
2.4 Extracting the theory of strategy maps within a performance management framework
In a realist synthesis, how an intervention is meant to work often needs to be interpreted or adapted to fit the realist ontology. Even if some research implicitly uses a generative model of causality, few are described initially in such a way (Wynn and Williams, 2012). Others may be useful for evaluating the effectiveness of maps but focus on outcomes whose primary interest is not the direct improvement of organizational performance, e.g. for conflict resolution (Ackermann et al., 2016).
Therefore, a scoping study served to develop an initial classification of potential mechanisms using the foundational texts of the balanced scorecard (e.g. Kaplan and Norton, 2001, 2004, 2006), practitioner resources on the topic (Balanced Scorecard Institute, 2017) and reviews on casual maps and strategy maps (Hodgkinson and Clarkson, 2005; Lueg and Julner, 2014). Theories resulting from the scoping study were refined as the study progressed through a process of abstraction or abductive redescription – in other words, describing how the maps were meant to work in uniform terms to fit performance management.
These were grouped according to their associated performance measurement stage, whether to structure problems, develop, implement or modify a performance management system, or for use as an analysis or communication tool. During the search process, the background section of each study included in the full-text review was evaluated to extract the theory, if present, of how the strategy map or mapping process was meant to work.
The mechanism theory, presented in Section 3, was further divided into hierarchies depending on level, such that the lowest involved largely psychological processes, and the highest considered organizational outcomes. This process and its implications will be explored in the discussion section, but centered on examining how maps could affect organizational properties via the actions of many individuals (Astbury and Leeuw, 2010).
2.5 Search processes
Figure 1 shows an outline of the process for the synthesis. The search for studies to evaluate the propositions began with keyword searches for “performance measurement” in the academic citation databases of Scopus and Web of Knowledge, and later expanded to include “causal map” and “strategy map.” The searches were intentionally broad to increase the likelihood of including relevant articles in the review. That search began with keyword searches of the Scopus and Web of Knowledge academic databases, resulting in 6,583 unique articles. Additional text filters resulted in 4,225 articles for title and abstract review. The review relied heavily on the snowball approach, following Denyer et al. (2008), where references of each selected article were searched for relevant evidence.
2.6 Selection and appraisal of evidence
For the purposes of this review, the definition of performance measurement came from Franco-Santos et al. (2007), who argue that a performance measurement system exists if there are processes of measure design and selection, data capture, and information provision, features performance measures and supporting infrastructure, and has the role of measuring performance. This definition was selected because it encompasses only the necessary conditions of a performance measurement system, and would allow for a wide range of texts to be included.
Selection criteria:
• addresses performance measurement or management in organizations;
• describes an empirical study;
Revisiting strategy mapping
explores the consequences of the use of strategy or causal maps for either structuring problems, developing performance measures, communicating performance or analyzing performance;
• journal is included in the Scopus Citations Index or Journal Citations Report;
• article is published between 1992 and 2017; and
• results in English.
Selection criteria were applied in stages. Titles and abstracts were reviewed separately to exclude only those articles that did not meet the selection criteria. Articles with the possibility of relevance were passed on for further review and were considered relevant if they could be used to evaluate the developing program theory.
Articles that met all the inclusion criteria that were published in peer-reviewed journals were included, though not all impacted the final synthesis to an equal extent. For example, though the study by Cugini et al. (2011) on the application of strategy maps in a university setting provided an example of a successful implementation, the study mainly focuses on describing the resulting strategically linked scorecard, offering little evidence for evaluating underlying causal mechanisms. On the other hand, studies were also evaluated if they were considered to have sufficient rigor and relevance but were not in either citation index, though only one, that of Vo et al. (2005), was included in this fashion.
Application of the selection criteria resulted in 52 studies which were included in the final review. Of these, more than 60 percent were featured in journals with a 2017 SCIMago
Downloaded by 50.224.134.150 At 11:12 07 January 2019 (PT)
Journal Rank in the first quartile, with over a third of the studies in three and four-star journals in the 2018 ABS Academic Journal Guide, both common means of establishing quality (e.g. Franco-Santos et al., 2012).
2.7 Extraction
An extraction form was used to categorize the proposed mechanisms, context, subject, intervention characteristics, and an assessment of relevance and rigor of each of the studies. As it became clear which factors were of particular interest, the extraction form was refined to include the new information, and studies which had been previously examined were examined again to consider any new information. This reflects a recognition that database protocols may need more flexibility in studies on organizations than in the context of evidence-based medicine (Tranfield et al., 2003).
2.8 Analysis and synthesis process
Unlike traditional systematic review, the process of analysis and synthesis takes place alongside assessing relevance and extracting data. Following Pawson (2006) and Wong et al. (2013), full texts were reviewed and analyzed. The logical mode for this process is referred to as abstraction by Pawson (2006) and abductive redescription by Bhaskar (2016), i.e. describing events in a theoretically significant way. The result is an evolving “mechanism sketch” (Craver, 2006), a baseline categorization of the critical features, processes and actors that can explain how strategy maps generate the outcomes of interest.
This baseline, and another key part of the synthesis process, comes from comparing and contrasting findings from the included studies to infer a likely explanation, so that relevant findings could be used to develop specific propositions. Though not discussed specifically by Pawson (2006), the process could be thought of as inference to the best possible explanation (Lipton, 2004). It is important to note that first, the same study may support one proposition while not another. Because the focus is on generative mechanisms, studies may also inform the evaluation of more than one proposition or mechanism. In this way, the findings of these studies were used to evaluate the mechanisms that were derived in the process of abstraction.
3. A theory of maps for performance management
Performance management refers to a wide range of processes which center on setting goals, defining performance measures, reviewing and acting upon performance data, and the activities that surround these, with the ultimate goal to improve organizational performance (Bititci et al., 2018). Strategy maps have been implicated in any number of these activities, but broadly, their use can be seen as addressing three separate but interrelated performance management stages or processes. These stages can be to structure problems, generally in the form of strategy formation, to select, define, modify or develop an existing performance management component or system, or to communicate, analyze or evaluate performance, here referred to as use. It should be noted that studies within performance management rarely distinguish between these different purposes, which, as will be discussed, have complicated research into strategy maps.
The following section explores how maps are seen to drive the desired positive outcomes of each stage. This theory is the result of abstraction described in the previous section, and its purpose is to provide a high-level framework that facilitates the evaluation of results. Alluding again to the match example where combustion provides a baseline explanation for how a match generates a flame, this section aims to find a baseline explanation as to how a strategy map would generate its outcomes.
Revisiting strategy mapping
Downloaded by 50.224.134.150 At 11:12 07 January 2019 (PT)
IJPPM
A summary of the articles included in this review can be found in Table AI which includes the citation, the methodological approach, propositions addressed, research context, the type of strategy map, its complexity, elicitation technique and, if appropriate, the method of its development.
3.1 Strategy mapping for problem structuring
Strategy maps within performance management were originally presented as a way of “describing strategy” in order to understand it (Kaplan and Norton, 2000). This statement highlights that mapping for structuring problems is an active process which aims to facilitate the generation of ideas, gaining a broader understanding, and ultimately pursuing a more effective strategy. Within management studies, mapping has been used to achieve a wide range of ends. Of interest to this review are the mechanisms that explain how the creation of maps work for strategy formation and execution for an individual, in groups, and finally how these can lead to the pursuit of a more effective strategy and increased organizational performance.
3.1.1 The outcome: what is a structured problem? Broadly, when exploring outcomes for individuals, these studies are concerned with gaining a deeper understanding of an issue. Understanding is discussed as task performance (Öllinger et al., 2015), new knowledge or ideas (Goodier et al., 2010), presenting a diverse range of concepts (Goodier and Soetanto, 2013), or complexity of maps presented (Xu, 2011).
There is also an interest in how participants perceive the strategy or strategy making process, which is often pursued in tandem. For example, mapping can be used for changing how people feel about the strategy itself, whether by allowing their views to be heard, by separating the ideas from the speaker and from the motivational effects these can generate (Ackermann and Eden, 2011). Because of the potential, mapping is used for consensus building and conflict resolution (Ackermann and Eden, 2005; Ackermann et al., 2014, 2016). Ultimately, within performance management, the outcomes discussed above are meant to facilitate the pursuit of a more appropriate or effective strategy (Goodier et al., 2010; Jenkins and Johnson, 1997). A full list of outcomes for structuring found in this review is included in Table I.
3.1.2 How are maps meant to help structure problems? Figure 2 presents the mechanisms that were found in the literature that would explain how strategy maps can generate learning, motivation, ownership and, ultimately, the pursuit of a more effective strategy – the outcomes sought through their use as a tool for structuring problems. These outcomes correspond to three levels that have been abstracted from the literature: a psychological level whose outcomes are understanding and motivation, a group or social level where, in addition to reaching a shared, broader understanding, there can positive changes in attitude, and finally, the generation and selection of an appropriate course of action at the organizational level.
For the individual, maps are meant to lead to understanding by functioning as a kind of mirror, a process referred to here as actualization. By creating a map, the mapper makes ideas about an issue explicit, and thereby can see and reflect upon them. Eden and Ackermann (2018) refer to the map in this process as a “transitional object.” The nature of the knowledge created and how actualization works have been debated extensively (see Hodgkinson and Clarkson, 2005 for an overview) but remain outside the scope of this paper. What is important is that the node-link structure of causal maps specifically is a key component because it allows seeing, reflecting upon and possibly modifying how ideas relate to one another (Eden, 1988).
Groups can achieve consensus or shared understanding, more holistic views of an issue and have more ideas presented in several ways. First, through the actualization process, participants are able to avoid embarrassment and “save face” (Eden, 2004), participate more, and also perceive the process as fair. As a result, participation, motivation and ownership of the strategy formation process increase. This mechanism is referred to here as inclusion.
Second, the visual mapping process allows participants to “piggy back” (Shaw et al., 2009) off one another’s ideas, and so the process has a self-referential effect. This mechanism is referred to here as reinforcement.
The ideas generated through mapping provide multiple alternatives for action beyond those of other techniques, and so allow decision makers to choose a more appropriate course of action through the increased understanding gained through mapping. This mechanism is referred to here as choice.
Figure 2 also includes a number of components which condition whether and the extent to which actualization will take place. These will be considered further when evaluating the evidence but can be divided roughly into the characteristics of the mapper and their environment, including the nature of the problem. As will be discussed, in groups and for the organization these are especially important for explaining (lack of) outcomes.
3.2 Mapping for system development
For the current discussion, “development” refers to processes that aim to alter the state of an existing performance measurement or management system and is meant to include both implementation of a new system and adaptation of existing ones. Within performance management, there is clear interest in using maps for system development and in developing maps themselves (Bourne and Bourne, 2011; Kaplan and Norton, 2004).
3.2.1 What outcomes are sought for development? Generally, the outcome sought during development is selecting or creating an “appropriate” measure, or more broadly, creating a more effective performance measurement system. The terms “appropriate” and “effective” are dependent on their context and take on different meanings in the studies in this review but drew on performance management literature. For example, Lucianetti (2010) investigates the use of strategy maps for translating strategy into operational goals, for adopting new performance measures, and for making cause and effect relationships between measures explicit. Drawing on Neely et al. (1995), Montemari and Nielsen (2013) seek measures that are related to specific goals, controllable, have an explicit management purpose, reflect system causality and provide vision. Studies also seek coherence, completeness, a balance of measures (Cugini et al., 2011; Parisi, 2013) or consensus as to the appropriateness of the included measures (Aranda and Arellano, 2010; Francioli and Cinquini, 2014).
3.2.2 How do maps help develop performance management systems? Development generally discussed either as an extension of the structuring process (Aranda and Arellano, 2010; Parisi, 2013). That is, mapping is meant to assist with the selection or measures or with the attribution of value. In effect, strategy maps help answer “what do we measure?” (Montemari and Nielsen, 2013), either by actualizing the idea, or by providing a sufficiently broad vision of the organization, thus increasing the likelihood that appropriate measures are chosen to be developed and included, or that other performance management system components are adapted to align to strategy.
3.3 A theory of strategy maps for use
Within performance management, the potential for maps for communicating and effectively analysis of organizational strategy and performance has been widely discussed (Francioli and Cinquini, 2014; Kaplan, 2012; Nørreklit et al., 2012). Rather than centering on the process of mapping, this discussion begins when a map has already been formed and codified. The typical form this takes within performance management is a hierarchical map, sometimes arranged into perspectives following the balanced scorecard, of a limited number
IJPPM
of performance measures (Kaplan and Norton, 2004). The following sections will consider what these reports have been used to achieve, and how they are meant to achieve it.
3.3.1 What outcomes are sought through use? Strategy maps have primarily been discussed within the context of diagnostic and interactive use (Simons, 1995). That is, there is an interest in evaluating the extent to which the organization has been effective or efficient in its pursuit of the strategy (diagnostic), and also in evaluating the extent to which the current strategy is appropriate (interactive). The interest within performance management centers around how maps can lead to better understanding and decision making, and ultimately to increased organizational performance. For an individual evaluating a map-style report, this review is concerned with how strategy maps effectively communicate performance relative to other types of communication.
Operationalized, the aim of using a strategy map for evaluation can be categorized broadly as enabling improved decision making for the individual, and for the organization consensus, collaboration and double-loop learning (Argyris, 2010). A list of outcomes of interest included in this review is included in Table III.
3.3.2 How do maps work for use? How maps are meant to bring about the outcomes described above can be separated into mechanisms explaining improved decisions making at an individual level and the organizational level. For the individual, given the way the mind works, that the node-link structure is appropriate for use, helping to reduce cognitive load and at the same time allowing the inclusion of a more representative depiction of reality (Frederiksen et al., 2011). This mechanism is referred in Figure 3 as processing.
There is some discussion that suggests that communicating and analyzing strategy maps facilitates understanding and empowerment, which facilitate organizational learning, consensus and strategic alignment (Kaplan and Norton, 2004; Kaplan and Norton, 2006). Because these discussions revolve around both evaluating the extent to which a given strategy has been achieved and also evaluating the appropriateness of the strategy itself, this mechanism is referred to here and appears in Figure 3 as evaluation.
4. Evaluating the evidence
The previous section has outlined how strategy maps are meant to work within a performance management context. However, in explaining how a match produces flame, what is also needed is to understand key conditioning components that would explain whether a given attempt will produce a flame or not. Therefore, the following section
6. Conclusions
Two decades after its introduction to the field, the strategy map has the potential to represent a major contribution to contemporary performance management. This review suggests that separating the strategy map from the balanced scorecard could help it realize its potential as a breakthrough theory within performance management. Doing so allows the identification of mechanisms that explain how strategy mapping can facilitate strategy formation, performance measurement system development, and strategy evaluation and communication, which can further lead to the development of more effective applications of the concept.
Realizing the potential of the strategy map will require addressing a mismatch between research focus to date and organizational reality. To fully utilize strategy maps within performance management, researchers will need to better understand how these feature with other performance management components. Doing so will require shifting focus from evaluative tasks for diagnostic use – representing the majority of research on evaluation – to observing how these function in organizations and how they can support the overall strategic dialog. Experimental research is helpful for better understanding the behavioral effects of these maps, and yet they often neglect the difficulty in developing and implementing them for use in organizations, generally operating in conditions of frequent strategic change (Porporato et al., 2017). Therefore, a major contribution of this review is to highlight the importance of differentiating these processes in order to analyze how maps work in organizations.
The second contribution of this review is that it begins to separate the theory of strategy maps from any particular tool or framework, which in performance management is generally the balanced scorecard. Through the realist synthesis process, the review offers a “mechanism sketch” (Craver, 2006), a baseline categorization of the critical features, processes and actors that can explain how strategy maps generate the outcomes of interest. Given the realist assumption of openness, the exact way that these features interrelate will vary from situation to situation, but the mechanism should remain constant.
Further, 12 propositions are offered on how strategy maps will work, for which purpose, and in what circumstances. Future research within performance management can build upon these to develop a unique theory of maps that is specific to and useful for the field. More research is needed to understand, for example, how the use of strategy maps for
Revisiting strategy mapping
Downloaded by 50.224.134.150 At 11:12 07 January 2019 (PT)
IJPPM
evaluation might lead to unintended, potentially negative impacts when they are combined with existing incentive structures (Cheng and Coyte, 2014; Mastilak et al., 2012), but there is also a need to explore interactions with target setting, defining KPIs, information flows and other performance management components. Doing so opens the possibility of discovering new applications of strategy maps and mapping within performance management.
Separating the theory from the tool is also important because it can help to explain and address failures at different levels. Distinguishing level could help explain why, for example, strategy maps could effectively improve communications across groups, but lead to poor decision making in an individual evaluative task. The view offered here is that understanding the two requires a consideration of largely different levels, one primarily cognitive, the other situated in and conditioned by organizational-level elements. Perhaps most importantly, a focus on how can help the strategy map to establish its own place within performance management study, and to evolve in the rapidly changing organizational context (Bititci et al., 2012).
The review represents one of very few realist syntheses in management studies, though recent calls for more reviews of this type highlight their perceived potential (Jones and Gatrell, 2014). By focusing on the underlying theory of how strategy maps are meant to work, these types of reviews open new lines of questioning that could be of interest to performance measurement and management.
Although the findings are encouraging, the review is limited in several ways. Perhaps most importantly, by taking a broad view of strategy maps across three stages of performance management, nuance has been sacrificed in the analysis of each. While maintaining sufficient breadth is useful for considering strategy maps within performance management at a high level, future studies will be needed to better establish particular configurations of elements that generate outcomes. This is not a call for lists in the form of context, mechanism and outcome, but rather for continued focus on building nuanced explanations of strategy maps.
The findings of this paper are important for practitioners using or considering adopting the use of strategy maps. First, it highlights that creating strategy maps is a highly accessible activity for achieving shared understanding of what organizations do and how they do it, even among diverse groups of stakeholders. What is significant, and distinct from recent reviews (e.g. Islam 2018), is that the process of creation is what drives much of the benefits to be had from the strategy map, and further one that likely requires significantly less investment than many elements of the performance management system. For example, simply attempting to create a strategy map as a group can be a useful exercise that can generate consensus. These benefits can be carried over to develop or implement appropriate performance measures, where they serve as a focus point for discussion to link measures to strategy. Conversely, practitioners should proceed with caution before investing in strategy map-style reports for communicating performance for diagnostic use. Not only are there multiple challenges to developing such reports, but they may also have unintended effects on behavior or simply be ignored.
The original purpose of the strategy map was to describe strategy at a time when intangible assets were being recognized as central to gaining sustainable competitive advantage. In the current global context, characterized by an increasing rate of change, the introduction of disruptive technology and societal shifts, organizations that effectively address complexity will have an advantage over those which cannot (Kelly, 2015). This review suggests that the strategy map is particularly well-suited to addressing this need because of its ability to support consensus building and learning, and therefore could support critical performance management aims in ways that have to date not been fully explored. By considering the theory of how strategy maps work and in what circumstances, both researchers and practitioners alike can move toward realizing the full potential of strategy maps in performance management.
Only a few years after introducing strategy maps to performance management by incorporating them into the balanced scorecard, Kaplan and Norton (2004) remarked that these were “as big an insight to executives as the balanced scorecard itself” (p. 9). It was a significant observation, given that the balanced scorecard became one of the most widely used frameworks used in practice (Rigby and Bilodeau, 2015). The power of the map as first introduced stemmed from its purported ability to effectively describe strategy in a cohesive and straightforward way, thereby increasing the likelihood of successful strategy implementation (Kaplan and Norton, 2001). Strategy maps could also be used in aiding in formulating strategy, in structuring problems, in defining measures and objectives, and in decision making (Kaplan and Norton, 2004; Kaplan and Norton, 2006; Lueg and Julner, 2014).
However, nearly two decades after introducing the strategy map to performance management, evidence suggests that the impact of strategy maps for performance management practice remains limited. There is evidence that few organizations use strategy maps as a part of the balanced scorecard or other performance management framework (Speckbacher et al., 2003; Tapinos et al., 2010), despite these being linked to effective use and satisfaction (Laitinen et al., 2010; Lueg and Julner, 2014). Further, strategy mapping in general often fails to be included in descriptions of the balanced scorecard (see Rigby, 2017), and is seldom used as a standalone tool in practice (Tapinos et al., 2010). In short, it appears that the strategy map and strategy mapping have not realized their potential for performance management.
There are several issues that could explain the lack of impact to date. First, descriptions of the role of strategy maps and how they are meant to work within the balanced scorecard framework have remained vague, often do not specify the outcome intended through their use, or apply overly generalized conceptions of performance (Hoque, 2014; Lueg, 2015; Öllinger et al., 2015). Second, many scholarly works on the strategy maps remain normative (Islam, 2018), or take the limited view of the strategy map as a management control device (Tapinos et al., 2010). Despite a few developments (e.g. the possibility of including time delays), this narrow focus contrasts with an evolving discussion of strategy mapping and its related causal mapping in general in management and operations research (Hodgkinson and Clarkson, 2005), which has not entered mainstream discussions of the tool within performance management. Rather, discussions of the strategy map as it appears in performance management remain bound to the balanced scorecard framework, which, it should be noted, appears to be on the decline (Rigby and Bilodeau, 2018). Therefore, if the map is to reach its breakthrough potential for performance management, it is useful to consider it separately from the balanced scorecard.
Therefore, the aim of this paper is to revisit a major component of performance management, the strategy map, to thoroughly consider the theory of how they work, and further consider this within a unique performance management context. There are two intended contributions through this aim: first, specifying purpose and extracting theory can help practitioners better fit them to purpose and allow maps to be employed more effectively. This synthesis addresses this aim specifically by offering several propositions inferred from the review results. Second, it aims to permit performance management research and practice to be able to adapt, adjust, and expand existing and emerging theory on maps and mapping beyond that offered in the original balanced scorecard framework. In other words, instead of whether strategy maps “work,” the interest of this study is to develop an understanding of the generative mechanisms behind strategy maps:
RQ1. How and in what circumstances do strategy maps contribute to increased organizational performance?
The objective of this paper is to address the research question through a realist synthesis (Pawson, 2006) of empirical studies on the use of strategy maps as a part of a performance management framework. A realist synthesis is a type of systematic literature review that focuses on developing a theory of how a particular tool, framework, program or intervention is meant to work, and then examines the evidence to evaluate the strength of the theory. Because it focuses on theory rather than the tool itself, it is well-suited for evaluating complex interventions like the use of strategy maps, in which there may be multiple, conflicting factors influencing its outcomes. The idea is that by separating the theory from the tool, realist synthesis can facilitate knowledge creation and make it easier to adapt its use to a particular context.
The paper proceeds as follows: first, it explores realist synthesis and the methods of review. Next, results are presented, and then discussed along with implications for research and practitioners.
Downloaded by 50.224.134.150 At 11:12 07 January 2019 (PT)
2. Methodology
Most interest around the strategy map within performance management has maintained Kaplan and Norton’s focus on the technical aspects of strategy maps (see Islam, 2018, for a recent review of these) to the detriment of the sensemaking processes that take place around them. Underlying this focus is a common position within performance management studies that the interpretation of performance information is straightforward, linked to positivism (Micheli and Mari, 2014). These assumptions can be problematic when considering the social aspects of performance management (Beer and Micheli, 2018), a criticism that has been applied to strategy maps (Modell, 2012).
Therefore, a potentially fruitful means of understanding how maps work is to also revisit the philosophical assumptions upon which considerations of the strategy map in performance management have been built.
This paper describes a realist synthesis (Pawson, 2006). In practical terms, the method begins with a guiding question: “What works for whom under what circumstances, how, and why?” (Wong et al., 2013). Underlying this question is a realist philosophy of science, which will be briefly discussed in the following paragraphs as a backdrop to the synthesis method.
2.1 Why realism?
Scientific realism developed largely in response to a criticism that traditional research approaches were limited in their ability to provide explanations because they relied on artificially creating or assuming closed experimental conditions (Sayer, 1992). In most cases, experimental closure is undesirable or impossible because reality is fundamentally open (Bhaskar, 1975). This openness quickly comes into conflict with the more commonly employed Humean view of causality which seeks to establish scientific laws by seeking events in succession (Hume, 1967).
Under this empiricist approach, reality is seen as obeying universal laws which can be uncovered through the repeated observation of events. Researchers can then induce the existence of these laws, which can then be tested via statistical methods to establish their validity.
However, scientific practice under the empiricist approach has been criticized because it effectively reduces reality to observable events. In social systems, this position has been cited as especially problematic because it allows for the meaningfulness of social interactions to be completely ignored or greatly reduced (Bhaskar, 1979).
As an alternative, realism adopts a generative view of causality under which cognitive, social and physical entities interrelate to produce events via mechanisms. The primary aim of science under this perspective is to identify these mechanisms and understand their nature in order to improve practice (Bhaskar, 2014, p. v). However, disagreements exist on the meaning of the term “mechanism”, which have complicated its application in practice (Dalkin et al., 2015), and so some further clarification is needed.
First, mechanisms are described as the generally unobservable relations between processes, physical and social structures, and ideas that produce outcomes (Astbury and Leeuw, 2010; Mingers and Standing, 2017), which may operate in different contexts in which other mechanisms may be operating simultaneously. Because of the focus on how mechanisms operate in particular contexts to produce outcomes, realist evaluation often reports results in a “CMO” configuration for context, mechanism and outcome (Pawson, 2013). However, several researchers have pointed out continued confusion on what constitutes a mechanism and what does not (Craver, 2009; Dalkin et al., 2015; Mingers and Standing, 2017). This discussion adopts the view of Mingers (2014), in which the mechanism explains the relation between the entities within a system that gives rise to the outcome of interest.
Before illustrating the concept of mechanism used here, it is important to note that from the realist perspective, mechanisms operate in a stratified reality (Astbury and Leeuw, 2010;
Revisiting strategy mapping
Downloaded by 50.224.134.150 At 11:12 07 January 2019 (PT)
IJPPM
Bhaskar and Danermark, 2006). There are a number of ways in which realists conceive of stratification (Bhaskar, 2010), but what is important here is the concept of emergence, i.e. that the properties of an entity cannot be reduced to any one of its components, but rather emerge from their interaction.
An example using a matchstick can help to illustrate these concepts. At one level, the combination of its chemical composition and the friction of the surface create a process of combustion which, given the right conditions (e.g. the presence of oxygen), will produce a flame. Chemical composition and combustion is the mechanism that explains the outcome of the flame but provide part, but not all of the explanation. For example, to achieve the generation of the flame matches generally cannot be lit under water. Neither will the flame be produced if the wrong technique is used: too much pressure, and the matchstick breaks. Too little, and there will not be enough friction for the reaction to take place.
This type of analysis is open to higher-order considerations such as why the match might be struck in the first place, or the systems of production and infrastructure that could explain its existence. It also includes an interest in secondary outcomes: light a match on an airplane, for example, and the interrelation of various social structures will likely result in the person’s arrest – an emergent outcome which cannot be explained through the match’s chemical properties alone and requires understanding how people make sense of the action.
2.2 Why realist synthesis?
Adopting a realist approach to discovery has several implications for how research is carried out and, importantly, how evidence is cumulated and synthesized. Critically, rejecting a view of causality based on events implies that traditional forms of systematic literature review (Tranfield et al., 2003) require revisiting.
Systematic literature review originated in the field of medicine as means of consolidating existing knowledge. These reviews were meant to increase rigor over traditional, narrative reviews through transparency, inclusivity, and a focus on explanation (Denyer and Tranfield, 2009). Realist synthesis adopts many of the elements of these reviews, but requires adapting explanations into the generative view, adopting a more flexible approach to evidence gathering and to collection, and by abandoning the traditional hierarchy of evidence in evaluation. These elements and their implications will be discussed below corresponding with the stages of review, but essentially realist syntheses involve two processes: extracting the theories of how a particular intervention works (the mechanisms) via abductive redescription or abstraction, and evaluating the strength of those theories through a critical examination of the studies uncovered through the search processes.
The following section describes the stages and methods of review, which following Pawson (2006) include identifying a topic, extracting theory, search for literature, selection and appraisal, extraction, analysis and synthesis.
2.3 Identifying the topic of review
The interest of this discussion is in extracting the theory of strategy maps within a performance management context, where with few exceptions, strategy maps are discussed as a part of the balanced scorecard framework. Here, a scoping study revealed generally vague descriptions of how the strategy maps were meant to work, corroborating observations of much literature on the balanced scorecard in general (Hoque, 2014). Therefore, it was thought that a focus on strategy maps would have the greatest potential impact for practitioners and also would benefit performance measurement theory building.
2.4 Extracting the theory of strategy maps within a performance management framework
In a realist synthesis, how an intervention is meant to work often needs to be interpreted or adapted to fit the realist ontology. Even if some research implicitly uses a generative model of causality, few are described initially in such a way (Wynn and Williams, 2012). Others may be useful for evaluating the effectiveness of maps but focus on outcomes whose primary interest is not the direct improvement of organizational performance, e.g. for conflict resolution (Ackermann et al., 2016).
Therefore, a scoping study served to develop an initial classification of potential mechanisms using the foundational texts of the balanced scorecard (e.g. Kaplan and Norton, 2001, 2004, 2006), practitioner resources on the topic (Balanced Scorecard Institute, 2017) and reviews on casual maps and strategy maps (Hodgkinson and Clarkson, 2005; Lueg and Julner, 2014). Theories resulting from the scoping study were refined as the study progressed through a process of abstraction or abductive redescription – in other words, describing how the maps were meant to work in uniform terms to fit performance management.
These were grouped according to their associated performance measurement stage, whether to structure problems, develop, implement or modify a performance management system, or for use as an analysis or communication tool. During the search process, the background section of each study included in the full-text review was evaluated to extract the theory, if present, of how the strategy map or mapping process was meant to work.
The mechanism theory, presented in Section 3, was further divided into hierarchies depending on level, such that the lowest involved largely psychological processes, and the highest considered organizational outcomes. This process and its implications will be explored in the discussion section, but centered on examining how maps could affect organizational properties via the actions of many individuals (Astbury and Leeuw, 2010).
2.5 Search processes
Figure 1 shows an outline of the process for the synthesis. The search for studies to evaluate the propositions began with keyword searches for “performance measurement” in the academic citation databases of Scopus and Web of Knowledge, and later expanded to include “causal map” and “strategy map.” The searches were intentionally broad to increase the likelihood of including relevant articles in the review. That search began with keyword searches of the Scopus and Web of Knowledge academic databases, resulting in 6,583 unique articles. Additional text filters resulted in 4,225 articles for title and abstract review. The review relied heavily on the snowball approach, following Denyer et al. (2008), where references of each selected article were searched for relevant evidence.
2.6 Selection and appraisal of evidence
For the purposes of this review, the definition of performance measurement came from Franco-Santos et al. (2007), who argue that a performance measurement system exists if there are processes of measure design and selection, data capture, and information provision, features performance measures and supporting infrastructure, and has the role of measuring performance. This definition was selected because it encompasses only the necessary conditions of a performance measurement system, and would allow for a wide range of texts to be included.
Selection criteria:
• addresses performance measurement or management in organizations;
• describes an empirical study;
Revisiting strategy mapping
explores the consequences of the use of strategy or causal maps for either structuring problems, developing performance measures, communicating performance or analyzing performance;
• journal is included in the Scopus Citations Index or Journal Citations Report;
• article is published between 1992 and 2017; and
• results in English.
Selection criteria were applied in stages. Titles and abstracts were reviewed separately to exclude only those articles that did not meet the selection criteria. Articles with the possibility of relevance were passed on for further review and were considered relevant if they could be used to evaluate the developing program theory.
Articles that met all the inclusion criteria that were published in peer-reviewed journals were included, though not all impacted the final synthesis to an equal extent. For example, though the study by Cugini et al. (2011) on the application of strategy maps in a university setting provided an example of a successful implementation, the study mainly focuses on describing the resulting strategically linked scorecard, offering little evidence for evaluating underlying causal mechanisms. On the other hand, studies were also evaluated if they were considered to have sufficient rigor and relevance but were not in either citation index, though only one, that of Vo et al. (2005), was included in this fashion.
Application of the selection criteria resulted in 52 studies which were included in the final review. Of these, more than 60 percent were featured in journals with a 2017 SCIMago
Downloaded by 50.224.134.150 At 11:12 07 January 2019 (PT)
Journal Rank in the first quartile, with over a third of the studies in three and four-star journals in the 2018 ABS Academic Journal Guide, both common means of establishing quality (e.g. Franco-Santos et al., 2012).
2.7 Extraction
An extraction form was used to categorize the proposed mechanisms, context, subject, intervention characteristics, and an assessment of relevance and rigor of each of the studies. As it became clear which factors were of particular interest, the extraction form was refined to include the new information, and studies which had been previously examined were examined again to consider any new information. This reflects a recognition that database protocols may need more flexibility in studies on organizations than in the context of evidence-based medicine (Tranfield et al., 2003).
2.8 Analysis and synthesis process
Unlike traditional systematic review, the process of analysis and synthesis takes place alongside assessing relevance and extracting data. Following Pawson (2006) and Wong et al. (2013), full texts were reviewed and analyzed. The logical mode for this process is referred to as abstraction by Pawson (2006) and abductive redescription by Bhaskar (2016), i.e. describing events in a theoretically significant way. The result is an evolving “mechanism sketch” (Craver, 2006), a baseline categorization of the critical features, processes and actors that can explain how strategy maps generate the outcomes of interest.
This baseline, and another key part of the synthesis process, comes from comparing and contrasting findings from the included studies to infer a likely explanation, so that relevant findings could be used to develop specific propositions. Though not discussed specifically by Pawson (2006), the process could be thought of as inference to the best possible explanation (Lipton, 2004). It is important to note that first, the same study may support one proposition while not another. Because the focus is on generative mechanisms, studies may also inform the evaluation of more than one proposition or mechanism. In this way, the findings of these studies were used to evaluate the mechanisms that were derived in the process of abstraction.
3. A theory of maps for performance management
Performance management refers to a wide range of processes which center on setting goals, defining performance measures, reviewing and acting upon performance data, and the activities that surround these, with the ultimate goal to improve organizational performance (Bititci et al., 2018). Strategy maps have been implicated in any number of these activities, but broadly, their use can be seen as addressing three separate but interrelated performance management stages or processes. These stages can be to structure problems, generally in the form of strategy formation, to select, define, modify or develop an existing performance management component or system, or to communicate, analyze or evaluate performance, here referred to as use. It should be noted that studies within performance management rarely distinguish between these different purposes, which, as will be discussed, have complicated research into strategy maps.
The following section explores how maps are seen to drive the desired positive outcomes of each stage. This theory is the result of abstraction described in the previous section, and its purpose is to provide a high-level framework that facilitates the evaluation of results. Alluding again to the match example where combustion provides a baseline explanation for how a match generates a flame, this section aims to find a baseline explanation as to how a strategy map would generate its outcomes.
Revisiting strategy mapping
Downloaded by 50.224.134.150 At 11:12 07 January 2019 (PT)
IJPPM
A summary of the articles included in this review can be found in Table AI which includes the citation, the methodological approach, propositions addressed, research context, the type of strategy map, its complexity, elicitation technique and, if appropriate, the method of its development.
3.1 Strategy mapping for problem structuring
Strategy maps within performance management were originally presented as a way of “describing strategy” in order to understand it (Kaplan and Norton, 2000). This statement highlights that mapping for structuring problems is an active process which aims to facilitate the generation of ideas, gaining a broader understanding, and ultimately pursuing a more effective strategy. Within management studies, mapping has been used to achieve a wide range of ends. Of interest to this review are the mechanisms that explain how the creation of maps work for strategy formation and execution for an individual, in groups, and finally how these can lead to the pursuit of a more effective strategy and increased organizational performance.
3.1.1 The outcome: what is a structured problem? Broadly, when exploring outcomes for individuals, these studies are concerned with gaining a deeper understanding of an issue. Understanding is discussed as task performance (Öllinger et al., 2015), new knowledge or ideas (Goodier et al., 2010), presenting a diverse range of concepts (Goodier and Soetanto, 2013), or complexity of maps presented (Xu, 2011).
There is also an interest in how participants perceive the strategy or strategy making process, which is often pursued in tandem. For example, mapping can be used for changing how people feel about the strategy itself, whether by allowing their views to be heard, by separating the ideas from the speaker and from the motivational effects these can generate (Ackermann and Eden, 2011). Because of the potential, mapping is used for consensus building and conflict resolution (Ackermann and Eden, 2005; Ackermann et al., 2014, 2016). Ultimately, within performance management, the outcomes discussed above are meant to facilitate the pursuit of a more appropriate or effective strategy (Goodier et al., 2010; Jenkins and Johnson, 1997). A full list of outcomes for structuring found in this review is included in Table I.
3.1.2 How are maps meant to help structure problems? Figure 2 presents the mechanisms that were found in the literature that would explain how strategy maps can generate learning, motivation, ownership and, ultimately, the pursuit of a more effective strategy – the outcomes sought through their use as a tool for structuring problems. These outcomes correspond to three levels that have been abstracted from the literature: a psychological level whose outcomes are understanding and motivation, a group or social level where, in addition to reaching a shared, broader understanding, there can positive changes in attitude, and finally, the generation and selection of an appropriate course of action at the organizational level.
For the individual, maps are meant to lead to understanding by functioning as a kind of mirror, a process referred to here as actualization. By creating a map, the mapper makes ideas about an issue explicit, and thereby can see and reflect upon them. Eden and Ackermann (2018) refer to the map in this process as a “transitional object.” The nature of the knowledge created and how actualization works have been debated extensively (see Hodgkinson and Clarkson, 2005 for an overview) but remain outside the scope of this paper. What is important is that the node-link structure of causal maps specifically is a key component because it allows seeing, reflecting upon and possibly modifying how ideas relate to one another (Eden, 1988).
Groups can achieve consensus or shared understanding, more holistic views of an issue and have more ideas presented in several ways. First, through the actualization process, participants are able to avoid embarrassment and “save face” (Eden, 2004), participate more, and also perceive the process as fair. As a result, participation, motivation and ownership of the strategy formation process increase. This mechanism is referred to here as inclusion.
Second, the visual mapping process allows participants to “piggy back” (Shaw et al., 2009) off one another’s ideas, and so the process has a self-referential effect. This mechanism is referred to here as reinforcement.
The ideas generated through mapping provide multiple alternatives for action beyond those of other techniques, and so allow decision makers to choose a more appropriate course of action through the increased understanding gained through mapping. This mechanism is referred to here as choice.
Figure 2 also includes a number of components which condition whether and the extent to which actualization will take place. These will be considered further when evaluating the evidence but can be divided roughly into the characteristics of the mapper and their environment, including the nature of the problem. As will be discussed, in groups and for the organization these are especially important for explaining (lack of) outcomes.
3.2 Mapping for system development
For the current discussion, “development” refers to processes that aim to alter the state of an existing performance measurement or management system and is meant to include both implementation of a new system and adaptation of existing ones. Within performance management, there is clear interest in using maps for system development and in developing maps themselves (Bourne and Bourne, 2011; Kaplan and Norton, 2004).
3.2.1 What outcomes are sought for development? Generally, the outcome sought during development is selecting or creating an “appropriate” measure, or more broadly, creating a more effective performance measurement system. The terms “appropriate” and “effective” are dependent on their context and take on different meanings in the studies in this review but drew on performance management literature. For example, Lucianetti (2010) investigates the use of strategy maps for translating strategy into operational goals, for adopting new performance measures, and for making cause and effect relationships between measures explicit. Drawing on Neely et al. (1995), Montemari and Nielsen (2013) seek measures that are related to specific goals, controllable, have an explicit management purpose, reflect system causality and provide vision. Studies also seek coherence, completeness, a balance of measures (Cugini et al., 2011; Parisi, 2013) or consensus as to the appropriateness of the included measures (Aranda and Arellano, 2010; Francioli and Cinquini, 2014).
3.2.2 How do maps help develop performance management systems? Development generally discussed either as an extension of the structuring process (Aranda and Arellano, 2010; Parisi, 2013). That is, mapping is meant to assist with the selection or measures or with the attribution of value. In effect, strategy maps help answer “what do we measure?” (Montemari and Nielsen, 2013), either by actualizing the idea, or by providing a sufficiently broad vision of the organization, thus increasing the likelihood that appropriate measures are chosen to be developed and included, or that other performance management system components are adapted to align to strategy.
3.3 A theory of strategy maps for use
Within performance management, the potential for maps for communicating and effectively analysis of organizational strategy and performance has been widely discussed (Francioli and Cinquini, 2014; Kaplan, 2012; Nørreklit et al., 2012). Rather than centering on the process of mapping, this discussion begins when a map has already been formed and codified. The typical form this takes within performance management is a hierarchical map, sometimes arranged into perspectives following the balanced scorecard, of a limited number
IJPPM
of performance measures (Kaplan and Norton, 2004). The following sections will consider what these reports have been used to achieve, and how they are meant to achieve it.
3.3.1 What outcomes are sought through use? Strategy maps have primarily been discussed within the context of diagnostic and interactive use (Simons, 1995). That is, there is an interest in evaluating the extent to which the organization has been effective or efficient in its pursuit of the strategy (diagnostic), and also in evaluating the extent to which the current strategy is appropriate (interactive). The interest within performance management centers around how maps can lead to better understanding and decision making, and ultimately to increased organizational performance. For an individual evaluating a map-style report, this review is concerned with how strategy maps effectively communicate performance relative to other types of communication.
Operationalized, the aim of using a strategy map for evaluation can be categorized broadly as enabling improved decision making for the individual, and for the organization consensus, collaboration and double-loop learning (Argyris, 2010). A list of outcomes of interest included in this review is included in Table III.
3.3.2 How do maps work for use? How maps are meant to bring about the outcomes described above can be separated into mechanisms explaining improved decisions making at an individual level and the organizational level. For the individual, given the way the mind works, that the node-link structure is appropriate for use, helping to reduce cognitive load and at the same time allowing the inclusion of a more representative depiction of reality (Frederiksen et al., 2011). This mechanism is referred in Figure 3 as processing.
There is some discussion that suggests that communicating and analyzing strategy maps facilitates understanding and empowerment, which facilitate organizational learning, consensus and strategic alignment (Kaplan and Norton, 2004; Kaplan and Norton, 2006). Because these discussions revolve around both evaluating the extent to which a given strategy has been achieved and also evaluating the appropriateness of the strategy itself, this mechanism is referred to here and appears in Figure 3 as evaluation.
4. Evaluating the evidence
The previous section has outlined how strategy maps are meant to work within a performance management context. However, in explaining how a match produces flame, what is also needed is to understand key conditioning components that would explain whether a given attempt will produce a flame or not. Therefore, the following section
6. Conclusions
Two decades after its introduction to the field, the strategy map has the potential to represent a major contribution to contemporary performance management. This review suggests that separating the strategy map from the balanced scorecard could help it realize its potential as a breakthrough theory within performance management. Doing so allows the identification of mechanisms that explain how strategy mapping can facilitate strategy formation, performance measurement system development, and strategy evaluation and communication, which can further lead to the development of more effective applications of the concept.
Realizing the potential of the strategy map will require addressing a mismatch between research focus to date and organizational reality. To fully utilize strategy maps within performance management, researchers will need to better understand how these feature with other performance management components. Doing so will require shifting focus from evaluative tasks for diagnostic use – representing the majority of research on evaluation – to observing how these function in organizations and how they can support the overall strategic dialog. Experimental research is helpful for better understanding the behavioral effects of these maps, and yet they often neglect the difficulty in developing and implementing them for use in organizations, generally operating in conditions of frequent strategic change (Porporato et al., 2017). Therefore, a major contribution of this review is to highlight the importance of differentiating these processes in order to analyze how maps work in organizations.
The second contribution of this review is that it begins to separate the theory of strategy maps from any particular tool or framework, which in performance management is generally the balanced scorecard. Through the realist synthesis process, the review offers a “mechanism sketch” (Craver, 2006), a baseline categorization of the critical features, processes and actors that can explain how strategy maps generate the outcomes of interest. Given the realist assumption of openness, the exact way that these features interrelate will vary from situation to situation, but the mechanism should remain constant.
Further, 12 propositions are offered on how strategy maps will work, for which purpose, and in what circumstances. Future research within performance management can build upon these to develop a unique theory of maps that is specific to and useful for the field. More research is needed to understand, for example, how the use of strategy maps for
Revisiting strategy mapping
Downloaded by 50.224.134.150 At 11:12 07 January 2019 (PT)
IJPPM
evaluation might lead to unintended, potentially negative impacts when they are combined with existing incentive structures (Cheng and Coyte, 2014; Mastilak et al., 2012), but there is also a need to explore interactions with target setting, defining KPIs, information flows and other performance management components. Doing so opens the possibility of discovering new applications of strategy maps and mapping within performance management.
Separating the theory from the tool is also important because it can help to explain and address failures at different levels. Distinguishing level could help explain why, for example, strategy maps could effectively improve communications across groups, but lead to poor decision making in an individual evaluative task. The view offered here is that understanding the two requires a consideration of largely different levels, one primarily cognitive, the other situated in and conditioned by organizational-level elements. Perhaps most importantly, a focus on how can help the strategy map to establish its own place within performance management study, and to evolve in the rapidly changing organizational context (Bititci et al., 2012).
The review represents one of very few realist syntheses in management studies, though recent calls for more reviews of this type highlight their perceived potential (Jones and Gatrell, 2014). By focusing on the underlying theory of how strategy maps are meant to work, these types of reviews open new lines of questioning that could be of interest to performance measurement and management.
Although the findings are encouraging, the review is limited in several ways. Perhaps most importantly, by taking a broad view of strategy maps across three stages of performance management, nuance has been sacrificed in the analysis of each. While maintaining sufficient breadth is useful for considering strategy maps within performance management at a high level, future studies will be needed to better establish particular configurations of elements that generate outcomes. This is not a call for lists in the form of context, mechanism and outcome, but rather for continued focus on building nuanced explanations of strategy maps.
The findings of this paper are important for practitioners using or considering adopting the use of strategy maps. First, it highlights that creating strategy maps is a highly accessible activity for achieving shared understanding of what organizations do and how they do it, even among diverse groups of stakeholders. What is significant, and distinct from recent reviews (e.g. Islam 2018), is that the process of creation is what drives much of the benefits to be had from the strategy map, and further one that likely requires significantly less investment than many elements of the performance management system. For example, simply attempting to create a strategy map as a group can be a useful exercise that can generate consensus. These benefits can be carried over to develop or implement appropriate performance measures, where they serve as a focus point for discussion to link measures to strategy. Conversely, practitioners should proceed with caution before investing in strategy map-style reports for communicating performance for diagnostic use. Not only are there multiple challenges to developing such reports, but they may also have unintended effects on behavior or simply be ignored.
The original purpose of the strategy map was to describe strategy at a time when intangible assets were being recognized as central to gaining sustainable competitive advantage. In the current global context, characterized by an increasing rate of change, the introduction of disruptive technology and societal shifts, organizations that effectively address complexity will have an advantage over those which cannot (Kelly, 2015). This review suggests that the strategy map is particularly well-suited to addressing this need because of its ability to support consensus building and learning, and therefore could support critical performance management aims in ways that have to date not been fully explored. By considering the theory of how strategy maps work and in what circumstances, both researchers and practitioners alike can move toward realizing the full potential of strategy maps in performance management.
Wednesday, August 29, 2018
“STRATEGY, PLANNING, AND PREPARING
Since its high tide in the 1970s, the strategic planning school, led by writers like Igor Ansoff and Peter Lorange, has fallen out of favor. They were the heirs of von Bülow. Henry Mintzberg has played the role of a polemical Clausewitz, his efforts culminating in The Rise and Fall of Strategic Planning, a book of over 400 pages devoted to a detailed critique of the planners, the final chapter of which also tried to salvage something positive from their methods.8 It is sobering to realize that this book appeared as late as 1994. It is sad that it expends so much effort on describing what you should not do, whereas Clausewitz and von Moltke concentrated on what you should do. Mintzberg points out that “formal planning does not create strategy so much as deal with the consequences of strategy created in ”
“as deal with the consequences of strategy created in other ways,”9 ways he describes elsewhere as “crafting strategy.”10 The order is critical: first the strategy, then the plan.
If the notion of strategy as a plan is moribund, the notion of it as a framework for decision making is gaining ground. The change is being led by practitioners. In an article published in 2001 that rediscovers some of the principles of von Moltke’s essay just 130 years later, Kathy Eisenhardt and Don Sull quote examples of companies including Yahoo! and eBay, Dell and Cisco, Miramax and Nortel as conceiving of strategy as “simple rules” which guide decision making.11 The examples make the idea sound new and modern, whereas it is merely enlightened.
Among those adopting the more enlightened view are planners themselves. Daniel Simpson spent nine years as head of strategy and planning at a $3bn consumer goods company headquartered in the US. Disillusioned by the results of planning and the need to absorb much of the literature ”
“Mintzberg toiled through (some of which, he opines, is “not very helpful” and a portion of which he describes as “complete rubbish”), Simpson concludes that the keys to success are “an overall sense of direction and an ability to be flexible.”12 The example of successful practice he quotes is Welch’s “planful opportunism,” one case in which we know for certain that von Moltke was a direct influence. Welch himself had great influence in this area, not only because of the status of GE and his record, but also because at the beginning of his tenure GE was generally recognized as the leading exponent of strategic planning. Welch’s view was that as a result strategic thinking had almost disappeared, and in 1984 he dismantled the planning system.13
Simpson adds an interesting comment after citing Welch. “I think more successful companies are developed through this sort of planful opportunism than through the vision of an exceptional CEO,” he writes. “They aren’t in the media spotlight as much as companies with the visionary CEO, but they are more common.”14 This is no surprise; exceptional CEOs are by definition uncommon. “However, it also demonstrates that the original intention of the Prussian reformers, to create an intelligent organization whose performance did not depend on its being led by a genius, is as true in business. And if there is evidence that our thinking about strategy is catching up with von Moltke’s, at this point we are still behind. We are extraordinarily reluctant to admit that luck plays a part in business success. The media create a cult of CEO heroes and their salaries are now such that restless shareholders have become rebellious. We would do well to remember that while a leader’s reputation is ultimately based on success, “how much of it is in fact down to his own efforts is very hard to say.”
This is a serious matter. A recent scholarly article argues that the greater a CEO’s celebrity, the greater their perceived control over the actions and performance of their firm. This leads CEOs to continue to take actions associated with their own celebrity, and to create hubris.15 This poses a double jeopardy: the delusion that one can control ”
“external events (i.e., a denial of friction); and the delusion that one is solely responsible for success, with a concomitant tendency to command a great deal more than is necessary (i.e., a reversal of a core principle of mission command). Hubris encourages a return to the deadly cycle of organizational stagnation we examined in Chapter 2. As I pointed out above, because friction is rooted in human finitude, ignoring it is to play at being God. To attribute to CEO-heroes the ability to control events and be immune to good or bad luck is at heart a metaphysical worldview reminiscent of Greek polytheism or even, at the extreme, medieval theology.
Strategy, then, demands a certain type of thinking. It sets direction and therefore clearly encompasses what von Moltke calls a “goal,” “aim,” or “purpose.” Let us call this element the aim. An aim can be an end-point or destination, and aiming means pointing in that direction, so it encompasses both “going west” and “getting to San Francisco.” “The aim defines what the organization is trying to achieve with a view to gaining competitive advantage “How we set about achieving the aim depends on relating possible aims to the external opportunities offered by the market and our internal capabilities. The process of thinking strategically involves relating three points of a triangle, as in Figure 11.”
“A good strategy creates coherence between our capabilities, the opportunities we can detect, and our aims. Different people have a tendency to start with, and give greater weight to, one or other of these three factors. Where they start from does not matter. Where they end up does. The result must be cohesion. If any one of these factors floats off on its own, dominates thinking at the expense of the others, or is simply mismatched, then in time perdition will follow.”
“The strategy triangle confronts us with the first observation von Moltke makes about the nature of strategy: reciprocity between ends and means. Both are ambiguous and interdependent. In most of our day-to-day problems, the end is a given. It is fixed and we just have to work out the means of achieving it. In Figure 11, the two-headed arrows indicate that our consideration of the means (our capabilities and the opportunities we face) codetermines the ends (our aims).
Reciprocity pervades not only strategic thinking but decision making and action. Because the effects of our actions depend not merely on what we do but on the actions of other independent wills, strategy will need to adapt to the newly created situations which result. It is thus a “system of expedients.” The task of strategy is not completed by the initial act of setting direction. Strategy develops further as action takes place, old opportunities close off, new ones arise, and new capabilities are built. The relationship between strategy development and execution is also reciprocal. Doing strategy means thinking, doing, learning, and adapting. It means going round the loop. The reappraisal of ends and means is continuous.
In assessing ends and means, we have above all to be realistic. Developing strategy is an intellectual activity. It involves discerning facts and applying rationality. Leadership is a moral activity. It involves relating to people and generating emotional commitment. Developing a strategy around pre-existing emotional commitments is courting disaster. When people convince themselves that they have the capability to do something that in fact they do not, just because a lot of other people seem to be doing so, or convince themselves that the market will love the latest thing to pop out of R&D, just because their own engineers love it, strategies fail. “When companies set themselves the aim of growing from an also-ran to a market leadership position in two years simply because doing so will boost the CEO’s share options, shareholders’ money is squandered on failed acquisitions and hopeless investments.
Many of the best-known strategy development tools – such as Porter’s five forces and value chain models, the matrices for displaying competitive position used by BCG or McKinsey, cost analysis, supply curves, market segmentation, and so on – are in fact tools for analyzing the situation and trying to work out what drives success. Useful though they are, they do not produce strategies. They help to sort out information, simplify the complexities of reality, and focus attention on the essentials of the situation, internal or external. They are only effective if they generate insight into the basis of competition. “A notion central to Clausewitz’s thinking about strategy was that war aims and the strategy adopted to realize them should be developed from an understanding of what I am calling the “basis of competition,” and what he called the enemy’s “center of gravity.” “Making out this centra gravitatis in the “enemy’s war effort,” he wrote, “to identify its spheres of influence, is a central point of strategic judgment.”16 The term, like friction, is borrowed from mechanics:
Just as the center of gravity is always to be found where the greatest mass is brought together, and just as every blow delivered against the load’s center of gravity is the most effective… so it is in war. The forces of every protagonist, whether a single state or an alliance of partners, have a certain unity, and by virtue of this some coherence; it is where there is coherence that we find analogies to a center of gravity. There are therefore certain centers of gravity in these forces, the movement and direction of which govern other points, and these centers of gravity are to be found where the largest forces are gathered.17”
“So it is in business too. Businesses engage in a vast range of activities. The art of strategic thinking is to identify which of them is the decisive differentiator, the determinant of competitive advantage. It involves mastering and sorting through a vast range of activities and simplifying them accurately down to the essentials which make the difference. The true strategist is a simplifier of complexity. Not many people can consistently do it well.
Clausewitz knew that. Indeed, so rare did he judge the qualities leading to strategic insight to be, that he gave the chapter in which he describes them the title “Military Genius.”18 We should treat this much-abused term with caution. Clausewitz was using it in the precise sense defined by Kant: genius is a gift of nature which intuitively develops the rules of human practices such as the arts.19 Clausewitz’s comments are worth quoting:”
“If he is to successfully prevail in this constant struggle with the unexpected, then two qualities are essential: firstly a mind which even in this heightened darkness is not without some shafts of inner light which lead him to the truth, and then the courage to follow that dim light. The first can be characterized with the French expression coup d’oeil and the second is conviction.20
This sounds a bit dangerous. It could be an excuse for stubbornness, for not listening, for bees in the bonnet and private agendas. That is why it is rare. The key is determination based on insight. Clausewitz realized this:
There are people who possess a highly refined ability to penetrate the most demanding problems, who do not lack the courage to shoulder many burdens, but who nevertheless cannot reach a decision in difficult situations. Their courage and their insight stand apart from each other, never meet, and in consequence they cannot reach a decision. “Conviction results from an act of mind which realizes that it is necessary to take a risk and by virtue of that realization creates the will to do so… the sign of a genius for war is the average rate of success.21”
“The phenomenon of making good judgments in uncertainty has since been the object of careful examination. It is about the use of intuition.
Psychologist Gary Klein has made a study of intuitive decision making. By observing experts in a given field in situations in which they made decisions, Klein realized that they did not follow the conventional “rational model” of developing and evaluating options before choosing between them. They seemed to go straight to the answer, using what appeared to nonexperts, and indeed often to themselves, to be a “sixth sense.” On analysis, the sixth sense turned out to be perfectly rational. It was based on pattern recognition. Through years of experience in their field, experts build up patterns of expectation, and notice immediately when something unusual occurs which breaks the pattern. These signals make the “right” decision obvious to them. It looks to others and feels to them to be intuitive, but the intuition is schooled, and rational. Clausewitz gives it the French name coup d’oeil, the “glance of a practiced eye. Germans more usually refer to Fingerspitzengefühl, the “feeling in your fingertips.” In the Anglo-Saxon world things take place more viscerally – it is “gut feeling.” Whatever the language, schooled intuition is the basis of insight.22 It was this discipline which von Moltke mastered in his domain of military strategy.
Insights into the center of gravity of a business and hence innovative strategies tend to come from people of long experience who have an unusual capacity to reflect on that experience in such a way that they become aware of the patterns it shows. This awareness enables them to understand how all the elements of their experience relate to each other so that they can grasp and articulate the essentials. Because of this, what to others is a mass of confusing facts is to them a set of clear patterns making the answer to many problems obvious. ”
“Hence they have the courage to act. Because they base their decisions on that understanding, and because that understanding is sound, they tend in the long run to get more things right than wrong and so demonstrate the above-average success rate that Clausewitz identifies as marking them out. We tend to speak of them as having “good judgment.” In their field they do. But because it is grounded in pattern recognition, the quality of their judgment is dependent on context and they do not necessarily display it in every area of human activity.23
A short story may illustrate the point.”
“A few years ago, I visited a manufacturer of domestic boilers. At the time, the company was number three in the market and was not only making good returns but gaining share, closing the gap with the number two player. I asked all the top executives why the company was so successful. One said it was the quality of the product – but he admitted that the differences with competitors’ products were small. One said it was the brand – but had to admit that the market leader’s brand was also very strong. So it went on: R&D, technology, production efficiency, delivery times, customer service – all had their advocates, but none in itself felt compelling.”
“My last interview was with the managing director. I asked him once again why the business was so successful. “Let me tell you how our business works,” he said. “Almost all of our domestic business is for replacement of existing boilers. People replace boilers when their existing ones break down. What do you do when your boiler breaks down? You call the installer,” he continued, answering his own question. “When he tells you the boiler is too old to repair because he can’t get the parts, what do you do?” He paused. “I’ll tell you. You do what he suggests. And when you ask him which new boiler to install, he tells you that too. So 90 percent of all purchasing decisions are made by the installer.” He paused to let this sink in. “Our business,” he said deliberately, “is about service to the installer. “But I am the only person around here who gets that. They all think I’m an old man with a bee in his bonnet.” He looked me in the eye. “We are being successful because we offer our installers better service than any of our competitors. But we can do even better. I know that if we gear up the whole company toward optimizing service to the installer, right across the value chain, we can become market leader.”
It all seemed very simple. It made perfect sense. “The company was clearly doing more to enhance service to the installer than any other player in the market. Everyone knew that it was important – but so were lots of other things. The managing director was the only one there who regarded it as essential. He knew every detail of his business, built up over 30 years of experience. He did not only know every tree in his particular wood, he could describe the state of the bark on each one. However, he was the only one who could readily describe the shape of the wood. He had grasped the basis of competition, the center of gravity of the business, and hence the source of its competitive advantage.
This informed all his operational decisions. He wanted to increase the number of visits installers paid to the company’s site – which was already more than any of their rivals – and build a new training center. He was obsessed with the quality of “its installation literature. He was ready to invest whatever it took to increase spare parts availability at the distributors so that installers did not waste time waiting for a part. He wanted the new range of boilers the company was just developing to be energy efficient, quiet, and reliable, but above all he wanted them to be easy to install. And so on. And it was working.
He wanted to run some strategy workshops to focus all his top team on optimizing service to the installer. They were already making their implicit strategy happen, but as it became explicit and the top team grew more aligned, so decision making and execution became more focused. At the time of writing the company has overtaken the number two player, and is closing the gap with the market leader.”
“In this example, service to the installer is the source of competitive advantage my friends are seeking to exploit. Their aim is to achieve leadership of their chosen segments. They have identified becoming the supplier of choice to the installer as an opportunity across the market, and by excelling at that they are unhinging the position of their major competitors. They already have the capabilities to do so, but they are investing further in those capabilities and creating others. They are doing what all successful strategists do, which is to build further on their existing strengths. They therefore have a coherent strategy – they have linked up all three corners of the strategy triangle.”
“Their capabilities took time to build and have become complex and interlocking. They have allowed the company to build a position in the market which is sustainable because they also create barriers around it, making it difficult for competitors to do the same thing as well as they do. The proposition they offer installers is a powerful one. That results in further intangible advantages such as their reputation. Their proposition has become hard to copy, and by continuing to invest in its strengths, the company is maintaining its advantage. Their strategy informs all their decisions and “their operational plans. It is being pursued as a central idea under continually evolving circumstances.
Their competitors are having to play a similar game, because service to the installer is the center of gravity of the business as a whole. Other businesses admit of more than one center of gravity. In the airline business, one can compete on the basis of service, focusing on the business traveler, but in the last decade some have realized that another option is to compete on price, and the low-cost airline – offering a very different value proposition – has changed the business as a whole, based on an insight into another set of market opportunities and a different set of corresponding capabilities. “Centers of gravity are not static. For example, changes in technology have altered the basis of competition in the computer business from the period of the CPU, through the distributed server, to the PC, to the laptop. Failing to shift its position fast enough, the original dominant player, IBM, lost its position, went through a crisis, and has emerged as a survivor in a very different and more diverse competitive landscape.
Identifying the competitive center of gravity is a first step in setting direction and will inform further decisions. The most fundamental strategic decisions are those determining the compass heading and/or destination. From those follow further decisions about investment, resource allocation, and actions. The direction has to be turned into a path, the route of which is always informed by the center of gravity, but which also takes account of changing circumstances. “That means that making the strategy happen will require a whole series of decisions on the part of a wide range of people.
Being made in the context of strategy, those decisions will have the reciprocal relationship between ends and means that is characteristic of it. As they involve overall direction, they will tend to be cross-functional and, as von Moltke observed, they will tend to be “one-offs” because every situation is unique. If we approach them with the natural, intuitive decision-making approach described by Gary Klein, we run a serious risk of getting things wrong. Unless we are strategy specialists (as some consultants are), it is unlikely that our experience base will be appropriate and we may tend to prejudge an issue as being of a certain type. That is the main reason most of the functional executives in the boiler company could not see that service to the installer was the center of gravity. They all knew that it was important. There is an enormous difference between knowing that something is important and realizing that it is the basis of competition.
“Having an inappropriate experience base is dangerous when the nature of the issue itself is at stake. We are also liable to become emotionally anchored on a certain solution or type of solution. We therefore need to put together a diverse team and run a disciplined process of going round the loop, moving from the framing of the question itself, through option generation, to option evaluation, and back to reframing the question. It is a characteristic of high-performance teams that they go round the loop more quickly and reframe more “often than average ones. It is usually reframing that generates creative solutions. It is because it involves systematic, “going-round-the-loop” thinking rather than linear thinking that von Moltke can refer to strategy as a “free practical art.”
In order to provide guidance for decision making under continually evolving circumstances, strategy can be thought of as an intent.”
Excerto de: Bungay, Stephen. “The Art of Action: Leadership that Closes the Gaps between Plans, Actions and Results”. iBooks..
Subscribe to:
Posts (Atom)