Shyam's Slide Share Presentations


This article/post is from a third party website. The views expressed are that of the author. We at Capacity Building & Development may not necessarily subscribe to it completely. The relevance & applicability of the content is limited to certain geographic zones.It is not universal.


Monday, March 20, 2017

Case of Digital Reinvention 4

As executives assess the scope of their investments, they should ask themselves if they have taken only a few steps forward in a given dimension—by digitizing their existing customer touchpoints, say. Others might find that they have acted more significantly by digitizing nearly all of their business processes and introducing new ones, where needed, to connect suppliers and users.
To that end, it may be useful to take a closer look at Exhibit 6, which comprises six smaller charts. The last of them totals up actions companies take in each dimension of digitization. Here we can see that the most assertive players will be able to restore more than 11 percent of the 12 percent loss in projected revenue growth, as well as 7.3 percent of the 10.4 percent reduction in profit growth. Such results will require action across all dimensions, not just one or two—a tall order for any management team, even those at today’s digital leaders.

Looking at the digital winners

To understand what today’s leaders are doing, we identified the companies in our survey that achieved top-quartile rankings in each of three measures: revenue growth, EBIT growth, and return on digital investment.

We found that more than twice as many leading companies closely tie their digital and corporate strategies than don’t. What’s more, winners tend to respond to digitization by changing their corporate strategies significantly. This makes intuitive sense: many digital disruptions require fundamental changes to business models. Further, 49 percent of leading companies are investing in digital more than their counterparts do, compared with only 5 percent of the laggards, 90 percent of which invest less than their counterparts. It’s unclear which way the causation runs, of course, but it does appear that heavy digital investment is a differentiator.

Leading companies not only invested more but also did so across all of the dimensions we studied. In other words, winners exceed laggards in both the magnitude and the scope of their digital investments (Exhibit 7). This is a critical element of success, given the different rates at which these dimensions are digitizing and their varying effect on economic performance. 

Strengths in organizational culture underpin these bolder actions. Winners were less likely to be hindered by siloed mind-sets and behavior or by a fragmented view of their customers. A strong organizational culture is important for several reasons: it enhances the ability to perceive digital threats and opportunities, bolsters the scope of actions companies can take in response to digitization, and supports the coordinated execution of those actions across functions, departments, and business units.

Bold strategies win

So we found a mismatch between today’s digital investments and the dimensions in which digitization is most significantly affecting revenue and profit growth. We also confirmed that winners invest more, and more broadly and boldly, than other companies do. Then we tested two paths to growth as industries reach full digitization.

The first path emphasizes strategies that change a business’s scope, including the kind of pure-play disruptions the hyperscale businesses discussed earlier generate. As Exhibit 8 shows, a great strategy can by itself retrieve all of the revenue growth lost, on average, to full digitization—at least in the aggregate industry view. Combining this kind of superior strategy with median performance in the nonstrategy dimensions of McKinsey’s digital-quotient framework—including agile operations, organization, culture, and talent—yields total projected growth of 4.3 percent in annual revenues. (For more about how we arrived at these conclusions, see sidebar “About the research.”).

Most executives would fancy the kind of ecosystem play that Alibaba, Amazon, Google, and Tencent have made on their respective platforms. Yet many recognize that few companies can mount disruptive strategies, at least at the ecosystem level. With that in mind, we tested a second path to revenue growth (Exhibit 9).

In the quest for coherent responses to a digitizing world, companies must assess how far digitization has progressed along multiple dimensions in their industries and the impact that this evolution is having—and will have—on economic performance. And they must act on each of these dimensions with bold, tightly integrated strategies. Only then will their investments match the context in which they compete.

Contd 5.........

Page 12, 35

The case for digital reinvention 5

The case for digital reinvention 3

Instead, the survey indicates that distribution channels and marketing are the primary focus of digital strategies (and thus investments) at 49 percent of companies. That focus is sensible, given the extraordinary impact digitization has already had on customer interactions and the power of digital tools to target marketing investments precisely. By now, in fact, this critical dimension has become “table stakes” for staying in the game. Standing pat is not an option.

The question, it seems, looking at exhibits 4 and 5 in combination, is whether companies are overlooking emerging opportunities, such as those in supply chains, that are likely to have a major influence on future revenues and profits. That may call for resource reallocation. In general, companies that strategically shift resources create more value and deliver higher returns to shareholders. This general finding could be even more true as digitization progresses.

Our survey results also suggest companies are not sufficiently bold in the magnitude and scope of their investments (see sidebar “Structuring your digital reinvention”). Our research (Exhibit 6) suggests that the more aggressively they respond to the digitization of their industries—up to and including initiating digital disruption—the better the effect on their projected revenue and profit growth. The one exception is the ecosystem dimension: an overactive response to new hyperscale competitors actually lowers projected growth, perhaps because many incumbents lack the assets and capabilities necessary for platform strategies.

Contd 4.........

Page 1, 2, 4. 5

The case for digital reinvention 2

This finding confirms what many executives may already suspect: by reducing economic friction, digitization enables competition that pressures revenue and profit growth. Current levels of digitization have already taken out, on average, up to six points of annual revenue and 4.5 points of growth in earnings before interest and taxes (EBIT). And there’s more pressure ahead, our research suggests, as digital penetration deepens (Exhibit 2).

While the prospect of declining growth rates is hardly encouraging, executives should bear in mind that these are average declines across all industries. Beyond the averages, we find that performance is distributed unequally, as digital further separates the high performers from the also-rans. This finding is consistent with a separate McKinsey research stream, which also shows that economic performance is extremely unequal. Strongly performing industries, according to that research, are three times more likely than others to generate market-beating economic profit. Poorly performing companies probably won’t thrive no matter which industry they compete in.

At the current level of digitization, median companies, which secure three additional points of revenue and EBIT growth, do better than average ones, presumably because the long tail of companies hit hard by digitization pulls down the mean. But our survey results suggest that as digital increases economic pressure, all companies, no matter what their position on the performance curve may be, will be affected.

Uneven returns on investment

That economic pressure will make it increasingly critical for executives to pay careful heed to where—and not just how—they compete and to monitor closely the return on their digital investments. So far, the results are uneven. Exhibit 3 shows returns distributed unequally: some players in every industry are earning outsized returns, while many others in the same industries are experiencing returns below the cost of capital. 

These findings suggest that some companies are investing in the wrong places or investing too much (or too little) in the right ones—or simply that their returns on digital investments are being competed away or transferred to consumers. On the other hand, the fact that high performers exist in every industry (as we’ll discuss further in a moment) indicates that some companies are getting it right—benefiting, for example, from cross-industry transfers, as when technology companies capture value in the media sector.

Where to make your digital investments

Improving the ROI of digital investments requires precise targeting along the dimensions where digitization is proceeding. Digital has widely expanded the number of available investment options, and simply spreading the same amount of resources across them is a losing proposition. In our research, we measured five separate dimensions of digitization’s advance into industries: products and services, marketing and distribution channels, business processes, supply chains, and new entrants acting in ecosystems.

How fully each of these dimensions has advanced, and the actions companies are taking in response, differ according to the dimension in question. And there appear to be mismatches between opportunities and investments. Those mismatches reflect advancing digitization’s uneven effect on revenue and profit growth, because of differences among dimensions as well as among industries. Exhibit 4 describes the rate of change in revenue and EBIT growth that appears to be occurring as industries progress toward full digitization. This picture, combining the data for all of the industries we studied, reveals that today’s average level of digitization, shown by the dotted vertical line, differs for each dimension. Products and services are more digitized, supply chains less so. 

To model the potential effects of full digitization on economic performance, we linked the revenue and EBIT growth of companies to a given dimension’s digitization rate, leaving everything else equal. The results confirm that digitization’s effects depend on where you look. Some dimensions take a bigger bite out of revenue and profit growth, while others are digitizing faster. This makes intuitive sense. As platforms transform industry ecosystems, for example, revenues grow—even as platform-based competitors put pressure on profits. As companies digitize business processes, profits increase, even though little momentum in top-line growth accompanies them.

The biggest future impact on revenue and EBIT growth, as Exhibit 4 shows, is set to occur through the digitization of supply chains. In this dimension, full digitization contributes two-thirds (6.8 percentage points of 10.2 percent) of the total projected hit to annual revenue growth and more than 75 percent (9.4 out of 12 percent) to annual EBIT growth.

Despite the supply chain’s potential impact on the growth of revenues and profits, survey respondents say that their companies aren’t yet investing heavily in this dimension. Only 2 percent, in fact, report that supply chains are the focus of their forward-looking digital strategies (Exhibit 5), though headlining examples such as Airbnb and Uber demonstrate the power of tapping previously inaccessible sources of supply (sharing rides or rooms, respectively) and bringing them to market. Similarly, there is little investment in the ecosystems dimension, where hyperscale businesses such as Alibaba, Amazon, Google, and Tencent are pushing digitization most radically, often entering one industry and leveraging platforms to create collateral damage in others. 

Contd 3...............

Page 1 3, 4. 5

The case for digital reinvention 03-21

Digital technology, despite its seeming ubiquity, has only begun to penetrate industries. As it continues its advance, the implications for revenues, profits, and opportunities will be dramatic.

Image credit : Shyam's Imagination Library

As new markets emerge, profit pools shift, and digital technologies pervade more of everyday life, it’s easy to assume that the economy’s digitization is already far advanced. According to our latest research, however, the forces of digital have yet to become fully mainstream. On average, industries are less than 40 percent digitized, despite the relatively deep penetration of these technologies in media, retail, and high tech.

As digitization penetrates more fully, it will dampen revenue and profit growth for some, particularly the bottom quartile of companies, according to our research, while the top quartile captures disproportionate gains. Bold, tightly integrated digital strategies will be the biggest differentiator between companies that win and companies that don’t, and the biggest payouts will go to those that initiate digital disruptions. Fast-followers with operational excellence and superior organizational health won’t be far behind.

The case for digital reinvention 

As digitization penetrates more fully, it will dampen revenue and profit growth for some, particularly the bottom quartile of companies, according to our research, while the top quartile captures disproportionate gains. Bold, tightly integrated digital strategies will be the biggest differentiator between companies that win and companies that don’t, and the biggest payouts will go to those that initiate digital disruptions. Fast-followers with operational excellence and superior organizational health won’t be far behind.

These findings emerged from a research effort to understand the nature, extent, and top-management implications of the progress of digitization. We tailored our efforts to examine its effects along multiple dimensions: products and services, marketing and distribution channels, business processes, supply chains, and new entrants at the ecosystem level (for details, see sidebar “About the research”). We sought to understand how economic performance will change as digitization continues its advance along these different dimensions. What are the best-performing companies doing in the face of rising pressure? Which approach is more important as digitization progresses: a great strategy with average execution or an average strategy with great execution?

The research-survey findings, taken together, amount to a clear mandate to act decisively, whether through the creation of new digital businesses or by reinventing the core of today’s strategic, operational, and organizational approaches.

More digitization—and performance pressure—ahead

According to our research, digitization has only begun to transform many industries (Exhibit 1). Its impact on the economic performance of companies, while already significant, is far from complete.

Contd 2.........

Page 2, 3, 4, 5

What’s Your Data Worth? 03-20

Many businesses don’t yet know the answer to that question. But going forward, companies will need to develop greater expertise at valuing their data assets.

Image credit : Shyam's Imagination Library

In 2016, Microsoft Corp. acquired the online professional network LinkedIn Corp. for $26.2 billion. Why did Microsoft consider LinkedIn to be so valuable? And how much of the price paid was for LinkedIn’s user data — as opposed to its other assets? Globally, LinkedIn had 433 million registered users and approximately 100 million active users per month prior to the acquisition. Simple arithmetic tells us that Microsoft paid about $260 per monthly active user.

Did Microsoft pay a reasonable price for the LinkedIn user data? Microsoft must have thought so — and LinkedIn agreed. But the deal generated scrutiny from the rating agency Moody’s Investors Service Inc., which conducted a review of Microsoft’s credit rating after the deal was announced. What can be learned from the Microsoft–LinkedIn transaction about the valuation of user data? How can we determine if Microsoft — or any acquirer — paid a reasonable price?

The answers to these questions are not clear. But the subject is growing increasingly relevant as companies collect and analyze ever more data. Indeed, the multibillion-dollar deal between Microsoft and LinkedIn is just one recent example of data valuation coming to the fore. Another example occurred during the Chapter 11 bankruptcy proceedings of Caesars Entertainment Operating Corp.

Inc., a subsidiary of the casino gaming company Caesars Entertainment Corp. One area of conflict was the data in Caesars’ Total Rewards customer loyalty program; some creditors argued that the Total Rewards program data was worth $1 billion, making it, according to a Wall Street Journal article, “the most valuable asset in the bitter bankruptcy feud at Caesars Entertainment Corp.” A 2016 report by a bankruptcy court examiner on the case noted instances where sold-off Caesars properties — having lost access to the customer analytics in the Total Rewards database — suffered a decline in earnings. But the report also observed that it might be difficult to sell the Total Rewards system to incorporate it into another company’s loyalty program. Although the Total Rewards system was Caesars’ most valuable asset, its value to an outside party was an open question.

As these examples illustrate, there is no formula for placing a precise price tag on data. But in both of these cases, there were parties who believed the data to be worth hundreds of millions of dollars.

Exploring Data Valuation

To research data valuation, we conducted interviews and collected secondary data on information activities in 36 companies and nonprofit organizations in North America and Europe. Most had annual revenues greater than $1 billion. They represented a wide range of industry sectors, including retail, health care, entertainment, manufacturing, transportation, and government.

Although our focus was on data value, we found that most of the organizations in our study were focused instead on the challenges of storing, protecting, accessing, and analyzing massive amounts of data — efforts for which the information technology (IT) function is primarily responsible.

While the IT functions were highly effective in storing and protecting data, they alone cannot make the key decisions that transform data into business value. Our study lens, therefore, quickly expanded to include chief financial and marketing officers and, in the case of regulatory compliance, legal officers. Because the majority of the companies in our study did not have formal data valuation practices, we adjusted our methodology to focus on significant business events triggering the need for data valuation, such as mergers and acquisitions, bankruptcy filings, or acquisitions and sales of data assets. Rather than studying data value in the abstract, we looked at events that triggered the need for such valuation and that could be compared across organizations.
We define data value as the composite of three sources of value: (1) the asset, or stock, value; (2) the activity value; and (3) the expected, or future, value.
All the companies we studied were awash in data, and the volume of their stored data was growing on average by 40% per year. We expected this explosion of data would place pressure on management to know which data was most valuable. However, the majority of companies reported they had no formal data valuation policies in place. A few identified classification efforts that included value assessments. These efforts were time-consuming and complex. For example, one large financial group had a team working on a significant data classification effort that included the categories “critical,” “important,” and “other.” Data was categorized as “other” when the value was judged to be context-specific. The team’s goal was to classify hundreds of terabytes of data; after nine months, they had worked through less than 20.

The difficulty that this particular financial group encountered is typical. Valuing data can be complex and highly context-dependent. Value may be based on multiple attributes, including usage type and frequency, content, age, author, history, reputation, creation cost, revenue potential, security requirements, and legal importance. Data value may change over time in response to new priorities, litigation, or regulations. These factors are all relevant and difficult to quantify.

A Framework for Valuing Data

How, then, should companies formalize data valuation practices? Based on our research, we define data value as the composite of three sources of value: (1) the asset, or stock, value; (2) the activity value; and (3) the expected, or future, value. Here’s a breakdown of each value source:

1. Data as Strategic Asset

For most companies, monetizing data assets means looking at the value of customer data. This is not a new concept; the idea of monetizing customer data is as old as grocery store loyalty cards. Customer data can generate monetary value directly (when the data is sold, traded, or acquired) or indirectly (when a new product or service leveraging customer data is created, but the data itself is not sold). Companies can also combine publicly available and proprietary data to create unique data sets for sale or use.

How big is the market opportunity for data monetization? In a word: big. The Strategy& unit of PwC has estimated that, in the financial sector alone, the revenue from commercializing data will grow to $300 billion per year by 2018.

2. The Value of Data in Use

Data use is typically defined by the application — such as a customer relationship management system or general ledger — and frequency of use. The frequency of use is typically defined by the application workload, the transaction rate, and the frequency of data access.

The frequency of data usage brings up an interesting aspect of data value. Conventional, tangible assets generally exhibit decreasing returns to use. That is, they decrease in value the more they are used. But data has the potential — not always, but often — to increase in value the more it is used. That is, data viewed as an asset can exhibit increasing returns to use. For example, Google Inc.’s Waze navigation and traffic application integrates real-time crowdsourced data from drivers, so the Waze mapping data becomes more valuable as more people use it.

The major costs of data are in its capture, storage, and maintenance. The marginal costs of using it can be almost negligible. An additional factor is time of use: The right data at the right time — for example, transaction data collected during the Christmas retail sales season — may be of very high value.

Of course, usage-based definitions of value are two-sided; the value attached to each side of the activity is unlikely to be the same. For example, for a traveler lost in an unfamiliar city, mapping data sent to the traveler’s cellphone may be of very high value for one use, but the traveler may never need that exact data again. On the other hand, the data provider may keep the data for other purposes — and use it over and over again — for a very long time.

3. The Expected Future Value of Data

Although the phrases “digital assets” or “data assets” are commonly used, there is no generally accepted definition of how these assets should be counted on balance sheets. In fact, if data assets are tracked and accounted for at all — a big “if” — they are typically commingled with other intangible assets, such as trademarks, patents, copyrights, and goodwill. There are a number of approaches to valuing intangible assets. For example, intangible assets can be valued on the basis of observable market-based transactions involving similar assets; on the income they produce or cash flow they generate through savings; or on the cost incurred to develop or replace them.
Making implicit data policies explicit, codified, and sharable across the company is a first step in prioritizing data value.

What Can Companies Do?

No matter which path a company chooses to embed data valuation into company-wide strategies, our research uncovered three practical steps that all companies can take.

1. Make valuation policies explicit and sharable across the company. It is critical to develop company-wide policies in this area. For example, is your company creating a data catalog so that all data assets are known? Are you tracking the usage of data assets, much like a company tracks the mileage on the cars or trucks it owns? Making implicit data policies explicit, codified, and sharable across the company is a first step in prioritizing data value.

A few companies in our sample were beginning to manually classify selected data sets by value. In one case, the triggering event was an internal security audit to assess data risk. In another, the triggering event was a desire to assess where in the organization the volume of data was growing rapidly and to examine closely the costs and value of that growth.

The strongest business case we found for data valuation was in the acquisition, sale, or divestiture of business units with significant data assets. We anticipate that in the future, some of the evolving responsibilities of chief data officers may include valuing company data for these purposes. But that role is too new for us to discern any aggregate trends at this time.

2. Build in-house data valuation expertise. Our study found that several companies were exploring ways to monetize data assets for sale or licensing to third parties. However, having data to sell is not the same thing as knowing how to sell it. Several of the companies relied on outside experts, rather than in-house expertise, to value their data. We anticipate this will change. Companies seeking to monetize their data assets will first need to address how to acquire and develop valuation expertise in their own organizations.

3. Decide whether top-down or bottom-up valuation processes are the most effective within the company. In the top-down approach to valuing data, companies identify their critical applications and assign a value to the data used in those applications, whether they are a mainframe transaction system, a customer relationship management system, or a product development system. Key steps include defining the main system linkages — that is, the systems that feed other systems — associating the data accessed by all linked systems, and measuring the data activity within the linked systems. This approach has the benefit of prioritizing where internal partnerships between IT and business units need to be built, if they are not already in place.

A second approach is to define data value heuristically — in effect, working up from a map of data usage across the core data sets in the company. Key steps in this approach include assessing data flows and linkages across data and applications, and producing a detailed analysis of data usage patterns. Companies may already have much of the required information in data storage devices and distributed systems.

Whichever approach is taken, the first step is to identify the business and technology events that trigger the business’s need for valuation. A needs-based approach will help senior management prioritize and drive valuation strategies, moving the company forward in monetizing the current and future value of its digital assets.

Reproduced from MITSLOAN Management Review

Saturday, March 18, 2017

When To NOT Use Isotype Controls 03-19

Antibodies can bind to cells in a specific manner – where the FAB portion of the antibody binds to a high-affinity specific target or the FC portion of the antibody binds to the FcR on the surface of some cells.

They can also bind to cells in a nonspecific manner, where the FAB portion binds to a low affinity, non-specific target. Further, as cells die, and the membrane integrity is compromised, antibodies can non-specifically bind to intracellular targets.

So, the question is, how can you identify and control for this observed nonspecific antibody binding? 

To answer this question, many research groups started using a control known as the isotype control.
The concept of this control is that an antibody targeting a protein not on the surface of the target cells has the same isotype (both heavy and light chain) as the antibody of interest. When used to label cells, those that showed binding to the isotype would be excluded as they represented the non-specific binding of the cells.

Why Isotype Controls Often Fall Short 

Isotype controls were once the most popular negative control for flow cytometry experiments.

They are still very often included by some labs, almost abandoned by others, and a subject of confusion for many beginners. What are they, why and when do I need them? Are they of any use at all, or just a waste of money?

Most importantly, why do reviewers keep asking for them when they review papers containing flow data?

Isotype controls were classically meant to show what level of nonspecific binding you might have in your experiment. The idea is that there are several ways that an antibody might react in undesirable ways with the surface of the cell.

Not all of these can be directly addressed by this control (such as cross-reactivity to a similar epitope on a different antigen, or even to a different epitope on the same antigen). What it does do is give you an estimate of non-specific (non-epitope-driven) binding. This can be Fc mediated binding, or completely nonspecific “sticky” cell adhesion.

In order to be useful, the isotype control should ideally be the same isotype, both in terms of species, heavy chain (IgA, IgG, IgD, IgE, or IgM) and light chain (kappa or lambda) class, the same fluorochrome (PE, APC, etc.), and have the same F:P ratio. F:P is a measurement of how many fluorescent molecules are present on each antibody.

This, unfortunately, makes the manufacture of ideal isotype controls highly impractical. 

There is even a case to be made that differences in the amino acid sequence of the variable regions of both the light and heavy chains might result in variable levels of undesirable adherence in isotypes versus your antibody of interest. 
Moving Beyond Isotype Controls

Many flow cytometry researchers are no longer using isotype controls, with some suggesting they be left out of almost all experiments.

If you spend any time browsing the Purdue Cytometry list, you’ll see these same arguments presented in threads about isotype controls. 

A report in Cytometry A presents options for controls in several categories, the options available, and pros and cons of each option. The report's section on isotype controls summarizes the problems with the use of isotype controls very clearly.

A second report in Cytometry B presents options for controls in several categories, the options available, and pros and cons of each option.

The section of the above paper focusing on isotype controls summarizes the problems with their use very clearly.

The article also illustrates difference in undesirable binding at different levels using the same clone from different manufacturers.

For example, the figure below shows how even the same isotype control clone can result in highly variable levels of undesirable staining.

If you do use isotype controls in your experiment, they must match as many of the following characteristics as possible for your specific antibody — species, isotype, fluorochrome, F:P ratio, and concentration.

Here are 5 cases against using isotype controls alone...

1. Isotype controls are not needed for bimodal experiments.

You don’t need isotype controls for experiments that are clearly bimodal. For example, if you are looking for T cells and B cells in peripheral blood, the negative cells also in the circulation will provide gating confidence.

As seen in the second figure below, it is extremely easy to pick out CD4 and CD8 positive cells in the sample of lysed mouse blood.

2. Isotype controls are not sufficient for post-cultured cells.

If you are using post-cultured cells, the isotype control might give you some information about the inherent “stickiness” of your cells.

However, this measurement is not a value you can subtract from your specific antibody sample to determine fluorescence intensity or percent positive.

Instead, the measurement is simply a qualitative measure of “stickiness” and the effectiveness of Fc-blocking in your protocol.

3. Isotype controls should not be used as gating controls.

If you are using multiple dyes in your search, and your concern is positivity by spectral overlap, you will be better served by using a fluorescence-minus-one control (FMO), in which all antibodies are included except the one you suspect is most prone to error from spectral overlap.

4. Isotype controls should not be used to determine positivity.

You should absolutely not be using isotype controls to determine positive versus negative cells — or, as mentioned in #3 above, as a gating control.

5. Isotype controls are not always sufficient for determining non-specific antibody adherence.

Isotype controls cannot always determine non-specific antibody adherence versus, for example, free fluorochrome adherence. For this, you need to use isoclonic controls. If you add massive amounts of non-fluorochrome conjugated monoclonal antibody to your staining reaction, your fluorescence should drop. If it does not, your issue is not due to nonspecific antibody binding, but to free fluorochrome binding.

Sunday, March 12, 2017

Brain hardwired to respond to others’ itching 03-12

Some behaviors — yawning and scratching, for example — are socially contagious, meaning if one person does it, others are likely to follow suit. Now, researchers at Washington University School of Medicine in St. Louis have found that socially contagious itching is hardwired in the brain.

Studying mice, the scientists have identified what occurs in the brain when a mouse feels itchy after seeing another mouse scratch. The discovery may help scientists understand the neural circuits that control socially contagious behaviors.

“Itching is highly contagious,” said principal investigator Zhou-Feng Chen, PhD, director of the Washington University Center for the Study of Itch. “Sometimes even mentioning itching will make someone scratch. Many people thought it was all in the mind, but our experiments show it is a hardwired behavior and is not a form of empathy.”

For this study, Chen’s team put a mouse in an enclosure with a computer screen. The researchers then played a video that showed another mouse scratching.

“Within a few seconds, the mouse in the enclosure would start scratching, too,” Chen said. “This was very surprising because mice are known for their poor vision. They use smell and touch to explore areas, so we didn’t know whether a mouse would notice a video. Not only did it see the video, it could tell that the mouse in the video was scratching.”

Next, the researchers identified a structure called the suprachiasmatic nucleus (SCN), a brain region that controls when animals fall asleep or wake up. The SCN was highly active after the mouse watched the video of the scratching mouse.

When the mouse saw other mice scratching — in the video and when placed near scratching littermates — the brain’s SCN would release a chemical substance called GRP (gastrin-releasing peptide). In 2007, Chen’s team identified GRP as a key transmitter of itch signals between the skin and the spinal cord.

“The mouse doesn’t see another mouse scratching and then think it might need to scratch, too,” Chen said. “Instead, its brain begins sending out itch signals using GRP as a messenger.”

Chen’s team also used various methods to block GRP or the receptor it binds to on neurons. Mice whose GRP or GRP receptor were blocked in the brains’ SCN region did not scratch when they saw others scratch. But they maintained the ability to scratch normally when exposed to itch-inducing substances.

Chen believes the contagious itch behavior the mice engaged in is something the animals can’t control.

“It’s an innate behavior and an instinct,” he said. “We’ve been able to show that a single chemical and a single receptor are all that’s necessary to mediate this particular behavior. The next time you scratch or yawn in response to someone else doing it, remember it’s really not a choice nor a psychological response; it’s hardwired into your brain.

View at the original source

Saturday, March 11, 2017

New NASA Radar Technique Finds Lost Lunar Spacecraft 03-11

DSS-14 is NASA's 70-meter (230-foot) antenna located at the Goldstone Deep Space Communications Complex in California. It is known as the “Mars Antenna” as it was first to receive signals from the first spacecraft to closely observe Mars, Mariner 4, on March 18, 1966.
Credits: NASA/JPL-Caltech

Finding derelict spacecraft and space debris in Earth’s orbit can be a technological challenge. Detecting these objects in orbit around Earth’s moon is even more difficult. Optical telescopes are unable to search for small objects hidden in the bright glare of the moon.

However, a new technological application of interplanetary radar pioneered by scientists at NASA’s Jet Propulsion Laboratory in Pasadena, California, has successfully located spacecraft orbiting the moon -- one active, and one dormant. This new technique could assist planners of future moon missions.

“We have been able to detect NASA’s Lunar Reconnaissance Orbiter [LRO] and the Indian Space Research Organization’s Chandrayaan-1 spacecraft in lunar orbit with ground-based radar,” said Marina Brozović, a radar scientist at JPL and principal investigator for the test project. “Finding LRO was relatively easy, as we were working with the mission’s navigators and had precise orbit data where it was located. Finding India’s Chandrayaan-1 required a bit more detective work because the last contact with the spacecraft was in August of 2009.”

Add to the mix that the Chandrayaan-1 spacecraft is very small, a cube about five feet (1.5 meters) on each side -- about half the size of a smart car. Although the interplanetary radar has been used to observe small asteroids several million miles from Earth, researchers were not certain that an object of this smaller size as far away as the moon could be detected, even with the world’s most powerful radars. Chandrayaan-1 proved the perfect target for demonstrating the capability of this technique.

This computer generated image depicts the Chandrayaan-1’s location at time it was detected by the Goldstone Solar System radar on July 2, 2016. In the graphic the 120-mile (200-kilometer) wide purple circle represents the width of the Goldstone radar beam at lunar distance. The radar beam was pointed 103 miles (165 kilometers) off the lunar surface. The white box in the upper-right corner of the animation depicts the strength of echo. As the spacecraft entered and exited the radar beam (purple circle), the echo from the spacecraft alternated between being very strong and very weak, as the radar beam scattered from the flat metal surfaces. Once the spacecraft flew outside the beam, the echo was gone.
Credits: NASA/JPL-Caltech

While they all use microwaves, not all radar transmitters are created equal. The average police radar gun has an operational range of about one mile, while air traffic control radar goes to about 60 miles. To find a spacecraft 237,000 miles (380,000 kilometers) away, JPL’s team used NASA's 70-meter (230-foot) antenna at NASA's Goldstone Deep Space Communications Complex in California to send out a powerful beam of microwaves directed toward the moon. Then the radar echoes bounced back from lunar orbit were received by the 100-meter (330-foot) Green Bank Telescope in West Virginia.

Finding a derelict spacecraft at lunar distance that has not been tracked for years is tricky because the moon is riddled with mascons (regions with higher-than-average gravitational pull) that can dramatically affect a spacecraft’s orbit over time, and even cause it to have crashed into the moon. JPL’s orbital calculations indicated that Chandrayaan-1 is still circling some 124 miles (200 kilometers) above the lunar surface, but it was generally considered “lost.”

However, with Chandrayaan-1, the radar team utilized the fact that this spacecraft is in polar orbit around the moon, so it would always cross above the lunar poles on each orbit. So, on July 2, 2016, the team pointed Goldstone and Green Bank at a location about 100 miles (160 kilometers) above the moon’s north pole and waited to see if the lost spacecraft crossed the radar beam. Chandrayaan-1 was predicted to complete one orbit around the moon every two hours and 8 minutes.  Something that had a radar signature of a small spacecraft did cross the beam twice during four hours of observations, and the timings between detections matched the time it would take Chandrayaan-1 to complete one orbit and return to the same position above the moon’s pole.

Radar imagery acquired of the Chandrayaan-1 spacecraft as it flew over the moon’s south pole on July 3, 2016. The imagery was acquired using NASA's 70-meter (230-foot) antenna at the Goldstone Deep Space Communications Complex in California. This is one of four detections of Chandrayaan-1 from that day.
Credits: NASA/JPL-Caltech

The team used data from the return signal to estimate its velocity and the distance to the target.  This information was then used to update the orbital predictions for Chandrayaan-1.

“It turns out that we needed to shift the location of Chandrayaan-1 by about 180 degrees, or half a cycle from the old orbital estimates from 2009,” said Ryan Park, the manager of JPL’s Solar System Dynamics group, who delivered the new orbit back to the radar team.  “But otherwise, Chandrayaan-1’s orbit still had the shape and alignment that we expected.”

Radar echoes from the spacecraft were obtained seven more times over three months and are in perfect agreement with the new orbital predictions. Some of the follow-up observations were done with the Arecibo Observatory in Puerto Rico, which has the most powerful astronomical radar system on Earth. Arecibo is operated by the National Science Foundation with funding from NASA’s Planetary Defense Coordination Office for the radar capability.

Hunting down LRO and rediscovering Chandrayaan-1 have provided the start for a unique new capability. Working together, the large radar antennas at Goldstone, Arecibo and Green Bank demonstrated that they can detect and track even small spacecraft in lunar orbit. Ground-based radars could possibly play a part in future robotic and human missions to the moon, both for a collisional hazard assessment tool and as a safety mechanism for spacecraft that encounter navigation or communication issues.

JPL manages and operates NASA's Deep Space Network, including the Goldstone Solar System Radar, and hosts the Center for Near-Earth Object Studies for NASA's Near-Earth Object Observations Program, an element of the Planetary Defense Coordination Office within the agency's Science Mission Directorate.

Friday, March 10, 2017

Exciting new possibilities in the Quantum Realm 03-10

The strangeness of the quantum realm opens up exciting new technological possibilities.

A BATHING cap that can watch individual neurons, allowing others to monitor the wearer’s mind. A sensor that can spot hidden nuclear submarines. A computer that can discover new drugs, revolutionise securities trading and design new materials. A global network of communication links whose security is underwritten by unbreakable physical laws. Such—and more—is the promise of quantum technology.

All this potential arises from improvements in scientists’ ability to trap, poke and prod single atoms and wispy particles of light called photons. Today’s computer chips get cheaper and faster as their features get smaller, but quantum mechanics says that at tiny enough scales, particles sail through solids, short-circuiting the chip’s innards. Quantum technologies come at the problem from the other direction. Rather than scale devices down, quantum technologies employ the unusual behaviours of single atoms and particles and scale them up. Like computerisation before it, this unlocks a world of possibilities, with applications in nearly every existing industry—and the potential to spark entirely new ones.

Strange but true

Quantum mechanics—a theory of the behaviour at the atomic level put together in the early 20th century—has a well-earned reputation for weirdness. That is because the world as humanity sees it is not, in fact, how the world works. Quantum mechanics replaced wholesale the centuries-old notion of a clockwork, deterministic universe with a reality that deals in probabilities rather than certainties—one where the very act of measurement affects what is measured. Along with that upheaval came a few truly mind-bending implications, such as the fact that particles are fundamentally neither here nor there but, until pinned down, both here and there at the same time: they are in a “superposition” of here-there-ness. The theory also suggested that particles can be spookily linked: do something to one and the change is felt instantaneously by the other, even across vast reaches of space. This “entanglement” confounded even the theory’s originators.

It is exactly these effects that show such promise now: the techniques that were refined in a bid to learn more about the quantum world are now being harnessed to put it to good use. Gizmos that exploit superposition and entanglement can vastly outperform existing ones—and accomplish things once thought to be impossible.

Improving atomic clocks by incorporating entanglement, for example, makes them more accurate than those used today in satellite positioning. That could improve navigational precision by orders of magnitude, which would make self-driving cars safer and more reliable. And because the strength of the local gravitational field affects the flow of time (according to general relativity, another immensely successful but counter-intuitive theory), such clocks would also be able to measure tiny variations in gravity. That could be used to spot underground pipes without having to dig up the road, or track submarines far below the waves.

Other aspects of quantum theory permit messaging without worries about eavesdroppers. Signals encoded using either superposed or entangled particles cannot be intercepted, duplicated and passed on. That has obvious appeal to companies and governments the world over. China has already launched a satellite that can receive and reroute such signals; a global, unhackable network could eventually follow.

The advantageous interplay between odd quantum effects reaches its zenith in quantum computers. Rather than the 0s and 1s of standard computing, a quantum computer’s bits are in super positions of both, and each “qubit” is entangled with every other. Using algorithms that recast problems in quantum-amenable forms, such computers will be able to chomp their way through calculations that would take today’s best supercomputers millennia. Even as high-security quantum networks are being developed, a countervailing worry is that quantum computers will eventually render obsolete today’s cryptographic techniques, which are based on hard mathematical problems.

Long before that happens, however, smaller quantum computers will make other contributions in industries from energy and logistics to drug design and finance. Even simple quantum computers should be able to tackle classes of problems that choke conventional machines, such as optimising trading strategies or plucking promising drug candidates from scientific literature. Google said last week that such machines are only five years from commercial exploitability. This week IBM, which already runs a publicly accessible, rudimentary quantum computer, announced expansion plans. As our Technology Quarterly in this issue explains, big tech firms and startups alike are developing software to exploit these devices’ curious abilities. A new ecosystem of middlemen is emerging to match new hardware to industries that might benefit.

The solace of quantum

This landscape has much in common with the state of the internet in the early 1990s: a largely laboratory-based affair that had occupied scientists for decades, but in which industry was starting to see broader potential. Blue-chip firms are buying into it, or developing their own research efforts. Startups are multiplying. Governments are investing “strategically”, having paid for the underlying research for many years—a reminder that there are some goods, such as blue-sky scientific work, that markets cannot be relied upon to provide.

Fortunately for quantum technologists, the remaining challenges are mostly engineering ones, rather than scientific. And today’s quantum-enhanced gizmos are just the beginning. What is most exciting about quantum technology is its as yet untapped potential. Experts at the frontier of any transformative technology have a spotty record of foreseeing many of the uses it will find; Thomas Edison thought his phonograph’s strength would lie in elocution lessons. For much of the 20th century “quantum” has, in the popular consciousness, simply signified “weird”. In the 21st, it will come to mean “better”.

View at the original source

Monday, March 6, 2017

10 Principles of Strategy through Execution 03-07

“We are all in the gutter,” wrote Oscar Wilde, “but some of us are looking at the stars.” That is the nature of strategy through execution. You operate deep in the weeds, managing countless day-to-day tasks and transactions. At the same time, you keep a steady gaze on your company’s long-term goals  and on ways you can stand out from your competitors.

Having a close link between strategy and execution is critically important. Your strategy is your promise to deliver value: the things you do for customers, now and in the future, that no other company can do as well. Your execution occurs in the thousands of decisions made each day by people at every level of your company.

Quality, innovation, profitability, and growth all depend on having strategy and execution fit together seamlessly. If they don’t fit — if you can’t deliberately align them in a coherent way — you risk operating at cross-purposes and losing your focus. This problem is all too common. In a recent Strategy& global survey, 700 business executives were asked to rate their company’s top leaders in terms of their skill at strategy creation and at execution. Only 8 percent were credited as being very effective at both.

Strategy&, the strategy consulting business of PwC, has been studying the relationship between strategy and execution for years. We have found that the most iconic enterprises — companies such as Apple, Amazon, Danaher, IKEA, Starbucks, and the Chinese appliance manufacturer Haier, all of which compete successfully time after time — are exceptionally coherent. They put forth a clear winning value proposition, backed up by distinctive capabilities, and apply this mix of strategy and execution to everything they do.

Any company can follow the same path as these successful firms, and an increasing number of companies are doing just that. If you join them, you will need to cultivate the ability to translate the strategic into the everyday. This means linking strategy and execution closely together by creating distinctive, complex capabilities that set your company apart, and applying them to every product and service in your portfolio. These capabilities combine all the elements of execution — technology, human skills, processes, and organizational structures — to deliver your company’s chosen value proposition.

How do you accomplish this on a day-to-day basis? How do you get the strategists and implementers in your company to work together effectively? These 10 principles, derived from our experience at Strategy&, can help you avoid common pitfalls and accelerate your progress. For companies that truly embrace strategy through execution, principles like these become a way of life.

1. Aim High

Don’t compromise your strategy or your execution. Set a lofty ambition for your strategy: not just financial success but sustained value creation, making a better world through your products, services, and presence. Apple’s early goal of making “a computer for the rest of us,” which effectively shaped the personal computer industry, is a classic example.

Next, aim just as high on the execution side, with a dedication to excellence that seems almost obsessive to outsiders. Apple, for instance, has long been known for its intensive interest in every aspect of product design and marketing, iterating endlessly until its notoriously demanding leaders are satisfied. The company’s leaders do not consider execution beneath them; it is part of what makes Apple special.

Together, a strong long-term strategy and a fierce commitment to excellent execution can transform not only a company, but a regional economy. After the 1992 Olympics in Barcelona, a group of local political and business leaders realized, with some disappointment, that the event hadn’t triggered the economic growth they had expected. So they resolved to change the region’s economy in other ways. Led by the mayor, the group created a common base of technologies and practices and set up training programs for local enterprises. By 2014, after two decades of persistent effort, the city had become a hub for research and technology companies. One legacy of the Olympics is a group of about 600 sports-related companies with a collective annual revenue of US$3 billion and 20,000 employees.

In carrying out this first principle, the top executives of your company must lead the way. They must learn to set lofty goals, establish a clear message about why those goals are relevant, and stick to them without compromise. This may take a while, because lofty goals require patience. You need to persevere without lowering your standards, and the confidence to believe you can reach the goals soon enough. Leaders must demonstrate that courage and commitment, or no one else will. At the same time, don’t be surprised if the rewards start to appear sooner than you expect — both financial rewards and the intrinsic pleasure of working with highly capable people on relevant projects. With high aspirations (for example, IKEA’s goal of “creating a better everyday life for the many people” or Amazon’s self-proclaimed role as the “everything store”), you recruit talented people who are deeply committed to being there. That’s one way you’ll know that you’re aiming high enough: The whole organization will start to feel like a better place to work.

2. Build on Your Strengths.

Your company has capabilities that set it apart, things you do better than anyone else. You can use them as a starting point to create greater success. Yet more likely than not, your strongest capabilities have been obscured over the years. If, like most companies, you pursue opportunities that crop up without thinking much about whether you have the prowess needed to capture them, you can gradually lose sight of what you do best, or why customers respond to it.

Take an inventory of your most distinctive capabilities. Look for examples where you have excelled as a company, achieving greatly desired outcomes without heroic efforts. Articulate all the different things that had to happen to make these capabilities work, and figure out what it will take to build on your strengths, so that you can succeed the same way more consistently in the future.

Sometimes a particular episode will bring to light new ways of building on your strengths. That’s what happened at Bombardier Transportation, a division of a Canadian firm and one of the world’s largest manufacturers of railroad equipment. To win a highly competitive bid for supplying 66 passenger train cars to a British rail operator, Bombardier shifted its manufacturing and commercial models to a platform-based approach, which allowed it to use and reuse the same designs for several different types of railway cars. “Platforming,” which was a new operational strategy for the industry, required adjustments to Bombardier’s supplier relationships and product engineering practices. But the benefits were immediate: lower costs, less technology risk, faster time-to-market, and better reliability.

Bombardier won the bid — and, more importantly, learned from the experience, making the episode a model for other bids and contracts. When some Bombardier engineers complained about the platform approach on the grounds that it curtailed their creativity, the leadership had an immediate answer: The platform demonstrated capabilities that competitors couldn’t match and the company’s creativity could be focused on innovation. Additional contracts soon followed.

The more knowledge you have about your own capabilities, the more opportunities you’ll have to build on your strengths. So you should always be analyzing what you do best, gathering data about your practices, and conducting postmortems. In every case, there is something to learn — about your operations, and also about the choices you make and the value you’re able to deliver.

3. Be Ambidextrous

In the physical world, ambidexterity is the ability to use both hands with equal skill and versatility. In business, it’s the ability to manage strategy and execution with equal competence. In some companies, this is known as being “bilingual”: able to speak the language of the boardroom and the shop floor or software center with equal facility. Ambidextrous managers can think about the technical and operational details of a project in depth and then, without missing a beat, can consider its broader ramifications for the industry. If strategy through execution is to become a reality, people across the enterprise need to master ambidexterity.

Lack of ambidexterity can be a key factor in chronic problems. For instance, if IT professionals focus only on execution when they manage ERP upgrades or the adoption of new applications, they may be drawn to vendors for their low rates or expertise on specific platforms instead of their ability to design solutions that support the company’s business strategy. When the installation fails to deliver the capabilities that the company needs, there will be an unplanned revision; the costs will balloon accordingly, and the purchase won’t fulfill its promise.

We recognize, of course, that not everyone needs to be equally conversant in the company’s strategy. A typical paper goods manufacturer, for example, employs chemists who research hydrogen bonds to discover ways to make paper towels more absorbent. They may not need to spend much time debating strategy in the abstract, but they do need to be aware of how their role fits in. Like the apocryphal bricklayer who sees himself as building a cathedral, the highly skilled technologists on your team must recognize that they are not merely fulfilling a spec but rather developing a technology unlike anyone else’s, for the sake of building highly distinctive capabilities. They might even help figure out what those capabilities should be.

Similarly, your top leaders don’t have to be experts on hydrogen bonds or cloud-based SQL server hosting, but they do have to be conversant enough with technological and operational details to make the right high-level decisions. No longer can a senior executive credibly say, “I don’t use computers. My staff is my computer.” If your leaders aren’t ambidextrous, they risk being eclipsed or outperformed by someone who is.

In The Self-Made Billionaire Effect: How Extreme Producers Create Massive Value (Portfolio, 2014), John Sviokla and Mitch Cohen suggest using the word producers to describe ambidextrous individuals. Self-made billionaires, such as Spanx founder Sara Blakely, POM Wonderful cofounder Lynda Resnick, Uniqlo founder Tadashi Yanai, and Morningstar founder Joe Manseuto have this quality. They can both envision a blockbuster strategy and figure out in detail how to develop and sell it to customers. There are similarly ambidextrous people in every company, but they often go unappreciated. Find them, recognize and reward them, and give them opportunities to influence others. 

Foster ambidexterity in practices and processes as well as in people. For example, in your annual budgeting exercises, ask people to explain the relationship of each line item to the company’s strategy, and specifically to the capability it is enabling. Over time, this approach will channel investments toward projects with a more strategic rationale. 

4. Clarify Everyone’s Strategic Role

When the leaders of the General Authority of Civil Aviation (GACA) of Saudi Arabia decided to improve the way they ran the country’s 25 airports, they started with the hub in Riyadh, one of the largest airports in the country. They had already outsourced much of their activity, redesigning airport practices and enhancing operations. But not much had changed. Convening the directors and some department leaders, the head of the airport explained that some seemingly minor operational issues — long customs lines, slow boarding processes, and inadequate basic amenities — were not just problems in execution. They stood in the way of the country’s goal of becoming a commercial and logistics hub for Africa, Asia, and Europe. Individual airport employees, he added, could make a difference.

The head of the airport then conducted in-depth sessions with employees on breaking down silos and improving operations. In these sessions, he turned repeatedly to a common theme: Each minor operational improvement would affect the attractiveness of the country for commercial travel and logistics. A wake-up call for staff, the sessions marked a turning point for the airport’s operational success. Other airports in the Saudi system are now expected to follow suit.

The people in your day-to-day operations — wherever they are, and on whatever level — are continually called upon to make decisions on behalf of the enterprise. If they are not motivated to deliver the strategy, the strategy won’t reach the customers. It is well established that financial rewards and other tangible incentives will go only so far in motivating people. Workers cannot make a greater personal commitment unless they understand why their jobs make a difference, and why the company’s advancement will help their own advancement.

Successful leaders spend a great deal of time and attention on the connection between strategy and personal commitment. One such leader has run the trade promotion effectiveness (TPE) capability at two global consumer products goods (CPG) companies over the past several years. CPG companies use this capability to build the momentum of key brands. It involves assembling assortments of products to promote, merchandising them to retailers, arranging in-store displays and online promotions, adjusting prices and discounts to test demand, and assessing the results. A great TPE capability consistently attracts customers and compels them to seek out the same products for months after the campaign ends. TPE and related activities often represent the second-largest item (after the cost of goods sold) on the P&L statement. This in itself indicates the capability’s strategic importance for CPG companies.

In both enterprises, this executive took the time to go up and down the organization, making a case for why the specific mechanics of trade promotion matter to the value proposition of the company and, ultimately, to its survival. He made it a point to talk numbers but didn’t limit the conversation to them. “We spend billions at this company on promotions,” he might say. “We have to get back $100 million in added revenue next year, and another $100 million on top of that the year after.” He then urged employees to develop better promotions that would attract more consumers and increase their synergies with retailers. This combination of numbers and mission made it clear how people’s individual efforts could affect the company’s prospects. 

5. Align Structures to Strategy

Set up all your organizational structures, including your hierarchical design, decision rights, incentives, and metrics, so they reinforce your company’s identity: your value proposition and critical capabilities. If the structures of your company don’t support your strategy, consider removing them or changing them wholesale. Otherwise, they will just get in your way.

Consider, for example, the metrics used to track the results delivered by call center employees. In many companies, these individuals must follow a script and check off that they’ve said everything on the list — even at the risk of irritating potential customers. Better instead to get employees to fully internalize the company’s strategy and grade them on their prowess at solving customer problems.

Danaher, a conglomerate of more than 25 companies specializing in environmental science, life sciences, dental technologies, and industrial manufacturing technologies, is intensely focused on creating value through operational excellence. Critical to this approach are metrics built into the Danaher Business System, the company’s intensive continuous improvement program. Only eight key metrics, called “core value drivers” to underline their strategic relevance, are tracked constantly in all Danaher enterprises. The financial metrics (core growth, operating margin expansion, working capital returns, and return on invested capital) are used not just by investors but also by managers to evaluate the value of their own activities.

Danaher also tracks two customer-facing metrics (on-time delivery and quality as perceived by customers), and two metrics related to employees (retention rates and the percentage of managerial positions filled by internal candidates). Lengthy in-person operating reviews, conducted monthly, are very data driven, focusing on solving problems and improving current practices. The metrics are posted on the shop floor, where anyone can see the progress that’s being made — or not being made — toward clear targets. The meetings are constructive: People feel accountable and challenged, but also encouraged to rise to the challenges.

Data analytics is evolving to the point where it can help revitalize metrics and incentives. A spreadsheet is no longer enough to capture and analyze this body of material; you can use large information management systems programmed to deliver carefully crafted performance data. No matter how complex the input, the final incentives and metrics need to be simple enough to drive clear, consistent behavior. More generally, every structure in your organization should make your capabilities stronger, and focus them on delivering your strategic goals.

6. Transcend Functional Barriers

Great capabilities always transcend functional barriers. Consider Starbucks’ understanding of how to create the right ambience, Haier’s ability to rapidly manufacture home appliances to order, and Amazon’s aptitude for launching products and services enabled by new technologies. These companies all bring people from different functions to work together informally and creatively. Most companies have some experience with this. For example, any effective TPE capability brings together marketing, sales, design, finance, and analytics professionals, all working closely together and learning from one another. The stronger the cross-functional interplay and the more it is supported by the company’s culture, the more effective the promotion.

Unfortunately, many companies unintentionally diminish their capabilities by allowing functions to operate independently. It’s often easier for the functional leaders to focus on specialized excellence, on “doing my job better” rather than on “what we can accomplish together.” Pressed for time, executives delegate execution to IT, HR, or operational specialists, who are attuned to their areas of expertise but not necessarily to the company’s overall direction. Collaborative efforts bring together people who don’t understand each other or, worse, who pursue competing objectives and agendas. When their narrow priorities conflict, the teams end up stuck in cycles of internal competition. The bigger a company gets, the harder it becomes to resolve these problems.

You can break this cycle by putting together cross-functional teams to blueprint, build, and roll out capabilities. Appoint a single executive for each capability team, accountable for fully developing the capability. Ensure this person has credibility at all levels of the organization. Tap high-quality people from each function for this team, and give the leader the authority to set incentives for performance.

There’s always the risk that these cross-functional teams will be seen as skunkworks, separate from the rest of the enterprise. To guard against this risk, you need a strong dotted line from each team member back to the original function. Sooner or later, the capabilities orientation will probably become habitual, affecting the way people (including functional leaders) see their roles: not as gatekeepers of their expertise, but as contributors to a larger whole.

7. Become a Fully Digital Enterprise

The seventh principle should affect every technological investment you make — and with luck, it will prevent you from making some outdated ones. Embrace digital technology’s potential to transform your company: to create fundamentally new experiences and interactions for your customers, your employees, and every other constituent. Until you use technology this way, many of your IT investments will be wasted; you won’t realize their potential in forming powerful new capabilities.

Complete digitization will inevitably broaden your range of strategic options, enabling you to pursue products, services, and innovations that weren’t feasible before. For example, Under Armour began as a technologically enabled sports apparel company, specializing in microfiber-based synthetic fabrics that felt comfortable under all conditions. To keep its value proposition as an innovator, it aggressively expanded into fitness trackers and the development of smart apparel. The company is now developing clothing that will provide data that can both help athletes raise their game and point the way to design improvements.

Adopting digital technology may mean abandoning expensive legacy IT systems, perhaps more rapidly than you had planned. Customers and employees have come to expect the companies they deal with to be digitally sophisticated. They now take instant access, seamless interoperability, smartphone connectivity, and an intuitively obvious user experience for granted. To be sure, it is expensive and risky to shift digital systems wholesale, and therefore you need to be judicious; some companies are applying the Fit for Growth approach to IT, in which they reconsider every expense, investing more only in those that are directly linked to their most important capabilities. (See “Building Trust while Cutting Costs,” by Vinay Couto, Deniz Caglar, and John Plansky.)

Fortunately, cloud-based technologies provide many more options than were available before. To boost agility and reduce costs, you can outsource some tech activities, while keeping others that are distinctive to your business. You also can use embedded sensors and analytics to share data across your value chain and collaborate more productively (an approach known as “Industry 4.0” and the “Industrial Internet of Things”). The biggest constraint is no longer the cost and difficulty of implementation. It’s your ability to combine business strategy, user experience, and technological prowess in your own distinctive way. 

8. Keep It Simple, Sometimes

Many company leaders wish for more simplicity: just a few products, a clear and simple value chain, and not too many projects on the schedule. Unfortunately, it rarely works out that way. In a large, mainstream company, execution is by nature complex. Capabilities are multifaceted. Different customers want different things. Internal groups design new products or processes without consulting one another. Mergers and acquisitions add entirely new ways of doing things. Although you might clean house every so often, incoherence and complexity creep back in, along with the associated costs and bureaucracy.

Many company leaders wish for more simplicity. Unfortunately, it rarely works out that way.

The answer is to constantly seek simplicity, but in a selective way. Don’t take a machete to your product lineup or org chart. Remember that not all complexity is alike. One advantage of aligning your strategy with your capabilities is that it helps you see your operations more clearly. You can distinguish the complexity that truly adds value (for example, a supply chain tailored to your most important customers) from the complexity that gets in your way (for example, a plethora of suppliers when only one or two are needed).

As Vinay Couto, Deniz Caglar, and John Plansky explain in Fit for Growth: A Guide to Strategic Cost Cutting, Restructuring, and Renewal (Wiley, 2017), effective cost management depends on the ability to ruthlessly cut the investments that don’t drive value. Customer-facing activities can be among the worst offenders. Some customers need more tailored offerings or elaborate processes, but many do not.

For example, Lenovo, a leading computer hardware company with twin headquarters in China and the U.S. (Lenovo’s ThinkPad computer business was acquired with its purchase of IBM’s personal computer business), has a strategy based on cross-pollination of innovation between two entirely different markets. The first is “relationship” customers (large enterprises, government agencies, and educational institutions), which purchase in large volume, need customized software, and are often legacy IBM customers. The second is “transactional” customers (individuals and smaller companies), typically buying one or two computers at a time, all seeking more or less the same few models; these customers, however, are sensitive to cost and good user experience.

Lenovo has a single well-developed hardware and software innovation capability aimed at meeting the needs of both types of customers. But its supply chain capability is bifurcated. The relationship supply chain is complex, designed to provide enterprise customers with greater responsiveness and flexibility. Lenovo’s computer manufacturing plant in Whitsett, N.C., which opened in 2013, was designed for fast shipping, large orders, and high levels of customization. Meanwhile, the company maintains a simpler supply chain with manufacturing sites in low-cost locations for its transactional customers.

The principle “keep it simple, sometimes” is itself more complex than it appears at first glance. It combines three concepts in one: First, be as simple as possible. Second, let your company’s strategy be your guide in adding the right amount of complexity. Third, build the capabilities needed to effectively manage the complexity inherent in serving your markets and customers.

9. Shape Your Value Chain

No company is an island. Every business relies on other companies in its network to help shepherd its products and services from one end of the value chain to the other. As you raise your game, you will raise the game of other operations you work with, including suppliers, distributors, retailers, brokers, and even regulators.

Since these partners are working with you on execution, they should also be actively involved in your strategy. That means selling your strategy to them, getting them excited about taking the partnership to a whole new level, and backing up your strategic commitment with financing, analytics, and operational prowess. For example, when the Brazilian cosmetics company Natura Cosméticos began sourcing ingredients from Amazon rain forest villages, its procurement staff discovered that the supply would be sustainable only if they built deeper relationships with their suppliers. Beyond paying suppliers, they needed to invest in the suppliers’ communities. The company has held to that commitment even during down periods.

Use leading-edge digital technology to align analytics and processes across your value chain. In the past, companies that linked operations to customer insight in innovative ways did it through vertical integration, by bringing all parts of the operation in-house. For example, Inditex created a robust in-house network that linked its Zara retail stores with its design and production teams. Real-time purchase data allowed designers to find out what was selling — and what wasn’t — more quickly than their competitors could. This approach has helped Zara introduce more items that would sell quickly while keeping costs down. And it has helped Inditex outpace its rivals in both profitability and growth.

At the time Inditex developed its system, vertical integration was a prerequisite for that kind of integration. But now the technology has changed, and in a cloud-based computer environment, you no longer need full vertical integration. You can achieve the same result through integrated business platforms (some managed by third-party logistics companies such as Genpact, and others being developed as joint ventures). By allowing several companies to share real-time data seamlessly, these platforms enable each participating company to set more ambitious strategic goals. 

10. Cultivate Collective Mastery

The more bound your company is by internal rules and procedures for making and approving decisions, the slower it becomes. Hence the frustration leaders have with the pace of bureaucracy, in which people can’t make decisions because they don’t know what the strategic priorities are — or even what other stakeholders will think. In a world where disruption has become prevalent, your company can’t afford the time or expense of operating this way.

The alternative is what we call collective mastery. This is a cultural attribute, often found in companies where strategy through execution is prevalent. It is the state you reach when communication is fluid, open, and constant. Your strategists understand what will work or not work because they talk easily with functional specialists. Your functional specialists know not only what they’re supposed to do, but why it matters. Everyone moves quickly and decisively, because they have the ingrained judgment to know who to consult, and when. People trust one another to make decisions on behalf of the whole.

Many of the attributes of Silicon Valley companies owe a great deal to the high level of collective mastery in the area. The culture of these companies encourages risk taking, because it’s expected that people will make mistakes — not as a goal, of course, but in the process of learning. People expect their colleagues to be informal, quick-thinking, and unassuming. They rely on systems and processes only when they add value, and are willing to jettison them at other times. With this type of culture, people can focus on getting results.

Collective mastery builds over time when people have the support and encouragement they need to work easily and readily across organizational boundaries, with a high level of trust and frequent informal contact. Even when they hold different perspectives, they get to the point where they understand one another’s thinking.

To operate this way, you have to be flexible. That doesn’t mean giving up your strategy; you still should pursue only opportunities with which you have the capabilities to win. Indeed, knowing what you do best allows you to be closer to the customers who matter, and to give more autonomy to employees. Because you are less distracted by nonstrategic issues, you have the attention and resources to pursue worthwhile opportunities as soon as they arise. Collective mastery also makes it easier to conduct an experiment: to launch a project and learn from the response without making a huge commitment. This high level of fluidity and flexibility is essential for navigating in a volatile economic landscape.

In the end, the 10 principles of strategy through execution will do more than help you achieve your business goals. They will also help build a new kind of culture, one in which people are aware of where you’re going and how you’re going to get there. The capabilities you build, and the value you provide, are larger than any individual can make them. But by creating the right kind of atmosphere, you make it possible to not just stand in the weeds and look at the stars, but reach a higher level than you may ever have thought you would.

View at the original source