Companies Learn and Adapt to Their Competitive Environments

åǥÁö

Most organisms behave like complex adaptive systems. As their parts sense and respond to conditions outside the system that constantly change, each part interacts with and inf..






Companies Learn and Adapt to Their Competitive Environments


Most organisms behave like complex adaptive systems. As their parts sense and respond to conditions outside the system that constantly change, each part interacts with and influences the other parts in the system. Ultimately, this yields new patterns of behavior for the overall system.

In brief, it is an open system that is smart ? it continually learns, adapts and evolves.

For example, consider the lowly anthill. The ant colony displays collective behavior, yet each ant acts individually. If you block the path taken by a line of ants to a source of food, the system will adapt. It will develop a new path to that food source. The interaction of a small group of ants, however, produces the change. The queen ant doesn¡¯t instruct each ant to change its individual behaviors. The new path emerges out of the interconnected actions of individuals.

In the same way, a company is a complex adaptive system. Much like an anthill, it displays a collective response when each employee or stakeholder individually changes his or her behavior to a dynamic operating environment, such as shifting consumer demand, supply chain bottlenecks, and so on.

The company relies on a variety of learning feedback loops to deal with the forces of change in its environment.

Researchers at the consulting firm CSC Index identified five attributes of intelligent systems:1

Sensing, or bringing awareness to everyday things. Adapting, or modifying behavior to fit the environment. Learning, or using experience to improve performance. Inferring, or drawing conclusions from rules and observations. Anticipating, or thinking and reasoning about what to do next.

We discussed the first of these five attributes in our exploration of the ¡°Sense and Respond¡± trend. In this section, we¡¯ll discuss the second and third of CSC¡¯s smart attributes.

We¡¯ll describe how companies are improving their learning and adapting capabilities by leveraging advances in artificial intelligence, including neural networks. Later, we will explore inferring and anticipating in our discussions of ¡°Genetic Algorithms¡± and ¡°Seed, Select, and Amplify.¡±

Recall the distinction between single-loop responses and double-loop learning. Double-loop feedback processes involve questioning and possibly revising the rules and beliefs governing a system¡¯s actions ? that is, the way things work or are traditionally done. These rules, beliefs, assumptions, or even strategies reflect conditions that no longer hold. Double-loop feedback corresponds to a complex adaptive system.

The promise of artificial intelligence and advanced computing is that companies will engage in more double-loop learning processes using expert systems, neural networks, data and text mining, and so on. Companies will ¡°learn and adapt¡± to their complex, changing competitive landscapes using these tools.

The hope is that they will become less fixed and mechanical, and more evolving and organic. In other words, as systems become increasingly smarter and intelligent, organizations will morph and evolve into ever more complex adaptive systems. They will become increasingly life-like.

An adaptive enterprise can be defined as one that constantly stays alert to changes in the marketplace and rapidly adapts its strategies and operations in order to survive and thrive in that environment.

Let¡¯s examine the technological advances in artificial intelligence and advanced computing that create these hopes. Perhaps the best known example of artificial intelligence was the system known as Deep Blue, the world-famous chess program. Deep Blue¡¯s ability to evaluate 200 million possible chess moves per second was derived from specialized hardware and software that combined the knowledge of a grandmaster with the processing speed of an IBM PowerParallel SP computer.

The Deep Blue System was an early example of an advanced neural network. These networks have grown increasingly complex in the 21st century; we expect their rate of change to accelerate dramatically over the coming five years. A neural network is a computer system in which learning is achieved by linking several layers of simple processing elements called neurons. Conventional computers use algorithms to solve problems; as a result, they can only solve problems that we already know how to solve. But neural networks can solve problems, spot trends, and detect patterns that are beyond the capabilities of people and other computer techniques.

Researchers at the Batelle Memorial Institute note that these technologies are very good for making inferences about imprecise or incomplete input data.2 ¡°They are good pattern recognition engines and robust classifiers, with the ability to generalize in making decisions about imprecise input data.¡±

Neural networks now solve business problems that prove too complicated for functional experts. For this reason, they are portrayed as rivaling human intelligence. When organizations leverage these networks, they are able to adapt more readily to increasingly complex, chaotic and changing environments. Consider the following neural network applications:

Overnight, Wal-Mart analyzes point-of-sale data gathered from its more than 3,000 stores to make forecasts about the sales of every product at each store.3 Because these forecasts are so accurate, the company cuts the costs of inventories and focuses its promotional spending on the products that offer the best returns on investment. According to a whitepaper published by CSC Index, ¡°At [Wal-Mart¡¯s] corporate headquarters, they want to know trends down to the last Q-tip, but it would take several lifetimes for a human analyst to glean anything from the equivalent of two million books of data contained in a terabyte.¡±

A software system named CopLink is helping Tucson police solve crimes. Although suspects John Malvo and John Mohammed were apprehended before Tucson police arrived in Washington, D.C. with the technology, the application provided eerie clues that could have found the pair in last year¡¯s sniper shooting spree. Police in Westminster, California are also developing a system ¡°to predict crime trends, improve patrol assignments, and develop better crime-prevention programs.¡±

At Southwest Airlines, executives used complexity-theory modeling to analyze and make its freight operations more efficient. The models showed that the carrier was transferring many packages to the most direct flights, leading to unnecessary handling and storing of packages. By letting some cargo take more roundabout routes, the airline has cut the cargo transfer rate by 70 percent at its six freight hubs ? saving millions of dollars, according to InformationWeek News.4

Procter & Gamble ran a simulation to help achieve an inventory reduction of 25 percent in its immense supply network, which includes some 250 product lines. Contrary to conventional thinking, the company found that if it sent out its delivery trucks more often, before they were filled to capacity, it could slash the cost of inventory in its warehouse. This led P&G to estimate that the technology could save half the time and cost of its supply chain through lower inventory, cheaper transportation costs, and fewer purchasing managers.

These applications are not without their drawbacks. Companies will need broad access to a wide variety of data; many companies still lack robust information databases as a result of poor ERP installations. Managing the data is just the beginning. Building these models is a complex process. Few companies have the expertise in-house.

Several companies now provide these services on a consulting basis, including Genalytics, Magnify, Quadstone, Fair Isaac Corporation, California Scientific Software, NCR¡¯s TeraData Division, the SAS Institute, and SPSS. Yet, some companies will be uncomfortable turning their customer databases over to external parties.

Based on the previous analysis, we forecast the following four developments:

1. Expect widespread roll-out of what we call ¡°predictive intelligence¡± applications using technologies such as data mining, neural networks and genetic algorithms. Some companies currently use deterministic statistical analysis to build such applications ? for example, TiVo currently recommends television programming to its subscribers, and Amazon recommends items based on a customer¡¯s purchase history. But we expect this technology to become even more sophisticated. Consider the following:

Predictive analysis tool vendor Sightward helped the skin-and-hair products retailer The Body Shop and the Seattle-based kitchenware provider Sur La Table, Inc. improve their catalog sales. The Body Shop estimates that revenue per catalog increased 10 to 20 percent. Sur La Table estimates that 35 percent of the mailing list generated by predictive analysis would not have been found using standard prospecting analytic techniques.

Financial giant HSBC uses the technology to forecast foreign-exchange rates for its traders. The firm expects to generate foreign exchange models of unprecedented detail and accuracy as it combines its own market knowledge with artificial intelligence software and parallel computing capabilities.5 U.S. Bancorp, Wachovia Bank, and the Credit Union National Association have reduced their credit-card fraud rates by 70 percent using neural network-based applications.

2. Business intelligence will help companies make sense of what they¡¯re sensing. As we explained in our discussion of the ¡°sense and respond¡± trend, we predict an explosion in company information from a variety of sources, including instant messaging and RFID sensors. A company¡¯s ERP system will store much of this data; however, an ever greater portion will be unstructured information stored on web pages, in e-mails, in text messages, and so on. At the same time that companies are taking advantage of data mining, we also expect a growing but slower roll-out of text mining applications. While these use the same neural networks as data mining applications, they also employ more recent tools such as intelligent agents, which are in the early stages of development. We therefore anticipate the number of text mining installations in companies to initially trail the number of data mining applications over the next two to three years. In seven to 10 years, however, we expect the number of companies installing new text mining applications to exceed those installing new data mining applications. Initial installations point to the promise and power of future text mining applications. Consider the following:

The National Security Agency¡¯s Echelon system eavesdrops on voice messages and text sent over the Internet and telephone lines. The amount of data the NSA collects is staggering. According to Businessweek Online,6 the entire adult population of the U.S. couldn¡¯t begin to manually filter all the material NSA collects, which is why DARPA is a primary sponsor of research on artificial intelligence.

The Department of Defense Intelligence Information Systems is developing an application that will monitor and filter up to 35,000 messages a day, including text, graphics, and rich media, at 40 sites around the world.7 The software will also be used to search quickly through 20 million messages that the system has stored over the last 15 years.

3. The number of activities requiring cross-enterprise coordination will explode, especially as business executives recognize the interdependence of companies across their extended value chains. These developments point to the growing importance of inter-enterprise processes such as resource allocation, production scheduling, and logistics routing ? what most executives now call ¡°supply chain planning.¡± When combined with data- and text-mining across the Internet, however, these processes begin to represent something much greater: the collective intelligence of the extended enterprise or the brain of the eco-system. Consider the following:

Procter & Gamble plans to install a programmed agent at each decision point along the supply chain. The ultimate goal is to link the software to its suppliers¡¯ IT systems. This will automate every purchasing decision the company makes. But more significantly, all of the decisions along the supply chain will be made at the same time, rather than in sequence.

Some experts envision the creation of an artificial being with the combined knowledge of the world¡¯s greatest minds by combining all of the data on all of the systems linked to the Internet. For example, a computer program called SmarterChild goes to an on-line dictionary for correct word spellings, to a sports Web site for the correct score of every game, and so on. These programs can draw upon all of the databases on the Web, not only a single database. Before long, according to some scientists, such programs could evolve into an intelligent network that consists of all of the approximately 1 billion computers on the Internet.

4. Computers will become largely autonomous. Smart products already contain diagnostic and self-healing properties unimaginable a decade ago. BMW Automobiles can detect problems while operating, remotely contact the company¡¯s computer, self-diagnose the condition, and download temporary software fixes until the owner returns the car to the dealer. In the UK, telephone networks route their own calls and continuously change their programming to cope with changes in demand. Mainframe storage devices now have that capability ? an economical development when one considers that it costs twice as much to manage storage systems as it does to buy them. We also expect that computers will be increasingly able to self-program so that they assume greater degrees of autonomy. Advances in genetic algorithms and genetic programming, which we¡¯ll discuss as we explore the next trend, make this possible. IBM¡¯s research budget for autonomic computing already approaches $500 million, and other companies are developing their own autonomic computing solutions.

References List :1. McKinsey Quarterly, 1997, Number 1, "Strategy at the Edge of Chaos," by Eric Beinhocker. ¨Ï Copyright 1997 by McKinsey & Company. All rights reserved.2. To download the "What Is an Artificial Neural Network?" visit the Pacific Northwest National Laboratory at: www.emsl.pnl.gov/proj/neuron/neural/what.html3. Business Week, March 24, 2003, "Smart Tools," by Otis Port, Michael Arndt, and John Carey. ¨Ï Copyright 2003 by The McGraw-Hill Companies. All rights reserved.4. InformationWeek News, February 25, 2002, "Companies Boost Sales Efforts with Predictive Analysis," by Rick Whiting. ¨Ï Copyright 2002 by CMP Media, Inc., LLC. All rights reserved.5. South China Morning Post, June 19, 2002, "HSBC Banks on Brain Power," by Bien Perez. ¨Ï Copyright 2002 by South China Morning Post Publishers Limited. All rights reserved.6. Business Week, March 24, 2003, "Smart Tools," by Otis Port, Michael Arndt, and John Carey. ¨Ï Copyright 2003 by The McGraw-Hill Companies. All rights reserved.7. The Globe and Mail, September 6, 2002, "Tools for Taming Data Chaos," by Kevin Marron. ¨Ï Copyright 2002 by Bell Globemedia Publishing, Inc. All rights reserved.

ÀÌÀü

¸ñ·Ï