Archive for the ‘Foresight’ Category

In the same way that the Web has changed the communication habits of millions of people, social computing is evolving to help us work with and in complex adaptive systems. The insights from Snowden and Jackson about ‘foresight’ and complexity, and their relationship with social computing, are fascinating not just for futures planning, but also for re-thinking processes in knowledge intensive organisations like knowledge management, communication and collaboration processes. For instance:

Developing a sensor network: The ability to quickly access authoritative guidance from colleagues, and to regular streams of intelligence regarding clients, competitors, market changes, and so on, is crucial to the development of actionable current awareness. But too often, companies rely on a handful of sources to feed them information, constrain sharing to document and email centric models and squeezed people’s interactions into pre-existing software models and workflows. That leaves the better part of a company’s extensive network of resources untapped and a void of higher level meta-data or collective intelligence derived from people’s diverse activities and contributions to a social computing platform.

If employees, clients and collaborators are able to contribute fragments of information, like tags, bookmarks, comments and links, as they come across or create the information in the course of their daily work not only will this be of benefit at the individual level (e.g. there’s no extra or less effort involved), when that information is aggregated, patterns can be determined which help others to spot trends and focus on hotspots in realtime. That makes for a potent early warning system and truly effective current awareness.

Sharing your way to competitive advantage: To operate in complex networked environments, companies are having to rethink old models based on the control of proprietary information. Snowden articulated it in these terms: “The paradigm has shifted. For whole of last century ownership gave economic power. Now, it is the speed at which you exploit things that matters not ownership. A strategy of openness makes more things available to you. What matters then is your agility and ability to exploit things.”

Over the last few years we have seen some companies become more open and share their learning/information with clients and other organisations. For instance, Innocentive.com enables companies, academic institutions and NFPs to come together in an open innovation marketplace to post challenges, share ideas and devise breakthrough solutions. In the UK, several major international law firms established the Banking Legal Technology portal in 2006 due to pressure from investment banking clients wishing to reduce costs and streamline access to all advice/information from the different firms. Likewise, Legal OnRamp provides another forum for lawyers share information and showcase their expertise, and for in-house counsel to access to precedents of major law firms and the pool their resources with other general counsel.Going forward, we will see companies using increasing volumes of fragmented data (e.g. tweets, blogs, comments, html links and pages) to contribute to social extranets, accessible by clients and competitors alike.

In that way, companies will get to see more and do more for less. By opening up the scanning process, not only will they add to the overall pool from which they can draw, they will also be presented with new narratives and possibilities which would not have been apparent or available in a closed setting. It will then be companies’ ability to interpret and apply the information quickly, innovatively and insightfully that will provide competitive advantage.

Developing new meaning through deliberate ambiguity: This picture presents a classic example of ambiguity. Woman - Young and OldThere’s an old and a young woman in there. Perhaps you see one or both of them. How long did it take you to focus on the different images? Does that mean anything? Is one more persuasive than the other?  Snowden proposed ‘deliberate ambiguity’ as a vehicle for encouraging emergent meaning and contributing to to the effectiveness and richness of a work. Increasing moves toward the use of fragmented materials in our work, like clipping items from feed readers, adding to them notes and tags, linking the clippings to blog posts and engaging people in further online discussions and idea sharing, we are deliberately introducing a higher degree of ambiguity to the system. It is precisely this ambiguity that allows us to interpret and give new meaning to the fragments, which provides new perspectives, ideas and interpretations. This is the source of innovation and difference – not best practice and compliance regimes.

There are also ramifications for traditional information categorisation and classification regimes, the purpose of which was to disambiguate and establish order in the system. Efforts to create order in this way can be counter-productive. If you are looking for something that hasn’t been categorised in the way you expect, then you probably won’t find it (quickly or perhaps at all). You are also less likely to make valuable serendipitous discoveries by stumbling across items that sit outside of traditional categories. As Thomas Gruber (2007) explains in his article “Ontology of Folksonomy: A Mash-up of Apples and Oranges”:

“Tags introduce distributed human intelligence into the system. As others have pointed out, Google’s revolution in search quality began when it incorporated a measure of “popular” acclaim — the hyperlink — as evidence that a page ought to be associated with a query. When the early webmasters were manually creating directories of interesting sites relevant to their interests, they were implicitly “voting with their links.” Today, as the adopters of tagging systems enthusiastically label their bookmarks and photos, they are implicitly voting with their tags. This is, indeed, “radical” in the political sense, and clearly a source of power to exploit.”

In that way, user participation in the form of social tagging offers a far more powerful means of discovering information and meaning.

Using technology to provide decision support: Although previous generations dreamed of artificial intelligence and people feeding computers information and receiving answers, we now understand that the roles should be reversed, and we are interested in using computer networks to augment human intelligence and make it easier for us to make decisions of our own. This is the key differentiating factor about social computing – it has human agency in it. Whilst computers can present more data, human agency is needed to determine the meaning of the information fragments. That requires us to deliberate model/look at things from different perspectives then present the data back for human-based interpretation and decision making.

To conclude: “the whole point about technology is to provide decision support for human beings not to make decisions” (Dave Snowden).

Read Full Post »

In today’s complex and turbulent environment, organisations need ‘foresight’ to be able to respond promptly to various change drivers including technology, sustainability, globalisation and the economy. Essentially ‘foresight’ is a participative approach to creating shared long-term visions to inform short term decision-making processes (see http://www.foresight-network.eu/). During a recent webinar, Dave Snowden (Cognitive Edge) and Mike Jackson (Shaping Tomorrow) outlined how horizon scanning and social computing can help organisations plan for the future, protect themselves against unexpected threats and exploit forthcoming opportunities.

Jackson outlined the following five components of foresight development and managing change:

  • ID and Monitor Change: Identify patterns from the stories, fragments of information and behaviours of many participants in a system or network and decide how those patterns impact business.
  • Critique implications: Inform the impact assessment with a cross-section of information not just intelligence regarding one’s own industry. That means monitoring much more change and developing a better peripheral vision to be able to understand the broader implications for the business.
  • Imagine difference: Establish the risks and alternatives for different scenarios.
  • Envision preferred route forward: Established where you are, then determine where you want to go by scanning plausible, possible and probable ideas and changes for the future.
  • Plan and implement: Identify goals, resources, strategies and stakeholders required to create change and help them cope with the inherently uncertain future.

The participatory, evolutionary and social nature of ‘foresight’ development makes for a snug fit with social computing (i.e the simpler more networked online applications that connect people and allow them to pool their knowledge and interact better with those in their network). More specifically, social computing enables the content and online interactions to constantly shift so as to better reflect the knowledge, ideas, opinions, preferences and even aspirations of all contributors. Not only does this allow us to develop a better radar of what is happening across our network, it also provides us with higher level of collaborative intelligence: a range of opportunities and outputs that could not be created by any number of individuals or small groups working alone.

(Interestingly, these are also features of complex adaptive systems (emergent, highly connected and simple on the micro level; complex and unpredictable on the macro level) which evolve through rapid feedback loops making them highly adaptive to changing conditions.)

Snowden picked up here, talking about complexity theory, the creation of human sensor networks and the need to manage the evolutionary potential of the present as an alternative to traditional scenario planning. Referring to his recent blog Think anew, Act anew: Scenario Planning, Snowden cited a wonderful quote from Seneca:

“The greatest loss of time is delay and expectation, which depend upon the future. We let go the present, which we have in our power, and look forward to that which depends upon chance, and so relinquish a certainty for an uncertainty.”

This quote emphasises that what matters now is managing the present by switching from “fail safe design strategies to safe fail experimentation”. That involves the use of small early interventions in the form of exploratory behaviour, allowing the ones with good results to be amplified and the ones that don’t work to be eliminated.

Snowden went on to outlined the three fundamental consequences of complexity theory, which need to be present to mange a complex system, which he covers in great depth along with a critique of horizon scanning and scenario planning, in his post mentioned above.

  • Need for distributed cognition. The crux of this is decentralisation and mass participation. The idea that the few can decide for the many, whether it be drawing up scenarios, selecting technology, imposing structures or the like, is inherently unstable. Instead, we need to start to use large numbers of people to feed decision making processes with current information and diverse perspectives.
  • Fragments are key. Material that is used must be finely granulated. A big problem with traditional scenario planning is that it produces ‘chunked’ reports. The human brain has evolved to handle fragmented data patterns – pictures, comments, captions, labels, etc. One of the reasons social computing is so successful is that it presents information in multiple inter-threaded fragments, so that the brain can ‘conceptually blend’ those and link those fragments to determine how to move forward. Documents don’t tune in to the evolutionary nature of humans. Fragmented information has evolutionary advantage.
  • Overcome disintermediation. People making decisions about the future have to have direct contact with raw intelligence. They can’t afford to have middle management or processes mediating, summarising or grouping information. The information must come from trusted sources and permit interaction with those sources (so the information can be validated).

In Part II of this post I will look at some of the implications of this participative evolutionary approach for traditional current awareness and information creation and categorisation processes.

Read Full Post »