Feeds:
Posts
Comments

Consider for a moment where you turn when you’re looking for a document, a reference or information about who’s working on what and with which clients. Commonly people will ask their colleagues or send out an email to their team or people in their wider network. People will use their networks when ever and where ever they can to supplement their face-to-face interactions, and to get information that’s tailored to what they need when they need it. It is precisely these drivers that have led to the exponential rise of online social networks and the evolution of social technologies.

However, instead of supporting the social networks through which information and knowledge circulate, many of the large, centralised, top-down implementations in firms have focused on enforcing information and management processes. It’s no wonder that many of these specialist applications are underused – with their different interfaces and rules for user interactions that require people to spend time figuring out how to use them, compiling information to be approved for inclusion, and then trying to find the information once it has made it into the system. They are not user-friendly, and they don’t reflect the workings of a network where people turn to people to get what they need.

Aside from these technology issues, a shift is needed away from traditional ideas associated with knowledge ‘management’. People use technology because they provide an individual benefit, like getting their work done more efficiently or building their expertise in an area that will help them win clients or get promoted. It’s time to get rid of the notion that people must capture and share information to make the firm more profitable. Instead we should be thinking about the behaviour shift and support that is needed to help make individuals more productive and sharing a by-product of doing not an end in itself.

Social software can play a useful role in streamlining the interaction and communication necessary to support existing ways of working. It can for instance help tackle the burgeoning email and information overload problems suffered by so many legal professionals, and help them quickly and easily find what they need when they need it.

It requires simple changes to the way people work like using a wiki to prepare pitches instead of sending out emails to a limited group of contributors. That change can provide the immediate benefits of reducing email traffic and keeping all the information in one place for assimilation, review and future reference. It also provides the flow on benefits of providing greater transparency (subject to any confidentiality restrictions) to those who would have been otherwise excluded from the pitch preparation process and adding to the collective intelligence of the firm. Likewise feed readers and social bookmarking are excellent personal KM tools. Not only do those tools provide direct benefits to individuals by putting current relevant information at their finger-tips, they also provide a collective benefit. On the one hand, people can find out about others’ interests or expertise in different fields, and on the other, when the information is aggregated, patterns can be determined which help others to spot trends and focus on hot spots in real-time.

This is one of the most important lessons of the Web 2.0 world for the enterprise social computing world, and hints at an important improvement that online social networking can bring to bear on the firm – a significant increase in participation based on the fact that the tools support individual needs. These shifts will shape the possibility of new, flatter and less costly ways of working in the future.

For lawyers, social networking has always been an important feature of the way they do business, and there are many characteristics of lawyerly behaviour that map very closely to the features of online social networking, such as:

  • Relationship based-business development;
  • Individual brand based on reputation and trust;
  • Expertise location and knowledge proliferation through social networks;
  • Development of legal content and expertise as a social endeavour;
  • Strong guild-like legal community.

Nevertheless, as traditionally conservative adopters of technology, many lawyers simply have not had the time to consider the implications of these social and technological developments, whilst others dismiss them as passing fads or consider them unlikely to have any real impact on the legal world.

by ocean.flynn

by ocean.flynn

The popularity of networking sites like Facebook, Twitter and YouTube has tended to limit perceptions of social networking to the online out-of-work pass-time of the younger (Net) generation, leaving many lawyers struggling to see beyond these media-created impressions of online networking.

Some question the value of professional networking sites, which have yet to attract a critical mass of participants.  Others do not see as relevant activities like micro-blogging, social tagging and bookmarking, or are concerned with perceived risks associated with online social networking stemming from a breach of ethics or data security, and “inappropriate” behaviour.

These concerns, which need to be acknowledged and addressed if we are to see widespread adoption, have not deterred some innovative legal professionals who have observed the highly visible success and popularity of sites such as Wikipedia, Delicious, Facebook and LinkedIn, and are getting involved in social networking in an effort to secure competitive advantage through:

  • Development and exploitation of social capital within online social networks;
  • Development of collective intelligence, both inside the firm and more broadly within a market context;
  • Informal knowledge sharing using online social tools and networks.

Within the firm, over-structured group collaboration tools are increasingly giving way to lightweight wiki-based team and group spaces. Costly internal newsletters are becoming blogs, one-way intranet publishing is being opened up using wikis, RSS is starting to replace email alerts and internal social networks are taking forward the concept of expertise location and ‘know who’.

Within the marketplace, online social networking is helping legal professionals and firms alike to increase their visibility and be part of the conversation where ever it is happening, build reputation and relationships, recruit and retain the best and brightest new legal minds who have grown up as internet natives, and provided value-added personalised legal services and secure referrals.

Clearly, there are many opportunities to re-think the way firms operate and emerge as more effective businesses. Have you thought about the potential for improvement in your firm?

With the release of our publication Social Networking for the Legal Profession, I will be introducing in a series of blogs some of the topics and themes outlined in more detail in the report.  I will start the series with posts outlining the context in which online social networking and social computing is asking us to rethink the way legal practices operate and emerge as more effective businesses. In this first post I consider how turbulent conditions forcing change, with later posts discussing the rapid rise of social software, the business role for social networking and shaping new ways of working.*

The impact of the recession looks set to have a profound and long-term effect on the legal profession, especially at the top end of the market. As companies do fewer deals and cut back on their legal spend, business is drying up in certain practice areas and revenues are falling for most firms, as evidenced by regular media reports of law firm lay-offs. Many firms are restructuring and paring-back costs, whilst others are seeking new avenues to exploit. In such times, client retention is of prime importance, and winning new business in the downturn has become an even greater challenge.

Hence, lawyers are constantly seeking ways to get closer to their clients and provide greater value, whilst market forces are pushing them to be leaner, more efficient and innovative in their service delivery.  There is also an ongoing struggle to capitalise on the skills, experience and talent to improve firms’ agility, overall effectiveness and competitiveness, which are more visible now the market is no longer growing as it has in recent years.

Such acute conditions have brought into sharp relief a number of other challenging trends facing the legal profession, namely:

  • Market pull towards commoditisation: Commoditisation has been at the heart of Richard Susskind’s contention that lawyers must adapt to the concept of legal services as commodities and “embrace better, quicker, less costly, more convenient, and publicly valued ways of working”.   This trend towards commoditisation is being driven from a number of quarters including technology advances  (e.g. online and document assembly technologies), standardisation and packaging of lower risk transactional work and client demand for smarter, more cost-effective legal services. The pull towards commoditisation means that price and quality are no longer the only differentiators or drivers of competitive advantage.   Instead, advantage is derived from leveraging intangible assets and capabilities, which most obviously surround the capture, sharing and innovative delivery of knowledge.
  • The rise of the knowledge economy and knowledge markets: In today’s service-based economy, social and professional networks circulate valuable information, ideas, skills and opportunities, quickly and effectively.  As a result, what and who you know determines where and how far you go.  As legal and support staff continue to be laid-off, so too does thousands of years of knowledge and expertise – largely untapped.  That knowledge can sit in various places, including email exchanges, memos, meeting notes, hard and shared drives, not to mention in the heads of the departees and their networks.
  • Technological advances: Information and communication technology continues to evolve and have a pervasive impact on our personal and professional lives.  Since its inception, the web has fundamentally changed the way users interact, connect and communicate.  Technology’s continued and rapid evolution is offering increasing opportunities for re-engineering of business. Unlike previous generations of technology, which essentially offered the opportunity of ‘substitution innovation’ (doing what had always been done a little better), new social technologies offer possibilities for radical change in the way things are done.
  • Generation shifts and expectations: The ‘Net’ generation has grown up on the internet, communicating with instant messaging, text messaging, social networking and (perhaps less commonly) via email.  They are always online and are used to constant real-time contact, multitasking, communicating, sharing and networking.  As a result, the expectation of younger or more Internet-savvy lawyers is that they can have the same freedom, flexibility and power inside and beyond the firm as they can using social tools for their personal affairs.

My next post will discuss the rapid rise of social software on the Web and in businesses.

In the same way that the Web has changed the communication habits of millions of people, social computing is evolving to help us work with and in complex adaptive systems. The insights from Snowden and Jackson about ‘foresight’ and complexity, and their relationship with social computing, are fascinating not just for futures planning, but also for re-thinking processes in knowledge intensive organisations like knowledge management, communication and collaboration processes. For instance:

Developing a sensor network: The ability to quickly access authoritative guidance from colleagues, and to regular streams of intelligence regarding clients, competitors, market changes, and so on, is crucial to the development of actionable current awareness. But too often, companies rely on a handful of sources to feed them information, constrain sharing to document and email centric models and squeezed people’s interactions into pre-existing software models and workflows. That leaves the better part of a company’s extensive network of resources untapped and a void of higher level meta-data or collective intelligence derived from people’s diverse activities and contributions to a social computing platform.

If employees, clients and collaborators are able to contribute fragments of information, like tags, bookmarks, comments and links, as they come across or create the information in the course of their daily work not only will this be of benefit at the individual level (e.g. there’s no extra or less effort involved), when that information is aggregated, patterns can be determined which help others to spot trends and focus on hotspots in realtime. That makes for a potent early warning system and truly effective current awareness.

Sharing your way to competitive advantage: To operate in complex networked environments, companies are having to rethink old models based on the control of proprietary information. Snowden articulated it in these terms: “The paradigm has shifted. For whole of last century ownership gave economic power. Now, it is the speed at which you exploit things that matters not ownership. A strategy of openness makes more things available to you. What matters then is your agility and ability to exploit things.”

Over the last few years we have seen some companies become more open and share their learning/information with clients and other organisations. For instance, Innocentive.com enables companies, academic institutions and NFPs to come together in an open innovation marketplace to post challenges, share ideas and devise breakthrough solutions. In the UK, several major international law firms established the Banking Legal Technology portal in 2006 due to pressure from investment banking clients wishing to reduce costs and streamline access to all advice/information from the different firms. Likewise, Legal OnRamp provides another forum for lawyers share information and showcase their expertise, and for in-house counsel to access to precedents of major law firms and the pool their resources with other general counsel.Going forward, we will see companies using increasing volumes of fragmented data (e.g. tweets, blogs, comments, html links and pages) to contribute to social extranets, accessible by clients and competitors alike.

In that way, companies will get to see more and do more for less. By opening up the scanning process, not only will they add to the overall pool from which they can draw, they will also be presented with new narratives and possibilities which would not have been apparent or available in a closed setting. It will then be companies’ ability to interpret and apply the information quickly, innovatively and insightfully that will provide competitive advantage.

Developing new meaning through deliberate ambiguity: This picture presents a classic example of ambiguity. Woman - Young and OldThere’s an old and a young woman in there. Perhaps you see one or both of them. How long did it take you to focus on the different images? Does that mean anything? Is one more persuasive than the other?  Snowden proposed ‘deliberate ambiguity’ as a vehicle for encouraging emergent meaning and contributing to to the effectiveness and richness of a work. Increasing moves toward the use of fragmented materials in our work, like clipping items from feed readers, adding to them notes and tags, linking the clippings to blog posts and engaging people in further online discussions and idea sharing, we are deliberately introducing a higher degree of ambiguity to the system. It is precisely this ambiguity that allows us to interpret and give new meaning to the fragments, which provides new perspectives, ideas and interpretations. This is the source of innovation and difference – not best practice and compliance regimes.

There are also ramifications for traditional information categorisation and classification regimes, the purpose of which was to disambiguate and establish order in the system. Efforts to create order in this way can be counter-productive. If you are looking for something that hasn’t been categorised in the way you expect, then you probably won’t find it (quickly or perhaps at all). You are also less likely to make valuable serendipitous discoveries by stumbling across items that sit outside of traditional categories. As Thomas Gruber (2007) explains in his article “Ontology of Folksonomy: A Mash-up of Apples and Oranges”:

“Tags introduce distributed human intelligence into the system. As others have pointed out, Google’s revolution in search quality began when it incorporated a measure of “popular” acclaim — the hyperlink — as evidence that a page ought to be associated with a query. When the early webmasters were manually creating directories of interesting sites relevant to their interests, they were implicitly “voting with their links.” Today, as the adopters of tagging systems enthusiastically label their bookmarks and photos, they are implicitly voting with their tags. This is, indeed, “radical” in the political sense, and clearly a source of power to exploit.”

In that way, user participation in the form of social tagging offers a far more powerful means of discovering information and meaning.

Using technology to provide decision support: Although previous generations dreamed of artificial intelligence and people feeding computers information and receiving answers, we now understand that the roles should be reversed, and we are interested in using computer networks to augment human intelligence and make it easier for us to make decisions of our own. This is the key differentiating factor about social computing – it has human agency in it. Whilst computers can present more data, human agency is needed to determine the meaning of the information fragments. That requires us to deliberate model/look at things from different perspectives then present the data back for human-based interpretation and decision making.

To conclude: “the whole point about technology is to provide decision support for human beings not to make decisions” (Dave Snowden).

Yesterday, I went along to the InformatieProfessional Conference here in Amsterdam. As with all things associated with the web these days, the theme of the conference was Integrity 2.0. Key issues revolved around data privacy, information reliability and management of information overload. Lee gave a great presentation on the use of “Social network as information filters”. Here’s what he had to say:

The affordances of social web allow us to build a new relationship with each other and with information.  New forms of media consumption and architecture of participation hold important implications for information management:

Sharing as a by product of action: During the 1990s we saw a rise in interest in KM, which came up with a host of ideas that were never implemented.  The problem lay in the precepts on which knowledge ‘management’ was built, i.e. that people could and should share because sharing is a good idea.  But people are fundamentally lazy and selfish.  They don’t share unless they have to.  And even if they wanted to, the tools available to them have been so difficult to use and unfit for the purpose that they themselves created barriers to participation.  Now, we have the ability to support effective sharing by placing flexible, user-friendly social tools like wikis, blogs and status updates ‘in the flow’ of people’s daily work.  Contributing to the collective intelligence of the organisation takes no extra effort and flows from the very activities necessary for people to get their jobs done.

Socialisation of information: The second phenomenon is that social computing makes invisible data visible.  Information that was previously private or hidden in databases or behind firewalls is being ‘socialised’.  A key difference that the social web holds for information professionals is the way it enables individuals to manage their feeds and flows of information. It also offers new ways of aggregating information, which provides new levels of meaning and adds significant value.  For instance, we have seen how Google employs page visits to feed back and improve page rankings, which acts as an extremely effective recommendation system. Even in the absence of social tools or sophisticated algorithms, organisations have a wealth of meta-data available to them e.g. what people are reading, browsing and searching behaviours, time spent on pages, and so on.  However, to date they have been terrible at surfacing this information, wasting extremely valuable data.

Rapid feedback is key to evolution: These days we see thousands of new internet businesses launch, and they expose themselves to constant feedback.   That means they will fail quickly or succeed because they are robust and responsive to market conditions.  Inside companies, the situation is very different.  Traditional static intranets don’t have the same evolutionary forces at play.  Information is simply broadcast and there are no feedback loops through which people can signal their preferences, and improve or change what they are being given.  Internal tools are evolving very slowly compared to their web-counterparts, because the lessons of the web about the need for interaction, transparency and feedback are not being applied in the enterprise context.

Networked productivity: Companies also need to move beyond the obsession with personal productivity and look to networked productivity.  That requires more and better information sharing, and its aggregation to create ambient intelligence.  We are still exploring and tapping into the great source of value of networks in the enterprise.  Consider for instance the ways in which following people on twitter, reading blogs, discovering new information via Digg or delicious tags makes us more productive collectively.

What does this mean for managing information?

The answer often results in a binary debate about focusing on experts on the one hand and the wisdom of crowds on the other.  But each of these views is too simplistic, but not mutually exclusive.  If you were to look on Google for the best restaurants in New York City,  page rank driven user recommendations would provide you with a set of de-facto facts about the ‘best restaurants’ based on people’s search and browse behaviours.  We don’t know they are the best, but we do know that enough people have clicked on the page to make it worthwhile considering.  On the other hand, WolframAlpha seeks to establish a fact, but the problem it that it hasn’t got a clue about how to do so.  The ‘fact’ simply can’t be established through semantic data because there are different ways of establishing what is ‘true’ in this context.

So which do we use: the individual or distributed model?  On the one hand, gurus like Steve Jobs commonly do an outstanding job of deciding what it is that everybody will have and love.  On the other, we have the development of the Linux kernel using distributed expertise.  Two equally powerful scenarios.  However, recently we have also seen examples of experts testifying in trials based on their interpretation of information behaviour, and getting their opinions very wrong.  Similarly, in healthcare, we are being advised that what was said to be good for us yesterday is not good today ‘in light of what we now know’.  Being open to interpretation, knowledge and truth mean different things to different people and change over time.

That has ramifications for the way we manage information – using networks and human signals to improve information findability:

  • Findability: Making something increasingly easy to find is much better than search.  Whilst some companies look to black-box solutions like Autonomy to find ‘right’ answer, others are using social tagging to build an accurate picture of what information is and isn’t important in their systems.  Leaving trails is a far better way to find information.
  • Human signals: Signals are a very powerful way of validating information. Working through our networks, we see what people have read, commented or voted on the most, and use that contextual level information to help guide us in our search for our ‘facts’ or meaning.

If we continue to manage information as we did in the past we will inevitably create information overload and increasing sources of frustration for our consumers. In the past, the job of information managers was to codify and store information.  Most of the metaphors surrounding this work related to about putting information into boxes.  This approach is not robust or scalable and leads to filter failure. We need to move away from the obsession with storage, and to a weave fabric of information through which people operate.  Notably, the connective tissue (e.g. signals, links and tags) is as important as the information it points to.  All of this is based on people who by their actions indicate what they think is important and useful.

It is this human generated web of information that is the only effective way of dealing with the information deluge.  Everyday, we have too much information pushed at us via email.  We sit like Pavlov’s dog waiting for the tinkle to alert us to the arrival of new mail, only to dutifully go to our inbox (and salivate) over what usually turns out to be spam.  This is a disturbing productivity drain.  Too much of the wrong kind of information commands people’s attention. In addition, most enterprise communication and collaboration tools cannot distinguish between the variable velocity and life span of information.  Which information is current only in the moment, and which has more durable and lasting significance?

To cope with these problems, we need better filters and better radars.  Your ‘filters’ are your network including Twitter, Delicious, Digg, Stumblupon, etc, signaling links or sites you should read because people you trust think they are important.  But using your network as filters, in isolation, can lead to group think as you tend to be attracted to people with similar interests, views or roles. In built bias is not a bad thing as long as you have other mechanisms for finding new information.  This is where your ‘radar’ comes in.  It comprises alerts, searches and smart feeds, which are always on the look out for new stuff.  The combination of the two things is needed to capitalise on ambient awareness.

In fact, one of main purposes of knowledge management is to help people find good information on which to make better decisions.  This is far more  involved than people processing email, memos and other document-centric communications.  People are incredibly adept at receiving and processing ambient information.  In the office we overhear other people’s conversations, we see what people are working on, we receive snippets of news from our feeds or the paper, and so on.  This information is constantly feeding our consciousness. And the human brain has evolved process these huge volumes of fragmented ambiguous information.  But if people constantly have their noses in their inbox, or they are forced into document-centric models of information sharing, they are  cut off from valuable information sources and flows.

Online social networking acts as an excellent operational information filter.  We are used to connecting with people and exchanging information in spaces, and this behaviour is reflected online in social and business networking sites like Facebook and LinkedIn.  Instead of going to Google to search for the best restaurants in NYC, people now go to their network and get better more relevant results.

These activities socialise the information, along with the language and meaning.  An experiment run by the Sony computer lab used robots to describe images projected onto a wall.  The robots had to rapidly learn how to communicate with each other to come up with a description.  They found that at the beginning of the experiment, the number of words used for a concept was quite large but declined over time as the robots negotiated meaning and converged on the designated concept. The finding: Polysemy declines rapidly for new concepts as dominant terms emerge.

Likewise, the process of social tagging is fascinating, especially its effect on interactions and understanding.  As we label our information, we find people that share our perceptions or interests, or we even add new meaning through the label itself. This is the power of folksonomies over taxonomies which for decades have made information impossible to find for most people.  Instead of trying to structure everything and remove all ambiguity, we should use a top-down categorisation system for things that are broadly correct (e.g. regions, products, practice areas) and below that allow human-generated emergent metadata like labels to act as a more effective social way of navigating through information.  Allowing the structure of the language to come from people in the field.

For information professionals, this means moving from tending boxes and labels to becoming information networkers.  It means being guides rather than gate keepers.  Information professionals need to share 21st century competencies with people, helping them to use their networks as filters and establish their radars giving greater control to the individual.  All of this points to a much more interesting future role for information professionals.

In today’s complex and turbulent environment, organisations need ‘foresight’ to be able to respond promptly to various change drivers including technology, sustainability, globalisation and the economy. Essentially ‘foresight’ is a participative approach to creating shared long-term visions to inform short term decision-making processes (see http://www.foresight-network.eu/). During a recent webinar, Dave Snowden (Cognitive Edge) and Mike Jackson (Shaping Tomorrow) outlined how horizon scanning and social computing can help organisations plan for the future, protect themselves against unexpected threats and exploit forthcoming opportunities.

Jackson outlined the following five components of foresight development and managing change:

  • ID and Monitor Change: Identify patterns from the stories, fragments of information and behaviours of many participants in a system or network and decide how those patterns impact business.
  • Critique implications: Inform the impact assessment with a cross-section of information not just intelligence regarding one’s own industry. That means monitoring much more change and developing a better peripheral vision to be able to understand the broader implications for the business.
  • Imagine difference: Establish the risks and alternatives for different scenarios.
  • Envision preferred route forward: Established where you are, then determine where you want to go by scanning plausible, possible and probable ideas and changes for the future.
  • Plan and implement: Identify goals, resources, strategies and stakeholders required to create change and help them cope with the inherently uncertain future.

The participatory, evolutionary and social nature of ‘foresight’ development makes for a snug fit with social computing (i.e the simpler more networked online applications that connect people and allow them to pool their knowledge and interact better with those in their network). More specifically, social computing enables the content and online interactions to constantly shift so as to better reflect the knowledge, ideas, opinions, preferences and even aspirations of all contributors. Not only does this allow us to develop a better radar of what is happening across our network, it also provides us with higher level of collaborative intelligence: a range of opportunities and outputs that could not be created by any number of individuals or small groups working alone.

(Interestingly, these are also features of complex adaptive systems (emergent, highly connected and simple on the micro level; complex and unpredictable on the macro level) which evolve through rapid feedback loops making them highly adaptive to changing conditions.)

Snowden picked up here, talking about complexity theory, the creation of human sensor networks and the need to manage the evolutionary potential of the present as an alternative to traditional scenario planning. Referring to his recent blog Think anew, Act anew: Scenario Planning, Snowden cited a wonderful quote from Seneca:

“The greatest loss of time is delay and expectation, which depend upon the future. We let go the present, which we have in our power, and look forward to that which depends upon chance, and so relinquish a certainty for an uncertainty.”

This quote emphasises that what matters now is managing the present by switching from “fail safe design strategies to safe fail experimentation”. That involves the use of small early interventions in the form of exploratory behaviour, allowing the ones with good results to be amplified and the ones that don’t work to be eliminated.

Snowden went on to outlined the three fundamental consequences of complexity theory, which need to be present to mange a complex system, which he covers in great depth along with a critique of horizon scanning and scenario planning, in his post mentioned above.

  • Need for distributed cognition. The crux of this is decentralisation and mass participation. The idea that the few can decide for the many, whether it be drawing up scenarios, selecting technology, imposing structures or the like, is inherently unstable. Instead, we need to start to use large numbers of people to feed decision making processes with current information and diverse perspectives.
  • Fragments are key. Material that is used must be finely granulated. A big problem with traditional scenario planning is that it produces ‘chunked’ reports. The human brain has evolved to handle fragmented data patterns – pictures, comments, captions, labels, etc. One of the reasons social computing is so successful is that it presents information in multiple inter-threaded fragments, so that the brain can ‘conceptually blend’ those and link those fragments to determine how to move forward. Documents don’t tune in to the evolutionary nature of humans. Fragmented information has evolutionary advantage.
  • Overcome disintermediation. People making decisions about the future have to have direct contact with raw intelligence. They can’t afford to have middle management or processes mediating, summarising or grouping information. The information must come from trusted sources and permit interaction with those sources (so the information can be validated).

In Part II of this post I will look at some of the implications of this participative evolutionary approach for traditional current awareness and information creation and categorisation processes.

To what extent does your company facilitate social networking between employees split by geographical or organisational distance, or with (existing or potential) clients and business partners?  What’s the value of this social capital to the company (i.e. the connections within and between social networks as well as connections among individuals).  How does it change the nature of opportunities and constraints each person faces, and the flow-on effects to the team and company as a whole?

IBM recently published its research surrounding Beehive (an experimental internal platform designed to blur the boundaries of work and home, professional and personal, and business and fun).  The report provides empirical evidence of the power of nurturing social capital in the enterprise. IBM Social Networking Research.pdf

The researchers studied issues associated with adoption, usage, motivations, and impact of social networking in the workplace, and they found that:

[E]ven with limited use of Beehive, over a relatively short amount of time, there are associations between types of usage and … different types of social capital. When someone is using Beehive for meeting new contacts, they report a greater interest in making these types of contacts at the company in general.

When someone is using Beehive for keeping up with known colleagues, both in their workgroup and in their extended network of loose ties, they report having closer ties with their immediate network (bonding social capital), a higher sense of citizenship (willingness to help the greater good of the company), and greater access to both new people and expertise within the company [(bridging social capital)].

And finally, the more intensely someone uses Beehive (meaning more frequent visits and stronger associations with the community on the site) the higher they report their social capital is, across all measures. They have closer bonds to their network, they have a greater willingness to contribute to the company, they have a greater interest in connecting globally, have greater access to new people, and a greater ability to access expertise.”

As IBM has illustrated with its customised Beehive-development, supporting social networking in the organisation means more than simply bring in-house functionality from (public) social networking tools.

Instead, social networking functionality should be integrated not only with existing information systems, but also with the particular needs of the organisation to enable people to grow informal networks which exist alongside formal structures, and fully exploit the wealth of information and expertise circulating in and around the organisation.  The latter is very difficult for public social networking sites to deliver.

As with any change initiative, building the right adoption models are equally important as building the right architectural/technical models.  Adoption models raise important issues around the situation of social tools, control of people’s (private) information/discussions, and building on existing networking behaviours, to ensure that levels of information flow and control match needs, cultures and expectations.

Here are a few thoughts in that regard:

  • Well ‘situated’ social tools: This is a concept that we’ve talked alot about (e.g. here, here and here) as it helps in lowering the barrier to adoption.  By ensuring the networking platform is well integrated with key existing information systems and social tools, as people contribute and work with information, trails are automatically created, which when aggregated in profile/personal pages, automatically reflect people’s social network and information connections.  The information is constantly refreshed and kept current without extra effort on the part of the individual user.  People can readily identify who’s working with who on what, or who is connected with who and may be able to make an introduction or support a proposal/project idea.
  • Technology and communication preferences: To maximise involvement, tools need to be made available which reflect people’s preferences for technology and communication style.  As we are seeing from the public domain, there should be a greater emphasis on presence sharing, status updates and other ad hoc style exchanges during people’s work, which can be rapidly embellished and/or responded to by others.  These quick fire exchanges can then form a feed of information in the same way that friendfeed streams information.
  • Professional and personal ‘identities':  For some people, the line between professional and private lives is increasingly opaque.  But we can’t assume that professional and personal identities will merge comfortably.  As Doug Cornelius points out, as colleagues and clients become friends, we may want to share information with them that we don’t want to share with others.  In the same way that some people use Facebook to keep in touch with their freinds and LinkedIn for their business contacts, people should have the ability to manage ‘professional’ or ‘public’ and ‘private’ profiles in a way which suits their desired level of openness or privacy.
  • Intelligent social networks:  To be even more useful, the networking systems needs to give us a little bit of extra information – like a pat on the back for having participated.  For instance we should see not only ‘who is connected with who’, but also the proximity of people’s connections based on shared attributes, such as tags, groups, communities, and signals based on RSS from social news-reading and interactions (e.g. visits to or comments on posts).  So if we give a little we get alot.

Adoption issues aside, another sticking point for getting top-down buy-in for a social networking project in the company is the difficulty of measuring the value that social capital.  This was one of the caveats the IBM research team highlighted in their report, i.e. the results are indicative of a relationship between use and the measures they used, but are not causal.  As Bill Ives rightly points out, the next steps for us will be to see how we can illustrate the relationship with improved performance and bottom line results.  Your thoughts on this as always are welcome!

Thanks to Bill Ives for reporting the IBM research.

Follow

Get every new post delivered to your Inbox.