Weak Signals and Text Mining I – An Introduction to Weak Signals

“Weak Signals” is a rather fashionable term used in parts of the future-watching community, although it is an ill-defined term as evidenced by the lack of a specific entry in Wikipedia (there is only a reference under Futurology). There is an air of mystique and magic about Weak Signals Analysis that turns some people off, me included, but I have come to the conclusion that a sober interpretation of the idea can be provided. This is what we are trying to do in a work package led by the Zentrum für Soziale Innovation (strapline “all innovations are socially relevant”) in the TELMap project. This work combines two approaches: one with direct engagement with people, our “human sources” track, and one looking at “recorded sources”, i.e. existing written texts. My area of interest, and that of colleagues at RWTH Aachen University is in the recorded sources. This post provides an introduction to the work, I hope a sober interpretation of “weak signals”, and a following post will outline some initial ideas about how text mining might be used.

A Weak Signal is essentially a sign of change that is not generally appreciated. In some cases the majority of experts or people generally would dismiss it as being irrelevant or simply fail to notice it. In these days of social software and ubiquitous near-instantaneous global communication the focus is generally on trends, memes, etc. Thought leaders of various kinds – individuals and organisations – wield huge power over the focus of attention of a following majority. The act of anticipating what the next trend/meme/etc will be could be construed as looking for a weak signal. There are a number of problems with identifying these and a naive approach is bound to fail; for example, to ask people to “tell me some weak signals” is equivalent to asking them to tell of something they think is irrelevant which might be important. Neither can you ask the experts, by definition. The point here is that the person who spots a sign of change may well be an outsider, on the periphery or be in a despised sub-culture.

In spite of Weak Signals being a problem concept, the fact remains that to anticipate change would give an innovator an advantage and potentially help an agent in the mainstream to avoid being blind-sided. To make even a small contribution here is part of the mission of both TELMap and CETIS. Our intention is to divert some attention away from the hot topics of the day and to discover some neglected perceptions or ideas that are worthy of more attention, both social attention and analytical investigation. This intention, and an assertion that we only ever consider Possible Weak Signals, is my “sober interpretation”. There is no magic here, no shamanic trance leading to revelation.

There is ample literature around the topic of Weak Signals but I will only mention a couple. Elina Hiltunen is a well-known figure, see for example some slides and references (pdf), in which she gives an informal checklist for weak signals (quoted with minor changes to the English) that should be viewed as indicative of necessary rather than sufficient criteria:

  1. Makes your colleagues to laugh (ridicule)
  2. You colleagues are opposing it: no way, it will never happen
  3. Makes people wonder
  4. No one has heard about it before
  5. People would rather that no-one talks about it anymore (a tabu)

Two more Finns, Leena Ilmola and Anna Kotsalo-Mustonen discuss the importance of filters: “When monitoring their operating environments for weak signals and for other disruptive information companies face filters that hinder the entry of the information to the company”. Substitution of “technology enhanced learning community” for “company” gives us our initial problem statement.  Ilmola and Kotsalo-Mustonen describe thee kinds of filter following earlier work by Igor Ansoff, who is generally credited with introducing the concept of Weak Signals in the 1970’s:

  1. The surveillance filter. Colloquially, “just looking under the street-lamp”. The obvious compensator for the surveillance filter in our situation is diversity of recorded sources.
  2. The mentality filter. We tend to only notice things that are relevant to our immediate context and problems. Information overload and tendencies to conform to social norms and be influenced by fashion compound the effects of people working “in the trenches”. By using text mining approaches we hope to compensate for these problems by filtering information in the recorded sources in a mental-model-agnostic manner.
  3. The power filter. The signals of change that lead to change of strategy or action do so through an existing power structure and become filtered according to political considerations. Ideas that challenge the status quo are threatening. As for the previous filter, we hope to avoid some of the effect of the power filter, although not entirely. Most recorded sources have already been subjected to implicit (many bloggers self-censor to protect their job/career) or explicit (e.g. Journal or magazine articles) power filters.

The adoption of a text mining approach over a diverse range of recorded sources offers a promising means to draw out some Possible Weak Signals, although I am clear that text mining will be challenging to apply and that it will only be useful in tandem with human engagement. Given an initial list of possible signals, it will be necessary to apply some heuristics such as the Hiltunen checklist to try to reduce “noise”. These can then be used to facilitate discussion and disputation, cross-reference with other studies and with the conclusions of our “human sources” leading to ever shorter lists. If we find a few cases where people say “you know what, that isn’t so crazy after all”, or similar, I will consider the activity to have been a success. The next post summarises mainstream text mining approaches. describes how Weak Signals considerations affect the selection of text mining methods and outlines some ideas for application of text mining to look for possible weak signals.

2 thoughts on “Weak Signals and Text Mining I – An Introduction to Weak Signals

  1. Interesting area I have never thought about. Obviously part of the challenge is about finding significance in the minor fluctuations. Hiltunen’s criteria seem to me one plausible approach to looking for significance, based on perhaps on the subconscious connections of the colleagues observed. But to me, an interesting approach would be as an extension to theory or conjecture. If we hypothesise some connection between things that may not have been thought through before, even a text mining approach could give useful feedback on whether there might be something in it.