Copyright © 1999 Perpetuity Press Ltd Page 67
Crime Prevention and Community Safety: An International Journal
Surfing the Crime Net
by Daniel Gilling
Surfing the Crime Net is intended to be a regular feature of this journal. Its purpose is to
review the sources of information about crime prevention and community safety which are
currently available on the World Wide Web (WWW). The review is intended to be an interna-
tional one, examining English-language material from across the globe.
As thousands of subscribers daily come on-line, plugging in to the Internet and specifically
the WWW, so its status as a source of information appears to grow, and it is regarded less and
less as a solitary refuge for the computer bore. The WWW is proving its worth as a source of
information for teachers, researchers and practitioners alike and this is as true in the field of
criminal justice as it is in many other subjects.
Unfortunately, the WWW is also replete with commercialism and information of dubious
quality (although thankfully not in my experience pornography, as many sceptics appear to
presuppose), and its unregulated massive expansion inevitably means that if you do not know
what you are looking for, or where to look, you can spend unproductive hours surfing the net
without reward. This feature intends to make the surfing more rewarding and less time-con-
suming, by directing readers towards the more useful and higher quality sources of informa-
tion.
It will do this both by listing sources which may be of interest to crime prevention and com-
munity safety practitioners and academics, as well as by engaging in more detailed reviews of
individual web sites, or web sites grouped around particular themes. In the case of listing
sources of interest, it comes with the health warning that web sites can disappear as fast as
they appear: what is here today may be gone tomorrow, and there is no guarantee of web sites
remaining open indefinitely.
Finally, before we get started, it is worth stressing that this feature is not generally intended as
a technical ‘how to do it’ guide, not least because such a feat lies beyond the skills and knowl-
edge of this novice user. In the main, the technical information will extend only to passing on
the addresses of the Web pages, sometimes referred to as Uniform Resource Locators (URLs),
and generally expressed in the format http://www.blah.blah.htm. That said, for this particular
contribution only, a small number of very simple points or tips may well be worth making at
this juncture. Apologies are made in advance for preaching to the converted:
1. Try to use the Internet when the USA is asleep. When Americans wake up and come on-
line there is a noticeable slowdown in WWW operations, and it takes a lot longer for
documents to download. To some extent you can mitigate the effects of this from the
more elaborate WWW pages by switching off the pictures (clicking off ‘auto load im-
ages’ from the options menu in Netscape, for example).
2. When you find particularly good sources of information or links — such as the Univer-
sity of Cambridge’s Institute of Criminology links to other sites of criminological inter-
Page 68 Daniel Gilling
Crime Prevention and Community Safety: An International Journal
est (http://www.law.cam.ac.uk/crim/CRIMLINK.HTM) or Sociorealm’s ‘Criminology
on the WWW’ (http://www.geocities.com/~sociorealm/crime2.htm) — use your book-
mark facility so that you do not have to retrace your steps every time you want to access
them.
3. If you want to print off what you find, then depending upon the capacity of your compu-
ter and printer, and the size of the web document, it is often easier to copy and paste the
WWW pages into a word-processing document first. This has the added advantage that
you are then able to edit the pages for your own use.
4. If you have a quest for a specific topic or source of information, then rather than brows-
ing aimlessly between links, and wasting several hours in the process, use an Internet
directory with a search facility, such as National Information on Software and Services
(NISS at http://www.niss.ac.uk), or a search engine such as Lycos (http://www.lycos.com)
or Alta Vista (http://www.altavista.digital.com).
That’s enough of the technical details. For the remainder of this first review I shall devote my
attention to a single but monumental document, running to several hundreds of pages (http://
www.ncjrs.org/works/index.htm). The full title of the document is Preventing Crime: What
Works, What Doesn’t, What’s Promising, and it was produced in 1997 as a report for the
United States Congress on behalf of the National Institute of Justice, authored by Lawrence
Sherman and a number of other American contributors. The fact that this report has been
available on the Internet since 1997 means that it is well known to many academics. Neverthe-
less, its importance is such that it deserves scrutiny at this time, and particularly in the light of
the UK Government’s passing of the 1998 Crime and Disorder Act.
The purpose of the report is to provide a critical assessment of the state of knowledge regard-
ing the effectiveness of a range of different crime prevention interventions across a range of
seven different but interrelated institutional contexts. These institutional contexts — specifi-
cally communities, families, schools, labour markets, places, policing, and criminal justice —
each merit a substantial chapter in the main body of the report, and in each of these chapters
the scientific evidence is reviewed to determine signs of effectiveness.
In essence, the report sets out to assess the nature and quality of scientific evidence — all of it
US-based — of the effectiveness of crime prevention in different institutional settings, in
order that federal resource allocation decisions might be informed by an unambiguous assess-
ment of where resources, which total close to four billion dollars in annual grants, might most
effectively be deployed. The review is not confined to evaluations of projects where federal
funding has been deployed to date, and also takes in evidence from projects funded through
alternative sources, whether public, private or voluntary.
The approach taken by the report’s authors is first to establish a scientific yardstick against
which project evaluations may themselves be measured. This yardstick is assembled via the
construction of a ranking schema, which ranks evaluations on a scale of 1 to 5, with 5 being
the gold standard. The higher one moves up the ranking, the more rigorous the study in terms
of the number of evaluative prerequisites that it includes. The most important of these relate to
such things as the need for experimental and control groups/areas to be similar in profile and
structure; the need for control groups/areas to be randomly selected; the need for reliable
correlation test methods; and the need for the appropriate ordering of cause and effect.
Having established the yardstick or measurement tool, the authors are then forced to compro-
Daniel Gilling Page 69
Crime Prevention and Community Safety: An International Journal
mise the very highest scientific standards because — unsurprisingly, for those aware of the
general state of crime prevention evaluations — so few evaluations match the very highest
ranking, with none obtaining the gold standard. Instead, the authors suggest quite reasonably
that where a crime prevention measure has obtained at least two separate 3-ranked evaluations
demonstrating success, it may be assumed to be something that works. By the same token,
those measures obtaining at least two separate 3-ranked evaluations showing failure may be
deemed not to work; those achieving a single 3-ranked evaluation demonstrating success may
be characterised as promising; and those achieving no such ranked evaluation cannot be as-
sumed either to work or not to work.
When this ranking schema is applied to crime prevention measures in each of the institutional
contexts covered in the main body of the report, the result is the finding that not as much is
known about what works and what does not than could and perhaps should have been the case.
It becomes immediately apparent, especially in some of the institutional contexts where evalu-
ations inevitably tend to be more difficult, that evaluations are frequently not conducted, or
frequently not conducted very well, or with much scientific rigour. This is attributed by the
authors of this report in large part to the limited availability of funding for the purposes of
project evaluations. As they observe, ‘the major limitations on better crime prevention evalu-
ations today are not technical, but statutory’. That is to say that there now exists a considerable
knowledge of and expertise in the science of evaluation, but too frequently projects are resourced
without sufficient regard to or funding of the evaluation dimension, and with predictable con-
sequences.
The authors use this fact to make an important point to the National Institute of Justice. If
federally-funded programmes are resourced enough to facilitate proper and rigorous scientific
evaluations, then they will have a value well in excess of their financial cost, showing more
clearly than hitherto which crime prevention measures, in which institutional contexts, are
most effective. Such information allows lessons to be learnt.
The report’s findings are inevitably equivocal, given the paucity of a large proportion of the
evaluations, or their complete absence, yet there is more than enough here to indicate a few
important conclusions. It is apparent, for example, that crime prevention in institutional set-
tings other than policing and criminal justice shows enough promise to justify a decisive shift
away from the current over-reliance on these two as the mainstays of federally-funded crime
prevention — yet the authors acknowledge that crime prevention is not always the sole goal of
this federal funding. Also, it is apparent that crime prevention in one institutional setting is
interrelated with crime prevention in another, with the one reinforcing the other. This is a
simple, easily illustrated point, helping to explain why multi-strategies are often more effec-
tive than single measures, even though it makes it more difficult for the evaluator to separate
chains of cause and effect.
The findings of the report, about the poor state of evaluations generally, and the resultant lack
of knowledge of ‘what works’, are especially timely for those seeking to devise and imple-
ment crime prevention strategies in the UK in the wake of the 1998 Crime and Disorder Act.
This hastily conceived legislation clearly expects evaluation to be built into the strategies, and
yet with no additional central funding earmarked for crime prevention measures themselves,
let alone their evaluation, the prospects do not look good, and so the sound advice of this
report will probably pass unheeded too often.
This is particularly likely to be the case given the early signs that partnerships are basing their
identification of strategic priorities on the basis of very limited data sets, which will make
Page 70 Daniel Gilling
Crime Prevention and Community Safety: An International Journal
quality pre-test/post-test comparisons very difficult. In the context of the haste to identify
strategies, data collection has tended to be of the ‘quick and dirty’ variety, although there will
probably be some exceptions.
This kind of approach to crime prevention has been encouraged by governments that, from the
mid-1980s onwards, have increasingly spoken the managerialist language of ‘audit’ rather
than ‘evaluation’. The report reviewed here explicitly states that it is about scientific evalua-
tion, not audit of effort. The two are not the same, with the latter threatening to replace a
complex analysis of cause and effect with a simple counting procedure which may be taken to
imply a great deal about performance, but proves nothing.
The report is undoubtedly a useful contribution to knowledge about crime prevention, and it is
good to see a study that prioritises the science of evaluation, and recognises its intrinsic value
to policy makers. However, in giving more attention to evaluation the authors also explicitly
distance themselves from policy analysis: they say the report is about the former and not the
latter. Yet this is a problematic distinction to make. Evaluations take account of contexts in so
far as comparisons are drawn between experimental and control groups/areas, but the policy
process is also a part of the context, and thus the distinction that the authors make does not
hold. Moving from macro through to micro-levels of analysis, the policy context has a very
strong influence upon what is implemented and what is achieved. It is apparent, for example,
that at the state level different infrastructures facilitate and inhibit different forms of crime
prevention; while at the institutional level inter-organisational relations, and at the individual
level interpersonal relations, are both important contextual variables. As the realist research
agenda of those such as Pawson and Tilley argues and seeks to demonstrate, evaluations need
to take account of the complex contexts in which mechanisms are fired. It is surprising how
often a little policy analysis leads one to change one’s understanding of these mechanisms: the
mechanism is often mutated within the policy process, and thus what one thinks one is evalu-
ating, and what one actually is evaluating, may be two different things. The gap between
claims and deeds may be great indeed.
There is a danger that this report may be read as a compendium of off-the-shelf packages, at
least for those programmes that appear to work. However, while it contains a lot of informa-
tion and good ideas, it needs to be recognised that, as with the probation literature on ‘what
works’, there is no guaranteed formula. Some things work in some conditions at some times.
This does not mean we abandon evaluation as wasted effort. Rather, we recognise the need for
rigour, and we recognise that science, like equal opportunity employers, is always in the proc-
ess of working towards solutions. There are few laurels to rest upon in crime prevention.
Daniel Gilling
Senior Lecturer in Social Policy
University of Plymouth