3. Closed Environment Testing
of ISP−Level Internet Content
Filters
Report to the Minister for Broadband, Communications and the
Digital Economy
June 2008
9. Executive summary
Executive summary
This report has been prepared by the Australian Communications
Media Authority (ACMA) in response to a ministerial direction
received in June 2007 to conduct closed environment testing of
internet service provider (ISP)-level internet content filters. The
purpose of the trial was to assess the current maturity of commercial
filtering products that are suitable for deployment by internet service
providers. The direction is at Appendix A.
A detailed assessment of the ACMA considers that, under the conditions created for the trial, the
state-of-the-art of ISP-level state of ISP-level filtering technology has significantly advanced,
filtering technology is at and stands in contrast with the state of this technology evidenced in
pages 52—53. the previous trial of filter products commissioned by NetAlert Ltd in
2005. The main indicators of the increasing maturity of ISP-level
filtering technology are:
● the number of filter products that are specifically designed to be
deployed by ISPs;
● availability of a number of filter products that produce moderate
to nearly nil performance degradation;
● improvements in accuracy—the products tested exhibited high
levels of successful blocking and low levels of overblocking (that
is, blocking access to content that is intended to be accessible);
● availability of the capability to offer different filtering options to
ISPs’ customers or for customers themselves to customise
filtering; and
● actual deployments of filter products by ISPs in other countries.
The one area which showed little sign of advance was reflected by
the absence in this trial, for the most part, of any capability of
filtering content carried via non–web protocols.
The findings of this report and the assessment of the state-of-the-art
of ISP-level filter products reflect testing in a controlled laboratory
environment. In particular, the testing simulated a Tier 3 network
(the lowest level of the ISP network hierarchy); different results
might be observed in a real−world Tier 3 network or in networks at
Australian Communications and Media Authority 1
10. Executive summary
higher levels in the ISP network hierarchy. This is due to variations
in architecture, including hardware used, size and complexity of the
network and traffic demands.
Introduction to the trial
The purpose of the trial was to assess the capability of available
technology to filter illegal or inappropriate content at ISP-level and
advances in filtering technology since the previous trial in 2005.
The report commences with definitions of the four criteria against
which the capability of ISP-level filtering technologies was assessed:
● performance−whether the products degrade internet
performance;
● effectiveness−the extent to which the products correctly identify
and block illegal content, content that may be inappropriate for
minors and innocuous content;
● scope−whether the products are capable of filtering non-web
internet traffic; and
● adaptability−whether the products can be customised to apply
different levels of blocking according to the preferences of the
user.
ACMA was not asked, as part of the trial, to assess the capability of
ISP-level filtering technologies that filter only illegal content.
ACMA was also not asked to investigate the balance of costs and
benefits associated with implementing ISP-level filtering, including:
● capital and operating costs associated with implementing filter
products;
● costs associated with any upgrading of an ISP’s network to
address performance degradation associated with a particular
filter product; and
● the nature and implications of the implementation of ISP-level
filtering for ISPs’ customers.
ACMA also did not assess other matters that may be of relevance to
the efficacy of ISP-level filters in a real-world context, such as;
● the extent to which a filter can be circumvented; and
● the ease with which it is installed, deployed and implemented.
Chapter 1 summarises the findings of previous technical studies of
filtering for the Australian Government and concludes with an
explanation of the different types of filtering techniques.
Australian Communications and Media Authority 2
11. Executive summary
Project background
Chapter 2 describes the selection of six filter products for testing,
following a public call for expressions of interest from filter vendors.
The method of compiling the In accordance with the ministerial direction, the trial was required to
Category 1, 2 and 3 lists of test the capability of filter products to distinguish between illegal,
URLs is described at pages inappropriate and innocuous content. To test this capability, three
20—22. lists of URLs were created as test data:
1. Category 1 was intended to test the extent to which the selected
filter products blocked content on the ACMA prohibited
content list.
2. Category 2 was intended to test the extent to which the selected
filter products underblock, by allowing access to content that
may be regarded as harmful or inappropriate for children but is
not illegal.
3. Category 3 was intended to test the extent to which filter
products overblock by blocking access to content that may be
regarded as innocuous.
Execution of the trial
Chapter 3 describes how an isolated purpose-built network was
established to test performance, simulating both the function of the
internet as a source of content and the function of end users
requesting content. The test network was analogous to a Tier 3 ISP.
Details of the methodology Testing of the effect on network performance of each filter product
used in testing performance involved measuring:
are provided at pages 29—
● baseline performance of the test network with no filter installed;
35.
● performance of the test network with each filter connected, in
turn, but with no active filtering occurring; and
● performance of the test network with each filter connected, in
turn and actively filtering.
Three indices representing the performance of each filter product
were calculated from the results of these tests.
Details of the methodology Testing of effectiveness involved measuring:
used in testing effectiveness
● the effectiveness of a filter in blocking content corresponding to
are provided at pages 35—
Categories 1 and 2—that is, content that was intended to be
37.
blocked; and
● the effectiveness of a filter in distinguishing content from
Category 3—that is, content that was not intended to be blocked.
Australian Communications and Media Authority 3
12. Executive summary
Two indices representing the effectiveness of each filter product
were calculated from the results of these tests.
Chapter 3 concludes by describing the methodology used to assess
the scope and adaptability, which involved an expert review of
product documentation and interviews with suppliers of the products
to identify specific features of each product.
Results
Chapter 4 sets out the measurements from the quantitative
performance and effectiveness tests for the filter products and lists
the capabilities of the filter products with for scope and adaptability.
Details of the performance For the performance test, the percentage results showing the degree
results are at pages 39—43. of degradation introduced by a filter connected to the test network
but not actively filtering (where a low figure indicates a lesser degree
of performance degradation) were:
● below 30 per cent for all products; and
● below 10 per cent for five of the six products.
The percentage results showing the degree of degradation introduced
by a filter connected to the test network and actively filtering (where
a low figure again indicates a lesser degree of performance
degradation) were:
● nearly nil (two per cent) for one product;
● in the range 22 to 30 per cent for three products; and
● in excess of 75 per cent for two products.
Details of the effectiveness For the effectiveness test, the results showing the degree of success
results are at pages 43—44. in blocking content corresponding to each of the URLs listed in the
Category 1 and 2 lists (where figures fall in the range of 0 to 1 and a
high figure indicates a greater degree of success in blocking content
that was intended to be blocked) were:
● above 0.88 for all products; and
● 0.94 or above for three products.
The results showing the degree of success in not blocking content
corresponding to each of the URLs listed in the Category 3 list
(where figures fall in the range of 0 to 1 and a low figure indicates a
greater degree of success in not blocking content that was intended to
be blocked) were:
● below 0.08 for all products; and
● below 0.03 for four products.
Australian Communications and Media Authority 4
13. Executive summary
Details of the scope results Each of the filter products is able to block traffic entirely across a
are at pages 44—45. wide range of non-web protocols, such as instant messaging and
peer-to-peer protocols. However, a capability to identify illegal
content and content that may be regarded as inappropriate carried via
such protocols was not found, excepting:
● two products that can identify particular types of content carried
via one email protocol; and
● one product that can identify particular types of content carried
via one streaming media protocol.
Details of the adaptability Chapter 4 concludes by reporting that all of the products allow the
results are at pages 45—46. customisation of filtering policies for groups of users, for individual
customers of an ISP and for individual users.
Conclusions
The specific findings for performance and effectiveness in this trial
are substantively different to those of the previous trial.
A comparison of the results The previous trial reported that, when filters were connected to the
for performance in this trial test network and actively filtering, performance degradation ranged
with those of the previous from 75 per cent to a very high 98 per cent between the best-and
trial is at pages 47—49. worst performing filter products. In the current trial, the
corresponding performance degradation varied across a greater
range—from a very low two per cent to 87 per cent between the best
and-worst performing filter products.
Network degradation as a percentage
100%
90%
of baseline throughput
80%
70%
60%
50%
40%
30%
20%
10%
0%
Previous trial Current trial
Although the performance of two of the six products tested in the
current trial was relatively poor, one product generated almost no
network degradation and the remaining three products exhibited low
to moderate levels of degradation in network performance.
The median network degradation of the tested filters significantly
dropped indicating a significant improvement in network
performance in the current trial compared with that of the
Australian Communications and Media Authority 5
14. Executive summary
previous trial. ACMA considers that this improvement in the
performance of filters tested in the current trial compared with the
previous trial represents a profound advance in ISP-level filtering
technology.
A comparison of the results The previous trial reported a difference in the level of successful
for accuracy in this trial blocking (that is, the proportion of content that should have been
with those of the previous blocked that was actually blocked) between the least and the most
trial is at pages 49—51. accurate filter products in the range 70 to 100 per cent. The
corresponding levels measured in the current trial varied across a
smaller range, between 88 and 97 per cent, with most achieving over
92 per cent. The median rate of successful blocking was improved
from the previous trial.
100%
Percentage of Category 1 and 2
content successfully blocked
90%
80%
70%
60%
50%
40%
30%
20%
10%
0%
Previous trial Current trial
The previous trial reported a difference in the level of overblocking
(that is, the proportion of content that was blocked that should not
have been blocked) between the most and the least accurate filter
products in the range six to 62 per cent. The corresponding levels
measured in the current trial varied across a significantly smaller
range—between one and eight per cent, with most falling under three
per cent. The median overblocking rate was significantly improved
from the previous trial.
Percentage of Category 3 content
100%
90%
80%
70%
overblocked
60%
50%
40%
30%
20%
10%
0%
Previous trial Current trial
Australian Communications and Media Authority 6
15. Executive summary
An assessment of the ability Despite the general nature of advances in ISP-level filtering
of ISP-level filters to control technology between the current trial and the previous trial, most
non-web content is at pages filters are not presently able to identify illegal content and content
44—45. that may be regarded as inappropriate that is carried via the majority
of non-web protocols, although development work by filter vendors
is underway in this area. This is despite developments in the use of
internet technologies that have led to increased use of non-web
protocols such as instant messaging and file-sharing.1
1
See Chapter 2 of Australian Communications and Media Authority, Developments in Internet Filtering Technologies
and other Measures for Promoting Online Safety, February 2008.
Australian Communications and Media Authority 7
16. Chapter 1: Introduction
Chapter 1: Introduction
Overview
Chapter 1 sets out the terms of reference for the trial and specifically describes the criteria against
which the filter products tested in the trial were evaluated − performance of filters in an ISP’s
network, effectiveness of the filters in blocking particular content, scope or range of internet
content on which filters were able to operate and adaptability to the specific requirements of ISPs
or their customers. The purposes served by evaluating ISP-level filtering products against these
criteria are set out. The chapter goes on to outline the previous Australian Government technical
studies on internet filtering and presents a summary of their respective findings and concludes by
providing an overview of the contents of each of the following chapters.
This report sets out the results of a study into the maturity of products for filtering internet content
that are suitable for deployment by internet service providers (ISPs). The study was informed via a
technical, laboratory-based trial of a sample of commercial filter products available to ISPs,
conducted in the first half of 2008. The study was undertaken at the direction of the former
Minister for Communications, Information Technology and the Arts for the purpose of assessing:
● the capacity of available technology suitable for deployment by ISPs to filter internet content
that is illegal and inappropriate for minors which consumers may access through their internet
connection; and
● the extent of advances made since a previous trial of ISP-based filter technologies that was
carried out during 2005.
Terms of reference
The Protecting Australian Families Online Direction No. 1 of 2007 (Appendix A) was made under
subsection 14(1) of the Australian Communications and Media Authority Act 2005 in relation to
ACMA’s functions under paragraph 8(1)(d) of that Act (dealing with, reporting to and advising the
Minister in relation to matters affecting consumers of carriage services). The direction instructed
ACMA to conduct a trial of one or more commercial filter products that provide the capability of
filtering of internet content that is illegal and inappropriate for minors at the ISP-level and to
report the findings of the trial to the Minister by 30 June 2008.
Specifically, ACMA was directed to test the products against the following criteria:
1. Performance−whether the products cause delays or otherwise degrade internet performance.
2. Effectiveness−the extent to which the products distinguish illegal content, content that may be
inappropriate for minors and innocuous content.
3. Scope−whether the products are capable of filtering internet traffic other than web content,
such as peer-to-peer file transfers, chat and instant messaging.
Australian Communications and Media Authority 8
17. Chapter 1: Introduction
4. Adaptability−whether the products are capable of being customised to apply different levels of
blocking appropriate for children of different ages and to target different categories of content.
The performance criterion is intended to quantitatively measure whether ISP-level filtering
products, when deployed and operational within an ISP’s network, adversely affect the
performance of the network such that it would be necessary for an ISP to upgrade its network
sooner than forecast in order to restore performance to a satisfactory level.
The effectiveness criterion is intended to quantitatively measure the extent to which ISP-level
filtering products satisfactorily perform their essential role − that is, successfully identifying and
preventing the delivery of illegal internet content and internet content that is inappropriate for
children, while permitting innocuous internet content.
The scope criterion is intended to indicate the ability of ISP-level filtering products to identify and
block illegal and inappropriate internet content that is transmitted and delivered across the internet
using non-web protocols (for example, instant messaging, file transfers) in addition to web
content.
The adaptability criterion is intended to indicate whether ISP-level filtering products can offer a
range of filtering options that enable a filtered service supplied to an ISP’s customers to be tailored
to meet specific requirements of the ISP or of the customer, as opposed to a ‘one-size fits all’
filtering solution.
Accordingly, the assessment of ISP-level filtering products against these four criteria seeks to
update knowledge about the overall maturity of ISP-level filters, not to identify a ‘best buy’
product.
There are some other matters that may also be relevant to the efficacy of a filter in a real-world
context, but which were outside the terms of reference and not assessed by ACMA. These include
the following.
Filters may vary in their resistance to circumvention. Table 1 illustrates some common methods
of how filters can be circumvented and how these methods may apply to both PC−based filters and
ISP−level filters.
Method PC−based filters ISP−level filters
Administrator passwords Can be circumvented Cannot be circumvented
Boot disks Can be circumvented Cannot be circumvented
Anonymisers Low probability of circumvention
Translation software Moderate probability of circumvention
Search engine caching Low probability of circumvention
Mirrors Low probability of circumvention
Additional domain names Low probability of circumvention
Table 1: Common methods of circumventing filters
Unlike PC−based filters, which are open, ISP−level filters lie on the internal network of an ISP
which is generally firewalled off from public access and are therefore less open to attack.
Filters may also vary in the administrative complexity of installation, deployment and
implementation. Factors such as the compatibility of existing hardware in the ISP’s network in
Australian Communications and Media Authority 9
18. Chapter 1: Introduction
which the product is installed, the availability of skilled personnel and degree of change that such a
device brings into a network make it impossible to assess in a closed trial of this nature.
Previous technical studies
This report is the latest in a series of Australian Government technical studies of internet content
filtering, which are summarised below.
BLOCKING CONTENT ON THE INTERNET: A TECHNICAL PERSPECTIVE,
CSIRO, JUNE 1998
Commissioned by the National Office for the Information Economy, this report examined the
technical aspects of the internet that allow particular content which has already been identified as
illegal or offensive to be blocked, particularly by ISPs and content hosts. The report explored the
technical issues associated with different methods of blocking content, as well as some non
technical issues and evaluated the pros and cons of each method.
The primary observations of the report were that:
● ISPs could offer differentiated services, including a walled-garden service that provides access
to a limited subset of websites and a filtered internet access service; and
● international cooperation would be needed in order to deal with hosting of illegal (or offensive)
content outside Australia.
EFFECTIVENESS OF INTERNET FILTERING SOFTWARE PRODUCTS, CSIRO,
SEPTEMBER 2001
Commissioned by NetAlert Ltd and the former Australian Broadcasting Authority, this report
presented the findings of testing on 14 filter products for:
● ease of installation, configuration and use;
● ease of bypassing or disabling filter;
● ability to stop access to undesirable content;
● ability to avoid blocking desirable content; and
● ability to track access.
The majority of the products tested were client-side desktop applications, but a number of server-
side solutions suitable for enterprise environments were also tested. None of the products were
specifically intended for deployment by ISPs.
The primary observation of the report was that almost all the products were effective in blocking
undesirable content, but all blocked some portion of desirable content.
INTERNET CONTENT FILTERING, OVUM, APRIL 2003
Commissioned by the then Department of Communication, Information Technology and the Arts
to evaluate the state−of−the−art internet content filtering technologies, this study provided:
● technical advice on the emergence of new blocking or filtering methodologies since the
commencement of the online content co-regulatory scheme set out in Schedule 5 to the
Broadcasting Services Act 1992, in 2000;
Australian Communications and Media Authority 10
19. Chapter 1: Introduction
● information about financial costs and administrative requirements associated with ISP-level
filtering and the potential impacts on the performance and efficiency of the internet; and
● an overview of the application of filter technologies and other access management techniques
in government−administered schemes overseas.
The study was undertaken using a combination of secondary and primary research, including
interviewing leading content filtering vendors, ISPs in Australia and Singapore, government
officials and interested parties involved in internet content filtering in countries covered in three
case studies. No filter products were actually tested in the study.
The major observations of the study were that no major technological developments had occurred
since the commencement of the online content co-regulatory scheme in 2000 and that ISPs who
had adopted filtering had seen a limited impact on throughput. It also noted, however, that the
costs associated with such an implementation at the ISP-level made it unattractive for ISPs.
A STUDY ON SERVER−BASED INTERNET FILTERS: ACCURACY,
BROADBAND PERFORMANCE DEGRADATION AND SOME EFFECTS ON
THE USER EXPERIENCE, RMIT IT TESTLAB, APRIL 2005
Commissioned by NetAlert Ltd to provide a quantitative analysis of the performance impact of
applying server−based internet content filtering applications and appliances to an internet feed in
both live and controlled environments, this technical trial provided information on:
● the extent of degradation of internet access speed and performance;
● the accuracy of filtering; and
● the effect of filtering on the user experience when using broadband internet services.
RMIT IT TestLab encountered difficulties in finding products for the trial that were specifically
designed for ISP−level filtering; consequently, all the products tested were designed for enterprise-
level content filtering but had the capability of being deployed in an ISP’s network.
The primary findings of the trial were:
● a reduction of between 18 and 78 per cent in network performance when accessing the internet
through a content filter;
● the filters performed significantly better when blocking pornography and other adult content
but performed less well when blocking other types of content;
● of the filters tested, the most effective in terms of accuracy blocked 76 per cent of the URLs
used in the testing and only one filter blocked 100 per cent of URLs on ACMA’s prohibited
content list;
● only one in six users noticed any degradation in network performance, but this degradation was
regarded as minor and acceptable; and
● all users reported a measure of over-blocking.
The present report updates the body of evidence about the state−of−the−art of ISP-level filtering
technologies in 2008, recognising both the possible developments in filtering technologies and the
changes in internet use since the reports described above.
Australian Communications and Media Authority 11
20. Chapter 1: Introduction
How filtering technologies operate
In order to put the nature of the products tested in the trial in context, the following is an overview
of the ways in which different filtering technologies operate.
Filter products perform two basic functions in order to limit users’ access to content − they identify
content that is to be excluded (or included) and then block (or allow) access to that material.
Identification methods are common to most filter products, while blocking methods vary
depending on the type of location a filter is deployed; for example, a home computer, ISP network
or mobile phone network.
Terminology describing different methods of filtering is imprecise and is rarely used among filter
vendors in a standard manner, while several terms often exist to describe the same filtering
methodologies.
The following section provides a description of the most common filtering methods.
IDENTIFICATION TECHNIQUES
The two basic methods for identifying content to be filtered are:
1. Index-based filtering−material is included in a list of either ‘good’ or ‘bad’ content.
2. Analysis-based filtering−on examination of the content, it is found to meet a set of criteria
intended to determine its acceptability.
Australian Communications and Media Authority 12
21. Chapter 1: Introduction
Index-based filtering is the process of permitting or blocking access to web pages on the basis of
their inclusion on a list (or index) of web resources. Such filtering can be based on ‘whitelists’
(exclusively permitting specific content while blocking all other content) or ‘blacklists’
(exclusively denying specific content while permitting all other content). Indexes may be
developed manually, by human searches and analysis of content, by category indexes (most
commercial filter vendors provide a database of URLs classified by the nature of the content) or
automatically by analysis-based filtering, as discussed next. Figure 1 illustrates the architecture of
index-based filtering.
Figure 1: Index-based filtering process
Australian Communications and Media Authority 13
22. Chapter 1: Introduction
Analysis-based filtering refers to the dynamic classification of content using computer software
and has emerged in response to the shortcomings of index-based filtering—that is, that the latter is
only applicable to web pages that have previously been assessed. Such analysis may be based on
key word, profile, image analysis, file type, link analysis, reputation and deep packet inspection
among other criteria. Such classification may be in real-time or offline. Figure 2 illustrates the
implementation of analysis-based filtering.
Step 2: Request for content is
forwarded to web server Web server hosting
requested content
Step 3: Web server returns content
Internet
Step 1: User requests
content
Step 4: Filter assesses
content
Filter
Step 5: If content
is assessed as
inappropriate, it
content is
User blocked;
otherwise it is
delivered
Figure 2: Analysis-based filtering process
BLOCKING TECHNIQUES
Common blocking techniques used are:
● packet filtering;
● DNS poisoning;
● caching web proxies;
● port blocking;
● pass-by filtering; and
● pass-through filtering.
Packet filtering involves a router or other device determining whether to allow or deny the
passage of a request for content by examining the headers2 of the packets3 as they pass through.
Packet filtering examines the destination IP address in the header and determines whether that IP
2
Header refers to the information contained at the beginning of a packet, which contains information about the
handling of the packet. Analogous to an envelope around a letter, header information includes, among other things,
the destination and source IP addresses.
3
A packet is a formatted block of data that can be transmitted over a network. It includes both header information and
the content to be delivered.
Australian Communications and Media Authority 14
23. Chapter 1: Introduction
address is to be blocked according to an index. Where an IP address is to be blocked, the router
will not forward the request for data to the content host and therefore no connection will be made
with the host computer. This method blocks all traffic associated with an IP address.
DNS poisoning (also referred to as DNS tampering) involves changing the information returned to
a particular user within their DNS server4 when responding to a query of a blocked domain. Users
attempting to request blocked websites will not be directed to the correct IP address.
Caching web proxies are used in networks to acquire web content from a host on behalf of a user
and deliver it to the user’s web browser. They provide a perceived increase in web browsing
speeds and reduce bandwidth usage by holding copies of frequently requested material so that a
request does not need to be made to the content host every time a user in the network wants to
view the content. The same technology can be used to precisely block the content that is deemed to
be inappropriate, usually on the basis of URL. The content retrieved and stored in the local cache
is inspected and classified offline. Based on the classification of the cached content, proxies may
then modify the communications between user and content host, usually to block requests for
content hosted at a URL on a blacklist, often replacing the returned content with an ‘Access
Denied’ (or similar) message to filter out inappropriate sections.5
Some filters use port blocking to close ports through which particular data is transferred. Different
classes of programs or services on a computer use different ports to send and receive data. For
example, some email programs use port 110, while web traffic is received through port 80 by
default. By blocking the ports that various programs use to access the internet, filters aim to
prevent use of those programs to access content on the internet.
Used primarily with analysis-based filtering, pass-by filtering allows a requested web page to load
and marks it for later analysis. The page will later be added to the filter vendor’s index of
categorised material and therefore blocked, where appropriate, for users who subscribe to a filtered
service. This means there is no delay to the user in accessing requested material, but that
inappropriate content may be viewed by a user once.
Used primarily with analysis-based filtering, pass-through filtering is often referred to as
‘proxying’. Delivery of a requested website is not permitted until analysis of its content is
complete, introducing a certain amount of delay6 that depends on the processing power of the
hardware on which the filter software is installed.
Most commercial filter products employ both index-based filtering and analysis-based
filtering. This is to provide a robust filtering solution by minimising the respective
limitations of each approach; that is, filtering that relies exclusively on an index may not
appropriately deal with newly created internet content and filtering that relies exclusively
on analysis may consume an excessive amount of processing resources on the computer or
other hardware on which it operates.
4
The domain name system is the system that translates internet domain names such as ‘acma.gov.au’ into IP
addresses. A DNS server is a server that performs this kind of translation. ISPs generally have DNS servers.
5
Dornseif, M. (2003) ‘Government Mandated Blocking of Foreign Web Content’, in von Knop, J., Haverkamp, W.
and Jessen, E. (Eds) Security, E-Learning, E-Services: Proceedings of the 17th DFN-Arbeitstagung über
Kommunikationsnetze, Düsseldorf, available at: http://md.hudora.de/publications/200306-gi-blocking/200306-gi
blocking.pdf, accessed 10 October 2007.
6
Ovum (2003) Internet Content Filtering. A Report to DCITA, p.17, available at:
http://www.dbcde.gov.au/__data/assets/file/10915/Ovum_Report_-_Internet_content_filtering.rtf, accessed
15 October 2007.
Australian Communications and Media Authority 15
24. Chapter 1: Introduction
Outline of the report
The remainder of this report is structured as follows.
Chapter 2 describes the preparation for the trial, including the appointment of a test agency, the
selection of a sample of ISP-level filtering products for testing and the compilation of test data
comprising URLs linking to content that is illegal, inappropriate for children or innocuous.
Chapter 3 sets out the manner in which the testing of the sample of ISP-level filtering products was
conducted. The methodologies used for testing the performance and effectiveness of the sample
filter products are described, including the design of the test network, the test procedures and the
derivation of several measures of filter performance and effectiveness. Chapter 3 also describes the
manner in which the scope and adaptability of the sample filter products was assessed.
Chapter 4 provides the detailed results for the sample filter products in relation to performance,
effectiveness, scope and adaptability.
Chapter 5 sets out the findings of the trial and provides an assessment of the current state-of-the
art of ISP-level filtering technologies, both by reference to the capabilities of the sample of filters
tested in this trial and by comparison with the previous trial undertaken by NetAlert Ltd in 2005.
Chapter 5 concludes with observations suggesting the standard that presently available ISP-level
filters are capable of achieving.
Australian Communications and Media Authority 16
25. Chapter 2: Project background
Chapter 2: Project background
Overview
Chapter 2 describes the initial steps taken in establishing the trial, which began with the
appointment of Enex TestLab to conduct the trial. It describes the process by which a sample of six
filter products was selected for testing, outlines the range of filtering techniques employed by the
selected filters and provides an overview of the characteristics of each of the filters. Also covered
in the chapter is the process of creating test data for the trial, in the form of three separate indexes
of URLs. The first index, which corresponded to the ACMA prohibited content list, was for the
purpose of assessing the extent to which the selected filters successfully block content on this list;
the second index was for the purpose of assessing the extent to which the filters fail to identify
content that is intended to be blocked; and the third index was for the purpose of assessing the
extent to which filters block content that is intended to be accessible. The chapter concludes with a
description of the selection and establishment of a test facility.
Appointment of test agency
ACMA selected Enex TestLab to conduct the trial following a competitive tender process.
The tender process commenced with the issue of a request for tender, accompanied by the posting
of a notification on the AusTender website7, publication of an advertisement in The Australian and
issuing of a media release by ACMA drawing attention to the request for tender. Three tender bids
were received in response to the request for tender. These were subjected to a careful and thorough
evaluation in order to select the company that best met the following criteria specified in the
request for tender:
● capability of meeting ACMA’s requirements;
● extent to which the tender bid meets the requirements set out in the request for tender for
setting up and configuring the trial, conducting the testing and reporting the results;
● degree of overall compliance with the request for tender, including with the draft contract;
● price; and
● experience and past performance of the tenderer, including the skills and experience of
specified personnel or subcontractors.
7
Link: https://www.tenders.gov.au/?event=public.advert.showClosed&AdvertUUID=520E6196-AFB3-4A6F
774BAF8F2E23144F
Australian Communications and Media Authority 17
26. Chapter 2: Project background
Enex TestLab is an established provider of information and communications technology testing
and benchmarking services.
Selection of filter products
In order to conduct the trial, it was necessary to acquire a selection of filter products that exhibited
a broad spectrum of available internet filtering technologies.
Enex TestLab placed an advertisement in the IT section of The Australian seeking filter vendors
interested in participating in the trial. Vendors were asked to submit completed expressions of
interest packages for each individual product that they proposed to offer for testing. Twenty-eight
expressions of interest, representing 26 products, were received from vendors (two products were
each represented by two expressions of interest from separate vendors).
The expressions of interest were considered by an evaluation panel established by Enex TestLab.
The purpose of the evaluation was to arrive at a selection of six products, where the selection:
● included a minimum of two hardware filter products;
● included a minimum of two software filter products;
● included a minimum of one filter product that was a hybrid solution (that is, a combination of
software and hardware); and
● covered as wide a cross-section of filtering technologies as possible.
The first phase of the evaluation assessed if each product was:
● suitable for deployment in an ISP environment; and
● either a software or a hardware solution that was not a vendor-managed service.8
During this initial evaluation phase, nine products were eliminated as they were vendor-managed
solutions. Vendor-managed solutions were excluded from the trial because the bulk of their
functionality generally resides in the facilities of the filter vendor itself; hence there is no way to
test such solutions in a closed environment.
The remaining 17 products progressed on to the second phase of the evaluation, involving a
detailed technical assessment of each product. During this phase, individual products were scored
on the following criteria:
● the product being suitable for an ISP environment;
● the product being generally available (that is, currently in commercial use as opposed to being
an experimental product);
● the product being either a hardware and/or a software product, while not being a vendor-
managed service;
● the product having realistic hardware specifications to be deployed in an ISP;
8
For internet content filtering, a vendor-managed service is a third-party service, hosted on the internet, which is
offered as a subscription. Such a service does not provide the transport to access to the internet, but filters the
content that an internet subscriber receives. Subscribers of this service use a managed service to either proxy off
of, or establish VPN through which they retrieve their internet content. Such a service cannot be tested in a closed
environment.
Australian Communications and Media Authority 18
27. Chapter 2: Project background
● the product being able to filter content other than web content (that is, internet protocols other
than HTTP and HTTPS);
● for software products, the product vendor being able to supply the recommended hardware
configuration on which they would pre-install their product9; and
● the vendors being able to provide support for their products during the trial.
The six products that were selected comprised two hardware solutions, three software solutions
and one hybrid solution. Vendors for four of the six products were able to cite examples of current
implementations of the products by overseas ISPs.
The six products offered the following filtering methodologies:
● index-based filtering;
● analysis-based filtering;
● use of proxies to substitute content that is to be blocked with warning messages;
● filtering of non-web content;
● restriction of access based on user profiles and time of day;
● IP address-based blocking; and
● URL-based blocking.
Table 2 records the multiple filtering techniques employed by each filter product. It specifically
illustrates the range of filtering methodologies offered by the selected filter products and more
generally illustrates the broad spectrum of filtering methodologies used in currently available filter
products. With a few exceptions, commercial filter products do not exclusively use a single
filtering technique, but use a combination of two or more methodologies. Specifically, most
commercial filtering products and all of the selected filter products, employ a combination of
index-based filtering and analysis-based filtering.
Similarly, none of the selected products only targeted illegal content. This is because the testing
required under the terms of reference involved assessing the accuracy of the filters in blocking
inappropriate content, which comprises a significantly broader range of material than illegal
content. However, it is possible to configure all of the selected products to filter illegal content
only, using a blacklist such as ACMA’s prohibited content list and allow access to all other types
of content. The characteristics of illegal content filtering are discussed further in Appendix C.
In accordance with non-disclosure agreements with the suppliers of the selected filter products,
names of the individual products have been withheld; the products have instead been represented
by the Greek letters Alpha, Beta, Gamma, Delta, Theta and Omega.
9
This requirement was to eliminate any suggestion by software vendors that their products were installed on
incorrectly configured hardware.
Australian Communications and Media Authority 19
28. Chapter 2: Project background
Product Identification Filtering techniques
technique
Packet filtering
DNS poisoning
Pass-through
Port blocking
Caching web
Analysis
Pass-by
filtering
filtering
proxies
Index
Alpha 9 9 9 9 9
Beta 9 9 9 9 9 9
Gamma 9 9 9 9 9
Delta 9 9 9 9 9 9 9
Theta 9 9 9 9
Omega 9 9 9 9 9
Table 2: Features offered by individual selected filter products
A detailed description of each of the selected products follows:
Product Alpha is a hybrid software solution offering both hardware and software components. It
operates as an Ethernet bridge, connecting multiple segments of a network within an ISP. The
product employs index-based filtering, analysis-based filtering, packet filtering, port blocking and
pass-through filtering.
Product Beta is a software solution. It is installed within the core network. The vendor also offers
an option where the application is sold as an appliance. For the purpose of this trial, the vendor
provided the software solution pre-installed on its own hardware. The product employs index-
based filtering, analysis-based filtering, packet filtering, port blocking, pass-by filtering and pass-
through filtering.
Product Gamma is a software solution. It is installed within the core network. The vendor
provided its product pre-installed on its own hardware. The product employs index-based filtering,
analysis-based filtering, packet filtering, DNS poisoning, port blocking and pass-by filtering.
Product Delta is a software solution. It is placed within the core network. The vendor also offers
an option where the application is sold as an appliance. For the purpose of this trial, the vendor
provided the software solution pre-installed on its own hardware. The product employs index-
based filtering, analysis-based filtering, packet filtering, DNS filtering, caching web proxies, port
blocking and pass-by filtering.
Product Theta is a hardware appliance. It is a gateway device. The product employs index-based
filtering, analysis-based filtering, packet filtering and pass-through filtering.
Product Omega is a hardware appliance. It is a gateway device. The product employs index-based
filtering, analysis-based filtering, caching web proxies and pass-by filtering.
Compilation of test data
As set out in the minister’s direction, the trial was required to test the ability of filter products to
distinguish between illegal, inappropriate and innocuous content. To this end, a set of URLs
needed to be compiled as test data.
Australian Communications and Media Authority 20
29. Chapter 2: Project background
Three indexes of URLs containing illegal, inappropriate and innocuous content were created for
the purpose of the trial, based on:
● classification categories under the National Classification Code; and
● the ACMA prohibited content list.10
The first URL index, Category 1, was intended to test the extent to which the selected filter
products blocked content on the ACMA prohibited content list. The second URL index, Category
2, was intended to test the extent to which the selected filter products underblock, by allowing
access to content that may be regarded as harmful or inappropriate for children but is not illegal.
The third URL index, Category 3, was intended to test the extent to which filter products
overblock, by blocking access to content that may be regarded as innocuous.
Under the National Classification Code, content for films is classified in categories G − General,
PG − Parental Guidance, M − Mature, MA15+ − Mature Accompanied, R18+ − Restricted,
X18+ − Restricted and RC Refused Classification. The National Classification Code provides a
nationally uniform and well-defined standard for rating content and is the standard that is applied
under Schedule 7 of the Broadcasting Services Act 1992 to classification of internet and mobile
content.
The Category 1 index of URLs was created from the ACMA prohibited content list. In accordance
with Schedule 5 to the Broadcasting Services Act 1992, this list contains URLs that link to internet
content hosted outside Australia for ACMA is satisfied is prohibited or potentially prohibited.
Prohibited and potentially prohibited content is defined in clauses 20 and 21 of Schedule 7 to the
Broadcasting Services Act 1992 and may include content in the range MA15+ to RC. The ACMA
prohibited content list was provided to Enex TestLab, which checked whether each URL was still
live. ACMA approved the Category 1 index, containing 1000 URLs, before it was employed in the
trial.
For Categories 2 and 3, a distinction needed to be made between inappropriate and innocuous
content. The National Classification Code provides a helpful distinction between content that is
legally restricted—MA15+ through to X18+—and that which is not—G through to M.
Accordingly, for the purpose of the testing, this distinction was used to separate inappropriate
content from innocuous content. In a real-world application, filters may allow more granular
distinctions to be made.
The Category 2 index of URLs was drawn from an existing database of URLs held by Enex
TestLab. The content from this list of URLs was intended to be rated in the range from MA15+ to
X18+. A proportion of content rated as strong M, which was regarded as close to the MA15+
classification, was also allowed in this category. To verify that the content assembled in this list
fell into this range, ACMA checked the range of content accessed via the URLs. ACMA approved
the Category 2 index, containing 933 URLs, before it was employed in the trial.
The Category 3 index of URLs was also drawn from an existing database of URLs held by Enex
TestLab. The content from this list of URLs was intended to be rated in the range from G to M. To
verify that the content assembled in this list fell into this range, ACMA checked the range of
10
The ACMA prohibited content list is a list of URLs that have been reported by internet users to ACMA and have
been categorised by ACMA’s Content Assessment team as prohibited content. This list includes content rated RC
1(b)—Child Pornography—as well as other content rated RC, X18+ and R18+.
Australian Communications and Media Authority 21
30. Chapter 2: Project background
content accessed via a sample of the URLs. ACMA approved the Category 3 index, containing
1997 URLs, before it was employed in the trial.
Selection of test site
As set out in the minister’s direction, the trial testing was required to be conducted in Tasmania.
For this purpose, Enex TestLab secured the premises of the Telstra Broadband eLab in
Launceston.
Under the agreement between Enex TestLab and Telstra for use of the Telstra Broadband eLab,
neither Telstra nor its employees nor any of its affiliates were permitted to have any input or
influence on the trial conducted within these premises.
Australian Communications and Media Authority 22
31. Chapter 3: Execution of the trial
Chapter 3: Execution of the trial
Overview
Chapter 3 describes the methodologies used to evaluate the selected filter products for performance,
effectiveness, scope and adaptability. In order to appreciate the relationship between the particular
methodology followed in measuring performance and the real-world manner in which ISP networks
operate, this chapter includes an outline of the architecture of a typical ISP network. The configuration
of the test network used for measurement of performance and effectiveness is then described.
For measuring performance, this chapter describes how, in order to assess the extent to which a filter
introduces any changes to the throughput of an ISP’s network, the trial collected data to enable
comparison of the performance of the test network with no filter installed, with each filter product
installed but not actually filtering content (passive mode) and with each filter product actively filtering
content (active mode).
For measuring effectiveness, this chapter describes how, in order to assess the accuracy of a filter in
identifying and blocking content from categories 1 and 2—while similarly identifying but allowing
access to content from category 3—the trial collected data on whether each URL in the three indexes
described in the previous chapter was correctly identified by each filter product, either for blocking or
permitting access to the corresponding content.
For measuring scope and adaptability, this chapter describes how, in order to evaluate the capabilities
of filters in filtering non-web traffic and in customising filtering policies in accordance with the
specific requirements of an ISP or one of its customers, an expert review captured details of various
capabilities of the selected filter products.
PERFORMANCE
Evaluating the performance impact of filters meant determining the extent to which the operation
of a particular filter product in the test network introduced degradation in network performance.
Before indicating how performance impact was measured in the trial, it is necessary to describe the
typical architecture of an ISP’s network in order to appreciate where and how performance of the
network can be affected.
Operation of an ISP’s network
Figure 3 illustrates a typical layout for the network of an ISP offering ADSL broadband services
(it is not an actual representation of any particular ISP’s architecture).
The links among the various network elements are shown using lines of varying widths and
colours. The thickness of each line is representative of the bandwidth of the link represented by it.
Links within the ISP’s internal network are of higher bandwidths (and are accordingly shown with
Australian Communications and Media Authority 23
32. Chapter 3: Execution of the trial
lines of greater thickness) than the link between the end user and the local exchange digital
subscriber line aggregation module (DSLAM).
Legend: Line Speeds
OC48: 2.488Gbps OC12: 622Mbps OC3: 155Mbps
Internet DS3: 44.736Mbps ADSL: 1.5Mbps
‘Prime’ network Central exchange
multiplexer
Internet gateway
Domain name
server (DNS)
Billing server
Core router
Local exchange DSLAM
News server Mail server Edge router
Content filter
Database server
Typical ISP internal network architecture End user
(usually co-located in a central exchange)
‘Access’ network
Figure 3: A typical ISP's architecture from the ISP to the end user
There are usually multiple users (in the order of hundreds to a few thousand) connected to a local
exchange and multiple local exchanges (in the order of tens to a few hundred) connected to a
central exchange. ISP networks are typically designed in this manner as it offers a scalable and
cost-effective solution for the network demand likely to be generated by end users.
In this typical ISP network, an end user on an ADSL connection is connected via their local
exchange11 DSLAM, through their central exchange multiplexer12, to their ISP, which then routes
their traffic back and forth to the internet via an internet gateway. The bandwidth of the network
links decreases as one gets further away from the internet gateway.
Figure 3 illustrates that the segment that is the ISP’s core network constitutes the ‘prime13’
network, as connections among its respective network elements are assigned a high bandwidth. By
contrast, the segment between the end user and the local exchange DSLAM constitutes the
‘access14’ network. The peak network throughput to the end user is limited to the subscriber’s
bandwidth, which is usually no more than a few megabits per second.
Consequently, a slowdown in performance on the access network does not necessarily indicate any
end-to-end congestion in the network. For example, such degradation in network performance may
11
When dealing with internet traffic, a local exchange is often referred to as a Point of Presence or POP.
12
A multiplexer is a telecommunications device used to break large bandwidth links into smaller links, while keeping
them synchronised.
13
In networking terms, this is often referred to as the ‘fast’ network.
14
In networking terms, this is often referred to as the ‘slow’ network.
Australian Communications and Media Authority 24
33. Chapter 3: Execution of the trial
be the result of a large demand on bandwidth, such as an end user downloading a video file, which
exceeds the bandwidth of the access network.
Similarly, any actual congestion in the prime network segment that is small or moderate in degree
typically has a less pronounced effect on an end user on the access network. This is because there
is a significantly larger amount of bandwidth on the ‘prime’ network than what the access network
can demand. In the example shown in Figure 3, the bandwidth of the access network is 1.5
megabits per second, whereas that of the prime network is 622 megabits per second—over 400
times greater. As a result, the access network representing the segment between the ISP and the
end user has little bearing on the effect of an ISP-level filter on the overall ISP scalability.
Measurements conducted on the prime network provide a quantitative gauge of the effect on
network performance of a filter within the ISP’s core network. These measured quantities express
the number of transactions per second and the data rate within the network (measured in megabits
per second) that the ISP’s network is capable of supporting. The effect of a filter on an ISP’s
network is best reflected by the performance seen within the prime network. The measurements
conducted in this trial focus on this area.
The network architecture applicable for ISPs offering connections other than ADSL—for example,
dial-up, cable, satellite or mobile connections to the internet—is broadly similar to that described
above, although the bandwidth available on the access network may differ significantly.
Network performance metrics
In order to understand the metrics used in the trial, a number of concepts that are central to
network performance measurement are clarified below.
Networks are rated based on both:
● bandwidth—a measure of the potential rate that data can be transmitted over a network15; for
example, when an ISP advertises a 1.5 megabits per second internet service, it means that, in
peak conditions, the internet connection will transmit data at 1.5 megabits per second; and
● throughput—the actual speed at which data will be transferred from one point on the network
to another at a particular time16; it can be regarded as the rate at which ‘useful’ data is
transferred.
Users rarely experience throughput higher than 80 per cent of the rated bandwidth.17 This is due to
the inherent design of network protocols – the set of rules by which data is transferred across
networks. Various network protocols are in common use; for example, the IEEE 802.3 standard for
Ethernet and ATM.18 Irrespective of the standard of protocol that is used, data is split into
‘packets’ before transmitting. Each packet is assembled into a pre-defined format (as specified in
the protocol), called a ‘frame’, before being transmitted. A frame typically contains elements such
as the following:
● the header or preamble, which defines the type of protocol being used;
15
Tannenbaum, Andrew S. (2002), Computer Networks 4th Edition, Prentice Hall
16
http://www.support.psi.com/support/common/networking/diff.html.
17
Spurgeon, Charles E. (2000), Ethernet: The Definitive Guide, O'Reilly
18
ATM: Asynchronous Transfer Mode is a cell relay, packet-switching network and data link layer protocol that
encodes data traffic into small (53 bytes; 48 bytes of data and 5 bytes of header information) fixed-sized cells. This
differs from other technologies based on packet-switched networks (such as the Internet Protocol or Ethernet), in
which variable sized packets (known as frames when referencing Layer 2) are used.
Australian Communications and Media Authority 25
34. Chapter 3: Execution of the trial
● the start of frame delimiter, which indicates the start of the frame;
● the destination address—the IP address to which the packet is headed;
● the source address—the IP address from which the packet originates;
● the length of the packet, which allows the receiving device to correctly separate one packet
from another;
● the data or payload—the actual useful information that needs to be transmitted, such as the
contents of a web page;
● padding—any dummy bytes required to fulfil minimum frame size requirements; and
● the checksum, which is used for error-checking and correction.
Length
SOF
Figure 4: The IEEE 802.3 Ethernet frame format19
Figure 4 shows the frame format for 802.3 Ethernet. The ‘SOF’ field denotes the ‘Start of Frame’
and is 1 byte. This sample frame format will use at least 27 bytes and up to 73 bytes in overhead;
that is, the preamble, destination and source address, length and checksum.
There are two primary factors that affect network efficiency:
1. The amount of overhead, as seen in the above example.
2. The number of retransmissions required to transfer an error-free packet.
Considering Ethernet and the frame format illustrated in Figure 4, a regular MP3 file of
approximately 4MB (or 4,194,304 bytes) would require a total of 2,797 frames, each of which
would contain 27 bytes of overhead. This equates to a network efficiency of 98.23 per cent,
assuming that the 4MB MP3 file is divided into 1500-byte packets and there are no retransmits20
as a result of packet loss in transmission. In reality, however, data communications without
retransmissions rarely occur. If such a transmission required a single instance of retransmission,
the efficiency would fall to 49.11 per cent. Routing devices attempt to balance overhead and
number of retransmissions by adapting packet sizes, in order to obtain optimum network
performance. As a result, the theoretical efficiency is rarely obtained.
As a result of the balancing of overhead and number of retransmissions, the throughput of
networks increases with increasing network load until the network reaches a state of saturation;
that is, the network is carrying as much traffic as its theoretical bandwidth. For Ethernet networks,
this is about 80 per cent of the available bandwidth. Beyond this point, as network load increases,
the network efficiency begins to plateau. This characteristic is illustrated in Figure 5. Similar
characteristics have also been observed for other protocols.
19
Tannenbaum, Andrew S. (2002), Computer Networks 4th Edition, Prentice Hall
20
A retransmit is where the same packet is transmitted more than once to overcome a scenario where the original
packet may have been lost when initially transmitted. It is analogous to repeating oneself in a conversation when
the recipient fails to interpret one’s statements the first time.
Australian Communications and Media Authority 26
35. Chapter 3: Execution of the trial
100%
90%
80%
70%
% throughput
60%
50%
40%
30%
20%
10%
0%
0% 10% 20% 30% 40% 50% 60% 70% 80% 90% 100% 110% 120% 130% 140% 150% 160%
% load applied
Figure 5: Load versus throughput graph for various protocols21
The performance characteristics of a network are also influenced by the nature of the traffic being
transmitted. Networks exhibit better efficiency when the traffic being transmitted is of a
predictable nature; for example, streaming media. This is because routing and switching devices
require less time to determine the optimum packet size. The performance of Ethernet traffic
degrades as traffic becomes increasingly ‘bursty’; that is, where packet size becomes random.22
Traffic generated by internet chat and games, where packet sizes vary without a predictable
pattern, are examples of such traffic.
Network throughput is also measured in the number of transactions per second. A transaction is
defined as a complete cycle of data interexchange. For the purpose of this trial, a transaction starts
with the initiation of a web request and ends when the requested web content is delivered.
Test network and hardware
An isolated test network to simulate an ISP’s network was built to observe the effect of each filter
product on network performance.
The network architecture is shown in Figure 6. The network architecture seen here is analogous to
a Tier 3 ISP; that is, an ISP that purchases outbound transport from other networks in order to
reach the internet (see Appendix B for details).
21
The percentage load applied is a measure of the network demand placed on a network as a percentage of the total
bandwidth available. An array of 20 machines each demanding 10Mbps equates to 200Mbps; such an array would
place a network with an available bandwidth of 100Mbps under a load of 200 per cent.
22
Mazraani, T.Y.; Parulkar, G.M. (1992), Performance analysis of the Ethernet under conditions of bursty traffic,
Global Telecommunications Conference, 1992. Conference Record, GLOBECOM Communication for Global
Users., IEEE Volume , Issue , 6-9 Dec 1992 Page(s):592 - 596 vol.1
Australian Communications and Media Authority 27
36. Chapter 3: Execution of the trial
Legend: Load Generation Array
Content return path Content Request Path
Traffic statistics information Gigabit ethernet
WebBench client
simulates load generated from web
requests from 0 to 6 users
Web server
simulates internet content
WebBench client
Gigabit switch simulates load generated from web
acts as edge router requests from 0 to 6 users
Vendor supplied content filter
filters internet content WebBench client
simulates load generated from web
requests from 0 to 6 users
DNS
performs IP address lookup
WebBench client
simulates load generated from web
requests from 0 to 6 users
WebBench client
simulates load generated from web
requests from 0 to 6 users
WebBench controller
controls WebBench tests;
compiles and collects test results
WebBench client
simulates load generated from web
requests from 0 to 6 users
Figure 6: Test network for evaluating network performance of internet content filters
As the network was an isolated one, two core network functions were simulated:
1. The function of the internet as a source of content.
2. The function of end users requesting content.
Simulating the internet
The internet was simulated using a high-end web server. This web server hosted a range of content
from both Category 2 and Category 3 indexes replicated from active web sites published on the
internet. The nature of the content included, but was not limited to:
● static web content in the form of HTML documents; and
● images complementing web content in the form of GIF and JPEG files.
The web server acted as a target host for web requests generated by the array of client machines
(described below), individually processing the requests and delivering the resultant content back to
the requesting client.
Simulating end users
To measure the effect of individual filters on network performance, the function of end users
requesting content was simulated using a tool called WebBench 5.0, a benchmarking and testing
software program developed by VeriTest that measures the performance of web servers and
networks under different load conditions. 23
WebBench 5.0 operates using a client-server architecture. The controller manages the execution of
the tests and compiles the statistics collected by the client machines at the end of a test cycle. This
machine was connected to an array of client machines that generated the web requests. The client
load-generation array comprised six machines, each running the WebBench client software in
23
http://www.veritest.com/benchmarks/webbench/home.asp.
Australian Communications and Media Authority 28