Wednesday, December 30, 2009

A curious case of government pressure

NFUDYTYZCJPT

News today of an attempt by the Canadian government to get a German ISP to censor a political parody site that was critical of the Canadian Government.

I'd be wary of making too much of this in relation to the Australian debate for several reasons.

The Canadian Government agency does not have jurisdiction over the German ISP and there is no suggestion, as far as I can tell, that they asked the German Government to assist with the censorship action. In other words: the Canadian Government wasn't using its monopoly on the use of force on Canadian soil to force the German ISP to act in a certain way.

Secondly, the collateral damage caused to the 4500 sites is most likely the result of incompetence on the German ISPs part, rather than the Canadian Government's part.

Yes, it does illustrate the dangers of a Government that thinks it can throw its weight around, but it doesn't really have too much to say about the Australian debate since it wasn't a case of the state acting against a commercial entity within the state. It was a case of an agency of a state asking a commercial entity of another state for assistance. One can argue that the German ISP should have told the Canadian Government to bugger off, but that is a different argument.

10 Questions About The Mandatory ISP-level Filter

Readers might like to consider using this list of questions as the basis of a letter to their local Labor MP or Senator.
  1. Has the probability of inadvertent exposure to Refused Classification material by adults been quantified? If not, is this probability judged to be: low, moderate or high?
  2. Have the consequences of inadvertent exposure to Refused Classification material by adults been measured? Are these thought to be minor, major or serious?
  3. Has the quantity of potentially Refused Classification material in existence on the Internet been estimated in either absolute or relative terms?
  4. Does the Government have an estimate or measure of the percentage of potentially Refused Classification material on the Internet that is currently Refused Classification? What is that estimate?
  5. Does the Government have a coverage goal for the Refused Classification list in terms of the percentage of potentially Refused Classification material that is actually refused classification? What is that goal?
  6. Is the Government concerned that in exempting X-18+ material from the specifications of the mandatory filter that it may be implicitly condoning the consumption of X-18+ rated materials by Australian adults?
  7. Does the Government believe it is acceptable for Australian adults who encounter X-18+ or potentially Refused Classification material on the Internet to treat such material as not Refused Classification until such time as ACMA makes a definitive decision otherwise?
  8. Does the Government believe that Australian adults who encounter such X-18+ or potentially Refused Classification material should use their own judgment to decide for themselves whether they should remain exposed to such material?
  9. If the Government does believe that all Australian adults should retain for themselves the responsibility of deciding what material is, and is not, acceptable to view, why is the mandatory filter required?
  10. What political benefit does the ALP gain by successfully sheparding enabling legislation for the mandatory ISP-level filter through both houses of parliament?

If you would like to share a reference to this page, please use: http://tinyurl.com/gr8wallofkrudd-10q

Government blessed smut?

According to the Government's latest FAQ, one of the measures the Government is now proposing is:

(The) Introduction of mandatory internet service provider (ISP) level filtering of content that is rated Refused Classification (RC) in order to reduce the risk of inadvertent exposure.

Without presenting any supporting evidence whatsoever, the Government would have us believe that the probability of inadvertent exposure by adults to refused classification material is high or that the consequence of that exposure, should it occur, is serious. Or both. Otherwise, why would such a draconian measure be necessary?

Suppose we suspend disbelief at the sheer flimsiness of the premise for a second and assume that it is true that adult Australians are, in fact, placed in peril by inadvertent exposure to Refused Classification material like Ken Park or euthanasia texts like "The Peaceful Pill".

Let us suspend disbelief even further and assume that ACMA will eventually discover all the potentially Refused Classification material that exists on the internet. ACMA, however, is not a machine and we are assured that all classification decisions will be subject to due process. Due process, being what it is, takes time. Evidence suggests that ACMA can currently take as much as 64 days to action a complaint.

So, at any given moment, there will be a lot of X-18 rated material (which is not subject to the mandatory filter) and as yet refused to be classified material that adults may be inadvertently exposed to.

Suppose now that an adult Australian is inadvertently exposed to this material. Should they:

  1. report the material immediately to ACMA and let ACMA decide whether the material is Refused Classification, or,
  2. assume that any material not blocked must, by definition, not be Refused Classification and is therefore ok to view

An adult electing to report material without first carefully considering whether the material is RC would be highly irresponsible. Apart from anything else, this would flood the ACMA classification machinery with material that probably won't be classified as RC. If the ACMA classification processes are overloaded, they may well fail to detect actually illegal material that should have been referred to police.

Equally, an adult who assumed that any material not blocked is implicitly not Refused Classification, is unlikely to find support from a court which will insist that, as an adult, they are entirely capable of making responsible judgments for themselves.

Furthermore, the Government would surely be distressed if Australians came to believe that X-18+ material, not being subject to a mandatory filter, had an implicit blessing from the Government as being acceptable for Australian adults to view.

Given that abdication of responsible use of the Internet by adults is unacceptable, the only option left (by definition) is for the Government to assume, and indeed, require that all adult Australians who are inadvertently exposed to potentially "Refused Classification" material must act responsibly and decide for themselves whether it is appropriate to remain exposed to such material.

What, then, is the point of the mandatory filter?


If you'd like to share a reference to this page, please use http://tinyurl.com/gr8wallofkrudd-smut.

Friday, December 25, 2009

Revealed: Donald Rumsfeld's influence on Australian censorship policies

As Donald Rumsfeld once said:

There are classified classifiables. These are things we have classified as classifiable. There are the classified unclassifiables. That is to say, there are things that we have classified as unclassifiable. But there are also unclassified unclassifiables. There are things we haven't classified that are unclassifiable.

And Rumsfeld's influence goes deeper. As @frankfil has pointed out, a Rumsfeldian analysis neatly explains the inner psyche of the ALP:

we know you want to view it, we know you know we don't want you to view it, we know you know you think we think we know best

Thursday, December 24, 2009

One very lucid mad cow

This comment on Crikey (http://tinyurl.com/smokinmadcow) from "Mad Cow" was so fricking well done (no pun intended) that I have quoted in full, without permission, the original text in order that as many people as possible can read it. If the original author would prefer that I did not, s/he is welcome to contact me and I will remove it.

Well done, Sir/Madam - a damn fine read!

Jon Seymour - who is very definitely not the Mad Cow.


Let me cut through this entire debate in the following way.

Senator Conroy has another option. That is to abandon the classification law and instead to base the blacklist on the following:

1. The blacklist is based on laws that clearly and objectively define illegality. We know exactly what this means and if this were the case, a host of objections to the content of the list would vanish.

2. The blacklist is compiled by law enforcement, subject to independent oversight and maintenance and subject to judicial review. This dispenses with issues of secrecy, personal taste and corruption.

Now, what is striking about this entire debate, Senator Conroy, is the fact that given this alternative, you don’t grab it with both hands. You clearly are sacrificing your Party’s good will, and if you are absolutely honest about wanting to block ONLY illegal content, you have absolutely no escape from this proposition.

Imagine if the Senate were to move an amendment to that effect - to divorce the blacklist from the classification system and to give it to those best able to judge illegality - the police and the judiciary.

How would you vote, Senator Conroy?

Dear reader, it is very obvious that this entire debate is built upon one central lie. And that is that the Senator wants to block only illegal material. No more, no less. But the reality is that if this were the case he would neither inherit from the censorship laws, nor would he hand the task to public servants whose task is to judge taste. He would have no option but to abandon classification law and to vote for the above amendment.

The sad thing is, that to the extent that blacklisting child porn and violent web sites has any effect at all (it will have next to none - and I’ll explain why in a moment) the objective of child protection will be endangered by cluttering the same list with what is mostly harmless rubbish that does no harm at all to normal adults. Indeed, the bigger you make the list, the more you make it a target for deliberate leaks, and the more you make it easier to reverse engineer the list - and in so doing you simply raise the probability of publishing the addresses of (what were) child porn sites.

Senator Conroy has all along tried to lead people to the inference that if something is banned, not only is it “bad” but it is “illegal”. Now that he is losing this debate, when it is pointed out that most RC content is legal to view, in most places (with some obvious exceptions), what he does instead is point out that RC content is illegal to distribute.

What that argument actually does does is to highlight the objectionable nature of the classification law itself. That we have laws that stop adults from seeing things (and in order to see them you have to obtain them) that are not illegal or harmful but are simply what some other people don’t want us to see. This is the core of what censorship means. The internet allowed people to bypass the blocks on distribution of RC material, and this is precisely what offends those who wish protect us from impure thoughts.

The movement to oppose gay marriage despite overwhelming public support and the campaign to extend censorship law into the internet, have something in common. They reflect the angst of those whose religion tells them to “correct” the thoughts and behaviour of non believers and if necessary to interfere with the laws of the land.

Senator Conroy, I am sure you have read “The High Price of Heaven” by David Marr. How the Church has repeatedly tried to interfere with the State, about the victimisation of homosexuals, and about the driving forces behind censorship. I suggest you have a long hard think and realise that even if the Christian Lobby has votes, you’re still here to govern for the majority.

Now to the technical detail. It would be a reasonable guess to surmise that within the current blacklist, those sites that are, or more correctly were, of child pornography, were submitted via law enforcement channels. Its also well known that when such sites become known to law enforcement they are promptly taken down. So to the extent that the blacklist contains child pornography or other sites that are genuinely illegal, it is also a fact that such sites are almost certainly defunct even before they even get a chance to be on the list. And to my knowledge, no URL on the leaked blacklists pointed to a functional child porn web site.

The next absurdity in this whole debate is the fact that the blacklist as it stands is really a sample. Its based mostly on public submission. The fact is that if you were to gather together the entire content of the web and send it to ACMA (even ignoring the prompt mass resignations) the material that would theoretically be classified as RC (or in this case might merely be written up as potentially RC but not yet submitted for formal classification) would amount to conservatively, some tens to hundreds of thousands of URLs.

No filter is capable of this task. The reason is simple. Those filters that appeared to do well in the trials rely upon features built into common routers, where traffic to certain IP addresses is directed or copied to a separate interface. The problem here is that every single IP packet address has to at least be compared to all of those IP addresses in the list. As this list grows, the router reaches hard wired limits. And without going into even further detail (my formal training is in computer engineering) you’re either going to get a massive speed degradation, or the filter will break, or the router will simply cease to function. And the limits will typically be reached at a few thousand to a few tens of thousands of addresses.

The filter trials were purposely designed not to expose such limits. The filter trials were purposely designed not to expose a number of other technical limitations.

Those of us who understand the technology know quite well that the filter will not scale. It cannot even remotely serve its stated purpose.

Senator Conroy what you are doing robs resources from law enforcement, makes life harder for genuine child protection agencies, muddies the issue, distracts the public debate from important issues such as parental education and so poisons the well that even if you now propose to merely filter only illegal material and turn your back on the censorship law, that it will be hard for people to believe you.

What you are doing Sir, requires an astonishing act of sheer gall. You’re trying to sell a lie to three groups of people:

You’re telling one group of people that RC material is all illegal, and if you can’t do that you’re trying to fool as many people as possible that RC material if it isn’t illegal, is at least horribly objectionable, nasty and perhaps immoral. No, it isn’t

You’re telling another group of people that it will be safe to leave their young kids alone with the internet. This is not only a hoax, but a cruel one.

and at the same time…

You’re telling yet another group of people - those who see it as their god given right to protect us from ourselves - that the filter will stamp out all the horribly morally objectionable things that a lot of us just plain enjoy. And when they finally figure out that the net is bigger, much much bigger, than your filter - you can guess what the Christian Lobby is going to do next…


Originally written by "Mad Cow" as a comment on Crikey. Reproduced here without permission.

Wednesday, December 23, 2009

The Trouble With RC

There is something slightly pythonesque about Australia's National Classification Code. In addition to the classifications that classify content, there is a classification, paradoxically named "Refused Classification" for classifying content that is otherwise refused classification.

"Wait a minute, didn't you say these things were not to be classified? How come you just classified them? Bertand, I think we may need your help here."

It is interesting to reflect on how other countries address the paradox of the classifying the unclassifiable with an oxymoronic classification.

Like us, France rates content such as film with a scheme that roughly correlates with age of people for whom the content is suitable. There is a category suitable for everyone (like our G), a category for under 10's, under 12's, under 16's and under 18's. And then there is everything else. Perhaps they name this category, perhaps they don't. Unlike Australia, they don't attempt to classify material suitable for adults any further. They may ban it, they just don't attempt to apply the weight of their film and literature classification bureaucracy to the task of classifying it. Or should that be the task of refusing to classify it. Anyway, they don't spend too much time thinking about. If a film is just wrong, they just ban it. They are French. They have no need to think further about such vulgarities!

The US is similar. The industry rates films, the market decides. America being the somewhat puritanical market that it is, will typically not buy anything rated NC-17 so the industry tries very hard not classify things that way. The idea that there could be something more perverse than NC-17 is a foreign, dare I say French, concept to the average American consumer. Of course, there is something more perverse than NC-17. That's labelled obscene. If your film is labelled obscene, and the court agrees, you will go to jail.

Australia loves its classification without a name (or rather, its classification without a sensible name). More particularly, Australian politicians love it. For them, the Refused Classification non-classification is a secular equivalent of Cardinal Pell. Instead of calling on the Lord to get them out of a fix, Australian politicians call on the great Too Hard Basket in the sky - the "Refused Classification" classification (or is it a non-classification, I keep forgetting).

Politicians can use RC to hide all sorts of trouble. Child porn, slap an RC-rating on it. Adult-oriented games, slap an RC-rating on it. Margaret Pomeranz's taste in American social commentaries - slap an RC-rating on it. They love the fact that innocuous things like adult-oriented games and vile things like child porn have the same label. It makes it so much easier for them to label their opponents as purveyors of filth.

The best thing about RC is that its all so much easier than making things actually illegal. If you start making things actually illegal, the people are going to get ticked off. Sure, you have made it slightly harder for people to get the material that is classified that way, but the people will still get it. That's not the point. By brandishing your moralistic credentials in front of the public at large you can claim whatever votes you believe that's worth and get back to whatever it is that politicians do on those long, cold nights in Canberra.

So if you wonder why Australia is planning to line its film and literature classifiers up along side the police forces of the world in the fight against child sexual abuse, wonder no longer. The RC has got Australian politicians out of a fix before and, as long as we let them, it will get them out of a fix again.


Update: In another conversation it has been pointed out to me that in order to protect the integrity of the ACMA complaints system, ACMA may refuse to accept a public complaint. Now, whether this is best characterised as refusal to classify or refusal to refuse to classify, I am not sure. However, it does seem like this is the only actual way a content item can actually be refused classification by the system. I think you will agree, this is really quite remarkable.

ALP's new campaign messages to support improved content classification (and fight child abuse)

Every good promotional campaign needs a good one-two punch. Hence, these are the messages that the ALP can use to promote the merits of its film and literature classification-lead approach to the problems of child sexual abuse:

Fight child sexual abuse: support ALP's decision to divert AFP resources to ACMA

ACMA: fighting child abuse - one content classification decision at a time.

Inadvertently exposed: the ALP's obsession with universal censorship

When you are a Government of a Western nation about to introduce a mandatory censorship regime unlike anything else in the Western world it is a good idea to try to play up comparisons with social democracies like Denmark, Norway, Sweden and Finland and our Commonwealth cousins the UK and Canada. It is also good to downplay comparisons with authoritarian regimes like Iran, Saudi Arabia or China.

So, naturally enough the Government's FAQ about their current filtering policy attempts to do this by asking the rhetorical question: "How does Australia's approach compare with other western democracies?"

It's a good question, but sadly for the Government the answer only serves to emphasise what is so wrong about the Government's proposal. In all the countries listed, not one has a mandatory filtering scheme. In all the countries listed, not one attempts to filter anything other than strictly illegal child abuse material.

Just as revealing is the list they did not enumerate - the list of 30 or so Western democracies which, like Australia, do not presently have any filtering regime.

And, of course, it is no surprise to learn that the Government does not list the countries that do have mandatory filtering regimes like Iran, Saudi Arabia and China.

Comparing Australia's proposed policy with other Western democracies actually highlights how draconian this policy is. Why is it that Australia is the only Western demoncracy to propose a mandatory filter? Why is it that the scope of Australia's filter is so uniquely broad that it will include material that is actually legal to own and view?

Part of the problem is that Australia is trying to do with 'taste' police, what other countries do with real police. Other countries treat child abuse as what it is: a horrific crime against children. Australia is trying to deal with the problem of child abuse by dealing with it as a content classification problem. The Government would have us believe that it can do something meaningful about the problem of child abuse by devoting more effort to content classification and then ensuring that content classification decisions are riguourously enforced at our digital borders (e.g. on the other side of the pipes into your living room).

Or, at least, it used to believe this. It is clear that the Government now understands that a mandatory filter can't contribute to fighting child abuse because it now states that the purpose of the mandatory ISP-level filter is merely to "reduce the risk of inadvertent exposure" to Refused Classification material. It readily admits that a technically competent user with the motivation to do so can circumvent a mandatory filter.

However, even this more modest aim is still far more draconian than those of other Western democracies that have some kind of filtering policy. These countries seek only to minimize inadvertent exposure to illegal child sexual abuse material which is a far more restricted category of material than that which is rated Refused Classification by the Australian National Classification Code.

Consider this: in 2003 Margaret Pomeranz, the ABC's film reviewer, attempted to give the Refused Classification film "Ken Park" a screening before a crowd in Balmain, Sydney. Police physically intervened to prevent her breaking the law. Yet it is exactly material of this kind that will be subject to Conroy's censorwall. Is Stephen Conroy prepared to call Margaret Pomeranz, a purveyor of "the worst of the worst" kind of internet filth? Or is she instead a decent person who strongly believes the National Classification Board made an error when it gave "Ken Park" an RC rating?

The ALP's policy on ISP-level filtering has changed on numerous occasions since it was first drafted in 2006. At that time the policy was about mandating that ISPs offer a cleanfeed to families that wanted it. In December 2007, it was about mandating that ISP's impose a cleanfeed that people could opt out of. In 2008, the policy changed again and opt-out ceased to be an option.

All along we were told that a mandatory filter was necessary to prevent Australians who seek child pornopgraphy from viewing it.

The Government has since learnt that a filter will be utterly ineffective for that task, primarily because most child pornography is traded on networks that are invisible to an HTTP-based filter. It now, at least, readily admits that the filter can be technically circumvented with ease. So, in recognition of these cold hard facts, the Government now insists that the mandatory filter is no longer about preventing criminal access to illegal material. It is now merely about preventing inadvertent exposure of ordinary citizens to Refused Classification material.

Think about that.

The Government insists that it must filter your Internet connection to prevent you being inadvertently exposed to material, such as the movie "Ken Park", that the National Classification Board has deemed unsuitable for any other classification.

How paternalistic. How patronising.

There would be less (but not much less) disquiet about the mandatory filter if the Australian government chose to target, like the European governments it wants to compare itself to, only strictly illegal material. Yet the Government, despite the wriggle room afforded by changing its position once more, has explicitly decided not to go down this path. It has deliberately chosen to continue down the path of ensuring that the National Classification Code is uniformly and universally applied to citizens as if each and every one of them were themselves film and literature distributors.

People who are "inadvertently exposed" to films such as "Ken Park" are at little risk of abusing children because of that exposure. People who deliberately access child sexual abuse material are. Making it more difficult for Margaret Pomeranz to download "Ken Park" from the web does precisely nothing about the problem of child sexual abuse, irrespective of Minister Conroy's persistent angry insistence otherwise.


People wishing to publish this in other fora should contact the author to obtain permission.

Tuesday, December 22, 2009

On the limitations of using content classification as a crime fighting technique

Originally posted as a comment here.


There are some key differences between the European filters and the proposed Australian filter.

First, the European filters are all currently voluntary. Second, the European filters all approach the problem of child abuse as an extension of law enforcement efforts. The approach for the Australian filter is quite different: it is trying to tackle child abuse as if it was a content classification problem.

Whereas the decision procedure for the European filter is: "is the material to be blocked illegal", the question for the Australian filter is: "has this material been refused classification by the body that assesses film and literature".

One would have thought that the European filters are at least tackling a criminal problem as a criminal problem. The Australian Government is attempting to tackle child abuse as a problem of good or bad taste.

Any guesses which approach might be more effective?

Monday, December 21, 2009

The Karma Sutra Of The ALP's filtering policy

So, the Government's mandatory ISP-level filtering policy has morphed once more.

Back in 2006 when Big Kim was the great white hope of all Howard fearing liberals, the ALP introduced a policy to mandate that all ISPs offer a cleanfeed to families that wanted it.

As far we can tell, the ALP took this policy to the 2007 election (having changed horses in the mean time).

It came as a shock to us all when the Australian Christian Lobby's representative in cabinet announced, in the dying hours of 2007, that the new Government intended to mandate filtering of all residential ISP connections.

In the early days of this policy the Minister did offer an olive branch to those with free speech concerns - those perverts that were prepared to declare themselves as such could opt out.

When it became apparent that a large portion of the population was prepared to take this risk and opt-out, the flexible Minister changed his position once more and insisted that all Australians would be subject to a filter for which there was no opt-out option.

This was necessary, the Minister insisted, to combat the scourge of child pornography that was flooding, uninvited, into Australian homes.

Oh, really, Minister?

And then there was a lull as Government set about to concoct a trial that would demonstrate to a disbelieving public that the policy might just work. The trial was delayed for various reasons, the main one being a shortage of suitably compliant ISPs. Eventually a trial was established.

In the pressure to get the trial started, the department "forgot" to define success criteria for the trial, an oversight that the Minister thought was irrelevant since he would be in a much better position to determine what the success criteria were once the results were in.

The results came in. And had a nap. For months.

But during the extended nap, the results impregnated the Minister with new understanding. The filter cannot possibly be effective at the stated goal of ridding Australia of child pornography because of the ease of availability of "technical circumventions".

The newly impregnated Minister was worried. How can we sell this turkey to a disbelieving public?

"Change the scope, change the scope!", yelled the pragmatists.

So now we have it: the ALPs current policy on mandatory ISP-level filtering:

"Introduction of mandatory internet service provider (ISP) level filtering of content that is rated Refused Classification (RC) in order to reduce the risk of inadvertent exposure."

Gone is the fantasy that a mandatory filter can do anything about criminal access to illegal material.

In its place we have the objective of: "reducing the risk of inadvertent exposure" to Refused Classification material.

Does anyone have any evidence whatsoever that a) the probability of inadvertent exposure is high or b) the consequences to the individual or society of inadvertent exposure to refused classification material are serious?

Any evidence at all? Let alone quantified evidence that would allow some rational assessment of costs and benefits.

Apparently not. And why should we expect such a thing? The ways of the Karma Sutra are not to be understood by the uninitiated. We are mere citizen-playthings of the single-minded ALP-ACL beast.

jon.

A post to Kate Lundy's blog

[ Originally posted at Kate Lundy's blog. ]

Kate,

The Government's policy with respect to mandatory ISP filtering is currently:

"(The) Introduction of mandatory internet service provider (ISP) level filtering of content that is rated Refused Classification (RC) in order to reduce the risk of inadvertent exposure."

This policy is explicitly not about using filters to prevent criminal access to illegal material as the Government quite freely admits that such filters are easily circumvented with technical measures.

What then is the policy about? Let the words speak for themselves: "in order to reduce the risk of inadvertent exposure" to refused classification material.

Does the Government have any evidence that inadvertent exposure to refused classification material is an actual problem? Is there any evidence that inadvertent exposure to refused classification material causes lasting harm to either the viewer of such material or to society at large?

It is not even clear that the problem the Government is trying to solve is an actual problem in the first place.

Deploying a heavy weight censorship mechanism that has the potential to distort the health of Australian democracy for generations is hardly a rational policy response to a problem that hasn't even been demonstrated, let alone quantified.

Can you point to a single example of another Western democracy where the scope of the filter is broader than strictly illegal material?

What makes Australian citizens unique in the Western world that adults are not entitled to decide for themselves which legal (but refused classification) material they should be able to view?

Does not the Government's insistence on denying Australian adults this choice fly directly in the face of principle 1a) of the National Classification Code:

(a) adults should be able to read, hear and see what they want;

A commentary on the DBCDE FAQ

I've annotated the DBCDE FAQ with commentary.

Click through the image to read more: .

Anyone wishing to prepare and publish their own annotated commentary can download a zipped version of the files and make their own edits as desired.

Thursday, February 26, 2009

For Bernadette, it will always be ground hog day

For Bernadette McMenanmin, it will always be ground hog day - perhaps tomorrow, despite all evidence to the contrary, will be the day that delivers the perfect ISP-level 'filter' which blocks only that which is 'right' to block and nothing more.

When she is not accusing opponents of mandatory ISP-level censorship of advocating child pornography, she likes to question why technologists have failed to use their imagination to derive effective technical solutions to the problems of child pornography.

Bernadette, it is not for lack of imagination. We know exactly what an effective technical solution would look like. It is precisely this imagination which forces us to raise our voices in protest and warn those less technically literate than ourselves that we really do not want go there.

The measures would not be cheap, nor would they perform well and they would be horribly inaccurate. They would not even be 100% effective at denying access to illegal material, but they would be far, far more effective than what is currently proposed. And surely, if we are to think of the children, effectiveness should be our only concern.

A technical and legislative solution that is effective would have these characteristics:  use of a whitelist to deny access to all sites not positively certified as acceptable; the outlawing and blocking of unregistered protocols particularly those that can be used to implement tunnels; the outlawing and blocking of all unlicensed uses of encryption; the outlawing and blocking of all use of VPNs and anonymous proxies; the outlawing of the acquisition and use of server equipment located outside Australia; the outlawing and blocking of all P2P protocols; the outlawing of the use or possession of all pornography that depicts persons below the age of 30.

Not one of these technical or legislative measures would be acceptable in a democratic 20th century economy, let alone one that has pretensions to be a "digital" economy - one with a National Broadband Network, or not.

Yet all of these measures would be necessary to effectively deal with the problem of adults that seek child pornography. If just one of these measures is not adopted, the resulting filter would be vulnerable to subversion by those sufficiently motivated to subvert it.

If the current 'filter' is implemented, it will be utterly ineffective: vulnerable children will still be abused by adults; the flow of the associated child pornography will continue unabated; those that 'think of the children' to the exclusion of every other rational consideration will demand yet more obtrusive and draconian controls on Internet usage.

We know the current proposal will fail to achieve the desired objectives; we know that child protection advocates of the kind represented by McMenanim will not cease in their calls for ever greater Government meddling; we know this will be but the first round of futile but increasingly invasive attempts to replace civic morality with technical fixes.

As technologists we are pleading - stop this insanity.

Any attempt to eviscerate the citizenry's moral conscience by replacing it with a set of technical censorwall rules is deeply flawed on a technical level, but also on an ethical one. Citizens should accept responsibility for avoiding illegal material and should be accountable for any transgressions - they should not be freed of this moral responsibility by a paternalistic Government and its supporters.