Crack Url Filter Defintion

12/15/2017by

Computer Hacking, This category refers to any site promoting questionable or illegal use of equipment and/or software to crack passwords, create viruses, gain access to other computers, and so on. This includes any site that offers instruction on how to hack as well. This does not include legitimate security information sites.

Page presented by when a blocked page is requested. Internet censorship in Australia currently consists of a regulatory regime under which the (ACMA) has the power to enforce content restrictions on Internet content hosted within Australia, and maintain a 'black-list' of overseas websites which is then provided for use in filtering software. The restrictions focus primarily on child pornography, sexual violence, and other illegal activities, compiled as a result of a consumer complaints process. In 2009, the found no evidence of Internet filtering in Australia, but due to legal restrictions ONI does not test for filtering of child pornography. In October 2008, a policy extending to a system of mandatory filtering of overseas websites which are, or potentially would be, 'refused classification' (RC) in Australia was proposed.

Australia is classified as 'under surveillance' by due to the proposed legislation. If enacted, the legislation would require to block access to such content for all users. The proposal has generated substantial opposition, with a number of concerns being raised by opponents and only a few groups strongly in support. On 5 August 2010, the announced that they would not vote for the policy, making it virtually impossible for the filtering scheme to pass.

Crack Url Filter Defintion

In November 2010, the (DBCDE) released a document indicating that the earliest date any new legislation could reach parliament was mid-2013. In June 2011, two Australian ISPs, Telstra and Optus, confirmed they would voluntarily block access to a list of child abuse websites provided by the Australian Communications and Media Authority and more websites on a list compiled by unnamed international organisations from mid-year. In November 2012, the former Labor Communications Minister,, withdrew his party's mandatory Internet filter.

On the same day, the then Communications Minister stated that as a result of notices to the Australian largest ISPs, over 90% of Australians using Internet Services are going to have a web filter. Australian Federal Police will then pursue smaller ISPs and work with them to meeting their 'obligation under Australian law'. IiNet and Internode quietly confirmed that the request to block content from Australian Federal Police went from voluntary to mandatory under s313 in an existing law.

IiNet had sought legal advice and accepted the s313 mandatory notice but would not reveal the legal advice publicly. In June 2015, the country passed an amendment which will allow the court-ordered blocking of websites deemed to primarily facilitate.

In December 2016, the Federal Court of Australia ordered more than fifty ISP's to block 5 sites that infringe on the Copyright Act after rights holders, Roadshow Films, Foxtel, Disney, Paramount, Columbia and the 20th Century Fox files a lawsuit. The sites barred include The Pirate Bay, Torrentz, TorrentHound, IsoHunt and SolarMovie. In 2006, Australia had a good [ – ] [ ] flow of information online compared to most other countries A collection of both federal and state laws apply to Internet content in Australia. Federal law [ ] While the Australian constitution does not explicitly provide for freedom of speech or press, the High Court has held that a right to freedom of expression is implied in the constitution, and the government generally respects these rights in practice.

An independent press, an effective judiciary, and a functioning democratic political system combine to ensure freedom of speech and press. There were no government restrictions on access to the Internet or credible reports that the government routinely monitored e-mail or Internet chat rooms.

Individuals and groups can and do engage in the expression of views via the Internet, including by e-mail. Broadcasting Services Act 1992 [ ] Provisions of Schedule 5 and Schedule 7 of the inserted in 1999 and 2007 allow the Australian Communications and Media Authority to effectively ban some content from being hosted within Australia. Under this regime, if a complaint is issued about material 'broadcast' on the Internet the ACMA is allowed to examine the material under the guidelines for film and video.

The content is deemed to be 'prohibited' where it is (or in ACMA's judgement likely would be): • Refused classification, or classified X18+ • Classified R18+, and not protected by an • Classified MA15+ and not protected by an adult verification system, where the user has paid to access the content. Where content is deemed to be prohibited, the ACMA is empowered to issue local sites with a takedown notice under which the content must be removed; failure to do so can result in fines of up to $11,000 per day. If the site is hosted outside Australia, the content in question is added to a blacklist of banned URLs.

This list of banned Web pages is then added to filtering software (encrypted), which must be offered to all consumers by their Internet service providers. In March 2009, this blacklist was leaked online. A number of take down notices have been issued to some Australian-hosted websites. According to in at least one documented case, the hosting was merely shifted to a server in the United States, and the records updated so that consumers may never have noticed the change.

[ ] Suicide Related Materials Offences Act 2006 [ ] In 2006, the Federal Parliament passed the Suicide Related Materials Offences Act, which makes it illegal to use communications media such as the Internet to discuss the practical aspects of. Copyright Amendment (Online Infringement) Bill 2015 [ ] In June 2015, an amendment was passed to, which will allow for the court-ordered blocking of non-domestic websites whose 'primary purpose' is to 'facilitate'.

State and territory laws [ ] Some state governments have laws that ban the transmission of material unsuitable for minors. In, Internet censorship legislation was introduced in 2001 which criminalises online material which is unsuitable for minors. Engineering Mathematics By Bv Ramana Pdf Download.

In 2002, the New South Wales Standing Committee on Social Issues issued a report recommending that the legislation be repealed, and in response the stated that the legislation 'will be neither commenced nor repealed' until after the review of the Commonwealth Internet censorship legislation had been completed. Notable examples [ ] Wikinews has related news: In October 2000, (EFA) attempted under the (FOI) to obtain documents relating to the implementation of the filter. While a few were released, many were not, and in 2003 new legislation, 'Communications Legislation Amendment Bill (No. 1) 2002', was passed by the Liberal government and four independents, and opposed by The Greens and the Australian Labor Party. While the stated reason for the bill was to prevent people accessing by examining the blocked sites, this bill exempted whole documents from FOI, many of which did not reference prohibited content at all. EFA state that the bill was designed to prevent further public scrutiny of filtering proposals. In 2002, Minister attempted, without success, to shut down three protest websites by appealing to the then-communications minister.

The stated these were Melbourne and websites, and that the (the predecessor to ACMA) cleared them of breaching government regulations on 30 October 2002. Also in 2002, and under the terms of the, the Federal Court ordered to remove material from his Australian website which denied aspects of and vilified Jews.

In 2006, Richard Neville published a 'spoof' website that had a fictional transcript of John Howard apologising to Australians for the. The website was forcibly taken offline by the government with no recourse. After the devastating in February 2009, details about an alleged arsonist were posted online by bloggers. Victorian police deputy commissioner has asked the state to examine the possibility of removing these blogs from the, as they might jeopardise any court case.

In March 2009, after a user posted a link to a site on ACMA's blacklist on the Whirlpool forum, Whirlpool's, Bulletproof Networks, was threatened with fines of $11,000 per day if the offending link was not removed. The same link in an article on EFA's website was removed in May 2009 after ACMA issued a 'link-deletion notice', and the EFA took the precautionary step of also removing indirect links to the material in question. The 2009 winner of the George Polk award for videography shows footage of 26-year-old being shot and dying during Iran protests.

This footage has also been declared 'prohibited content' by ACMA, attracting fines of $11,000 per day for any Australian website which posts a link to the video. After the Australian government announced plans to mandate filtering in Australia in December 2009, an anti-censorship website hosted on stephenconroy.com.au (The full name of the then Minister for Broadband, Communications and the Digital Economy) was taken offline by auDA after only 24 hours of being published online. On 22 May 2009, it was disclosed in the press, citing, that the had added online Peaceful Pill Handbook, which deals with the topic of voluntary, to the blacklist maintained by the used to filter access to citizens of Australia. Euthanasia groups will hold seminars around Australia teaching how to evade the proposed filter using proxy servers and. A spokeswoman for Senator Conroy said that euthanasia would not be targeted by the proposed web filter, however Stephen Conroy has previously stated that 'while euthanasia remains illegal it will be captured by the RC filter'. In January 2010, the article 'Aboriginal' was removed from the search engine results of, following a complaint that its content was. George Newhouse, the lawyer for the complainant, claims the site is 'illegal' and should be blocked by the mandatory filter.

As the address of the site appeared on the leaked ACMA blacklist, it is likely that the whole site would be blocked by the filter. A search on terms related to the article will produce a message that one of the results has been removed after a legal request relating to Australia's. In 2010, the website of the is banned from within several state and federal government departments, including 's.

Convenor of the Australian Sex Party has described this ban as 'unconstitutional'. In April 2013, it was revealed that an used by more than 1,200 websites had been blocked by certain Internet service providers.

It was discovered by the which was one of the sites blocked. It was later revealed that the (ASIC) had ordered the blocking of the address to target a fraud website, and that the remaining websites were blocked unintentionally. The block was subsequently lifted.

ASIC subsequently revealed that it had used its blocking powers 10 times over the preceding 12 months, and that a separate action taken in March had also caused the inadvertent temporary blocking of around 1000 untargeted active sites, as well as around 249,000 sites that hosted 'no substantive content' or advertised their domain name for sale. The blocks were carried out under the section 313 legislation, and blocking notices were sent to four or five ISPs on each occasion. Proposed mandatory filtering legislation [ ] In October 2008, the governing proposed extending Internet censorship to a system of mandatory filtering of overseas websites which are, or potentially would be, 'refused classification' (RC) in Australia. As of June 2010, legislation to enact the proposed policy had not been drafted.

The proposal has generated substantial opposition, with a number of concerns being raised by opponents and only a few groups strongly supporting the policy. In November 2010, the (DBCDE) released a document indicating that the earliest date any new legislation could reach parliament was mid-2013. However, voluntary filtering by ISPs remains a possibility.

Terminology [ ] Proposed Australian laws on Internet censorship are sometimes referred to as the Great Firewall of Australia, Rabbit Proof Firewall (a reference to the Australian ), Firewall Australia or Great Firewall Reef (a reference to and the ). The proposed filter has been referred to in the media variously as an filter and a filter. The worldwide-web is a myriad of containing to each other, hosted on around the world. The Internet is the physical network used to convey requests from users' computers to these servers and responses from the servers back to the users. The proposed filter only monitors certain specific to conveying web traffic. As it aims to monitor the majority of web traffic, it is appropriately referred to as a web filter.

As it is agnostic of the majority (99.99%) of a user's computer might establish with other computers on the Internet, it is something of a misnomer to refer to it as an Internet filter. Since the proposed filter is situated at the (the junction between users and the Internet at large), introducing such a filter cannot possibly slow down the Internet itself.

It can only (potentially) slow down access to the Internet by users of that ISP. Ignoring considerations, communication speed across the Internet for any non-web traffic would be unaffected.

History [ ] In 1995, the Labor Party of Federal Government began inquiries into regulating access to online content as part of expanding the scope of classification material mediums. In the same year, the Liberal Party of Victoria and Western Australia State Governments and Country Liberal Party of Northern Territory Government implemented changes to law that allows censoring online content as part of expanding the scope of classification material. Queensland introduced similar legislation at the time, but a case of an ISP showed it did not apply to online services when the judge ruled that the act did not apply. In 1996, the Labor Party of New South Wales State Government attempted to propose a standard Internet censorship legislation for all Australian States and Territories. The legislation would have made ISPs responsible for their customers' communications.

But the proposed legislation attracted widespread protests and has since been postponed in favour of a national scheme. In 1997, following the previous Federal Government, the Liberal Party further commissioned inquiries into a variety of online censorship schemes, including self-imposed censorship by ISPs. By 1999, the then Federal Government attempted to introduce an Internet censorship regime. Some have pointed out it was to gain support from minority senators to assist with the sale of and introduction of, but as noted above, this censorship plan had been in development for several years. In 2001, was commissioned to examine Internet content filtering. The report focused primarily on evaluating the effectiveness of client-side filtering schemes (which were generally ineffective), but also discussed some of the difficulties with ISP-based filtering In March 2003, the Fairfax papers and the reported the results of a survey taken by of 200 children, which found that many of them had found pornography on the Internet.

Over the next few days was a storm of media and political attention, and there were calls for finer Internet filters and tougher censorship laws. Analysis of the report showed little new material, and only 2% of girls had admitted being exposed to pornography, while the figure for boys was 38%; such a difference between boys and girls would seem to indicate that inadvertent exposure was rare, contrary to the conclusions of the report. After the controversy died down, no new action resulted from the new report, media attention, or political speeches. In 2003, the Labor Party opposed filtering at the ISP level, with Labor Senator stating 'Unfortunately, such a short memory regarding the debate in 1999 about Internet content has led the coalition to already offer support for greater censorship by actively considering proposals for unworkable, quick fixes that involve filtering the Internet at the ISP level.'

Shortly before the, two political parties issued new policies on Internet censorship. The 's policy involved voluntary adherence by users. The released a far stricter policy of mandatory filtering at the level. The petitioned the Australian Federal Government in 2004 to further restrict access by children to material via the.

The petition was submitted in December 2004. On 21 March 2006, the Labor Party committed to requiring all ISPs to implement a mandatory system applicable to “all households, and to schools and other public Internet points” to “prevent users from accessing any content that has been identified as prohibited by the Australian Communications and Media Authority”. On the same day, the then communications minister stated that to “filter the Internet will only result in slowing down the Internet for every Australian without effectively protecting children from inappropriate and offensive content” Political party policies, positions and statements [ ] Labor Party [ ] On 31 December 2007, announced the Federal Government's intention to introduce an ISP-based filter to censor 'inappropriate material' from the Internet to protect children. In this announcement, it was stated that adults could opt out of the filter to receive uncensored access to the Internet. In May 2008, the government commenced an $82 million “cybersafety plan” which included an additional mandatory filter with no provision. This ISP-based filter aims to stop adults from downloading content that is illegal to possess in Australia, such as child pornography or materials related to.

In March 2009, Stephen Conroy dismissed suggestions that the Government would use the filter to crack down on as 'conspiracy theories'. He stated that the filter would only be used to remove 'refused classification' (RC) content, using the same rationale as existing television, radio and print publications, and that the Senate could be relied upon to provide rigorous assessment of any proposed legislation. However, Labor's policy statement on the issue contradicts this. It is also contrary to an earlier ministerial release in 2008. The most recent explanation of the government's position on this issue is provided on the ministry website. This clearly states that only ISP-level filtering of (designated) refused classification (RC) material will be mandatory under their policy.

However, ISP's will be encouraged to offer ISP-level filtering of 'adult content' as an optional (commercial) service to their customers. Such an optional extra service is aimed at parents trying to protect their children from 'undesirable' content that would otherwise be available, because it would not be RC (e.g., it might receive a classification of 'R'). Labor Senator said in January 2010 that she is lobbying within the party for an 'opt-out' filter, describing it as the 'least worst' option. In February 2010 she said she would propose the opt-out option when the filtering legislation goes before caucus. Stephen Conroy has stated that 85% of Internet Service Providers, including Telstra, Optus, iPrimus, and iiNet, welcome the Internet filter. In response, Steve Dalby, iiNet's chief regulatory officer, stated that iiNet as a company does not support the Internet filter, and never has. On 9 July 2010, Stephen Conroy announced that any mandatory filtering would be delayed until at least 2011.

On 9 November 2012, Stephen Conroy shelved the proposed mandatory filter legislation in favour of existing legislation, touting that it was successful in compelling the largest ISPs to adopt a filter. As a result, 90% of Australian Internet users are blocked from accessing content. The Liberal/Nationals Coalition [ ] In February 2009, then opposition communications spokesman obtained independent legal advice confirming that a mandatory censorship regime would require new legislation. In March 2009, after the ACMA blacklist was leaked and iiNet withdrew from the filtering trials, he stated that Stephen Conroy was 'completely botching the implementation of this filtering policy'. In March 2010, shadow treasurer attacked the filter, saying 'What we have in the government’s Internet filtering proposals is a scheme that is likely to be unworkable in practice. But more perniciously it is a scheme that will create the infrastructure for government censorship on a broader scale'. Petition For Occupational Drivers License Bexar County Tax.

During the 2010 Federal Election, Liberal communications spokesman Tony Smith announced that 'a Coalition government will not introduce a mandatory ISP level filter', with Joe Hockey also announcing an intention to vote against the policy if Labor is re-elected. This followed the 2010 Federal Conference of the passing a motion proposed by the Young Nationals to 'oppose any mandatory ISP-level internet censorship'. In November 2012, Coalition Communications spokesman welcomed the proposed legislation being dropped as it endangered freedom and Internet performance. However, some Coalition members voiced concern, citing support for a mandatory filter to protect children and families but will not propose it citing lack of political support at the time. The Coalition have proposed an 'eSafety commissioner' to take down undesirable content from the Internet as a means to protect children. It was met with criticism as a duplication of current government efforts and 'difficult and expensive' to implement.

In September 2013, two days before the federal election, the Coalition announced they would introduce an opt-out filter for all Internet connections, including both fixed line and mobile devices. This has since been retracted as 'poorly worded' in a statement from, who said, 'The correct position is that the Coalition will encourage mobile phone and Internet service providers to make available software which parents can choose to install on their own devices to protect their children from inappropriate material.' The Greens [ ] The Greens do not support the Internet filter, and Greens senator predicts that due to obstruction in the Senate, the legislation will not be introduced until after the next federal election. At the end of 2008, he asked questions in parliament related to the filtering trial, for which the Government provided answers in January 2009: • When asked about the stated public demand for Internet filtering, the government responded that the filtering was an election commitment • The Internet filter would be easy to bypass using technological measures • 674 out of 1,370 blocked sites on the mandatory list relate to child pornography; 506 sites would be classified as R18+ or X18+, despite the fact that such content is legal to view in Australia. The remaining 190 sites from this number on the blacklist can be viewed at the full revealed blacklists on WikiLeaks.

Ludlam believes that the Labor party may have hit a wall of 'technical impossibility', and the filter does not suit its purpose: 'This isn't a great advertisement for the workability of any large scale scheme. The proposal has always been unpopular, now perhaps the Government is starting to come to grips with what the industry has been saying all along: if your policy objective is to protect children on-line, this is not the way to go about it.' Despite their lack of support for the filter, The Greens preselected, whose think-tank first suggested an ISP-based Internet filter, for the by-election in the seat of. Independents and minor parties [ ] In October 2008, Family First senator Senator was reported to support the blocking of and fetish material under the government's plans to filter access to the Internet. A Family First spokeswoman [ ] confirmed that the party wants content banned for everyone, including adults. A spokesman for independent senator said: 'should the filtering plan go ahead, he would look to use it to block Australians from accessing overseas sites, which are illegal to run in Australia'. Senator Xenophon has, however, stated that he has serious concerns about the plan, and in February 2009 withdrew all support, stating that 'the more evidence that's come out, the more questions there are on this'.

He believes that money would be better spent educating parents and cracking groups used by paedophiles. A political party associated with the Eros Association, the, was launched in November 2008 and plans to campaign on issues including censorship and the federal government's promised Internet filter.

In 2014, the party won a seat in the. Two blacklists [ ] As of October 2008, the plan includes two blacklists, the first used to filter 'illegal' content, and the second used to filter additional content unsuitable for children. The first filter will be mandatory for all users of the Internet, while the second filter allows opting out. The government will not release details of the content on either list, but has stated that the mandatory filter would include at least 10,000 sites, and include both the ACMA blacklist and UK's (IWF) blacklist. In December 2008 the IWF list caused problems when the article was added to the list, as it prevented many people in the UK from. The ACMA definitions of 'prohibited content' give some idea of what could potentially be blacklisted.

Online content prohibited by ACMA includes: • Any online content that is classified RC or X 18+ by the Classification Board. This includes real depictions of actual, child pornography, depictions of bestiality, material containing excessive violence or sexual violence, detailed instruction in crime, violence or drug use, and/or material that advocates the doing of a terrorist act. • Content which is classified R 18+* and not subject to a restricted access system that prevents access by children. This includes depictions of simulated sexual activity, material containing strong, realistic violence and other material dealing with intense adult themes. In answer to a question in Parliament in October 2008, the government in January 2009 stated that of the 1,370 websites on the, 674 were related to child pornography, and the remainder would be classified as R18+ and X18+. Two websites are known to be on the ACMA blacklist after they were submitted to ACMA for review.

When ACMA responded with the advice that these sites had been placed upon its blacklist, ACMA's response was in turn posted back to the by the original submitters, with the purpose of demonstrating that political content would be censored by the mandatory filter. One was an website, with details posted to, and the other was a copy of Denmark's own Internet blacklist, with both the blacklist and ACMA's response posted on. The web hosting company for Whirlpool, Bulletproof networks, was threatened with $11,000 in fines per day if the link was not removed, so Whirlpool removed the link to the restricted site.

Civil liberties campaigners regard the inclusion of these sites on the blacklist as a demonstration that it is not difficult to get a site placed on the blacklist, and that the blacklist includes sites which are themselves not illegal to view. Leaking of the ACMA blacklist [ ] 18 March 2009: WikiLeaks publishes a list which is 'derived from the ACMA list for the use of government-approved censorship software in its 'ACMA-only' mode.' Included in the list were 'the websites of a Queensland dentist, a tuckshop convener and a kennel operator'. 19 March 2009: Australian media sources report that the ACMA blacklist has been leaked to WikiLeaks 'The seemingly innocuous websites were among a leaked list of 2300 websites the Australian Communications and Media Authority was planning to ban to protect children from graphic pornography and violence.' ACMA claimed that the list which appeared on the Wikileaks website was not the ACMA 'blacklist', as it contained 2300 URLs. ACMA claimed the ACMA list contained only 1061 URLs in August 2008, and has at no stage contained 2300.

The ACMA report on the issue noted the similarities between the two lists, yet addressed only the claim reported in the media that the list was the blacklist. The report only contains the following claims about the two lists: • 'The list provided to ACMA differs markedly in length and format to the ACMA blacklist.' • 'The ACMA blacklist has at no stage been 2,300 URLs in length and at August 2008 consisted of 1,061 URLs.'

20 March 2009: WikiLeaks published another list, this time closer to the length published by ACMA. Wikileaks believes that the list is up-to-date as of the time of publication 25 March 2009: Stephen Conroy has reportedly stated that this list closely resembles the ACMA list.

26 March 2009: The above report of 25 March 2009 was followed by the Minister's statement on the ABC's television program the following day that 'the second list which has appeared appears to be closer [to the true black-list]. I don't actually know what's on the list but I'm told by [.] ACMA it appears to be closer to the actual, legitimate list.'

On the program Senator Conroy also explained that the seemingly inexplicable censoring of a dentist's website was due to subversion of the website by the Russian mafia, who had inserted RC material. In the same discussion 's website, despite the PG rating given to his photographs by the same body, appeared on the blacklist due to a technical error according to Stephen Conroy The ACMA has since released a statement claiming the technical error was a 'computer system caching error' and further stated 'found that this is the only URL where a caching error resulting in the URL being incorrectly added to the list.' Live filtering trials [ ] The government has committed to trials of the mandatory Internet filter before implementation. On 28 July 2008, an report entitled “Closed Environment Testing of ISP-Level Internet Content Filtering” showed performance and accuracy problems with the six unnamed ISP-based filters trialled. •, OpenNet Initiative, 8 November 2011 •, OpenNet Initiative, March 2010 •, Reporters Without Borders (Paris), 12 March 2012 • ^ 23 March 2010 at the. • ^ • ^ • ^ • Welch, Dylan (2010-08-06)...

• ^ 28 November 2010 at the., Strategy Paper SGSM, 14 November 2010. Retrieved 9 July 2012 • ^, Jennifer Dudley-Nicholson, News Limited newspapers, 22 June 2011 • [ ], APP, Computerworld, 9 November 2012 •, Department of Broadband, Communications and the Digital Economy, 9 November 2012 (archived on Pandora) •, Renai LeMay, Delimiter 14 November 2012 •, Bureau of Democracy, Human Rights and Labor, U.S. Department of State, 25 May 2012 •. Australasian Legal Information Institute. Retrieved 27 May 2007. • ^ • • Perron, Marshall (2006-01-05)...

Ars Technica. Retrieved 22 June 2015. Australasian Legal Information Institute. Retrieved 25 January 2009. Australasian Legal Information Institute. Retrieved 25 January 2009.

• • • • • 9 November 2008 at the. The Sydney Morning Herald. 17 March 2006.

• • • • Moses, Asher (18 December 2009).. The Sydney Morning Herald. • 23 January 2010 at the. • Duffy, Michael (22 May 2009).. Sydney Morning Herald.

Retrieved 22 May 2009. • • •,, 15 January 2010, retrieved 15 January 2010 •, Indigenous Community News Network, retrieved 15 January 2010 •,, retrieved 15 January 2010 • • Riley, Duncan,, The Inquisitr, retrieved 15 January 2010 •,, retrieved 15 January 2010 • • Michael Ryan (12 April 2013)... Archived from on 29 June 2013. • Mark Gregory (2013-05-21)...

• Ben Grubb (2013-06-05).. The Sydney Morning Herald. • • • • • • • • • • • • • • ^ • • Report prepared by CSIRO for NetAlert.

Original report available at (PDF). Archived from (PDF) on 27 July 2011.

Retrieved 25 February 2010. Family First Party.

Archived from (PDF) on 14 April 2008. Retrieved 3 March 2008. 7 February 2008.

Retrieved 3 March 2008. 31 December 2007. Retrieved 3 March 2008. • ^ • • • 10 July 2009 at the. • 28 October 2009 at the.

• • 27 February 2010 at the. • • • Moses, Asher (9 July 2010)..

The Sydney Morning Herald. Retrieved 10 July 2010. • 1 February 2013 at the., Department of Broadband, Communications and the Digital Economy, 9 November 2012 • • ^ • • • • •, Matthew Grimson, ABC News, 5 September 2013.

Retrieved 5 September 2013. • ^ Moses, Asher (2009-01-30).. • Asher Moses (27 August 2008).. Retrieved 31 March 2008.

Archived from on 6 July 2011. Retrieved 31 October 2008.

• • 9 December 2009 at the. Darren Pauli,, October 2008. Reprinted by • • • • • • ^ Earley, David (20 March 2009).. The Courier-Mail. • ^ • • 27 March 2009 at the. The Sydney Morning Herald. 27 March 2009.

Archived from (PDF) on 8 September 2008. • 7 August 2008 at the. • ^ • ^ • • [ ] • 14 May 2009 at the. • • • • 13 February 2009 at the. • • • • • • • • 8 March 2011 at the. • • • •, Australian Law Reform Commission, 29 February 2012 • ^, Reporters Without Borders, 12 March 2012 • Chloe Herrick (27 June 2011).. Retrieved 9 July 2012.

Internet Industry Association. 30 June 2011. Archived from on 13 October 2012. Retrieved 9 July 2012.

• Renai LeMay (14 July 2011).. Retrieved 9 July 2012. Retrieved 9 July 2012. Retrieved 9 July 2012. • Renai LeMay (13 July 2011).. Technology Spectator. Retrieved 9 July 2012.

• 10 May 2012 at the., R. Finkelstein and M. Ricketson, Independent Media Inquiry Secretariat, Department for Broadband, Communications and the Digital Economy, 28 February 2012 • ^ • in [APC (magazine) APC Magazine] •. 4 December 2008. Archived from on 15 February 2009. The Digital Liberty Coalition. 3 December 2008.

• • • • • • 3 August 2009 at the. 14 July 2009. Archived from on 25 July 2009. • • • • • • 27 October 2008 at the. • • • - Sydney Morning Herald, 24 October 2008 • • Foo, Fran (9 July 2009)..

• • • • ^ • • 11 January 2010 at the. • • • • 2 July 2009 at the. • • Robinson, Justin (26 March 2009)..

Retrieved 26 March 2009. Australian Broadcasting Corporation (ABC). 10 September 2009.

Retrieved 13 February 2010. Sydney Morning Herald.

Fairfax Media. 11 February 2010. Archived from on 12 February 2010. Retrieved 11 February 2010.

•, Asher Moses, Sydney Morning Herald (Fairfax Media), 10 February 2010. Retrieved 7 July 2014. External links [ ] • •, a blog documenting censorship and in Australia •, Electronic Frontiers Australia, March 2003 • by Derek E. Bambauer,, Legal Studies Paper No. 125, December 2008 •, ACMA Research and Developments reports 1 (2008), 2 (2009), and 3 (2010) •, DBCDE, October 2009 •, Roger Brooking in Brookingblog, 6 July 2013.

Scott Mitchell 4GuysFromRolla.com March 2004 Applies to: Microsoft® ASP.NET Summary: Examines how to perform dynamic URL rewriting with Microsoft ASP.NET. URL rewriting is the process of intercepting an incoming Web request and automatically redirecting it to a different URL. Discusses the various techniques for implementing URL rewriting, and examines real-world scenarios of URL rewriting. (31 printed pages). Contents Introduction Take a moment to look at some of the URLs on your website. Do you find URLs like Or maybe you have a bunch of Web pages that were moved from one directory or website to another, resulting in broken links for visitors who have bookmarked the old URLs.

In this article we'll look at using URL rewriting to shorten those ugly URLs to meaningful, memorable ones, by replacing with something like We'll also see how URL rewriting can be used to create an intelligent 404 error. URL rewriting is the process of intercepting an incoming Web request and redirecting the request to a different resource. When performing URL rewriting, typically the URL being requested is checked and, based on its value, the request is redirected to a different URL. For example, in the case where a website restructuring caused all of the Web pages in the /people/ directory to be moved to a /info/employees/ directory, you would want to use URL rewriting to check if a Web request was intended for a file in the /people/ directory. If the request was for a file in the /people/ directory, you'd want to automatically redirect the request to the same file, but in the /info/employees/ directory instead.

With classic ASP, the only way to utilize URL rewriting was to write an ISAPI filter or to buy a third-party product that offered URL rewriting capabilities. With Microsoft® ASP.NET, however, you can easily create your own URL rewriting software in a number of ways. In this article we'll examine the techniques available to ASP.NET developers for implementing URL rewriting, and then turn to some real-world uses of URL rewriting. Before we delve into the technological specifics of URL rewriting, let's first take a look at some everyday scenarios where URL rewriting can be employed.

Common Uses of URL Rewriting Creating data-driven ASP.NET websites often results in a single Web page that displays a subset of the database's data based on querystring parameters. For example, in designing an e-commerce site, one of your tasks would be to allow users to browse through the products for sale. To facilitate this, you might create a page called displayCategory.aspx that would display the products for a given category. The category's products to view would be specified by a querystring parameter. That is, if the user wanted to browse the Widgets for sale, and all Widgets had a had a CategoryID of 5, the user would visit: There are two downsides to creating a website with such URLs. First, from the end user's perspective, the URL is a mess. Usability expert that URLs be chosen so that they: • Are short.

• Are easy to type. • Visualize the site structure. • 'Hackable,' allowing the user to navigate through the site by hacking off parts of the URL. I would add to that list that URLs should also be easy to remember. The URL meets none of Neilsen's criteria, nor is it easy to remember. Asking users to type in querystring values makes a URL hard to type and makes the URL 'hackable' only by experienced Web developers who have an understanding of the purpose of querystring parameters and their name/value pair structure.

A better approach is to allow for a sensible, memorable URL, such as By just looking at the URL you can infer what will be displayed—information about Widgets. The URL is easy to remember and share, too. I can tell my colleague, 'Check out yoursite.com/products/Widgets,' and she'll likely be able to bring up the page without needing to ask me again what the URL was. (Try doing that with, say, an Amazon.com page!) The URL also appears, and should behave, 'hackable.' That is, if the user hacks of the end of the URL, and types in they should see a listing of all products, or at least a listing of all categories of products they can view. Note For a prime example of a 'hackable' URL, consider the URLs generated by many blog engines. To view the posts for January 28, 2004, one visits a URL like If the URL is hacked down to the user will see all posts for January 2004.

Cutting it down further to will display all posts for the year 2004. In addition to simplifying URLs, URL rewriting is also often used to handle website restructuring that would otherwise result in numerous broken links and outdated bookmarks. What Happens When a Request Reaches IIS Before we examine exactly how to implement URL rewriting, it's important that we have an understanding of how incoming requests are handled by Microsoft® Internet Information Services (IIS). When a request arrives at an IIS Web server, IIS examines the requested file's extension to determine how handle the request. Requests can be handled natively by IIS—as are HTML pages, images, and other static content—or IIS can route the request to an ISAPI extension. (An ISAPI extension is an unmanaged, compiled class that handles an incoming Web request.

Its task is to generate the content for the requested resource.) For example, if a request comes in for a Web page named Info.asp, IIS will route the message to the asp.dll ISAPI extension. This ISAPI extension will then load the requested ASP page, execute it, and return its rendered HTML to IIS, which will then send it back to the requesting client.

For ASP.NET pages, IIS routes the message to the aspnet_isapi.dll ISAPI extension. The aspnet_isapi.dll ISAPI extension then hands off processing to the managed ASP.NET worker process, which processes the request, returning the ASP.NET Web page's rendered HTML. You can customize IIS to specify what extensions are mapped to what ISAPI extensions. Figure 1 shows the Application Configuration dialog box from the Internet Information Services Administrative Tool.

Note that the ASP.NET-related extensions—.aspx,.ascx,.config,.asmx,.rem,.cs,.vb, and others—are all mapped to the aspnet_isapi.dll ISAPI extension. Configured mappings for file extensions A thorough discussion of how IIS manages incoming requests is a bit beyond the scope of this article. A great, in-depth discussion, though, can be found in Michele Leroux Bustamante's article. It's important to understand that the ASP.NET engine gets its hands only on incoming Web requests whose extensions are explicitly mapped to the aspnet_isapi.dll in IIS. Examining Requests with ISAPI Filters In addition to mapping the incoming Web request's file extension to the appropriate ISAPI extension, IIS also performs a number of other tasks. For example, IIS attempts to authenticate the user making the request and determine if the authenticated user has authorization to access the requested file. During the lifetime of handling a request, IIS passes through several states.

At each state, IIS raises an event that can be programmatically handled using ISAPI filters. Like ISAPI extensions, ISAPI filters are blocks of unmanaged code installed on the Web server. ISAPI extensions are designed to generate the response for a request to a particular file type. ISAPI filters, on the other hand, contain code to respond to events raised by IIS. ISAPI filters can intercept and even modify the incoming and outgoing data. ISAPI filters have numerous applications, including: • Authentication and authorization. • Logging and monitoring.

• HTTP compression. • URL rewriting.

While ISAPI filters can be used to perform URL rewriting, this article examines implementing URL rewriting using ASP.NET. However, we will discuss the tradeoffs between implementing URL rewriting as an ISAPI filter versus using techniques available in ASP.NET. What Happens When a Request Enters the ASP.NET Engine Prior to ASP.NET, URL rewriting on IIS Web servers needed to be implemented using an ISAPI filter. URL rewriting is possible with ASP.NET because the ASP.NET engine is strikingly similar to IIS. The similarities arise because the ASP.NET engine: • Raises events as it processes a request. • Allows an arbitrary number of HTTP modules handle the events that are raised, akin to IIS's ISAPI filters.

• Delegates rendering the requested resource to an HTTP handler, which is akin to IIS's ISAPI extensions. Like IIS, during the lifetime of a request the ASP.NET engine fires events signaling its change from one state of processing to another. The BeginRequest event, for example, is fired when the ASP.NET engine first responds to a request.

The AuthenticateRequest event fires next, which occurs when the identity of the user has been established. (There are numerous other events— AuthorizeRequest, ResolveRequestCache, and EndRequest, among others. These events are events of the System.Web.HttpApplication class; for more information consult the technical documentation.) As we discussed in the previous section, ISAPI filters can be created to respond to the events raised by IIS. In a similar vein, ASP.NET provides HTTP modules that can respond to the events raised by the ASP.NET engine.

An ASP.NET Web application can be configured to have multiple HTTP modules. For each request processed by the ASP.NET engine, each configured HTTP module is initialized and allowed to wire up event handlers to the events raised during the processing of the request. Realize that there are a number of built-in HTTP modules utilized on each an every request. One of the built-in HTTP modules is the FormsAuthenticationModule, which first checks to see if forms authentication is being used and, if so, whether the user is authenticated or not. If not, the user is automatically redirected to the specified logon page. Recall that with IIS, an incoming request is eventually directed to an ISAPI extension, whose job it is to return the data for the particular request. For example, when a request for a classic ASP Web page arrives, IIS hands off the request to the asp.dll ISAPI extension, whose task it is to return the HTML markup for the requested ASP page.

The ASP.NET engine utilizes a similar approach. After initializing the HTTP modules, the ASP.NET engine's next task is to determine what HTTP handler should process the request. All requests that pass through the ASP.NET engine eventually arrive at an HTTP handler or an HTTP handler factory (an HTTP handler factory simply returns an instance of an HTTP handler that is then used to process the request). The final HTTP handler renders the requested resource, returning the response. This response is sent back to IIS, which then returns it to the user that made the request.

ASP.NET includes a number of built-in HTTP handlers. The PageHandlerFactory, for example, is used to render ASP.NET Web pages. The WebServiceHandlerFactory is used to render the response SOAP envelopes for ASP.NET Web services. The TraceHandler renders the HTML markup for requests to trace.axd.

Figure 2 illustrates how a request for an ASP.NET resource is handled. First, IIS receives the request and dispatches it to aspnet_isapi.dll.

Next, the ASP.NET engine initializes the configured HTTP modules. Finally, the proper HTTP handler is invoked and the requested resource is rendered, returning the generated markup back to IIS and back to the requesting client. Request processing by IIS and ASP.NET Creating and Registering Custom HTTP Modules and HTTP Handlers Creating custom HTTP modules and HTTP handlers are relatively simple tasks, which involve created a managed class that implements the correct interface. HTTP modules must implement the System.Web.IHttpModule interface, while HTTP handlers and HTTP handler factories must implement the System.Web.IHttpHandler interface and System.Web.IHttpHandlerFactory interface, respectively. The specifics of creating HTTP handlers and HTTP modules is beyond the scope of this article. For a good background, read Mansoor Ahmed Siddiqui's article,.

Once a custom HTTP module or HTTP handler has been created, it must be registered with the Web application. Registering HTTP modules and HTTP handlers for an entire Web server requires only a simple addition to the machine.config file; registering an HTTP module or HTTP handler for a specific Web application involves adding a few lines of XML to the application's Web.config file. Specifically, to add an HTTP module to a Web application, add the following lines in the Web.config's configuration/system.web section: The type value provides the assembly and class name of the HTTP module, whereas the name value provides a friendly name by which the HTTP module can be referred to in the Global.asax file. HTTP handlers and HTTP handler factories are configured by the tag in the Web.config's configuration/system.web section, like so: Recall that for each incoming request, the ASP.NET engine determines what HTTP handler should be used to render the request. This decision is made based on the incoming requests verb and path. The verb specifies what type of HTTP request was made—GET or POST—whereas the path specifies the location and filename of the file requested. So, if we wanted to have an HTTP handler handle all requests—either GET or POST—for files with the.scott extension, we'd add the following to the Web.config file: where type was the type of our HTTP handler.

Note When registering HTTP handlers, it is important to ensure that the extensions used by the HTTP handler are mapped in IIS to the ASP.NET engine. That is, in our.scott example, if the.scott extension is not mapped in IIS to the aspnet_isapi.dll ISAPI extension, a request for the file foo.scott will result in IIS attempting to return the contents of the file foo.scott. In order for the HTTP handler to process this request, the.scott extension must be mapped to the ASP.NET engine.

The ASP.NET engine, then, will route the request correctly to the appropriate HTTP handler. For more information on registering HTTP modules and HTTP handlers, be sure to consult the along with the. Implementing URL Rewriting URL rewriting can be implemented either with ISAPI filters at the IIS Web server level, or with either HTTP modules or HTTP handlers at the ASP.NET level. This article focuses on implementing URL rewriting with ASP.NET, so we won't be delving into the specifics of implementing URL rewriting with ISAPI filters. Implementing URL rewriting at the ASP.NET level is possible through the System.Web.HttpContext class's RewritePath() method.

The HttpContext class contains HTTP-specific information about a specific HTTP request. With each request received by the ASP.NET engine, an HttpContext instance is created for that request. This class has properties like: Request and Response, which provide access to the incoming request and outgoing response; Application and Session, which provide access to application and session variables; User, which provides information about the authenticated user; and other related properties.

With the Microsoft®.NET Framework Version 1.0, the RewritePath() method accepts a single string, the new path to use. Internally, the HttpContext class's RewritePath(string) method updates the Request object's Path and QueryString properties. In addition to RewritePath(string), the.NET Framework Version 1.1 includes another form of the RewritePath() method, one that accepts three string input parameters. This alternate overloaded form not only sets the Request object's Path and QueryString properties, but also sets internal member variables that are used to compute the Request object's values for its PhysicalPath, PathInfo, and FilePath properties. To implement URL rewriting in ASP.NET, then, we need to create an HTTP module or HTTP handler that: • Checks the requested path to determine if the URL needs to be rewritten. • Rewrites the path, if needed, by calling the RewritePath() method. For example, imagine that our website had information each employee, accessible through /info/employee.aspx?empID=employeeID.

To make the URLs more 'hackable,' we might decide to have employee pages accessible by: /people/EmployeeName.aspx. Here is a case where we'd want to use URL rewriting. That is, when the page /people/ScottMitchell.aspx was requested, we'd want to rewrite the URL so that the page /info/employee.aspx?empID=1001 was used instead. URL Rewriting with HTTP Modules When performing URL rewriting at the ASP.NET level you can use either an HTTP module or an HTTP handler to perform the rewriting. When using an HTTP module, you must decide at what point during the request's lifecycle to check to see if the URL needs to be rewritten. At first glance, this may seem to be an arbitrary choice, but the decision can impact your application in both significant and subtle ways. The choice of where to perform the rewrite matters because the built-in ASP.NET HTTP modules use the Request object's properties to perform their duties.

(Recall that rewriting the path alters the Request object's property values.) These germane built-in HTTP modules and the events they tie into are listed below.

Comments are closed.