Examples
Emulating host-header-based virtual sites on a single site
For example you have registered two domains www.site1.com and www.site2.com. Now you can create two different sites using single physical site. Add the following rules to your httpd.ini file:
[ISAPI_Rewrite]
#Fix missing slash char on folders
RewriteCond Host: (.*)
RewriteRule ([^.?]+[^.?/]) http://$1$2/ [I,R]
#Emulate site1
RewriteCond Host: (?:www.)?site1.com
RewriteRule (.*) /site1$1 [I,L]
#Emulate site2
RewriteCond Host: (?:www.)?site2.com
RewriteRule (.*) /site2$1 [I,L]
Now just place your sites in /site1 and /site2 directories.
or you can use more generic rules:
[ISAPI_Rewrite]
#Fix missing slash char on folders
RewriteCond Host: (.*)
RewriteRule ([^.?]+[^.?/]) http://$1$2/ [I,R]
RewriteCond Host: (www.)?(.+)
RewriteRule (.*) /$2$3
The directory names for sites should be like /somesite1.com, /somesite2.info, etc.
Using different namespaces on development and live servers
Suppose you are developing rules for a website that will be deployed to a /livesite/ folder on a live server. However, on a development machine you need to store files in the /develop/ folder. In this case you could use UriMatchPrefix and UriFormatPrefix directives to minimize a number of changes those will be needed for a namespace change:
[ISAPI_Rewrite]
#specify namespaces with UriMatchPrefix and UriFormatPrefix
UriMatchPrefix /develop
UriFormatPrefix /develop
#rules are going here
#really we are checking for /develop/sampleN.htm and formatting to /develop/sampleN.asp
RewriteRule /sample1.htm /sample1.asp [I,L]
RewriteRule /sample2.htm /sample2.asp [I,L]
RewriteRule /sample3.htm /sample3.asp [I,L]
#reset namespaces to default
UriMatchPrefix
UriFormatPrefix
The only things you will need to change before a deployment to a live server are values of UriMatchPrefix and UriFormatPrefix.
Using loops (Next flag) to convert request parameters
Suppose you wish to access physical URLs like http://www.myhost.com/foo.asp?a=A&b=B&c=C using requests like http://www.myhost.com/foo.asp/a/A/b/B/c/C and the number of parameters may vary from one request to another.
There exist at least two possible solutions. You could simply add a separate rule for each possible number of parameters or you could use a technique demonstrated by the following example.
[ISAPI_Rewrite]
RewriteRule (.*?.asp)(?[^/]*)?/([^/]*)/([^/]*)(.*) $1(?2$2&:?)$3=$4$5 [NS,I]
Note that this rule may break page-relative links to CSSs, images, etc. This is due to a change in the base path (parent folder of the page) that is being used by a browser to calculate complete resource URI. There are three possible solutions:
Use the rule given below. It does not affect base path.
Directly specify correct base path for a page with the help of tag.
Change all page-relative links to either root-relative or absolute form.
This rule will extract one parameter from request URL, append it to the end of the request string and restart rules processing from the beginning. So it will loop until all parameters will be moved to the right place (or until the RepeatLimit will be exceeded).
There also exist many variations of this rule with different separator characters. For example, to use URLs like http://www.myhost.com/foo.asp~a~A~b~B~c~C the following rule could be implemented:
[ISAPI_Rewrite]
RewriteRule (.*?.asp)(?[^~]*)?~([^~]*)~([^~]*)(.*) $1(?2$2&:?)$3=$4$5 [NS,I]
Running servers behind IIS
Assume we have internet server running IIS and several corporate servers running other platform. These servers are not directly accessible from the internet but only from our corporate network. Here is a simple example how to map another server into the IIS site’s namespace using proxy flag:
[ISAPI_Rewrite]
RewriteProxy /mappoint(.+) http://sitedomain$1 [I,U]
Moving sites from UNIX to IIS
This rules can help change the URL from /~username to /username and /file.html to /file.htm. It can be useful if you just moved your site from UNIX to IIS and keep getting hits to the old pages from search engines and other external pages.
[ISAPI_Rewrite]
#redirecting to update old links
RewriteRule (.*).html $1.htm
RewriteRule /~(.*) http://myserver/$1 [R]
Moving site location
Many webmasters asked for a solution to the following problem: They want to redirect all requests to one web server to another web server. Such problems usually arise when you need to establish a newer web server which will replace the old one over time. The solution is to use ISAPI_Rewrite on the old web server:
[ISAPI_Rewrite]
#redirecting to update old links
RewriteRule (.+) http://newwebserver$1 [R]
Browser-dependent content
It is sometimes necessary to provide browser-dependent content at least for important top-level pages, i.e. one has to provide a full-featured version for the Internet Explorer, a minimum-featured version for the Lynx browsers and an average-featured version for all others.
We have to act on the HTTP header "User-Agent". The sample code does the following: If the HTTP header "User-Agent" contains "MSIE", the target foo.htm is rewritten to foo.IE.htm. If the browser is "Lynx" or "Mozilla" of version 1 or 2 the URL becomes foo.20.htm. Other browsers receive page foo.32.html. All this is done by the following ruleset:
[ISAPI_Rewrite]
RewriteCond User-Agent: .*MSIE.*
RewriteRule /foo.htm /foo.IE.htm [L]
RewriteCond User-Agent: (?:Lynx|Mozilla/[12]).*
RewriteRule /foo.htm /foo.20.htm [L]
RewriteRule /foo.htm /foo.32.htm [L]
Dynamically generated robots.txt
robots.txt is a file that search engines use to discover URLs that should or should not be indexed. But creation of this file for large sites with lot of dynamic content is a very complex task. Have you ever dreamed about dynamically generated robots.txt? Let's write robots.asp script:
<%@ Language=Jscript EnableSessionState=False%>
<%
//The script must return plain text
Response.ContentType="text/plain";
/*
Place generation code here
*/
%>
Now make it robots.txt using single rule:
[ISAPI_Rewrite]
RewriteRule /robots.txt /robots.asp
Making search engines to index dynamic pages
Content of the site stored in XML files. There is /XMLProcess.asp file that processes XML files on server and returns HTML to end user. URLs to the documents have a form of:
http://www.mysite.com/XMLProcess.asp?xml=/somdir/somedoc.xml
But many popular search engines will not index such documents because URLs contain question mark (document is dynamically generated). ISAPI_Rewrite can competely eliminate this problem.
[ISAPI_Rewrite]
RewriteRule /doc(.*).htm /XMLProcess.asp?xml=$1.xml
Now to access documents use URL like http://www.mysite.com/doc/somedir/somedoc.htm. Search engines will never know that physically there is no somedoc.htm file and content is dynamically generated.
Negative expressions (NOT)
Sometimes you need to apply rule when some pattern not matches. In this case you may use so called Forward Lookahead Asserts in regular expressions.
For example you need to move all users not using Internet Explorer to the other location:
[ISAPI_Rewrite]
# Redirect all non Internet Explorer users
# to another location
RewriteCond User-Agent: (?!.*MSIE).*
RewriteRule (.*) /nonie$1
Dynamic authentication
For example we have some members area on the site and we need password-protect files in this area but we don't like to use built-in server security. In this case it is possible to create ASP script (call it proxy.asp) that will proxy all requests to the members area and check for required permissions. Here is a simple template for this page where you can put your own authorization code:
<%@ Language=Jscript EnableSessionState=False%>
<%
function Authorize()
{
//Check if the user is authorized to view a resource here
//Return true if user has a required permission, otherwise return false
return true;
}
if(!Authorize())
{
//Redirect to the login page
Response.Redirect("http://mysite.com/LoginPage.asp?ref="+Request.QueryString.Item);
Response.End()
}
var WinHttpReq = new ActiveXObject("WinHttp.WinHttpRequest.5");
WinHttpReq.Open(Request.ServerVariables("REQUEST_METHOD").Item, Request.QueryString.Item, true);
var headers=String(Request.ServerVariables("ALL_RAW")).split("
");
for(i=0; i<headers.length && headers[i]; i++)
{
header = headers[i].match(/([w-.]+):s*([ S]*)/);
if(header)
WinHttpReq.SetRequestHeader(header[1],header[2]);
}
if(lngCount = Request.TotalBytes)
{
var data=Request.BinaryRead(lngCount);
WinHttpReq.Send(data);
} else {
WinHttpReq.Send();
}
if(!WinHttpReq.WaitForResponse(15))
{
WinHttpReq.Abort();
Response.Status="408 Request Timeout";
} else {
Response.Status = "" + WinHttpReq.Status + " " + WinHttpReq.StatusText;
headers=String(WinHttpReq.GetAllResponseHeaders()).split("
");
for(i=0; i<headers.length && headers[i]; i++)
{
header = headers[i].match(/([w-.]+):s*([ S]*)/);
if(header)
Response.AddHeader(header[1],header[2]);
}
Response.Write(WinHttpReq.ResponseText);
}
%>
Now we need to configure ISAPI_Rewrite to proxy requests through this page:
[ISAPI_Rewrite]
# Proxy all requests through proxy.asp
RewriteRule /members(.+) /proxy.asp?http://mysite.com/members$1
Blocking inline-images (stop hot linking)
Assume we have some pages with inlined GIF graphics under http://www.mysite.com/. These graphics are nice, so others directly incorporate them via hyperlinks to their pages. We don't like this practice because it adds useless traffic to our server.
While we cannot 100% protect the images from inclusion, we can at least restrict the cases where the browser sends a HTTP Referer header.
[ISAPI_Rewrite]
RewriteCond Host: (.+)RewriteCond Referer: (?!http://1.*).*RewriteRule .*.(?:gif|jpg|png) /block.gif [I,O]
Regular Expressions Testing Tool
RXTest utility could be used to simulate rule …