Run 32 bit Apps in IIS on 64 bit OS

So you want to run your web Apps in 32 bit mode on your 64 bit windows operating system
Ive never needed to do this until now, Cybersouce, the online financial gateway we have recently integrated with dont support 64 bit when you use their .NET API.

OK all you need to do is
1. IIS management >> Local computer node >> Web Server Extensions >> set the “ASP.NET vr xxxx (32 bit)” to Allow
2. Run the following batch command (changing the folder targets if you need to)
—————————————————————————————-
cscript c:inetpubadminscriptsadsutil.vbs SET W3SVC/AppPools/Enable32bitAppOnWin64 1
C:WINDOWSMicrosoft.NETFrameworkv2.0.50727aspnet_regiis.exe -i

NET STOP W3SVC /y & NET START W3SVC
—————————————————————————————-

If things go terribly wrong and you need to revert to 64bit then run the following batch
—————————————————————————————-
cscript c:inetpubadminscriptsadsutil.vbs SET W3SVC/AppPools/Enable32bitAppOnWin64 0
C:WINDOWSMicrosoft.NETFramework64v2.0.50727aspnet_regiis.exe -i

NET STOP W3SVC /y & NET START W3SVC
IISRESET
—————————————————————————————–
IISRESET

Display your latest twitter on your WP Blog

Copy and paste this code anywhere in your website
originally posted by http://www.wprecipes.com/how-to-display-your-latest-twitter-entry-on-your-wp-blog

SQL UPDATE() updating millions of records nicely

Ever needed to update millions of records before ?
New to tackling this ?
I had this experience when our database contained 17 million records when 15.5 million were junk and could be disgarded. My mission was flag the unwanted records and remove them.
This blog is simply about how to go about updating a table with millions of records. Fear not there is an approach that wont lock your database for hours, or expontentially increase the size of your transaction log file. Its known as Chunked updates.

original source explained here http://www.sqlusa.com/bestpractices2005/hugeupdate/

In my case I needed to update 17 million records based on a cross reference match of each record against another table containing 5.5K records, a single update statement would have takens hours and hours.
The chunked approach on an optimised indexed set of tables took just 35 minutes on a single license quad core proc SQL server 2005 standard edition.
Heres is the SQL statement

So you can see from the code i was chunking in 500000 rows segments, tune this to your requirements I guessed and struck it lucky.

For those interested, yes i did take steps to prevent this from occuring again. I created an insert trigger that cross references the User Agent table flagging the record. A Job then does the clean up of periodically archiving the unwanted data into a redundant database for historical reference.

HTML embed an audio clip and repeat / loop it

I needed to loop an audio clip for a friends static web page. I found out that there are so many idosyncracies to take into account with various browsers.
Here’s how to do it with an mp3

ok so

1. The header contains the tag

enclosed within a tag

This is used in older browsers that do not support the tag

2. The tag

is used with the correct parameters to call the sound autoplay it and loop it
3. The

supported by pretty much anything is there just in case the browser is not IE and does not support the tag

The only other alternative is to do the following

HTML Background image fullscreen to screen size

Have you ever needed to have your background image automatically fullscreen to the browser dimensions

There are so many different ways to do it but i found the most reliable way is to use a div container. Place an image within it setting the width and height to 100%

SQL Server exclusive access, prevent connections to DB accept your own

Im about to embark on a rather large upgrade of our live webserver. It includes both DB and ASP.NET updates
For this i am going to need exclusive access to the DB while i perform the reuired operations. After trawling the internet for a few minutes i stumbled across a great article with some SQL script.
Here it is, the only change I made was so that when you run the procedure, if it encounters your connections username in the list it will ignore it.
So if you are connecting to the instance of SQL via managment studio on your temrinal, or remotely but using your login credentials, you will be denying access to a DB for everyone accept your connection username.

Originally posted by Tony Regerson
http://sqlblogcasts.com/blogs/tonyrogerson/

IIS, ASP.NET, (RSS feed Validation missconfigured server), Content-Length, Compression

An interesting problem was elevated to my desk recently.

An RSS feed written in asp.net was failing validation at http://validator.w3.org/feed/
The reason given was a missconfigured server.

After looking into this a little further i discovered the root of the problem was not a miss configured server it was a content length related issue.
IIS has two forms of compression, dynamic and static. Static content can be a jpg, a static html file or other such content that does not change and has a defined “size and shape”. where as asp.net pages and other such content are considrered dynamic.
Now, IIS when compressing static content knows the size of the object and hence passes the content length to the client in the header of the response stream, so the client knows how much is left to recieve.
If the content length is not passed down to the client then it is possible the client may get confused when it starts receiving all these chunks from IIS and doesnt when they are gong to stop inevitably leading to a splat.

This is also indicative of a response where .ContentLength = -1 so in other words the webserver is chunking the response.

Another important note is that IIS will only compress content to the client if the client has notified IIS it can accept compressed content, this is a header value buried in the header of the request.

However in this particuar instance it would seem that even though the validation engine was compatible with compression, it didnt like dynamic compression. Im unsure if this was a problem with the validation website or if this was an RSS compatibility requirement.

So in other words we have to provide IIS with the content length in order that it can send this information down to the client.

This is simple enough if you have ever needed to stream any type of content across any medium in .NET.

The current way the rss was being transmitted to the client was via the response.write method of the output response stream. This way may appear adequate in your development test browser (typically IE 7 or 8 ) and until you look deeper and apply lifecycle testing across all media you wont see that IIS isnt passing certain required information to the client (ie content length)

So a simple application of binary streaming should fix the problem.

To begin with the original code is correct to a point starting with setting the response type parameters like so

so following on with the original code we create our TextWriter and pass it into our XmlTextWriter to populate with our XML feed

now we can write our xml feed.
Once the xml feed is written we are ready to deposit the feed to the client. We are going to use the Response.BinaryWrite method of the response stream

Essentially what we have done is pull the xml text into a buffer of a specific size based on the xml length, and passed it directly to the BinaryWrite response method which in turn writes directly to the current httpcontext response buffer without character conversion. IIS knows then content length deduced from the binary buffer size and passes this information down to the client followed on with however many chunks of data until the transafer is complete.

this is the reflector of the binaryWrite method

This type of response can also be applied to passing files down to a client request.
Using this approach removes the required “browse directory” permissions one needs to apply to the normal style virtual directory containing your static content such as pdf’s that you want made available for download.
This approach also allows you to hold all of your content in the same location (DB or file system) and you can apply rights to users accessing these files via .net rather than being restricted to NT authentication on the virtual directory simply by removing the need for teh virtual directory.

WordPress add custom code highlighting tag button to visual toolbar

On my previous post to this i wanted to add some vb and tsql code, and have it appear like T-SQL or VB with similar syntax highlighting

So if you run wordpress here is how you do it
download the “WordPress:CodeHighlighingPlugin” from ideathinking http://ideathinking.com/wiki/index.php/WordPress:CodeHighlighterPlugin

Follow their install instructions

And here is where my tip comes in

to add custom tag button so they appear in your Visual editing toolbar as your posting, locate the following js file
wp-includes\js\quicktags-dev.js
wp-includes\js\quicktags.js

within the dev js file add the tag buttons of your choice, i personally added tsql and vb like so

Starting at line ~123
#Important note: replace the <_pre and <_/pre with the correct syntax (ie remove the _), i merely used it here in order the site would render the text rather than run through the highlighting engine

now you need to put it all on one line and copy paste into a similar location in the quicktags.js

SQL index on nvarchar

In SQL 2005 in order to be able to index any field, the field length cannot exceed 900 bytes
This in mind you can have an index on something like a varchar(900) or an nvarchar(450)
This greatly assisted me on my quest to improve performance on our global www platform.
We record every visit to our website, which as you can imagine was increasing our db size exponentially.
I ended up using two locations to download browser user agent information
http://www.user-agents.org/allagents.xml
http://www.botsvsbrowsers.com/category/1/index.html
and populate a table in the with this, periodically updating the table twice a day. I then created an insert trigger on the table that records each user, this trigger cross referenced the user agent table to identify if this browsing instance was human or a bot of some description, raising a bit field flag if it was not human.
As you can imagine, on a very busy website it is critical that this happen as fast as possible. Hence applying an index on the fields under comparison is ideal and very fast.

To edit the field in the db and bring it down to the correct size a simple transact staement was needed

The website when it pulls out the http user agent from the web request, truncates it down to 450 so as to avoid any splat in the db when the data layer tries to insert the data

Now that the non human traffic has been flagged, in our fully normalised referential database we can archive the data and delete the records from the production servers very rapidly, saving space and improving performance