HTML Background image fullscreen to screen size

Have you ever needed to have your background image automatically fullscreen to the browser dimensions

There are so many different ways to do it but i found the most reliable way is to use a div container. Place an image within it setting the width and height to 100%

SQL Server exclusive access, prevent connections to DB accept your own

Im about to embark on a rather large upgrade of our live webserver. It includes both DB and ASP.NET updates
For this i am going to need exclusive access to the DB while i perform the reuired operations. After trawling the internet for a few minutes i stumbled across a great article with some SQL script.
Here it is, the only change I made was so that when you run the procedure, if it encounters your connections username in the list it will ignore it.
So if you are connecting to the instance of SQL via managment studio on your temrinal, or remotely but using your login credentials, you will be denying access to a DB for everyone accept your connection username.

Originally posted by Tony Regerson

IIS, ASP.NET, (RSS feed Validation missconfigured server), Content-Length, Compression

An interesting problem was elevated to my desk recently.

An RSS feed written in was failing validation at
The reason given was a missconfigured server.

After looking into this a little further i discovered the root of the problem was not a miss configured server it was a content length related issue.
IIS has two forms of compression, dynamic and static. Static content can be a jpg, a static html file or other such content that does not change and has a defined “size and shape”. where as pages and other such content are considrered dynamic.
Now, IIS when compressing static content knows the size of the object and hence passes the content length to the client in the header of the response stream, so the client knows how much is left to recieve.
If the content length is not passed down to the client then it is possible the client may get confused when it starts receiving all these chunks from IIS and doesnt when they are gong to stop inevitably leading to a splat.

This is also indicative of a response where .ContentLength = -1 so in other words the webserver is chunking the response.

Another important note is that IIS will only compress content to the client if the client has notified IIS it can accept compressed content, this is a header value buried in the header of the request.

However in this particuar instance it would seem that even though the validation engine was compatible with compression, it didnt like dynamic compression. Im unsure if this was a problem with the validation website or if this was an RSS compatibility requirement.

So in other words we have to provide IIS with the content length in order that it can send this information down to the client.

This is simple enough if you have ever needed to stream any type of content across any medium in .NET.

The current way the rss was being transmitted to the client was via the response.write method of the output response stream. This way may appear adequate in your development test browser (typically IE 7 or 8 ) and until you look deeper and apply lifecycle testing across all media you wont see that IIS isnt passing certain required information to the client (ie content length)

So a simple application of binary streaming should fix the problem.

To begin with the original code is correct to a point starting with setting the response type parameters like so

so following on with the original code we create our TextWriter and pass it into our XmlTextWriter to populate with our XML feed

now we can write our xml feed.
Once the xml feed is written we are ready to deposit the feed to the client. We are going to use the Response.BinaryWrite method of the response stream

Essentially what we have done is pull the xml text into a buffer of a specific size based on the xml length, and passed it directly to the BinaryWrite response method which in turn writes directly to the current httpcontext response buffer without character conversion. IIS knows then content length deduced from the binary buffer size and passes this information down to the client followed on with however many chunks of data until the transafer is complete.

this is the reflector of the binaryWrite method

This type of response can also be applied to passing files down to a client request.
Using this approach removes the required “browse directory” permissions one needs to apply to the normal style virtual directory containing your static content such as pdf’s that you want made available for download.
This approach also allows you to hold all of your content in the same location (DB or file system) and you can apply rights to users accessing these files via .net rather than being restricted to NT authentication on the virtual directory simply by removing the need for teh virtual directory.

WordPress add custom code highlighting tag button to visual toolbar

On my previous post to this i wanted to add some vb and tsql code, and have it appear like T-SQL or VB with similar syntax highlighting

So if you run wordpress here is how you do it
download the “WordPress:CodeHighlighingPlugin” from ideathinking

Follow their install instructions

And here is where my tip comes in

to add custom tag button so they appear in your Visual editing toolbar as your posting, locate the following js file

within the dev js file add the tag buttons of your choice, i personally added tsql and vb like so

Starting at line ~123
#Important note: replace the <_pre and <_/pre with the correct syntax (ie remove the _), i merely used it here in order the site would render the text rather than run through the highlighting engine

now you need to put it all on one line and copy paste into a similar location in the quicktags.js

SQL index on nvarchar

In SQL 2005 in order to be able to index any field, the field length cannot exceed 900 bytes
This in mind you can have an index on something like a varchar(900) or an nvarchar(450)
This greatly assisted me on my quest to improve performance on our global www platform.
We record every visit to our website, which as you can imagine was increasing our db size exponentially.
I ended up using two locations to download browser user agent information
and populate a table in the with this, periodically updating the table twice a day. I then created an insert trigger on the table that records each user, this trigger cross referenced the user agent table to identify if this browsing instance was human or a bot of some description, raising a bit field flag if it was not human.
As you can imagine, on a very busy website it is critical that this happen as fast as possible. Hence applying an index on the fields under comparison is ideal and very fast.

To edit the field in the db and bring it down to the correct size a simple transact staement was needed

The website when it pulls out the http user agent from the web request, truncates it down to 450 so as to avoid any splat in the db when the data layer tries to insert the data

Now that the non human traffic has been flagged, in our fully normalised referential database we can archive the data and delete the records from the production servers very rapidly, saving space and improving performance

To host WordPress on multiple domains

Tested with WordPress 2.3.3 and 2.5. and 5.2.9

original poster
1. Edit wp-config.php by adding the following codes:

2. Open wp-includes/functions.php and find the codes:

3. replace with:

ASP.NET IIS 7 Visual Studio 2008


This is a strange post but i decided a long time ago i should really start collating all my knowledge in one place.
I have so many “mht” files and random txt documents with bug fixes, solutions, how to’s, best practices etc etc i should get round to putting it in one place so its easy to find.

Anyway so here is the latest bug i hjave discovered.

I was re organising our companies .NET solutions NameSpaces to align them in more of a meaning fashion. I also reorganised the folder structure of the projects so that common projects (projects used in more than one solution) are accessible from relative referencing in the various solutions.

Long story short i discovered after much work that i could not run a website in debug mode working from the locahost IIS. VS 2008 >> web >> settings >> startup options >> under the server settings select use custom server and http://localhost  or whatever you have bound to the website on the local machine

This is the error i was recieving

Microsoft Visual Studio
Unable to start debugging on the web server. The debugger cannot connect to the remote computer. This may be because the remote computer does not exist or a firewall may be preventing communication to the remote computer. Please see Help for assistance.
OK   Help �

Now after much research and trial and error i discovered that of the miriad of possible reasons it could be (none of the documented ones applied to me) i stumbled on the answer by experimentation. If the root folder, or any folder within the path of the website is named with the same name as the base namespace you use, this is the error you get.

so simple, just make sure you dont have a folder named after your namespace

Information Technology, Life, Interesting Stumbles, Servers, Configuration, Topology, Security, Best Practices, Developing, Fire Fighting, Problem solving, Visual Studio 2005 – 2015 .NET 1.1 – 4.5, jQuery, JSON & much much more