Wednesday, August 29, 2007

4347.aspx

.NET libraries for writing your own messenger bot

I do not use Instant Messaging as much as some of my friends but I can imagine several services that would be useful via IM. There are endless opportunities when you consider that IM can offer push as well as pull services.


It looks like I am not the only one as there are several open source .NET libraries for working with Microsoft Messenger:



  • DotMSN 2.0 supports the MSNP9 protocol.

  • MSNPSharp extends DotMSN 2.0 with support for the MSNP11 protocol.

I am busy working on several other hobby projects at the moment but I would love to put together a IM bot later this year. What kind of services you would like to access via IM?

4349.aspx

Real PreAuthenticate for .NET Web Services

.NET Web Service clients do not send authentication details with the requests so you get a traffic pattern like this


Request 1:



Client: GET someUrl
Server: 401 WWW-Authenticate Basic
Client: GET with Authorization header
Server: 200 OK


Request 2:



Client: GET someUrl
Server: 401 WWW-Authenticate Basic
Client: GET with Authorization header
Server: 200 OK


The PreAuthenticate property improves the situation if you make many repeated calls to a web service.


Request 1:



Client: GET someUrl
Server: 401 WWW-Authenticate Basic
Client: GET with Authorization header
Server: 200 OK


Request 2:



Client: GET someUrl with Authorization header
Server: 200 OK


The first request does not contain the Authorization header but the subsequent requests do. I do not like this feature as the systems I work on usually make one single web service request per client request so I always get 401 errors.


You can fix the problem by manually inserting the Authorization header in the requests. Do not modify the Web Service client generated by Visual Studio as you will loose your changes if you update the Web Reference.  Extend the Web Service client instead and override the GetWebRequest method:


public class WSTestPreAuthenticate : WSTest


{


    protected override System.Net.WebRequest GetWebRequest(Uri uri)


    {


        HttpWebRequest request;


        request = (HttpWebRequest)base.GetWebRequest(uri);


 


        if (PreAuthenticate)


        {


            NetworkCredential networkCredentials =


                Credentials.GetCredential(uri, "Basic");


 


 


            if (networkCredentials != null)


            {


                byte[] credentialBuffer = new UTF8Encoding().GetBytes(


                    networkCredentials.UserName + ":" +


                    networkCredentials.Password);


                request.Headers["Authorization"] = "Basic " + Convert.ToBase64String(credentialBuffer);


            }


        }


        return request;


    }


}



The change above means that PreAuthentication works as it should since it always sends the Authorization header


Request 1:



Client: GET someUrl with Authorization header
Server: 200 OK


The client has to set the PreAuthenticate header and any custom credentials:


WSTestPreAuthenticate test = new WSTestPreAuthenticate ();


test.PreAuthenticate = true;


test.Url = "http://localhost/WSTest/WSTest.asmx";


test.Credentials =


new System.Net.NetworkCredential("user", "password");


int ret = test.Method("test");

Tuesday, August 28, 2007

4346.aspx

Performance problem with ADO, stored procedures, BLOBs and SQL Server

There is a know bug in ADO that causes horrible performance with ADO if you insert a BLOB via stored procedure. Working on a (almost) real time import of data I noticed that importing blobs took forever….


I did the right thing and used stored procedures for all data access. In this case it was a mistake as it took several minutes to insert a small blob of a few hundred KB.


I ended up with the workaround below. It is ugly but the least “dirty” solution I found:


 set oCon = CreateObject("ADODB.Connection")
oCon.ConnectionString = CONNECTION_STRONG
oCon.open
Set oCmd = CreateObject("ADODB.Command")
set oCmd.ActiveConnection = oCon
oCmd.CommandType = adCmdStoredProc
oCmd.CommandText = "usp_insert_item"
oCmd.Parameters.Refresh()
oCmd.Parameters("@whatever").value = somedataHre
oCmd.Parameters("@PageData").value = "dummy"
 'The stored procedure returns the ID of the new record as newID
Set oRS = oCmd.Execute
BinaryID = oRS("newID")
oRS.close()

'Update the binary data field
set oRS = CreateObject("ADODB.Recordset")
oRS.CursorLocation = 3
oRS.CursorType = 1
oRS.LockType = 2
set oRS.ActiveConnection = oCon
oRS.Open "select BlobField from TheTable where ID=" & BinaryID
oRS.Fields("BlogField").Value = TheBinaryData
oRS.Update()
oRS.Close()

Set oRS = Nothing
Set oCmd = Nothing
set oCon = Nothing

Friday, August 24, 2007

4326.aspx

The Sun BlackBox

Wouldn't it be fantastic if the Sun BlackBox could be used in emergency situations like this photo(shop?) shows:


But, using them for Red Cross emergencies is not feasible today as the BlackBox is hungry:



  • Cooling water supply: 60 gallons (~230 liters) per minute of cold water at 13 degrees Celsius which means a large coldwater supply or a small water supply with a serious cooler

  • Dual 600 amp feeds power feeds

  • High bandwidth internet connection

Pretty hard to find in emergency situations, in Africa or elsewhere.


It is no wonder the box is hungry considering the possible configurations:



  • A single Project Blackbox could accommodate 250 Sun Fire T1000 servers with the CoolThreads technology with 2000 cores and 8000 simultaneous threads.

  • A single Project Blackbox could accommodate 250 x64-based servers with 1000 cores.

  • A single Project Blackbox could provide as much as 1.5 petabytes of disk storage or 2 petabytes of energy-efficient tape storage.

  • A single Project Blackbox could provide 7 terabytes of memory.

  • A single Project Blackbox could handle up to 10,000 simultaneous desktop users. 

  • A single Project Blackbox currently has sufficient power and cooling to support 200 kilowatts of rackmounted equipment.

Ignoring the humanitarian benefits it is still an interesting concept for some companies due to its benefits:



  • The black box gives a lot of computing power considering its size

  • It is quick to deploy (just plug in the water, internet connection and power)

  • It is cheaper than building your own data center

  • It gives a 20% improvement in energy saving compared to normal data centers

You can get a look inside the box and more info at YouTube:





Via SciAm August 2007 [pages 74-77]

4316.aspx

Sometimes you just have to say NO!

Saying NO is important many times in life. Kids have to learn limits, you have to say NO to leave time for yourself and your family but you also have to say NO at work, especially if you are a consultant.


Being a consultant does not mean that you should Con & insult but help the client reach the best possible solution. It is just too tempting to go with the flow and use XML and open source for everything. It is especially important that architects have "the balls" to say NO when clients/colleagues make requests that do not make any sense. I am the first to admit that saying YES is the simplest option to make "the problem" go away but you will pay for it in the long run.


Gartner estimates that:



Trough 2011, enterprises will waste $100bn buying the wrong networking technologies and services. Enterprises are missing out on opportunities to build a network that would put them at a competitive advantage. Instead, they follow outdated design practices and collectively will waste at least $100bn in the next five years.


Add the cost for using the wrong development technologies and architectures and you get a mind boggling number.


Just saying NO does not work unless you are an authority figure (so people trust whatever you say) so it is better to teach/guide people to reach the right conclusion themselves. I am not saying it is easy, but having kids to “practice on” daily helps :-)

Thursday, August 23, 2007

4325.aspx

Notte di fiaba - Riva del Garda 23/8-26/8

Notte di Fiaba takes place in Riva del Garda 23/8-26/8. This years theme is "Pippi Calzelunghe". I cannot attend as I  will be working this weekend so please let me know what you think of it if you go.


This picture is from 2005 when the theme was “The wizard of Oz“:



It starts today and continues until Sunday. You can find the program here.

4315.aspx

WebProxy.Credentials does not work when KeepAlive = false in .NET 1.1

While working on the next version of SocketProxy I came along a strange issue with Proxy NetworkCredentials not working.


This is the code:


System.Net.WebProxy uplinkProxy;               


 


uplinkProxy = new System.Net.WebProxy(txtProxy.Text);


uplinkProxy.Credentials = new System.Net.NetworkCredential(txtUser.Text, txtPassword.Text);                                         


 


System.Net.HttpWebRequest request = (HttpWebRequest) WebRequest.Create(txtSite.Text);


request.KeepAlive=false;


request.Proxy = uplinkProxy; 


 


System.Net.HttpWebResponse response = (HttpWebResponse) request.GetResponse();     


response.Close();



Using WireShark (ex Ethereal) I noticed that my proxy authentication was not passed:
Request



GET http://www.microsoft.com/ HTTP/1.1
Proxy-Connection: Close
Host: www.microsoft.com


Reply



HTTP/1.1 407 Proxy Authentication Required ( The ISA Server requires authorization to fulfill the request. Access to the Web Proxy filter is denied.  )
Proxy-Authenticate: Negotiate
Proxy-Authenticate: Kerberos
Proxy-Authenticate: NTLM
Proxy-Authenticate: Basic realm="..."



The interesting thing is that it works fine with HTTPS which is why I did not notice the problem earlier.


Request:



CONNECT www.microsoft.com:443 HTTP/1.1
Proxy-Authorization: Negotiate TlRMT...==
Host: www.microsoft.com:443


Reply



HTTP/1.1 407 Proxy Authentication Required ( Access is denied.  )
Proxy-Authenticate: Negotiate TlRMT..=
Content-Length: 0    


Automatic second request:



CONNECT www.microsoft.com:443 HTTP/1.1
Proxy-Authorization: Negotiate TlRMT...=
Host: www.microsoft.com:443


Second reply



HTTP/1.1 200 Connection established
Connection: Keep-Alive
Proxy-Connection: Keep-Alive



Set KeepAlive to true and it works since the .NET client automaticlly makes two requests, just like .NET 2.0 does:


First request:



GET http://www.microsoft.com/en/us/default.aspx HTTP/1.1
Proxy-Authorization: Negotiate TlRMT...==
Host: www.microsoft.com


Reply



HTTP/1.1 407 Proxy Authentication Required ( Access is denied.  )
Proxy-Authenticate: Negotiate TlRMTV...=
Connection: Keep-Alive 


Automatic second request:



GET http://www.microsoft.com/en/us/default.aspx HTTP/1.1
Proxy-Authorization: Negotiate TlRMTV...=
Host: www.microsoft.com


Second reply



HTTP/1.1 200 OK
Connection: Keep-Alive

Wednesday, August 22, 2007

4314.aspx

Windows Update error 0x80072EFD and proxy scripts

Windows Update failed on my PC earlier this week with error 0x80072EFD.  It is a very generic error that just means that the Windows Update client did not receive a response from the Windows Update Web site.


Microsoft KB article #836941 explains the steps that normally resolve the error but it did not work for me. In my case it was caused the Proxy auto configuration configuration script I am using.



The PAC script works grand when browsing but Windows Update refuses to work. Temporarily disabling the PAC script and configuring a static proxy works.


The proxy script below is based on one I got from Michele. It has the following features:



  • Disables ads in Live Messenger and Live mail by sending requests for the host rad.msn.com to a proxy that does not exist :-)

  • Uses direct internet access from home

  • Supports different proxy configurations for several clients (I have changed the real IPs and hosts)
// Configure it as file://c:/scripts/proxy.pac
function FindProxyForURL(url, host)
{
// Set if (true) to show debug messages
if (false)
{
alert(url);
alert(host);
alert(myIpAddress());
}
if (host == "127.0.0.1" || host == "localhost")
{
return "DIRECT";
}
// Disable Live Messenger/Mail ads by pointing to proxy that does not exist
if (host == "rad.msn.com")
{
return "PROXY 127.0.0.1:55555";
}
// Home Network
if (isInNet(myIpAddress(), "192.168.10.0", "255.255.255.0"))
{
return "DIRECT";
}
// Client 1. Direct access to local network and to Exchange server
if (isInNet(myIpAddress(), "10.135.160.0", "255.255.252.0"))
{
if ((isInNet(host, "10.135.0.0", "255.255.0.0"))||(host == "owa.company.com"))
{
return "DIRECT";
}
else
{
return "PROXY 10.135.160.4:8080";
}
}

// Client 2
if (isInNet(myIpAddress(), "10.10.0.0", "255.0.0.0"))
{
if (isInNet(host, "10.10.0.0", "255.255.0.0"))
{
return "DIRECT";
}
else
{
return "PROXY proxy.company.com:8080";
}
}
   // N other clients here..

// Anything else goes directly
return "DIRECT";
}

A quick intro to the functions/variables used in the script:



  • The host variable tells the script which site I am accessing

  • myIpAddress() returns my ip address which changes according to my location

  • I match my ip address against a subnet using isInNet() to know if I am at home, in the office or on site at a client

  • The function returns the string “DIRECT“ if the host can be reached directly otherwise it returns “PROXY ip-or-host:port“

More info on how Windows Update works with proxies here:


Tuesday, August 21, 2007

4304.aspx

The technology behind Google

I am sure you already know about Google Labs, but if you are into technology you have to have a check out the papers written by Googlers. There is something for everyone:



  • Algorithms and Theory (33)

  • Artificial Intelligence and Data Mining (21)

  • Audio, Video, and Image Processing (18)

  • Distributed Systems and Parallel Computing (47)

  • Human-Computer Interaction (9)

  • Hypertext and the Web (11)

  • Information Retrieval (22)

  • Machine Learning (38)

  • Natural Language Processing (20)

  • Science (7)

  • Security, Cryptography, and Privacy (12)

  • Software Engineering (10)

I recently read and learned a lot from these papers:



  • Google File System: how Google stores peta byte upon peta byte of data in a redundant and distributed system 

  • BigTable: Distributed storage of structured data

  • MapReduce If you think multi threaded programming is hard, you should try parallel computing

It is a shame that Microsoft does not have something similar at labs.live.com

Saturday, August 18, 2007

4302.aspx

MonoDevelop - .NET development on Linux, Solaris, Mac OS X (and Windows)

Mono has gone a long way since Miguel de Icaza started work on it in 2001. It can run binaries produced by Visual Studio and it has C# and VB.NET compilers. Mono is still working on full .NET 2.0 support but they have a lot of interesting features in the Mono namespace:



  • Gtk#: Bindings for the popular Gtk+ GUI toolkit for UNIX and Windows systems. Other bindings are available: Diacanvas-Sharp and MrProject.
  • #ZipLib: A library to manipulate various kinds of compressed files and archives (Zip and tar).
  • Tao Framework: bindings for OpenGL
  • Mono.Directory.LDAP / Novell.Directory.LDAP: LDAP access for .NET apps.
  • Mono.Data: We ship support for PostgreSQL, MySQL, Firebird, Sybase ASE, IBM DB2, SQLite, Microsoft SQL Server, Oracle, and ODBC data sources.
  • Mono.Cairo: Bindings for the Cairo rendering engine (Our System.Drawing is implemented on top of this).
  • Mono.Posix / Mono.UNIX: Bindings for building POSIX applications using C#.
  • Mono.Remoting.Channels.Unix: Unix socket based remoting
  • Mono.Security: Enhanced security and crypto framework
  • Mono.Math: BigInteger and Prime number generation
  • Mono.Http: Support for creating custom, embedded HTTP servers and common HTTP handlers for your applications.
  • Mono.XML: Extended support for XML
  • Managed.Windows.Forms (aka System.Windows.Forms): A complete and cross platform, System.Drawing based Winforms implementation.
  • Remoting.CORBA: A CORBA implementation for Mono.
  • Ginzu: An implementation on top of Remoting for the ICE stack

You can try it on Windows or download a Vmware image with Mono 1.2.4 installed on openSUSE 10.2


Earlier this month they also released Monodevelop 0.15 which has all the features you would expect from a modern development IDE like code completion, projects and add-in support:



I spend my time developing for Windows these days but it is great to see that support for .NET is growing on other platforms as well. Mono supports embedded http servers and http handlers but it would be great to see a .NET cross platform application server.


[Update: made some small formatting changes]

Thursday, August 16, 2007

4271.aspx

A simple queue using tables in Microsoft SQL Server

I needed a fast and reliable queue in SQL when running some distributed tests but I did not find anything that suited my needs so I put together the simple system below. My main requirements were:



  • support hundreds of parallel readers and writers

  • each request in the queue must only be returned to one reader

  • make it easy to monitor the performance of my readers and writers.

It consists of two stored procedures:



  • usp_Request_push adds one requst to the queue (RequestsQueue table)

  • usp_Request_pop returns one request from the queue and moves it to the RequestsCompleted table

The RequestsCompleted table is optional but I added it for several reasons:



  • Track all requests to verify that my client does not "loose" messages. Any record in RequestsCompleted with a "RemovedDate" but a NULL "CompletedDate" got lost somewhere.

  • To calculate statistics like average time before the message was removed from the queue and average time before the request was completed by the client.

Tables


CREATE TABLE [dbo].[RequestsQueue](


      [id] [uniqueidentifier] NOT NULL


CONSTRAINT [DF_RequestsQueue_id] 


DEFAULT (newid()),


      [request] [varchar](512) NOT NULL,


      [InsertedDate] [datetime] NOT NULL


CONSTRAINT [DF_RequestsQueue_InsertedDate] 


DEFAULT (getdate()),


 CONSTRAINT [PK_RequestsQueue] PRIMARY KEY NONCLUSTERED


(


      [id] ASC


)WITH (PAD_INDEX  = OFF, IGNORE_DUP_KEY = OFF) ON [PRIMARY]


) ON [PRIMARY]


 


CREATE TABLE [dbo].[RequestsCompleted](


      [id] [uniqueidentifier] NOT NULL


CONSTRAINT [DF_RequestsCompleted_id] 


DEFAULT (newid()),


      [request] [varchar](512) NOT NULL,


      [InsertedDate] [datetime] NOT NULL


CONSTRAINT [DF_RequestsCompleted_InsertedDate] 


DEFAULT (getdate()),


      [RemovedDate] [datetime] NOT NULL


CONSTRAINT [DF_RequestsCompleted_RemovedDate] 


DEFAULT (getdate()),


      [CompletedDate] [datetime] NULL


) ON [PRIMARY


 



Stored procedures


CREATE PROCEDURE [dbo].[usp_Request_Pop] 


AS


BEGIN


declare


@request varchar(512),


@id uniqueidentifier,


@InsertedDate datetime


 


    /*


    The queue works like this:


    - Get the first record in the queue     


    - "Move" the recor to the RequestCompleted table


    - return the request to the caller


 


    We use a locking cursor so the same record cannot be


    returned to two different clients    


    */


     


      SET NOCOUNT ON;


 


      set @id=null


begin transaction


      DECLARE Stack_Cursor CURSOR FOR


            SELECT TOP 1 id, request, InsertedDate


            FROM RequestsQueue with (ROWLOCK, FASTFIRSTROW, XLOCK)           


 


      OPEN Stack_Cursor;


 


      FETCH NEXT FROM Stack_Cursor INTO @id, @request, @InsertedDate


      IF @@FETCH_STATUS = 0


      BEGIN


            delete RequestsQueue where current of Stack_Cursor         


      END;


 


      CLOSE Stack_Cursor;


      DEALLOCATE Stack_Cursor;


commit transaction     


 


if not @id is null


begin


    -- OPTIONAL store to the Completed table.


      insert into RequestsCompleted (id, request, insertedDate)


            values (@id, @request, @InsertedDate)


   


      -- return data to caller


      select @id as id, @request as request, @InsertedDate as InsertedDate


end


 


END


 




CREATE PROCEDURE [dbo].[usp_Request_Push]


@request varchar(512)


AS


BEGIN


   SET NOCOUNT ON;


   insert into RequestsQueue(request) values (@request)


END


 


Note: The queue is not guaranteed to be FIFO as the select statement does not sort the records in any way. It is normally FIFO as SQL Server appends records in tables that do not have a clustered index.

Wednesday, August 8, 2007

4262.aspx

Leatherman Squirt S4

I just got this in the mail:


Cool Or What?  It is the tiny Leatherman Squirt S4. Tiny, but “great” when you unfold it:



The Squirt S4 is tiny but it has a lot of useful features that I have missed in the past. In particular:



  • Scissors

  • Straight Knife

  • Tweezers

  • Extra-small and medium Screwdriver

  • Small flat Phillips Screwdriver

My only worry is that I will forget to remove it the next time I go by plane…


I chose the S4 as it has scissors but Leatherman offers two similar models that may suit you better depending on your needs:



P.S. Reminder to myself:



Egil, you have to learn from your lessons and stop buying stuff from the USA. The price is great with the current Euro-USD exchange rate but you pay, and you pay, and you pay again. This time you had to pay 5 euro in VAT, plus another 3.50 euro in postal and another 3 in custom expenses, basically doubling the cost. Very, very bad for the WAF.

4258.aspx

How to fix alarms and reminders not working on Pocket PC

This Pocket PC bug comes in several "flavors" and I have had the misfortune of encountering several of them:



  • Reminders for appointments do not work

  • Alarms ring but they do not pop up the notification so there is no way to turn them off!

Only a hard resets fixes the problem :-(


It turns out this is caused by a "feature" on Windows Mobile 2003 and newer devices. Windows Mobile runs a maintenance job at midnight. This job is scheduled to take 15 seconds but it may take longer if the device is low on memory, has a lot of data or if you have a reminder set for midnight. The end result is that the reminder is only half triggered and the reminders do not work anymore.


The free WakeUp tweak utility fixed the problem for me. I used it to enlarge the maintenance window to 30 seconds and I have not had any problems in several months. It also has tools to "clean up" the job database if it contains half triggered reminders.



You can also fix this manually by leaving the device on at midnight so the maintenance job can finish properly. But you are likely to have the problem again unless you change the size of the maintenance window.


This article explains the cause in further detail and suggests several ways to fix the problem.

Monday, August 6, 2007

4253.aspx

Compressing and hiding DLLs

Flashback…
Compressed executables were important in the good(?) old DOS days for making reverse engineering more difficult, reducing disk space and loading time.


The AsmZip utility by Francesco Balena gives similar benefits for .NET projects:



a) simplified deployment: you need to distribute fewer files (often just the main EXE)
b) more robust applications: end users can't break the application by accidentally deleting one of its DLLs
c) fewer bytes on disk: all DLLs are compressed and appended to the main EXE file
d) the ability to "hide" some of your trade secrets, for example which 3rd party controls you've used
e) a slightly better protection of your intellectual property: compressed DLLs can't be decompiled, at least not as easily as uncompressed DLLs .


Very handy instead of deploying many DLLs with your .NET projects.


UPX (Ultimate Packer for eXecutables) is your friend for other platforms

4252.aspx

Build your own PC!

Jeff Atwood has a great series of posts regarding building your own PC.


I have not built my own PC from scratch in a decade but I completely agree with Jeff that you should do it. You get better value for money and a system that is tailor made to your needs. But even more important, there is nothing like learning by doing. I freed up space, increased my WAF, and had some fun this spring by scavenging several old PCs and putting together a decent PC which I donated to the local school.


My "server" at home has gone through endless changes and urgent hardware changes after I decided to host my domains at home. I have learned a lot about hosting, mail, DNS, hardware, web and security. I consider myself lucky as there has only been one "incident" since I started hosting my domains at home. I was all my fault as I had created a test account on my SMTP server with password "test" that I forgot to delete after my tests, essentially creating an open SMTP relay. It did not take long for hackers to break the password by trying common account/password combinations and spam the universe. Luckily I was at home and noticed the abnormal internet traffic and closed the hole quickly but thousands mail was sent from my IP :-(


Lesson learned. I wrote simple web honeypot is also useful if you decide to host your own websites.


P.S. Do not expect to save money, and especially not time, if you decide to host your domains at home. Electricity alone costs more than renting resources on a Virtual Server, at least in Italy.