Wednesday, August 29, 2007

4347.aspx

.NET libraries for writing your own messenger bot

I do not use Instant Messaging as much as some of my friends but I can imagine several services that would be useful via IM. There are endless opportunities when you consider that IM can offer push as well as pull services.


It looks like I am not the only one as there are several open source .NET libraries for working with Microsoft Messenger:



  • DotMSN 2.0 supports the MSNP9 protocol.

  • MSNPSharp extends DotMSN 2.0 with support for the MSNP11 protocol.

I am busy working on several other hobby projects at the moment but I would love to put together a IM bot later this year. What kind of services you would like to access via IM?

4349.aspx

Real PreAuthenticate for .NET Web Services

.NET Web Service clients do not send authentication details with the requests so you get a traffic pattern like this


Request 1:



Client: GET someUrl
Server: 401 WWW-Authenticate Basic
Client: GET with Authorization header
Server: 200 OK


Request 2:



Client: GET someUrl
Server: 401 WWW-Authenticate Basic
Client: GET with Authorization header
Server: 200 OK


The PreAuthenticate property improves the situation if you make many repeated calls to a web service.


Request 1:



Client: GET someUrl
Server: 401 WWW-Authenticate Basic
Client: GET with Authorization header
Server: 200 OK


Request 2:



Client: GET someUrl with Authorization header
Server: 200 OK


The first request does not contain the Authorization header but the subsequent requests do. I do not like this feature as the systems I work on usually make one single web service request per client request so I always get 401 errors.


You can fix the problem by manually inserting the Authorization header in the requests. Do not modify the Web Service client generated by Visual Studio as you will loose your changes if you update the Web Reference.  Extend the Web Service client instead and override the GetWebRequest method:


public class WSTestPreAuthenticate : WSTest


{


    protected override System.Net.WebRequest GetWebRequest(Uri uri)


    {


        HttpWebRequest request;


        request = (HttpWebRequest)base.GetWebRequest(uri);


 


        if (PreAuthenticate)


        {


            NetworkCredential networkCredentials =


                Credentials.GetCredential(uri, "Basic");


 


 


            if (networkCredentials != null)


            {


                byte[] credentialBuffer = new UTF8Encoding().GetBytes(


                    networkCredentials.UserName + ":" +


                    networkCredentials.Password);


                request.Headers["Authorization"] = "Basic " + Convert.ToBase64String(credentialBuffer);


            }


        }


        return request;


    }


}



The change above means that PreAuthentication works as it should since it always sends the Authorization header


Request 1:



Client: GET someUrl with Authorization header
Server: 200 OK


The client has to set the PreAuthenticate header and any custom credentials:


WSTestPreAuthenticate test = new WSTestPreAuthenticate ();


test.PreAuthenticate = true;


test.Url = "http://localhost/WSTest/WSTest.asmx";


test.Credentials =


new System.Net.NetworkCredential("user", "password");


int ret = test.Method("test");

Tuesday, August 28, 2007

4346.aspx

Performance problem with ADO, stored procedures, BLOBs and SQL Server

There is a know bug in ADO that causes horrible performance with ADO if you insert a BLOB via stored procedure. Working on a (almost) real time import of data I noticed that importing blobs took forever….


I did the right thing and used stored procedures for all data access. In this case it was a mistake as it took several minutes to insert a small blob of a few hundred KB.


I ended up with the workaround below. It is ugly but the least “dirty” solution I found:


 set oCon = CreateObject("ADODB.Connection")
oCon.ConnectionString = CONNECTION_STRONG
oCon.open
Set oCmd = CreateObject("ADODB.Command")
set oCmd.ActiveConnection = oCon
oCmd.CommandType = adCmdStoredProc
oCmd.CommandText = "usp_insert_item"
oCmd.Parameters.Refresh()
oCmd.Parameters("@whatever").value = somedataHre
oCmd.Parameters("@PageData").value = "dummy"
 'The stored procedure returns the ID of the new record as newID
Set oRS = oCmd.Execute
BinaryID = oRS("newID")
oRS.close()

'Update the binary data field
set oRS = CreateObject("ADODB.Recordset")
oRS.CursorLocation = 3
oRS.CursorType = 1
oRS.LockType = 2
set oRS.ActiveConnection = oCon
oRS.Open "select BlobField from TheTable where ID=" & BinaryID
oRS.Fields("BlogField").Value = TheBinaryData
oRS.Update()
oRS.Close()

Set oRS = Nothing
Set oCmd = Nothing
set oCon = Nothing

Friday, August 24, 2007

4326.aspx

The Sun BlackBox

Wouldn't it be fantastic if the Sun BlackBox could be used in emergency situations like this photo(shop?) shows:


But, using them for Red Cross emergencies is not feasible today as the BlackBox is hungry:



  • Cooling water supply: 60 gallons (~230 liters) per minute of cold water at 13 degrees Celsius which means a large coldwater supply or a small water supply with a serious cooler

  • Dual 600 amp feeds power feeds

  • High bandwidth internet connection

Pretty hard to find in emergency situations, in Africa or elsewhere.


It is no wonder the box is hungry considering the possible configurations:



  • A single Project Blackbox could accommodate 250 Sun Fire T1000 servers with the CoolThreads technology with 2000 cores and 8000 simultaneous threads.

  • A single Project Blackbox could accommodate 250 x64-based servers with 1000 cores.

  • A single Project Blackbox could provide as much as 1.5 petabytes of disk storage or 2 petabytes of energy-efficient tape storage.

  • A single Project Blackbox could provide 7 terabytes of memory.

  • A single Project Blackbox could handle up to 10,000 simultaneous desktop users. 

  • A single Project Blackbox currently has sufficient power and cooling to support 200 kilowatts of rackmounted equipment.

Ignoring the humanitarian benefits it is still an interesting concept for some companies due to its benefits:



  • The black box gives a lot of computing power considering its size

  • It is quick to deploy (just plug in the water, internet connection and power)

  • It is cheaper than building your own data center

  • It gives a 20% improvement in energy saving compared to normal data centers

You can get a look inside the box and more info at YouTube:





Via SciAm August 2007 [pages 74-77]

4316.aspx

Sometimes you just have to say NO!

Saying NO is important many times in life. Kids have to learn limits, you have to say NO to leave time for yourself and your family but you also have to say NO at work, especially if you are a consultant.


Being a consultant does not mean that you should Con & insult but help the client reach the best possible solution. It is just too tempting to go with the flow and use XML and open source for everything. It is especially important that architects have "the balls" to say NO when clients/colleagues make requests that do not make any sense. I am the first to admit that saying YES is the simplest option to make "the problem" go away but you will pay for it in the long run.


Gartner estimates that:



Trough 2011, enterprises will waste $100bn buying the wrong networking technologies and services. Enterprises are missing out on opportunities to build a network that would put them at a competitive advantage. Instead, they follow outdated design practices and collectively will waste at least $100bn in the next five years.


Add the cost for using the wrong development technologies and architectures and you get a mind boggling number.


Just saying NO does not work unless you are an authority figure (so people trust whatever you say) so it is better to teach/guide people to reach the right conclusion themselves. I am not saying it is easy, but having kids to “practice on” daily helps :-)

Thursday, August 23, 2007

4325.aspx

Notte di fiaba - Riva del Garda 23/8-26/8

Notte di Fiaba takes place in Riva del Garda 23/8-26/8. This years theme is "Pippi Calzelunghe". I cannot attend as I  will be working this weekend so please let me know what you think of it if you go.


This picture is from 2005 when the theme was “The wizard of Oz“:



It starts today and continues until Sunday. You can find the program here.

4315.aspx

WebProxy.Credentials does not work when KeepAlive = false in .NET 1.1

While working on the next version of SocketProxy I came along a strange issue with Proxy NetworkCredentials not working.


This is the code:


System.Net.WebProxy uplinkProxy;               


 


uplinkProxy = new System.Net.WebProxy(txtProxy.Text);


uplinkProxy.Credentials = new System.Net.NetworkCredential(txtUser.Text, txtPassword.Text);                                         


 


System.Net.HttpWebRequest request = (HttpWebRequest) WebRequest.Create(txtSite.Text);


request.KeepAlive=false;


request.Proxy = uplinkProxy; 


 


System.Net.HttpWebResponse response = (HttpWebResponse) request.GetResponse();     


response.Close();



Using WireShark (ex Ethereal) I noticed that my proxy authentication was not passed:
Request



GET http://www.microsoft.com/ HTTP/1.1
Proxy-Connection: Close
Host: www.microsoft.com


Reply



HTTP/1.1 407 Proxy Authentication Required ( The ISA Server requires authorization to fulfill the request. Access to the Web Proxy filter is denied.  )
Proxy-Authenticate: Negotiate
Proxy-Authenticate: Kerberos
Proxy-Authenticate: NTLM
Proxy-Authenticate: Basic realm="..."



The interesting thing is that it works fine with HTTPS which is why I did not notice the problem earlier.


Request:



CONNECT www.microsoft.com:443 HTTP/1.1
Proxy-Authorization: Negotiate TlRMT...==
Host: www.microsoft.com:443


Reply



HTTP/1.1 407 Proxy Authentication Required ( Access is denied.  )
Proxy-Authenticate: Negotiate TlRMT..=
Content-Length: 0    


Automatic second request:



CONNECT www.microsoft.com:443 HTTP/1.1
Proxy-Authorization: Negotiate TlRMT...=
Host: www.microsoft.com:443


Second reply



HTTP/1.1 200 Connection established
Connection: Keep-Alive
Proxy-Connection: Keep-Alive



Set KeepAlive to true and it works since the .NET client automaticlly makes two requests, just like .NET 2.0 does:


First request:



GET http://www.microsoft.com/en/us/default.aspx HTTP/1.1
Proxy-Authorization: Negotiate TlRMT...==
Host: www.microsoft.com


Reply



HTTP/1.1 407 Proxy Authentication Required ( Access is denied.  )
Proxy-Authenticate: Negotiate TlRMTV...=
Connection: Keep-Alive 


Automatic second request:



GET http://www.microsoft.com/en/us/default.aspx HTTP/1.1
Proxy-Authorization: Negotiate TlRMTV...=
Host: www.microsoft.com


Second reply



HTTP/1.1 200 OK
Connection: Keep-Alive