Wednesday, September 28, 2005

File Upload in ASP.NET

There are a couple of things we need to keep in mind while doing a file-upload in ASP.NET.

If the upload fails it could be bcoz of the following reasons:

ASP.NET limits the size of file uploads for security purposes. The default size is 4 MB. This can be changed by modifying the maxRequestLength attribute of Machine.config's element.

If we are using the HtmlInputFile html control, then we need to set the the Enctype property of the HtmlForm to "multipart/form-data" for this control to work properly, or we might get 'null' for the PostedFile property.

How does the Page's IsPostBack property work?

IsPostBack checks to see whether the HTTP request is accompanied by postback data containing a __VIEWSTATE or __EVENTTARGET parameter. If there are none, then it is not a postback.

Friday, September 23, 2005

.NET Remoting fundas


Developers often get confused when they see the following methods (all sound the same).
RegisterWellknownServiceType()
RegisterWellknownClientType()
RegisterActivatedServiceType()
RegisterActivatedClientType()

Finally I got hold of a image that explains the above in a lucid manner...

Diff btw "protected" access specifier in Java and .NET

In Java, when a class is declared as 'protected', it is also given 'package' access automatically. i.e. a protected member can be accessed within the same package by other members.

But in .NET, 'protected' access specifier means that only subclasses can see it. If we want other members of a assembly to see the 'protected' class, then we need to put the access specifier as 'protected internal'.

Thus 'protected' in Java is equivalent to 'protected internal' in .NET

Tuesday, September 20, 2005

Improving the performance of Eclipse

I often used to get frustrated with the speed of eclipse on my machine. Then I heard from my friend how I can increase the initial memory allocated to Eclipse. U just have to pass values from the command prompt using the vmargs argument.

eclipse -vmargs -Xms256m -Xmx256m

This single change, gave me a very good "perceived" performance benifit while using the IDE.

Other tips to increase the performance of Eclipse can be found at:
http://eclipsewiki.editme.com/GeneralFaq

Friday, September 16, 2005

Eating Ur own dog food :)

I came accross this slang many a times over the last few years. So what does "eating Ur own dog food " mean?

“A company that eats its own dog food sends the message that it
considers its own products the best on the market.”


“This slang was popularized during the dotcom craze when some
companies did not use their own products and thus could "not even eat
their own dog food". An example would've been a software company that
created operating systems but used its competitor's software on its
corporate computers.”

Thursday, September 15, 2005

How do download managers work?

I have often wondered how download managers worked? I wanted to know the internal working that makes it possible for these components to download faster?

The main functions of a download manager are:
• Resuming interrupted downloads (i.e., downloading only the rest of the file instead of restarting theprocess from the very beginning);
• Scheduled operation: connecting to the Internet, downloading a list of specific files and disconnectingaccording to a user-defined schedule (e.g. at night when the connection quality is usually higher, while the connection rates are lower);
• some download managers have additional functions: searching for files on WWW and FTP servers byname, downloading files in several "streams" from one or from different mirror servers, etc.

Most download managers use the concept of "Multi-connection downloading" - the file is downloaded in several segments through multiple connections and reassembled at the user's PC.
To understand how this would work, we first need to understand a feature of web-servers.
A lot of webservers (http and ftp) today support the "resume download" function - what this means is that if your download is interrupted or stopped, U can resume downloading the file from where U left it. But the question now arises, how does the client(web-browser) tell the server what part of the file it wants or where to resume download? Is this a standard or server proprietary? I was suprised when I found out that it is the HTTP protocol itself that has support for "range downloads", i.e. when U request for a resource, U can also request what portion/segment of the resource U want to download. This information is passed from the client as a HTTP header:

See http header snipper below:

GET http://lrc.aiha.com/English/Training/Dldmgrs-Eng.pdf?Cache HTTP/1.1
Host: lrc.aiha.com
Accept: */*
User-Agent: DA 7.0
Proxy-Authorization: Basic bmFyZW5kcjpuYXJlbjEyNDM=
Connection: Close
Range: bytes=0-96143

Now what download managers do is that they start a number of threads that download different portions of the resource. So the download manager will make another request with the header:

GET http://lrc.aiha.com/English/Training/Dldmgrs-Eng.pdf?Cache HTTP/1.1
Host: lrc.aiha.com
Accept: */*
User-Agent: DA 7.0
Proxy-Authorization: Basic bmFyZW5kcjpuYXJlbjEyNDM=
Connection: Close
Range: bytes=96143-192286

This solves the mystery of how the download managers are able to simultaneously download different portions of the resource.

Imp Note: To resume interrupted downloads, it is not enough to use a download manager: the server from which the file is being downloaded should support download resumption. Unfortunately, some servers do not support thisfunction, and are called “non-resumable.”
So Ur download managers won't work (no increase in speed either), as the servers would ignore the HTTP "range" header.

But I was still confused how exactly does this increase the speed. After all if the current bandwidth is "fully utilized" with one connection, how does making more connections help? The answers I found on the net are as below:

Normal TCP connections, as used by HTTP, encounter a maximum connection throughput well below that of the available bandwidth in circumstances with even moderate amounts of packet loss and signal latency. (so bcoz of packet loss and latency, the client will have to re-request some packets ) Multiple TCP connections can help to alleviate these effects and, in doing so, provide faster downloads and better utilization of available bandwidth.

Opening more connections means less sharing with others. Web servers are set up to split their bandwidth into several streams to support as many users downloading as possible. As an example, if the download manager created eight connections to the server, the server thinks it is transmitting to eight different users and delivers all eight streams to the same user. Each of the eight requests asks for data starting at a different location in the file.

Here are the links to some good download managers:

www.getright.com
www.internetdownloadmanager.com/
www.netants.com
www.alwaysfreeware.co.uk/dload.html

Accessing Windows Registry using Java

I always wished someone could provide me with a neat API for accessing the windows registry using Java. I did not want to get into the nitty-gritty of JNI calls..
Fortunately there is an opensource library available at:
http://www.trustice.com/java/jnireg/

Check out the source code to understand it better.

Monday, September 12, 2005

Recursively adding files to Clearcase.

Recently I wanted to add a whole directory struture to Clearcase VOB and I was suprised to see that the Graphical Explorer did not have a option to recursively add an entire directory structure to source-control. Thankfully I found out a command line tool ("clearfsimport") using
which I could import/add all directories and files to the VOB.
The general usage of the command is :

clearfsimport [ –preview ] [ –follow ] [ –recurse ] [ –rmname ]
[–comment comment ] [ –mklabel label ] [ –nsetevent ] [ –identical ][ –master ] [ –unco ] source-name [ . . . ] target-VOB-directory

For more details RTFM or check out these links:

http://www.cmcrossroads.com/ubbthreads/showflat.php?Cat=&Number=33221
http://www.cmcrossroads.com/ubbthreads/showflat.php?Number=33343

Friday, September 09, 2005

Double checked Locking idiom and the Singleton pattern.

I have seen a lot of programs using the double-checked locking pattern in their singleton classes to avoid the overhead of synchronization for each method call.

But it is now known that the double-check locking idiom does not work. It is bcoz of the memory model of Java, which allows out-of-order writes to memory. i.e. There exists a window of time, when an instance is not null, but still not fully initiliazed(constructor has not returned) !!!

For more explanation, check out this simple and lucid article:
http://www-128.ibm.com/developerworks/java/library/j-dcl.html

Thursday, September 08, 2005

Immutable Collections in Java

Quite often, we may feel the need for an immutable collection, i.e. users cannot modify the collection and bring it to an invalid state, but can only read it.
The Collection API has methods to help us with this. For e.g. to get an immutable list, use the following code:

List ul = Collections.unmodifiableList(list);

Check out the other methods of Collections which give you other helper methods to make collections immutable. Any attempt to modify the returned list, whether direct or via its iterator, result in an UnsupportedOperationException.

The 'immutablity' aspect can also be used as a design pattern for concurrent read-and-write access to a collection. (Think mutiple users or threads). Anyone who needs to modify a collection will make a copy of the collection object and modify it.