Much of the software I use on a day-to-day basis requires a HTTP connection to the Internet. Unfortunately, not all of this software includes reliable Web proxy support for Windows Authentication (NTLM). Whilst many people are connecting to the Internet from networks without proxy servers, I'm often connecting from corporate networks through Microsoft ISA Server.
Here is some advice for anyone writing software that uses that needs uses the Internet:
- Include proxy support in your application. You'll not believe how many applications get un-installed because they don't support proxy servers.
- Ensure that your proxy supports auto-configuration (.pac) files. If you don't go this far make it clear how the proxy host name should be specified, whether to include "http://" at the beginning and what port number to use.
- Provide support for various authentication mechanisms. Many corporate networks use NTLM authentication. If your application runs on the Microsoft CLR you have support for this authentication with the CredentialCache class. Native applications can use the support available in WinInet or the more recent WinHttp. The latter includes a proxy configuration tool to make life a little easier.
- Respect user credentials. If a user has to explicitly provide their NT logon credentials to your application make sure to store them securely.
- When requests fail provide useful error messages and server names to the user. This will help them figure out how to make connections work. A lot of times setup is a process of trial and error for users who aren't provided information by network administrators.
One of the things that I find most frustrating about on .NET projects is working with relational data sources. My experience with DataSets in the 1.x days was far from positive. They proved too inefficient and difficult to debug. This has changed in 2.0 with the many improvements to the API and the introduction of visualizers to the integrated debugger. I'm still not sold on this solution, but at least things are improving ;)
My preference has been to develop a layer of custom objects which get called from the upper layers of the application. This is very flexible and easy to debug. In addition, you can create these objects without having any back end developed so that prototyping is simpler. To be fair this can be a bit time consuming, and I have tried to augment this with code generation using CodeSmith. Working this way lets me deal with objects in a fashion native to the .NET platform, take advantage to intelliense and simplify unit testing.
I'm looking at two other solutions - LLBLGen Pro and NHibernate. LLBLGen seems to be better suited to my needs at present since it has a better user experience. Both of these tools map generated objects to the tables in the database, so you can avoid switching back and forth between programming models. Complex queries are expressed using custom syntax and this is where the story sours for NHibernate and LLBGen to a lesser extent. LLBGen makes it simple to wrap existing stored procedures so this is potentially useful when the the SQL gets complex. Ideally I'd like to rid myself of the relational model and SQL altogether but I guess we're going to have to live with it forever.
On this topic it's worth reading a paper by Ted Neward on the object-relational divide and various technologies that have been developed to bridge it. The paper was for MSDN so it covers the LINQ technology that will likely be part of C# 3.0.
I've been evaluating some imaging controls from Atalasoft for a client project. The application uses Windows Forms which poses some licencing issues with many imaging components out there. After some searching I ended up on the Atalasoft site and downloaded a trial. What you get in the box is impressive: hybrid managed C++/C# assemblies that don't rely on native code, excellent online help and a number of sample applications that cover useful areas of the API. These haven't been updated to support .NET 2.0 features such as BackgroundWorker but this is simple, if tedious to code yourself.
Unlike vendors that have carried a product forward from the COM days, Atalasoft have implemented an object model which is close to the framework guidelines. Base functionality in the toolkit is good, but DotImage Pro is where the cool WinForms bits live. They include ThumbnailView and FolderThumbnailView classes which can load from custom objects or watch the filesystem respectively. I'd imagine most people just need to load thumbnails from disk, but my application needs to load images from a range of sources.
I used the PDF Rasterizer extension to extract thumbnails from an Acrobat document and was pleasantly surprised by the results. Memory consumption was low and didn't increase massively even with large numbers of thumbnails. You can find out more about the memory management on the Atalasoft site. As I use more features of the toolkit I'll probably post some snippets online.