Wednesday, March 29, 2006

Top Ten Tips On How To Become A Rock Star Programmer (NOT)

Mikael Grev's JavaLobby article about the "Top Ten Tips on how to become a Rock Star Programmer" has gained quite some publicity lately. I first heard about it on the Java Posse podcast, and now Jeff Atwood posted a response on his weblog as well.

Like some other commentators so far I also tend to disagree with several of Mikael's statements, for example:

#2 Use a big TFT screen
I still have tube monitors both at work and at home, and my code looks pretty much the same as if I would be sitting in front of a TFT ;-) Besides, my email client at work will backfire on any attempt of running in dual screen mode (yes, it's Lotus Notes...)

#4 Don't learn APIs too well

Do you want people having to look up the documentation at each second line? I code most API-calls "blindfolded", I rarely consult the JDK or .NET Framework documentation. IntelliSense helps, but only as long as I type in component or method names which I am familiar with already. And I have met one or two guys who "didn't learn APIs too well". They implemented their own version of String.indexOf() - need to say more?

#7 Go back and enhance your old code
Now, my old code might not be pretty, but it works. I know what I would do differently today, but that knowledge stems from somewhere else. And if I know how to do it now, why bother and execute that change on an otherwise perfectly running system - it might break just for the beauty of a new approach. I'd rather apply what I have learned on current projects than on old ones.

#9 Don't ask people for advice
From personal experience, I have benefited most by working with and learning from great developers. I never hesitated to ask them for advice. And when in exchange someone requests help from me, I will be happy to assist. Mikael recommends googling instead. Google is great to look things up, but I wouldn't rely on it exclusively.

Summing up, my recommendations on how to become a Rock Star Programmer would go into a slightly different direction:

#1 Get substantiated education. It's not that everyone needs a PhD in computer science, but your local "how-to-become-a-web-developer-in-six-weeks"-course won't make you a real programmer either.

#2 Never stop learning. Read books and magazine articles, check developer weblogs and listen to technology podcasts, go to industry conferences, take each training opportunity your employer offers, etc.

#3 Work as a developer already during education, at least part-time. This allows you to combine theory and practice. Also, try to find a pet project you can pursue in sparetime, in an area apart from your daily field of work.

#4 Go sure you truly understand why things work the way they do (or why they don't). Don't second-guess. If this library operates fine in debug mode, but breaks in release mode, go and figure out where that flawed memory access happens.

#5 Observe how great developers are doing it. Look at their code and ask them when in doubt. And don't forget to transfer the knowledge you gained to the next one in line.

#6 Eat your own dog food (I agree with Mikael Grev on this one). Don't stay in an ivory tower designing stuff that might not be very useful to others.

#7 Use the best tools you can get, as this will prevent you from wasting your time on mundane tasks.

#8 Be self-critical. Seek perfection, but also be conscious you will never reach it. At the same time, stay humble - application programming isn't rocket science. Making mistakes is only human, just go sure you don't neglect the possibility that your code contains errors, and act accordingly.

#9 Be curious. Enquire what goes on under the hoods, decompile stuff you are interested in, have a look at the memory image in your debugger, use code analysis and profiling tools (like Purify, Boundschecker, OptimizeIt and Lint), or check at which rate your application opens and closes database connections, things like that.

#10 And finally: Have passion for your domain of work.

Monday, March 20, 2006

"Vanilla DAL" Sample Screenshots Available

Sourceforge embraces the use of screenshots even for component libraries. What you normally end up doing is capturing some configuration file content, or an IDE screenshot of some lines of code invoking the library's API. So here are some screenshots depicting a simple Vanilla DAL application.

Sunday, March 19, 2006

The First Item Ever Sold On EBay

Legend has it that the first item ever sold on eBay was a PEZ candy dispenser. The story goes that eBay-founder Pierre Omidyar cobbled together the first version of his "AuctionWeb"-site (which was hosted on and later became eBay) in order to help his fiancee to trade the PEZ dispensers that she was collecting. This version has been told over and over again, and at the foyer of the german eBay subsidiary in Berlin there is a showcase full of PEZ dispensers seemingly supporting the tale. Another time, eBay CEO Meg Whitman even posed for a photo with lined-up PEZ dispensers.

Truth is that the PEZ story is an invention of eBay's first public relation manager, Mary Lou Song. Back in 1997, Song was facing some difficulties attracting media attention for eBay's online auction efforts. When she met Pam Wesley, Omidyar's fiancee, Wesley told her about her hobby of collecting PEZ dispensers. So Song literally made up eBay's founding story, and the journalists loved it.

Adam Cohen revealed what really happened in his book "The perfect store", and this has since been confirmed by eBay.

Actually the first item that Pierre Omidyar ever sold on eBay was his broken laser pointer. It went away for 14 dollars.

BTW, my latest eBay purchase just arrived yesterday: a Magnavox Odyssey 2 video game console (known in Europe as the Philips Videopac G7000).

Thursday, March 16, 2006

.NET Explicit Interface Implementations

As most .NET developers are well aware of, .NET explicit interface implementations (when the interface name prepends the method name) are a mechanism to avoid naming collisions for interfaces with equal method signatures. The caller then has to cast to whatever interface he intends to refer to, e.g.

public interface ITest1 {
    void Test();

public interface ITest2 {
    void Test();

public class TestImpl : ITest1, ITest2 {
    void ITest1.Test() {
    void ITest2.Test() {

TestImpl test = new TestImpl();

Notice that there are no access modifiers on the method implementations, which forces the caller to cast to which interface he refers to - only then the method implementation becomes accessible.

But there is more to that. This mechanism also allows you to prefer one implementation over the other, e.g.

public class TestImpl : ITest1, ITest2 {
    public void Test() {
    void ITest2.Test() {

Now the first method implementation will be invoked on references to TestImpl and ITest1, the second one only on references to ITest2.

.NET uses a similar approach on some framework components, e.g. System.Data.SqlClient.SqlDataAdapter, which implements System.Data.IDbDataAdapter. Part of IDbDataAdapter is the following property declaration:

IDbCommand SelectCommand {

... where, as expected, SqlCommand implements IDbCommand. But all that can be found on SqlDataAdapter's public members is:

public SqlCommand SelectCommand {

SqlDataAdapter's IDbDataAdapter.SelectCommand implementation has been hidden on references to SqlDataAdapter, and is only accessible after casting it to IDbDataAdapter:

IDbCommand IDbDataAdapter.SelectCommand {
    get {
        return this._selectCommand;
    set {
        this._selectCommand = (SqlCommand)value;

Or have a look at System.Collections.Generic.IEnumerable<T>:

public interface IEnumerable<T> : IEnumerable {
    IEnumerator<T> GetEnumerator();

public interface IEnumerable {
    IEnumerator GetEnumerator();

Now this seemed like a puzzler to me at first sight. Two methods with the same signature, only differing in their return value? On an implementation level, this is forbidden. But when inheriting from a second interface, the class will simply have to implement one of the methods "explicitly" (means declaring which implementation belongs to which interface).

Looking at .NET's System.Collections.List<T> code (using Lutz Roeder's Reflector), there are even three versions of GetEnumerator(): one for IEnumerator<T>, one for IEnumerator (both explicit interface implementations), and finally a third public implementation:

IEnumerator<T> IEnumerable<T>.GetEnumerator() {
    return new List<T>.Enumerator(this);

IEnumerator IEnumerable.GetEnumerator() {
    return new List<T>.Enumerator(this);

public List<T>.Enumerator GetEnumerator() {
    return new List<T>.Enumerator(this);

Wednesday, March 15, 2006

"Vanilla DAL": First Release Available

It took another night of coding, but I decided to give it a try and upload what I would call a technology preview of the Vanilla Data Access Framework. Documentation consists of a readme.txt-three-liner, but more will follow. My original plan was to spend at least six weeks before publishing anything, now it turned out to have been only 72 hours... oh well.

Please note that this release only supports Microsoft SqlServer. SqlServer 2000 desktop engine is provided by Microsoft at no charge. I haven't run Vanilla DAL on SqlServer 2005 express edition yet, but that should work as well.

More Progress On "Vanilla DAL"

Today I added support for transactions and stored procedures to project "Vanilla DAL". This took less time than expected. Transactions are built on top of so called units-of-work (despite the name, no nested transactions here), which follow a simple action pattern. So for the caller, a typical transaction looks like this:

UnitOfWorkList list = new UnitOfWorkList(
    new ExecuteNonQueryUnitOfWork(
        new NonQueryParameter(
    new UpdateUnitOfWork(
        new UpdateParameter(

... where...

public interface IUnitOfWork {
    void Execute(IDBAccessor accessor, IDbTransaction

So when I think it over it's more a visitor- than only an action-pattern... ;-)

Stored procedures on the other hand caused even less effort thanks to the beauty of the ADO.NET API. Actually I didn't have to change any code at all - all that was necessary was editing the configuration-file like this:


With improved error handling finished as well, I should soon reach a state that allows me to publish a functional prototype on - enough for people to download and play around with, certainly not ready for real-life projects yet. Anyway, by now Vanilla DAL comes with a SqlServer reference-implementation only - support for other DB vendors will follow later.

So during the next days I have to set up the build process using NAnt, write some documentation and tutorials, and finally downgrade to .NET 1.1. This means giving up generics as well as ADO.NET 2.0 features, but it would be much worse to exclude all developers who still work under .NET 1.1.

By the way, NAnt does not support .NET 2.0 yet either. I also don't intend to learn MSBuild just for the sake of getting this build working.

Tuesday, March 14, 2006

"Vanilla DAL" Preview

I promised I would post some sample code as soon as the emerging "Vanilla DAL" prototype succeeded in its first database roundtrip. So, here we go...

Vanilla requires a configuration XML-file. The configuration file's schema definition (XSD) makes it easy to let a tool like XMLSpy auto-generate the basic element structure. In this case I entered one sample SQL-statement:

<?xml version="1.0" encoding="UTF-8"?>
  <ConnectionString>Data Source=(local);Initial Catalog=
  Northwind;Integrated Security=True</ConnectionString>
        select  * 
        from    customers 
        where   city = @city and
        (       select count(*)
                      from   orders 
                      where  orders.customerid =
                >= @minordercount

The ADO.NET connectionstring can be specified at runtime as well. So the configuration-file basically tells Vanilla to which database to connect, and which SQL-statements are available for execution. Additionally Vanilla supports a simple dataset-to-db mapping, with no need to hand-code any SQL.

Typically the configuration-file will be compiled into the client's assembly. At runtime the client instantiates a so called DBAccessor:

VanillaConfig config =
IDBAccessor accessor =

IDBAccessor is an interface that every database-specific implementation has to support. Working against this interface, the client will never be contaminated with database-specific code. When connecting to another database, all that is required is to use a different the configuration file. Multiple configurations can be applied at the same time as well.

At this point Vanilla is ready to go and can execute its first command:

accessor.Fill(new FillParameter(
    new ParameterList(
        new Parameter("city", "Salzburg")

Here we populate a datatable with the query's result. All we need to know is the target datatable (here: part of a typed dataset), the name of the database table, and a list of parameters.

Next we will do some in-memory data manipulation, and then update the database accordingly:

foreach (NorthwindDataSet.CustomersRow cust in
    northwindDataSet.Customers) {
    cust.City = "Vienna";
accessor.Update(new UpdateParameter(
    northwindDataSet.Customers, "Customers"));

Let's see how we can execute the SQL-statement from our configuration-file:

accessor.Fill(new FillParameter(
    new StatementID("CustomersByCityAndMinOrderCount"),
    new ParameterList(
        new Parameter("city", "Vienna"),
        new Parameter("minordercount", 2)

And this is IDBAccessor's current interface (subject to change):

public interface IDBAccessor {
    IDbCommand CreateCommand(CommandParameter param);
    IDbConnection CreateConnection();
    IDbDataAdapter CreateDataAdapter();

    void Fill(FillParameter param);
    int Update(UpdateParameter param);
    int ExecuteNonQuery(NonQueryParameter param);
    object ExecuteScalar(ScalarParameter param);

    void ExecuteTransaction(UnitOfWorkList workList);

The API will still be subject of change as this is just a tentative draft. Anyway, this completes our little sneak preview of Vanilla DAL. Please let me know about remarks and suggestions on the Vanilla DAL forums.

Monday, March 13, 2006

Hacking On The "Vanilla DAL" Prototype

I have begun working on the "Vanilla DAL" prototype. After a grinding start (mainly because it took me a while to find some motivation when staring at an empty Visual Studio solution), things are running quite smoothly now.

I struggled with the decision, but finally chose Visual Studio 2005 instead of 2003. All my projects at work are still developed under 2003, and this might be my only opportunity for quite a while to play around with some 2005 features. On the other hand Vanilla DAL is just component library so all those new RAD tools won't be that useful - anyway, the new intellisense- and refactoring-functions have saved me quite some time already.

My plan is to make Vanilla DAL available on .NET 1.1 as well, so I am either going to downgrade the whole codebase one day, or there might be special 2003-version with a couple of minor API changes. Internal typed collections are implemented using generics, so that's an implementation detail which won't work under 1.1. The exported API functions use custom implementations of typed collections - I consider that more convenient for the end-user who might not be too familiar with generics.

I hope to execute the first database-roundtrip by tomorrow, so we should soon be able to see a preview on how coding against Vanilla DAL might look like.

Thursday, March 09, 2006

Project "Vanilla DAL" On Sourceforge.Net

Hey, the Vanilla DAL project has been approved by The project can be found at, there is also a first draft document explaining "Motivation and Strategy".

Wednesday, March 08, 2006

Announcing Project "Vanilla DAL"

I just filed in my first open source project for hosting at Project review is still pending, so there is no official site yet, but I guess I can already post my proposal in here:

Vanilla DAL is a data access framework on top of ADO.NET. It aims to simplify the developer's job when accessing relational database systems by providing the following functionality:

  • Structured organization of SQL statements within XML-files (as opposed to the way Visual Studio's query builder generates SQL code).

  • Convenience implementations for recurring tasks.

  • Wrapping database-specific components, hence laying the groundwork for database independence.

  • Object-relational mapping of typed datasets and database tables.

  • Automated optimistic locking facility.

  • A simple mechanism for propagating transaction context through data access methods.
It is important to ensure a low learning curve. Vanilla DAL should be understood and applied within the shortest possible amount of time. Design goals: no performance overhead, no constraints of what the developer can do, lightweight and consistent API.

So far all I have is this idea, and about one to two hours of spare time during weekdays for pursuing the project. My baby-son goes to bed around 8pm, so afterwards I can get going - I have done a lot of tech research during the night lately, but it's starting to be a little bit tiring, and want to get my hands on coding some stuff that grabbed my interest for quite a while already.

More on Vanilla DAL during the next weeks...

Monday, March 06, 2006

What Makes A Great Programmer

Steve McConnell (via Jeff Attwood's blog):

The intense inwardness of programming makes personal character especially important.


Your employer can't force you to be a good programmer; a lot of times your employer isn't even in a position to judge whether you're good. If you want to be great, you're responsible for making yourself great. It's a matter of your personal character.

Let's face it, programming is difficult - that is, unless you are a coding god of the likes of James Gosling or Linus Torvalds. The rest of us surely have delivered some really bad code once in a while. The step we have to take is to acknowledge our limitations and strive to get better. This takes some guts, especially as many programmers have strong egos. Edsger Dijkstra described that phenomenon more than thirty years ago in his classic paper "The Humble Programmer".

What also stroke me often during my professional career is the huge discrepancy in work productiveness and quality between the best and the worst programmers. I mean their job descriptions might look quite similar, but their qualification differs like a brain surgeon's does from a caregiver's. No offense intended here, I admire what caregivers are doing, but still you wouldn't let them operate your brain, right? But each day and at many places, I can assure you, software layman are doing software surgery, and of course they screw up. And they are doing so because they think they are surgeons, when they are not - which at the same time prevents them from improving.

It partly has to do with our industry's education system. Some developers have been hacking away since their early teenage days, some others have attended a six-week-now-I-become-a-web-designer-course. And many employers don't care a lot. The novice might cost them half, and the fact that he is just going to produce one tenth of the value is unconceivable for most managers. In other areas, the apprentice would never be allowed to touch stuff that only a master craftsman can handle. In the software business there are no such restrictions.

At the same time formal education is no guarantee either. I have seen graduates from my university produce great stuff, and others who were horrible coders. Some of the best developers I know don't even have a degree. On the other hand there are graduates who think they have seen it all, which is exactly their problem.

It's hard to separate the wheat from the chaff up-front. Jeff Atwood has a point when he says:

When interviewing candidates for programming positions, I always look for someone who is brave enough to say "I don't know" when they need to. Candidates who can't or won't do this get red flagged; those types of programmers are dangerous. "Can-do" attitudes have a superficial allure, but they're actually poison in our field.

I might add that letting interviewees do some life coding should be a must - unfortunately this is hardly ever practiced.