Self-Sustaining Systems Wiki


DevelopmentalIdeas

Developing embryos go through a lot of steps to generate the final form. There is lots of contextual information involved: chemical gradients, protein storage in the egg, etc. that all coordinate and control development.

Does it make sense for computer systems to self-configure based on the actual hardware/software involved?


Here's an example of system self tuning in the latest Java 1.5 release:

(from http:/java.sun.comdevelopertechnicalArticlesInterviews/hamilton_qa.html)

Now, in looking at how customers actually deploy to large systems we realized we had been suffering from a flawed assumption. Historically, we've provided very detailed configuration flags to let you manually configure the JVM to best exploit server class systems. If you knew how to set the right flags, you could really boost your performance on large systems. However we've now realized that we got a little too carried away with all these configuration options and in practice many customers ignored most of these options.

So one of the things that we're introducing in Tiger is what we call performance ergonomics, where at JVM start-up the JVM will analyze the kind of environment it's in, and will now automatically configure these fancy options that you used to have to set manually. For example, if you are on a large server system with lots of memory, it will use aggressive heap options targeted for high throughput. If you want to, you can still manually configure all these options, but my advice would be to initially just try it with the defaults. This may now work even better than doing hand configuration.

Is this trivially obvious or a simple example of a system being responsible to tune itself appropriately?


My opinion? This is trivially obvious and anyone who believed that "the assumption" was anything more than laziness should have been fired. -rpg