Showing posts with label complexity. Show all posts
Showing posts with label complexity. Show all posts

Sunday, September 17, 2006

A few simple thoughts...


  • Artists create to express (Emotion)
  • Designers communicate
  • Engineers solve problems
  • Scientists define boundaries and explore (Abstraction/Exploration)

Effective IT people combine all of the above seamlessly. This entry was inspired by a graphic posted by John Maeda.

Sunday, August 06, 2006

Laws of software systems...

Over the past few months I have been writing notes of software development. The following are "laws" that I've condensed from these notes which are a reflection of my current appreciation of software. Some of these laws are based on observation, some are from data I'm analysing as part of my research into software evolution.

So here are the "laws"....

1. If you *notice* a software system, it is probably broken.
-- We do not really notice a DVD player, the interface to a car or another other device that we use day-to-day. They just work, do their thing and stay out of the users' active mind. This observation is a generalisation from that view point. A more direct metaphor is - "we only notice a garbage-bin that is full".

2. Software has both 'form' and 'function'.
All systems have 'form' and 'function'. It took me a number of years to actively start seeing this distinction. End-users, need not -- but as software developers, we need to be aware of what a software system is -- distinct from what it does to help a user.

3. Every function in a system should reduce (remove?) pain, or increase pleasure. Functions that cannot be tied back to a pain-point of a pleasure-point are redundant.
This one is rather obvious -- but I wanted to state it to make sure that we actively acknowledge it. Before a team plans and allocates resources -- this is probably the one law that should be used as a sanity check before executing the plan.

4. Software systems can only be constructed to help users achieve *definable* goals.
If we cannot state the 'goal' or the end result just cannot build the software. The problem unfortunately is not as simple as this law makes it seem, but a more detailed blog is probably warranted later to explain my views on this one.

5. The form, the function and structure of every software system reveals the *culture* that built it. Evolution of a software system is limited/enhanced by this culture.
The way a system is constructed and managed says a lot about the culture that built it. We may marvel at the pyramids (and to a certain extent the sky line of a city like New York) -- both of which do reflect the attributes of a culture that built it. It may not say much about the motivation, but it certainly does show what a culture considers valuable and worthwhile. In the software world -- Programming languages, Operating Systems and even simple text-editors say a lot about the culture that produced them. Culture almost always indicates a set of beliefs, and a guiding philosophy. The evolution of a product, will reveal quite a bit about this philosophy -- what do they fight for, what do they abandon and when do they acknowledge their mistakes?

6. Evolution is a sign of life, activity and usage. Systems that do not evolve are either perfect (or) irrelevant.
This one has already been stated (in a different way) in Lehmans' Laws of Software Evolution. I want to acknowledge it and add to my set of laws.

7. A holistic appreciation of problem and solution complexity is far more important than any process.
This is probably to a certain extent contradicting Descartes reductionism approach. But, a software system does show a number of emergent properties -- especially once we take into consideration that the system includes the 'users'. Many complex systems end up being used in rather odd ways, features are combined in never before imagined ways -- this is essentially what I now consider to be 'humanistic', the final outcome is not predictable -- it is however bounded. Processes do not build systems, it is the people that do. A process may help capture the current set of values (beliefs, philosophy), but that is about where they stop.

8. History suggests that the solutions that 'last' are those that are 'efficient'.
Efficiency is a beast that works on a number of different levels, but here I would like to take a simplistic argument and suggest that 'efficient' systems are those that use the least amount of 'energy' to achieve a user-goal. Energy includes everything from time taken to build the system, the CPU cycles it takes to get the job done (i.e. electricity used) and ofcourse the time it took for the user to learn it, *remember it* and use it. All of these are very closely inter-related (see Law 9)

9. Scalability is one attribute that can be predicted based on the complexity of a software system.
Software that takes a long time to build and/or learn will never reache a 'large' user base, limiting its scalability. These systems tend to be rather inefficient. The best systems are those that are flexible enough to allow the users to extend - allowing new properties to emerge.

10. Limitations in software are limitations of the collective human mind.
Software systems are built by teams (a collective human mind). If something does not work as expected, chances are it has been over-engineered to perform a specific task and/or takes a long time to learn (with respect to the goal). Ideally, systems that seem to engage us humans are those that are simple to learn, but require a lot of time and effort to master.

There is a few more to add to this list, I'll post them once I have finished adding detail to them. I will try to elaborate more on each law in some depth in later posts. Do feel free to comment, contradict, or lend further support to them. These 'laws' will evolve as I change and understand better, so these are not Universal statements -- just "Vasa's laws of software systems...."