Slack

is the title of an excellent book by Tom DeMarco. This second snippet (here's the first) continues my tactic of rereading good books several times. As usual I'm going to quote from a few pages:
Talented managers are ... all sense organ, constantly attuned to the effect their leadership is having on their people ... Managers without such talent find themselves relying on formulas and "principles" of management. They reason, "This thing I'm trying to do should work; the fact that it isn't working probably suggests that I'm doing it half-heartedly." And so they do more of whatever they've been doing.
When the new automation is in place, there is less total work to be done by the human worker, but what work is left is harder. That is the paradox of automation: It makes the work harder, not easier.
In my experience, standard processes for knowledge work are almost always empty at their center.
The power you've granted is the power to err. If that person messes up, you take the consequences. Looked at from the opposite perspective, it is this capacity to injure the person above you that makes empowerment work.
When there is neither time nor staff to cope with work that runs more slowly than expected, then the cost of lateness is paid out of quality. There is no other degree of freedom.
... voluminous documentation of everything that will hold still for it.
Successful change can only come about in the context of a clear understanding of what may never change, what the organization stands for... the organization's culture... If nothing is declared unchangeable, then the organization will resist all change. When there is no defining vision, the only way the organization can define itself is its stasis.

the (de)composition fallacy

You've probably heard the saying "the whole is greater than the sum of the parts". You can think about this law in various ways. For example, if you remove the brain from a man you no longer have a man minus a brain. You have two dead things. A brain is more than just a "part" of a man. A part has relationships to a whole that contribute to the essential wholeness of the whole.

Another way is at a much simpler level - where all the parts are of the same type aggregating together over time. Time patterns them. Human beings are pretty poor at seeing effects over time. We tend to think things are more permanent than they are. We think that the way things are now is how they've always been and how they'll continue to be. I recall showing my children some old Victorian pennies and telling them that's what pennies used to look like. They didn't believe me! Before street lighting it was apparently very common in England to have two sleeps a day. The short one after a midday meal was called the "small sleep". The time we eat our main meal has changed. Eating several courses at a meal is a relatively recent phenomenon. The cutlery we eat with has changed. And what we wear. Before the 17th century your left and right shoe (if you had shoes at all) were the same and were called "straights".

But things do change, and this matters because

Things that don't matter in isolation often matter in composition.

Suppose you compile some code and you get a few warnings. Do these warnings matter? You might think not, since there are only a few, but they do. Tomorrow you'll be writing some more code, and the day after that some more too, and so will your colleagues. After six months those few warnings have turned into 3000 warnings. That's the composition fallacy.

3000 warnings is a big problem for at least two reasons. But the two reasons I'm thinking of are really the same reason. Let me explain.

The first reason is that if you've got 3000 warnings then any new warnings aren't even noticed in the comparative vastness of the existing 3000. You've become completely desensitized to warnings. The number of warnings inexorably grows but no one notices - you've just got "a lot of warnings" that is always "a lot of warnings".

The second reason is the same reason but in reverse.

Suppose you see what you couldn't see before - that a lot of warnings is a composition problem - and you try to do something about it. You've now got to work hard learning how to do something you don't know how to do - namely how to write code without warnings. That takes effort. And what do you get for all your effort? Almost nothing it seems! The number of warnings goes down a tiny bit but there are so many warnings you've hardly made any difference at all. A lot of warnings remains a lot of warnings.

One thing you can do in this situation is to stop reporting the total number of warnings and switch to reporting only the number of warnings either added or removed. This is an example of switching from a static measure (the number of warnings is now 2981) to a more dynamic measure (in the last 24 hours the number of warnings has gone down by 19).