Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Yes, it's sad.

For a couple of years I helped develop scheduling software for supply chains in process industry. We frequently optimized for throughput or resource utilization, but also for just in time or minimal latency. So goals differ, but it kind of works in industrial context.

Now, there has always been a tendency to also frame knowledge work like software development as though it's just industrial production. Hence (mostly futile) attempts to make things predictable, reproducible and "efficient". Where efficiency is bluntly taken to mean optimal utilization.





There's often a tension between efficiency (say maximising throughput, or minimising latency) and robustness (being able to cope with shortages of inputs, demand shocks, work around failures). The world got to experience a bunch of logistical examples of this around COVID-19, but there's examples everywhere. Having a whole 2nd engine on a passenger plane seems wasteful, until the first engine fails.

When attempting to apply a process optimisation perspective from supply chains or manufacturing to software delivery, one way the software delivery problem space differs is that the software delivery process isn't a process that produces a stream of identical units that are independent of each other.

Suppose we abstract the software situation, we can tell ourselves that it is a repeatable process that produces an endless stream of independent features or fixes (weighed in "story points" say) that get shipped to production. This mental model maybe works some of the time, until it doesn't.

In reality, each software change is often making a bespoke, one-off modification or addition to an existing system. Work to deliver different features or fixes are not fungible and delivering the work items may not be independent -- if changes interfere with each other by touching overlapping components in the existing system and modifying them in incompatible ways. A more realistic mental model needs to acknowledge that there's a system there, and its existing architecture and accumulated cruft may heavily constrain what can be done, and that the system is often a one-off thing that is getting changed in bespoke ways with each item of work that ships.


That distinction between Industrial Production and Knowledge Work feels like the root cause.

It seems like modern Agile has mutated into a tool for Manufacturing Predictability rather than Software Discovery. We are so obsessed with making the velocity graph look like a straight line that we stopped asking if we are even building the right thing.

Do you think that shift happened because non-technical management needed a metric they could understand (tickets closed), or did we do this to ourselves?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: