As a comment on http://eekim.com/2018/12/doug-engelbart-human-systems-tribes-and-collective-wisdom/:
I think the claim of the Los Angeles Review of Books is wrong or misleading at least. I interpret Engelbart's work as an attempt to demonstrate that it has to be both sides together, so it's not useful to look at them as separate categories. A tool without a human to use it is obviously useless, but a human without a tool can't do anything either (in fact, some would argue that there wouldn't be any humans if not for tools, I do not necessarily subscribe to that, but you get the idea). If one claims that technology improves a lot and fast and the human systems don't, OK, in some sense, but a biological system like the human body and man-made systems and machines are fundamentally different. To install new software in a computer doesn't take very long, but rewiring/rewriting the firmware in the brain can take generations or ages even, and both for legitimate reasons and for different results. But this is not meant to suggest that they can be viewed separate just because they start at different positions on the scale and follow different functions for their improvement over time, not tracking each other even remotely, as every improvement in the tool system like writing, the printing press, integrated circuits and the Internet (to use the also hugely misleading single narrative of history, as if it were one exclusive, harmonic consecutive line of improvement on the chart) boosted many improvements on the human system side, and improvements in the human system (go name some major ones in a similar reductionary narrative of only a single aspect and consecutive progression, the pendants to the tool system improvements I've listed like accounting, public education, interface and component standardization as well as decentralized networking must only be boring ones, just as the tool system pendants to improvements on the system side can arguably be expected to be somewhat boring as well). To say that our technology doesn't need to improve, is utterly confused, as quite a lot of it in the young digital sector (see, the human system is on a different timescale again, how to change that?) is more or less crap for stupid reasons, and true, in no small part due to immense deficiencies in the human system, so if the idea is that we don't face any difficulty to build awesome technology, but lack the human system models and support to actually get it going, fine, that still leaves you without the tools to help you with improving the human system that would allow/support the creation of the tool system that's wanted/needed. Or the other way around: if the tools were there and good, human system improvement would be much less needed as focus finally could shift to the problems and topics we actually care about as knowledge workers, as I can imagine that good technology would be flexible/generic enough to incorporate or support almost any human system and the creation of such, and the handling of any situation, but we have absolutely no idea how to build technology that allow just that, which is a serious problem I'm not willing to dismiss. And if it should be a question of human system design to come up with a concept for it, please work on it and don't fail to implement. I don't want to claim that Engelbart provided us with such a design nor that he didn't, but it's a pity that his tool(s) aren't available to us any more, while his human systems are much more accessible, because they can travel and be transmitted from printed pages and electronic screens to be consumed by eyes and processed by brains, but it doesn't seem like these brains alone are doing a particular good job with solving our complex, urgent problems, so something is obviously missing here. I'm fine with whatever the solution might be or from where it might come from, but for that reason, I want to hesitate to artificially restrict the types of potential solutions or directions they may come from, for no good reason other than a convenient, simple narrative (not to dismiss the value of good narratives, but to point out that there are more complex ones as well).