Like, "The problem with OO is seeing everything as 'is-a' when it should be 'has-a', which tables of attributes can model better" [my paraphrase -- CRC]. He seemed (as usual) deaf to his counterparty's explanations and even (albeit partial) agreement, as when Fowler readily admitted he usually modeled stuff as single-inheritance just because most OO languages can't do multiple.

Fowler's explanation (in asking Bryce for clarification of whether this was what he meant, IIRC) was along the lines of "If you have a class person you could subclass that along (Man|Woman) or (Doctor|Nurse), but then if you implement both divisions and add a method foo() in both sub-classifications, what will DoctorAlice.foo() do? So I usually only subclass along the 'dominant' (=most used in my code) dimension, and add the rest as ('has-a') attributes" [heavily paraphrased, of course -- CRC].

Not that this affected Bryce in the slightest; he kept yacking as if single inheritance was this huge insurmountable obstacle that made all OOP near worthless (and as if Fowler were a staunch proponent of SI). Regardless of all that, though, "Top"Mind (and Fowler!) did have a point [finally filling that in three days after starting this comment]: The naïve proponents of "OOP lets you accurately model the real world in code!" (to whom I must, to my shame, admit I belonged. Or at least pretended to, just to keep whacking on Bryce) were wrong; in a single-inheritance world, it doesn't. And in OOP as it is practiced today -- and largely was back then -- multiple inheritance is a small and shrinking part. For, I hasten to add, good reasons.