Or at least not based on that article. Stepanov advocates algorithms, which is good if you're writing an ultra-performant program to do a specific task. But most programmers now have to do multi-task and ultra-complexity programs. That shopping cart web page is I/O bound anyway and has to handle ever-changing CRM profiling, page style, auditing regulations and tax calculations. Their requirements change independently. I don't need a generic algorithm, I need frameworks, micro-architectures, analysis patterns and design patterns. So, this results in poor performance. CPU time is cheaper than programmer time.
So, languages like Java can't do
template <class StrictWeakOrdered>\ninline StrictWeakOrdered& max(StrictWeakOrdered& x,\nStrictWeakOrdered& y) {\nreturn x < y ? y : x;\n}
Just use the
Comparable
interface, typecast and use
instanceof
as needed. That's your 'covariant signature transformation and an ability to obtain types from types'. It's ugly but I'm not going to cry because of that.
Now, if I was writing games, all this would be the wrong approach and Java, OO, layered architecture, design patterns and all the rest would be jettisoned and C++ would be in from the cold. But that's too specialist for me.
I don't need a language with cleverness but third-party libraries with conflicting, garbage collection schemes. That gets in the way of the architectures and algorithms I need to devise.