"a calculation may happen hundreds of times every second"
I wasn't sure this meant it needed "real-time" response or not. Is it a large batch job, or something like real-time image recognition?
Most desktop DB engines cache a lot of stuff in RAM, I would note. It only starts to becomes disk intensive when it runs out of available RAM and can't use indexes nor the natural order of rows that fit the grain of requests. But, at least it does not crash at that point.
Does he need something that "degrades gracefully", or is it predictable-speed-or-bust?