Software design involves gambles. Non-normalizing may help for certain kinds of usage, but at the risk of not being able to handle future requirements that are not favored by the unnormalized stuff. Whether it is worth gambling away future flexibility to get performance for a specific usage pattern today is a business decision similar to financial decision cost analysis. I would suggest looking for ways to improve performance under normalization before giving in to denormalization for speed.
And as somebody pointed out, historical (read-only) data is sometimes easier to sift or use if denormalized. This is because most of such research is done from a customer sales perspective rather than concerned with internal processes about already-completed stuff.