I throw the dice and enter one more parameter to the discussion.
I thing that calculations sampling rate of 100 fragments per second may be not enough to approximate the behaviour of a car.
At a speed of 100 km/h about 27cm are being travelled every 100th of a second, thus increasing adjustments to the model.
Vector processing and resampling is a solution but increased sampling rate is the only way to calculate high frequency model changes.
I donnot know if the described issue has already being discussed, so excuse me if already done.
Finally Dynamic Look-up-tables is a way of caching data whenever processing power is available but I donnot thing this is the case in driving simulators.
For example on Grand tourismo they are using several convolution matrixes (drop pulses on real cars) to get samples of total deformations specific vehicles are having during a set of conditions. The rest is a good approximation on a model that gets data from these matrixes !!!
Result is average.
I donnot know what type of magic the creator of lfs has made to physics, if it was just a good guess, or a full blown mathematical model, but I believe he is the only single one that can really give some light of how difficult this can be.
And as time passes, age comes, brain slows down, consider sharing.....