I haven't noticed any recent threads about this, so here goes:
Why are the turbochargers so unrealistic, given the focus on realism everywhere else?
As I understand, turbos generally have a fairly steep boost threshold (500-1000 rpm wide), then flat boost (limited by boost controller) up to whatever airflow the turbo is designed to sustain, optionally followed by drop off in the higher rpm (in case of engines where the redlines outrun the design range of the turbo, or power-limiting ecu's).
Consequently, there are the phenomena of turbo lag (little to no boost bellow the threshold), hitting boost (turbo exponentially spooling up to peak boost when past threshold, resulting in reduced throttle precision), staying on boost (keeping the engine above a specific RPM so that the turbo will spool quickly when getting back on gas), and finally loss of boost at the limiter.
In my opinion, none of these are simulated accurately in LFS.
For example, XRT will smoothly spool the turbo from 0.2 to 0.8 bar over the course of 5+ seconds between 3000 and 5000 rpm in 3rd, with no noticeable kick. Even kicking from 6000rpm has boost building for around 3 seconds in a linear fashion, when it should explode within 1 second in a nonlinear fashion.
Another really noticeable one is RAC, where the turbo is showing only 0.3 bar boost when brake torqued at 3500rpm (advertised torque peak, generally coincident with the point at which the turbo comes online and the boost controller/wastegate cuts in).
Furthermore, peak boost keeps sloping up between 4000 and 7000 rpm, which in my opinion is inconsistent with how boost controllers work. If anything, the boost pressure should start dropping past 6000 rpm.
I should not be able to floor a turbo car with less care than an N/A one and leisurely control the tail end as the boost slowly builds (as XRT and RAC generally allow).
To LFS credit, the inertia is simulated somewhat (boost recovers much more rapidly on a shift than after a long lift), although I am not certain on how accurate that is.
I could go on about wastegates, blow off valves and bypass valves, but admittedly I would be going out of my own depth there.
I feel like there is a huge opportunity to give more character and realism to turbocharged cars by reworking the turbo model.
If I had to guess, the big items there would be
1) Dynamic airflow modeling through the engine, to allow simulating exponential buildup behavior and reasonably accurate decay/circulation from the various valves in the system)
2) Detailed modeling of turbo performance characteristics, using data on some real-world snails as basis.
Even a simple (using the word loosely) emulation with realistically defined peak boost values and "spool-up" factors throughout the rpm range, coupled with exponential buildup could be made to feel more realistic than the current implementation (whatever it is).
If anyone is still awake after all that, keep the snails spinning
P.S. I would love to hear about any errors in my own assumptions and arguments above, as few things are as educating as proper rebuttal.
Why are the turbochargers so unrealistic, given the focus on realism everywhere else?
As I understand, turbos generally have a fairly steep boost threshold (500-1000 rpm wide), then flat boost (limited by boost controller) up to whatever airflow the turbo is designed to sustain, optionally followed by drop off in the higher rpm (in case of engines where the redlines outrun the design range of the turbo, or power-limiting ecu's).
Consequently, there are the phenomena of turbo lag (little to no boost bellow the threshold), hitting boost (turbo exponentially spooling up to peak boost when past threshold, resulting in reduced throttle precision), staying on boost (keeping the engine above a specific RPM so that the turbo will spool quickly when getting back on gas), and finally loss of boost at the limiter.
In my opinion, none of these are simulated accurately in LFS.
For example, XRT will smoothly spool the turbo from 0.2 to 0.8 bar over the course of 5+ seconds between 3000 and 5000 rpm in 3rd, with no noticeable kick. Even kicking from 6000rpm has boost building for around 3 seconds in a linear fashion, when it should explode within 1 second in a nonlinear fashion.
Another really noticeable one is RAC, where the turbo is showing only 0.3 bar boost when brake torqued at 3500rpm (advertised torque peak, generally coincident with the point at which the turbo comes online and the boost controller/wastegate cuts in).
Furthermore, peak boost keeps sloping up between 4000 and 7000 rpm, which in my opinion is inconsistent with how boost controllers work. If anything, the boost pressure should start dropping past 6000 rpm.
I should not be able to floor a turbo car with less care than an N/A one and leisurely control the tail end as the boost slowly builds (as XRT and RAC generally allow).
To LFS credit, the inertia is simulated somewhat (boost recovers much more rapidly on a shift than after a long lift), although I am not certain on how accurate that is.
I could go on about wastegates, blow off valves and bypass valves, but admittedly I would be going out of my own depth there.
I feel like there is a huge opportunity to give more character and realism to turbocharged cars by reworking the turbo model.
If I had to guess, the big items there would be
1) Dynamic airflow modeling through the engine, to allow simulating exponential buildup behavior and reasonably accurate decay/circulation from the various valves in the system)
2) Detailed modeling of turbo performance characteristics, using data on some real-world snails as basis.
Even a simple (using the word loosely) emulation with realistically defined peak boost values and "spool-up" factors throughout the rpm range, coupled with exponential buildup could be made to feel more realistic than the current implementation (whatever it is).
If anyone is still awake after all that, keep the snails spinning
P.S. I would love to hear about any errors in my own assumptions and arguments above, as few things are as educating as proper rebuttal.