The beveled gears are for noise reduction; race gearboxes uses straight cut gears which tend to slap against each other full on whereas the beveled gears sorta slide into place when they engage. It's also good to note that the gears themselves don't engage one another when you shift; they're actually always engaged. What happens is that one of the gears spins freely on the main shaft of the gearbox and a splined collar with side-mounted dogs engage the gear when pushed by the selector fork. The collar is where the synchros are mounted and it's job is really just to spin up the collar to match the gear's RPM when engaging.
Pardon me if I hijack the thread for a little bit, but I don't really agree with the above; especially the part that "with some physics understanding, you'll see, that the momentup put on the cars then is quite big."
Seems to me that all this applies only when you have inelastic collisions... then the energy represented by the overlap goes directly into reactive force (generating large momentum). But, cars are not inelastic -- they crumple -- thus a vast portion of that energy should really dissapate into the bodywork, suspension, etc and only a little bit go into the reactive force.
Even though LFS does not today have a fully implemented damage model, seems like it could at least pull out that part of the energy and apply only a small portion of the remaining energy toward the change in momentum. Of course, until you have a complete damage model, you'll need to estimate the dissapated energy instead of calculate it... but you get the idea.
I'm definitely in camp B. I just can't get used to a car with too much forward bias. I pretty much always have to dial it back to around 62% or I'll go understeering off at the first corner. I just find it gives me that initial bite to get the car rotated.
BTW: I found my reference to Carroll Smith. It's in Driving to Win on page 2 - 25. He's got a section in there about rowing through the gears vs skipping (kinda tangental), but he does say that we should "Concentrate your efforts on braking efficiently and, when the car has slowed sufficiently, downshift into the required gear (taking care to properly synch the revs) - and get on with the job." I remember, but can't locate off hand, that he also describes a little experiment where you take the car and do some 60 mph to 0 stops; both with and without downshifting. His claim is that by concentrating on modulating the brakes, you'll stop in less distance if you don't row the gears.
Just a quick question for those fast drivers out there. I've noticed that many (most? All?) of the really ast setups have a brake bias set quite a bit more forward than I'd expect. Seems like I've read somewhere that people are doing this because they're using the engine to slow themselves down.
Why? I've read from Carroll Smith that you want to set the car up for maximum braking with just the brakes (I'm sure Carroll has some pithy comment about brakes are for stopping and engines are for going; but I digress).
So can someone explain why engine braking is so popular?
I'm trying to remember, but doesn't some of the GT series (Aussie super cars maybe?) use ballast to even up the field after each race? It does mix things up a bit and encourage more passing, etc.
How about a server setting to add ballast to the winner? This might work really well for sequences of short races where everyone restarts after the race. We can even remove added ballast if you start losing...
Alternatively, there could be a server setting to fetch the WR information and do some handicapping on that basis.