The specific multiple monitor feature that was removed (as a built-in Windows feature) since Vista is where multiple displays behave as if they're the same screen - meaning fullscreen games etc will span across both displays.
In Vista/7/8, the displays used in extended mode still behave mostly independently - the start menu, task bar & full screen applications load by default only on the primary display and maximised applications will only be on a single display, not both simultaneously.
To get the old feature back on Vista/7/8, you have to use hardware (TH2G), software (SoftTH, but that only works for fullscreen 3d programs afaik) or a 3rd party driver feature like Eyefinity.
TBH, I think Vista got way more bad press than it deserved.
Most of the initial problems were caused firstly by the change in driver model, which meant that hardware manufacturers had to get off there arses and update drivers - something that many were incapable/slow at doing. Secondly, they removed a lot of the 'hacky' ways of doing things, which meant that a lot of programs that weren't written properly in the first place had problems. Finally there was all that "Vista capable" crap with PC OEMs - that one probably has equal blame between MS and the OEMs.
Vista with SP2 wasn't actually much different from the early W7 versions. There are a few retarded things about it and some inefficiencies, but it runs a lot better than it did originally.
Windows 7 fixed many of the strange design choices, improves UAC and has improved stability and efficiency - basically just tweaks and a few features over Vista. W7 is better than Vista overall but there's not actually a huge difference between the two.
Something that I think was forgotten by most people who still loudly hate on Vista is how much of a dog the almighty XP was when it first came out. It had a heavily criticised UI, consumer driver/software compatibility problems and was more of a resource hog and more unstable than Win2k. Sound familiar?
XP only really became usable after SP1 and only got good after the big changes made in SP2.
Yeah, that's what I figured for the 10-20ms readings, although I thought the 30ms was a bit high. I guess it's just an obscure combination of timing and maybe CPU load.
The multiple screen feature was indeed removed by Vista, but modern AMD and nVidia graphics cards can do it themselves. AMD's Eyefinity supports up to 6 screens per card and I believe nVidia has some kind of equivalent.
Windows 7 is actually a very solid OS. There are a fair few improvements over XP - its 64bit implementation especially, plus better stability - although it does use more RAM (less than Vista though). Saying that, with 8GB RAM costing less than £30, who cares.
Underneath, Windows 8 is basically Windows 7 with some performance improvements and a few other tweaks.
The new tablet ("don't call it Metro"*) interface is the start menu and nothing more. The desktop still exists as it was, other than the start menu, so the only time you need to access the tablet interface is when opening programs and shutting down. It is a bit clunky to use with a mouse though, but a hell of a lot better than it was in the Developer Preview.
Frankly that's no real hardship, as since Vista most people use the start menu only to begin typing the name of a program then select it from the list of the results - that behaviour hasn't changed.
I haven't properly played with W8 on real hardware yet - only in a VM - so I can't really say what it's like to use on the long term.
I've been using W7 every day for over 3 years now and I personally prefer it to XP. It's certainly much more stable; the only thing that's ever crashed it for me is when the occasional broken version of Flash takes out the graphics driver.
*They got sued over the Metro name for Trademark infringement or something
Just tested this on my machine - localhost dedicated server, same version as client, Win7 64bit
B9: 0.01-0.02s; mostly 0.02 about 80% of the time
B11: 10-20ms; mostly 0.01 >90% of the time
I haven't seen more than 20ms to a localhost server, although I did see 30ms (B9, Wine) or so to a LAN server sometimes, which is a little strange as real ping time is ~400µs.
B11 LAN server seems fairly solid at 10ms (>80% of the time), never above 20ms while on track.
Reported latency to Internet servers seems to be about OK in B11, mostly 40-50ms (max 60) to a server with real 36±1ms ping.
In my experience, it is usually the client's load times that cause the disconnects on track change, although IIRC occasionally people with very bad connections do get taken out by others who are loading slowly.
cargame.nl meant that there is literally no way of detecting whether someone has rolled their car, it's impossible to program - InSim doesn't give that kind of information.
Information about roll & pitch is only available through OutSim, which is client side only.
Yeah, that's what I figured - I doubt the OS will be a significant factor. It's probably down to local connections usually having vastly lower latency, so the buffer is being emptied much faster. So fast, it never actually gets full. It also means that internet clients will be susceptible to problems with much fewer buttons than on a LAN.
Using MadCat's ButtonBomb, I got a client disconnect every time I've tried so far, except for a local server + client + InSim. Considering the amount of data being sent (over 54KB of button text alone) I'm not really surprised.
Client & InSim: Windows 7 64bit
Servers tested:
Windows Vista 32bit (LAN) - lost connection
Wine on Ubuntu 10.04 (LAN) - lost connection
Windows 7 64bit (local) - connection OK
Server log:
Nov 01 12:02:35 InSim - TCP : ButtonBomb Nov 01 12:02:40 FATAL TCP ERROR : BUFFER SIZE Nov 01 12:02:40 Leave @ 11212 : Degats Nov 01 12:02:40 Lost connection to Degats^L (Degats)
Interestingly, when testing with localhost, the buttons didn't get displayed on screen if the client was already on the server when the InSim connected. When the client joined with InSim already running, the buttons displayed fine, but the text in the chat history went missing - probably to do with the many overlaid buttons.
On the LAN servers, the buttons caused a disconnect whichever connected first.
I'll try do some more tests later with my own test app, hopefully with some more OS & network combinations as well as different numbers of buttons etc.
BTW MadCatX, the "Packet count" option in the ButtonBomb is never used in your code, so it sends out 238 buttons every time.
Just a thought, it's possible that your PSU can't quite cut it. It's an older, cheaper design (shown by the max currents - noone's ever needed 300W of 3.3 & 5V, those high currents are simply a way of making it seem more powerful than it really is).
The manual isn't clear if the PCIe power connectors are on the same 12V rail or not (another sign of the quality level of the PSU). If they're on the same rail, then it's very possible that the GPU is overloading that rail.
Download some software that will display the voltages (eg speedfan) and see if they drop significantly when the GPU is under load. This might also explain why it started behaving when Fraps was running - Fraps caps the framerate, potentially reducing the GPU workload.
Something else to note regarding the SSD partitions - 20GB is not enough for a 64bit windows installation. My c:\windows\ directory alone is 30GB.
To expand on this, IIRC the player does get renamed in some places, but not all, causing the name shown in the player list/chat/over car to be different. This discrepancy can be detected in the InSim packets, which is why several InSim applications try to deal with the problem, usually by kicking the player.
If I understand you (correct me if I'm wrong) each client - assuming 32 clients - sees 32 rows of 4 buttons before a race start?
If so, the server still sends all 4096 packets to clients even if you're sending the (32x4) buttons once to the global UCID=255, as the server still has to send separately to each client - even if the different clients are receiving the same button ID+Content.
Each single button sent to UCID=255 actually sends 32 packets total (or whatever the current number of connections - could be up to 47).
If you were to make each of those 4096 buttons unique, there would only be extra packets sent InSim -> Server, but the same number Server -> Client.
Example:
(32x4) buttons sent to UCID=255
128 packets InSim -> Server
128 packets Server -> each Client
4096 packets Server -> Clients in total.
(32x4) buttons sent to each client in turn
4096 packets InSim -> Server
4096 packets Server -> Clients in total.
Assuming the InSim is local to the server, the number of packets InSim <-> Server should be a lot less sensitive than Server <-> Client as latency is near zero, so AFAIK it shouldn't matter much if you're sending 128 or 4096 packets locally.
The above is the case in the current 0.6B version, of course the new patch could vastly reduce the number of packets in all cases - except a single button (global or unique) sent to all clients, which would still be 1 packet per client.
tl;dr
In the current version, it doesn't matter whether you're sending a global or unique button to all clients, the server still sends out the same number of packets:
Total packets = buttons per client * number of clients
That should be fine, as it would only delete the ...\LFS folder.
If anyone's not sure if the location they installed LFS is safe from deletion or not, just untick the "Remove Live for Speed folder" box in the uninstaller and it won't delete anything other than shortcuts and file associations.
You can then manually delete whatever LFS files you need to.
My skins_x/ folder alone is nearly 3.5GB. If I'd had high resolution skin downloads on, skins_y/ could be 4 times that size.
Then there's skins/, high resolution modified textures, autosaved replays etc.
The amount of space a long running LFS install uses can easily mount up.
I have LFS on a 1TB data drive, so I don't really care if it gets big after a while (I do purge mprs occasionally though)
However, if I uninstall a program I don't expect it to be taking up any significant disk space at all - that's usually the only reason why I'd uninstall something in the first place. And by significant space I mean no more than a few MBs for user settings.
AFAIK it can already do that, but LFS generates so many files (including many large ones like skins, replays etc) that the uninstaller won't know about but you may not want taking up space, which is why the 'delete everything' option is there.
That's probably the best way to do it, as you're never going to be able to include all the possible special cases because they could be named pretty much anything (symlinks can be a very good way of getting yourself into trouble)
I'd hazard a guess that most people who don't read what the uninstaller says it's about to do won't have read the readme or the wiki.
It's not uncommon for an application to delete the folder it was installed to, although most that do (including LFS) ask you first.
If the user doesn't click through the installer and goes out of their way to choose a non-default install location, doesn't notice that it's being installed to the root of Program Files, then doesn't bother reading what the uninstaller says it's going to do - even though there's a check box which is always a warning that it's about to do something you might not want it to - then quite frankly it is user error if the directory gets hosed.
Now, here's a real example of programmer error:
The uninstaller for Test Drive Unlimited deletes every file (but thankfully not folder) it has system permissions for in the root of C:\. One of those files is boot.ini, without which, many windows installations can't boot, rendering the computer unusable.
The really sad thing? They knew about this problem in the Beta, didn't fix it for release and didn't tell anyone about it.
I've had something similar sometimes (in 0.6B) when my router forgets what UDP is for a while.
I press ctrl+t to switch to TCP position packets, which makes it so I can see other players. InSim then knows where I am, but (some?) other players still can't see my car.
This could be similar to what Chuck described above, as I've seen this happen sometimes with others: somehow the timing to a player goes out of sync, which causes LFS to show lag when the packets actually arrive on time or impossibly low lag (eg 0.01 between Europe and Australia). What's really strange is that sometimes the amount it's out of sync drifts, causing the reported latency to slowly increase past 0.5 and eventually drop back down to 0.00.
No idea if this is still possible in the test patch though.
He's not just scaling proportionally to distance though, he's also doing horizontal scaling based on the relative plane angle to fake the impression of looking at it from the side.
TC does have buttons that are scaled based on distance (easy) and others that physically track the location/position of other cars (not particularly complex trig).
I don't know if any other cruise servers have done anything similar or taken it further. AFAIK, one or two race InSim programs have implemented buttons that track a specific point on the track, but no scaling.
What MadCatX is doing (so far) is a further step that's a little more complex still that gives a bit more of an impression of '3D'.
It is indeed less important as far as checking the overall server health is concerned.
However, IMO it is still very important to be able to easily monitor actual latency of individual players somehow, especially when there is close quarters racing going on as a laggy (although I suppose jitter is just as important) player can easily cause havoc.
It's quite surprising how much useful connection info the dancing of the old lag bars could show to the experienced eye.
From my tests, It's sent whenever a position packet from the player arrives so late (or not at all) that the car is made invisible and the "lag (0.00s)" thing shows in place of the player name above the car.
Every MCI packet that's sent while the player is in that invisible/lag state (or has been since the last MCI) has the CCI_LAG bit set.