Dual core CPU's are literally two processors shoe-horned into the same chip with a few adjustments. To the software author, and even to the OS, they appear as two distinct CPU's.
For a long time the Windows OS has had multi-tasking where different programs can run concurrently and these programs can be allocated to any CPU core.
Multi-threading is where a software application repeats this process within itself. Just out of interest the concept is not new, the Amiga OS had support for this back in the dark ages when cavemen where still using telephones connected to phone sockets on the wall, Windows cottoned on a little later, but MS where not nearly as slow as British Telecom who took 17 years to prove the transistor worked before the cavemen could have digital exchanges.
Except, at least as far as I know, Windows always decides which cores to run stuff on. The only thing software developers can do is create multiple threads and trust Windows do handle the rest.
Ok thanks. Having read through most of the rest of the thread now, (specifically yours and JeffRs posts), I get it now. It's pretty much what I was trying to say, but badly. I was thinking more of "threads" as seen from the OS perspective as you yourself said in one post about creating a multi-threaded application and just letting the OS get on with deciding which CPU or Core to allocate it to, (thanks for clearing up the point about whether multi-core CPs are seen as logical or physical by the OS btw).
Oh and on your point about BT. It may very well be the case but that says even more about the other Telcos around the world as BT were the first to go digital. The UK lead the Telecoms "revolution" of the 80's and 90's.
Poor customer service they may have but BT have been, (and continue to be), a driving force in world Telecommunications technologies and development.
Does that meen I can have 20mbit down 2mbit up now? *listens to sound of pin dropping* Nope. Not leading.
Thatcher granted BT a monopoly so they could invest in infrastructure, it's now out of date and BT no longer have their monopoly. On the plus side we dont pay 10p for a 1 minute local call now, on the downside, my internet sucks - and not in a multi-threaded way either.
I've been on BT a while (no Virgin Media out here) and that is comparable to what my old house had Dan! 0.16k-0.22k down and 0.2-0.4k up! I had 3 reasons for moving and that was one of them - but i'm still a country bumpkin and get 2-4mbit down and 0.4mbit up.
If I stay settled then I'm seriously considering moving to a big city in the next year or two just for better internet.
The RTC (Real Time Clock) is a different piece of hardware. The RTC is a Motorola 146818 or the equivalent, normally uses a 32.768khz "crystal", it's a low power CMOS device, and is not as accurate the 8253/8254 (or equivalent) and it's 1.193185mhz crystal. It's main purpose is to keep track of the time of day while the PC is turned off, running off the washer sized battery on the motherboard.
On newer systems, a high precision timer is available and referred to as the "multi-media timer" in Visual Studio. These should be nearly identical on all PC's.
Since LFS is meant to run on older PC's and perhaps Windows 95, it would use timer0 on the 8253/8254 or it's equivalent, indirectly, but this is an accurate timer.
Numerical intergration is used when differential equations that can't be solved directly. You start of with a known position and velocity and acceleration. The only thing you can directly calculate is the acceleration. So given this, you "integrate" the acceleration for .01 second, and predict a new velocity. Based on this predicted velocity, you use the average velocity during the .01 second to predict a new position. Then the cycle repeats itself. Different math methods improve the accuracy, but reducing the step size helps the most. Using a step of .001 second would be more accurate, but would require a 1000hz physics engine if needed to be run in real time.
This can get very complicated. In space, gravity is diminishing with distance, and time dialates a small amount due to relativity. The more accurate the math model is, the less fuel consumed to do course corrections during space travel.
Near the earths surface, aerodynamic drag is extremely complicated, especially in the case of super-sonic objects.
In the case of a car, stuff like the suspension and tread movements move too quickly to be properly modeled on current PC's. I've read that you'd need a physics engine running at 10,000 to 100,000hz to do an accurate job, and you'd need an extremely good mathematicl model to do this. From what I understand this isn't done for real cars, so the modeling stops and real testing starts with prototypes to discover any hidden problems.
If the modeling was accurate enough and used, then stuff like blow overs wouldn't happen (the driver was OK):
What do I get on older systems? Hardware only in newer systems is not much use to me personally - yet.
Ah okee I get yah, yep, exponential calculations and the like. I guess it's all down to how accurate you need to be, LFS' physics are well ahead of anything i've needed to do away from the world of cricket balls (where I think I have LFS beaten ) so i've never personally had to tackle the problem to the nth degree of realism, consequently I can get away with far cruder basic algorythms for the stuff I do.
A little beyond the simulation scope of most computer games, but I have always been fascinated by the n dimenionsal universe model and the relativity of time and how it relates to the 16 dimensional "spaghetti"verse. I dont necessarily believe in it (I dont pretend to know the answers) but I do find it really fascinating. Still, i'm not going to incorporate it into a computer game
Yah there's this desire to model the car to the nth degree of accuracy, but i've said this before - it's pointless as arbitary values are perfectly adequate for the task. What we need in sims is better environmental simulation because at the moment so little of the dynamicism of actual racing environments is modelled. More sim authors need to get themselves on the track and go watch as much club racing as they can... Watch how a field of single seaters maintain a fairly consistent gap between each other at each part of the circuit, and get the 'feel' of what really happens during a race weekend. atm, all that is missing - and it's far more important than the car physics being 95 or 100% accurate.
Oh I know that accident, quite personally in a way... I even wrote a poem that referenced it during this years Le Mans, although it's a bit crap and I never inteded to publish it I just put it online.
A multi-media timer. The wiki article was wrong. Multi-media timers predate HPET's and can run at 1ms => 1000hz and have been available since Windows 95:
The HPET spec requires a minimum 10mhz source clock. http://www.intel.com/hardwaredesign/hpetspec_1.pdf
From my quick web search, the HPET's can be run at 100 usec => 10000 hz. I don't know what the limit is, other than system overhead.
In the sample code linked to below, setting uDelay to 1ms would result in TimeProc being called once every 1 ms.
It's interesting and hopefully meens that in time this issue will go away - if it is truly accurately calibrated between PC's, but on my target audience at the current time it'll give me the same clock I use now. None-the-less it seems there's no harm in including this into my existing systems so i'll package it up and experiment with it. Thanks.
Java provides a couple of mechanisms for getting time information from the underlying OS. System.currentTimeMillis() returns the number of milliseconds between now and midnight on January 1st 1970 (epoch time) and System.nanoTime() returns the current time of the most precise timer available from the OS. There is an important note about System.nanoTime though:
On topic, I would like to see multiple cores being supported by LFS because multi-core machines are making up a much larger part of the new PC market than ever before and that trend is probably only going to rise in the future. LFS has traditionally catered for low-end machines, but with a future release (maybe the first S3 Alpha) moving to more intensive graphics/physics along with a move to multi-core support would be good.
mmtimerf.c - 1ms timer via callback function (like an interrupt routine)
mmtimert.c - 1ms timer via thread (timer sets event that the thread waits for)
I tested these using Visual C 4.0 (1995) and Visual Studio 2005.
I tested on an old 2.0ghz Pentium 4, D850GB motherboard (2001) system running Windows ME and they run just fine. This is a system that predates LFS by 1 to 2 years.
Here's a Java example showing a program that uses 3 timers to do different things. One timer switches the background and text colour after a specified delay time, another timer updates the time remaining until the next colour change label and the last timer updates the fields showing how long the program has been running for both in milliseconds and nanoseconds. The delay between changing colours can be changed from 10 milliseconds to 5000 milliseconds. Don't run it low if you have epilepsy (you have been warned! ). Unzip the attached archive and double click the executable JAR file to run the application. Don't run the JAR if you don't trust me. I've attached a sample screenshot of the application. The source for the application can be viewed by extracting the contents of the JAR file and looking at the "tester/MultimediaTimerTester.java" file.
You can't buy a single core CPU desktop anymore, every thing is dual core and up. Intel has 6 and 8 core CPUs in their future lineups. It's time to stop catering to older systems and start adding awesome DX10 neon light effects and NAWZ.
LFS could easily benefit from multi-core optimization. LFS could benefit from a lot of stuff but we just have to wait for it...
Whether BT has the latest technology available actually deployed in its network is irrelevant to the fact that BT's technology labs have played a major role in driving forward Telecommunications.
But as a point in fact within the UK it is BT, not any other operator, that is actually pushing forward the deployment of FTTC and FTTH. So again, despite the fact that people love to hate them, the only reason the UK is ever going to get fast internet access is because BT is prepared to spend the money. No doubt, just as with ADSL, they'll spend the money deploying it, all their competitors will just reap the profits using it under the UKs LLU regulations.