Single- vs. multi-threaded programming on a single core processor

18.6k views Asked by At

Can someone please explain if there's really any advantage to writing a multi-threaded piece of code that runs on a single processor with a single core? E.g., a method that processes document pages such that the pages are mutually exclusive w/r/t the aforementioned piece of code.

At first glance, it doesn't seem like there'd be an advantage because true multi-threading is not possible. I.e., the OS would have to context switch the threads anyway. I'm wondering if just coding something in a single-threaded manner could actually be more efficient.

Clearly, there are plenty of cases where writing multi-threaded code makes sense, but again, my question gets to whether there's really an advantage of doing so when the application is running on a single-core processor.

EDIT: note, that I did not say "application" but rather "piece of code" - look at my example above. Clearly there are benefits to having a multi-threaded application.

3

There are 3 answers

1
Ben Barden On BEST ANSWER

There are still advantages to be gained, but they're a bit situational.

  • In many cases, giving the thing multiple threads will allow it to claim more system resources from other processes. This is finicky to balance, and each thread you introduce adds a bit of overhead, but it can be a reason.

  • If you are dealing with multiple potentially blocking resources - like file IO or GUI interaction or whatnot, then multithreading can be vital.

0
Ian Goldby On

On a single core processor, an application that uses asynchronous (non-blocking) I/O will be slightly more efficient than one that uses multiple blocking threads, because it avoids the overhead of context switching between threads.

Also, asynchronous I/O scales better than blocking I/O in threads because the overhead per extra I/O operation is minimal compared to the overhead of creating a new thread.

Having said that, you shouldn't generally use single-threaded asynchronous I/O in new applications because almost all new processors are multicore. Instead you should still use asynchronous I/O, but split the work amongst a set of worker threads using something like a thread pool. Your system documentation will tell you the ideal number of worker threads; usually it is equal to the number of processing cores available.

Edit: On the Windows platform at least, the async/await pattern in .NET is the modern way to perform asynchronous I/O. It makes this pattern as trivially easy to write as the old blocking I/O pattern. There is almost no excuse for writing blocking I/O now.

1
BergQuester On

Yes, multi-threading is useful in a single core. If one thread in an application gets blocked waiting for something (say data from the network card or waiting for the disk to write data), the CPU can switch to another thread to keep working.

BeOS was written with pervasive multithreading in mind, even in a time of single core processors. The result was a very responsive OS, though a rather difficult OS to program for.