Computing (FOLDOC) dictionary
Jump to user comments
computer, parallel (Or "multi-tasking", "multi-processing",
"multiprogramming", "concurrency", "process scheduling") A
processor between several independent jobs. The first
multitasking operating systems were designed in the early
1960s.
(probably more common) a system process called the
"
scheduler" suspends the currently running task after it has
run for a fixed period known as a "
time-slice". In both
cases the scheduler is responsible for selecting the next task
to run and (re)starting it.
The running task may relinquish control voluntarily even in a
pre-emptive system if it is waiting for some external
event.
In either system a task may be suspended prematurely if a
hardware
interrupt occurs, especially if a higher priority
task was waiting for this event and has therefore become
runnable.
The scheduling
algorithm used by the scheduler determines
which task will run next. Some common examples are
Multitasking introduces
overheads because the processor
spends some time in choosing the next job to run and in saving
and restoring tasks' state, but it reduces the worst-case time
from job submission to completion compared with a simple
batch system where each job must finish before the next one
starts. Multitasking also means that while one task is
waiting for some external event, the
CPU to do useful work
on other tasks.
A multitasking operating system should provide some degree of
protection of one task from another to prevent tasks from
interacting in unexpected ways such as accidentally modifying
the contents of each other's memory areas.
The jobs in a multitasking system may belong to one or many
is almost synonymous but implies that there is more than one
user.
overheads and no protection of tasks from each other, all
threads share the same memory.
(1998-04-24)