General Instructions. You are expected to work alone on this assignment.
Submit your solution using CMS. No late assignments will be accepted.
To facilitate grading, format your solutions as follows.
Consider a distributed system, where each processor p has a single core and has a hardware clock Cp. We write Cp(t) to denote the time that Cp displays at the instant the omniscient observer knows the real time to be exactly t. For simplicity, assume that the resolution of Cp is very high and, therefore, "ticks" of this clock are short relative to the length of time required to execute an instruction.
Were Cp perfect, then Cp(t) = t would hold for all values t. But actual hardware clocks do not advance at exactly the same rate as real-time or, for that matter, at the same rates as each other. For the purposes of this problem, assume hardware clocks are kept synchronized within D of each other. Thus, for all processors p and q and all real times t:
0 <= | Cp(t) - Cq(t) | <= D
where | x | denotes the absolute value of x.
Further, suppose the message-delivery delay for the network that connects processors is bounded from below and from above. Specifically, the delay (as measured by the hardware clock at any of the processors) to deliver a message is always a value between DDmin and be DDmax.