Previous Contents Next

5   Layer Execution Model

5.1   Callbacks

Layers are implemented as a set of callbacks that handle events passed to the layer by Ensemble from some other protocol layer. These callbacks can in turn call callbacks that Ensemble has provided the layer for passing events onto other layers. Logically, a layer is initialized with one callback for passing events to the layer above it, and another callback for passing events to the layer below it. After initialization, a layer returns two callbacks to the system: one for handling events from the layer below it and another for handling events from the layer above it. In practice, these ``logical callbacks'' are subdivided into several callbacks that handle the cases where different kinds of messages are attached to events.



Figure 4: Layers are executed as I/O automata, with pairs FIFO event queues connecting adjacent layers.


5.2   Ordering Properties

The system infrastructure that handles scheduling of protocol layers and the passing of events between protocol layers provides the following guarantees:
FIFO ordering
: The infrastructure guarantees that events passed between two layers are delivered in order. For instance, if layer A is stacked above layer B, then all events layer A passes to layer B are guaranteed to be delivered in FIFO order to layer B. In addition, events that layer B passes up to layer A are guaranteed to be delivered in FIFO order to layer A. Note that these ordering properties allow the scheduler some flexibility in scheduling because they only specify the ordering of events in a single channel between a pair of layers.
no concurrency
: The sytem infrastructure that hands events to layers through the callbacks never invokes a layer's callbacks concurrently. It guarantees that at most one invocation of any callback is executing at a time and that the current callback returns before another callback is made to the protocol layer. See fig 4 for a diagram of layer automata. Note that although a single layer may not be executed concurrently, different layers may be executed concurrently by a scheduler.
The execution of a protocol stack can be visualized as a set of protocol layers executing with a pair of event queues between each pair: one queue for events going up and another for events going down. The protocol layers are then automata that repeatedly are scheduled to take pending events from one of the adjacent incoming queues, execute it, and then deposit zero or more events into two adjacent outgoing queues (see fig 4).
Previous Contents Next