Heterochronous Dataflow

Researchers: Rachel Zhou
Advisor:Edward A. Lee

Synchronous dataflow (SDF) is very useful in modeling dataflow systems with fixed structure, such as signal processing systems. Under the SDF model of computation, actors consume and produce a fixed number of tokens on each firing. A schedule can be pre-computed at compile time and questions such as deadlock and memory usage are decidable.

However, many signal processing systems nowadays implement adaptive algorithms, which result in data rates change during execution. The constraint of fixed rates in SDF limits its usage in these kinds of applications. In this projects, we will study heterochronous dataflow (HDF), which is a heterogenous composition of SDF and finite state machine (FSM). The semantics of HDF allow rate changes through state transitions of FSM, while within each state the system can be considered as an SDF model. Therefore, HDF retains many of the benefits of SDF, yet it has dramatically increased its expressiveness.

Currently we are implementing HDF in Ptolemy II. The scheduling of HDF involve an intensive interaction between the dataflow models and FSM, especially in hierarchical HDF. We will explore the optimization scheduling algorithm of HDF while retaining its compatibility with the basic SDF and FSM. We will further study how HDF can be used in modeling adaptive signal processing systems and its limitations and possible extensions. Relations between HDF and other dataflow models such as Boolean dataflow (BDF) and dynamic dataflow (DDF) will also be studied.

[1] A. Girault, B. Lee, and E. A. Lee, "Hierarchical Finite State Machines with Multiple Concurrency Models", IEEE Transactions on Computer-aided Design of Integrated Circuits and Systems, Vol. 18, No. 6, June 1999.

[2] E. A. Lee and D. G. Messerschmitt, "Synchronous Data Flow", Proceedings of the IEEE, Vol. 75, No. 9, pp. 1235-1245, September, 1987.

Last updated 11/20/03