The Distributed-SDF Domain

This Demo illustrates the usage and advantages of the Distributed-SDF domain.

The model is an standard SDF Model with a Distributed-SDF Director. Any non hierarchical SDF model can be simulated in a distributed manner unless the actors have external dependencies.

(Note that actors that have GUI dependencies might generate errors!)

So try your own SDF models, you just change the standard SDF director by the Distributed-SDF Director.

By configuring the Distributed-SDF Director, we can choose to run the model either sequential or parallel (default).

Make sure that the distributed platform is running before you start a distributed simulation. To do this we provide a script under: This script starts the following services: If you run the DOS batch file you will get 6 command windows, the first for the first 2 services and the remaining 5 one for each server. This allows you to follow each server individually.

We provide 3 more scripts under $PTII/ptolemy/distributed/jini:

In order to illustrate the distributed simulation advantages to users that have only access to a single machine this demo uses Sleep actors. For every fire of the Sleep actors, they will stall for 5 seconds. In a sequential simulation this means that a 10 seconds stall is performed since the Sleep actors are executed one after the other. In the case of parallel simulation, those two actors can stall at the same time, reducing this time to 5 seconds for both.

On the standard output for the model, you can check the execution time for the simulation. Try to run both a sequential an a parallel simulation and compare the times.

(Note that the more iterations and the bigger the sleep times, the more advantage you will get from using the distributed platform.)

For a more realistic demo, please run the individual servers on different machines.

(Note that if your network blocks broadcast packets you migh need to configure unicast locators.)

Look at the documentation of the Distributed-SDF Director in particular and the classes and cofiguration files under ptolemy.distributed for further information.

You can also contact the author for further questions on kapokasa@kom.aau.dk.