1 / 28

A Framework for Reuse and Parallelization of Large-Scale Scientific Simulation Code

This paper presents a framework for reusing and parallelizing scientific simulation code, addressing the challenges of integrating legacy code and new code into larger simulations. The framework is applied to Laser Ablation experiments, demonstrating its effectiveness in improving execution timings and enabling parallel implementation. The framework follows a micro-kernel paradigm and utilizes PVM as a parallel library. Overall, the framework enhances code development, separates computation from communication, and improves software architecture in the scientific community.

jarvisj
Download Presentation

A Framework for Reuse and Parallelization of Large-Scale Scientific Simulation Code

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. A Framework for Reuse and Parallelization of Large-Scale Scientific Simulation Code Manolo E. Sherrill, Roberto C. Mancini, Frederick C. Harris, Jr, and Sergiu M. Dascalu University of Nevada, Reno

  2. Introduction • Software design in the scientific community often fails due to the nature and lifespan of the code being developed. • Today we want to present a framework for reuse and parallelization of scientific simulation code. • Both legacy code and new code

  3. Introduction • What causes the difficulty? • New Code • Algorithms were implemented just to test if they would work – NOT to be used in production. • BUT if it works…. • It gets used and becomes Legacy Code

  4. Introduction • Once you reach this stage it becomes difficult to integrate these code pieces into larger simulations due to • name conflicts, • poor organization • lack of consistent styles • …

  5. Motivation • Laser Ablation experiments at the Nevada Terawatt Facility

  6. Motivation • Laser Ablation fits the description just provided: • Pieces of code that have been written once • Then used over and over again • And Modified several times over that lifetime. • This modification comes as new data comes from experimentalists.

  7. Laser Ablation • What is Laser Ablation? • It is the process of ablating material from a solid or liquid target • Typically with a low intensity laser • 1x107 W/cm2 to 1x1010 W/cm2

  8. The laser deposits the bulk of its energy in the skin depth region of the target The material is heated, then undergoes melting, evaporation, and possible plasma formation Laser Ablation

  9. The material in the gaseous state forms a plume that moves away from the target The expansion proceeds at of a few 10 mm/nsec Laser Ablation

  10. Laser Ablation

  11. Laser Ablation • Laser ablation is commonly used • In experimental physics as ion sources • In industry to generate thin films • Recently more exotic systems have entered this arena • Pico- and Femto-second lasers • The shorter pulses allow you to ablate a thinner layer thereby getting a thinner film

  12. Framework Development • We started with sequential simulation code for Laser Ablation • We were moving to multi-element, multi-spatial zone simulations • Initial execution timings were disappointing, • No, …they were unacceptable. • Therefore a parallel implementation was pursued. • But major issues stopped us (as described previously)

  13. Framework Paradigm • Operating Systems • Monolithic • Micro-Kernel • Monolithic Kernels fight the same issues we have discussed in scientific code….it gets so hard to make fixes and additions without introducing more errors.

  14. Framework Paradigm • Micro-Kernels separate functionality into separate processes (as opposed to layers) • Inter-process communication happens through message passing via consistent interfaces. • The micro-kernel acts as a centralized point of connection and communication.

  15. Framework Paradigm • Micro-Kernels have issues and benefits • Can be slower – due to message passing • Can use RAM more efficiently – due to separate processes

  16. Implementation Overview • Though the low-level message passing used in micro-kernels is not appropriate for scientific programming • It did move us to look at message passing which leads us back to parallel programming.

  17. Implementation Overview • We used PVM as our parallel library to accommodate our implementation framework • Our framework • acts as a set of utilities • transfers data • manages data structures for accessing processes on local and remote nodes • as well as coordinating I/O

  18. Framework Instantiation • It is important to note that except for the amount of data transferred between processes and for the topology of the implementation, the modules are independent of the physics code embedded in them. • It does allow us to separate the computation from the communication, and thereby isolate the legacy code making code development easier.

  19. The legacy code is represented by the circles. The triangles are the framework around them that handles the message passing. Left pointing are children Right pointing are parents and spawn processes Framework Instantiation

  20. Once a self-consistent solution is found, data from the SEKM layer is forwarded to the Radiation Transport Module. Here, the data from each zone is combined into a complete synthetic spectra and then compared. Framework Instantiation

  21. The comparison of synthetic data to the experimental data is done in an automated manner with a search engine Thousands of sample data (temperature and density) are stored in a Parallel Q which spawns Layer II members Framework Instantiation

  22. Framework Instantiation • 4 Zone synthetic spectra compared with experimental data

  23. Framework Instantiation • 6 Zone synthetic spectra compared with experimental data

  24. Conclusions • We have presented the motivation for a change in software architecture in the physics community. • This change has resulted in easier maintenance and increased performance of simulation codes • It has allowed protection and encapsulation of existing legacy code as well as allowing parallelization of the same code

  25. Conclusions • This framework is also beneficial for a variety of reasons: • New experimental data just requires changes to a single module • Physics components can be tested and modified individually before adding them to a larger simulation. (even ones not planned on) • It also allows us to change the topology and bring in parallel processing quite easily • It keeps legacy codes separate, thereby providing some protection

  26. Future Work • Applying this to other Physics codes. • Thereby effectively re-using legacy codes, particularly on larger machines • Integration of others code models without having the source. • BIG issue in the Physics community.

More Related