1 / 6

Processing Module Development

Processing Module Development. Rasmus Munk Larsen, Stanford University rmunk@quake.stanford.edu 650-725-5485. Overview. New challenges compared to MDI Module status and MDI heritage Module structure and development strategy Community contributions and collaboration.

Download Presentation

Processing Module Development

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Processing Module Development Rasmus Munk Larsen, Stanford University rmunk@quake.stanford.edu 650-725-5485

  2. Overview • New challenges compared to MDI • Module status and MDI heritage • Module structure and development strategy • Community contributions and collaboration

  3. HMI Data Processing Pipeline • New challenges compared to MDI: • Real-time and reliability requirements on ground based observable calculations • Level 0-1 of pipeline (telemetry capture, observable calculations) must support instrument ground testing • Real-time requirements on high level data products for space weather etc. demands dual pipeline paths: • Fast algorithms for preliminary (“quick look”) data products • Slower, more accurate process for definitive calibrated data products • Pipeline mode generation of high-level data products • Automatic on-demand generation of data products, e.g., triggered by VSO queries • Requires improved traceability, in particular when using evolving research codes • Vastly increased data volume • New computationally intensive data analyses, such as time-distance analysis, farside imaging, vector magnetogram inversion • Vector magnetic data products and processing • Ameliorating circumstances • Large body of knowledge and software from MDI, GONG & other projects • Moore’s law (Disk density, networking bandwidth, computing power) • Maturing computing infrastructures (Web technologies, Grids, software tools)

  4. Instrument specific code, Stanford is primary developer Research code exists in the community New code under development (HAO) Research code currently used Standalone “production” code routinely used MDI pipeline modules exist Module status and MDI heritage Intermediate and high level data products Primary observables Internal rotation Heliographic Doppler velocity maps Spherical Harmonic Time series Mode frequencies And splitting Internal sound speed Full-disk velocity, sound speed, Maps (0-30Mm) Local wave frequency shifts Ring diagrams Doppler Velocity Carrington synoptic v and cs maps (0-30Mm) Time-distance Cross-covariance function Tracked Tiles Of Dopplergrams Wave travel times High-resolution v and cs maps (0-30Mm) Egression and Ingression maps Wave phase shift maps Deep-focus v and cs maps (0-200Mm) Far-side activity index Stokes I,V Line-of-sight Magnetograms Line-of-Sight Magnetic Field Maps Stokes I,Q,U,V Full-disk 10-min Averaged maps Vector Magnetograms Fast algorithm Vector Magnetic Field Maps Vector Magnetograms Inversion algorithm Coronal magnetic Field Extrapolations Tracked Tiles Tracked full-disk 1-hour averaged Continuum maps Coronal and Solar wind models Continuum Brightness Solar limb parameters Brightness feature maps Brightness Images

  5. PE API PE API Module structure Input data • Design tasks • Identify intermediate and high-level data products desired by research community • Establish top level data flow and interface specs to • Isolate module development from pipeline infrastructure • Allow flexibility for evolving techniques (research codes) • Develop/import computational engines in HMI environment, verify: • Correctness (test suites) • Performance requirements (algorithm improvement, code tuning) • Traceability, reproducibility (*: version & configuration info in meta data) Data product Computational engine: C/Fortran/IDL/Matlab/… Meta data Updated meta data* Module specific params Storage management Data archiving/logging Updating catalogue DSDS

  6. Community contributions and collaboration • Contributions from co-I teams: • Software for intermediate and high level analysis modules • Algorithm description (detailed enough to understand the contributed code) • Test data and intended results for verification • Time • Explain algorithms and implementation • Help with verification • Collaborate on improvements if required (e.g. performance or maintainability) • Contributions from HMI team: • Pipeline execution environment • Software resources (Development environment, libraries, tools) • Time • Collaborate on defining module interface • Help with porting code to target hardware • Collaborate on algorithmic improvements, code tuning, parallelization • Verification

More Related