Mike Folk - Academia.edu (original) (raw)

Papers by Mike Folk

Research paper thumbnail of A Mathematical Typesetting System

APL Quote Quad, 1980

A translator was written in APL to convert linear expressions into two-dimensional mathematical e... more A translator was written in APL to convert linear expressions into two-dimensional mathematical expressions, which are typed on a versatile hard-copy terminal. The result is a composite of several small rectangles, each containing a part of the overall picture. The language could be embedded within a text editor for writing documents with many mathematical expressions.

Research paper thumbnail of Section IV Review Questions

Research paper thumbnail of File structures

From the Publisher: Based on the bestselling File Structures, Second Edition, this book takes an ... more From the Publisher: Based on the bestselling File Structures, Second Edition, this book takes an object-oriented approach to the study of file structures. It allows students and professionals to acquire the fundamental tools needed to design intelligent, cost-effective, and appropriate solutions to file structure problems. The book begins by presenting the software and hardware characteristics that combine to make file structure design important to application development. It continues with a thorough treatment of the tools that support effective use of files for storing and retrieving information. This book teaches design by putting the hands-on work of constructing and running programs at the center of the learning process. By following the many programming examples included in the book and in the exercise sets, readers will gain a significant understanding of object-oriented techniques and will see how C++ can be an effective software development tool.

Research paper thumbnail of PS: a procedure simulator for dynamic program visualization

Research paper thumbnail of The educational significance of the future

Research paper thumbnail of Atmospheric Sciences and Climate Applications Using HDF and HDF 5

Since 1998, NCSA has developed a more general and robust data format, called HDF5, which was desi... more Since 1998, NCSA has developed a more general and robust data format, called HDF5, which was designed to support the future demands of Earth Science, such as large data storage, performance, and flexibility. The NCSA HDF group encourages applications to use HDF5 for better performance and maintenance. See extend abstract P2.42 for more information about HDF5. In summary, HDF5 has the following features that make its use attractive for many important atmospheric science applications. 1) Flexible data model 2) Open-source, free software 3) Portability 4) Emphasis on performance • MPIIO support • Improved sub-setting • In-memory compression • Alternative storage models

Research paper thumbnail of A Hierarchical Grouping Convention for FITS

This paper describes a grouping convention for FITS that facilitates the construction of hierarch... more This paper describes a grouping convention for FITS that facilitates the construction of hierarchical associations of Header Data Units (HDUs). The grouping convention uses FITS table structures (ASCII or binary) to encapsulate pertinent information about the HDUs be- longing to a group. Group members may reside in a single FITS le or be distributed in many FITS les; the FITS les themselves may reside on dieren t computer systems.

Research paper thumbnail of Java (TM) applications using NCSA HDF files

Concurrency: Practice and Experience, 1997

HDF is a library and file format for storing and exchanging scientific data. In the absence of a ... more HDF is a library and file format for storing and exchanging scientific data. In the absence of a Java (TM) implementation of the HDF library, Java applications that need to directly access data in HDF files require a Java interface to the native code HDF library. This document describes the design of the Java interface for the HDF library and one prototype application – the Java HDF Viewer (JHV). Performance issues discussed in this paper appear to be inevitable consequences of the technologies that provide the portability and security of Java. The performance penalties are not significant for many applications, but become very significant for data-intensive computing. Just-in-time compiler technology will probably have little impact on the data copying and memory management issues described in this paper. © 1997 John Wiley & Sons, Ltd.

Research paper thumbnail of Reuse Readiness Levels (RRLs)

Objective Recognizing the need to measure the maturity of software for reuse, the NASA Earth Scie... more Objective Recognizing the need to measure the maturity of software for reuse, the NASA Earth Science Data Systems (ESDS) Software Reuse Working Group (WG) proposes a set of Reuse Readiness Levels (RRLs). The maturity of a particular technology can be measured in various ways, one common method being with Technology Readiness Levels (TRLs) or other similar measurements. However, the ability or readiness of a particular technology to be reused is generally not considered, or plays only a small role if it is considered.

Research paper thumbnail of Libraries and Tools for efficiently computing and analyzing NASA Earth Science Data with HDF and HDF5

The Hierarchical Data Format (HDF) developed at the National Center for Supercomputing Applicatio... more The Hierarchical Data Format (HDF) developed at the National Center for Supercomputing Application (NCSA) at University of Illinois at Urbana-Champaign has become the primary standard file format for storing data from NASA's Earth Observing System (EOS). Data from Landsat 7 and Terra (including the CERES, MODIS and MISR instruments) are all stored in HDF and/or HDF-EOS. HDF5 is a new

Research paper thumbnail of Java™ Applications Using NCSA HDF Files

HDF is a library and file format for storing and exchanging scientific data. In the absence of a ... more HDF is a library and file format for storing and exchanging scientific data. In the absence of a Java(TM) implementation of the HDF library, Java applications that need to directly access data in HDF files require a Java interface to the native code HDF library. This document describes the design of the Java interface for the HDF library and one prototype application, the Java HDF Viewer (JHV). Performance issues discussed in this paper appear to be inevitable consequences of the technologies that provide the portability and security of Java. The performance penalties are not significant for many applications, but become very significant for data intensive computing. Just In Time compiler technology will probably have little impact on the data copying and memory management issues described in this paper. 1 Introduction The NCSA Hierarchical Data Format (HDF) is a library and platform independent data format for the storage and exchange of scientific data. [3] HDF is designed to mana...

Research paper thumbnail of Data Formats

Research paper thumbnail of MEAD (A Modeling Environment for Atmospheric Discovery)

Bulletin of the American Meteorological Society, 2004

National Center for Supercomputing Applications University of Illinois Champaign-Urbana, Illinois... more National Center for Supercomputing Applications University of Illinois Champaign-Urbana, Illinois Center for Analysis and Prediction of Storms University of Oklahoma Norman, Oklahoma Rice University Houston, Texas University of Indiana Bloomington, Indiana University of Alabama in Huntsville Huntsville, Alabama Rutgers University New Brunswick, New Jersey Lawrence Berkeley National Laboratory Berkeley, California Georgia Institute of Technology Atlanta, Georgie University of Minnesota Minneapolis, Minnesota Portland State University Portland, Oregon University of Illinois Champaign-Urbana, Illinois Pacific Marine Environmental Laboratory Seattle, Washington University of Miami Miami, Florida

Research paper thumbnail of A Critical Look at the Cross Impact Matrix Method. A Research Report

This paper explains some of the problems with, and their importance to the application Jf, the Cr... more This paper explains some of the problems with, and their importance to the application Jf, the Cross-Impact Matrix (CIM). The C1M is a research method designed to serve as a heuristic device to enhance a person's ability to think about the future and as an analytical device to be used by planners to help in actually forecasting future occurrences. The author makes no judgment about CIM's usefulness as a heuristic device; but he does fault it as an analytical methodology. He partitions the analytical problems into two categories; (1) theoretical-including questions about underlying assumptions of the model, the meaning of inputs, the ability of experts to perceive accurately, and the ability of mathema 3 to handle the inputs; and (2) practical-including invaYJ .E.natical formulae and questionable use of simulation techn: Related documents are EA 004 239 and EA 004 241.

Research paper thumbnail of A Critique of Some Futures Prediction Techniques and Their Implications for Educational Planners

Research paper thumbnail of Easy Access of EOSDIS HDF data via OPeNDAP and Other Tools

ABSTRACT HDF is a set of data formats and software libraries for storing scientific data with an ... more ABSTRACT HDF is a set of data formats and software libraries for storing scientific data with an emphasis on standards, storage, and I/O efficiency. The HDF-EOS version 2 (HDF-EOS2) profile and library, built on top of HDF version 4 (HDF4), define and implement the standard data format for the NASA Earth Science Data and Information System (ESDIS). More than five hundred data products in NASA data centers are stored in HDF4 and HDF-EOS2. HDF5 is a newer data format. It has been embraced as an important data format for Earth science, HDF-EOS5, which is built on top of HDF5, is the primary data format for data from the EOS Aura satellite. The OPeNDAP Data Access Protocol (DAP) and its related software have emerged as important components of the earth science data system infrastructure. The OPeNDAP protocol is widely used to remotely access earth science data. Several third-party visualization and analysis tools that can read data from OPeNDAP servers, such as IDV, Panoply, GrADS, Ferret, NCL, MATLAB, and IDL, are widely used by many earth scientists, researchers, and educators to access HDF earth science data. IDL, MATLAB and NCL are also widely used to access the HDF data locally. Ensuring the easy access of HDF4, HDF5 and HDF-EOS data by these tools either locally or via OPeNDAP client will tremendously reduce the time for HDF users to visualize the data in their favorite way and accordingly improve their working efficiencies. In the past three years, under the support of NASA ESDIS and ACCESS projects, The HDF Group implemented the HDF4-OPeNDAP and HDF5-OPeNDAP data handlers so that many NASA HDF and HDF-EOS Swath and Grid data can be accessed by widely used visualization and analysis tools such as IDV, Panoply, GrADS, Ferret, NCL and IDL via OPeNDAP. We also provided comprehensive example codes and plots on how to use MATLAB, IDL and NCL to access NASA HDF-EOS and HDF data products distributed by various NASA data centers. We will demonstrate on how one can have easy access of HDF and HDF-EOS data via OPeNDAP and other tools. The Global Atmospheric Temperature from an HDF-EOS2 AIRS Grid File

Research paper thumbnail of Towards Long-Term Archiving of NASA HDF-EOS and HDF Data - Data Maps and the Use of Mark-Up language

Abstract The Hierarchical Data Format (HDF) has been a data format standard in NASA&#... more Abstract The Hierarchical Data Format (HDF) has been a data format standard in NASA's Earth Observing System Data and Information System (EOSDIS) since the 1990s. Its rich structure, platform independence, full-featured Application Programming Interface (API), and internal compression make it very useful for archiving science data and utilizing them with a rich set of software tools. However, a key drawback for long-term archiving is the complex internal byte layout of HDF files, requiring one to use the API to access HDF data. This ...

Research paper thumbnail of File Structures: A Conceptual Toolkit

From the Publisher: Based on the bestselling File Structures, Second Edition, this book takes an ... more From the Publisher: Based on the bestselling File Structures, Second Edition, this book takes an object-oriented approach to the study of file structures. It allows students and professionals to acquire the fundamental tools needed to design intelligent, cost-effective, and appropriate solutions to file structure problems. The book begins by presenting the software and hardware characteristics that combine to make file structure design important to application development. It continues with a thorough treatment of the tools that support effective use of files for storing and retrieving information. This book teaches design by putting the hands-on work of constructing and running programs at the center of the learning process. By following the many programming examples included in the book and in the exercise sets, readers will gain a significant understanding of object-oriented techniques and will see how C++ can be an effective software development tool.

Research paper thumbnail of Data Models for Storage and Retrieval

Data Integrative Approaches in Computation, Analysis, and Modeling, 2005

Research paper thumbnail of Investigation of IO Performance in a Weather Model

ABSTRACT WRF (Weather and Research Forecasting Model) is a limited-area weather model that has be... more ABSTRACT WRF (Weather and Research Forecasting Model) is a limited-area weather model that has been used intensively for both weather research and prediction. Due to the heavy computational volume, WRF is using multi-layer domain decomposition method to run the model in parallel supercomputing environments. Although there are several performance metrics to measure the overall model behavior; the IO performance has never been thoroughly evaluated. The Hierarchical Data Format (HDF) developed at the National Center for Supercomputing Application (NCSA) at University of Illinois at Urbana-Champaign has become the primary standard file format for storing data from NASA's Earth Observing System (EOS). Since 1999, NCSA has developed a more general and robust data format, called HDF 5, which will support the future demands of Earth Science. HDF5 provides chunking storage of the data, which improves the performance. HDF5 is also easily to hook with external compression packages and make the data storage more efficient and flexible. Furthermore, HDF5 supports the Message Passing Interface (MPI-I/O) standard, which is capable of performing I/O efficiently in parallel computing environments. Currently NCSA is developing an implementation of WRF sequential and parallel I/O modules that reads and writes HDF5 datasets. The module will be available to anyone using WRF as a new option for I/O. In this poster, we report three case studies to compare the performance among NetCDF IO module, sequential HDF5 IO module and parallel HDF5 IO module. In other work, Unidata and NCSA are collaborating to design netCDF4, a new netCDF built on top of HDF5. We hope the current study will provide some insights for scientists, researchers and developers to make decisions in choosing the best file I/O in their computational applications. For more information: HDF Home Page: http://hdf.ncsa.uiuc.edu WRF Home Page: http://wrf-model.org

Research paper thumbnail of A Mathematical Typesetting System

APL Quote Quad, 1980

A translator was written in APL to convert linear expressions into two-dimensional mathematical e... more A translator was written in APL to convert linear expressions into two-dimensional mathematical expressions, which are typed on a versatile hard-copy terminal. The result is a composite of several small rectangles, each containing a part of the overall picture. The language could be embedded within a text editor for writing documents with many mathematical expressions.

Research paper thumbnail of Section IV Review Questions

Research paper thumbnail of File structures

From the Publisher: Based on the bestselling File Structures, Second Edition, this book takes an ... more From the Publisher: Based on the bestselling File Structures, Second Edition, this book takes an object-oriented approach to the study of file structures. It allows students and professionals to acquire the fundamental tools needed to design intelligent, cost-effective, and appropriate solutions to file structure problems. The book begins by presenting the software and hardware characteristics that combine to make file structure design important to application development. It continues with a thorough treatment of the tools that support effective use of files for storing and retrieving information. This book teaches design by putting the hands-on work of constructing and running programs at the center of the learning process. By following the many programming examples included in the book and in the exercise sets, readers will gain a significant understanding of object-oriented techniques and will see how C++ can be an effective software development tool.

Research paper thumbnail of PS: a procedure simulator for dynamic program visualization

Research paper thumbnail of The educational significance of the future

Research paper thumbnail of Atmospheric Sciences and Climate Applications Using HDF and HDF 5

Since 1998, NCSA has developed a more general and robust data format, called HDF5, which was desi... more Since 1998, NCSA has developed a more general and robust data format, called HDF5, which was designed to support the future demands of Earth Science, such as large data storage, performance, and flexibility. The NCSA HDF group encourages applications to use HDF5 for better performance and maintenance. See extend abstract P2.42 for more information about HDF5. In summary, HDF5 has the following features that make its use attractive for many important atmospheric science applications. 1) Flexible data model 2) Open-source, free software 3) Portability 4) Emphasis on performance • MPIIO support • Improved sub-setting • In-memory compression • Alternative storage models

Research paper thumbnail of A Hierarchical Grouping Convention for FITS

This paper describes a grouping convention for FITS that facilitates the construction of hierarch... more This paper describes a grouping convention for FITS that facilitates the construction of hierarchical associations of Header Data Units (HDUs). The grouping convention uses FITS table structures (ASCII or binary) to encapsulate pertinent information about the HDUs be- longing to a group. Group members may reside in a single FITS le or be distributed in many FITS les; the FITS les themselves may reside on dieren t computer systems.

Research paper thumbnail of Java (TM) applications using NCSA HDF files

Concurrency: Practice and Experience, 1997

HDF is a library and file format for storing and exchanging scientific data. In the absence of a ... more HDF is a library and file format for storing and exchanging scientific data. In the absence of a Java (TM) implementation of the HDF library, Java applications that need to directly access data in HDF files require a Java interface to the native code HDF library. This document describes the design of the Java interface for the HDF library and one prototype application – the Java HDF Viewer (JHV). Performance issues discussed in this paper appear to be inevitable consequences of the technologies that provide the portability and security of Java. The performance penalties are not significant for many applications, but become very significant for data-intensive computing. Just-in-time compiler technology will probably have little impact on the data copying and memory management issues described in this paper. © 1997 John Wiley & Sons, Ltd.

Research paper thumbnail of Reuse Readiness Levels (RRLs)

Objective Recognizing the need to measure the maturity of software for reuse, the NASA Earth Scie... more Objective Recognizing the need to measure the maturity of software for reuse, the NASA Earth Science Data Systems (ESDS) Software Reuse Working Group (WG) proposes a set of Reuse Readiness Levels (RRLs). The maturity of a particular technology can be measured in various ways, one common method being with Technology Readiness Levels (TRLs) or other similar measurements. However, the ability or readiness of a particular technology to be reused is generally not considered, or plays only a small role if it is considered.

Research paper thumbnail of Libraries and Tools for efficiently computing and analyzing NASA Earth Science Data with HDF and HDF5

The Hierarchical Data Format (HDF) developed at the National Center for Supercomputing Applicatio... more The Hierarchical Data Format (HDF) developed at the National Center for Supercomputing Application (NCSA) at University of Illinois at Urbana-Champaign has become the primary standard file format for storing data from NASA's Earth Observing System (EOS). Data from Landsat 7 and Terra (including the CERES, MODIS and MISR instruments) are all stored in HDF and/or HDF-EOS. HDF5 is a new

Research paper thumbnail of Java™ Applications Using NCSA HDF Files

HDF is a library and file format for storing and exchanging scientific data. In the absence of a ... more HDF is a library and file format for storing and exchanging scientific data. In the absence of a Java(TM) implementation of the HDF library, Java applications that need to directly access data in HDF files require a Java interface to the native code HDF library. This document describes the design of the Java interface for the HDF library and one prototype application, the Java HDF Viewer (JHV). Performance issues discussed in this paper appear to be inevitable consequences of the technologies that provide the portability and security of Java. The performance penalties are not significant for many applications, but become very significant for data intensive computing. Just In Time compiler technology will probably have little impact on the data copying and memory management issues described in this paper. 1 Introduction The NCSA Hierarchical Data Format (HDF) is a library and platform independent data format for the storage and exchange of scientific data. [3] HDF is designed to mana...

Research paper thumbnail of Data Formats

Research paper thumbnail of MEAD (A Modeling Environment for Atmospheric Discovery)

Bulletin of the American Meteorological Society, 2004

National Center for Supercomputing Applications University of Illinois Champaign-Urbana, Illinois... more National Center for Supercomputing Applications University of Illinois Champaign-Urbana, Illinois Center for Analysis and Prediction of Storms University of Oklahoma Norman, Oklahoma Rice University Houston, Texas University of Indiana Bloomington, Indiana University of Alabama in Huntsville Huntsville, Alabama Rutgers University New Brunswick, New Jersey Lawrence Berkeley National Laboratory Berkeley, California Georgia Institute of Technology Atlanta, Georgie University of Minnesota Minneapolis, Minnesota Portland State University Portland, Oregon University of Illinois Champaign-Urbana, Illinois Pacific Marine Environmental Laboratory Seattle, Washington University of Miami Miami, Florida

Research paper thumbnail of A Critical Look at the Cross Impact Matrix Method. A Research Report

This paper explains some of the problems with, and their importance to the application Jf, the Cr... more This paper explains some of the problems with, and their importance to the application Jf, the Cross-Impact Matrix (CIM). The C1M is a research method designed to serve as a heuristic device to enhance a person's ability to think about the future and as an analytical device to be used by planners to help in actually forecasting future occurrences. The author makes no judgment about CIM's usefulness as a heuristic device; but he does fault it as an analytical methodology. He partitions the analytical problems into two categories; (1) theoretical-including questions about underlying assumptions of the model, the meaning of inputs, the ability of experts to perceive accurately, and the ability of mathema 3 to handle the inputs; and (2) practical-including invaYJ .E.natical formulae and questionable use of simulation techn: Related documents are EA 004 239 and EA 004 241.

Research paper thumbnail of A Critique of Some Futures Prediction Techniques and Their Implications for Educational Planners

Research paper thumbnail of Easy Access of EOSDIS HDF data via OPeNDAP and Other Tools

ABSTRACT HDF is a set of data formats and software libraries for storing scientific data with an ... more ABSTRACT HDF is a set of data formats and software libraries for storing scientific data with an emphasis on standards, storage, and I/O efficiency. The HDF-EOS version 2 (HDF-EOS2) profile and library, built on top of HDF version 4 (HDF4), define and implement the standard data format for the NASA Earth Science Data and Information System (ESDIS). More than five hundred data products in NASA data centers are stored in HDF4 and HDF-EOS2. HDF5 is a newer data format. It has been embraced as an important data format for Earth science, HDF-EOS5, which is built on top of HDF5, is the primary data format for data from the EOS Aura satellite. The OPeNDAP Data Access Protocol (DAP) and its related software have emerged as important components of the earth science data system infrastructure. The OPeNDAP protocol is widely used to remotely access earth science data. Several third-party visualization and analysis tools that can read data from OPeNDAP servers, such as IDV, Panoply, GrADS, Ferret, NCL, MATLAB, and IDL, are widely used by many earth scientists, researchers, and educators to access HDF earth science data. IDL, MATLAB and NCL are also widely used to access the HDF data locally. Ensuring the easy access of HDF4, HDF5 and HDF-EOS data by these tools either locally or via OPeNDAP client will tremendously reduce the time for HDF users to visualize the data in their favorite way and accordingly improve their working efficiencies. In the past three years, under the support of NASA ESDIS and ACCESS projects, The HDF Group implemented the HDF4-OPeNDAP and HDF5-OPeNDAP data handlers so that many NASA HDF and HDF-EOS Swath and Grid data can be accessed by widely used visualization and analysis tools such as IDV, Panoply, GrADS, Ferret, NCL and IDL via OPeNDAP. We also provided comprehensive example codes and plots on how to use MATLAB, IDL and NCL to access NASA HDF-EOS and HDF data products distributed by various NASA data centers. We will demonstrate on how one can have easy access of HDF and HDF-EOS data via OPeNDAP and other tools. The Global Atmospheric Temperature from an HDF-EOS2 AIRS Grid File

Research paper thumbnail of Towards Long-Term Archiving of NASA HDF-EOS and HDF Data - Data Maps and the Use of Mark-Up language

Abstract The Hierarchical Data Format (HDF) has been a data format standard in NASA&#... more Abstract The Hierarchical Data Format (HDF) has been a data format standard in NASA's Earth Observing System Data and Information System (EOSDIS) since the 1990s. Its rich structure, platform independence, full-featured Application Programming Interface (API), and internal compression make it very useful for archiving science data and utilizing them with a rich set of software tools. However, a key drawback for long-term archiving is the complex internal byte layout of HDF files, requiring one to use the API to access HDF data. This ...

Research paper thumbnail of File Structures: A Conceptual Toolkit

From the Publisher: Based on the bestselling File Structures, Second Edition, this book takes an ... more From the Publisher: Based on the bestselling File Structures, Second Edition, this book takes an object-oriented approach to the study of file structures. It allows students and professionals to acquire the fundamental tools needed to design intelligent, cost-effective, and appropriate solutions to file structure problems. The book begins by presenting the software and hardware characteristics that combine to make file structure design important to application development. It continues with a thorough treatment of the tools that support effective use of files for storing and retrieving information. This book teaches design by putting the hands-on work of constructing and running programs at the center of the learning process. By following the many programming examples included in the book and in the exercise sets, readers will gain a significant understanding of object-oriented techniques and will see how C++ can be an effective software development tool.

Research paper thumbnail of Data Models for Storage and Retrieval

Data Integrative Approaches in Computation, Analysis, and Modeling, 2005

Research paper thumbnail of Investigation of IO Performance in a Weather Model

ABSTRACT WRF (Weather and Research Forecasting Model) is a limited-area weather model that has be... more ABSTRACT WRF (Weather and Research Forecasting Model) is a limited-area weather model that has been used intensively for both weather research and prediction. Due to the heavy computational volume, WRF is using multi-layer domain decomposition method to run the model in parallel supercomputing environments. Although there are several performance metrics to measure the overall model behavior; the IO performance has never been thoroughly evaluated. The Hierarchical Data Format (HDF) developed at the National Center for Supercomputing Application (NCSA) at University of Illinois at Urbana-Champaign has become the primary standard file format for storing data from NASA's Earth Observing System (EOS). Since 1999, NCSA has developed a more general and robust data format, called HDF 5, which will support the future demands of Earth Science. HDF5 provides chunking storage of the data, which improves the performance. HDF5 is also easily to hook with external compression packages and make the data storage more efficient and flexible. Furthermore, HDF5 supports the Message Passing Interface (MPI-I/O) standard, which is capable of performing I/O efficiently in parallel computing environments. Currently NCSA is developing an implementation of WRF sequential and parallel I/O modules that reads and writes HDF5 datasets. The module will be available to anyone using WRF as a new option for I/O. In this poster, we report three case studies to compare the performance among NetCDF IO module, sequential HDF5 IO module and parallel HDF5 IO module. In other work, Unidata and NCSA are collaborating to design netCDF4, a new netCDF built on top of HDF5. We hope the current study will provide some insights for scientists, researchers and developers to make decisions in choosing the best file I/O in their computational applications. For more information: HDF Home Page: http://hdf.ncsa.uiuc.edu WRF Home Page: http://wrf-model.org