George Ostrouchov - Academia.edu (original) (raw)
Uploads
Papers by George Ostrouchov
2017 IEEE International Conference on Cluster Computing (CLUSTER)
Concurrency and Computation: Practice and Experience
Proceedings of the XSEDE16 on Diversity, Big Data, and Science at Scale - XSEDE16, 2016
Simulation plays a big role in understanding the behavior of building envelopes. With the increas... more Simulation plays a big role in understanding the behavior of building envelopes. With the increasing availability of computational resources, it is feasible to conduct parametric simulations for applications such as software model calibration, building control optimization, or fault detection and diagnostics. In this paper, we present an uncertainty exploration of two types of buildings: a) of a building envelope's thermal conductivity properties for a heavily instrumented residential building involving more than 200 sensors, and b) a sensitivity analysis of a stand-alone retail building from the U.S. Department of Energy's reference model. A total of 156 input parameters were determined to be important by experts which were then varied using a Markov Order process for the residential building generating hundreds of GBs of data for tens of thousands of simulations. For the commercial building, 20 parameters were varied using a fractional factorial design requiring just 1024 simulations generating data in the order of a few hundred megabytes. These represent a wide variety and range of simulations from a few to tens of thousands of simulations in an ensemble. Depending on the number of simulations in an ensemble, the techniques employed to meaningfully make sense of the information can be very different, and potentially challenging. Additionally, the method of analysis almost always depends on the experimental design. The Markov Order sampling strategy and fractional factorials designs of sampling presented represent two approaches one could employ for large sensitivity analysis of buildings at two different scales of simulations. The paper presents the analysis using descriptive statistics as well as employing multiple analysis of variance techniques for comparison and contrast.
Siam Journal on Matrix Analysis and Applications, 2009
Scidac Review, May 29, 2008
The SciDAC Visualization and Analytics Center for Enabling Technologies (VACET) isa highly produc... more The SciDAC Visualization and Analytics Center for Enabling Technologies (VACET) isa highly productive effort combining the forces of leading visualization researchersfrom five different institutions to solve some of the most challenging dataunderstanding problems in modern science. ...
Statistical analyses of data from epidemiologic studies of workers exposed to radiation have been... more Statistical analyses of data from epidemiologic studies of workers exposed to radiation have been based on recorded annual radiation doses (yearly dose of record). It is usually assumed that the dose values are known exactly, although it is generally recognized that the data contain uncertainty due to measurement error and bias. In our previous work with weekly data, a probability distribution was used to describe an individual's dose during a specific period of time and statistical methods were developed for estimating it from weekly film dosimetry data. This study showed that the yearly dose of record systematically underestimates doses for Oak Ridge National Laboratory (ORNL) workers. This could result in biased estimates of doseresponse coefficients and their standard errors. The results of this evaluation raise serious questions about the suitability of the yearly dose of record for direct use in low-dose studies of nuclear industry workers. Here, we extend our previous work to use full information in Pocket meter data and develop the Data Synthesis for Individual Dose Estimation (DSIDE) methodology. Although the DSIDE methodology in this study is developed in the context of daily and weekly data to produce a cumulative yearly dose estimate, in principle it is completely general and can be extended to other time period and measurement combinations. The new methodology takes into account the "measurement error" that is produced by the film and pocketmeter dosimetry systems, the biases introduced by policies that lead to recording left-censored doses as zeros, and other measurement and recording practices. The DSIDE method is applied to a sample of dose histories obtained from hard copy dosimetry records at ORNL for the years 1945 to 1955. First, the rigorous addition of daily pocket-meter information shows that the negative bias is generally more severe than was reported in our work based on weekly film data only, however, the amount of bias also varies greatly between person-years. Second, the addition of pocket-meter information reduces uncertainty for some person-years, while increasing it for others. Together, these results suggest that detailed pocket-meter and film dosimetry information is required to obtain unbiased and reliable dosimetry data for use in epidemiologic studies of workers at ORNL.
2017 IEEE International Conference on Cluster Computing (CLUSTER)
Concurrency and Computation: Practice and Experience
Proceedings of the XSEDE16 on Diversity, Big Data, and Science at Scale - XSEDE16, 2016
Simulation plays a big role in understanding the behavior of building envelopes. With the increas... more Simulation plays a big role in understanding the behavior of building envelopes. With the increasing availability of computational resources, it is feasible to conduct parametric simulations for applications such as software model calibration, building control optimization, or fault detection and diagnostics. In this paper, we present an uncertainty exploration of two types of buildings: a) of a building envelope's thermal conductivity properties for a heavily instrumented residential building involving more than 200 sensors, and b) a sensitivity analysis of a stand-alone retail building from the U.S. Department of Energy's reference model. A total of 156 input parameters were determined to be important by experts which were then varied using a Markov Order process for the residential building generating hundreds of GBs of data for tens of thousands of simulations. For the commercial building, 20 parameters were varied using a fractional factorial design requiring just 1024 simulations generating data in the order of a few hundred megabytes. These represent a wide variety and range of simulations from a few to tens of thousands of simulations in an ensemble. Depending on the number of simulations in an ensemble, the techniques employed to meaningfully make sense of the information can be very different, and potentially challenging. Additionally, the method of analysis almost always depends on the experimental design. The Markov Order sampling strategy and fractional factorials designs of sampling presented represent two approaches one could employ for large sensitivity analysis of buildings at two different scales of simulations. The paper presents the analysis using descriptive statistics as well as employing multiple analysis of variance techniques for comparison and contrast.
Siam Journal on Matrix Analysis and Applications, 2009
Scidac Review, May 29, 2008
The SciDAC Visualization and Analytics Center for Enabling Technologies (VACET) isa highly produc... more The SciDAC Visualization and Analytics Center for Enabling Technologies (VACET) isa highly productive effort combining the forces of leading visualization researchersfrom five different institutions to solve some of the most challenging dataunderstanding problems in modern science. ...
Statistical analyses of data from epidemiologic studies of workers exposed to radiation have been... more Statistical analyses of data from epidemiologic studies of workers exposed to radiation have been based on recorded annual radiation doses (yearly dose of record). It is usually assumed that the dose values are known exactly, although it is generally recognized that the data contain uncertainty due to measurement error and bias. In our previous work with weekly data, a probability distribution was used to describe an individual's dose during a specific period of time and statistical methods were developed for estimating it from weekly film dosimetry data. This study showed that the yearly dose of record systematically underestimates doses for Oak Ridge National Laboratory (ORNL) workers. This could result in biased estimates of doseresponse coefficients and their standard errors. The results of this evaluation raise serious questions about the suitability of the yearly dose of record for direct use in low-dose studies of nuclear industry workers. Here, we extend our previous work to use full information in Pocket meter data and develop the Data Synthesis for Individual Dose Estimation (DSIDE) methodology. Although the DSIDE methodology in this study is developed in the context of daily and weekly data to produce a cumulative yearly dose estimate, in principle it is completely general and can be extended to other time period and measurement combinations. The new methodology takes into account the "measurement error" that is produced by the film and pocketmeter dosimetry systems, the biases introduced by policies that lead to recording left-censored doses as zeros, and other measurement and recording practices. The DSIDE method is applied to a sample of dose histories obtained from hard copy dosimetry records at ORNL for the years 1945 to 1955. First, the rigorous addition of daily pocket-meter information shows that the negative bias is generally more severe than was reported in our work based on weekly film data only, however, the amount of bias also varies greatly between person-years. Second, the addition of pocket-meter information reduces uncertainty for some person-years, while increasing it for others. Together, these results suggest that detailed pocket-meter and film dosimetry information is required to obtain unbiased and reliable dosimetry data for use in epidemiologic studies of workers at ORNL.