Martin Gažák - Academia.edu (original) (raw)
Papers by Martin Gažák
Applied Radiation and Isotopes, 2013
Large quantities of radionuclides were released in March-April 2011 during the accident of the Fu... more Large quantities of radionuclides were released in March-April 2011 during the accident of the Fukushima Dai-ichi Nuclear Power Plant to the atmosphere and the ocean. Atmospheric and marine modeling has been carried out to predict the dispersion of radionuclides worldwide, to compare the predicted and measured radionuclide concentrations, and to assess the impact of the accident on the environment. Atmospheric Lagrangian dispersion modeling was used to simulate the dispersion of (137)Cs over America and Europe. Global ocean circulation model was applied to predict the dispersion of (137)Cs in the Pacific Ocean. The measured and simulated (137)Cs concentrations in atmospheric aerosols and in seawater are compared with global fallout and the Chernobyl accident, which represent the main sources of the pre-Fukushima radionuclide background in the environment. The radionuclide concentrations in the atmosphere have been negligible when compared with the Chernobyl levels. The maximum (137)Cs concentration in surface waters of the open Pacific Ocean will be around 20 Bq/m(3). The plume will reach the US coast 4-5 y after the accident, however, the levels will be below 3 Bq/m(3). All the North Pacific Ocean will be labeled with Fukushima (137)Cs 10 y after the accident with concentration bellow 1 Bq/m(3).
Sustainability, 2021
Radar measurements are inherently affected by various meteorological and non-meteorological facto... more Radar measurements are inherently affected by various meteorological and non-meteorological factors that may lead to a degradation of their quality, and the unwanted effects are also transferred into composites, i.e., overlapping images from different radars. The paper was aimed at answering the research question whether we could create ‘cleaner’ radar composites without disturbing features, and if yes, how the operational practice could take advantage of the improved results. To achieve these goals, the qRad and qPrec software packages, based on the concept of quality indices, were used. The qRad package estimates the true quality of the C-band radar volume data using various quality indices and attempts to correct some of the adverse effects on the measurements. The qPrec package uses a probabilistic approach to estimate precipitation intensity, based on heterogeneous input data and quality-based outputs of the qRad software. The advantages of the qRad software are improved radar ...
The Polish Grid Initiative commenced in 2009 in the context of the PLGrid Project funded under th... more The Polish Grid Initiative commenced in 2009 in the context of the PLGrid Project funded under the framework of the Innovative Economy Operational Programme. The main purpose of this Project is to provide the Polish scientific community with an IT platform based on Grid computer clusters, enabling e-science research in various fields. The Project is establishing a country-wide Polish Grid infrastructure, which supports scientific research through integration of experimental data and results of advanced computer simulations carried out by geographically-dispersed teams. PL-Grid aims at significantly extending the amount of computing resources provided to the Polish scientific community and constructing a Grid system facilitating effective and innovative use of the available resources. In the paper some basic facts concerning the PL-Grid Project goals are outlined together with achieved results represented by several examples of innovative grid services and software developed within P...
The Weather Research and Forecasting (WRF) system represents a numerical weather prediction model... more The Weather Research and Forecasting (WRF) system represents a numerical weather prediction model suitable for research and operational user communities. WRF is designed to run on a variety of platforms, either serially or in parallel, with or without multi-threading. In this paper we describe the performance investigation of the real WRF simulation process which is configured as a workflow of three consecutive jobs (pre-processing, modeling, and post-processing) running on a multicore cluster locally, and remotely on the EGEE Grid. The workflow was compiled and validated for two parallel scenarios: the pure MPI, and the hybrid MPI+OpenMP.
2010 Seventh International Conference on Fuzzy Systems and Knowledge Discovery, 2010
Page 1. 978-1-4244-5934-6/10/$26.00 ©2010 IEEE 2998 2010 Seventh International Conference on Fuzz... more Page 1. 978-1-4244-5934-6/10/$26.00 ©2010 IEEE 2998 2010 Seventh International Conference on Fuzzy Systems and Knowledge Discovery (FSKD 2010) Prediction of Significant Meteorological Phenomena Using Advanced Data Mining and Integration Methods ...
Procedia Computer Science, 2010
This paper describes the planned contribution of the project Data Mining Meteo (DMM) to the resea... more This paper describes the planned contribution of the project Data Mining Meteo (DMM) to the research of parametrized models and methods for detection and prediction of significant meteorological phenomena, especially fog and low cloud cover. The project is expected to cover methods for integration of distributed meteorological data necessary for running the prediction models, training models and then mining the data in order to be able to efficiently and quickly predict even randomly occurring phenomena. We present the methods and technologies we will use for integration of the input data, distributed on different vendors' servers. The meteorological detection and prediction methods are based on statistical and climatological methods combined with knowledge discovery-data mining of meteorological data (SYNOP, METAR messages, weather radar imagery, "raw" meteorological data from stations, satellite imagery and results of common meteorological prediction models).
We present several applications from domain of meteorology and crisis management we developed and... more We present several applications from domain of meteorology and crisis management we developed and/or plan to develop. Particularly, we present IMS Model Suite-a complex software system designed to address the needs of accurate forecast of weather and hazardous weather phenomena, environmental pollution assessment, prediction of consequences of nuclear accident and radiological emergency. We discuss requirements on computational means and our experiences how to meet them by grid computing.
ABSTRACT ranged from 12 mBq/m 3 and 1.4 mBq/m 3 to 3700 mBq/m 3 and 1040 mBq/m 3 , respectively. ... more ABSTRACT ranged from 12 mBq/m 3 and 1.4 mBq/m 3 to 3700 mBq/m 3 and 1040 mBq/m 3 , respectively. Small variations in the activity ratio of 134 Cs/ 137 Cs (around 0.9) in measured samples were found. Results were compared with the data obtained during the Chernobyl accident. Modeling of long range air masses transport indicated a rather complicated pathway of radioactive cloud transfer across the Pacific and the North America to the Europe.
Lecture Notes in Computer Science, 1999
ABSTRACT An approach to dynamic parallelizing of coarse grained pro- gram where the parallelizati... more ABSTRACT An approach to dynamic parallelizing of coarse grained pro- gram where the parallelization sources are both dataflow analysis and the features pointed out in the program by annotating is proposed. Program annotating enables to hold two additional types of parallel computations which cannot be found out only from the analysis of dataflow depen- dences. Firstly, there are speculative computations based on anticipating alternative branches of the program’s computational process. Secondly, there are pipeline computations that sometimes may be initialised for operators at the moment when their input data are not complete. Auto- mated program analysis of this type of concurrence is either very hard or it generates a lot of surplus computation, thus absorbing the effect of program parallelization. The implementation of the system of dynamic program parallelization for clusters of PCs and results of some experiments performed on it are described.
Lecture Notes in Computer Science, 1999
... in PVM Environment * Alexander Godlevsky, Martin Ga z ak, and Ladislav Hluch y Institute of I... more ... in PVM Environment * Alexander Godlevsky, Martin Ga z ak, and Ladislav Hluch y Institute of Informatics, Slovak Academy of Sciences D ubravsk a cesta 9, 842 37 Bratislava, Slovakia godlevsky.ui@savba.sk, hluchy.ui@savba.sk Abstract. ...
Applied Radiation and Isotopes, 2013
Large quantities of radionuclides were released in March-April 2011 during the accident of the Fu... more Large quantities of radionuclides were released in March-April 2011 during the accident of the Fukushima Dai-ichi Nuclear Power Plant to the atmosphere and the ocean. Atmospheric and marine modeling has been carried out to predict the dispersion of radionuclides worldwide, to compare the predicted and measured radionuclide concentrations, and to assess the impact of the accident on the environment. Atmospheric Lagrangian dispersion modeling was used to simulate the dispersion of (137)Cs over America and Europe. Global ocean circulation model was applied to predict the dispersion of (137)Cs in the Pacific Ocean. The measured and simulated (137)Cs concentrations in atmospheric aerosols and in seawater are compared with global fallout and the Chernobyl accident, which represent the main sources of the pre-Fukushima radionuclide background in the environment. The radionuclide concentrations in the atmosphere have been negligible when compared with the Chernobyl levels. The maximum (137)Cs concentration in surface waters of the open Pacific Ocean will be around 20 Bq/m(3). The plume will reach the US coast 4-5 y after the accident, however, the levels will be below 3 Bq/m(3). All the North Pacific Ocean will be labeled with Fukushima (137)Cs 10 y after the accident with concentration bellow 1 Bq/m(3).
Sustainability, 2021
Radar measurements are inherently affected by various meteorological and non-meteorological facto... more Radar measurements are inherently affected by various meteorological and non-meteorological factors that may lead to a degradation of their quality, and the unwanted effects are also transferred into composites, i.e., overlapping images from different radars. The paper was aimed at answering the research question whether we could create ‘cleaner’ radar composites without disturbing features, and if yes, how the operational practice could take advantage of the improved results. To achieve these goals, the qRad and qPrec software packages, based on the concept of quality indices, were used. The qRad package estimates the true quality of the C-band radar volume data using various quality indices and attempts to correct some of the adverse effects on the measurements. The qPrec package uses a probabilistic approach to estimate precipitation intensity, based on heterogeneous input data and quality-based outputs of the qRad software. The advantages of the qRad software are improved radar ...
The Polish Grid Initiative commenced in 2009 in the context of the PLGrid Project funded under th... more The Polish Grid Initiative commenced in 2009 in the context of the PLGrid Project funded under the framework of the Innovative Economy Operational Programme. The main purpose of this Project is to provide the Polish scientific community with an IT platform based on Grid computer clusters, enabling e-science research in various fields. The Project is establishing a country-wide Polish Grid infrastructure, which supports scientific research through integration of experimental data and results of advanced computer simulations carried out by geographically-dispersed teams. PL-Grid aims at significantly extending the amount of computing resources provided to the Polish scientific community and constructing a Grid system facilitating effective and innovative use of the available resources. In the paper some basic facts concerning the PL-Grid Project goals are outlined together with achieved results represented by several examples of innovative grid services and software developed within P...
The Weather Research and Forecasting (WRF) system represents a numerical weather prediction model... more The Weather Research and Forecasting (WRF) system represents a numerical weather prediction model suitable for research and operational user communities. WRF is designed to run on a variety of platforms, either serially or in parallel, with or without multi-threading. In this paper we describe the performance investigation of the real WRF simulation process which is configured as a workflow of three consecutive jobs (pre-processing, modeling, and post-processing) running on a multicore cluster locally, and remotely on the EGEE Grid. The workflow was compiled and validated for two parallel scenarios: the pure MPI, and the hybrid MPI+OpenMP.
2010 Seventh International Conference on Fuzzy Systems and Knowledge Discovery, 2010
Page 1. 978-1-4244-5934-6/10/$26.00 ©2010 IEEE 2998 2010 Seventh International Conference on Fuzz... more Page 1. 978-1-4244-5934-6/10/$26.00 ©2010 IEEE 2998 2010 Seventh International Conference on Fuzzy Systems and Knowledge Discovery (FSKD 2010) Prediction of Significant Meteorological Phenomena Using Advanced Data Mining and Integration Methods ...
Procedia Computer Science, 2010
This paper describes the planned contribution of the project Data Mining Meteo (DMM) to the resea... more This paper describes the planned contribution of the project Data Mining Meteo (DMM) to the research of parametrized models and methods for detection and prediction of significant meteorological phenomena, especially fog and low cloud cover. The project is expected to cover methods for integration of distributed meteorological data necessary for running the prediction models, training models and then mining the data in order to be able to efficiently and quickly predict even randomly occurring phenomena. We present the methods and technologies we will use for integration of the input data, distributed on different vendors' servers. The meteorological detection and prediction methods are based on statistical and climatological methods combined with knowledge discovery-data mining of meteorological data (SYNOP, METAR messages, weather radar imagery, "raw" meteorological data from stations, satellite imagery and results of common meteorological prediction models).
We present several applications from domain of meteorology and crisis management we developed and... more We present several applications from domain of meteorology and crisis management we developed and/or plan to develop. Particularly, we present IMS Model Suite-a complex software system designed to address the needs of accurate forecast of weather and hazardous weather phenomena, environmental pollution assessment, prediction of consequences of nuclear accident and radiological emergency. We discuss requirements on computational means and our experiences how to meet them by grid computing.
ABSTRACT ranged from 12 mBq/m 3 and 1.4 mBq/m 3 to 3700 mBq/m 3 and 1040 mBq/m 3 , respectively. ... more ABSTRACT ranged from 12 mBq/m 3 and 1.4 mBq/m 3 to 3700 mBq/m 3 and 1040 mBq/m 3 , respectively. Small variations in the activity ratio of 134 Cs/ 137 Cs (around 0.9) in measured samples were found. Results were compared with the data obtained during the Chernobyl accident. Modeling of long range air masses transport indicated a rather complicated pathway of radioactive cloud transfer across the Pacific and the North America to the Europe.
Lecture Notes in Computer Science, 1999
ABSTRACT An approach to dynamic parallelizing of coarse grained pro- gram where the parallelizati... more ABSTRACT An approach to dynamic parallelizing of coarse grained pro- gram where the parallelization sources are both dataflow analysis and the features pointed out in the program by annotating is proposed. Program annotating enables to hold two additional types of parallel computations which cannot be found out only from the analysis of dataflow depen- dences. Firstly, there are speculative computations based on anticipating alternative branches of the program’s computational process. Secondly, there are pipeline computations that sometimes may be initialised for operators at the moment when their input data are not complete. Auto- mated program analysis of this type of concurrence is either very hard or it generates a lot of surplus computation, thus absorbing the effect of program parallelization. The implementation of the system of dynamic program parallelization for clusters of PCs and results of some experiments performed on it are described.
Lecture Notes in Computer Science, 1999
... in PVM Environment * Alexander Godlevsky, Martin Ga z ak, and Ladislav Hluch y Institute of I... more ... in PVM Environment * Alexander Godlevsky, Martin Ga z ak, and Ladislav Hluch y Institute of Informatics, Slovak Academy of Sciences D ubravsk a cesta 9, 842 37 Bratislava, Slovakia godlevsky.ui@savba.sk, hluchy.ui@savba.sk Abstract. ...