Dr. Hadis Samadi Alinia | Western University Canada (original) (raw)
Books by Dr. Hadis Samadi Alinia
"بنا شدن گستره شهري تهران بر روي دوازده گسل مؤثر و سه گسل اصلي، اين كلانشهر را مستعد وقوع زمين¬لر... more "بنا شدن گستره شهري تهران بر روي دوازده گسل مؤثر و سه گسل اصلي، اين كلانشهر را مستعد وقوع زمين¬لرزه¬هاي قدرتمند و آسيبپذير نموده است. بر اساس دادههاي زمينلرزههاي تاریخی، تهران زمينلرزههاي بزرگی را در گذشته دور تجربه کرده است. دادههاي زمينلرزههاي موجود نشان ميدهند، از سال 1209 شمسی تاکنون زلزله مخربی اين شهر را تحت تاثير قرار نداده و به اين دليل زلزلهشناسان وقوع زلزلهاي ویرانگر را در آینده براي تهران پيشبيني ميکنند.
علاوه بر وجود ساختمانهاي قديمي و غير مهندسي ساز در تهران، مسئله مهم ديگري كه اين شهر را در برابر زلزله بسيار آسيبپذير نموده است، پديده روانگرايي به علت بالا بودن سطح تراز آب زيرزميني ميباشد. از آنجایی که بیشتر خاکهای سطح شهر تهران از نوع آبرفتی و ماسهای میباشند به علت عدم وجود کانالهای فاضلاب و بالا آمدن سطح تراز آب، لایهای سست، ماسهای و اشباع از آب را تشکیل داده است که بر اثر زلزله این لایه حالت خمیری گرفته و ساختمانی که حتی بر اثر نیروی افقی زلزله خراب نمیشود را در خود واژگون میکند.
تعيين مكان و ميزان تاثير آسيبپذيري شهر نسبت به خطرات زلزله همواره به عنوان يك مسئله پيچيده در تصميمگيريهاي مرتبط با مديريت بحران زمينلرزه مطرح بوده است. اين مسئله به دليل وابسته بودنش به پارامترها و نظرات كارشناسان مختلف همواره با عدم قطعيتهايي همراه بوده است. بنابراين عدم مديريت صحيح عدم قطعيتهاي موجود منجر به تصميمگيريهاي نادرست ميشود كه اين موضوع در مديريت بحران زلزله به علت همه جانبه بودن مخاطراتش بسيار داراي اهميت ميباشد.
تاكنون روشهاي تصميمگيري چند معياره مختلفي به منظور تهيه نقشه آسيبپذيري لرزهاي ارائه شدهاند و هر كدام به نوعي عدم قطعيتهاي موجود در دادهها و نظرات كارشناسي را مديريت ميكردند و در نهايت نقشه آسيبپذيري لرزهاي با كمترين عدم قطعيت با استفاده از آنها تهيه ميشده است. براي نمونه آقاطاهر روش سلسله مراتبي بهبود يافته را با منطق فازي به منظور وزندهي به معيارهاي آسيبپذيري و ارزيابي عدم قطعيتهايشان در نقشه آسيبپذيري لرزهاي شهر تهران استفاده نمود. تمركز پاياننامه فوق روي آسيبپذيري لرزهاي انساني بوده است. همچنين تحقيق ديگري در زمينه بهكارگيري روش تئوري مجموعه-هاي فازي شهودي به منظور مدلسازي عدم قطعيت نظر كارشناسان در تهيه نقشه آسيبپذيري فيزيكي و انساني در غالب پاياننامه كارشناسي ارشد سيلاوي انجام شده است. اين تئوري با هدف مديريت ناسازگاري در مفاهيم و يا نظرات كارشناسان ارائه شده است که در آن درجة عدم عضويت هر عنصر به يک مجموعه بصورت مستقل و نه بر مبناي درجة عضويت تعيين مي¬گردد. در كار تحقيقاتي ديگري كه توسط اميري پياده شده است، حوزههاي شهري بر حسب درجه آسيبپذيريشان با استفاده از دو روش تئوري مجموعههاي زبر برتري مبنا و نيز تلفيق روش سلسله مراتبي تحليلي با تئوري دمستفر شافر كلاسهبندي شدند.
در پاياننامه حاضر، از روش محاسبات دانهاي به منظور تهيه نقشه آسيبپذيري لرزهاي فيزيكي شهر تهران استفاده شده است. انتخاب اين روش به منظور سادهسازي روشهاي موجود در حل مسئله سلسله مراتبي و همچنين رفع كمبودهاي روشهاي استفاده شده در تحقيقهاي گذشته در مدلسازي عدم قطعيت نظر كارشناس ميباشد. يكي از مشخصههاي ويژه اين روش بر خلاف روشهاي گذشته، استخراج قوانين تصميمگيري دقيق با مقدار ناسازگاري صفر از مجموعههاي از پيش كلاسهبندي شده توسط كارشناس است كه در اين مورد، بر خلاف روشهاي مذكور در بالا در حل مسئله به صورت پارتيشن، حل مسئله پوششي ميباشد. به اين معني كه در اين روش با در نظر گرفتن عدم افزونگي، يك حوزه شهري ممكن است با بيش از يك قانون كلاسهبندي شود.
در اين پاياننامه، نقشه آسيبپذيري لرزهاي فيزيكي شهر تهران با در نظر گرفتن سناریوی فعال شدن گسل شمال تهران با استفاده از تئوری محاسبات دانهاي در دو محدوده مطالعاتی، كل شهر تهران و قسمت مرکز و جنوب شهر تهران با لحاظ نمودن وضعيت سطح آبهاي زيرزميني، تهيه شده است. دقت متوسط نقشه آسيبپذيري بدست آمده نسبت به نظر كارشناس براي كل شهر تهران ٧٢ % بوده است و براي نقشه آسيبپذيري براي مناطق مركزي و جنوب شهر تهران ٦٩% شده است. دقت نتايج بدست آمده حاكي از قدرت روش محاسبات دانهاي در حل مسئله تصميمگيري چند معياره و استخراج قوانين درست و دقيق از نظر كارشناس ميباشد.
از جمله مشاركتهاي مهم اين پايان نامه، استفاده از روش محاسبات دانهاي با ساختار درختي به عنوان روشي به منظور تهيه نقشه آسيبپذيري با در نظر گرفتن نظر كارشناس لرزهاي ميباشد. يكي از مزيتهاي استفاده از اين روش، استخراج قوانين تصميمگيري صحيح، دقيق و با كمترين ميزان عدم قطعيت ميباشد. به علاوه در اين روش، به نوعي قوانين كلاسهبندي از نظر اشتراك و جدائي نسبت به هم ارزيابي ميشوند كه باعث نقص يا افزونگي اطلاعات نشود.
از آنجا كه با توجه به بررسيهاي انجام شده به نظر ميرسد تاكنون تهيه نقشه آسيبپذيري مبتنی بر استفاده از درخت تصميمگيري دانهاي گزارش نشده است، اين تحقيق براي اولين بار در راستای نیل به اين هدف گام برداشته شده است".
Tehran, capital of Iran, is located on few known and unknown faults which make this mega city exposure to huge earthquakes. Based on historical earthquake records available, Tehran has experienced severe earthquakes a number of times. Seismologists predicted huge earthquake will be happen in Tehran in the near future. In addition to existence of old and non-standard buildings in Tehran, another important factor to make it very vulnerable against to earthquake, is liquefaction because of high groundwater level throughout south and center of the city. Specificity, covering most of Tehran’s area by alluvial and noncohesive soil, increasing groundwater level and nonexistence of drainage channels make a sandy layer and saturation of water in which a building with high earthquake resistance will overturn.
Determining locations and intensity of seismic vulnerability of a city is considered as a complicated disaster management problem. As, this problem generally depends on various criteria and expert’s opinions, one of the most important challenges concerned is the existence of uncertainty regarding inconsistency in expert’s view. Uncertainty in seismic vulnerability map would results biases in risk management which has multilateral effects on decision makings.
To overcome this problem, granular computing approach is proposed to produce physical seismic vulnerability of Tehran. One of the most significant properties of this method is induction of accurate rules having zero entropy from predefined classification undertaken of training datasets by the expert. Furthermore, not-redundant covering rules will be extracted for consistent classification where one object maybe classified with two or more non-redundant rules. To do this, some parameters such as generality, coverage, confidence and entropy are used in order to induct more suitable rules.
In this thesis, two case studies are considered; the first study area covers the whole Tehran statistical zones (3173 according to 1996 statistics) and the second case study is limited to the south and center of the city concerning ground water level situation.
The average accuracy resulted from the seismic vulnerability using granular computing in the first case study is 72%, and for the second case study is 69%. The accuracy of the results exhibits the strength of granular approach to extract correct information of expert’s views.
One the most important contribution of this thesis is using multi level granular computing as an approach to produce seismic vulnerability map by considering the expert’s opinion.
It seems that no comprehensive research has been reported on produce vulnerability map using Granular Decision Tree. Therefore, this research has come up to meet this goal.
اين كتاب شامل كليه نكات مهم و سوالات چهارگزينهاي دروس نقشهبرداري زميني، نقشهبرداري زيرزميني و ... more اين كتاب شامل كليه نكات مهم و سوالات چهارگزينهاي دروس نقشهبرداري زميني، نقشهبرداري زيرزميني و تئوري خطاها ميباشد.
سبك كار به نحوي ميباشد كه هر نكته يا هر روش درسي كه جهت پاسخ دادن تستها، مهم ميباشد با مثالي از ميان تستهاي كنكور يا تاليفي كه به آن موضوع ارتباط داشتهاند در ١٤ فصل و ٣ ضميمه آورده شده است.
درانتهاي هر فصل داوطلب ميتواند يادگيري خود را با ساير تستهاي كنكور تاليفي در سالهاي 75 تاكنون افزايش دهد. به اين ترتيب داوطلب اطمينان دارد كه تمامي نكاتي كه مطالعه ميكند براي هدف آموزشي و آمادگي او در كنكور كاربرد دارد.
در ويرايش جديد سعي شده است كه كليه غلطهاي املائي يا علمي كه در ويرايش قبلي كتاب وجود داشت، تصحيح شوند و سوالات چهارگزينهاي كنكور سال ٨٦ تاكنون نيز به آن افزوده شوند.
Papers by Dr. Hadis Samadi Alinia
Oxford University Press on behalf of The Royal Astronomical Society, 2019
A dominant source of error in space-based geodesy is the tropospheric delay, which results in exc... more A dominant source of error in space-based geodesy is the tropospheric delay, which results in excess path length of the signal as it passes through the neutral atmosphere. Many studies have addressed the use of global weather models and local meteorological observations to model the effects of this error in Global Positioning System (GPS) and Differential Interferometric Synthetic Aperture Radar (DInSAR) data. However, modelling of zenith tropospheric delays (ZTDs) errors in the GPS data, particularly in the areas of strong topographic relief, is highly
problematic because ZTD estimates cannot be captured by low resolution weather models and often it is not possible to find a nearby weather station for every GPS station. In this paper, we use DInSAR data with high spatial and temporal resolution from the volcanic
island of Hawaii to estimate the seasonal amplitudes of ZTD signals, which then are used to remove this error from GPS data. Here we observe the seasonal amplitude for more than one million DInSAR pixels for the time period between 2014 and 2017 and propose a best-fitting elevation-dependent model. This model is an integration of the exponential refractivity function and is linked to the observations from a radiosonde station and a weather station. It estimates seasonal amplitudes ranging from 0.2 cm at the highest elevations to 5.6 cm at the lower elevations, increasing exponentially from the DInSAR reference elevation. To demonstrate the potential of this model for correction of GPS data, we compare the modelled seasonal amplitude to the observed seasonal amplitudes of the variation of the local ZTD, computed from the Canadian Spatial Reference System-precise point positioning (CSRS-PPP) online application, for 21 GPS stations distributed throughout the island. Our results show that this
model provides results with root-mean-square error (rmse) values of less than 1 cm for the majority of GPS stations. The computed rmse of the residuals between the modelled seasonal signal and the high frequency variations of the ZTD signal at each station relative to the
reference GPS station, here PUKA, range between 0.7 and 4.1 cm. These estimated values show good agreement with those computed for the rmse of the residuals computed between the observed seasonal signal and the high frequency variations of ZTD, ranging from zero to
0.3 cm. This confirms the potential of the proposed DInSAR model to accurately estimate the seasonal variation of ZTDs atGPS stations at any arbitrary altitude with respect to the reference station.
New precise network solutions for continuous GPS (cGPS) stations distributed in eastern Ontario a... more New precise network solutions for continuous GPS (cGPS) stations distributed in eastern Ontario and western Québec provide constraints on the regional three-dimensional crustal velocity field. Five years of continuous observations at fourteen cGPS sites were analyzed using Bernese GPS processing software. Several different sub-networks were chosen from these stations, and the data were processed and compared to in order to select the optimal configuration to accurately estimate the vertical and horizontal station velocities and minimize the associated errors. The coordinate time series were then compared to the crustal motions from global solutions and the optimized solution is presented here. A noise analysis model with power-law and white noise, which best describes the noise characteristics of all three components, was employed for the GPS time series analysis. The linear trend, associated uncertainties, and the spectral index of the power-law noise were calculated using a maximum likelihood estimation approach. The residual horizontal velocities, after removal of rigid plate motion, have a magnitude consistent with expected glacial isostatic adjustment (GIA). The vertical velocities increase from subsidence of almost 1.9 mm/year south of the Great Lakes to uplift near Hudson Bay, where the highest rate is approximately 10.9 mm/year. The residual horizontal velocities range from approximately 0.5 mm/year, oriented south–southeastward, at the Great Lakes to nearly 1.5 mm/year directed toward the interior of Hudson Bay at stations adjacent to its shoreline. Here, the velocity uncertainties are estimated at less than 0.6 mm/year for the horizontal component and 1.1 mm/year for the vertical component. A comparison between the observed velocities and GIA model predictions, for a limited range of Earth models, shows a better fit to the observations for the Earth model with the smallest upper mantle viscosity and the largest lower mantle viscosity. However, the pattern of horizontal deformation is not well explained in the north, along Hudson Bay, suggesting that revisions to the ice thickness history are needed to improve the fit to observations. Keywords GPS position time series · Bernese GPS processing · Reference frame · Glacial isostatic adjustment (GIA)
… International Archives of …, 2008
This paper presents a methodology through utilizing remote sensing imagery, GIS-based neuro-fuzzy... more This paper presents a methodology through utilizing remote sensing imagery, GIS-based neuro-fuzzy approach and variety of social and environmental factors for simulating land use change. Two historical Landsat imageries of Tehran Metropolitan Area with twenty year time interval and user-selected socio-economic and environmental variables have been employed in order to simulate land use change. All images were rectified and registered to Universal Transverse Mercator (UTM) WGS 1984 zone 39N. Supervised classification was used to classify the images to different land use categories. Four classes were identified: road, residential area, service centre, administrative area. This work introduces a simulation experiment on urban land use change in which a supervised back propagation neural network has been employed in the parameterization of the simulation model, while GIS is used to model and monitor land use change and perform spatial analysis on the results. This paper adapts land use change model which parameterized for Tehran Metropolitan Area and explores how factors such as road, slope, administrative space, service centre and residential area parameters can influence it. For each cell in the study area, the real change between the two time steps is determined and analyzed compared with the provided variables in order to produce a probability of land use change layer. Parts of two datasets were used to train the neural network while full datasets were utilized to predict land use change modelling. In addition, the impact of training and prediction period on land use is examined. The creation of the GIS based neuro-fuzzy land use change modelling is the major contribution of this paper.
The San Andreas Fault is one of the most well-known destructive fault which is a boundary of two ... more The San Andreas Fault is one of the most well-known destructive fault which is a boundary of two moving plates; the Pacific Plate (on the west) and North American Plate (on the east) that meet in western California, in which the west side moves northwestward relative to the east side. Huge earthquakes occurred during the long history of this fault at each year and follow that the displacements made in the features on the surface near the fault, motivate scientists to attempt to forecast the next huge earthquake along that. In 1985, the US Geological Survey predicted that there would be a comparably-sized earthquake in this community by 1993, but no such event came until September 28, 2004, when a magnitude-6.0 earthquake struck at 17:15:24 Coordinated Universal Time, UTC; epicenter location 35.8158 N, 120.3748W; depth 7.9 km). There is a hope that by observing the collected data and studying the changes that precede an earthquake of that size helps scientists to better understand the physics of the earthquakes and faulting and also issue predictions for major earthquakes along this fault and around the world. To achieve this, requires observing them using different geodetic instruments with high resolution. Among variety existence instruments, borehole Gladwin Tensor Strainmeter (GTSM) capable to record changes in strains and deformation around the rock at a resolution of better than a nanostrain at periods of minutes to months (short term). It accomplishes this to measure different signals range from several nanostrain to many hundreds of nanostrain for example the strain resulted from lunar and solar tides. Strain data from GTSM are three measurements in different directions at each time; Areal strain and two types of shear strains. The main goal in this project is to determine slipping occurred into the near-surface region of the hypocentre in during, or as a result of, the passage of the seismic waves that trigger tremor near the San Andreas Fault, for time periods of 2 month before and about one month after that, using continuous strain data recorded on two boreholes Gladwin Tensor Strainmeters (GTSM) located in the opposite sides of the fault within the distribution of tremor the epicenter of this earthquake. In this project the obtained GTSM data with concerning that only tidal and drift signals are effective parameters on these data, will be modeled.
To understand earthquakes requires observing them up close and in detail-a difficult task because... more To understand earthquakes requires observing them up close and in detail-a difficult task because they are at present largely unpredictable, and so knowing where to put the instrumentation needed to make such observations is a challenge. Parkfield is the most closely observed earthquake zone in the world.
Tehran, capital of Iran, is located on a number of known and unknown faults which make this mega ... more Tehran, capital of Iran, is located on a number of known and unknown faults which make this mega city exposed to huge earthquakes. Determining locations and intensity of seismic vulnerability of a city is considered as a complicated disaster management problem. As this problem generally depends on various criteria, one of the most important challenges concerned is the existence of uncertainty regarding inconsistency in combining those effective criteria. The emergence of uncertainty in seismic vulnerability map results to some biases in risk management which has multilateral effects in dealing with the consequences of the earthquake. To overcome this problem, this paper proposes a new approach for Tehran’s seismic vulnerability classification based on granular computing. One of the most significant properties of this method is inference of
accurate rules having zero entropy from predefined classification
undertaken based on training datasets by the expert.
Furthermore, not-redundant covering rules will be extracted
for consistent classification where one object maybe classified with two or more nonredundant rules. In this paper, Tehran statistical zones (3,173 according to 1996 census) are considered as the study area. Since this city has not experienced a disastrous earthquake since 1830, this work’s results is the relative accurate with respect to the results of previous studies.
Proceeding of the IADIS International Conferences informatics 2011, 2011
"Pre-determining locations and intensity of a seismic area is considered as a complicated disaste... more "Pre-determining locations and intensity of a seismic area is considered as a complicated disaster management problem.
All over the world scientists attempts to predict an impending earthquake with varied phenomena as seismicity patterns.
But still there exists uncertainty problem along with discovering knowledge from large and complex dataset that have significant influence on the final results. In this research granular computing algorithm employed to solve the problem of rule induction from existing dataset. As, this problem depends on various criteria, one of the most important challenges concerned is the existence of uncertainty regarding inconsistency in combining influencing criteria and extracting more consistent knowledge for the next predictions. To overcome this problem, this paper proposes a new approach for seismic risk knowledge discovery based on granular computing theory. One of the significant properties of this method is
induction of more compatible rules having zero inconsistency from existing databases. Furthermore, in this approach non-redundant covering rules will be extracted for consistent classification where one object maybe classified with two or more non-redundant rules. This paper illustrate the advantages of using granular computing to discovery of knowledge from dataset consist of the seismic risk properties of the area between 58° 24' E, 60° 24' E Longitude and 27° 45' N, 29° 25' N Latitude around occurred near Reygan (Kerman Province), South-East of Iran where a devastating earthquake
occurred."
Rule induction is an area of machine learning in which formal rules are extracted from a set of o... more Rule induction is an area of machine learning in which formal rules are extracted from a set of observations or training dataset. Inducted rules can be expressed as a final result of the decision tree in which each branch represents a possible scenario of decision and its outcome. Existing decision learning algorithms like Iterative Dichotomiser (ID3) is an attribute centered method which may introduce unnecessary attributes in the classification rules. To overcome the problem, coverage and confidence measures are applied to select the most promising attribute-value at each step. The proposed approach is granule centered in which, instead of focusing on the selection of a suitable partition, i.e., a family of granules is defined, at each step, by values of an attribute. This paper is concentrated on the selection of a single granule. The decision tree learning algorithm ID3 and granular network are successfully applied for information table of test dataset of seismic vulnerability of urban areas in Tehran, capital of Iran.
Rule induction is an area of machine learning in which formal rules are extracted from a set of o... more Rule induction is an area of machine learning in which formal rules are extracted from a set of observations or decision trees.
Inducted rules can be expressed as relationships between concepts in terms of their intensions and extensions, such as sub-concepts and super-concepts, disjoint and overlap concepts, and partial sub-concepts. Existing algorithms for construction of decision trees cannot effectively deal with missing values. To overcome the limitation, this paper presents granule network as an improved decision trees to induct more informative classification rules in which all objects in the universe are classified correctly with
minimum uncertainty. To achieve this, some quantitative measures such as coverage and support are used to estimate the quality of the granules and their relations at each step. This paper is concentrated on seismic vulnerability classification using granule tree by applying quantity measures. It is implicated on 177 urban areas of north of Tehran, capital of Iran and the results are used to classify Tehran’s seismic vulnerability. The achieved results of the classification are successfully verified the proposed methodology.
Tehran, capital of Iran has several known and unknown active faults hence huge earthquake will pe... more Tehran, capital of Iran has several known and unknown active faults hence huge earthquake will permeates human settlement there. However, more than the disaster it is the inefficient and inadequate infrastructure in the aftermath of
earthquake which causes huge loss of life and property. There are substantial problems with availability of, and accessibility to reliable, up-to-date, and accurate geospatial data. The need for such data is significant if one is to successfully
react to and manage a disaster situation. The data required might include aspects such as the availability of resources, road access, and required disaster response operations; crucial for informed decision-making and disaster management. Thus, there is a need for the development and implementation of appropriate frameworks and utilization of technologies to alleviate the current lack of capacity to respond to disasters. This paper focuses on the use of spatial data infrastructure and Geospatial information system to achieve better outcomes from spatial decision-making for site selection of rescue centers. The usage of the spatial decision support system provides the advantages of emphasis on the
regions which need more attention. Among the successful models, in this paper multi-criteria decision making (MCDM) based on a location model is used.
This paper is presented a new approach which allowing triangulation of a set of points in order t... more This paper is presented a new approach which allowing triangulation of a set of points in order to be dynamically updated without a global reconstruction of the topology in which new points can be incrementally inserted into the set. The proposed algorithm for insertion is based on k-nearest
neighbor search of voronoi vertex in a 2-dimensional space and finding a triangle in which the inserted point locates while topological structure was used to save voronoi vertex. The inserted points are preprocessed into a data structure, so that given any query point q, the nearest (or generally k nearest) points of voronoi vertices can be reported efficiently.
K-nearest neighbor searching is a fundamental computational analysis in triangulation. After using k-nearest neighbor searching on voronoi vertices, walk algorithm with nearest voronoi vertices was performed to find a triangle in which the inserted point is located to update local triangulation in the region. The algorithm results in rather simple and efficient way to speed up triangulation. The algorithms have been implemented and successfully verified.
Iran is a highly vulnerable country to natural disasters especially earthquake. Tehran, capital... more Iran is a highly vulnerable country to natural disasters especially earthquake.
Tehran, capital of Iran, has several known and unknown active faults hence huge earthquake will permeates human settlement there. Production of seismic vulnerability map could help local and national disaster management organizations to create and implement a plan to promote awareness of earthquake vulnerability and implementation of seismic vulnerability reduction measures in Tehran.
The characteristics of knowledge driven approaches have inherent uncertainties and inconsistent in opinions of experts in determining seismic vulnerable areas. To produce less uncertainty vulnerability map using the rules inducted from the information table, a granular computing model is used for learning classification rules which are granule centered strategies.
Talks by Dr. Hadis Samadi Alinia
اين بحث راجع به تئوري دانهاي و استفاده از آن در كلاسهبندي حوزههاي شهري براساس آسيبپذيري لرزه... more اين بحث راجع به تئوري دانهاي و استفاده از آن در كلاسهبندي حوزههاي شهري براساس آسيبپذيري لرزهاي آنها ميباشد
Your ideas of my works by Dr. Hadis Samadi Alinia
Powerpoint by Dr. Hadis Samadi Alinia
"بنا شدن گستره شهري تهران بر روي دوازده گسل مؤثر و سه گسل اصلي، اين كلانشهر را مستعد وقوع زمين¬لر... more "بنا شدن گستره شهري تهران بر روي دوازده گسل مؤثر و سه گسل اصلي، اين كلانشهر را مستعد وقوع زمين¬لرزه¬هاي قدرتمند و آسيبپذير نموده است. بر اساس دادههاي زمينلرزههاي تاریخی، تهران زمينلرزههاي بزرگی را در گذشته دور تجربه کرده است. دادههاي زمينلرزههاي موجود نشان ميدهند، از سال 1209 شمسی تاکنون زلزله مخربی اين شهر را تحت تاثير قرار نداده و به اين دليل زلزلهشناسان وقوع زلزلهاي ویرانگر را در آینده براي تهران پيشبيني ميکنند.
علاوه بر وجود ساختمانهاي قديمي و غير مهندسي ساز در تهران، مسئله مهم ديگري كه اين شهر را در برابر زلزله بسيار آسيبپذير نموده است، پديده روانگرايي به علت بالا بودن سطح تراز آب زيرزميني ميباشد. از آنجایی که بیشتر خاکهای سطح شهر تهران از نوع آبرفتی و ماسهای میباشند به علت عدم وجود کانالهای فاضلاب و بالا آمدن سطح تراز آب، لایهای سست، ماسهای و اشباع از آب را تشکیل داده است که بر اثر زلزله این لایه حالت خمیری گرفته و ساختمانی که حتی بر اثر نیروی افقی زلزله خراب نمیشود را در خود واژگون میکند.
تعيين مكان و ميزان تاثير آسيبپذيري شهر نسبت به خطرات زلزله همواره به عنوان يك مسئله پيچيده در تصميمگيريهاي مرتبط با مديريت بحران زمينلرزه مطرح بوده است. اين مسئله به دليل وابسته بودنش به پارامترها و نظرات كارشناسان مختلف همواره با عدم قطعيتهايي همراه بوده است. بنابراين عدم مديريت صحيح عدم قطعيتهاي موجود منجر به تصميمگيريهاي نادرست ميشود كه اين موضوع در مديريت بحران زلزله به علت همه جانبه بودن مخاطراتش بسيار داراي اهميت ميباشد.
تاكنون روشهاي تصميمگيري چند معياره مختلفي به منظور تهيه نقشه آسيبپذيري لرزهاي ارائه شدهاند و هر كدام به نوعي عدم قطعيتهاي موجود در دادهها و نظرات كارشناسي را مديريت ميكردند و در نهايت نقشه آسيبپذيري لرزهاي با كمترين عدم قطعيت با استفاده از آنها تهيه ميشده است. براي نمونه آقاطاهر روش سلسله مراتبي بهبود يافته را با منطق فازي به منظور وزندهي به معيارهاي آسيبپذيري و ارزيابي عدم قطعيتهايشان در نقشه آسيبپذيري لرزهاي شهر تهران استفاده نمود. تمركز پاياننامه فوق روي آسيبپذيري لرزهاي انساني بوده است. همچنين تحقيق ديگري در زمينه بهكارگيري روش تئوري مجموعه-هاي فازي شهودي به منظور مدلسازي عدم قطعيت نظر كارشناسان در تهيه نقشه آسيبپذيري فيزيكي و انساني در غالب پاياننامه كارشناسي ارشد سيلاوي انجام شده است. اين تئوري با هدف مديريت ناسازگاري در مفاهيم و يا نظرات كارشناسان ارائه شده است که در آن درجة عدم عضويت هر عنصر به يک مجموعه بصورت مستقل و نه بر مبناي درجة عضويت تعيين مي¬گردد. در كار تحقيقاتي ديگري كه توسط اميري پياده شده است، حوزههاي شهري بر حسب درجه آسيبپذيريشان با استفاده از دو روش تئوري مجموعههاي زبر برتري مبنا و نيز تلفيق روش سلسله مراتبي تحليلي با تئوري دمستفر شافر كلاسهبندي شدند.
در پاياننامه حاضر، از روش محاسبات دانهاي به منظور تهيه نقشه آسيبپذيري لرزهاي فيزيكي شهر تهران استفاده شده است. انتخاب اين روش به منظور سادهسازي روشهاي موجود در حل مسئله سلسله مراتبي و همچنين رفع كمبودهاي روشهاي استفاده شده در تحقيقهاي گذشته در مدلسازي عدم قطعيت نظر كارشناس ميباشد. يكي از مشخصههاي ويژه اين روش بر خلاف روشهاي گذشته، استخراج قوانين تصميمگيري دقيق با مقدار ناسازگاري صفر از مجموعههاي از پيش كلاسهبندي شده توسط كارشناس است كه در اين مورد، بر خلاف روشهاي مذكور در بالا در حل مسئله به صورت پارتيشن، حل مسئله پوششي ميباشد. به اين معني كه در اين روش با در نظر گرفتن عدم افزونگي، يك حوزه شهري ممكن است با بيش از يك قانون كلاسهبندي شود.
در اين پاياننامه، نقشه آسيبپذيري لرزهاي فيزيكي شهر تهران با در نظر گرفتن سناریوی فعال شدن گسل شمال تهران با استفاده از تئوری محاسبات دانهاي در دو محدوده مطالعاتی، كل شهر تهران و قسمت مرکز و جنوب شهر تهران با لحاظ نمودن وضعيت سطح آبهاي زيرزميني، تهيه شده است. دقت متوسط نقشه آسيبپذيري بدست آمده نسبت به نظر كارشناس براي كل شهر تهران ٧٢ % بوده است و براي نقشه آسيبپذيري براي مناطق مركزي و جنوب شهر تهران ٦٩% شده است. دقت نتايج بدست آمده حاكي از قدرت روش محاسبات دانهاي در حل مسئله تصميمگيري چند معياره و استخراج قوانين درست و دقيق از نظر كارشناس ميباشد.
از جمله مشاركتهاي مهم اين پايان نامه، استفاده از روش محاسبات دانهاي با ساختار درختي به عنوان روشي به منظور تهيه نقشه آسيبپذيري با در نظر گرفتن نظر كارشناس لرزهاي ميباشد. يكي از مزيتهاي استفاده از اين روش، استخراج قوانين تصميمگيري صحيح، دقيق و با كمترين ميزان عدم قطعيت ميباشد. به علاوه در اين روش، به نوعي قوانين كلاسهبندي از نظر اشتراك و جدائي نسبت به هم ارزيابي ميشوند كه باعث نقص يا افزونگي اطلاعات نشود.
از آنجا كه با توجه به بررسيهاي انجام شده به نظر ميرسد تاكنون تهيه نقشه آسيبپذيري مبتنی بر استفاده از درخت تصميمگيري دانهاي گزارش نشده است، اين تحقيق براي اولين بار در راستای نیل به اين هدف گام برداشته شده است".
Tehran, capital of Iran, is located on few known and unknown faults which make this mega city exposure to huge earthquakes. Based on historical earthquake records available, Tehran has experienced severe earthquakes a number of times. Seismologists predicted huge earthquake will be happen in Tehran in the near future. In addition to existence of old and non-standard buildings in Tehran, another important factor to make it very vulnerable against to earthquake, is liquefaction because of high groundwater level throughout south and center of the city. Specificity, covering most of Tehran’s area by alluvial and noncohesive soil, increasing groundwater level and nonexistence of drainage channels make a sandy layer and saturation of water in which a building with high earthquake resistance will overturn.
Determining locations and intensity of seismic vulnerability of a city is considered as a complicated disaster management problem. As, this problem generally depends on various criteria and expert’s opinions, one of the most important challenges concerned is the existence of uncertainty regarding inconsistency in expert’s view. Uncertainty in seismic vulnerability map would results biases in risk management which has multilateral effects on decision makings.
To overcome this problem, granular computing approach is proposed to produce physical seismic vulnerability of Tehran. One of the most significant properties of this method is induction of accurate rules having zero entropy from predefined classification undertaken of training datasets by the expert. Furthermore, not-redundant covering rules will be extracted for consistent classification where one object maybe classified with two or more non-redundant rules. To do this, some parameters such as generality, coverage, confidence and entropy are used in order to induct more suitable rules.
In this thesis, two case studies are considered; the first study area covers the whole Tehran statistical zones (3173 according to 1996 statistics) and the second case study is limited to the south and center of the city concerning ground water level situation.
The average accuracy resulted from the seismic vulnerability using granular computing in the first case study is 72%, and for the second case study is 69%. The accuracy of the results exhibits the strength of granular approach to extract correct information of expert’s views.
One the most important contribution of this thesis is using multi level granular computing as an approach to produce seismic vulnerability map by considering the expert’s opinion.
It seems that no comprehensive research has been reported on produce vulnerability map using Granular Decision Tree. Therefore, this research has come up to meet this goal.
اين كتاب شامل كليه نكات مهم و سوالات چهارگزينهاي دروس نقشهبرداري زميني، نقشهبرداري زيرزميني و ... more اين كتاب شامل كليه نكات مهم و سوالات چهارگزينهاي دروس نقشهبرداري زميني، نقشهبرداري زيرزميني و تئوري خطاها ميباشد.
سبك كار به نحوي ميباشد كه هر نكته يا هر روش درسي كه جهت پاسخ دادن تستها، مهم ميباشد با مثالي از ميان تستهاي كنكور يا تاليفي كه به آن موضوع ارتباط داشتهاند در ١٤ فصل و ٣ ضميمه آورده شده است.
درانتهاي هر فصل داوطلب ميتواند يادگيري خود را با ساير تستهاي كنكور تاليفي در سالهاي 75 تاكنون افزايش دهد. به اين ترتيب داوطلب اطمينان دارد كه تمامي نكاتي كه مطالعه ميكند براي هدف آموزشي و آمادگي او در كنكور كاربرد دارد.
در ويرايش جديد سعي شده است كه كليه غلطهاي املائي يا علمي كه در ويرايش قبلي كتاب وجود داشت، تصحيح شوند و سوالات چهارگزينهاي كنكور سال ٨٦ تاكنون نيز به آن افزوده شوند.
Oxford University Press on behalf of The Royal Astronomical Society, 2019
A dominant source of error in space-based geodesy is the tropospheric delay, which results in exc... more A dominant source of error in space-based geodesy is the tropospheric delay, which results in excess path length of the signal as it passes through the neutral atmosphere. Many studies have addressed the use of global weather models and local meteorological observations to model the effects of this error in Global Positioning System (GPS) and Differential Interferometric Synthetic Aperture Radar (DInSAR) data. However, modelling of zenith tropospheric delays (ZTDs) errors in the GPS data, particularly in the areas of strong topographic relief, is highly
problematic because ZTD estimates cannot be captured by low resolution weather models and often it is not possible to find a nearby weather station for every GPS station. In this paper, we use DInSAR data with high spatial and temporal resolution from the volcanic
island of Hawaii to estimate the seasonal amplitudes of ZTD signals, which then are used to remove this error from GPS data. Here we observe the seasonal amplitude for more than one million DInSAR pixels for the time period between 2014 and 2017 and propose a best-fitting elevation-dependent model. This model is an integration of the exponential refractivity function and is linked to the observations from a radiosonde station and a weather station. It estimates seasonal amplitudes ranging from 0.2 cm at the highest elevations to 5.6 cm at the lower elevations, increasing exponentially from the DInSAR reference elevation. To demonstrate the potential of this model for correction of GPS data, we compare the modelled seasonal amplitude to the observed seasonal amplitudes of the variation of the local ZTD, computed from the Canadian Spatial Reference System-precise point positioning (CSRS-PPP) online application, for 21 GPS stations distributed throughout the island. Our results show that this
model provides results with root-mean-square error (rmse) values of less than 1 cm for the majority of GPS stations. The computed rmse of the residuals between the modelled seasonal signal and the high frequency variations of the ZTD signal at each station relative to the
reference GPS station, here PUKA, range between 0.7 and 4.1 cm. These estimated values show good agreement with those computed for the rmse of the residuals computed between the observed seasonal signal and the high frequency variations of ZTD, ranging from zero to
0.3 cm. This confirms the potential of the proposed DInSAR model to accurately estimate the seasonal variation of ZTDs atGPS stations at any arbitrary altitude with respect to the reference station.
New precise network solutions for continuous GPS (cGPS) stations distributed in eastern Ontario a... more New precise network solutions for continuous GPS (cGPS) stations distributed in eastern Ontario and western Québec provide constraints on the regional three-dimensional crustal velocity field. Five years of continuous observations at fourteen cGPS sites were analyzed using Bernese GPS processing software. Several different sub-networks were chosen from these stations, and the data were processed and compared to in order to select the optimal configuration to accurately estimate the vertical and horizontal station velocities and minimize the associated errors. The coordinate time series were then compared to the crustal motions from global solutions and the optimized solution is presented here. A noise analysis model with power-law and white noise, which best describes the noise characteristics of all three components, was employed for the GPS time series analysis. The linear trend, associated uncertainties, and the spectral index of the power-law noise were calculated using a maximum likelihood estimation approach. The residual horizontal velocities, after removal of rigid plate motion, have a magnitude consistent with expected glacial isostatic adjustment (GIA). The vertical velocities increase from subsidence of almost 1.9 mm/year south of the Great Lakes to uplift near Hudson Bay, where the highest rate is approximately 10.9 mm/year. The residual horizontal velocities range from approximately 0.5 mm/year, oriented south–southeastward, at the Great Lakes to nearly 1.5 mm/year directed toward the interior of Hudson Bay at stations adjacent to its shoreline. Here, the velocity uncertainties are estimated at less than 0.6 mm/year for the horizontal component and 1.1 mm/year for the vertical component. A comparison between the observed velocities and GIA model predictions, for a limited range of Earth models, shows a better fit to the observations for the Earth model with the smallest upper mantle viscosity and the largest lower mantle viscosity. However, the pattern of horizontal deformation is not well explained in the north, along Hudson Bay, suggesting that revisions to the ice thickness history are needed to improve the fit to observations. Keywords GPS position time series · Bernese GPS processing · Reference frame · Glacial isostatic adjustment (GIA)
… International Archives of …, 2008
This paper presents a methodology through utilizing remote sensing imagery, GIS-based neuro-fuzzy... more This paper presents a methodology through utilizing remote sensing imagery, GIS-based neuro-fuzzy approach and variety of social and environmental factors for simulating land use change. Two historical Landsat imageries of Tehran Metropolitan Area with twenty year time interval and user-selected socio-economic and environmental variables have been employed in order to simulate land use change. All images were rectified and registered to Universal Transverse Mercator (UTM) WGS 1984 zone 39N. Supervised classification was used to classify the images to different land use categories. Four classes were identified: road, residential area, service centre, administrative area. This work introduces a simulation experiment on urban land use change in which a supervised back propagation neural network has been employed in the parameterization of the simulation model, while GIS is used to model and monitor land use change and perform spatial analysis on the results. This paper adapts land use change model which parameterized for Tehran Metropolitan Area and explores how factors such as road, slope, administrative space, service centre and residential area parameters can influence it. For each cell in the study area, the real change between the two time steps is determined and analyzed compared with the provided variables in order to produce a probability of land use change layer. Parts of two datasets were used to train the neural network while full datasets were utilized to predict land use change modelling. In addition, the impact of training and prediction period on land use is examined. The creation of the GIS based neuro-fuzzy land use change modelling is the major contribution of this paper.
The San Andreas Fault is one of the most well-known destructive fault which is a boundary of two ... more The San Andreas Fault is one of the most well-known destructive fault which is a boundary of two moving plates; the Pacific Plate (on the west) and North American Plate (on the east) that meet in western California, in which the west side moves northwestward relative to the east side. Huge earthquakes occurred during the long history of this fault at each year and follow that the displacements made in the features on the surface near the fault, motivate scientists to attempt to forecast the next huge earthquake along that. In 1985, the US Geological Survey predicted that there would be a comparably-sized earthquake in this community by 1993, but no such event came until September 28, 2004, when a magnitude-6.0 earthquake struck at 17:15:24 Coordinated Universal Time, UTC; epicenter location 35.8158 N, 120.3748W; depth 7.9 km). There is a hope that by observing the collected data and studying the changes that precede an earthquake of that size helps scientists to better understand the physics of the earthquakes and faulting and also issue predictions for major earthquakes along this fault and around the world. To achieve this, requires observing them using different geodetic instruments with high resolution. Among variety existence instruments, borehole Gladwin Tensor Strainmeter (GTSM) capable to record changes in strains and deformation around the rock at a resolution of better than a nanostrain at periods of minutes to months (short term). It accomplishes this to measure different signals range from several nanostrain to many hundreds of nanostrain for example the strain resulted from lunar and solar tides. Strain data from GTSM are three measurements in different directions at each time; Areal strain and two types of shear strains. The main goal in this project is to determine slipping occurred into the near-surface region of the hypocentre in during, or as a result of, the passage of the seismic waves that trigger tremor near the San Andreas Fault, for time periods of 2 month before and about one month after that, using continuous strain data recorded on two boreholes Gladwin Tensor Strainmeters (GTSM) located in the opposite sides of the fault within the distribution of tremor the epicenter of this earthquake. In this project the obtained GTSM data with concerning that only tidal and drift signals are effective parameters on these data, will be modeled.
To understand earthquakes requires observing them up close and in detail-a difficult task because... more To understand earthquakes requires observing them up close and in detail-a difficult task because they are at present largely unpredictable, and so knowing where to put the instrumentation needed to make such observations is a challenge. Parkfield is the most closely observed earthquake zone in the world.
Tehran, capital of Iran, is located on a number of known and unknown faults which make this mega ... more Tehran, capital of Iran, is located on a number of known and unknown faults which make this mega city exposed to huge earthquakes. Determining locations and intensity of seismic vulnerability of a city is considered as a complicated disaster management problem. As this problem generally depends on various criteria, one of the most important challenges concerned is the existence of uncertainty regarding inconsistency in combining those effective criteria. The emergence of uncertainty in seismic vulnerability map results to some biases in risk management which has multilateral effects in dealing with the consequences of the earthquake. To overcome this problem, this paper proposes a new approach for Tehran’s seismic vulnerability classification based on granular computing. One of the most significant properties of this method is inference of
accurate rules having zero entropy from predefined classification
undertaken based on training datasets by the expert.
Furthermore, not-redundant covering rules will be extracted
for consistent classification where one object maybe classified with two or more nonredundant rules. In this paper, Tehran statistical zones (3,173 according to 1996 census) are considered as the study area. Since this city has not experienced a disastrous earthquake since 1830, this work’s results is the relative accurate with respect to the results of previous studies.
Proceeding of the IADIS International Conferences informatics 2011, 2011
"Pre-determining locations and intensity of a seismic area is considered as a complicated disaste... more "Pre-determining locations and intensity of a seismic area is considered as a complicated disaster management problem.
All over the world scientists attempts to predict an impending earthquake with varied phenomena as seismicity patterns.
But still there exists uncertainty problem along with discovering knowledge from large and complex dataset that have significant influence on the final results. In this research granular computing algorithm employed to solve the problem of rule induction from existing dataset. As, this problem depends on various criteria, one of the most important challenges concerned is the existence of uncertainty regarding inconsistency in combining influencing criteria and extracting more consistent knowledge for the next predictions. To overcome this problem, this paper proposes a new approach for seismic risk knowledge discovery based on granular computing theory. One of the significant properties of this method is
induction of more compatible rules having zero inconsistency from existing databases. Furthermore, in this approach non-redundant covering rules will be extracted for consistent classification where one object maybe classified with two or more non-redundant rules. This paper illustrate the advantages of using granular computing to discovery of knowledge from dataset consist of the seismic risk properties of the area between 58° 24' E, 60° 24' E Longitude and 27° 45' N, 29° 25' N Latitude around occurred near Reygan (Kerman Province), South-East of Iran where a devastating earthquake
occurred."
Rule induction is an area of machine learning in which formal rules are extracted from a set of o... more Rule induction is an area of machine learning in which formal rules are extracted from a set of observations or training dataset. Inducted rules can be expressed as a final result of the decision tree in which each branch represents a possible scenario of decision and its outcome. Existing decision learning algorithms like Iterative Dichotomiser (ID3) is an attribute centered method which may introduce unnecessary attributes in the classification rules. To overcome the problem, coverage and confidence measures are applied to select the most promising attribute-value at each step. The proposed approach is granule centered in which, instead of focusing on the selection of a suitable partition, i.e., a family of granules is defined, at each step, by values of an attribute. This paper is concentrated on the selection of a single granule. The decision tree learning algorithm ID3 and granular network are successfully applied for information table of test dataset of seismic vulnerability of urban areas in Tehran, capital of Iran.
Rule induction is an area of machine learning in which formal rules are extracted from a set of o... more Rule induction is an area of machine learning in which formal rules are extracted from a set of observations or decision trees.
Inducted rules can be expressed as relationships between concepts in terms of their intensions and extensions, such as sub-concepts and super-concepts, disjoint and overlap concepts, and partial sub-concepts. Existing algorithms for construction of decision trees cannot effectively deal with missing values. To overcome the limitation, this paper presents granule network as an improved decision trees to induct more informative classification rules in which all objects in the universe are classified correctly with
minimum uncertainty. To achieve this, some quantitative measures such as coverage and support are used to estimate the quality of the granules and their relations at each step. This paper is concentrated on seismic vulnerability classification using granule tree by applying quantity measures. It is implicated on 177 urban areas of north of Tehran, capital of Iran and the results are used to classify Tehran’s seismic vulnerability. The achieved results of the classification are successfully verified the proposed methodology.
Tehran, capital of Iran has several known and unknown active faults hence huge earthquake will pe... more Tehran, capital of Iran has several known and unknown active faults hence huge earthquake will permeates human settlement there. However, more than the disaster it is the inefficient and inadequate infrastructure in the aftermath of
earthquake which causes huge loss of life and property. There are substantial problems with availability of, and accessibility to reliable, up-to-date, and accurate geospatial data. The need for such data is significant if one is to successfully
react to and manage a disaster situation. The data required might include aspects such as the availability of resources, road access, and required disaster response operations; crucial for informed decision-making and disaster management. Thus, there is a need for the development and implementation of appropriate frameworks and utilization of technologies to alleviate the current lack of capacity to respond to disasters. This paper focuses on the use of spatial data infrastructure and Geospatial information system to achieve better outcomes from spatial decision-making for site selection of rescue centers. The usage of the spatial decision support system provides the advantages of emphasis on the
regions which need more attention. Among the successful models, in this paper multi-criteria decision making (MCDM) based on a location model is used.
This paper is presented a new approach which allowing triangulation of a set of points in order t... more This paper is presented a new approach which allowing triangulation of a set of points in order to be dynamically updated without a global reconstruction of the topology in which new points can be incrementally inserted into the set. The proposed algorithm for insertion is based on k-nearest
neighbor search of voronoi vertex in a 2-dimensional space and finding a triangle in which the inserted point locates while topological structure was used to save voronoi vertex. The inserted points are preprocessed into a data structure, so that given any query point q, the nearest (or generally k nearest) points of voronoi vertices can be reported efficiently.
K-nearest neighbor searching is a fundamental computational analysis in triangulation. After using k-nearest neighbor searching on voronoi vertices, walk algorithm with nearest voronoi vertices was performed to find a triangle in which the inserted point is located to update local triangulation in the region. The algorithm results in rather simple and efficient way to speed up triangulation. The algorithms have been implemented and successfully verified.
Iran is a highly vulnerable country to natural disasters especially earthquake. Tehran, capital... more Iran is a highly vulnerable country to natural disasters especially earthquake.
Tehran, capital of Iran, has several known and unknown active faults hence huge earthquake will permeates human settlement there. Production of seismic vulnerability map could help local and national disaster management organizations to create and implement a plan to promote awareness of earthquake vulnerability and implementation of seismic vulnerability reduction measures in Tehran.
The characteristics of knowledge driven approaches have inherent uncertainties and inconsistent in opinions of experts in determining seismic vulnerable areas. To produce less uncertainty vulnerability map using the rules inducted from the information table, a granular computing model is used for learning classification rules which are granule centered strategies.
اين بحث راجع به تئوري دانهاي و استفاده از آن در كلاسهبندي حوزههاي شهري براساس آسيبپذيري لرزه... more اين بحث راجع به تئوري دانهاي و استفاده از آن در كلاسهبندي حوزههاي شهري براساس آسيبپذيري لرزهاي آنها ميباشد
Geographic Information System (GIS) is currently used widely in variety applications including ur... more Geographic Information System (GIS) is currently used widely in variety applications including urban planning, seismic vulnerability assessment, natural resources, and agricultural management to better visualize and understand the spatial relationships and patterns of the geographic features.
Increasing population density and urbanization which result in agricultural limitation, is a major threat to food sources. Therefore, securing the future of our food production and our population is critical. GIS plays a significant role in identifying the consistently availability of adequate food for a household in order to support a healthy lifestyle. Here, we focus on employing GIS technique in assessment of food security and criteria in mapping the vulnerable areas to access to healthy food and classifying the areas from very low secure to very high secure areas. The importance of using this technique in food security is defined in the use of spatial analysis methods to characterize regions at risks due to lack of food, water and natural resources. The results of this analysis are of great interest to decision makers and related organizations as it can provide knowledge of the food security in different areas which include examining the local food situation, evaluating land use changes, determining the relationships between biophysical and socioeconomic, and creating 3D models to illustrate terrain and construct adequate strategies to provision the food sources.
Computer systems enabling creation, collection, compilation, analysis, and visualization of geogr... more Computer systems enabling creation, collection, compilation, analysis, and visualization of geographic or spatially-referenced data and subsequent hypotheses generation and/or decisionmaking using that data for a wide variety of purposes.
Computer systems enabling creation, collection, compilation, analysis, and visualization of geogr... more Computer systems enabling creation, collection, compilation, analysis, and visualization of geographic or spatially-referenced data and subsequent hypotheses generation and/or decision-making using that data for a wide variety of purposes. Accuracy and Precision Accuracy: are the answers correct when compared to a known standard? Centre of circle in the diagrams. Precision: (Surveyors and GPS): how repeatable (or similar) are the results when doing multiple measurements of the same thing? (Science, more generally): to what degree of exactness is an answer specied? Often, how many decimals are included in a measurement when dealing with quantitative data. Geoid The shape of the earth is most accurately modelled as a geoid which, is "the equipotential surface of the Earth's gravity eld which best ts, in a least squares sense, global mean sea level" Geographic Coordinate System UTM • Part of world wide UTM (a set of transverse cylindrical projections). Each zone has a distinct central meridian and uses false Eastings and Northings. Reduces east-west distortions but extended east-west areas span multiple zones making them difcult to work within this system constructed by adjusting the central meridian. Central Meridian (line of contact between globe and cylinder) runs down the middle of each zone. CM is a standard line in this case: consistent scale along the line. Geographic coordinate systems-Angular coordinates: Latitude (north-south):-90• (S) to+90• (N) Longitude (east-west):-180• (S) to+180• Do not transform into consistent linear units over the surface of the earth: 1• east-west at the Equator much larger than 1• east-west at the poles. *Different systems dened by varying models of earth's shape and/or how the ellipsoid model relates to specic positions on the earth. *For different regions and/or mapping purposes, different projections provide better or worse results. Global Positioning System (GPS)
Plates are made up of cool rocks with an average thickness of 100km.
Horizontal and vertical deformation of the Earth’s crust is due to a variety of different geophys... more Horizontal and vertical deformation of the Earth’s crust is due to a variety of different geophysical processes that take place on various spatiotemporal scales. The quality of the observations from spaced-based geodesy instruments such as Global Positioning System
(GPS) and differential interferometric synthetic aperture radar (DInSAR) data for monitoring these deformations are dependent on numerous error sources. Therefore, accurately identifying and eliminating the dominant sources of the error, such as troposphere error in
GPS signals, is fundamental to obtain high quality, sub-centimeter accuracy levels in positioning results.
In this work, I present the results of double-differenced processing of five years of GPS data, between 2008 and 2012, for sparsely distributed GPS stations in southeastern Ontario and western Québec. I employ Bernese GPS Software Version 5.0 (BSW5.0) and found two
optimal sub-networks which can provide high accuracy estimation of the position changes. I
demonstrate good agreement between the resulted coordinate time series and the estimates of the crustal motions obtained from a global solution. In addition, I analyzed the GPS position time series by using a complex noise model, a combination of white and power-law noises.
The estimated spectral index of the noise model demonstrates that the flicker noise is the dominant noise in most GPS stations in our study area. The interpretation of the observed velocities suggests that they provide an accurate constraint on glacial isostatic adjustment
(GIA) prediction models.
Based on a deeper analysis of these same GPS stations, I propose a model that accurately estimates the seasonal amplitude of zenith tropospheric delay (ZTD) error in the GPS data on local and regional spatial scales. I process the data for the period 2008 through 2012 from eight GPS stations in eastern Ontario and western Québec using precise point positioning (PPP) online analysis available from Natural Resource Canada (NRCan) (https://webapp.geod.nrcan.gc.ca/geod/tools-outils/ppp.php). The model is an elevation-dependent model and is a function of the decay parameter of refractivity with altitude and the seasonal amplitude of refractivity computed from atmospheric data (pressure, temperature,
and water vapor pressure) at a given reference station. I demonstrate that it can accurately estimate the seasonal amplitude of ZTD signals for the GPS stations at any altitude relative to that reference station. Based on the comparison of the observed seasonal amplitudes of the
differenced ZTD at each station and the estimates from the proposed model, it can provide an accurate estimation for the stations under normal atmospheric conditions. The differenced ZTD is defined as the differences of ZTD derived from PPP at each station and ZTD at the
reference station. Moreover, I successfully compute a five-year precipitable water vapor (PWV) at each GPS site, based on the ZTD derived from meteorological data and GPS processing. The results provide an accurate platform to monitor long-term climate changes
and inform future weather predictions.
In an extension of this research, I analyze DInSAR data between 2014 and 2017 with high temporal and spatial resolution, from Kilauea volcano in Hawaii in order to derive the spatial and temporal pattern of the seasonal amplitude of ZTD. I propose an elevation-dependent
model by the data from a radiosonde station and observations at a surface weather station for modeling the seasonal amplitudes of ZTD at any arbitrary elevation. The results obtained from this model fit the vertical profile of the observed seasonal amplitude of ZTD in DInSAR data, increasing systematically from the elevation of the DInSAR reference point. I demonstrate that the proposed model could be used to estimate the seasonal amplitude of the differenced ZTD at each GPS station within a local network with high accuracy. The results of this study concluded that, employing this model in GPS processing applications eliminates the need for the meteorological observations at each GPS site.