kntu
Engineering Journal of Geospatial Information Technology
2008-9635
9
2
2021
10
1
Classification of hyperspectral images by fusion of spectral and spatial features in convolutional neural networks
1
27
FA
Obeid
Sharifi
K.N. Toosi University of Technology
Behnam
Asghari Beirami
K.N. Toosi University of Technology
Mehdi
Mokhtarzade
K.N. Toosi University of Technology
Hyperspectral images are useful in monitoring the Earth surface phenomena due to the acquisition of large number of spectral bands. Hyperspectral image classification is the most important field of the hyperspectral data processing, and so far, there have been many attempts to increase its accuracy. Convolutional neural networks (CNNs) and spatial features have had a great role in improving the accuracy of the hyperspectral image classification in recent years. In the previous researches not much attention has been paid to the simultaneous use of the capabilities of the low spatial feature deriving methods in convolutional neural networks. For this reason, in the present paper, a new architecture of convolutional neural networks is introduced for the classification of hyperspectral images that uses the different combinations of spectral features and spatial features which are derived from morphological profiles, Gabor filter and local binary pattern (LBP) as input vectors to the proposed CNN. The experiments which are conducted on two real hyperspectral images from agricultural and urban areas, show the superiority of the proposed method (about 2.5%) in comparison to some recent spatial-spectral classification methods.
kntu
Engineering Journal of Geospatial Information Technology
2008-9635
9
2
2021
10
1
Simultaneous Location-Allocation of multiple Facilities using Multi-objective Evolutionary Algorithm based on Decomposition
29
49
FA
Sara
Beheshtifar
Tabriz University
Choosing the proper location for service centers can play an important role in reducing travel costs for users, desirable use of the land, and regulation of interactions among different facilities. When Location-Allocation (L.A.) problem of any new service centers is solved for multiple facilities independently, only the effects of existing land uses are taken into consideration , while the establishment of one facility, due to its impact on the surrounding space, may cause limitations for the establishment of other required facilities. By locating all the required centers simultaneously, better results can be obtained for the arrangement of the centers in an area. The main objective of this study is to solve the L.A. problem for several service centers with similar or dissimilar services in GIS environment simultaneously. For this purpose, the Multi-Objective Evolutionary Algorithm based on Decomposition (MOEA/D) algorithm has been used to optimize the three objective functions including minimizing travel costs, maximizing the suitability of selected sites, and maximizing the compatibility among the new service centers. The results showed that by using this method, acceptable solutions for the arrangement of different service centers in the study area have been obtained according to the defined objectives. The comparison of the results with Non-Dominated Sorting Genetic Algorithm II (NSGA_II), as one of the most common optimization algorithms, based on various criteria, showed that MOEA/D method has performed well in finding optimized answers so that none of the solutions of this method were dominated by the solutions of the NSGA_II, while the reverse was not true. Besides, from the point of view of the closeness of the answers to the ideal point, MOEA/D has generated better solutions (0.16) and the covered time has been 25% of NSGA_ II method.
kntu
Engineering Journal of Geospatial Information Technology
2008-9635
9
2
2021
10
1
Appropriate Copula Investigation for Modeling Dependence Structure of Troposperic Delay Data in the Mountainous Area of Central Europe
51
65
FA
Roya
Mousavian
Masoud
Masshadi Hossainali
Christof
Lorenz
Troposphere is the lowest and one of the most complex layers of the Earth's atmosphere from the electromagnetic signal travelling point of view. Electromagnetic signals in travelling through this medium are affected and received by a delay in receivers. In GNSS applications, regardless of treating the impact as a signal or noise, it has to be modeled efficiently. To this end, the problem is firstly discretized in space and time. To adopt an appropriate time and spatial resolution for the model, a-priori information on the dependence structure of the input data is inevitable. This study applies Copula as a mathematical tool for modeling the dependence structure of the Zenith Tropospheric Delays (ZTD). For this purpose, four of the most common Archimedean Copulas, i.e. Frank, Clayton, Gumbel and Ali-Mikhail-Haq (AMH) are used. In this research, south-east of Germany together with the neighboring areas in Czech Republic and Austria are selected as the study region. For this evaluation, hourly time series of ZTDs from April to October 2016 are calculated using three dimensional meteorological parameters extracted from the Weather Research and Forecast (WRF) model. Appropriate Copula is detected by two common criteria, i.e. Akaike Information Criterion (AIC) and Bayesian Information Criterion (BIC). The obtained results in most cases suggest Gumbel Copula for the test field. This Copula is asymmetric and exhibits tail dependence, especially the upper form within the input data. This implies tropospheric delays are more associated when approaching the larger values. Moreover, the results approve that the Pearson correlation is not always an appropriate measure for analyzing the dependence structure in the local scale troposphere modeling. Also, the obtained results emphasize on the necessity of applying a dynamic model based on the dependence structure of tropospheric delays.
kntu
Engineering Journal of Geospatial Information Technology
2008-9635
9
2
2021
10
1
Estimation of parameters (date and magnitude) of two strong earthquakes in Iran by integrating different earthquake precursors
67
81
FA
Mohammad Mahdi
Khoshgoftar
University of Tehran
Mohammad Reza
Saradjian
University of Tehran
Natural hazards cause thousands of deaths and millions of dollars of financial losses all around the world every year. Earthquake is one of the natural hazards that receives special attention, because it usually occurs with very few or no warnings. Earthquake precursors can be used as an alarm for impending earthquakes. As a single precursor is not able to accurately predict an earthquake, it is necessary to integrate different types of precursors. In this paper, the precursors of total electron content (TEC), land surface temperature (LST), aerosol optical depth (AOD) and surface latent heat flux (SLHF) for two severe earthquakes in Kermanshah and Bam have been studied and analyzed. The median and interquartile are used to detect the anomaly. When an earthquake-related disturbance is detected, based on the type of precursor, the number of the days relative to the earthquake date is estimated. Then, according to the amount of deviation of the precursor from the normal state, the magnitude of the impending earthquake is estimated. In order to evaluate the final parameters (date and magnitude) of the earthquake for each region, the method of mean square error (MSE) has been used. The date and magnitude of the earthquake were estimated for each precursor in Kermanshah and Bam earthquakes. By combining the earthquake parameters obtained from all precursors, the final earthquake parameters were estimated for these two earthquakes, in which according to the obtained results, the estimated date and magnitude of the impending earthquakes almost corresponded with the recorded date and magnitude of the earthquakes. According to the obtained results, it can be concluded that increasing the number and variety of the earthquake precursors can lead to the accurate estimation of the earthquake parameters.
kntu
Engineering Journal of Geospatial Information Technology
2008-9635
9
2
2021
10
1
Evaluating the usability of citizen-centered geographic information systems in solving urban spatial problems
83
104
FA
University of Tehran
Mohammadreza
Jelokhani-Niaraki
University of Tehran
Majid
Kiavarz
University of Tehran
A wide range of problems and needs of urban society in different areas are spatial in nature and vary from one place to another throughout the city. Since the citizens are better aware of their living environment and its problems than any other person, so the participation of citizens in reporting urban problems is very important. Citizen-centered or participatory spatial information systems provide a practical and effective platform for the citizens’ participation in solving urban problems. In order for these systems to be accepted by the urban community, they must have a high level of usability. Therefore, the present study seeks to design and develop a citizen-centered spatial information system to report the urban problems and also evaluate the usability (four parameters of effectiveness, efficiency, learnability and satisfaction) of the system in District 6 of Tehran. In terms of effectiveness, the system had the highest (36) and lowest (1) effectiveness in the number of "drawing on map" and "descriptive editing of reports" activities, respectively. Regarding the efficiency, the minimum amount of time spent to complete the first report was 11 seconds and the maximum was 76 seconds, which shows the relatively good performance of the system. The results show that in general, most users try to learn the system. Regarding the learning parameter, the obtained results indicate that the lowest amount of time spent on using the guidelines is zero (not studying the guidelines) and the highest amount is 46 seconds. Besides, the average time of using the guidelines by the citizens is 17.95 seconds which is a good one. Finally, regarding the satisfaction index, the average score assigned to the system (3.95 out of 5) indicates the highest amount of satisfaction.
kntu
Engineering Journal of Geospatial Information Technology
2008-9635
9
2
2021
10
1
Developing a spatial and temporal density-based clustering algorithm to extract stop locations from the user’s trajectory
105
128
FA
Negin
Masnabadi
Shahid Rajaee University
Farhad
Hosseinali
Shahid Rajaee University
Zahra
Bahramian
University of Tehran, College of Engineering
Identifying stopping points of trajectories is a preliminary and necessary step in the study of moving objects and has a major impact on spatial plans and services. In this study we use trajectory clustering to extract stopping points. DBSCAN algorithm (spatial clustering based on density of applications with noise) is the basic algorithm of density-based clustering methods, which despite its advantages has some shortcommings such as difficulty in determining input parameters, inability to detect clusters with different densities and not paying attention to round trip problem. In the proposed method, which is based on density, we use of spatial and temporal indices and several neighborhood radii to extract stop points. Solving the round trip problem, extracting clusters with different densities and reducing the degree of dependence of the results on input parameters are the advantages of the proposed method.In order to evaluate the proposed method, this method was implemented on the data obtained by handheld GPS in Arak city and the data related to the Geolife research project. The obtained results were compared with the results of five other algorithms including DBSCAN, ST-BDSCAN, VDBSCAN, DVBSCAN and K-means. Compared to the manual GPS route data in Arak city, the stop locations extracted by the proposed algorithm and the mentioned algorithms are 100%, 25%, 75%, 50%, 75% and 50%, respectively, which are correctly extracted and show the superiority of the developed method. Also, after extracting the stopping and moving points, indicators from Geolife data were determined to identify working and non-working days (holidays) with which the proposed method was able to act successfully up to 94.06%.The results show a decrease in the dependence of the results on input parameters, the accurate extraction of stopping points, a reduction in the standard deviation within the clusters, and an increase in the distance between the centers of the clusters.