• Home
  • Maximum Likelihood
    • List of Articles Maximum Likelihood

      • Open Access Article

        1 - Investigation of forest land use degradation due to dam construction using satellite images processing
        mandana azizi Mohammad panahandeh
        Identify land uses and land use changes to investigate and monitor sensitive areas is essential for sustainable land planning and management. The main objective of this study is to investigate the land use changes caused by the construction of Shafarood Dam in the Hyrca More
        Identify land uses and land use changes to investigate and monitor sensitive areas is essential for sustainable land planning and management. The main objective of this study is to investigate the land use changes caused by the construction of Shafarood Dam in the Hyrcanian forests in the north of Iran during a 17-year period using Landsat satellite imagery. To do this, three satellite imagery of the years 2000, 2013 and 2017 were used, and the corrections (geometric and atmospheric) were applied on the images and the map of the land use for each section in the region was prepared using the classification method of the maximum likelihood that the produced map have Kappa coefficient more than 86% and usage accuracy of 0.83. After classification, the comparison method was used to monitor the land use changes. The results revealed that in every three years, the most land cover of Shafarood watershed belongs to the forest class and in the next rank belongs to the rangeland class. As a result, the continuous decline of the forest class accured from 63.05 percent to 57.27 and 57.22 percent in the first section for the years 2013 and 2017 respectively. The continuous increase of the rock class (8.15-9.10-10.45) and bare lands (3.5- 4.47-5.08%) confirms it in the study area. Environmental challenges of constructing the Shafaroud dam is another emphasis on the importance of conducting advanced and specialized studies based on ecological methodologies and also increasing the decision makers awareness of Hyrcanian forests complexity which has formed in a very long-time period. Manuscript profile
      • Open Access Article

        2 - Separating alteration units in the Takht-e-Gonbad district using via comparing two classification methods of Support vector machine and maximum likelihood,
        Davoud Nazari neda mahvash mohammadi  Adabi   Mohammad Ghavidel-Syooki haniyeh kalani
        Separation of alteration units has an important role in exploration of ore deposits. In the past, classical methods were used for this purpose. Recently, the support vector machine (SVM), one of the most important data driven models, has been applied for geological purp More
        Separation of alteration units has an important role in exploration of ore deposits. In the past, classical methods were used for this purpose. Recently, the support vector machine (SVM), one of the most important data driven models, has been applied for geological purpose. This algorithm is a useful learning system based on constrained optimization theory. In this study, the SVM algorithm with various kernels and maximum likelihood method were used to separate the alteration units of the Takht-e-Gonbad district situated in Chahar Gonbad sheet by using satellite images of the ASTER sensor. The results were analyzed and evaluated according to the field studies. Based on the achieved results and field studies, the SVM method with the RBF kernel function compared to other kernels and the maximum likelihood method had the highest accuracy (89.17%) and kappa coefficient (0.83). Thus, the SVM method for classification of alteration is more accurate compared to other discussed methods. Manuscript profile
      • Open Access Article

        3 - Low-Complexity Iterative Detection for Uplink Multiuser Large-Scale MIMO
        Mojtaba Amiri Mahmoud Ferdosizade Naeiny
        In massive Multiple Input Multiple Output (MIMO) or large scale MIMO systems, uplink detection at the Base Station (BS) is a challenging problem due to significant increase of the dimensions in comparison to ordinary MIMO systems. In this letter, a novel iterative metho More
        In massive Multiple Input Multiple Output (MIMO) or large scale MIMO systems, uplink detection at the Base Station (BS) is a challenging problem due to significant increase of the dimensions in comparison to ordinary MIMO systems. In this letter, a novel iterative method is proposed for detection of the transmitted symbols in uplink multiuser massive MIMO systems. Linear detection algorithms such as minimum-mean-square-error (MMSE) and zero-forcing (ZF), are able to achieve the performance of the near optimal detector, when the number of base station (BS) antennas is enough high. But the complexity of linear detectors in Massive MIMO systems is high due to the necessity of the calculation of the inverse of a large dimension matrix. In this paper, we address the problem of reducing the complexity of the MMSE detector for massive MIMO systems. The proposed method is based on Gram Schmidt algorithm, which improves the convergence speed and also provides better error rate than the alternative methods. It will be shown that the complexity order is reduced from O(〖n_t〗^3) to O(〖n_t〗^2), where n_t is the number of users. The proposed method avoids the direct computation of matrix inversion. Simulation results show that the proposed method improves the convergence speed and also it achieves the performance of MMSE detector with considerable lower computational complexity. Manuscript profile
      • Open Access Article

        4 - Array Processing Based on GARCH Model
        H. Amiri H. Amindavar M. Kamarei
        In this paper, we propose a new model for additive noise based on GARCH time-series in arraysignal processing. Due to the some reasons such as complex implementation and computational problems, probability distribution function of additive noise is assumed Gaussian. In More
        In this paper, we propose a new model for additive noise based on GARCH time-series in arraysignal processing. Due to the some reasons such as complex implementation and computational problems, probability distribution function of additive noise is assumed Gaussian. In the different applications, scrutiny and measurement of noise shows that noise can sometimes significantly non-Gaussian and thus the methods based on Gaussian noise will degrade in an actual conditions. Heavy-tail probability density function (PDF) and time-varying statistical characteristics (e.g.; variance) are the most features of the additive noise process. On the other hand, GARCH process has important properties such as heavy-tail PDF (as excess kurtosis) and volatility modeling through feedback mechanism onto conditional variance so that it seems the GARCH model is a good candidate for the additive noise model in the array processing applications. In this paper, we propose a new method based on GARCH using the maximum likelihood approach in array processing and verify the performance of this approach in the estimation of the Direction-of-Arrivals of sources against the other methods and using the Cramer-Rao Bound. Manuscript profile
      • Open Access Article

        5 - DRSS-Based Localization Using Convex Optimization in Wireless Sensor Networks
        Hassan Nazari M. R. Danaee M. Sepahvand
        Localization with differential received signal strength measurement in recent years has been very much considered. Due to the fact that the probability density function is known for given observations, the maximum likelihood estimator is used. This estimator can be asym More
        Localization with differential received signal strength measurement in recent years has been very much considered. Due to the fact that the probability density function is known for given observations, the maximum likelihood estimator is used. This estimator can be asymptotically represented the optimal estimation of the location. After the formation of this estimator, it is observed that the corresponding cost function is highly nonlinear and non-convex and has a lot of minima, so there is no possibility of achieving the global minimum with Newton method and the localization error will be high. There is no analytical solution for this cost function. To overcome this problem, two methods are existed. First, the cost function is approximated by a linear estimator. But this estimator has poor accuracy. The second method is to replace the non-convex cost function with a convex one with the aid of convex optimization methods, in which case the global minimum is obtained. In this paper, we proposed new convex estimator to solve cost function of maximum likelihood estimator. The results of the simulations show that the proposed estimator has up to 20 percent performance improvement compared with existing estimators, moreover, the execution time of proposed estimator is 30 percent faster than other convex estimators. Manuscript profile