• List of Articles Software

      • Open Access Article

        1 - Present an Initial Estimation Method for Logical Transaction-based Software Projects
        mehrdad shahsavari
        The first and most basic requirement for successful entry of a project, is have a realistic and reasonable estimation. In this paper, in order to increase accuracy of software projects estimation and reduce complexity of estimation process, we introduce a method called More
        The first and most basic requirement for successful entry of a project, is have a realistic and reasonable estimation. In this paper, in order to increase accuracy of software projects estimation and reduce complexity of estimation process, we introduce a method called the "Logical Transaction Point (LTP)". Our method is most appropriate for transactional software. By use of this method can estimate the total size of use-case's and size of the whole software. In this paper we will prove the more accurate the same technique as UCP method and Due to more transparency and simplicity, its deployment is easier. We provide the main basis for this method, the degree of functional point analysis (FPA) and estimation the degree of use case point (UCP). Manuscript profile
      • Open Access Article

        2 - QoS Evaluation and Optimization for Mobile Telecommunication Networks Using Key Performance Indicators: A Case Study for Kerman City
        Mihsen Sheikh-Hosseini
        Proccess of quality of service (Qos) evaluation and optimization for mobile telecommunication networks is consisted of daily collection of key performance indicators (KPI) reports, determination and fixing of defects, and network testing to ensure the omission of the de More
        Proccess of quality of service (Qos) evaluation and optimization for mobile telecommunication networks is consisted of daily collection of key performance indicators (KPI) reports, determination and fixing of defects, and network testing to ensure the omission of the defects. Drive test is common approach of testing which is time-consuming and expensive dut to necessity for expertists and equipment. To overcome these limitations, this paper proposes a new software-based testing method using OPTIMA software. Performace of this proposed methed together with drive test approach have been studied and evaluated for the 70 BTS sites of Kerman GSM network as the statistical society. The results demonstrate that the proposed method provides performance similar to that of drive test for measurement of defect fixing. However, this software method not only removes the dependency to expertists and equipment, but also reduces the testing time about one-third comparied to the drive test.   Manuscript profile
      • Open Access Article

        3 - Using Grounded Theory in Software Engineering Research: A Study of Iran Web Startups
        Gholamhossein Dastghaibyfard
        Web Startups are newly established institutions that, in uncertain and resource constraints, try to enter fast-growing markets by offering cutting-edge solutions via web. Despite the economic importance of these businesses and their high failure rate, there are only a f More
        Web Startups are newly established institutions that, in uncertain and resource constraints, try to enter fast-growing markets by offering cutting-edge solutions via web. Despite the economic importance of these businesses and their high failure rate, there are only a few scientific studies that try to investigate software engineering in these businesses. This research, using Grounded Theory research methodology and interviewing nine Iran Web Startup, provides a set of evidence for understanding how software development in Startups is formed. The results show that the top priority of these businesses is to enter the market as quickly as possible so that they can find a product that meets a strong market demand by getting feedback from their customers. However, accelerated time-to-market due to the lack of engineering activities requires the rebuild of the product and workflow before the start of future growth. Manuscript profile
      • Open Access Article

        4 - Routing improvement to control congestion in software defined networks by using distributed controllers
        saied bakhtiyari Ardeshir Azarnejad
        Software defined networks (SDNs) are flexible for use in determining network traffic routing because they separate data plane and control plane. One of the major challenges facing SDNs is choosing the right locations to place and distribute controllers; in such a way th More
        Software defined networks (SDNs) are flexible for use in determining network traffic routing because they separate data plane and control plane. One of the major challenges facing SDNs is choosing the right locations to place and distribute controllers; in such a way that the delay between controllers and switches in wide area networks can be reduced. In this regard, most of the proposed methods have focused on reducing latency. But latency is just one factor in network efficiency and overall cost reduction between controllers and related switches. This article examines more factors to reduce the cost between controllers and switches, such as communication link traffic. In this regard, a cluster-based algorithm is provided for network segmentation. Using this algorithm, it can be ensured that each part of the network can reduce the maximum cost (including delays and traffic on links) between the controller and its related switches. In this paper, using Topology Zoo, extensive simulations have been performed under real network topologies. The results of the simulations show that when the probability of congestion in the network increases, the proposed algorithm has been able to control the congestion in the network by identifying the bottleneck links in the communication paths of each node with other nodes. Therefore, considering the two criteria of delay and the degree of busyness of the links, the process of placing and distributing the controllers in the clustering operation has been done with higher accuracy. By doing so, the maximum end-to-end cost between each controller and its related switches, in the topologies Chinanet of China, Uunet of the United States, DFN of Germany, and Rediris of Spain, is decreased 41.2694%, 29.2853%, 21.3805% and 46.2829% respectively. Manuscript profile
      • Open Access Article

        5 - Evaluation of Land Use Change Using Remote Sensing Data (Case study: Nukabad watershed, Khash city)
        Hossein Piri Sahragard Mohammad reza  dahmardeh Mansour  Rigi
        Identifying and reviewing land use change can help managers and planners to identify the factors affecting the change of user and the adoption of appropriate management decisions at different levels. The present study was carried out with the aim of studying land use ch More
        Identifying and reviewing land use change can help managers and planners to identify the factors affecting the change of user and the adoption of appropriate management decisions at different levels. The present study was carried out with the aim of studying land use change and land use classes using remote sensing method in Noukabad watershed in Khash city, Sistan and Baluchestan province. For this purpose, after determining the boundary of the study area, using maps 1:50000, associated temporal data were obtained from the Landsat Satellite of the American Geological Survey. After geometric, radiometric and atmospheric correction, land use classification based on satellite images for study periods (1994-2000-2005-2010-2016 years) were determined. The accuracy of the production maps was determined using the general accuracy and kappa coefficient. After the application of the land use map in the ArcGIS software environment, the comparison of land use changes between the studied periods (each statistical period with the previous statistical period) was performed. Based on the results, overall classification accuracy of user-generated maps for the years 1994 (97.45%), 2000 (97.21%), 2005 (98.04%), 2010 (97.09%) and 2016 (97.06%) were rated as relatively good. The results of land use change trend showed that the most changes were in mountainous rangelands and the least changes were related to rivers, and in the meanwhile, agricultural lands and residential areas had moderate changes, which the main reason of these changes can be considered climate change and human intervention. These results indicate that the correct recognition of land use changes is caused, Managers identify the strengths and weaknesses of their executive plans and take steps to resolve them. Manuscript profile
      • Open Access Article

        6 - Analysis of Climatic Fluctuations Elements and Their Effects on Jarahi River Discharge Operation using Non-Parametric Methods
        Reza برنا
        River flow changes over time could be the effects of climate change or lack of change in the region is clear. Research on the part of the river basins in the southern province of Khuzestan Jarahi done to examine the impact of climate change on river discharges in the re More
        River flow changes over time could be the effects of climate change or lack of change in the region is clear. Research on the part of the river basins in the southern province of Khuzestan Jarahi done to examine the impact of climate change on river discharges in the region is.Accordingly, in this study to investigate the variability of climate, especially temperature, rainfall and hydrological conditions of the non-parametric Man-Kendall test and yield age is MAKESENS software. In order to achieve the objectives of the research, fluctuations in temperature and precipitation for the period of thirty years (1983-2013) examined and it was found that the temperature fluctuations in the rally and Kendall statistics about 2/21, and the age profile yield 0.3 respectively, both of which were significant at 95%. Precipitation in the region have generally negative trend, indicating a decline in the region is the atmospheric precipitation, however, that these fluctuations are investigated any significant non-parametric Debit fluctuations in all studied stations downside is that these trends to test Mann-Kendall statistics, on average, about -2/4 and yield index age of -1/22 And significant at 95 and 99 percent. The results of this research can be said to increased fluctuations in temperature and decrease in rainfall during this period of reduced river flows largely affected. Manuscript profile
      • Open Access Article

        7 - Analysis of discontinuities situation using Monte Carlo method and comparison its results with Dips software (Case study: Saidi Ornamental Quarry Stone, Kerman)
         Hasibi شهرام  شفیعی بافتی
        Statistical and probabilistic methods to evaluate the uncertainty in the data and the validation of the answers obtained from the calculations are widely used. In this study, Monte Carlo simulation method is used in order to exactly determine the coordinates of the disc More
        Statistical and probabilistic methods to evaluate the uncertainty in the data and the validation of the answers obtained from the calculations are widely used. In this study, Monte Carlo simulation method is used in order to exactly determine the coordinates of the discontinuities and the validation of the results of the Dips software calculations. In this method by considering all possible states for variables with estimation of the probability distribution function and the sampling of the variables, the final function was simulated. In this study, the data of discontinuities related to the Saidi ornamental quarry stone are used. In the first step, by the use of the Dips software, major joint sets were identified. Then for calculation of coordinates of joint sets with use of Monte Carlo simulation for each joint set, 50,000 random data of dip and dip direction were simulated. In the final step, the statistical characteristics of dip and dip direction of each joint set were identified by use of simulated data. Finally, the simulation results were compared with the results of Dips software. The results of this study show that the level of accuracy in Dips software for the dip direction of discontinuities is an average of 99.38% and for dip is an average 94.34%.   Manuscript profile
      • Open Access Article

        8 - Enhancing Efficiency of Software Fault Tolerance Techniques in Satellite Motion System
        Hoda Banki babamir babamir Azam Farokh Mohammad Mehdi Morovati
        This research shows the influence of using multi-core architecture to reduce the execution time and thus increase performance of some software fault tolerance techniques. According to superiority of N-version Programming and Consensus Recovery Block techniques in compar More
        This research shows the influence of using multi-core architecture to reduce the execution time and thus increase performance of some software fault tolerance techniques. According to superiority of N-version Programming and Consensus Recovery Block techniques in comparison with other software fault tolerance techniques, implementations were performed based on these two methods. Finally, the comparison between the two methods listed above showed that the Consensus Recovery Block is more reliable. Therefore, in order to improve the performance of this technique, we propose a technique named Improved Consensus Recovery Block technique. In this research, satellite motion system which known as a scientific computing system is consider as a base for our experiments. Because of existing any error in calculation of system may result in defeat in system totally, it shouldn’t contains any error. Also the execution time of system must be acceptable. In our proposed technique, not only performance is higher than the performance of consensus recovery block technique, but also the reliability of our proposed technique is equal to the reliability of consensus recovery block technique. The improvement of performance is based on multi-core architecture where each version of software key units is executed by one core. As a result, by parallel execution of versions, execution time is reduced and performance is improved. Manuscript profile
      • Open Access Article

        9 - Ten Steps for Software Quality Rating Considering ISO/IEC
        Hassan Alizadeh Bahram Sadeghi Bigham Hossein Afsari
        In software rating area, it is necessary to apply a measurement reference model to evaluate the quality of software. The standard 25030 is an example of an evaluation system which is based on stakeholders' requirements. In this study, an attempt has been made to establi More
        In software rating area, it is necessary to apply a measurement reference model to evaluate the quality of software. The standard 25030 is an example of an evaluation system which is based on stakeholders' requirements. In this study, an attempt has been made to establish a model in which all implicit and explicit requirements of stakeholders, users and policy makers have been taken into account. In addition, AHP method has been followed to weigh the indicators used in the model. The results show applicability of the model to meet the requirements of Iranian users. Manuscript profile
      • Open Access Article

        10 - A New Set Covering Controller Placement Problem Model for Large Scale SDNs
        احمد جلیلی رضا اکبری منیژه  کشتگری
        Software Defined Network (SDN) is an emerging architecture that can overcome the challenges facing traditional networks. SDN enables administrator/operator to build a simpler and manageable network. New SDN paradigms are encouraged to deploy multiple (rather than centra More
        Software Defined Network (SDN) is an emerging architecture that can overcome the challenges facing traditional networks. SDN enables administrator/operator to build a simpler and manageable network. New SDN paradigms are encouraged to deploy multiple (rather than centralized) controllers to monitor the entire system. The Controller Placement Problem (CPP) is one of the key issues in SDN that affects every aspect of it such as scalability, convergence time, fault tolerance and node to controller latency. This problem has been investigated in diverse papers with their major attention paid on optimizing the location of an arbitrary number of controllers. The related works in this area get less attention to two following important issues. i) Bidirectional end-to-end latency between switch and its controller instead of propagation latency, ii) finding the minimal number of controllers that even is a prerequisite for locating them. In this paper, a Set Covering Controller Placement Problem Model (SCCPPM) to find the least number of required controllers with regard to carrier grade latency requirement is proposed. The new model is carried out on a set of 124 graphs from the Internet Topology Zoo and solve them with IBM ILOG CPLEX Optimization package. As expected, our results indicate that the number of required controllers for high resiliency is dependent on topology and network size. As well, in order to achieve carrier grade requirement, 86 percent of topologies must have more than one controller. Manuscript profile
      • Open Access Article

        11 - Confronting DDoS Attacks in Software-Defined Wireless Sensor Networks based on Evidence Theory
        Nazbanoo Farzaneh Reyhaneh Hoseini
        DDoS attacks aim at making the authorized users unable to access the network resources. In the present paper, an evidence theory based security method has been proposed to confront DDoS attacks in software-defined wireless sensor networks. The security model, as a secur More
        DDoS attacks aim at making the authorized users unable to access the network resources. In the present paper, an evidence theory based security method has been proposed to confront DDoS attacks in software-defined wireless sensor networks. The security model, as a security unit, is placed on the control plane of the software-defined wireless sensor network aiming at detecting the suspicious traffic. The main purpose of this paper is detection of the DDoS attack using the central controller of the software-defined network and entropy approach as an effective light-weight and quick solution in the early stages of the detection and, also, Dempster-Shafer theory in order to do a more exact detection with longer time. Evaluation of the attacks including integration of data from the evidence obtained using Dempster-Shafer and entropy modules has been done with the purpose of increasing the rate of detection of the DDoS attack, maximizing the true positive, decreasing the false negative, and confronting the attack. The results of the paper show that providing a security unit on the control plane in a software-defined wireless sensor network is an efficient method for detecting and evaluating the probability of DDoS attacks and increasing the rate of detection of an attacker. Manuscript profile
      • Open Access Article

        12 - Secured Access Control in Security Information and Event Management Systems
        Leila Rikhtechi Vahid Rafeh Afshin Rezakhani
        Nowadays, Security Information and Event Management (SIEM) is very important in software. SIEM stores and monitors events in software and unauthorized access to logs can prompt different security threats such as information leakage and violation of confidentiality. In t More
        Nowadays, Security Information and Event Management (SIEM) is very important in software. SIEM stores and monitors events in software and unauthorized access to logs can prompt different security threats such as information leakage and violation of confidentiality. In this paper, a novel method is suggested for secured and integrated access control in the SIEM. First, the key points where the SIEM accesses the information within the software is specified and integrated policies for access control are developed in them. Accordingly, the threats entered into the access control module embedded in this system are carefully detected. By applying the proposed method, it is possible to provide the secured and integrated access control module for SIEM as well as the security of the access control module significantly increases in these systems. The method is implemented in the three stages of the requirements analysis for the establishment of a secure SIEM system, secure architectural design, and secure coding. The access control module is designed to create a secured SIEM and the test tool module is designed for evaluating the access control module vulnerabilities. Also, to evaluate the proposed method, the dataset is considered with ten thousand records, and the accuracy is calculated. The outcomes show the accuracy of the proposed method is significantly improved. The results of this paper can be used for designing an integrated and secured access control system in SIEM systems. Manuscript profile
      • Open Access Article

        13 - Using Static Information of Programs to Partition the Input Domain in Search-based Test Data Generation
        Atieh Monemi Bidgoli Hassan haghighi
        The quality of test data has an important effect on the fault-revealing ability of software testing. Search-based test data generation reformulates testing goals as fitness functions, thus, test data generation can be automated by meta-heuristic algorithms. Meta-heurist More
        The quality of test data has an important effect on the fault-revealing ability of software testing. Search-based test data generation reformulates testing goals as fitness functions, thus, test data generation can be automated by meta-heuristic algorithms. Meta-heuristic algorithms search the domain of input variables in order to find input data that cover the targets. The domain of input variables is very large, even for simple programs, while this size has a major influence on the efficiency and effectiveness of all search-based methods. Despite the large volume of works on search-based test data generation, the literature contains few approaches that concern the impact of search space reduction. In order to partition the input domain, this study defines a relationship between the structure of the program and the input domain. Based on this relationship, we propose a method for partitioning the input domain. Then, to search in the partitioned search space, we select ant colony optimization as one of the important and prosperous meta-heuristic algorithms. To evaluate the performance of the proposed approach in comparison with the previous work, we selected a number of different benchmark programs. The experimental results show that our approach has 14.40% better average coverage versus the competitive approach Manuscript profile
      • Open Access Article

        14 - An Agent Based Model for Developing Air Traffic Management Software
        Mahdi Yosefzadeh Seyed Reza Kamel Tabbakh Seyed Javad  Mahdavi Chabok Maryam khairabadi
        The Air Traffic Management system is a complex issue that faces factors such as Aircraft Crash Prevention, air traffic controllers pressure, unpredictable weather conditions, flight emergency situations, airplane hijacking, and the need for autonomy on the fly. agent-ba More
        The Air Traffic Management system is a complex issue that faces factors such as Aircraft Crash Prevention, air traffic controllers pressure, unpredictable weather conditions, flight emergency situations, airplane hijacking, and the need for autonomy on the fly. agent-based software engineering is a new aspect in software engineering that can provide autonomy. agent-based systems have some properties such: cooperation of agents with each other in order to meet their goals, autonomy in function, learning and Reliability that can be used for air traffic management systems. In this paper, we first study the agent-based software engineering and its methodologies, and then design a agent-based software model for air traffic management. The proposed model has five modules .this model is designed for aircraft ,air traffic control and navigations aids factors based on the Belief-Desire-Intention (BDI) architecture. The agent-based system was designed using the agent-tool under the multi-agent system engineering (MaSE) methodology, which was eventually developed by the agent-ATC toolkit. In this model, we consider agents for special occasions such as emergency flights’ and hijacking airplanes in airport air traffic management areas which is why the accuracy of the work increased. It also made the flight’s sequence arrangement in take-off and landing faster, which indicates a relative improvement in the parameters of the air traffic management Manuscript profile
      • Open Access Article

        15 - Self-Organization Map (SOM) Algorithm for DDoS Attack Detection in Distributed Software Defined Network (D-SDN)
        Mohsen Rafiee Alireza  shirmarz
        The extend of the internet across the world has increased cyber-attacks and threats. One of the most significant threats includes denial-of-service (DoS) which causes the server or network not to be able to serve. This attack can be done by distributed nodes in the netw More
        The extend of the internet across the world has increased cyber-attacks and threats. One of the most significant threats includes denial-of-service (DoS) which causes the server or network not to be able to serve. This attack can be done by distributed nodes in the network as if the nodes collaborated. This attack is called distributed denial-of-service (DDoS). There is offered a novel architecture for the future networks to make them more agile, programmable and flexible. This architecture is called software defined network (SDN) that the main idea is data and control network flows separation. This architecture allows the network administrator to resist DDoS attacks in the centralized controller. The main issue is to detect DDoS flows in the controller. In this paper, the Self-Organizing Map (SOM) method and Learning Vector Quantization (LVQ) are used for DDoS attack detection in SDN with distributed architecture in the control layer. To evaluate the proposed model, we use a labelled data set to prove the proposed model that has improved the DDoS attack flow detection by 99.56%. This research can be used by the researchers working on SDN-based DDoS attack detection improvement. Manuscript profile
      • Open Access Article

        16 - High Performance Computing Nessecities Requirements of future generations and research directions
        ehsan arianyan MohammadMahdi Esnaashari Fatemeh Ehsani Boshla Shaghayeghsadat Hossieni bayan Masoud Dehyadegari Behnam Samadi
        Nowadays, increasing the processing power of supercomputers is a worldwide race. This race, signifies the importance of supercomputers in the current era. They are engines of improving technology in almost all scientific areas, such as computational biology, earth scien More
        Nowadays, increasing the processing power of supercomputers is a worldwide race. This race, signifies the importance of supercomputers in the current era. They are engines of improving technology in almost all scientific areas, such as computational biology, earth sciences, cosmology, fluid dynamics, and plasma modeling, to name a few. Next generation of supercomputers can be divided into two broad categories: 1) emerging technologies such as neuromorphic and quantum computing and 2) Exascala computing. Emerging technologies will be the future of supercomputing, however, not in a very recent future. Therefore, in this paper, we have focused on Exascale computing, and have tried to provide a comprehensive overview of the main requirements for this technology to be achieved and become available. Requirements have been overviewed from different aspects; hardware, software, artificial intelligence, and cloud computing. In addition, we have attempted to provide a complete taxonomy of hot research topics within this area. Manuscript profile
      • Open Access Article

        17 - Identify and Prioritize Science Parks and Technology Services Offered to Companies Active in the Field of Software Technologies
        Saeed Shavalpour Sadegh  HosseinZadeh Maleki Mehdi Ghafori Fard Mir Saman Pishvaee
        Formation of science and technology parks in Iran dates back to more than two decades. However, there are numerous challenges for the mission defined for them; parks cannot significantly give services to companies operating in the country's soft-technology. One of the c More
        Formation of science and technology parks in Iran dates back to more than two decades. However, there are numerous challenges for the mission defined for them; parks cannot significantly give services to companies operating in the country's soft-technology. One of the challenges that science and technology parks in the country are facing is a certain criteria to determine optimum services provided for development of companies operating in the field of software technology. Hence, this research has been trying to examine the various models of science and technology parks global software as well as field study, the opinions of some activists inside the arena, proposals for boosting and developing science and technology and services parks aimed at expanding and upgrading small and medium-sized interpreneurship in the field of software technologies. The process of this study includes reviewing the resources as well as qualitative non-constructive interviews to collect information and also to wrap up the research method of fuzzy decision-making. The results of the present study suggests that basic services, such as low-interest loans, endangers promoting and developing companies, transfer of experience required for companies, including a market analysis and demographic characteristics of their needs for development and success of these companies Manuscript profile
      • Open Access Article

        18 - The effect of software innovation on identity:A social approach according to technological and science rules.
        Esmaeil Jahanbakhsh Nasrin AbSHahi
        In recent decades, it is defined that innovation rules have a determinant role in technological evolution process. According to dynamic relationships between rules and innovation, it seems necessary to have a comprehensive look on the technological processes analysis. A More
        In recent decades, it is defined that innovation rules have a determinant role in technological evolution process. According to dynamic relationships between rules and innovation, it seems necessary to have a comprehensive look on the technological processes analysis. According to increasing of usage and the effect of technology on the society especially in organizations, it is important to do socio-cultural studies in technology and science area. This study has evaluated informational technology field sociologically, in order to consider the effect of a range of the software renovation of old processes on Esfahan telecommunication customer system in the identity component. This research based on Anthony Giddnes Sociology theories, has determined the survey method and questionnaire technique by using 222 people who were selected through stratified sampling method and specified the relations. All analysis in descriptive and inferential statistics were done by Statistical Package for Social Science (SPSS). The results show that there are two meaningful straight relationships between the software renovation of old processes variable and the users values and theories variable (p<0.001) as well as the software renovation of old processes variable and the users basic trust variable (p<0.001). Also there are two other straight and meaningful relationships between the users values and theories variable and the users self-identity variable (p<0.02) as well as the users basic trust variable and the users self-identity variable (p<0.001). The users basic trust variable (β= 0.26, p<0.001) predicts the users self-identity variable. Manuscript profile
      • Open Access Article

        19 - Suggest a Model for the Academic Humanities Spin off Incubators, Move to the Fourth Generation University
        Taraneh Enayati Alireza Aulipour
        Experts believe the academic humanities is the platform of all sciences in any countries. Therefore, the academic humanities should provide development platforms before experiential and technical sciences. The movement from training-based universities to entrepreneur-ba More
        Experts believe the academic humanities is the platform of all sciences in any countries. Therefore, the academic humanities should provide development platforms before experiential and technical sciences. The movement from training-based universities to entrepreneur-based universities was based on turning theory to practice and knowledge commercialization, which led to the creation of incubators and university spin off enterprises. But something which was neglected was the status of the humanities in the incubators and university spin off enterprises. This paper is a qualitative study which the researchers studied the records, information on the database and interviewed with the experts from the Science and Technology Parks and Incubators. Open and in-depth interviewing method was applied to describe challenges and problems for knowledge commercialization in the field of the academic humanities. In the end, The researchers suggest a model for humanities’ spin off enterprises by considering software technology, in order to move in the post post-modern world and localization of humanities and move towards the creation of a fourth generation university. Manuscript profile
      • Open Access Article

        20 - Successful Strategy for Determining the Target Market of Iranian System Software Developer Knowledge Base Companies: A Case Study
        Ahmad Reza Jafarian-Moghaddam Hamid Reza Jafarian Moghaddam Mehdi Hajimoradi Mohammad Mohammadpour Darzi Naghibi
        The design and development of system software is extremely costly and time-consuming. On the other hand, system softwares have own customers. In Iran, the time between product presentation to a customer and requesting and accepting a customer to buy system software is v More
        The design and development of system software is extremely costly and time-consuming. On the other hand, system softwares have own customers. In Iran, the time between product presentation to a customer and requesting and accepting a customer to buy system software is very long. This has led to an enormous increase in sales and marketing costs of Iranian knowledge-based companies. Accordingly, The Iranian System Software Developer knowledge- base Companies need to be able to increase their sales success rate by determining the correct product target market. The present paper, using the Multi-Criteria Decision Making Method (MCDM), attempts to provide guidance on the precise determination of the system software's target market for Iranian knowledge- base companies. The proposed method has been implemented in the PDNSoft Co. as a knowledge- base company and as a case study and one of the Iranian developers of system software product. The implementation of this method showed that the target market in Iran should consist of some groups of government customers. This method has improved the customer attraction rate from 0.06 to 0.13 per month. Manuscript profile
      • Open Access Article

        21 - Sustainable value creation framework on stakeholder management(case study: native operating system of Iran)
        Fatemeh Saghafi کلثوم  عباسی شاهکوه احسان  کشتگاری
        Value creation is the most important aspect of stakeholder management. This activity is garanteed the continuoes of open source software life cycle based on business process reengineering and innovation based on stakeholder requirements. Stakeholders have varied inte More
        Value creation is the most important aspect of stakeholder management. This activity is garanteed the continuoes of open source software life cycle based on business process reengineering and innovation based on stakeholder requirements. Stakeholders have varied interests and different goals and they are looking to capture the full benefits and eliminate their risks. So, constantly, they are in competition. In order to take appropriate policies for stakeholder,identification of their position and requirements is needed . Various models have been presented to identify stakeholder from different aspect. In this paper, it is tried to study different models and extract all main criteria for stakeholder. Then according to expert's ideas, important factors was identified and the framework for stakeholder identification was proposed. Also, with study of resource in software industry field and using questionnaire , stakeholder of native operating system was identified and classified based on proposed framework . Finally, some appropriate strategies for stakeholder management are presented. Manuscript profile
      • Open Access Article

        22 - Development of a Secure Web Service Using RUPSec
        S. M. Hosseininezhad G. Elahi P. Jaferian
        Security issues are considered as major hurdles in extensive utilization of web services in enterprise. There has been work on standards, protocols, and technologies to answer some of these concerns. Nevertheless, problems arise and the issues remain. One needs to reco More
        Security issues are considered as major hurdles in extensive utilization of web services in enterprise. There has been work on standards, protocols, and technologies to answer some of these concerns. Nevertheless, problems arise and the issues remain. One needs to recognize security needs before he can tackle the task of selecting the right standard and mechanism to provide a secure system. In this paper, a case study is followed and thru that an approach for utilizing RUPSec for development of a secure web service is offered. The major objective has been to provide a way to discover and extract security needs of web services based on threats against them. Furthermore, RUPSec’s strength in pinpointing threats and security requirements is tested. Manuscript profile
      • Open Access Article

        23 - AUT-QPM: The New Framework to Query Evaluation for Data Warehouse Creation
        N. Daneshpour Ahmad Abdollahzadeh Barforoush
        The main reason of data warehouse systems failure is lack of justification proof. Analysis is an important task for decision about data warehouse creation. In this paper, we present the framework to justify data warehouse based on the input query types. We classify quer More
        The main reason of data warehouse systems failure is lack of justification proof. Analysis is an important task for decision about data warehouse creation. In this paper, we present the framework to justify data warehouse based on the input query types. We classify query types and execute them on the databases and data warehouses with different sizes. The query response time and the number of I/O are evaluation parameters. In the experiments, different types of queries have been processed on databases and data warehouses and the results based on time and memory have been compared. These results are presented below: • For answering multidimensional queries and aggregated queries data warehouse systems will be required, • For answering nested queries and join queries, data warehouse system will be useful, • Database systems will be proper for answering simple queries and computational queries. In this work, the tools which can process the above ideas have been produced. The software will take user query and evaluate its process to decide having or not having data warehouses. Manuscript profile
      • Open Access Article

        24 - Learning Stable Analysis Patterns for Intelligent Software Agents
        S. Vafadar Ahmad Abdollahzadeh Barforoush
        Artificial Intelligence (AI) Techniques (such as learning) are used widely in agent-based systems. However, current research does not address a software engineering view on these techniques that support all the software development process. In this paper, we focus on re More
        Artificial Intelligence (AI) Techniques (such as learning) are used widely in agent-based systems. However, current research does not address a software engineering view on these techniques that support all the software development process. In this paper, we focus on requirement analysis – as the first step of the software development process and present techniques and tools to cover this shortage. In this regard, we provide a set of stable analysis patterns for learning capability of the agents. Stable analysis patterns are a set of meta-classes and their relations to analyze a specific issue in a domain-independent manner. Using stable analysis concepts, namely Enduring Business Themes (EBT), Business Objects (BO) and Industrial Objects (IO), these patterns represent the conceptual model of the learning. In this paper, we also apply these patterns on two case studies to investigate their applicability. These patterns are used as guidelines during analysis of learning. The main advantage of applying the stable analysis patterns in comparison with conventional analysis methods is modeling the knowledge of the learning analysis in addition to the ordinary classes of the domain. In addition, they generate more stable models via considering different levels of abstraction in the analysis. Manuscript profile
      • Open Access Article

        25 - Determination of Formal Methods Capabilities for Software Specification and Analysis
        H. Banki V. Ahmadi Sabet
        Software developers face the problem of adopting a suitable formal method to developing their software. We aim to determine capability level of formal methods in software specification and analysis in four steps. The first step introduces the criteria by which the forma More
        Software developers face the problem of adopting a suitable formal method to developing their software. We aim to determine capability level of formal methods in software specification and analysis in four steps. The first step introduces the criteria by which the formal methods assess. The second and third ones deal with categorizing sorts of software and formal methods based on their solution methods. The fourth step determines fitness of some typical formal methods to specification and analysis of each software category. Manuscript profile
      • Open Access Article

        26 - A Requirement-Based Method to Software Architecture Testing
        S. M. Sharafi
        In this paper, after a review on well-known scenario-based methods of SA evaluation, a different approach is introduced to find architectural defects. The proposed method at first, elicits the problems threatening the system's success. Then based on the analysis of the More
        In this paper, after a review on well-known scenario-based methods of SA evaluation, a different approach is introduced to find architectural defects. The proposed method at first, elicits the problems threatening the system's success. Then based on the analysis of the problems and probable defects which could cause the problems, tests are designed and applied to the system in order to find the real defects specially the architectural ones. Results show that the proposed method could be use to find those architectural defects which may be remained covered after applying the other methods. Therefore, it could be used as a mean to SA testing and also as a complementary mechanism along with well-known SA evaluation methods. The proposed method and its components are presented in a systematic form. An illustration of its application on the architecture of a real system is presented and the results are compared with the results of applying ATAM on the same architecture. Manuscript profile
      • Open Access Article

        27 - A Distance-Based Method for Inconsistency Resolution of Models
        R. Gorgan Mohammadi Ahmad Abdollahzadeh Barforoush
        Model driven approach to software engineering has been taken into consideration due to its impact on reducing complexities and improving the productivity in software development. Inconsistencies are considered as an important challenge in applying models. An inconsisten More
        Model driven approach to software engineering has been taken into consideration due to its impact on reducing complexities and improving the productivity in software development. Inconsistencies are considered as an important challenge in applying models. An inconsistency is occurred due to an undesired structural pattern in a model. The main drawback of current approaches to inconsistency resolution is not considering the difference between the repair and the spoiled model. This work presents a distance-based method for finding closest repair for the spoiled model. For this aim, models and metamodels are represented using directed graphs and graph transformation rules are employed for inconsistency resolution. A distance metric is defined based on the amount of changes in the graph corresponding to the model. Application of the proposed method to a set of BPMN models shows the improvement of the results. Manuscript profile
      • Open Access Article

        28 - Evaluation of Performance, Reliability and Security for Share-Data, Object-Oriented and Pipe and Filter Styles
        H. Banki H. Banki
        A desirable software application should be able to provide the quality attributes required by the system, as well as the functional requirements. Software architecture styles have a significant effect on the quality attributes of the designed software as well as its spe More
        A desirable software application should be able to provide the quality attributes required by the system, as well as the functional requirements. Software architecture styles have a significant effect on the quality attributes of the designed software as well as its specification and decomposition.) The quantity evaluation and analysis of this effectiveness rate result in the selection of the most appropriate style for designing the architecture. In this paper, a method based on the Colored Petri Net is proposed to quantitatively evaluate three candidate attributes of the software architectural styles called the quality attributes, performance, reliability, and security in three candidate styles named shared-data, object-oriented, and pipe-and-filter software architectural styles. This method has not limitations of the previous-ones in evaluating the quality attributes. In this method, the candidate styles are firstly modeled by using the Colored Petri Net; then, considering the evaluation rules, CPN tools are used to analyze the networks and calculate the exact value of the candidate attributes. At the end, the best candidate style is chosen for implementation through ranking the styles in terms of the satisfaction level of the candidate quality attributes. To present a practical representation using the proposed methodology, the ATM system has been chosen as a case study. Manuscript profile
      • Open Access Article

        29 - A Cloud-based Learnable Agent-oriented Approach to Control and improve Pacemaker Operation
        H. Banki نگار مجمع A. Monadjemi
        This paper aims to present a cloud-based learning agent-oriented approach for verification of the pacemaker behavior by monitoring and heart rate adjustment of an arrhythmic patient. In case of the pacemaker failure or inappropriate heart rate generation, the patient is More
        This paper aims to present a cloud-based learning agent-oriented approach for verification of the pacemaker behavior by monitoring and heart rate adjustment of an arrhythmic patient. In case of the pacemaker failure or inappropriate heart rate generation, the patient is put at risk. Using the proposed approach, one can directs the pacemaker rate to correct one when it is incorrect. Using a learnable software agent, the proposed approach is able to learn un-predefined situations and operates accordingly. The proposed approach is cloud based meaning that it sends a message through cloud in case of a critical situation. After determining the patient heart rate by pacemaker, the proposed method verifies this rate against the predefined physician suggestion and automatically corrects it based on a reinforcement learning mechanism if there is some conflict. The proposed method was implemented and installed on a tablet as a patient mobile device for monitoring the pacemaker implanted in the patient chest. The contrast between results of our approach and expected results existing in the dataset showed our approach improved the pacemaker accuracy until 13.24%. The use of the software agent with reinforcement learning is able to play a significant role in improving medical devices in case of critical situations. Manuscript profile
      • Open Access Article

        30 - How to Identify Requirements under Uncertainty for Self-Adaptive Software Systems Development
        R. Moeinfar Ahmad Abdollahzadeh Barforoush S. M. Hashemi
        One key challenge in software systems development is changing requirements at development phases or run-time. This might happen as the result of uncertainty in stakeholder requirements. Uncertain requirements drive a flexible and therefore adaptable architecture to mana More
        One key challenge in software systems development is changing requirements at development phases or run-time. This might happen as the result of uncertainty in stakeholder requirements. Uncertain requirements drive a flexible and therefore adaptable architecture to manage risks at run-time. Modeling uncertainty to adapt architecture automatically is an effective solution when requirements change. In order to evaluate requirements and handle uncertainty by modeling and self-managing, it is advantageous to quantify requirements, computationally. This study besides understanding the sources of uncertainty, investigates how to quantify requirements and quality attributes. Subsequently, decision making at all software development phases will be based on numerical analysis that leads to autonomic software development. Manuscript profile
      • Open Access Article

        31 - A New BGP-based Load Distribution Approach in Geographically Distributed Data Centers
        A. Esmaeili B. Bakhshi
        Today, hosting services in geographically distributed data centers is very common among service provider companies, because of more efficiency of energy consumption, high availability of the system, and providing quality of service. Load distribution is the main issue i More
        Today, hosting services in geographically distributed data centers is very common among service provider companies, because of more efficiency of energy consumption, high availability of the system, and providing quality of service. Load distribution is the main issue in the geographical data centers. On the one hand, there are several architectures to distribute load between different clusters, e.g., central load balancer, DNS-based systems, and IGP based schemes; one the other hand, the optimum traffic load balancing between clusters is a very challengeable issue. The proposed solutions have different facilities to distribute incoming traffic; nevertheless, they are vulnerable in terms of propagation delay, centralized load balancer failure, and maintaining connections. In this paper, a new architecture based on BGP and Anycast routing protocols in SDN based data centers is proposed to distribute traffic loads between clusters. Simulation result shows improvement in comparison to the existing techniques. Manuscript profile
      • Open Access Article

        32 - Parity Check Matrix Estimation of k/n Convolutional Coding in Noisy Environment Based on Walsh-Hadamard Transform
        Mohammad khaksar H. Khaleghi Bizaki
        Blind estimation of Physical layer transmission parameters, is one of the challenges for smart radios to adapt itself to network standards. These parameters could be transmission rate, modulation and coding scheme that is used for combating with channel errors. Therefor More
        Blind estimation of Physical layer transmission parameters, is one of the challenges for smart radios to adapt itself to network standards. These parameters could be transmission rate, modulation and coding scheme that is used for combating with channel errors. Therefore, Channel Coding Estimation, including code parameters, parity check matrix and generator matrix estimation, is one the interesting research topics in the context of software radios. Algebraic methods like Euclidean methods and Rank-based methods are usually performed on intercepted received sequence to estimate the code. Poor efficiency in a high error probability environment is the main drawback of this methods. Transform-based methods, like Walsh-Hadamard transform is one of the methods that could solve channel coding estimation problem. In this paper, new algorithm based on Walsh-Hadamard Transform is proposed that could reconstruct the parity check matrix of convolutional code with general k/n rate in a high error probability environments (BER>0.07), that has much better performance compared to other methods. This algorithm exploits algebraic properties of convolutional code in order to form k-n equation for estimation of k-n rows of the parity check matrix and then use Walsh-Hadamard transform to solve these equations. Simulation results verified excellent performance of the proposed algorithm in high error probability environments compared to other approaches. Manuscript profile
      • Open Access Article

        33 - SAHAR: An Architecture to Strengthen the Control Plane of the Software-Defined Network Against Denial of Service Attacks
        mehran shetabi Ahmad Akbari
        Software-defined network (SDN) is the next generation of network architecture thatby separating the data plane and the control plane enables centralized control with the aim of improving network management and compatibility. However, due to the centralized control polic More
        Software-defined network (SDN) is the next generation of network architecture thatby separating the data plane and the control plane enables centralized control with the aim of improving network management and compatibility. However, due to the centralized control policy, this type of network is prone to Inaccessibility of control plane against a denial of service (DoS) attack. In the reactive mode, a significant increase in events due to the entry of new flows into the network puts a lot of pressure on the control plane. Also, the presence of recurring events such as the collection of statistical information from the network, which severely interferes with the basic functionality of the control plane, can greatly affect the efficiency of the control plane. To resist attack and prevent network paralysis, this paper introduces a new architecture called SAHAR, which consists of a control box consisting of a coordinator controller, a primary flow setup controller, and one or more (as needed) secondary flow setup controller(s). Assigning monitoring and managing tasks to the coordinator controller reduces the load of flow setup controllers. In addition, dividing the incoming traffic between the flow setup controllers by the coordinator controller distributes the load at the control plane. Thus, by assigning the traffic load resulting from a denial-of-service attack to one or more secondary flow setup controller(s), the SAHAR architecture can prevent the primary flow setup controller from impairment and resist DoS attacks. Tests show that SAHAR performs better in the face of a DoS attack than existing solutions. Manuscript profile
      • Open Access Article

        34 - Bug Detection and Assignment for Mobile Apps via Mining Users' Reviews
        Maryam Younesi Abbas Heydarnoori F. Ghanadi
        Increasing the popularity of smart phones and the great ovation of users of mobile apps has turned the app stores to massive software repositories. Therefore, using these repositories can be useful for improving the quality of the program. Since the bridge between users More
        Increasing the popularity of smart phones and the great ovation of users of mobile apps has turned the app stores to massive software repositories. Therefore, using these repositories can be useful for improving the quality of the program. Since the bridge between users and developers of mobile apps is the comments that users write in app stores, special attention to these comments from developers can make a dramatic improvement in the final quality of mobile apps. Hence, in recent years, numerous studies have been conducted around the topic of opinion mining, whose intention was to extract and exert important information from user's reviews. One of the shortcomings of these studies is the inability to use the information contained in user comments to expedite and improve the process of fixing the software error. Hence, this paper provides an approach based on users’ feedback for assigning program bugs to developers. This approach builds on the history of a program using its commit data, as well as developers' ability in fixing a program’s errors using the bugs that developers have already resolved in the app. Then, by combining these two criteria, each developer will get a score for her appropriation for considering each review. Next, a list of developers who are appropriate for each bug are provided. The evaluations show that the proposed method would be able to identify the right developer to address the comments with a precision of 74%. Manuscript profile
      • Open Access Article

        35 - Optimal Resource Allocation in Multi-Task Software-Defined Sensor Networks
        S. A. Mostafavi M. Agha Sarram T. Salimian
        Unlike conventional wireless sensor networks which are designed for a specific application, Software-Defined Wireless Sensor Networks (SDSN) can embed multiple sensors on each node, defining multiple tasks simultaneously. Each sensor node has a virtualization program wh More
        Unlike conventional wireless sensor networks which are designed for a specific application, Software-Defined Wireless Sensor Networks (SDSN) can embed multiple sensors on each node, defining multiple tasks simultaneously. Each sensor node has a virtualization program which serves as a common communication infrastructure for several different applications. Different sensor applications in the network can have different target functions and decision parameters. Due to the resource constraints of sensor network nodes, the multiplicity and variety of tasks in each application, requirements for different levels of quality of service, and the different target functions for different applications, the problem of allocating resources to the tasks on the sensors is complicated. In this paper, we formulate the problem of allocating resources to the sensors in the SDSN with different objective functions as a multi-objective optimization problem and provide an effective solution to solve it. Manuscript profile
      • Open Access Article

        36 - Resource Management in Multimedia Networks Using Software-Defined Network Technology
        Ahmadreza Montazerolghaem
        Nowadays, multimedia networks on the Internet have become a low-cost and efficient alternative to PSTN. Multimedia transfer applications on the Internet are becoming more and more popular. This connection consists of two phases: signaling and media. The signaling phase More
        Nowadays, multimedia networks on the Internet have become a low-cost and efficient alternative to PSTN. Multimedia transfer applications on the Internet are becoming more and more popular. This connection consists of two phases: signaling and media. The signaling phase is performed by SIP proxies and the media phase by network switches. One of the most important challenges in multimedia networks is the overload of SIP proxies and network switches in the signaling and media phases. The existence of this challenge causes a wide range of network users to face a sharp decline in the quality of service. In this article, we model the routing problem in multimedia networks to deal with the overload. In this regard, we present a technology-based method of software-based networks and a mathematical programming model in multimedia networks. The proposed method is simulated under various scenarios and topologies. The results investigate that the throughput and resource consumption has improved. Manuscript profile
      • Open Access Article

        37 - A Prediction-Based Load Distribution Approach for Software-Defined Networks
        Hossein Mohammadi سیداکبر مصطفوی
        Software-defined networking is a new network architecture which separates the control layer from the data layer. In this approach, the responsibility of the control layer is delegated to the controller software to dynamically determine the behavior of the entire network More
        Software-defined networking is a new network architecture which separates the control layer from the data layer. In this approach, the responsibility of the control layer is delegated to the controller software to dynamically determine the behavior of the entire network. It results in a flexible network with centralized management in which network parameters can be well controlled. Due to the increasing number of users, the emergence of new technologies, the explosive growth of network traffic, meeting the requirements of quality of service and preventing underload or overload of resources, load balancing in software-based networks is of substantial importance. Load imbalance increases costs, reduces scalability, flexibility, efficiency, and delay in network service. So far, a number of solutions have been proposed to improve the performance and load balancing in the network, which take into account different criteria such as power consumption and server response time, but most of them do not prevent the system from entering the load imbalance mode and the risks of load imbalance. In this paper, a predictive load balancing method is proposed to prevent the system from entering the load imbalance mode using the Extreme Learning Machine (ELM) algorithm. The evaluation results of the proposed method show that in terms of controller processing delay, load balance and response time, it performs better than CDAA and PSOAP methods. Manuscript profile
      • Open Access Article

        38 - Social Media Application in E-Learning Process Management
        مهدی شامی زنجانی   زهرا تقی‌نیا آهنگری
        The current research aimed to offer a conceptual framework for utilization of social software in e-learning process management. In this research, after reviewing the relevant literature, social software were identified based on their relevancy with subject and frequency More
        The current research aimed to offer a conceptual framework for utilization of social software in e-learning process management. In this research, after reviewing the relevant literature, social software were identified based on their relevancy with subject and frequency of usage and an initial conceptual framework was offered based on the application of social software in each stage of e-learning process management and the validity of proposed framework was surveyed by polling experts. This is an applied research in which descriptive survey was used as the data collecting method. Eight Web 2.0 applications including Blog, Wiki, Social Networks, RSS, Podcast, Mesh-up, Social Bookmarking and IM were identified, with reviewing the literature. The findings resulted from the analysis of the data collected from experts showed that in e-learning process planning, three Web 2.0 applications including Wiki, Social Networks and Social Bookmarking, in e-learning process control, four Web 2.0 applications including Social Networks, RSS, Social Bookmarking and IM, and in e-learning process organizing and improvement, all identified Web 2.0 applications were respectively verified from the application viewpoint. Manuscript profile
      • Open Access Article

        39 - Three-dimensional geological modeling in two zones of eastern side of Ahvaz oil field
        Razeyeh . Doosti Irani Maryam Payrovi Mohammad .rahim karimi Mehdi . Doosti Irani
        The Ahvaz field is one of the most important oil fields in the Zagros Basin which is located in the Dezful Embayment. The trend of Ahvaz oil field is northwest- southeast parallel to the Zagros mountains. The purpose of this study is the geological 3D simulation (petrop More
        The Ahvaz field is one of the most important oil fields in the Zagros Basin which is located in the Dezful Embayment. The trend of Ahvaz oil field is northwest- southeast parallel to the Zagros mountains. The purpose of this study is the geological 3D simulation (petrophysical) for the zone one and two in the eastern part of the Ahvaz oil field. In this investigation, porosity modeling, water saturation and shale volume by using sequential Gaussian Simulation (SGS) was performed. At first, well logs, cores, well’s coordination, top and thickness of formations of the zone three of Ilam Formation and zone one of Sarvak Formation were collected. These information related to 25 wells in the eastern part of the Ahvaz oil field was used for the 3D modeling of the reservoir by using Petrel software. For the recognition of spatial correlation, variograms based on water saturation and permeability and three dimensional model of the petrophysical parameters and net to gross ratio (NTG) were drawn. Manuscript profile
      • Open Access Article

        40 - Modeling Mud Loss in Asmari Formation Using Geostatistics in RMS Software Environment in an Oil Field in Southwestern Iran
        Kioumars Taheri Farhad Mohammad Torab
        Studying lost circulation in Asmari formation is very important because about 25% to 40% of drilling costs is allocated to drilling mud expenses. Considering that Studied oil field encounters severe mud loss in Asmari formation, therefore the purpose of this study is re More
        Studying lost circulation in Asmari formation is very important because about 25% to 40% of drilling costs is allocated to drilling mud expenses. Considering that Studied oil field encounters severe mud loss in Asmari formation, therefore the purpose of this study is recognition of the lost circulation zones and illustrating the mud loss distribution in Asmari formation. The mud loss maps in Asmari field were plotted in RMS software using moving average algorithm method. For this purpose, the data of 363 wells in this oil field was processed after data preparation, for mapping and 3D modeling of 11 different zones in Asmari formation. The data processing includes different stages such as elimination of outliers, normal transformation, drawing the histogram, variography and estimation and modeling. In this research, the geostatistical kriging method was also used for estimation and 3D modeling of mud loss in Asmari formation so that the output of geostatistical modeling method shows the localized and better results. Consequently, by applying and analysis of results, the 2D and 3D models of mud loss in Asmari formation were demonstrated. By simulation and modeling of mud loss and its comparison with reservoir fault modeling and production indexes plots, it was identified that the dominant mud losses are related to fault zone fractures and in minor cases the increasing of mud weight is the reason of mud loss. Applying appropriate operations such as under balance drilling (UBD) and suitable well placement, use of drilling mud with proper mud weight in severe mud loss points, use of NIF and MMH especial drilling muds with lowest formation damage, or a combination of these methods are suggested for mud loss control in critical points of the oil field. Manuscript profile
      • Open Access Article

        41 - Biostratigraphy, microfacies and sequence stratigraphy of the Asmari Formation (based on Cyclolog) in the Qaleh Nar Oli field, Zagros Basin
        adel neisi ali Ghobeishav Mohammad Allahkarampour-Dill
        In this research, biostratigraphy, microfacies, sedimentary environments and sequence stratigraphy (using by Cyclolog software) of the Asmari Formation are carried out. These studies are done on the basis of 580 samples (core and cutting) from 430 meters thickness of th More
        In this research, biostratigraphy, microfacies, sedimentary environments and sequence stratigraphy (using by Cyclolog software) of the Asmari Formation are carried out. These studies are done on the basis of 580 samples (core and cutting) from 430 meters thickness of the formation from the well #2 of the Qaleh-Nar oilfield. Paleontological studies are led to identification of 23 genera and 28 species of the benthic and planktonic foraminifera. According to these microfossils, four assemblage zones have been recognized which confirm the age of Oligocene (Rupelian – Chattian) and Early Miocene (Aquitanian – Burdigalian) for the whole formation. Paleoenvironmental studies demonstrate 9 different microfacies that were deposited in the outer ramp (open marine) in the lower Asmari part, middle ramp (open marine to shoal) in the middle Asmari part and the inner ramp environment (tidal flat to lagoon) in the upper Asmari part. The sequence stratigraphy on the well #2 and the auxiliary well numbers 1, 3, 5, 6 and 7 of the Qaleh-Nar oilfield using by Cyclolog software reveals 7 positive breaks and 9 negative break levels alternatively. Some of the positive breaks define sequence boundaries and some of the negative breaks present the maximum flooding surfaces. In addition, a number of positive levels specify the major chronozone (stage boundaries). Comparison of the quintuple reservoir zones of the Asmari Formation in the Qaleh-Nar oilfield with the mentioned break levels suggests a fine correlation with these levels; however this correlation is invalid for other levels. Manuscript profile
      • Open Access Article

        42 - Drilling mud loss modeling to detect high risk point and suitable location for new drilling project in Sarvak Formation, Azadegan Oil Field
        Bahman Soleimani Abass  Esmaeli Ehsan Larki
        Drilling mud loss is considered as one of common problems encountered during excavation. The aim of this study is to evaluate of mud loss of the Sarvak Formation (Cenomanian age) in Azadegan oil field which is known as the most important rich hydrocarbon reservoir in th More
        Drilling mud loss is considered as one of common problems encountered during excavation. The aim of this study is to evaluate of mud loss of the Sarvak Formation (Cenomanian age) in Azadegan oil field which is known as the most important rich hydrocarbon reservoir in the Zagros region by providing the model in the environment of GS+ software. This formation consists of a thick sequence of limestone and clay limestone layers, which is divided into 7 zones based on petrophysical characteristics. For this purpose, available data such as drilling mud weight, pump pressure, mud loss, and related depths of 9 drilled wells were investigated. The results showed that despite of the operating factors including the weight of drilling mud and the pressure of the pumps were kept constant, the presence of fractures in the reservoir rocks causes to occur mud loss significantly in zone 3 while it is observed the lowest level of mud loss in zone7. Based on the results of mud lost data patterns, faults, sedimentary environments morphology (such as sedimentary channels) seem to play major roles in creating fractures or areas susceptible to mud loss. The difference in observed patterns of mud loss is more likely to confirm the relocation of channel status over the time in different parts of the reservoir. In general, the highest rate of mud loss was detected in the northern and southern edges while the lowest rate was happened in the middle part of the field. It is suggested to prevent mud loss hazards in this field, underbalanced drilling method should be used. Manuscript profile
      • Open Access Article

        43 - Investigating Social Relations between Eco-Tourists and Local People of Hormoz Island
        Elham  Nasrabadi حنانه محمدی کنگرانی Mehdi  Mirzadeh Kohshahi
        Ecotourism creates special opportunities to identify a wonderful aspect of nature, provides new information for tourists, and improves the living conditions of the host society. Since the Hormuz Island has a great capability in ecotourism, it can attract a large number More
        Ecotourism creates special opportunities to identify a wonderful aspect of nature, provides new information for tourists, and improves the living conditions of the host society. Since the Hormuz Island has a great capability in ecotourism, it can attract a large number of eco-tourists. Identifying and analyzing the friendly networks between eco-tourists and local people of Hormuz Island are the main aims of this study to investigate such relationship and their social effects. Upon conducting interviews and completing questionnaires by selected people, the collected data were analyzed using descriptive analysis, network’s analysis, and Visone’s software. Investigating the social relations between eco-tourists and local people showed that the formation of an informal friendly relationship and cooperation between the inhabitants and eco-tourists of Hormuz Island positively influenced ecotourism. Created social networks helped ecotourism continue in Hormuz Island and would have desirable economic and environmental effects for local people and Hormuz Island. This can have positive consequences such as increasing ecotourism development, sustainable ecotourism, and better ecotourism recognition for Hormuz Island inside and outside of Iran. Manuscript profile
      • Open Access Article

        44 - Identifying and Analyzing the Driving Forces Involved in Developing Tourism Industry in Kondoleh Village, Sahneh City
        mitra jaliliyan Mohammad Akbarpour Jafar Tavakoli
        Abstract Tourism is regarded as an inflectional factor in developing the communications between nations worldwide in today’s world, providing job opportunities in the economic sector and fostering socio-cultural interactions. On the other hand, rural tourism has widely More
        Abstract Tourism is regarded as an inflectional factor in developing the communications between nations worldwide in today’s world, providing job opportunities in the economic sector and fostering socio-cultural interactions. On the other hand, rural tourism has widely been researched by a wide variety of scholars around the world, as it can help create new jobs, increase the residents’ revenues, and mitigate poverty, especially in villages that enjoy cultural attractions. Therefore, this sought to investigate the key factors and influential driving forces involved in developing tourism in Kandoleh village using the Delphi method and the future study approach. This study is considered applied in terms of purpose, carried out based on new methods followed in the future study, and novel exploratory and analyzing techniques using a mixture of quantitative and qualitative methods. In this regard, the key indices were identified via Delphi method, and the critical driving forces were detected based on the cross-impact analysis method using the Mick Mac software. The study’s results suggested that out of the twenty-eight key factors identified, efficient management and designation of Kandoleh village as the pilot site for rural tourism were the most influential elements in developing tourism within the village, followed by factors such as increased investments by the private sector, advertising, registering and preserving historical and cultural monuments, and providing security for tourists, respectively. Manuscript profile
      • Open Access Article

        45 - Investigating the Influence of Human Resources Management Activities on Customer Loyalty in the Hoteling Industry Using a Multilevel Approach
        zahra Nikkhah-Farkhani Mohammad shaykhzade
        Considering the fact that customer loyalty is regarded as one of the most important concerns of hotel managers, this study sought to investigate the influence of human resources management activities on customer loyalty using a multi-level approach. The statistical popu More
        Considering the fact that customer loyalty is regarded as one of the most important concerns of hotel managers, this study sought to investigate the influence of human resources management activities on customer loyalty using a multi-level approach. The statistical population of the study comprised of managers, staff, and customers of four-star and five-star hotels in Mashhad. Moreover, the statistical sample size of the research at the organization level consisted of thirty-four, one-hundred and forty, two-hundred and forty managers, staff, and customers at organizational, staff, and customer levels, respectively. On the other hand, Sam’s 12-item questionnaire (2008) and the 17-item questionnaire developed by Al-Rafiei et al. (2013) were used to measure the human resources management activities and the staff’s satisfaction and loyalty, respectively, whose validity and reliability was tested using the Content validity method and Cronbach's alpha, respectively. Finally, HLM 7.02 and the SPSS software were used to analyse the collected data. The results proved the positive influence of human resources management activities on the staff’s satisfaction and loyalty, and in turn, on the satisfaction and loyalty of the customers. Therefore, it could be argued that hotel managers can increase the loyalty of their customers by emphasizing human resource management activities and improving their staff’s satisfaction and loyalty, which would increase the profitability of hotels in the long term by reducing marketing costs. Manuscript profile
      • Open Access Article

        46 - Long-Term Software Fault Prediction Model with Linear Regression and Data Transformation
        Momotaz  Begum Jahid Hasan Rony Md. Rashedul Islam Jia Uddin
        The validation performance is obligatory to ensure the software reliability by determining the characteristics of an implemented software system. To ensure the reliability of software, not only detecting and solving occurred faults but also predicting the future fault i More
        The validation performance is obligatory to ensure the software reliability by determining the characteristics of an implemented software system. To ensure the reliability of software, not only detecting and solving occurred faults but also predicting the future fault is required. It is performed before any actual testing phase initiates. As a result, various works on software fault prediction have been done. In this paper presents, we present a software fault prediction model where different data transformation methods are applied with Poisson fault count data. For data pre-processing from Poisson data to Gaussian data, Box-Cox power transformation (Box-Cox_T), Yeo-Johnson power transformation (Yeo-Johnson_T), and Anscombe transformation (Anscombe_T) are used here. And then, to predict long-term software fault prediction, linear regression is applied. Linear regression shows the linear relationship between the dependent and independent variable correspondingly relative error and testing days. For synthesis analysis, three real software fault count datasets are used, where we compare the proposed approach with Naïve gauss, exponential smoothing time series forecasting model, and conventional method software reliability growth models (SRGMs) in terms of data transformation (With_T) and non-data transformation (Non_T). Our datasets contain days and cumulative software faults represented in (62, 133), (181, 225), and (114, 189) formats, respectively. Box-Cox power transformation with linear regression (L_Box-Cox_T) method, has outperformed all other methods with regard to average relative error from the short to long term. Manuscript profile
      • Open Access Article

        47 - A New Parallel Method to Verify the Packets Forwarding in SDN Networks
        Rozbeh Beglari Hakem Beitollahi
        The rise of Software-Defined Networking (SDN) has revolutionized network management, offering greater flexibility and programmability. However, ensuring the accuracy of packet forwarding remains paramount for maintaining network reliability and security in SDN environme More
        The rise of Software-Defined Networking (SDN) has revolutionized network management, offering greater flexibility and programmability. However, ensuring the accuracy of packet forwarding remains paramount for maintaining network reliability and security in SDN environments. Unlike traditional IP networks, SDN separates the control plane from the data plane, creating new challenges for securing data transmission. Existing verification methods designed for IP networks often cannot be directly applied to SDN due to this architectural difference. To address the limitations of existing verification methods in SDN networks, new approaches are necessary. This research proposes a novel parallel method for verifying packet forwarding, building upon concepts from DYNAPFV. The proposed approach aims to overcome specific limitations of existing methods (including DYNAPFV), such as scalability issues, slow verification times. Simulations demonstrate significant improvements compared to DYNAPFV. The proposed parallel method achieves a 92% reduction in time required to identify malicious nodes within the network. The results also reveal a trade-off between security and verification time. As the probability of packet integrity confirmation increases from 0.8 to 0.99, system security strengthens, but the time to detect malicious switches also increases. Manuscript profile
      • Open Access Article

        48 - Proposing a Detection and Mitigation Approach for DDoS Attacks on SDN-Based IoT Networks
        fatemeh MotieShirazi Seyedakbar Mostafavi
        Internet of Things (IoT) is a network of objects on which objects can communicate with other objects. The Internet of Things is currently constantly under numerous attacks due to technical, legal and human problems. One of the most important of these attacks is the Deni More
        Internet of Things (IoT) is a network of objects on which objects can communicate with other objects. The Internet of Things is currently constantly under numerous attacks due to technical, legal and human problems. One of the most important of these attacks is the Denial of Service (DoS) attack, in which normal network services are out of service and it is impossible for objects and users to access the server and other resources. Existing security solutions have not been able to effectively prevent interruption attacks in Internet of Things services. Software-oriented network (SDN) is a new architecture in the network based on the separation of the control and data plane of the network. Programmability and network management capability by SDN can be used in IoT services because some IoT devices send data periodically and in certain time intervals. SDN can help reduce or prevent the data flood caused by IoT if properly deployed in the data center. In this article, a method to detect DDoS attacks in Internet of Things based on SDN is presented and then an algorithm to reduce DDoS attacks is presented. The proposed method is based on the entropy criterion, which is one of the most important concepts in information theory and is calculated based on the characteristics of the flow. In this method, by using two new components on the controller to receive incoming packets and considering the time window and calculating entropy and flow rate, a possible attack is detected in the network, and then based on the statistics of the flow received from the switches, the certainty of the attack is determined. Compared to the existing methods, the proposed method has improved 12% in terms of attack detection time and 26% in terms of false positives/negatives. Manuscript profile
      • Open Access Article

        49 - The Effect of Distance Learning through Shad Software on the Academic Achievement of Shahed Primary Special School Students in Alborz Province
        Ehsan Golmehr Mahdi Kalhornia Golkar Meysam Kalhornia Golkar leila shiravand
        Background and Aim: Education is one of the most important aspects of children's rights and so the quality of its implementation plays an important role in securing this right. This issue has a particular importance and sensitivity for children with special needs. With More
        Background and Aim: Education is one of the most important aspects of children's rights and so the quality of its implementation plays an important role in securing this right. This issue has a particular importance and sensitivity for children with special needs. With the spread of the Corona pandemic in the world, the need for distance education became inevitable. In Iran, distance learning, including at the basic level, was formed using ‘Shad’ software, which is still current in many primary schools. The aim of this study was to investigate the effect of this type of education on the educational achievement of children with special needs. Method: In this study, using a descriptive survey method and a researcher-made questionnaire based on the Academic Achievement Questionnaire adapted from Pham and Taylor's 1994 research, the statistical population, teachers of Shahed Fardis Exceptional School of Karaj was selected and data analysis was performed. Chi-square test and Friedman test were used with Spss software version 24. Results: The findings showed that the effect of ‘Shad’ software in distance education is positive in terms of software efficiency and effectiveness compared to other methods of distance education and also by creating a specific program to prepare for assessments; However, due to the absenteeism of education, the overall outcome of the comments was a negative impact on students' academic achievement compared to the attendance method at the end of the year. Also, this type of education can have a negative impact on academic achievement by reducing performance during face-to-face assessment, reducing literacy skills, and distracting students due to being at home. Conclusion: Distance education, especially with the elimination of hardware and software defects, as well as cultural placement and mutual understanding of teachers, parents and students, is an opportunity in some respects to increase academic achievement, but face-to-face education, as well, due to its functions, should still be considered and emphasized by the educational system. Manuscript profile
      • Open Access Article

        50 - Explaining the adoption process of software-oriented networks (SDN) using the foundational data method and systems approach
        Elham Ziaeipour ali rajabzadeh ghotri Alireza Taghizadeh
        Software Defined Networking (SDN) is one of the technologies with most promising role in digital transformation. Dynamic structure of SDN can adapt to ever changing nature of future networks and their users. The important impact of this technology on intelligence, agi More
        Software Defined Networking (SDN) is one of the technologies with most promising role in digital transformation. Dynamic structure of SDN can adapt to ever changing nature of future networks and their users. The important impact of this technology on intelligence, agility, management and control of current network devices as well as upcoming communication technologies reduces expenses and creates innovative businesses. Although, service providers are very interested in deploying SDN to transform their static infrastructures to a dynamic and programmable platform, they do not consider it as one of their priorities and still depend on traditional methods to manage their network. Therefore, this study highlights the factors affecting the acceptance of SDN architecture and its application by the national telecom operators, and proposes a comprehensive and new paradigm model using a systems approach and Grounded theory (Strauss and Corbin model). This innovative model is provided by systematically reviewing the theoretical foundations and conducting in-depth interviews with managers and experts in telecom industry. During the modeling process, more than a thousand initial codes were determined. Finally, based on the opinion of experts on these codes, a total of 73 open codes, 12 axial codes and 6 main categories have been extracted. Manuscript profile
      • Open Access Article

        51 - High Performance Computing: Next Generation Requirements and Research Axes
        ehsan arianyan MohammadMahdi Esnaashari Fatemeh Ehsani Boshla Shaghayeghsadat Hossieni bayan Masoud Dehyadegari Behnam Samadi
        Nowadays, increasing the processing power of supercomputers is a worldwide race. This race, signifies the importance of supercomputers in the current era. They are engines of improving technology in almost all scientific areas, such as computational biology, earth scien More
        Nowadays, increasing the processing power of supercomputers is a worldwide race. This race, signifies the importance of supercomputers in the current era. They are engines of improving technology in almost all scientific areas, such as computational biology, earth sciences, cosmology, fluid dynamics, and plasma modeling, to name a few. Next generation of supercomputers can be divided into two broad categories: 1) emerging technologies such as neuromorphic and quantum computing and 2) Exascala computing. Emerging technologies will be the future of supercomputing, however, not in a very recent future. Therefore, in this paper, we have focused on Exascale computing, and have tried to provide a comprehensive overview of the main requirements for this technology to be achieved and become available. Requirements have been overviewed from different aspects; hardware, software, artificial intelligence, and cloud computing. In addition, we have attempted to provide a complete taxonomy of hot research topics within this area. Manuscript profile
      • Open Access Article

        52 - Pathology of software export development in Iran
        Maryam Saleh Gorgory Mohsen Gerami Vahid Yazdanian
        Today, the new developments of the world economy, including the severe fluctuation of the price of raw materials, the increase in the wages of the human force, the increase in the cost of transportation, storage and other production factors, have made many developing co More
        Today, the new developments of the world economy, including the severe fluctuation of the price of raw materials, the increase in the wages of the human force, the increase in the cost of transportation, storage and other production factors, have made many developing countries to think about entering The field of production and trade of goods that have the least amount of dependence on high-risk economic components. The software industry is one of these industries. In addition to having high added value, this industry has the least need for raw materials and other cost-generating components. In fact, the software industry is a pure knowledge industry and is based on research and development. Our country can also take steps to export software in order to gain benefits from this industry. Considering this, this research has addressed the pathology of software export development. In this research, the statistical population of which are software companies that are members of the software exporters' union, about the demand, national perspective and strategy, international trust and communication, features of the software industry and infrastructure and internal factors are investigated. The results of our research show that the problems of software export are the lack of demand and software infrastructure and internal factors. Manuscript profile
      • Open Access Article

        53 - Software-Defined Networking Adoption Model: Dimensions and Determinants
        Elham Ziaeipour Ali Rajabzadeh Ghotri Alireza Taghizadeh
        The recent technical trend in the field of communication networks shows a paradigm change from hardware to software. Software Defined Networking (SDN) as one of the enablers of digital transformation could have prominent role in this paradigm shift and migration to Know More
        The recent technical trend in the field of communication networks shows a paradigm change from hardware to software. Software Defined Networking (SDN) as one of the enablers of digital transformation could have prominent role in this paradigm shift and migration to Knowledge-based network. In this regard, telecom operators are interested in deploying SDN to migrate their infrastructure from a static architecture to a dynamic and programmable platform. However, it seems that they do not consider SDN as one of their priorities and still depend on traditional methods to manage their network (especially in some developing countries such as Iran). Since the first step in applying new technologies is to accept them, we have proposed a comprehensive SDN adoption model with the mixed-method research methodology. At first, the theoretical foundations related to the research problem were examined. Then, based on Grounded theory, in-depth interviews were conducted with 12 experts (including university professors and managers of the major telecom operators). In result, more than a thousand initial codes were determined, which in the review stages and based on semantic commonalities, a total of 112 final codes, 14 categories and 6 themes have been extracted using open, axial and selective coding. Next, in order to confirm the indicators extracted from the qualitative part, the fuzzy Delphi method has been used. In the end, SPSS and SmartPLS 3 software were used to analyze the data collected from the questionnaire and to evaluate the fit of the model as well as confirm and reject the hypotheses. Manuscript profile
      • Open Access Article

        54 - Estimation of grain size curve of surface coarse sediments using imaging system designed
        A.H. Tabee A. Karami – Khaniki A.A. Bidokhti K. Lari
        Sediment recognition is one of the basic topics in coastal and river engineering. One of the parameters of sediment identification is their grain size. To determine the grain size, traditional methods such as sieving the sediments are usually used, which is accurate More
        Sediment recognition is one of the basic topics in coastal and river engineering. One of the parameters of sediment identification is their grain size. To determine the grain size, traditional methods such as sieving the sediments are usually used, which is accurate but time consuming. Image processing provides the ability to isolate and track targets (sediment grains) in images using the smallest unit of a digital image (pixel). In this paper, a one-piece system for imaging coarse-grained field sediments and presenting a granulation curve is constructed and tested, in which sediment processing and analysis is performed with ImageJ software and the results are compared by sieving method and was validated. Image samples were taken from laboratory and natural sand and sand sediments. The results show that the distribution obtained from the images of coarse (larger than one millimeter) and uniform surface sediments has a good correlation with the distribution obtained from the sieve analysis and reduces the time to at least one tenth and the total cost. Manuscript profile
      • Open Access Article

        55 - Test case Selection based on Test-Driven Development
        Zohreh Mafi mirian mirian
        Test-Driven Development (TDD) is one of the test-first software production methods in which the production of each component of the code begins with writing the test case. This method has been noticed due to many advantages, including the readable, regular and short cod More
        Test-Driven Development (TDD) is one of the test-first software production methods in which the production of each component of the code begins with writing the test case. This method has been noticed due to many advantages, including the readable, regular and short code, as well as increasing the quality, productivity and reliability, and the possibility of regression testing due to the creation of a comprehensive set of unit tests. The large number of unit test cases produced in this method is considered as a strong point in order to increase the reliability of the code, however, the repeated execution of test cases increases the duration of the regression testing in this method. The purpose of this article is to present an algorithm for selecting test cases to reduce the time of the regression test in TDD method. So far, various ideas have been proposed to select test cases and reduce the regression test time. Most of these ideas are based on programming language and software production methods. The idea presented in this article is based on the program difference method and the nature of the TDD method. In this method, meaningful semantic and structural connections are created between unit tests and code blocks, and the test case selection is done based on these relationships. Manuscript profile