Digitala Vetenskapliga Arkivet

Change search
Refine search result
1234567 1 - 50 of 1344
CiteExportLink to result list
Permanent link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Rows per page
  • 5
  • 10
  • 20
  • 50
  • 100
  • 250
Sort
  • Standard (Relevance)
  • Author A-Ö
  • Author Ö-A
  • Title A-Ö
  • Title Ö-A
  • Publication type A-Ö
  • Publication type Ö-A
  • Issued (Oldest first)
  • Issued (Newest first)
  • Created (Oldest first)
  • Created (Newest first)
  • Last updated (Oldest first)
  • Last updated (Newest first)
  • Disputation date (earliest first)
  • Disputation date (latest first)
  • Standard (Relevance)
  • Author A-Ö
  • Author Ö-A
  • Title A-Ö
  • Title Ö-A
  • Publication type A-Ö
  • Publication type Ö-A
  • Issued (Oldest first)
  • Issued (Newest first)
  • Created (Oldest first)
  • Created (Newest first)
  • Last updated (Oldest first)
  • Last updated (Newest first)
  • Disputation date (earliest first)
  • Disputation date (latest first)
Select
The maximal number of hits you can export is 250. When you want to export more records please use the Create feeds function.
  • 1.
    Abbas, Gulfam
    et al.
    Blekinge Institute of Technology, School of Computing.
    Asif, Naveed
    Blekinge Institute of Technology, School of Computing.
    Performance Tradeoffs in Software Transactional Memory2010Independent thesis Advanced level (degree of Master (Two Years))Student thesis
    Abstract [en]

    Transactional memory (TM), a new programming paradigm, is one of the latest approaches to write programs for next generation multicore and multiprocessor systems. TM is an alternative to lock-based programming. It is a promising solution to a hefty and mounting problem that programmers are facing in developing programs for Chip Multi-Processor (CMP) architectures by simplifying synchronization to shared data structures in a way that is scalable and compos-able. Software Transactional Memory (STM) a full software approach of TM systems can be defined as non-blocking synchronization mechanism where sequential objects are automatically converted into concurrent objects. In this thesis, we present performance comparison of four different STM implementations – RSTM of V. J. Marathe, et al., TL2 of D. Dice, et al., TinySTM of P. Felber, et al. and SwissTM of A. Dragojevic, et al. It helps us in deep understanding of potential tradeoffs involved. It further helps us in assessing, what are the design choices and configuration parameters that may provide better ways to build better and efficient STMs. In particular, suitability of an STM is analyzed against another STM. A literature study is carried out to sort out STM implementations for experimentation. An experiment is performed to measure performance tradeoffs between these STM implementations. The empirical evaluations done as part of this thesis conclude that SwissTM has significantly higher throughput than state-of-the-art STM implementations, namely RSTM, TL2, and TinySTM, as it outperforms consistently well while measuring execution time and aborts per commit parameters on STAMP benchmarks. The results taken in transaction retry rate measurements show that the performance of TL2 is better than RSTM, TinySTM and SwissTM.

    Download full text (pdf)
    FULLTEXT01
  • 2.
    Abghari, Shahrooz
    et al.
    Blekinge Institute of Technology, School of Computing.
    Kazemi, Samira
    Blekinge Institute of Technology, School of Computing.
    Open Data for Anomaly Detection in Maritime Surveillance2012Independent thesis Advanced level (degree of Master (Two Years))Student thesis
    Abstract [en]

    Context: Maritime Surveillance (MS) has received increased attention from a civilian perspective in recent years. Anomaly detection (AD) is one of the many techniques available for improving the safety and security in the MS domain. Maritime authorities utilize various confidential data sources for monitoring the maritime activities; however, a paradigm shift on the Internet has created new sources of data for MS. These newly identified data sources, which provide publicly accessible data, are the open data sources. Taking advantage of the open data sources in addition to the traditional sources of data in the AD process will increase the accuracy of the MS systems. Objectives: The goal is to investigate the potential open data as a complementary resource for AD in the MS domain. To achieve this goal, the first step is to identify the applicable open data sources for AD. Then, a framework for AD based on the integration of open and closed data sources is proposed. Finally, according to the proposed framework, an AD system with the ability of using open data sources is developed and the accuracy of the system and the validity of its results are evaluated. Methods: In order to measure the system accuracy, an experiment is performed by means of a two stage random sampling on the vessel traffic data and the number of true/false positive and negative alarms in the system is verified. To evaluate the validity of the system results, the system is used for a period of time by the subject matter experts from the Swedish Coastguard. The experts check the detected anomalies against the available data at the Coastguard in order to obtain the number of true and false alarms. Results: The experimental outcomes indicate that the accuracy of the system is 99%. In addition, the Coastguard validation results show that among the evaluated anomalies, 64.47% are true alarms, 26.32% are false and 9.21% belong to the vessels that remain unchecked due to the lack of corresponding data in the Coastguard data sources. Conclusions: This thesis concludes that using open data as a complementary resource for detecting anomalous behavior in the MS domain is not only feasible but also will improve the efficiency of the surveillance systems by increasing the accuracy and covering some unseen aspects of maritime activities.

    Download full text (pdf)
    FULLTEXT01
  • 3.
    Abualhana, Munther
    et al.
    Blekinge Institute of Technology, School of Computing.
    Tariq, Ubaid
    Blekinge Institute of Technology, School of Computing.
    Improving QoE over IPTV using FEC and Retransmission2009Independent thesis Advanced level (degree of Master (Two Years))Student thesis
    Abstract [en]

    IPTV (Internet Protocol Television), a new and modern concept of emerging technologies with focus on providing cutting edge high-resolution television, broadcast, and other fascinating services, is now easily available with only requirement of high-speed internet. Everytime a new technology is made local, it faces tremendous problems whether from technological point of view to enhance the performance or when it comes down to satisfy the customers. This cutting edge technology has provided researchers to embark and play with different tools to provide better quality while focusing on existing tools. Our target in dissertation is to provide a few interesting facets of IPTV and come up with a concept of introducing an imaginary cache that can re-collect the packets travelling from streaming server to the end user. In the access node this cache would be fixed and then on the basis of certain pre-assumed research work we can conclude how quick retransmission can take place when the end user responds back using RTCP protocol and asks for the retransmission of corrupted/lost packets. In the last section, we plot our scenario of streaming server on one side and client, end user on the other end and make assumption on the basis of throughput, response time and traffic.

    Download full text (pdf)
    FULLTEXT01
  • 4.
    Acharya, Mod Nath
    et al.
    Blekinge Institute of Technology, School of Computing.
    Aslam, Nazam
    Blekinge Institute of Technology, School of Computing.
    Coordination in Global Software Development: Challenges, associated threats, and mitigating practices2012Independent thesis Advanced level (degree of Master (Two Years))Student thesis
    Abstract [en]

    Global Software Development (GSD) is an emerging trend in today's software world in which teams are geographically dispersed, either in close proximity or globally. GSD provides certain advantages to development companies like low development cost, access to cheap and skilled labour etc. This type of development is noted as a more risky and challenging as compared to projects developed with teams under same roof. Inherently the nature of GSD projects are cooperative in which many software developers work on a common project, share information and coordinate activities. Coordination is a fundamental part of software development. GSD comprises different types of development systems i.e. insourcing, outsourcing, nearshoring, or farshoring, whatever the types of development systems selected by a company there exist the challenges to coordination. Therefore the knowledge of potential challenges, associated threats to coordination and practices to mitigate them plays a vital role for running a successful global project.

    Download full text (pdf)
    FULLTEXT01
  • 5. Adams, Liz
    et al.
    Börstler, Jürgen
    What It's Like to Participate in an ITiCSE Working Group2011In: ACM SIGCSE Bulletin, Vol. 43, no 1Article in journal (Other academic)
  • 6.
    Adebomi, OYEKANLU Emmanuel
    et al.
    Blekinge Institute of Technology, School of Computing.
    Mwela, JOHN Samson
    Blekinge Institute of Technology, School of Computing.
    Impact of Packet Losses on the Quality of Video Streaming2010Independent thesis Advanced level (degree of Master (Two Years))Student thesis
    Abstract [en]

    In this thesis, the impact of packet losses on the quality of received videos sent across a network that exhibit normal network perturbations such as jitters, delays, packet drops etc has been examined. Dynamic behavior of a normal network has been simulated using Linux and the Network Emulator (NetEm). Peoples’ perceptions on the quality of the received video were used in rating the qualities of several videos with differing speeds. In accordance with ITU’s guideline of using Mean Opinion Scores (MOS), the effects of packet drops were analyzed. Excel and Matlab were used as tools in analyzing the peoples’ opinions which indicates the impacts that different loss rates has on the transmitted videos. Statistical methods used for evaluation of data are mean and variance. We conclude that people have convergence of opinions when losses become extremely high on videos with highly variable scene changes

    Download full text (pdf)
    FULLTEXT01
  • 7.
    Adolfsson, Elin
    Blekinge Institute of Technology, School of Computing.
    Sociala medier för att hantera kundkontakter2012Independent thesis Advanced level (degree of Master (One Year))Student thesis
    Abstract [sv]

    KONTEXT. I takt med att Internets framväxt och fler och fler människor ansluter sig till sociala medieplattformar som Facebook, uppkommer nya sätt att interagera och kommunicera mellan varandra. Samtidigt ökar antalet organisationer och företag, som i sin tur måste hitta nya vägar för att deras marknadsföring ska sticka ut och bli uppfattad. För att nå ut till målgruppen med budskap, måste organisationen vara där målgruppen är. Därför har Facebook blivit en ny del av företags marknadsföringsredskap. Denna studie är fokuserad på Malmö stadsbiblioteks användning av Facebook som ett kommunikationsverktyg i hanteringen av kundkontakter. MÅL. Syftet med denna magisteruppsats är att undersöka Malmö stadsbiblioteks användning av Facebook, med ett huvudsakligt fokus på undperception och kundens upplevelse av kommunikationen, för att utreda hur Malmö stadsbibliotek bör använda Facebook i marknadsföringssyfte. METOD. Denna studie är baserad på en empirisk undersökning innehållandes en innehållsanalys av Malmö stadsbiblioteks Facebooksida, och också en onlineenkät gjord på de som ”gillar” och följer Facebooksidan. RESULTAT. Resultatet av den empiriska studien visar att Malmö stadsbibliotek står inför en bred målgrupp där segmentering är nödvändigt för att skapa och sprida rätt budskap till den primära målgruppen. Malmö stadsbibliotek publicerar statusuppdateringar relativt jämnt fördelat på månadens dagar och det är möjligt att peka ut olika kategorier av uppdateringar med olika målgruppsresponser och upplevelser från följarna. Resultatet av studien presenteras som diagram och tabeller med beskrivningar för att förbättra illustrationen och därför öka förståelsen. SLUTSATS. Slutsatsen av denna magisteruppsats är att de flesta respondenterna är positiva inför Malmö stadsbiblioteks användning av Facebook. Baserat på diverse befintliga teorier är marknadsföring på sociala nätverk ett bra sätt att interagera, kommunicera och få feedback från kunder för att kunna bygga goda relationer med kunderna. Detta är någonting som är väldigt viktigt i nuläget när antalet företag och reklambruset är så högt som det är. Det finns ingen enkel väg till framgång, med enbart en korrekt väg. Det är beroende av varje specifik organisation och deras specifika primärmålgrupp.

    Download full text (pdf)
    FULLTEXT01
  • 8.
    Aftab, Adnan
    et al.
    Blekinge Institute of Technology, School of Computing.
    Mufti, Muhammad Nabeel
    Blekinge Institute of Technology, School of Computing.
    Spectrum sensing through implementation of USRP22011Independent thesis Advanced level (degree of Master (Two Years))Student thesis
    Abstract [en]

    Scarcity of the wireless spectrum has led to the development of new techniques for better utilization of the wireless spectrum. Demand for high data rates and better voice quality is resulting in the development of new wireless standard making wireless spectrum limited than ever. In this era of wireless communication, service providers and telecom operators are faced with a dilemma where they need a large sum of the wireless spectrum to meet the ever increasing quality of service requirements of consumers. This has led to the development of spectrum sensing techniques to find the unused spectrum in the available frequency band. The results presented in this thesis will help out in developing clear understanding of spectrum sensing techniques. Comparison of different spectrum sensing approaches. The experiments carried out using USRP2 and GNU radio will help the reader to understand the concept of underutilized frequency band and its importance in Cognitive Radios.

    Download full text (pdf)
    FULLTEXT01
  • 9. Afzal, Wasif
    Search-Based Prediction of Software Quality: Evaluations and Comparisons2011Doctoral thesis, comprehensive summary (Other academic)
    Abstract [en]

    Software verification and validation (V&V) activities are critical for achieving software quality; however, these activities also constitute a large part of the costs when developing software. Therefore efficient and effective software V&V activities are both a priority and a necessity considering the pressure to decrease time-to-market and the intense competition faced by many, if not all, companies today. It is then perhaps not unexpected that decisions that affects software quality, e.g., how to allocate testing resources, develop testing schedules and to decide when to stop testing, needs to be as stable and accurate as possible. The objective of this thesis is to investigate how search-based techniques can support decision-making and help control variation in software V&V activities, thereby indirectly improving software quality. Several themes in providing this support are investigated: predicting reliability of future software versions based on fault history; fault prediction to improve test phase efficiency; assignment of resources to fixing faults; and distinguishing fault-prone software modules from non-faulty ones. A common element in these investigations is the use of search-based techniques, often also called metaheuristic techniques, for supporting the V&V decision-making processes. Search-based techniques are promising since, as many problems in real world, software V&V can be formulated as optimization problems where near optimal solutions are often good enough. Moreover, these techniques are general optimization solutions that can potentially be applied across a larger variety of decision-making situations than other existing alternatives. Apart from presenting the current state of the art, in the form of a systematic literature review, and doing comparative evaluations of a variety of metaheuristic techniques on large-scale projects (both industrial and open-source), this thesis also presents methodological investigations using search-based techniques that are relevant to the task of software quality measurement and prediction. The results of applying search-based techniques in large-scale projects, while investigating a variety of research themes, show that they consistently give competitive results in comparison with existing techniques. Based on the research findings, we conclude that search-based techniques are viable techniques to use in supporting the decision-making processes within software V&V activities. The accuracy and consistency of these techniques make them important tools when developing future decision-support for effective management of software V&V activities.

    Download full text (pdf)
    FULLTEXT01
  • 10.
    Afzal, Wasif
    Blekinge Institute of Technology.
    Using faults-slip-through metric as a predictor of fault-proneness2010In: Proceedings - Asia-Pacific Software Engineering Conference, APSEC, IEEE , 2010Conference paper (Refereed)
    Abstract [en]

    The majority of software faults are present in small number of modules, therefore accurate prediction of fault-prone modules helps improve software quality by focusing testing efforts on a subset of modules. This paper evaluates the use of the faults-slip-through (FST) metric as a potential predictor of fault-prone modules. Rather than predicting the fault-prone modules for the complete test phase, the prediction is done at the specific test levels of integration and system test. We applied eight classification techniques to the task of identifying fault-prone modules, representing a variety of approaches, including a standard statistical technique for classification (logistic regression), tree-structured classifiers (C4.5 and random forests), a Bayesian technique (Na\"{i}ve Bayes), machine-learning techniques (support vector machines and back-propagation artificial neural networks) and search-based techniques (genetic programming and artificial immune recognition systems) on FST data collected from two large industrial projects from the telecommunication domain. \emph{Results:} Using area under the receiver operating characteristic (ROC) curve and the location of (PF, PD) pairs in the ROC space, GP showed impressive results in comparison with other techniques for predicting fault-prone modules at both integration and system test levels. The use of faults-slip-through metric in general provided good prediction results at the two test levels. The accuracy of GP is statistically significant in comparison with majority of the techniques for predicting fault-prone modules at integration and system test levels. (ii) Faults-slip-through metric has the potential to be a generally useful predictor of fault-proneness at integration and system test levels.

    Download full text (pdf)
    fulltext
  • 11. Afzal, Wasif
    et al.
    Torkar, Richard
    On the application of genetic programming for software engineering predictive modeling: A systematic review2011In: Expert Systems with Applications, ISSN 0957-4174 , Vol. 38, no 9, p. 11984-11997Article, review/survey (Refereed)
    Abstract [en]

    The objective of this paper is to investigate the evidence for symbolic regression using genetic programming (GP) being an effective method for prediction and estimation in software engineering, when compared with regression/machine learning models and other comparison groups (including comparisons with different improvements over the standard GP algorithm). We performed a systematic review of literature that compared genetic programming models with comparative techniques based on different independent project variables. A total of 23 primary studies were obtained after searching different information sources in the time span 1995-2008. The results of the review show that symbolic regression using genetic programming has been applied in three domains within software engineering predictive modeling: (i) Software quality classification (eight primary studies). (ii) Software cost/effort/size estimation (seven primary studies). (iii) Software fault prediction/software reliability growth modeling (eight primary studies). While there is evidence in support of using genetic programming for software quality classification, software fault prediction and software reliability growth modeling: the results are inconclusive for software cost/effort/size estimation.

  • 12. Afzal, Wasif
    et al.
    Torkar, Richard
    Feldt, Robert
    A systematic review of search-based testing for non-functional system properties2009In: Information and Software Technology, ISSN 0950-5849, E-ISSN 1873-6025, Vol. 51, no 6, p. 957-976Article in journal (Refereed)
    Abstract [en]

    Search-based software testing is the application of metaheuristic search techniques to generate software tests. The test adequacy criterion is transformed into a fitness function and a set of solutions in the search space are evaluated with respect to the fitness function using a metaheuristic search technique. The application of metaheuristic search techniques for testing is promising due to the fact that exhaustive testing is infeasible considering the size and complexity of software under test. Search-based software testing has been applied across the spectrum of test case design methods; this includes white-box (structural), black-box (functional) and grey-box (combination of structural and functional) testing. In addition, metaheuristic search techniques have also been applied to test non-functional properties. The overall objective of undertaking this systematic review is to examine existing work into non-functional search-based software testing (NFSBST). We are interested in types of non-functional testing targeted using metaheuristic search techniques, different fitness functions used in different types of search-based non-functional testing and challenges in the application of these techniques. The systematic review is based on a comprehensive set of 35 articles obtained after a multi-stage selection process and have been published in the time span 1996-2007. The results of the review show that metaheuristic search techniques have been applied for non-functional testing of execution time, quality of service, security, usability and safety. A variety of metaheuristic search techniques are found to be applicable for non-functional testing including simulated annealing, tabu search, genetic algorithms, ant colony methods, grammatical evolution, genetic programming (and its variants including linear genetic programming) and swarm intelligence methods. The review reports on different fitness functions used to guide the search for each of the categories of execution time, safety, usability, quality of service and security; along with a discussion of possible challenges in the application of metaheuristic search techniques.

  • 13.
    Afzal, Wasif
    et al.
    Blekinge Institute of Technology, School of Computing.
    Torkar, Richard
    Blekinge Institute of Technology, School of Computing.
    Feldt, Robert
    Blekinge Institute of Technology, School of Computing.
    Resampling Methods in Software Quality Classification2012In: International Journal of Software Engineering and Knowledge Engineering, ISSN 0218-1940, Vol. 22, no 2, p. 203-223Article in journal (Refereed)
    Abstract [en]

    In the presence of a number of algorithms for classification and prediction in software engineering, there is a need to have a systematic way of assessing their performances. The performance assessment is typically done by some form of partitioning or resampling of the original data to alleviate biased estimation. For predictive and classification studies in software engineering, there is a lack of a definitive advice on the most appropriate resampling method to use. This is seen as one of the contributing factors for not being able to draw general conclusions on what modeling technique or set of predictor variables are the most appropriate. Furthermore, the use of a variety of resampling methods make it impossible to perform any formal meta-analysis of the primary study results. Therefore, it is desirable to examine the influence of various resampling methods and to quantify possible differences. Objective and method: This study empirically compares five common resampling methods (hold-out validation, repeated random sub-sampling, 10-fold cross-validation, leave-one-out cross-validation and non-parametric bootstrapping) using 8 publicly available data sets with genetic programming (GP) and multiple linear regression (MLR) as software quality classification approaches. Location of (PF, PD) pairs in the ROC (receiver operating characteristics) space and area under an ROC curve (AUC) are used as accuracy indicators. Results: The results show that in terms of the location of (PF, PD) pairs in the ROC space, bootstrapping results are in the preferred region for 3 of the 8 data sets for GP and for 4 of the 8 data sets for MLR. Based on the AUC measure, there are no significant differences between the different resampling methods using GP and MLR. Conclusion: There can be certain data set properties responsible for insignificant differences between the resampling methods based on AUC. These include imbalanced data sets, insignificant predictor variables and high-dimensional data sets. With the current selection of data sets and classification techniques, bootstrapping is a preferred method based on the location of (PF, PD) pair data in the ROC space. Hold-out validation is not a good choice for comparatively smaller data sets, where leave-one-out cross-validation (LOOCV) performs better. For comparatively larger data sets, 10-fold cross-validation performs better than LOOCV.

  • 14. Afzal, Wasif
    et al.
    Torkar, Richard
    Feldt, Robert
    Search-based prediction of fault count data2009Conference paper (Refereed)
    Abstract [en]

    Symbolic regression, an application domain of genetic programming (GP), aims to find a function whose output has some desired property, like matching target values of a particular data set. While typical regression involves finding the coefficients of a pre-defined function, symbolic regression finds a general function, with coefficients, fitting the given set of data points. The concepts of symbolic regression using genetic programming can be used to evolve a model for fault count predictions. Such a model has the advantages that the evolution is not dependent on a particular structure of the model and is also independent of any assumptions, which are common in traditional time-domain parametric software reliability growth models. This research aims at applying experiments targeting fault predictions using genetic programming and comparing the results with traditional approaches to compare efficiency gains.

    Download full text (pdf)
    FULLTEXT01
  • 15.
    Afzal, Wasif
    et al.
    Blekinge Institute of Technology.
    Torkar, Richard
    Blekinge Institute of Technology.
    Feldt, Robert
    Blekinge Institute of Technology, Faculty of Computing, Department of Software Engineering.
    Gorschek, Tony
    Blekinge Institute of Technology, Faculty of Computing, Department of Software Engineering.
    Genetic programming for cross-release fault count predictions in large and complex software projects2010In: Evolutionary Computation and Optimization Algorithms in Software Engineering: Applications and Techniques / [ed] Chis, Monica, IGI Global, Hershey, USA , 2010Chapter in book (Refereed)
    Abstract [en]

    Software fault prediction can play an important role in ensuring software quality through efficient resource allocation. This could, in turn, reduce the potentially high consequential costs due to faults. Predicting faults might be even more important with the emergence of short-timed and multiple software releases aimed at quick delivery of functionality. Previous research in software fault prediction has indicated that there is a need i) to improve the validity of results by having comparisons among number of data sets from a variety of software, ii) to use appropriate model evaluation measures and iii) to use statistical testing procedures. Moreover, cross-release prediction of faults has not yet achieved sufficient attention in the literature. In an attempt to address these concerns, this paper compares the quantitative and qualitative attributes of 7 traditional and machine-learning techniques for modeling the cross-release prediction of fault count data. The comparison is done using extensive data sets gathered from a total of 7 multi-release open-source and industrial software projects. These software projects together have several years of development and are from diverse application areas, ranging from a web browser to a robotic controller software. Our quantitative analysis suggests that genetic programming (GP) tends to have better consistency in terms of goodness of fit and accuracy across majority of data sets. It also has comparatively less model bias. Qualitatively, ease of configuration and complexity are less strong points for GP even though it shows generality and gives transparent models. Artificial neural networks did not perform as well as expected while linear regression gave average predictions in terms of goodness of fit and accuracy. Support vector machine regression and traditional software reliability growth models performed below average on most of the quantitative evaluation criteria while remained on average for most of the qualitative measures.

  • 16. Afzal, Wasif
    et al.
    Torkar, Richard
    Blekinge Institute of Technology, School of Computing.
    Feldt, Robert
    Blekinge Institute of Technology, School of Computing.
    Gorschek, Tony
    Blekinge Institute of Technology, School of Computing.
    Prediction of faults-slip-through in large software projects: an empirical evaluation2014In: Software quality journal, ISSN 0963-9314, E-ISSN 1573-1367, Vol. 22, no 1, p. 51-86Article in journal (Refereed)
    Abstract [en]

    A large percentage of the cost of rework can be avoided by finding more faults earlier in a software test process. Therefore, determination of which software test phases to focus improvement work on has considerable industrial interest. We evaluate a number of prediction techniques for predicting the number of faults slipping through to unit, function, integration, and system test phases of a large industrial project. The objective is to quantify improvement potential in different test phases by striving toward finding the faults in the right phase. The results show that a range of techniques are found to be useful in predicting the number of faults slipping through to the four test phases; however, the group of search-based techniques (genetic programming, gene expression programming, artificial immune recognition system, and particle swarm optimization-based artificial neural network) consistently give better predictions, having a representation at all of the test phases. Human predictions are consistently better at two of the four test phases. We conclude that the human predictions regarding the number of faults slipping through to various test phases can be well supported by the use of search-based techniques. A combination of human and an automated search mechanism (such as any of the search-based techniques) has the potential to provide improved prediction results.

    Download full text (pdf)
    FULLTEXT01
  • 17.
    Afzal, Wasif
    et al.
    Blekinge Institute of Technology.
    Torkar, Richard
    Blekinge Institute of Technology.
    Feldt, Robert
    Blekinge Institute of Technology, Faculty of Computing, Department of Software Engineering.
    Wikstrand, Greger
    KnowIT YAHM Sweden AB, SWE.
    Search-based prediction of fault-slip-through in large software projects2010In: Proceedings - 2nd International Symposium on Search Based Software Engineering, SSBSE 2010, IEEE , 2010, p. 79-88Conference paper (Refereed)
    Abstract [en]

    A large percentage of the cost of rework can be avoided by finding more faults earlier in a software testing process. Therefore, determination of which software testing phases to focus improvements work on, has considerable industrial interest. This paper evaluates the use of five different techniques, namely particle swarm optimization based artificial neural networks (PSO-ANN), artificial immune recognition systems (AIRS), gene expression programming (GEP), genetic programming (GP) and multiple regression (MR), for predicting the number of faults slipping through unit, function, integration and system testing phases. The objective is to quantify improvement potential in different testing phases by striving towards finding the right faults in the right phase. We have conducted an empirical study of two large projects from a telecommunication company developing mobile platforms and wireless semiconductors. The results are compared using simple residuals, goodness of fit and absolute relative error measures. They indicate that the four search-based techniques (PSO-ANN, AIRS, GEP, GP) perform better than multiple regression for predicting the fault-slip-through for each of the four testing phases. At the unit and function testing phases, AIRS and PSO-ANN performed better while GP performed better at integration and system testing phases. The study concludes that a variety of search-based techniques are applicable for predicting the improvement potential in different testing phases with GP showing more consistent performance across two of the four test phases.

  • 18. Ahmad, A
    et al.
    Shahzad, Aamir
    Padmanabhuni, Kumar
    Mansoor, Ali
    Joseph, Sushma
    Arshad, Zaki
    Requirements prioritization with respect to Geographically Distributed Stakeholders2011Conference paper (Refereed)
    Abstract [en]

    Requirements selection for software releases can play a vital role in the success of software product. This selection of requirements is done by different requirements prioritization techniques. This paper discusses limitations of these Requirements Prioritization Techniques (100$ Method and Binary Search Tree) with respect to Geographical Distribution of Stakeholders. We conducted two experiments, in this paper, in order to analyze the variations among the results of these Requirements Prioritization Techniques. This paper also discusses attributes that can affect the requirements prioritization when dealing with Geographically Distributed Stakeholders. We conducted first experiment with 100$ Dollar method and Binary Search Tree technique and second experiment has been conducted with modified 100$ Dollar method and Binary search tree technique. Results of these experiments have been discussed in this paper. This paper provides a framework that can be used to identify those requirements that can play an important role in a product success during distributed development.

  • 19. Ahmad, Azeem
    et al.
    Göransson, Magnus
    Shahzad, Aamir
    Limitations of the analytic hierarchy process technique with respect to geographically distributed stakeholders2010In: Proceedings of World Academy of Science, Engineering and Technology, ISSN 2010-376X, Vol. 70, no Sept., p. 111-116Article in journal (Refereed)
    Abstract [en]

    The selection of appropriate requirements for product releases can make a big difference in a product success. The selection of requirements is done by different requirements prioritization techniques. These techniques are based on pre-defined and systematic steps to calculate the requirements relative weight. Prioritization is complicated by new development settings, shifting from traditional co-located development to geographically distributed development. Stakeholders, connected to a project, are distributed all over the world. These geographically distributions of stakeholders make it hard to prioritize requirements as each stakeholder have their own perception and expectations of the requirements in a software project. This paper discusses limitations of the Analytical Hierarchy Process with respect to geographically distributed stakeholders' (GDS) prioritization of requirements. This paper also provides a solution, in the form of a modified AHP, in order to prioritize requirements for GDS. We will conduct two experiments in this paper and will analyze the results in order to discuss AHP limitations with respect to GDS. The modified AHP variant is also validated in this paper.

  • 20.
    Ahmad, Azeem
    et al.
    Blekinge Institute of Technology, School of Computing.
    Kolla, Sushma Joseph
    Blekinge Institute of Technology, School of Computing.
    Effective Distribution of Roles and Responsibilities in Global Software Development Teams2012Independent thesis Advanced level (degree of Master (Two Years))Student thesis
    Abstract [en]

    Context. Industry is moving from co-located form of development to a distributed development in order to achieve different benefits such as cost reduction, access to skillful labor and around the clock working etc. This transfer requires industry to face different challenges such as communication, coordination and monitoring problems. Risk of project failure can be increased, if industry does not address these problems. This thesis is about providing the solutions of these problems in term of effective roles and responsibilities that may have positive impact on GSD team. Objectives. In this study we have developed framework for suggesting roles and responsibilities for GSD team. This framework consists of problems and casual dependencies between them which are related to team’s ineffectiveness, then suggestions in terms of roles and responsibilities have been presented in order to have an effective team in GSD. This framework, further, has been validated in industry through a survey that determines which are the effective roles and responsibilities in GSD. Methods. We have two research methods in this study 1) systematic literature review and 2) survey. Complete protocol for planning, conducting and reporting the review as well as survey has been described in their respective sections in this thesis. A systematic review is used to develop the framework whereas survey is used for framework validation. We have done static validation of framework. Results. Through SLR, we have identified 30 problems, 33 chains of problems. We have identified 4 different roles and 40 different responsibilities to address these chains of problems. During the validation of the framework, we have validated the links between suggested roles and responsibilities and chains of problems. Addition to this, through survey, we have identified 20 suggestions that represents strong positive impact on chains of problems in GSD in relation to team’s effectiveness. Conclusions. We conclude that implementation of effective roles and responsibilities in GSD team to avoid different problems require considerable attention from researchers and practitioners which can guarantee team’s effectiveness. Implementation of proper roles and responsibilities has been mentioned as one of the successful strategies for increasing team’s effectiveness in the literature, but which particular roles and responsibilities should be implemented still need to be addressed. We also conclude that there must be basic responsibilities associated with any particular role. Moreover, we conclude that there is a need for further development and empirical validation of different frameworks for suggesting roles and responsibilities in full scale industry trials.

    Download full text (pdf)
    FULLTEXT01
  • 21. Ahmad, Ehsan
    et al.
    Raza, Bilal
    Feldt, Robert
    Assessment and support for software capstone projects at the undergraduate level: A survey and rubrics2011Conference paper (Refereed)
    Abstract [en]

    Software engineering and computer science students conduct a capstone project during the final year of their degree programs. These projects are essential in validating that students have gained required knowledge and they can synthesize and use that knowledge to solve real world problems. However, the external requirements on educational programs often do not provide detailed guidelines for how to conduct or support these capstone projects, which may lead to variations among universities. This paper presents the results from a survey conducted at 19 different Pakistani universities of the current management practices and assessment criteria used for the capstone project courses at Undergraduate level. Based upon the results of this survey and similar work on Master Thesis capstone projects in Sweden, we present assessment rubrics for software-related undergraduate capstone projects. We also present recommendations for the continuous improvement of capstone projects.

  • 22.
    Ahmad, Nadeem
    et al.
    Blekinge Institute of Technology, School of Computing.
    Habib, M. Kashif
    Blekinge Institute of Technology, School of Computing.
    Analysis of Network Security Threats and Vulnerabilities by Development & Implementation of a Security Network Monitoring Solution2010Independent thesis Advanced level (degree of Master (Two Years))Student thesis
    Abstract [en]

    Communication of confidential data over the internet is becoming more frequent every day. Individuals and organizations are sending their confidential data electronically. It is also common that hackers target these networks. In current times, protecting the data, software and hardware from viruses is, now more than ever, a need and not just a concern. What you need to know about networks these days? How security is implemented to ensure a network? How is security managed? In this paper we will try to address the above questions and give an idea of where we are now standing with the security of the network.

    Download full text (pdf)
    FULLTEXT01
  • 23.
    Ahmad, Waqar
    et al.
    Blekinge Institute of Technology, School of Computing.
    Riaz, Asim
    Blekinge Institute of Technology, School of Computing.
    Predicting Friendship Levels in Online Social Networks2010Independent thesis Advanced level (degree of Master (Two Years))Student thesis
    Abstract [en]

    Abstract Context: Online social networks such as Facebook, Twitter, and MySpace have become the preferred interaction, entertainment and socializing facility on the Internet. However, these social network services also bring privacy issues in more limelight than ever. Several privacy leakage problems are highlighted in the literature with a variety of suggested countermeasures. Most of these measures further add complexity and management overhead for the user. One ignored aspect with the architecture of online social networks is that they do not offer any mechanism to calculate the strength of relationship between individuals. This information is quite useful to identify possible privacy threats. Objectives: In this study, we identify users’ privacy concerns and their satisfaction regarding privacy control measures provided by online social networks. Furthermore, this study explores data mining techniques to predict the levels/intensity of friendship in online social networks. This study also proposes a technique to utilize predicted friendship levels for privacy preservation in a semi-automatic privacy framework. Methods: An online survey is conducted to analyze Facebook users’ concerns as well as their interaction behavior with their good friends. On the basis of survey results, an experiment is performed to justify practical demonstration of data mining phases. Results: We found that users are concerned to save their private data. As a precautionary measure, they restrain to show their private information on Facebook due to privacy leakage fears. Additionally, individuals also perform some actions which they also feel as privacy vulnerability. This study further identifies that the importance of interaction type varies while communication. This research also discovered, “mutual friends” and “profile visits”, the two non-interaction based estimation metrics. Finally, this study also found an excellent performance of J48 and Naïve Bayes algorithms to classify friendship levels. Conclusions: The users are not satisfied with the privacy measures provided by the online social networks. We establish that the online social networks should offer a privacy mechanism which does not require a lot of privacy control effort from the users. This study also concludes that factors such as current status, interaction type need to be considered with the interaction count method in order to improve its performance. Furthermore, data mining classification algorithms are tailor-made for the prediction of friendship levels.

    Download full text (pdf)
    FULLTEXT01
  • 24.
    Ahmed, Israr
    et al.
    Blekinge Institute of Technology, School of Computing.
    Nadeem, Shahid
    Blekinge Institute of Technology, School of Computing.
    Minimizing Defects Originating from Elicitation, Analysis and Negotiation (E and A&N) Phase in Bespoke Requirements Engineering2009Independent thesis Advanced level (degree of Master (Two Years))Student thesis
    Abstract [en]

    Defect prevention (DP) in early stages of software development life cycle (SDLC) is very cost effective than in later stages. The requirements elicitation and analysis & negotiation (E and A&N) phases in requirements engineering (RE) process are very critical and are major source of requirements defects. A poor E and A&N process may lead to a software requirements specifications (SRS) full of defects like missing, ambiguous, inconsistent, misunderstood, and incomplete requirements. If these defects are identified and fixed in later stages of SDLC then they could cause major rework by spending extra cost and effort. Organizations are spending about half of their total project budget on avoidable rework and majority of defects originate from RE activities. This study is an attempt to prevent requirements level defects from penetrates into later stages of SDLC. For this purpose empirical and literature studies are presented in this thesis. The empirical study is carried out with the help of six companies from Pakistan & Sweden by conducting interviews and literature study is done by using literature reviews. This study explores the most common requirements defect types, their reasons, severity level of defects (i.e. major or minor), DP techniques (DPTs) & methods, defect identification techniques that have been using in software development industry and problems in these DPTs. This study also describes possible major differences between Swedish and Pakistani software companies in terms of defect types and rate of defects originating from E and A&N phases. On the bases of study results, some solutions have been proposed to prevent requirements defects during the RE process. In this way we can minimize defects originating from E and A&N phases of RE in the bespoke requirements engineering (BESRE).

    Download full text (pdf)
    FULLTEXT01
  • 25.
    Ahmed, Mohammad Abdur Razzak and Rajib
    Blekinge Institute of Technology, School of Computing.
    Knowledge Management in Distributed Agile Projects2013Independent thesis Advanced level (degree of Master (Two Years))Student thesis
    Abstract [en]

    Knowledge management (KM) is essential for success in Global Soft- ware Development (GSD); Distributed Software Development (DSD); or Global Software Engineering (GSE). Software organizations are managing knowledge in innovative ways to increase productivity. One of the major objectives of KM is to improve productivity through effective knowledge sharing and transfer. Therefore, to maintain effective knowledge sharing in distributed agile projects, practitioners need to adopt different types of knowledge sharing techniques and strategies. Distributed projects introduce new challenges to KM. So, practices that are used in agile teams become difficult to put into action in distributed development. Though, informal communication is the key enabler for knowledge sharing, when an agile project is distributed, informal communication and knowledge sharing are challenged by the low communication bandwidth between distributed team members, as well as by social and cultural distance. In the work presented in this thesis, we have made an overview of empirical studies of knowledge management in distributed agile projects. Based on the main theme of this study, we have categorized and reported our findings on major concepts that need empirical investigation. We have classified the main research theme in this thesis within two sub-themes: • RT1: Knowledge sharing activities in distributed agile projects. • RT2: Spatial knowledge sharing in a distributed agile project. The main contributions are: • C1: Empirical observations regarding knowledge sharing activities in distributed agile projects. • C2: Empirical observations regarding spatial knowledge sharing in a distributed agile project. • C3: Process improvement scope and guidelines for the studied project.

    Download full text (pdf)
    FULLTEXT01
  • 26.
    Ahmed, Nisar
    et al.
    Blekinge Institute of Technology, School of Computing.
    Yousaf, Shahid
    Blekinge Institute of Technology, School of Computing.
    For Improved Energy Economy – How Can Extended Smart Metering Be Displayed?2011Independent thesis Advanced level (degree of Master (Two Years))Student thesis
    Abstract [en]

    Context: A District Heating System (DHS) uses a central heating plant to produce and distribute hot water in a community. Such a plant is connected with consumers’ premises to provide them with hot water and space heating facilities. Variations in the consumption of heat energy depend upon different factors like difference in energy prices, living standards, environmental effects and economical conditions etc. These factors can manage intelligently by advanced tools of Information and Communication Technology (ICT) such as smart metering. That is a new and emerging technology; used normally for metering of District Heating (DH), district cooling, electricity and gas. Traditional meters measures overall consumption of energy, in contrast smart meters have the ability to frequently record and transmit energy consumption statistics to both energy providers and consumers by using their communication networks and network management systems. Objectives: First objective of conducted study was providing energy consumption/saving suggestions on smart metering display for accepted consumer behavior, proposed by the energy providers. Our second objective was analysis of financial benefits for the energy provides, which could be expected through better consumer behavior. Third objective was analysis of energy consumption behavior of the residential consumes that how we can support it. Moreover, forth objective of the study was to use extracted suggestions of consumer behaviors to propose Extended Smart Metering Display for improving energy economy. Methods: In this study a background study was conducted to develop basic understanding about District Heat Energy (DHE), smart meters and their existing display, consumer behaviors and its effects on energy consumption. Moreover, interviews were conducted with representatives of smart heat meters’ manufacturer, energy providers and residential consumers. Interviews’ findings enabled us to propose an Extended Smart Metering Display, that satisfies recommendations received from all the interviewees and background study. Further in this study, a workshop was conducted for the evaluation of the proposed Extended Smart Metering Display which involved representatives of smart heat meters’ manufacture and residential energy consumers. DHE providers also contributed in this workshop through their comments in online conversation, for which an evaluation request was sent to member companies of Swedish District Heating Association. Results: Informants in this research have different levels of experiences. Through a systematic procedure we have obtained and analyzed findings from all the informants. To fulfill the energy demands during peak hours, the informants emphasized on providing efficient energy consumption behavior to be displayed on smart heat meters. According to the informants, efficient energy consumption behavior can be presented through energy consumption/saving suggestions on display of smart meters. These suggestions are related to daily life activities like taking bath and shower, cleaning, washing and heating usage. We analyzed that efficient energy consumption behavior recommended by the energy providers can provide financial improvements both for the energy providers and the residential consumers. On the basis of these findings, we proposed Extended Smart Metering Display to present information in simple and interactive way. Furthermore, the proposed Extended Smart Metering Display can also be helpful in measuring consumers’ energy consumption behavior effectively. Conclusions: After obtaining answers of the research questions, we concluded that extension of existing smart heat meters’ display can effectively help the energy providers and the residential consumers to utilize the resources efficiently. That is, it will not only reduce energy bills for the residential consumers, but it will also help the energy provider to save scarce energy and enable them to serve the consumers better in peak hours. After deployment of the proposed Extended Smart Metering Display the energy providers will able to support the consumers’ behavior in a reliable way and the consumers will find/follow the energy consumption/saving guidelines easily.

    Download full text (pdf)
    FULLTEXT01
  • 27.
    ahmed, Tanveer
    et al.
    Blekinge Institute of Technology, School of Computing.
    Raju, Madhu Sudhana
    Blekinge Institute of Technology, School of Computing.
    Integrating Exploratory Testing In Software Testing Life Cycle, A Controlled Experiment2012Independent thesis Advanced level (degree of Master (Two Years))Student thesis
    Abstract [en]

    Context. Software testing is one of the crucial phases in software development life cycle (SDLC). Among the different manual testing methods in software testing, Exploratory testing (ET) uses no predefined test cases to detect defects. Objectives. The main objective of this study is to test the effectiveness of ET in detecting defects at different software test levels. The objective is achieved by formulating hypotheses, which are later tested for acceptance or rejection. Methods. Methods used in this thesis are literature review and experiment. Literature review is conducted to get in-depth knowledge on the topic of ET and to collect data relevant to ET. Experiment was performed to test hypotheses specific to the three different testing levels : unit , integration and system. Results. The experimental results showed that using ET did not find all the seeded defects at the three levels of unit, integration and system testing. The results were analyzed using statistical tests and interpreted with the help of bar graphs. Conclusions. We conclude that more research is required in generalizing the benefits of ET at different test levels. Particularly, a qualitative study to highlight factors responsible for the success and failure of ET is desirable. Also we encourage a replication of this experiment with subjects having a sound technical and domain knowledge.

    Download full text (pdf)
    FULLTEXT01
  • 28.
    Ahmed, Zaheer
    et al.
    Blekinge Institute of Technology, School of Computing.
    Shahzad, Aamir
    Blekinge Institute of Technology, School of Computing.
    Mobile Robot Navigation using Gaze Contingent Dynamic Interface2010Independent thesis Advanced level (degree of Master (Two Years))Student thesis
    Abstract [en]

    Using eyes as an input modality for different control environments is a great area of interest for enhancing the bandwidth of human machine interaction and providing interaction functions when the use of hands is not possible. Interface design requirements in such implementations are quite different from conventional application areas. Both command-execution and feedback observation tasks may be performed by human eyes simultaneously. In order to control the motion of a mobile robot by operator gaze interaction, gaze contingent regions in the operator interface are used to execute robot movement commands, with different screen areas controlling specific directions. Dwell time is one of the most established techniques to perform an eye-click analogous to a mouse click. But repeated dwell time while switching between gaze-contingent regions and feedback-regions decreases the performance of the application. We have developed a dynamic gaze-contingent interface in which we merge gaze-contingent regions with feedback-regions dynamically. This technique has two advantages: Firstly it improves the overall performance of the system by eliminating repeated dwell time. Secondly it reduces fatigue of the operator by providing a bigger area to fixate in. The operator can monitor feedback with more ease while sending commands at the same time.

    Download full text (pdf)
    FULLTEXT01
    Download full text (pdf)
    FULLTEXT02
  • 29.
    Ahmet, Zeynep
    Blekinge Institute of Technology, School of Computing.
    What Are You Doing And Feeling Right Now?2012Independent thesis Advanced level (degree of Master (One Year))Student thesis
    Abstract [en]

    Understanding and capturing game play experiences of players have been of great interest for some time, both in academia and industry. Methods used for eliciting game play experiences have involved the use of observations, biometric data and post-game techniques such as surveys and interviews. This is true for games that are played in fixed settings, such as computer or video games. Pervasive games however, provide a greater challenge for evaluation, as they are games that typically engage players in outdoor environments, which might mean constant movement and a great deal of the players' motor skills engaged for several hours or days. In this project I explored a new method for eliciting different aspects of the game play experience of pervasive game players, specifically focusing on motional states and different qualities of immersion. I have centered this work on self-reporting as a means for reporting these aspects of the game play experiences. However, this required an approach to selfreporting as non-obtrusive, not taking too much of the players’ attention from the game activities as well as provide ease of use. To understand the challenges in introducing a new method into a gaming experience, I focused my research on understanding experience, which is a subjective concept. Even though there are methods aiming at capturing the physiological changes during game play, they don’t capture players’ interpretations of the gaming situation. By combining this with objective measurements, I was able to gain a comprehensive understanding of the context of use. The resulting designs were two tools, iteratively developed and pre-tested in a tabletop role-playing session before a test run in the pervasive game Interference. From my findings I was able to conclude that using self-reporting tools for players to use while playing was successful, especially as the data derived from the tools supported post-game interviews. There were however challenges regarding the design and functionality, in particular in outdoor environments, that suggests improvements, as well as considerations on the use of selfreporting as an additional method for data collection.

    Download full text (pdf)
    FULLTEXT01
  • 30.
    Aihara, Diogo Satoru
    Blekinge Institute of Technology, School of Computing.
    Study About the Relationship Between the Model-View-Controller Pattern and Usabiltity2009Independent thesis Advanced level (degree of Master (Two Years))Student thesis
    Abstract [en]

    Usability is one of the most important quality attributes in the new generation of software applications and computational devices. On the other hand, Model- View-Controller is a well known software architectural pattern and is widely used in its original form or its variations. The relationship between usability and the usage of Model-View-Controller, however, is still unknown. This thesis tries to contribute to this research question by providing the outcomes of a case study where a prototype has been developed in two different versions: one using Model-View-Controller and another using a widely known Object-Oriented guideline, the GRASP patterns. Those prototypes have been developed based on a non-functional prototype with a good level of usability. With the prototypes in hands, they were compared based on their design and based on the usability heuristics proposed by Nielsen. From this study, we discovered that the usage of MVC brings more advantages and disadvantages to the usability of the system than the ones that were found on the literature review. In general, the relationship between MVC and usability is beneficial, easing the implementation of usability features like validation of input data, evolutionary consistency, multiple views, inform the result of actions and skip steps in a process.

    Download full text (pdf)
    FULLTEXT01
  • 31.
    Akbar, Zeeshan
    et al.
    Blekinge Institute of Technology, School of Computing.
    Ali, Asar
    Blekinge Institute of Technology, School of Computing.
    Evaluation of AODV and DSR Routing Protocols of Wireless Sensor Networks for Monitoring Applications2009Independent thesis Advanced level (degree of Master (Two Years))Student thesis
    Abstract [en]

    Deployment of sensor networks are increasing either manually or randomly to monitor physical environments in different applications such as military, agriculture, medical transport, industry etc. In monitoring of physical environments, the most important application of wireless sensor network is monitoring of critical conditions. The most important in monitoring application like critical condition is the sensing of information during emergency state from the physical environment where the network of sensors is deployed. In order to respond within a fraction of seconds in case of critical conditions like explosions, fire and leaking of toxic gases, there must be a system which should be fast enough. A big challenge to sensor networks is a fast, reliable and fault tolerant channel during emergency conditions to sink (base station) that receives the events. The main focus of this thesis is to discuss and evaluate the performance of two different routing protocols like Ad hoc On Demand Distance Vector (AODV) and Dynamic Source Routing (DSR) for monitoring of critical conditions with the help of important metrics like throughput and end-to-end delay in different scenarios. On the basis of results derived from simulation a conclusion is drawn on the comparison between these two different routing protocols with parameters like end-to-end delay and throughput.

    Download full text (pdf)
    FULLTEXT01
  • 32.
    Akhlaq, Muhammad
    Blekinge Institute of Technology, School of Computing.
    A Smart-Dashboard: Augmenting safe & smooth driving2010Independent thesis Advanced level (degree of Master (Two Years))Student thesis
    Abstract [en]

    Annually, road accidents cause more than 1.2 million deaths, 50 million injuries, and US$ 518 billion of economic cost globally. About 90% of the accidents occur due to human errors such as bad awareness, distraction, drowsiness, low training, fatigue etc. These human errors can be minimized by using advanced driver assistance system (ADAS) which actively monitors the driving environment and alerts a driver to the forthcoming danger, for example adaptive cruise control, blind spot detection, parking assistance, forward collision warning, lane departure warning, driver drowsiness detection, and traffic sign recognition etc. Unfortunately, these systems are provided only with modern luxury cars because they are very expensive due to numerous sensors employed. Therefore, camera-based ADAS are being seen as an alternative because a camera has much lower cost, higher availability, can be used for multiple applications and ability to integrate with other systems. Aiming at developing a camera-based ADAS, we have performed an ethnographic study of drivers in order to find what information about the surroundings could be helpful for drivers to avoid accidents. Our study shows that information on speed, distance, relative position, direction, and size & type of the nearby vehicles & other objects would be useful for drivers, and sufficient for implementing most of the ADAS functions. After considering available technologies such as radar, sonar, lidar, GPS, and video-based analysis, we conclude that video-based analysis is the fittest technology that provides all the essential support required for implementing ADAS functions at very low cost. Finally, we have proposed a Smart-Dashboard system that puts technologies – such as camera, digital image processor, and thin display – into a smart system to offer all advanced driver assistance functions. A basic prototype, demonstrating three functions only, is implemented in order to show that a full-fledged camera-based ADAS can be implemented using MATLAB.

    Download full text (pdf)
    FULLTEXT01
  • 33. Akhlaq, Muhammad
    et al.
    Sheltami, Tarek
    Helgeson, Bo
    Blekinge Institute of Technology, School of Computing.
    Shakshuki, Elhadi
    Designing an integrated driver assistance system using image sensors2012In: Journal of Intelligent Manufacturing, ISSN 0956-5515, E-ISSN 1572-8145, Vol. January, p. 1-24Article in journal (Refereed)
    Abstract [en]

    Road accidents cause a great loss to human lives and assets. Most of the accidents occur due to human errors, such as bad awareness, distraction, drowsiness, low training, and fatigue. Advanced driver assistance system (ADAS) can reduce the human errors by keeping an eye on the driving environment and warning a driver to the upcoming danger. However, these systems come only with modern luxury cars because of their high cost and complexity due to several sensors employed. Therefore, camera-based ADAS are becoming an option due to their lower cost, higher availability, numerous applications and ability to combine with other systems. Targeting at designing a camera-based ADAS, we have conducted an ethnographic study of drivers to know what information about the driving environment would be useful in preventing accidents. It turned out that information on speed, distance, relative position, direction, and size and type of the nearby objects would be useful and enough for implementing most of the ADAS functions. Several camera-based techniques are available for capturing the required information. We propose a novel design of an integrated camera-based ADAS that puts technologies-such as five ordinary CMOS image sensors, a digital image processor, and a thin display-into a smart system to offer a dozen advanced driver assistance functions. A basic prototype is also implemented using MATLAB. Our design and the prototype testify that all the required technologies are now available for implementing a full-fledged camera-based ADAS.

  • 34.
    Akhlaq, Usman
    et al.
    Blekinge Institute of Technology, School of Computing.
    Yousaf, Muhammad Usman
    Blekinge Institute of Technology, School of Computing.
    Impact of Software Comprehension in Software Maintenance and Evolution2010Independent thesis Advanced level (degree of Master (Two Years))Student thesis
    Abstract [en]

    The need of change is essential for a software system to reside longer in the market. Change implementation is only done through the maintenance and successful software maintenance gives birth to a new software release that is a refined form of the previous one. This phenomenon is known as the evolution of the software. To transfer software from lower to upper or better form, maintainers have to get familiar with the particular aspects of software i.e. source code and documentation. Due to the poor quality of documentation maintainers often have to rely on source code. So, thorough understanding of source code is necessary for effective change implementation. This study explores the code comprehension problems discussed in the literature and prioritizes them according to their severity level given by maintenance personnel in the industry. Along with prioritizing the problems, study also presents the maintenance personnel suggested methodologies for improving code comprehension. Consideration of these suggestions in development might help in shortening the maintenance and evolution time.

    Download full text (pdf)
    FULLTEXT01
  • 35.
    Akhter, Adeel
    et al.
    Blekinge Institute of Technology, School of Computing.
    Azhar, Hassan
    Blekinge Institute of Technology, School of Computing.
    Statistical Debugging of Programs written in Dynamic Programming Language: RUBY2010Independent thesis Advanced level (degree of Master (Two Years))Student thesis
    Abstract [en]

    Debugging is an important and critical phase during the software development process. Software debugging is serious and tough practice involved in functional base test driven development. Software vendors encourages their programmers to practice test driven development during the initial development phases to capture the bug traces and the associated code coverage infected from diagnosed bugs. Application’s source code with fewer threats of bug existence or faulty executions is assumed as highly efficient and stable especially when real time software products are in consideration. Due to the fact that process of development of software projects relies on great number of users and testers which required having an effective fault localization technique. This specific fault localization technique can highlight the most critical areas of software system at code as well as modular level so that debugging algorithm can be used to debug the application source code. Nowadays many complex or simple software systems are in corporation with open bug repositories to localize the bugs. Any inconsistency or imperfection in early development phase of software product results in low efficient system and less reliability. Statistical debugging of program source code for visualization of fault is an important and efficient way to select and rank the suspicious lines of code. This research provides guidelines for practicing statistical debugging technique for programs coded in Ruby programming language. This thesis presents statistical debugging techniques available for dynamic programming languages. Firstly, the statistical debugging techniques were thoroughly observed with different predicate base approaches followed in previous work done in the subject area. Secondly, the new process of statistical debugging for programs coded in Ruby programming language is introduced by generating dynamic predicates. Results were analyzed by implementing multiple programs written in Ruby programming language with different complexity level. The analysis of experimentation performed on candidate programs depict that SOBER is more efficient and accurate in bug identification than Cause Isolation Scheme. It is concluded that despite of extensive research in the field of statistical debugging and fault localization it is not possible to identify majority of the bugs. Moreover SOBER and Cause Isolation Scheme algorithms are found to be two most mature and effective statistical debugging algorithms for bug identification with in software source code.

    Download full text (pdf)
    FULLTEXT01
  • 36.
    Akinwande, Gbenga Segun
    Blekinge Institute of Technology, School of Computing.
    Signaling Over Protocols Gateways in Next-Generation Networks2009Independent thesis Advanced level (degree of Master (Two Years))Student thesis
    Abstract [en]

    In this thesis, I examined various signalling both in wired and mobile networks, with more emphasis on SIGTRAN. The SIGTRAN is the protocol suite applicable in the current new generation and next-generation networks, most especially as it enables service provider to be able to interpolate both wireline and wireless services within the same architecture. This concept is an important component in today’s Triple-play communication, and hence this thesis has provided a broad view on Signalling and Protocol Gateways in Traditional and Next Generations Networks. Signal flow in a typical new generation network was examined by carrying out discrete event simulation of UMTS network using OPNET modeller 14.5. Through both Packet-Switching (PS) and Circuit-Switching (CS) signalling, I was able to examine the QoS on a UMTS. Precisely, I looked at throughput on UMTS network by implementing WFQ and MDRR scheduling schemes.

    Download full text (pdf)
    FULLTEXT01
  • 37.
    Alahari, Yeshwanth
    et al.
    Blekinge Institute of Technology, School of Computing.
    Buddhiraja, Prashant
    Blekinge Institute of Technology, School of Computing.
    Analysis of packet loss and delay variation on QoE for H.264 andWebM/VP8 Codecs2011Independent thesis Advanced level (degree of Master (Two Years))Student thesis
    Abstract [en]

    The popularity of multimedia services over Internet has increased in the recent years. These services include Video on Demand (VoD) and mobile TV which are predominantly growing, and the user expectations towards the quality of videos are gradually increasing. Different video codec’s are used for encoding and decoding. Recently Google has introduced the VP8 codec which is an open source compression format. It is introduced to compete with existing popular codec namely H.264/AVC developed by ITU-T Video Coding Expert Group (VCEG), as by 2016 there will be a license fee for H.264. In this work we compare the performance of H.264/AVC and WebM/VP8 in an emulated environment. NetEm is used as an emulator to introduce delay/delay variation and packet loss. We have evaluated the user perception of impaired videos using Mean Opinion Score (MOS) by following the International Telecommunication Union (ITU) Recommendations Absolute Category Rating (ACR) and analyzed the results using statistical methods. It was found that both video codec’s exhibit similar performance in packet loss, But in case of delay variation H.264 codec shows better results when compared to WebM/VP8. Moreover along with the MOS ratings we also studied the effect of user feelings and online video watching experience impacts on their perception.

    Download full text (pdf)
    FULLTEXT01
  • 38.
    Alam, Tariq
    et al.
    Blekinge Institute of Technology, School of Computing.
    Ali, Muhammad
    Blekinge Institute of Technology, School of Computing.
    The Challenge of Usability Evaluation of Online Social Networks with a Focus on Facebook2010Independent thesis Advanced level (degree of Master (Two Years))Student thesis
    Abstract [en]

    In today’s era online social networks are getting extensive popularity among internet users. People are using online social networks for different purposes like sharing information, chatting with friends, family and planning to hang out. It is then no surprise that online social network should be easy to use and easily understandable. Previously many researchers have evaluated different online social networks but there is no such study which addresses usability concerns about online social network with a focus on Facebook on an academic level (using students as subjects). The main rationale behind this study is to find out efficiency of different usability testing techniques from social network’s point of view, with a focus on Facebook, and issues related to usability. To conduct this research, we have adopted the combination of both qualitative and quantitative approach. Graduate students from BTH have participated in usability tests. Our findings are that although think aloud is more efficient then remote testing, but this difference is not very significant. We found from survey that different usability issues are in Facebook profile, media, Picture Tagging, Chatting etc.

    Download full text (pdf)
    FULLTEXT01
  • 39.
    Alam, Zahidul
    Blekinge Institute of Technology, School of Computing.
    Usability of a GNU/Linux Distribution from Novice User’s Perspective2009Independent thesis Advanced level (degree of Master (Two Years))Student thesis
    Abstract [en]

    The term Open Source Software (OSS) has been around for a long time in the world of computer science. Open source software development is a process by which we can manufacture economical and qualitative software and its source could be re-use in the improvement of the software. The success of OSS relies on several factors, e.g. usability, functionality, market focus etc. But in the end how popular the software will be measured by the number of users downloading the software and how much the software is usable to the users. Open Source Software achieve the status for stability, security and functionality. Most of this software has been utilized by expert level users of IT. But from the general users or the non-computer user’s point of view the usability issues of Open source software has been faced the most criticism [25, 26, 27, 28, 29, and 30]. This factor i.e. the usability issues of general user is also responsible for the limited distribution of the open source software [24]. The development process should apply the “user-centered” methodology [25, 26, 27, 28, 29, and 30]. In this thesis paper the issues of usability in OSS development and how the usability of open source software can be improved will be discussed. Beside this I investigate the usability quality of free Open Source Linux-based operating system Ubuntu and try to find out the usability standards of this OSS.

    Download full text (pdf)
    FULLTEXT01
  • 40.
    Al-Daajeh, Saleh
    Blekinge Institute of Technology, School of Computing.
    Balancing Dependability Quality Attributes for Increased Embedded Systems Dependability2009Independent thesis Advanced level (degree of Master (Two Years))Student thesis
    Abstract [en]

    Embedded systems are used in many critical applications where a failure can have serious consequences. Therefore, achieving a high level of dependability is an ultimate goal. However, in order to achieve this goal we are in need of understanding the interrelationships between the different dependability quality attributes and other embedded systems’ quality attributes. This research study provides indicators of the relationship between the dependability quality attributes and other quality attributes for embedded systems by identifying the impact of architectural tactics as the candidate solutions to construct dependable embedded systems.

    Download full text (pdf)
    FULLTEXT01
  • 41.
    Alfallah, Abdulaziz
    et al.
    Blekinge Institute of Technology, School of Computing.
    Alkaabey, Hieder
    Blekinge Institute of Technology, School of Computing.
    Study the Performance of the Use of SIP for Video Conferencing2011Independent thesis Advanced level (degree of Master (Two Years))Student thesis
    Abstract [en]

    The recent decades have witnessed enormous development in communication world; especially in internet technology that has played an important role in moving forward the human communications, with such development user demands better communication services such as video conferencing. Video conferencing becomes more popular nowadays since it can break the constraints on the communication ways of people who are probably exist in diverse geographical locations in real time. SIP is preferred to be used as signaling protocol for video conferencing, but still using SIP for video conferencing is affected by the delay which reduces the satisfaction of user as it decreases the QoS. This thesis work is aimed to study the performance of SIP signaling for video conferencing, and also describing the causes of the delay in SIP session establishment and termination. The literature review and the two simulations have been carried out in this thesis to examine the effect of specific parameters on the session setup delay. The first study is carried out by using ns-2 to simulate different transport protocols and study their effect on session setup delay. The second study is carried out by using SIPp to achieve two objects. The first object is to study the relationship between number of simultaneous calls on both the session setup delay and call release delay, while the second object is to verify the result of changing transport modes of transport protocol for both session setup delay and call release delay. The results obtained from first simulation showed that utilizing UDP as transport protocol will return less session setup delay than TCP and SCTP. The first objective of the second simulation results clarify the relationship between number of simultaneous calls on both the session setup delay and call release delay which was directly proportional; on the other hand the second object showed that by using UDP in mono socket mode has less session setup delay and call release delay.

    Download full text (pdf)
    FULLTEXT01
  • 42.
    Ali, Hassan
    et al.
    Blekinge Institute of Technology, School of Computing.
    Yazdani, Talha Moiz
    Blekinge Institute of Technology, School of Computing.
    Measurement Tools for IP Multimedia Subsystems (IMS)2009Independent thesis Advanced level (degree of Master (Two Years))Student thesis
    Abstract [en]

    is indeed a milestone. Next Generation Network (NGN) is an IP network which facilitates researcher to achieve this milestone. IP Multimedia Sub-system (IMS) makes possible to deliver IP multimedia services to users. IMS uses Session Initiation Protocol (SIP) as a signalling protocol. Since IMS involves lot of signalling between IMS entities and other network elements, accordingly the signalling must guarantee quality of service. Test beds and measuring tools help us to perform various tests according to our need. These tests should be carried in early stages of development with designer in structured way according to well defined process and methods. Initially this thesis aims to identify useful tools for IMS which can develop an appropriate IMS service creation environment. The basic task involves decoding of SIP message and to gather response time information. Finally to provide a proposal which tool is suitable for measuring signalling protocol regarding performance, exportability and hardware requirements.

    Download full text (pdf)
    FULLTEXT01
  • 43.
    Ali, Hazrat
    Blekinge Institute of Technology, School of Computing.
    A Performance Evaluation of RPL in Contiki2012Independent thesis Advanced level (degree of Master (Two Years))Student thesis
    Abstract [en]

    A Wireless Sensor Network is formed of several small devices encompassing the capability of sensing a physical characteristic and sending it hop by hop to a central node via low power and short range transceivers. The Sensor network lifetime strongly depends on the routing protocol in use. Routing protocol is responsible for forwarding the traffic and making routing decisions. If the routing decisions made are not intelligent, more re-transmissions will occur across the network which consumes limited resources of the wireless sensor network like energy, bandwidth and processing. Therefore a careful and extensive performance analysis is needed for the routing protocols in use by any wireless sensor network. In this study we investigate Objective Functions and the most influential parameters on Routing Protocol for Low power and Lossy Network (RPL) performance in Contiki (WSN OS) and then evaluate RPL performance in terms of Energy, Latency, Packet Delivery Ratio, Control overhead, and Convergence Time for the network. We have carried out extensive simulations yielding a detailed analysis of different RPL parameters with respect to the five performance metrics. The study provides an insight into the different RPL settings suitable for different application areas. Experimental results show ETX is a better objective, and that ContikiRPL provides very efficient network Convergence (14s), Control traffic overhead (1300 packets), Energy consumption (1.5% radio on time), Latency (0.5s), and Packet Delivery Ratio (98%) in our sample RPL simulation of one hour with 80 nodes, after careful configuration of DIO interval minimum/doublings, Radio duty cycling, and Frequency of application messages.

    Download full text (pdf)
    FULLTEXT01
  • 44.
    Ali, Israr
    et al.
    Blekinge Institute of Technology, School of Computing.
    Shah, Syed Shahab Ali
    Blekinge Institute of Technology, School of Computing.
    Usability Requirements for GIS Application: Comparative Study of Google Maps on PC and Smartphone2011Independent thesis Advanced level (degree of Master (Two Years))Student thesis
    Abstract [en]

    Context: Smartphone is gaining popularity due to its feasible mobility, computing capacity and efficient energy. Emails, text messaging, navigation and visualizing geo-spatial data through browsers are common features of smartphone. Display of geo-spatial data is collected in computing format and made publically available. Therefore the need of usability evaluation becomes important due to its increasing demand. Identifying usability requirements are important as conventional functional requirements in software engineering. Non-functional usability requirements are objectives and testable using measurable metrics. Objectives: Usability evaluation plays an important role in the interaction design process as well as identifying user needs and requirements. Comparative usability requirements are identified for the evaluation of a geographical information system (Google Maps) on personal computer (Laptop) and smartphone (iPhone). Methods: ISO 9241-11 guide on usability is used as an input model for identifying and specifying usability level of Google Maps on both personal computer and smartphone for intended output. Authors set target value for usability requirements of tasks and questionnaire on each device, such as acceptability level of tasks completion, rate of efficiency and participant’s agreement of each measure through ISO 9241-11 respectively. The usability test is conducted using Co-discovery technique on six pairs of graduate students. Interviews are conducted for validation of test results and questionnaires are distributed to get feedback from participants. Results: The non-functional usability requirements were tested and used five metrics measured on user performance and satisfaction. Through usability test, the acceptability level of tasks completion and rate of efficiency was matched on personal computer but did not match on iPhone. Through questionnaire, both the devices did not match participant’s agreement of each measure but only effectiveness matched on personal computer. Usability test, interview and questionnaire feedback are included in the results. Conclusions: The authors provided suggestions based on test results and identified usability issues for the improvement of Google Maps on personal computer and iPhone.

    Download full text (pdf)
    FULLTEXT01
  • 45.
    Ali, Muhammad Usman
    Blekinge Institute of Technology, School of Computing.
    Cloud Computing as a Tool to Secure and Manage Information Flow in Swedish Armed Forces Networks2012Independent thesis Advanced level (degree of Master (Two Years))Student thesis
    Abstract [en]

    In the last few years cloud computing has created much hype in the IT world. It has provided new strategies to cut down costs and provide better utilization of resources. Apart from all drawbacks, the cloud infrastructure has been long discussed for its vulnerabilities and security issues. There is a long list of service providers and clients, who have implemented different service structures using cloud infrastructure. Despite of all these efforts many organizations especially with higher security concerns have doubts about the data privacy or theft protection in cloud. This thesis aims to encourage Swedish Armed Forces (SWAF) networks to move to cloud infrastructures as this is the technology that will make a huge difference and revolutionize the service delivery models in the IT world. Organizations avoiding it would lag behind but at the same time organizations should consider to adapt a cloud strategy most reliable and compatible with their requirements. This document provides an insight on different technologies and tools implemented specifically for monitoring and security in cloud. Much emphasize is given on virtualization technology because cloud computing highly relies on it. Amazon EC2 cloud is analyzed from security point of view. An intensive survey has also been conducted to understand the market trends and people’s perception about cloud implementation, security threats, cost savings and reliability of different services provided.

    Download full text (pdf)
    FULLTEXT01
  • 46.
    Ali, Muhammad Usman
    et al.
    Blekinge Institute of Technology, School of Computing.
    Aasim, Muhammad
    Blekinge Institute of Technology, School of Computing.
    Usability Evaluation of Digital Library BTH a case study2009Independent thesis Advanced level (degree of Master (Two Years))Student thesis
    Abstract [en]

    Libraries have for hundreds of years been an important entity for every kind of institute, especially in the educational sector. So now it is an age of computers and internet. People are now using electronic resources to fulfill their needs and requirements of their life. Therefore libraries have also converted to computerized systems. People can access and use library resources just sitting at their computers by using the internet. This modern way of running a library has been called or given the name of digital libraries. Digital libraries are getting famous for flexibility of use and because more users can be facilitated at a time. As numbers of users are increasing, some issues relevant to interaction also arise while using digital libraries interface and utilizing its e-resources. In this thesis we evaluate usability factors and issues in digital libraries and the authors have taken as a case study the real time existing system of the digital library in BTH. This thesis report describes digital libraries and how users are being facilitated by them. Usability issues are also discussed relevant to digital libraries. Users have been the main source to evaluate and judge usability issues while interacting and using this digital library. The results obtained showed dis¬satisfaction of users regarding the usability evaluation of BTH:s digital library. The authors used usability evaluation techniques to evaluate functionality and services provided by the BTH digital library system interface. Moreover, based on the results of our case study, suggestions of improvement in BTH:s digital library are presented. Hopefully, these suggestions will help to make BTH digital library system more usable in an efficient and effective manner for users.

    Download full text (pdf)
    FULLTEXT01
  • 47.
    Ali, Nauman bin
    et al.
    Blekinge Institute of Technology, School of Computing.
    Edison, Henry
    Blekinge Institute of Technology, School of Computing.
    Towards Innovation Measurement in Software Industry2010Independent thesis Advanced level (degree of Master (Two Years))Student thesis
    Abstract [en]

    Context: In today’s highly competitive business environment, shortened product and technology life-cycles, it is critical for software industry to continuously innovate. To help an organisation to achieve this goal, a better understanding and control of the activities and determinants of innovation is required. This can be achieved through innovation measurement initiative which assesses innovation capability, output and performance. Objective: This study explores definitions of innovation, innovation measurement frameworks, key elements of innovation and metrics that have been proposed in literature and used in industry. The degree of empirical validation and context of studies was also investigated. It also elicited the perception of innovation, its importance, challenges and state of practice of innovation measurement in software industry. Methods: In this study, a systematic literature review, followed by online questionnaire and face-to-face interviews were conducted. The systematic review used seven electronic databases, including Compendex, Inspec, IEEE Xplore, ACM Digital Library, and Business Source premier, Science Direct and Scopus. Studies were subject to preliminary, basic and advanced criteria to judge the relevance of papers. The online questionnaire targeted software industry practitioners with different roles and firm sizes. A total of 94 completed and usable responses from 68 unique firms were collected. Seven face-to-face semi-structured interviews were conducted with four industry practitioners and three academics. Results: Based on the findings of literature review, interviews and questionnaire a comprehensive definition of innovation was identified which may be used in software industry. The metrics for evaluation of determinants, inputs, outputs and performance were aggregated and categorised. A conceptual model of the key measurable elements of innovation was constructed from the findings of the systematic review. The model was further refined after feedback from academia and industry through interviews. Conclusions: The importance of innovation measurement is well recognised in both academia and industry. However, innovation measurement is not a common practice in industry. Some of the major reasons are lack of available metrics and data collection mechanisms to measure innovation. The organisations which do measure innovation use only a few metrics that do not cover the entire spectrum of innovation. This is partly because of the lack of consistent definition of innovation in industry. Moreover, there is a lack of empirical validation of the metrics and determinants of innovation. Although there is some static validations, full scale industry trials are currently missing. For software industry, a unique challenge is development of alternate measures since some of the existing metrics are inapplicable in this context. The conceptual model constructed in this study is one step towards identifying measurable key aspects of innovation to understanding the innovation capability and performance of software firms.

    Download full text (pdf)
    FULLTEXT01
  • 48.
    Ali, Nauman Bin
    et al.
    Blekinge Institute of Technology, School of Computing.
    Petersen, Kai
    Blekinge Institute of Technology, School of Computing.
    A consolidated process for software process simulation: State of the Art and Industry Experience2012Conference paper (Refereed)
    Abstract [en]

    Software process simulation is a complex task and in order to conduct a simulation project practitioners require support through a process for software process simulation modelling (SPSM), including what steps to take and what guidelines to follow in each step. This paper provides a literature based consolidated process for SPSM where the steps and guidelines for each step are identified through a review of literature and are complemented by experience from using these recommendations in an action research at a large Telecommunication vendor. We found five simulation processes in SPSM literature, resulting in a seven-step process. The consolidated process was successfully applied at the studied company, with the experiences of doing so being reported.

    Download full text (pdf)
    fulltext
  • 49.
    Ali, Nauman Bin
    et al.
    Blekinge Institute of Technology, School of Computing.
    Petersen, Kai
    Blekinge Institute of Technology, School of Computing.
    Mäntylä, Mika
    Testing highly complex system of systems: An industrial case study2012Conference paper (Refereed)
    Abstract [en]

    Systems of systems (SoS) are highly complex and are integrated on multiple levels (unit, component, system, system of systems). Many of the characteristics of SoS (such as operational and managerial independence, integration of system into system of systems, SoS comprised of complex systems) make their development and testing challenging. Contribution: This paper provides an understanding of SoS testing in large-scale industry settings with respect to challenges and how to address them. Method: The research method used is case study research. As data collection methods we used interviews, documentation, and fault slippage data. Results: We identified challenges related to SoS with respect to fault slippage, test turn-around time, and test maintainability. We also classified the testing challenges to general testing challenges, challenges amplified by SoS, and challenges that are SoS specific. Interestingly, the interviewees agreed on the challenges, even though we sampled them with diversity in mind, which meant that the number of interviews conducted was sufficient to answer our research questions. We also identified solution proposals to the challenges that were categorized under four classes of developer quality assurance, function test, testing in all levels, and requirements engineering and communication. Conclusion: We conclude that although over half of the challenges we identified can be categorized as general testing challenges still SoS systems have their unique and amplified challenges stemming from SoS characteristics. Furthermore, it was found that interviews and fault slippage data indicated that different areas in the software process should be improved, which indicates that using only one of these methods would have led to an incomplete picture of the challenges in the case company.

  • 50.
    Ali, Sajjad
    et al.
    Blekinge Institute of Technology, School of Computing.
    Ali, Asad
    Blekinge Institute of Technology, School of Computing.
    Performance Analysis of AODV, DSR and OLSR in MANET2010Independent thesis Advanced level (degree of Master (Two Years))Student thesis
    Abstract [en]

    A mobile ad hoc network (MANET) consists of mobile wireless nodes. The communication between these mobile nodes is carried out without any centralized control. MANET is a self organized and self configurable network where the mobile nodes move arbitrarily. The mobile nodes can receive and forward packets as a router. Routing is a critical issue in MANET and hence the focus of this thesis along with the performance analysis of routing protocols. We compared three routing protocols i.e. AODV, DSR and OLSR. Our simulation tool will be OPNET modeler. The performance of these routing protocols is analyzed by three metrics: delay, network load and throughput. All the three routing protocols are explained in a deep way with metrics. The comparison analysis will be carrying out about these protocols and in the last the conclusion will be presented, that which routing protocol is the best one for mobile ad hoc networks.

    Download full text (pdf)
    FULLTEXT01
1234567 1 - 50 of 1344
CiteExportLink to result list
Permanent link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf