Change search
Refine search result
1234567 101 - 150 of 2971
CiteExportLink to result list
Permanent link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Rows per page
  • 5
  • 10
  • 20
  • 50
  • 100
  • 250
Sort
  • Standard (Relevance)
  • Author A-Ö
  • Author Ö-A
  • Title A-Ö
  • Title Ö-A
  • Publication type A-Ö
  • Publication type Ö-A
  • Issued (Oldest first)
  • Issued (Newest first)
  • Created (Oldest first)
  • Created (Newest first)
  • Last updated (Oldest first)
  • Last updated (Newest first)
  • Disputation date (earliest first)
  • Disputation date (latest first)
  • Standard (Relevance)
  • Author A-Ö
  • Author Ö-A
  • Title A-Ö
  • Title Ö-A
  • Publication type A-Ö
  • Publication type Ö-A
  • Issued (Oldest first)
  • Issued (Newest first)
  • Created (Oldest first)
  • Created (Newest first)
  • Last updated (Oldest first)
  • Last updated (Newest first)
  • Disputation date (earliest first)
  • Disputation date (latest first)
Select
The maximal number of hits you can export is 250. When you want to export more records please use the 'Create feeds' function.
  • 101.
    Alégroth, Emil
    et al.
    Blekinge Institute of Technology, Faculty of Computing, Department of Software Engineering.
    Karlsson, Arvid
    Cilbuper IT, Gothenburg, SWE.
    Radway, Alexander
    Techship Krokslatts Fabriker, SWE.
    Continuous Integration and Visual GUI Testing: Benefits and Drawbacks in Industrial Practice2018In: Proceedings - 2018 IEEE 11th International Conference on Software Testing, Verification and Validation, ICST 2018, Institute of Electrical and Electronics Engineers Inc. , 2018, p. 172-181Conference paper (Refereed)
    Abstract [en]

    Continuous integration (CI) is growing in industrial popularity, spurred on by market trends towards faster delivery and higher quality software. A key facilitator of CI is automated testing that should be executed, automatically, on several levels of system abstraction. However, many systems lack the interfaces required for automated testing. Others lack test automation coverage of the system under test's (SUT) graphical user interface (GUI) as it is shown to the user. One technique that shows promise to solve these challenges is Visual GUI Testing (VGT), which uses image recognition to stimulate and assert the SUT's behavior. Research has presented the technique's applicability and feasibility in industry but only limited support, from an academic setting, that the technique is applicable in a CI environment. In this paper we presents a study from an industrial design research study with the objective to help bridge the gap in knowledge regarding VGT's applicability in a CI environment in industry. Results, acquired from interviews, observations and quantitative analysis of 17.567 test executions, collected over 16 weeks, show that VGT provides similar benefits to other automated test techniques for CI. However, several significant drawbacks, such as high costs, are also identified. The study concludes that, although VGT is applicable in an industrial CI environment, its severe challenges require more research and development before the technique becomes efficient in practice. © 2018 IEEE.

  • 102.
    Alégroth, Emil
    et al.
    Blekinge Institute of Technology, Faculty of Computing, Department of Software Engineering.
    Matsuki, Shinsuke
    Veriserve Corporation, JPN.
    Vos, Tanja
    Open University of the Netherlands, NLD.
    Akemine, Kinji
    Nippon Telegraph and Telephone Corporation, JPN.
    Overview of the ICST International Software Testing Contest2017In: Proceedings - 10th IEEE International Conference on Software Testing, Verification and Validation, ICST 2017, IEEE Computer Society, 2017, p. 550-551Conference paper (Refereed)
    Abstract [en]

    In the software testing contest, practitioners and researcher's are invited to test their test approaches against similar approaches to evaluate pros and cons and which is perceivably the best. The 2017 iteration of the contest focused on Graphical User Interface-driven testing, which was evaluated on the testing tool TESTONA. The winner of the competition was announced at the closing ceremony of the international conference on software testing (ICST), 2017. © 2017 IEEE.

  • 103.
    Amaradri, Anand Srivatsav
    et al.
    Blekinge Institute of Technology, Faculty of Computing, Department of Software Engineering.
    Nutalapati, Swetha Bindu
    Blekinge Institute of Technology, Faculty of Computing, Department of Software Engineering.
    Continuous Integration, Deployment and Testing in DevOps Environment2016Independent thesis Advanced level (degree of Master (Two Years)), 20 credits / 30 HE creditsStudent thesis
    Abstract [en]

    Context. Owing to a multitude of factors like rapid changes in technology, market needs, and business competitiveness, software companies these days are facing pressure to deliver software rapidly and on a frequent basis. For frequent and faster delivery, companies should be lean and agile in all phases of the software development life cycle. An approach called DevOps, which is based on agile principles has come into play. DevOps bridges the gap between development and operations teams and facilitates faster product delivery. The DevOps phenomenon has gained a wide popularity in the past few years, and several companies are adopting DevOps to leverage its perceived benefits. However, the organizations may face several challenges while adopting DevOps. There is a need to obtain a clear understanding of how DevOps functions in an organization.

    Objectives. The main aim of this study is to provide a clear understanding about how DevOps works in an organization to researchers and software practitioners. The objectives of the study are to identify the benefits of implementing DevOps in organizations where agile development is in practice, the challenges faced by organizations during DevOps adoption, to identify the solutions/ mitigation strategies, to overcome the challenges,the DevOps practices, and the problems faced by DevOps teams during continuous integration, deployment and testing.

    Methods. A mixed methods approach having both qualitative and quantitative research methods is used to accomplish the research objectives.A Systematic Literature Review is conducted to identify the benefits and challenges of DevOps adoption, and the DevOps practices. Interviews are conducted to further validate the SLR findings, and identify the solutions to overcome DevOps adoption challenges, and the DevOps practices. The SLR and interview results are mapped, and a survey questionnaire is designed.The survey is conducted to validate the qualitative data, and to identify the other benefits and challenges of DevOps adoption, solutions to overcome the challenges, DevOps practices, and the problems faced by DevOps teams during continuous integration, deployment and testing.

    Results. 31 primary studies relevant to the research are identified for conducting the SLR. After analysing the primary studies, an initial list of the benefits and challenges of DevOps adoption, and the DevOps practices is obtained. Based on the SLR findings, a semi-structured interview questionnaire is designed, and interviews are conducted. The interview data is thematically coded, and a list of the benefits, challenges of DevOps adoption and solutions to overcome them, DevOps practices, and problems faced by DevOps teams is obtained. The survey responses are statistically analysed, and a final list of the benefits of adopting DevOps, the adoption challenges and solutions to overcome them, DevOps practices and problems faced by DevOps teams is obtained.

    Conclusions. Using the mixed methods approach, a final list of the benefits of adopting DevOps, DevOps adoption challenges, solutions to overcome the challenges, practices of DevOps, and the problems faced by DevOps teams during continuous integration, deployment and testing is obtained. The list is clearly elucidated in the document. The final list can aid researchers and software practitioners in obtaining a better understanding regarding the functioning and adoption of DevOps. Also, it has been observed that there is a need for more empirical research in this domain.

  • 104. Ambreen, T.
    et al.
    Ikram, N.
    Usman, Muhammad
    Blekinge Institute of Technology, Faculty of Computing, Department of Software Engineering.
    Niazi, M.
    Empirical research in requirements engineering: trends and opportunities2016In: Requirements Engineering, ISSN 0947-3602, E-ISSN 1432-010X, p. 1-33Article in journal (Refereed)
    Abstract [en]

    Requirements engineering (RE) being a foundation of software development has gained a great recognition in the recent era of prevailing software industry. A number of journals and conferences have published a great amount of RE research in terms of various tools, techniques, methods, and frameworks, with a variety of processes applicable in different software development domains. The plethora of empirical RE research needs to be synthesized to identify trends and future research directions. To represent a state-of-the-art of requirements engineering, along with various trends and opportunities of empirical RE research, we conducted a systematic mapping study to synthesize the empirical work done in RE. We used four major databases IEEE, ScienceDirect, SpringerLink and ACM and Identified 270 primary studies till the year 2012. An analysis of the data extracted from primary studies shows that the empirical research work in RE is on the increase since the year 2000. The requirements elicitation with 22 % of the total studies, requirements analysis with 19 % and RE process with 17 % are the major focus areas of empirical RE research. Non-functional requirements were found to be the most researched emerging area. The empirical work in the sub-area of requirements validation and verification is little and has a decreasing trend. The majority of the studies (50 %) used a case study research method followed by experiments (28 %), whereas the experience reports are few (6 %). A common trend in almost all RE sub-areas is about proposing new interventions. The leading intervention types are guidelines, techniques and processes. The interest in RE empirical research is on the rise as whole. However, requirements validation and verification area, despite its recognized importance, lacks empirical research at present. Furthermore, requirements evolution and privacy requirements also have little empirical research. These RE sub-areas need the attention of researchers for more empirical research. At present, the focus of empirical RE research is more about proposing new interventions. In future, there is a need to replicate existing studies as well to evaluate the RE interventions in more real contexts and scenarios. The practitioners’ involvement in RE empirical research needs to be increased so that they share their experiences of using different RE interventions and also inform us about the current requirements-related challenges and issues that they face in their work. © 2016 Springer-Verlag London

  • 105.
    Ameerjan, Sharvathul Hasan
    Mälardalen University, School of Innovation, Design and Engineering.
    Predicting and Estimating Execution Time of Manual Test Cases - A Case Study in Railway Domain2017Independent thesis Advanced level (degree of Master (Two Years)), 20 credits / 30 HE creditsStudent thesis
    Abstract [en]

    Testing plays a vital role in the software development life cycle by verifying and validating the software's quality. Since software testing is considered as an expensive activity and due to thelimitations of budget and resources, it is necessary to know the execution time of the test cases for an efficient planning of test-related activities such as test scheduling, prioritizing test cases and monitoring the test progress. In this thesis, an approach is proposed to predict and estimate the execution time of manual test cases written in English natural language. The method uses test specifications and historical data that are available from previously executed test cases. Our approach works by obtaining timing information from each and every step of previously executed test cases. The collected data is used to estimate the execution time for non-executed test cases by mapping them using text from their test specifications. Using natural language processing, texts are extracted from the test specification document and mapped with the obtained timing information. After estimating the time from this mapping, a linear regression analysis is used to predict the execution time of non-executed test cases. A case study has been conducted in Bombardier Transportation (BT) where the proposed method is implemented and the results are validated. The obtained results show that the predicted execution time of studied test cases are close to their actual execution time.

  • 106.
    Amiri, Javad Mohammadian
    et al.
    Blekinge Institute of Technology, School of Computing.
    Padmanabhuni, Venkata Vinod Kumar
    Blekinge Institute of Technology, School of Computing.
    A Comprehensive Evaluation of Conversion Approaches for Different Function Points2011Independent thesis Advanced level (degree of Master (Two Years))Student thesis
    Abstract [en]

    Context: Software cost and effort estimation are important activities for planning and estimation of software projects. One major player for cost and effort estimation is functional size of software which can be measured in variety of methods. Having several methods for measuring one entity, converting outputs of these methods becomes important. Objectives: In this study we investigate different techniques that have been proposed for conversion between different Functional Size Measurement (FSM) techniques. We addressed conceptual similarities and differences between methods, empirical approaches proposed for conversion, evaluation of the proposed approaches and improvement opportunities that are available for current approaches. Finally, we proposed a new conversion model based on accumulated data. Methods: We conducted a systematic literature review for investigating the similarities and differences between FSM methods and proposed approaches for conversion. We also identified some improvement opportunities for the current conversion approaches. Sources for articles were IEEE Xplore, Engineering Village, Science Direct, ISI, and Scopus. We also performed snowball sampling to decrease chance of missing any relevant papers. We also evaluated the existing models for conversion after merging the data from publicly available datasets. By bringing suggestions for improvement, we developed a new model and then validated it. Results: Conceptual similarities and differences between methods are presented along with all methods and models that exist for conversion between different FSM methods. We also came with three major contributions for existing empirical methods; for one existing method (piecewise linear regression) we used a systematic and rigorous way of finding discontinuity point. We also evaluated several existing models to test their reliability based on a merged dataset, and finally we accumulated all data from literature in order to find the nature of relation between IFPUG and COSMIC using LOESS regression technique. Conclusions: We concluded that many concepts used by different FSM methods are common which enable conversion. In addition statistical results show that the proposed approach to enhance piecewise linear regression model slightly increases model’s test results. Even this small improvement can affect projects’ cost largely. Results of evaluation of models show that it is not possible to say which method can predict unseen data better than others and it depends on the concerns of practitioner that which model should be used. And finally accumulated data confirms that empirical relation between IFPUG and COSMIC is not linear and can be presented by two separate lines better than other models. Also we noted that unlike COSMIC manual’s claim that discontinuity point should be around 200 FP, in merged dataset discontinuity point is around 300 to 400. Finally we proposed a new conversion approach using systematic approach and piecewise linear regression. By testing on new data, this model shows improvement in MMRE and Pred(25).

  • 107.
    Andersson, Alve
    Blekinge Institute of Technology, School of Computing.
    Att sticka ut i mängden: En studie av tekniker för variation av instansierade modeller2013Independent thesis Basic level (degree of Bachelor)Student thesis
    Abstract [sv]

    Trots den senaste tidens hårdvaruutveckling är realtidsrendering av stora folkmassor fortfarande ingen trivial uppgift. Denna uppgift beskrivs som crowd rendering. Effektiv crowd rendering bygger ofta på instansiering, men instansiering kommer med ett problem, det skapar kloner. Denna uppsats syftar till att undersöka och utvärdera ett antal tekniker som används för att skapa mångfald för instansierade modeller. Dessa tekniker kommer tillsammans att kallas varierad instansiering. Ett annat mål är att avgöra hur många modeller som behövs för att varierad instansiering skall betala sig i jämförelse med icke- instansierad utritning. Metoden som används är att mäta tiden för varje uppdatering på GPU för varje teknik med hjälp av ett mätinstrument. Varje teknik har implementerats i en applikation som skapats speciellt för detta ändamål. Analysen av mätningarna resulterade i tre kategorier. Kategorierna är GPU procentuell arbetsbörda stigande för instans avtagande för polygon, sjunkande för instans avtagande för polygon och jämn för instans och polygon. Antalet instanser som behövs för varierad instansiering skall betala sig i jämförelse med en icke- instansierad utritning bestämdes till någonstans mellan 100 och 300 modeller, beroende på antalet polygoner.

  • 108.
    Andersson, Björn
    et al.
    Blekinge Institute of Technology, School of Engineering, Department of Systems and Software Engineering.
    Persson, Marie
    Blekinge Institute of Technology, School of Engineering, Department of Systems and Software Engineering.
    Software Reliability Prediction – An Evaluation of a Novel Technique2004Independent thesis Advanced level (degree of Master (One Year))Student thesis
    Abstract [en]

    Along with continuously increasing computerization, our expectations on software and hardware reliability increase considerably. Therefore, software reliability has become one of the most important software quality attributes. Software reliability modeling based on test data is done to estimate whether the current reliability level meets the requirements for the product. Software reliability modeling also provides possibilities to predict reliability. Costs of software developing and tests together with profit issues in relation to software reliability are one of the main objectives to software reliability prediction. Software reliability prediction currently uses different models for this purpose. Parameters have to be set in order to tune the model to fit the test data. A slightly different prediction model, Time Invariance Estimation, TIE is developed to challenge the models used today. An experiment is set up to investigate whether TIE could be found useful in a software reliability prediction context. The experiment is based on a comparison between the ordinary reliability prediction models and TIE.

  • 109. Andersson, Emma
    et al.
    Peterson, Anders
    Törnquist Krasemann, Johanna
    Blekinge Institute of Technology, School of Computing.
    Quantifying railway timetable robustness in critical points2013In: Journal of Rail Transport Planning and Management, ISSN 2210-9706, Vol. 3, no 3, p. 95-110Article in journal (Refereed)
    Abstract [en]

    Several European railway traffic networks experience high capacity consumption during large parts of the day resulting in delay-sensitive traffic system with insufficient robustness. One fundamental challenge is therefore to assess the robustness and find strategies to decrease the sensitivity to disruptions. Accurate robustness measures are needed to determine if a timetable is sufficiently robust and suggest where improvements should be made.Existing robustness measures are useful when comparing different timetables with respect to robustness. They are, however, not as useful for suggesting precisely where and how robustness should be increased. In this paper, we propose a new robustness measure that incorporates the concept of critical points. This concept can be used in the practical timetabling process to find weaknesses in a timetable and to provide suggestions for improvements. In order to quantitatively assess how crucial a critical point may be, we have defined the measure robustness in critical points (RCP). In this paper, we present results from an experimental study where a benchmark of several measures as well as RCP has been done. The results demonstrate the relevance of the concept of critical points and RCP, and how it contributes to the set of already defined robustness measures

  • 110.
    Andersson, Jesper
    Växjö University, Faculty of Mathematics/Science/Technology, School of Mathematics and Systems Engineering. Datalogi.
    Dynamic Software Architectures2007Doctoral thesis, monograph (Other academic)
    Abstract [en]

    Software architecture is a software engineering discipline that

    provides notations and processes for high-level partitioning of

    systems' responsibilities early in the software design process. This

    thesis is concerned with a specific subclass of systems, systems with a dynamic software architecture. They have practical applications in various domains such as high-availability systems and ubiquitous computing.

    In a dynamic software architecture, the set of architectural elements and the configuration of these elements may change at run-time. These modifications are motivated by changed system requirements or by changed execution environments. The implications of change events may be the addition of new functionality or re-configuration to meet new Quality of Service requirements.

    This thesis investigates new modeling and implementation techniques for dynamic software architectures. The field of Dynamic Architecture is surveyed and a common ground defined. We introduce new concepts and techniques that simplify understanding, modeling, and implementation of systems with a dynamic architecture, with this common ground as our starting point. In addition, we investigate practical use and reuse of quality implementations, where a dynamic software architecture is a

    fundamental design principle.

    The main contributions are a taxonomy, a classification, and a set of architectural patterns for dynamic software architecture. The taxonomy and classification support analysis, while the patterns affect design and implementation work directly. The investigation of practical applications of dynamic architectures identifies several issues concerned with use and reuse, and discusses alternatives and solutions where possible.

    The results are based on surveys, case studies, and exploratory development of dynamic software architectures in different

    application domains using several approaches. The taxonomy,

    classification and architecture patterns are evaluated through several experimental prototypes, among others, a high-performance scientific computing platform.

  • 111.
    Andersson, Jesper
    et al.
    Linnaeus University, Faculty of Science and Engineering, School of Computer Science, Physics and Mathematics.
    Bencomo, Nelly
    Baresi, Luciano
    Lemos, Rogerio de
    Gorla, Alessandra
    Inverardi, Paola
    Vogel, Thomas
    Software Engineering Processes for Self-adaptive Systems2012In: Software Engineering for Self-adaptive Software Systems, Springer, 2012Chapter in book (Refereed)
    Abstract [en]

    In this paper, we discuss how for self-adaptive systems someactivities that traditionally occur at development-time are moved to runtime. Responsibilities for these activities shift from software engineers tothe system itself, causing the traditional boundary between development time and run-time to blur. As a consequence, we argue how the traditional  software engineering process needs to be reconceptualized to distinguishvboth development-time and run-time activities, and to support designers in taking decisions on how to properly engineer such systems.Furthermore, we identify a number of challenges related to this required reconceptualization, and we propose initial ideas based on process modeling.We use the Software and Systems Process Engineering Meta-Model(SPEM) to specify which activities are meant to be performed o-line andon-line, and also the dependencies between them. The proposed models should capture information about the costs and benets of shifting activitiesto run-time, since such models should support software engineers in their decisions when they are engineering self-adaptive systems.

  • 112.
    Andersson, Jesper
    et al.
    Växjö University, Faculty of Mathematics/Science/Technology, School of Mathematics and Systems Engineering.
    de Lemos, Rogerio
    Malek, Sam
    Weyns, Danny
    Katholieke Universiteit Leuven.
    Reflecting on self-adaptive software systems2009In: Software Engineering for Adaptive and Self-Managing Systems, 2009. SEAMS '09. ICSE Workshop on, 2009, Vol. 0, p. 38-47Conference paper (Refereed)
  • 113.
    Andersson, Jesper
    et al.
    Växjö University, Faculty of Mathematics/Science/Technology, School of Mathematics and Systems Engineering.
    de Lemos, Rogério
    Malek, Sam
    Weyns, Danny
    Katholieke Universiteit Leuven.
    Modeling Dimensions of Self-Adaptive Software Systems2009In: Software Engineering for Self-Adaptive Systems / [ed] Betty H.C. Cheng, Rogério de Lemos, Holger Giese, Paola Inverardi and Jeff Magee, Springer, 2009, Vol. 5525, p. 27-47Chapter in book (Other academic)
  • 114.
    Andersson, Jesper
    et al.
    Linnaeus University, Faculty of Science and Engineering, School of Computer Science, Physics and Mathematics.
    Heberle, Andreas
    Kirchner, Jens
    Linnaeus University, Faculty of Science and Engineering, School of Computer Science, Physics and Mathematics.
    Löwe, Welf
    Linnaeus University, Faculty of Science and Engineering, School of Computer Science, Physics and Mathematics.
    Service Level Achievements: Distributed Knowledge for Optimal Service Selection2011In: Proceedings - 9th IEEE European Conference on Web Services, ECOWS 2011 / [ed] Gianluigi Zavattaro, Ulf Schreier, and Cesare Pautasso, IEEE, 2011, p. 125-132Conference paper (Refereed)
    Abstract [en]

    In a service-oriented setting, where services are composed to provide end user functionality, it is a challenge to find the service components with best-fit functionality and quality. A decision based on information mainly provided by service providers is inadequate as it cannot be trusted in general. In this paper, we discuss service compositions in an open market scenario where an automated best-fit service selection and composition is based on Service Level Achievements instead. Continuous monitoring updates the actual Service Level Achievements which can lead to dynamically changing compositions. Measurements of real life services exemplify the approach.

  • 115.
    Andersson, Madelene
    Blekinge Institute of Technology, Department of Human Work Science and Media Technology.
    Systemutveckling i praktiken: konsten att tillmötesgå den okända användarens krav2002Independent thesis Basic level (degree of Bachelor)Student thesis
    Abstract [sv]

    ABSTRACT System development has become more and more concentrated on development for the Web and this has resulted in larger target groups. It will most surely continue to be so considering that the Web will be the infrastructure of business and services in the future. A big target group involves that the owner of a system can earn a lot of money from the paying users, but that assumes that the system can meet user needs. If a system on the Web does not satisfy the user?s demands then they will use the competitor?s system instead because it is only a mouse-click away. That is why the business, already during the development process, has to take the role of the users seriously. Even if all users cannot take part in the process at least some users can and it would be a shame not to take advantages of this kind of expert knowledge. This report describes how a system development project can be practised and how a developer can do to satisfy each user?s requirements even though he or she are not specified, and also why a useful system is classified as an investment in the future.

  • 116.
    Andersson, Magnus
    Blekinge Institute of Technology.
    Sökmotoroptimering med analysverktyg2018Independent thesis Basic level (university diploma), 10 credits / 15 HE creditsStudent thesis
  • 117.
    Andersson, Marcus
    et al.
    Blekinge Institute of Technology, Faculty of Computing, Department of Computer Science and Engineering.
    Nilsson, Alexander
    Blekinge Institute of Technology, Faculty of Computing, Department of Computer Science and Engineering.
    Improving Integrity Assurances of Log Entries From the Perspective of Intermittently Disconnected Devices2014Student thesis
    Abstract [en]

    It is common today in large corporate environments for system administrators to employ centralized systems for log collection and analysis. The log data can come from any device between smart-phones and large scale server clusters. During an investigation of a system failure or suspected intrusion these logs may contain vital information. However, the trustworthiness of this log data must be confirmed. The objective of this thesis is to evaluate the state of the art and provide practical solutions and suggestions in the field of secure logging. In this thesis we focus on solutions that do not require a persistent connection to a central log management system. To this end a prototype logging framework was developed including client, server and verification applications. The client employs different techniques of signing log entries. The focus of this thesis is to evaluate each signing technique from both a security and performance perspective. This thesis evaluates "Traditional RSA-signing", "Traditional Hash-chains"', "Itkis-Reyzin's asymmetric FSS scheme" and "RSA signing and tick-stamping with TPM", the latter being a novel technique developed by us. In our evaluations we recognized the inability of the evaluated techniques to detect so called `truncation-attacks', therefore a truncation detection module was also developed which can be used independent of and side-by-side with any signing technique. In this thesis we conclude that our novel Trusted Platform Module technique has the most to offer in terms of log security, however it does introduce a hardware dependency on the TPM. We have also shown that the truncation detection technique can be used to assure an external verifier of the number of log entries that has at least passed through the log client software.

  • 118.
    Andersson, Oskar
    Mid Sweden University, Faculty of Science, Technology and Media, Department of Information and Communication systems.
    Simulations in 3D research: Can Unity3D be used to simulate a 3D display system?2016Independent thesis Basic level (degree of Bachelor), 10 credits / 15 HE creditsStudent thesis
    Abstract [en]

    Mid Sweden University is currently researching how to capture more of a scene with a camera and how to create 3D images that does not require extra equipment for the viewer. In the process of this research they have started looking into simulating some of the tests that they wish to conduct. The goal of this project is to research whether the 3D graphics engine Unity3D could be used to simulate these tests, and to what degree. To test this a simulation was designed and implemented. The simulation used a split display system where each camera is directly connected to a part of the screen and using the position of the viewer the correct part of the camera feed is shown. Some literary studies were also done into how current 3D technology works. The simulation was successfully implemented and shows that simple simulation can be done in Unity3D, however, some problems were encountered in the process. The conclusion of the project show that there is much work left before simulation is viable but that there is potential in the technology and that the research team should continue to investigate it.

  • 119.
    Andersson, Patrik
    et al.
    Blekinge Institute of Technology, School of Computing.
    Johansson, Sakarias
    Blekinge Institute of Technology, School of Computing.
    Rendering with Marching Cubes, looking at Hybrid Solutions2012Independent thesis Basic level (degree of Bachelor)Student thesis
    Abstract [en]

    Marching Cubes is a rendering technique that has many advantages for a lot of areas. It is a technique for representing scalar fields as a three-dimensional mesh. It is used for geographical applications as well as scientific ones, mainly in the medical industry to visually render medical data of the human body. But it's also an interesting technique to explore for the usage in computer games or other real-time applications since it can create some really interesting rendering. The main focus in this paper is to present a novel hybrid solution using marching cubes and heightmaps to render terrain; moreover, to find if it’s suitable for real-time applications. The paper will follow a theoretical approach as well as an implementational one on the hybrid solution. The results across several tests for different scenarios show that the hybrid solution works well for today's real-time applications using a modern graphics card and CPU (Central Processing Unit).

  • 120.
    Andersson, Pierre
    et al.
    Örebro University, School of Science and Technology.
    Norlander, Arvid
    Örebro University, School of Science and Technology.
    Indoor Positioning Using WLAN2012Independent thesis Basic level (professional degree), 10 credits / 15 HE creditsStudent thesis
    Abstract [en]

    This report evaluates various methods that can be used to position a smartphone running the Android platform, without the use of any special hardware or infrastructure and in conditions where GPS is unavailable or unreliable; such as indoors. Furthermore, it covers the implementation of such a system with the use of a deterministic fingerprinting method that is reasonably device independent, a method which involves measuring a series of reference points, called fingerprints, in an area and using those to locate the user.

    The project was carried on behalf of Sigma, a Swedish software consulting company.

  • 121.
    Andersson, Rickard
    Umeå University, Faculty of Science and Technology, Department of Computing Science.
    Pseudo-optimal strategies in no-limit poker2006In: ICGA Journal, ISSN 1389-6911, Vol. 29, no 3, p. 143-149Article in journal (Refereed)
    Abstract [en]

    Games have always been a strong driving force in Artificial Intelligence. In the last ten years huge improvements has been made in perfect information games like chess and Othello. The strongest computer agents can nowadays beat the strongest human players. This is not the case for imperfect information games such as poker and bridge where creating an expert computer player has shown to be much harder. Previous research in poker has either addressed limit poker or simplified variations of poker games. This paper tries to extend known techniques successfully used in limit poker to no-limit. No-limit poker increases the size of the game tree drastically. To reduce the complexity an abstracted model of the game is created. Finding an optimal strategy for the new model is now a minimization problem using linear programming techniques. The result is a set of pseudo-optimal strategies for no-limit Texas Hold'em. A bot named AGGROBOT was built from these strategies which perform well as long as the players' stack sizes are fairly small.

  • 122.
    Andersson, Tobias
    et al.
    Blekinge Institute of Technology, Faculty of Computing, Department of Software Engineering.
    Brenden, Christoffer
    Blekinge Institute of Technology, Faculty of Computing, Department of Software Engineering.
    Parallelism in Go and Java: A Comparison of Performance Using Matrix Multiplication2018Independent thesis Basic level (degree of Bachelor), 10 credits / 15 HE creditsStudent thesis
    Abstract [en]

    This thesis makes a comparison between the performance of Go and Java using parallelizedimplementations of the Classic Matrix Multiplication Algorithm (CMMA). The comparisonattempts to only use features for parallelization, goroutines for Go and threads for Java,while keeping other parts of the code as generic and comparable as possible to accuratelymeasure the performance during parallelization.In this report we ask the question of how programming languages compare in terms of multi-threaded performance? In high-performance systems such as those designed for mathemati-cal calculations or servers meant to handle requests from millions of users, multithreadingand by extension performance are vital. We would like to find out if and how much of a dif-ference the choice of programming language could benefit these systems in terms of parallel-ism and multithreading.Another motivation is to analyze techniques and programming languages that have emergedthat hide the complexity of handling multithreading and concurrency from the user, lettingthe user specify keywords or commands from which the language takes over and creates andmanages the thread scheduling on its own. The Go language is one such example. Is this newtechnology an improvement over developers coding threads themselves or is the technologynot quite there yet?To these ends experiments were done with multithreaded matrix multiplication and was im-plemented using goroutines for Go and threads for Java and was performed with sets of4096x4096 matrices. Background programs were limited and each set of calculations wasthen run multiple times to get average values for each calculation which were then finallycompared to one another.Results from the study showed that Go had ~32-35% better performance than Java between 1and 4 threads, with the difference diminishing to ~2-5% at 8 to 16 threads. The differencehowever was believed to be mostly unrelated to parallelization as both languages maintainednear identical performance scaling as the number of threads increased until the scaling flat-lined for both languages at 8 threads and up. Java did continue to gain a slight increase goingfrom 4 to 8 threads, but this was believed to be due to inefficient resource utilization onJava’s part or due to Java having better utilization of hyper-threading than Go.In conclusion, Go was found to be considerably faster than Java when going from the mainthread and up to 4 threads. At 8 threads and onward Java and Go performed roughly equal.For performance difference between the number of threads in the languages themselves nonoticeable performance increase or decrease was found when creating 1 thread versus run-ning the matrix multiplication directly on the main thread for either of the two languages.Coding multithreading in Go was found to be easier than in Java while providing greater toequal performance. Go just requires the ‘go’ keyword while Java requires thread creation andmanagement. This would put Go in favor for those trying to avoid the complexity of multi-threading while also seeking its benefits.

  • 123.
    Andersson, Ulf
    Linköping University, Department of Electrical Engineering. Linköping University, The Institute of Technology.
    Betalkort: Slutrapport1980Report (Other academic)
    Abstract [sv]

    Idén till Universalkort uppkom för cirka tre år sedan, som en följd av ett examensarbete med namnet "ID- kort med minne". Där undersöktes möjligheten att placera ett nonvolatilt minne på ett kort. Fortsättningen på det projektet beskrivs kort under rubriken ”Existerande hårdvarusystem”.

    Utvecklingen av Betal- och Universalkortsidén har sedan dess bedrivits i samarbete med personal på Institutionen för Systemteknik inom ämnesområdet informationsteori vid LiTH. Jag vill rikta ett varmt tack till dessa och speciellt till Rolf Blom , Robert Forchheimer och Ingemar Ingemarsson för deras bidrag till arbetet.

  • 124. Andjelkovic, Igor
    et al.
    Artho, Cyrille
    Trace Server: A Tool for Storing, Querying and Analyzing Execution Traces2011In: Proc. JPF Workshop 2011, 2011Conference paper (Refereed)
  • 125.
    Andrade, Hugo
    Mälardalen University, School of Innovation, Design and Engineering.
    Software Product Line Architectures: Reviewing the Literature and Identifying Bad Smells2013Independent thesis Advanced level (degree of Master (One Year)), 20 credits / 30 HE creditsStudent thesis
    Abstract [en]

    The Software Product Line (SPL) paradigm has proven to be an effective way to achieve large scale reuse in different domains. It takes advantage of common aspects between different products, while also considering product specific features. The architecture plays an important role in SPL engineering, by providing means to better understand and maintain the product-derivation environment. However, it is difficult to evolve such architecture because it is not always clear where and how to refactor. The contribution of this thesis is twofold. First, the current state of the art of software Product Line Architectures (PLAs) is investigated through a systematic mapping study. It provides an overview of the field through the analysis, and categorization of evidence. The study identifies gaps, trends and provides future directions for research. Furthermore, this thesis addresses the phenomenon of architectural bad smells in the context of SPLs. A case study provides an investigation on the implications of such structural properties in a variability-based environment. Prior to the search for smells, the architecture of a sample SPL in the text editor domain is recovered from the source code.

  • 126. Andræ, A. S. G.
    et al.
    Möller, Patrik
    KTH, Superseded Departments, Microelectronics and Information Technology, IMIT.
    Liu, J.
    Uncertainty estimation by Monte Carlo simulation applied to life cycle inventory of cordless phones and microscale metallization processes2004In: Proc.Int.Conf.Asian Green Electron., 2004, p. 206-217Conference paper (Refereed)
    Abstract [en]

    This paper addressed the question whether there is an environmental advantage of using DECT phones instead of GSM phones in offices. The paper also addresses the environmental compatibility of Electrochemical Pattern Replication (ECPR) compared to classical photolithography based microscale metallization (CL) for pattern transfer. Both environmental assessments consider electricity consumption and CO2 emissions. The projects undertaken were two comparative studies of DECT phone/GSM phone and ECPR/CL respectively. The research method used was probabilistic uncertainty modelling with a limited number of inventory parameters used in the MATLAB tool. Within the chosen system boundaries and with the uncertainties added to input data, the ECPR is to 100 % probability better than CL and the DECT phone is to 90% better than the GSM phone.

  • 127.
    Andréasson, Dan
    et al.
    Linköping University, Department of Computer and Information Science, Software and Systems. Linköping University, Faculty of Science & Engineering.
    Morja, Daniel
    Linköping University, Department of Computer and Information Science. Linköping University, Faculty of Science & Engineering.
    Rekonstruktion och optimering av laddningstid för en webbsida i Ruby on Rails2015Independent thesis Basic level (degree of Bachelor), 10,5 credits / 16 HE creditsStudent thesis
    Abstract [sv]

    Många verksamheter representeras idag på internet i omodern stil vilket kan påverka besökarens uppfattning om verksamheten negativt. I detta arbete har en webbsida rekonstruerats. Webbsidan tillhör en förening med verksamhet inom gaming och esport. Rekonstruktionen är till för att ge besökare en klar bild av vad föreningens huvudverksamhet är och för att integrera streamingtjänsten Twitch för att ge besökare ytterligare en anledning att återbesöka sidan. Dessutom har laddningstiden för startsidan optimerats för att ge bättre besökupplevelse. Med hjälp av Redis och metoden Eager loading visar arbetet hur man kan sänka laddningstiden på en webbsida.

  • 128.
    Angarita Soto, Angie
    Mälardalen University, School of Innovation, Design and Engineering.
    Design Philosophy for User Friendly Parameter Handler2012Independent thesis Advanced level (degree of Master (Two Years)), 20 credits / 30 HE creditsStudent thesis
    Abstract [en]

    DCU2 (Drive Control Unit 2) is an important control system used in applications for train systems that are configured by a set of parameters. Traditionally, parameterization is conducted by using an excel workbook during the software development. The parameters are set up and further export the parameters to the compilation step. Such approach has a number of disadvantages, e.g., delays on the validation and verification steps, system configuration overhead, and suboptimal system reliability generated by the parameter configurations.

    To improve the parameterization process, this thesis implements a model-based software architecture approach and automotive industry standards via rapid prototyping by using scrum methodology. We do this by using Matlab/Simulink, TDL (Time Description Language) and UML (Unified Modeling Language) architectural description languages to enable different views of the software architecture. We then develop different prototypes that implement ASAM (Association for Standardization of Automation and Measuring Systems) standards like XCP protocol over Ethernet (code ASAM MCD-1 XCP V1.1.0) and ASAP2 (code ASAM MCD-2 MC) in every scrum sprint. An evaluation then shows that the thesis successfully implements previously defined standards that use commercial tools from e.g., Vector, proving that the parameter‟s unit control can be handled via online calibration and measurement, leading to a significant improvement in Bombardier‟s software development process in a distributed development environment.

  • 129.
    Annergren, Mariette
    et al.
    KTH, School of Electrical Engineering (EES), Automatic Control.
    Larsson, C. A.
    MOOSE2—A toolbox for least-costly application-oriented input design2016In: SoftwareX, ISSN 2352-7110, Vol. 5, p. 96-100Article in journal (Refereed)
    Abstract [en]

    MOOSE2 is a MATLAB®-based toolbox for solving least-costly application-oriented input design problems in system identification. MOOSE2 provides the spectrum of the input signal to be used in the identification experiment made to estimate a linear parametric model of the system. The objective is to find a spectrum that minimizes experiment cost while fulfilling constraints imposed in the experiment and on the obtained model. The constraints considered by MOOSE2 are: frequency or power constraints on the signal spectra in the experiment, and application or quality specifications on the obtained model.

  • 130.
    Ansari, Rehan Javed.
    et al.
    Blekinge Institute of Technology, School of Computing.
    Dodda, Sandhya Rani.
    Blekinge Institute of Technology, School of Computing.
    The Use of SCRUM in Global Software Development – An Empirical Study2010Independent thesis Advanced level (degree of Master (Two Years))Student thesis
    Abstract [en]

    The trend for global software development is increasing day by day. Global software development takes into account, the development of software globally by bringing knowledge about the market. There are several challenges that have an impact in developing software globally. In this study we investigate several management challenges faced in globally distributed projects and scrum practices that are being implemented by the organization. We also study the benefits in implementing scrum. For our research, we have performed literature review to find out the various challenges in managing globally distributed software projects and various scrum practices that are discussed. We conducted industrial case studies to find out the challenges being faced by them in globally distributed projects and the various scrum practices that are followed by them to overcome those challenges and also to know the benefits of implementing scrum in GSD. In order to provide quantitative support of management challenges and scrum practices discussed in the literature review, surveys have been conducted. We used grounded theory for analyzing the data gathered during the study. There are several challenges that are being faced by the organizations while developing software globally. There are also several scrum practices that have been found from the interviews. There are few challenges that need to be addressed in future research.

  • 131.
    Ansari, Umair Azeem
    et al.
    Blekinge Institute of Technology, Faculty of Engineering, Department of Industrial Economics.
    Ali, Syed Umair
    Blekinge Institute of Technology, Faculty of Engineering, Department of Industrial Economics.
    Application of LEAN and BPR principles for Software Process Improvement (SPI): A case study of a large software development organization2014Independent thesis Advanced level (degree of Master (One Year))Student thesis
    Abstract [en]

    Abstract ------------- Background ---------------- Like other businesses, the failures and problems faced by the software development industry over the time have motivated experts to look for software process improvement to create quality software rapidly, repeatedly, and reliably. Objective ------------ The purpose of this study is to evaluate if and how Lean thinking and principles primarily associated with auto manufacturing industry can be applied to software development lifecycle for Software Process Improvement (SPI). The secondary aim is to analyze how BPR can be integrated with Lean software development for process improvement. Method ---------- A derived Lean-BPR adoption pattern model is used as a theoretical framework for this thesis. The seven Lean software development principles along with four-step BPR process are selected as process improvement patterns, which effects the KPIs of a software organization. This research study incorporates both Qualitative and Quantitative methods and data to analyze the objectives of this study. The methodological framework of Plan-Do-Check-Act is used in the case study to implement process re-engineering incorporating Lean and BPR principles. The impact of adopting the Lean and BPR principles is assessed in terms of cost, productivity, quality of products and resource management. Results ---------- Application of Lean and BPR principles for software process improvement in the organization under study resulted in 79% improvement in test coverage, 60% reduction in time for test execution and analysis and 44% reduction in cost for fixing defects that were being passed to customer in past. Conclusion ------------- Based on case study results, it can be concluded that Lean, a bottom up approach, characterized by empowerment of employees to analyze and improve their own working process can be effectively combined with IT centric traditionally top down BPR approach for improving KPI’s and software processes.

  • 132.
    Antinyan, Vard
    et al.
    Department of Computer Science and Engineering, Chalmers University, Gothenburg, Sweden.
    Staron, Miroslaw
    Department of Computer Science and Engineering, Chalmers University, Gothenburg, Sweden.
    Derehag, Jesper
    Ericsson, Gothenburg, Sweden.
    Runsten, Mattias
    AB Volvo, Gothenburg, Sweden.
    Wikström, Erik
    Ericsson, Gothenburg, Sweden.
    Meding, Wilhelm
    Ericsson, Gothenburg, Sweden.
    Henriksson, Anders
    AB Volvo, Gothenburg, Sweden.
    Hansson, Jörgen
    University of Skövde, School of Informatics.
    Identifying Complex Functions: By Investigating Various Aspects of Code Complexity2015In: Proceedings of 2015 Science and Information Conference (SAI): July 28-30, 2015, London, United Kingdom, IEEE Press, 2015, p. 879-888Conference paper (Refereed)
    Abstract [en]

    The complexity management of software code has become one of the major problems in software development industry. With growing complexity the maintenance effort of code increases. Moreover, various aspects of complexity create difficulties for complexity assessment. The objective of this paper is to investigate the relationships of various aspects of code complexity and propose a method for identifying the most complex functions. We have conducted an action research project in two software development companies and complemented it with a study of three open source products. Four complexity metrics are measured, and their nature and mutual influence are investigated. The results and possible explanations are discussed with software engineers in industry. The results show that there are two distinguishable aspects of complexity of source code functions: Internal and outbound complexities. Those have an inverse relationship. Moreover, the product of them does not seem to be greater than a certain limit, regardless of software size. We present a method that permits identification of most complex functions considering the two aspects of complexities. The evaluation shows that the use of the method is effective in industry: It enables identification of 0.5% most complex functions out of thousands of functions for reengineering.

  • 133.
    Antinyan, Vard
    et al.
    Computer Science and Engineering, University of Gothenburg, Gothenburg, Sweden / Computer Science and Engineering, Chalmers, Gothenburg, Sweden.
    Staron, Miroslaw
    Computer Science and Engineering, University of Gothenburg, Gothenburg, Sweden / Computer Science and Engineering, Chalmers, Gothenburg, Sweden.
    Sandberg, Anna
    Ericsson, Sweden.
    Hansson, Jörgen
    University of Skövde, School of Informatics.
    A Complexity Measure for Textual Requirements2016In: Proceedings of the 26th International Workshop on Software Measurement (IWSM) and the 11th International Conference on Software Process and Product Measurement (Mensura) IWSM-Mensura 2016 / [ed] Jens Heidrich & Frank Vogelezang, IEEE, 2016, p. 148-158Conference paper (Refereed)
    Abstract [en]

    Unequivocally understandable requirements are vital for software design process. However, in practice it is hard to achieve the desired level of understandability, because in large software products a substantial amount of requirements tend to have ambiguous or complex descriptions. Over time such requirements decelerate the development speed and increase the risk of late design modifications, therefore finding and improving them is an urgent task for software designers. Manual reviewing is one way of addressing the problem, but it is effort-intensive and critically slow for large products. Another way is using measurement, in which case one needs to design effective measures. In recent years there have been great endeavors in creating and validating measures for requirements understandability: most of the measures focused on ambiguous patterns. While ambiguity is one property that has major effect on understandability, there is also another important property, complexity, which also has major effect on understandability, but is relatively less investigated. In this paper we define a complexity measure for textual requirements through an action research project in a large software development organization. We also present its evaluation results in three large companies. The evaluation shows that there is a significant correlation between the measurement values and the manual assessment values of practitioners. We recommend this measure to be used with earlier created ambiguity measures as means for automated identification of complex specifications.

  • 134.
    Anton, Andersson
    et al.
    Linnaeus University, Faculty of Technology, Department of Computer Science.
    Runbert, Johan
    Linnaeus University, Faculty of Technology, Department of Computer Science.
    Cross-platform Mobile Development and Internet of Things: Developing a cross-platform mobile application using web technologies to interact with smart things2015Independent thesis Basic level (degree of Bachelor), 10 credits / 15 HE creditsStudent thesis
    Abstract [en]

    Today more and more objects in our daily lives are getting connected to the Internet. This phenomenon is called the Internet of Things and is a way for physical things such as cars, buildings or even bus stations to get access and communicate with other objects using the Internet. The problem is that for every Internet of Things device, an application is often needed in order to communicate with these devices. Developing mobile applications in a separate programming language for each operating system can be an expensive and time consuming task.

    In this thesis, we implement and evaluate a cross-platform mobile solution for users to interact with smart things using the advantages of web technologies. To compare previous findings in this area, two literature reviews has been performed to find out which is the state of the art on cross-platform mobile development frameworks and smart-things technologies used for interacting with physical objects. The result is a mobile application developed using PhoneGap and jQuery Mobile that interacts with iBeacons, where students inside a university building can get directions and schedules for different rooms.

    The application received good results from a couple of usability studies, and performed well when measuring its performance. The outcome shows that web technologies that exist today are a viable solution to native mobile applications in terms of interacting with smart things such as tagging technologies.

  • 135.
    ANWAR, WALEED
    Blekinge Institute of Technology, Faculty of Computing, Department of Software Engineering.
    Software Quality Characteristics Tested For Mobile Application Development: Literature Review and Empirical Survey2015Independent thesis Advanced level (degree of Master (One Year)), 10 credits / 15 HE creditsStudent thesis
    Abstract [en]

    Smart phones use is increasing day by day as there is large number of app users. Due to more use of apps, the testing of mobile application should be done correctly and flawlessly to ensure the effectiveness of mobile applications.

  • 136.
    Aouachria, Moufida
    et al.
    Universite du Quebec a Montreal, CAN.
    Leshob, Abderrahmane
    Universite du Quebec a Montreal, CAN.
    Gonzalez-Huerta, Javier
    Blekinge Institute of Technology, Faculty of Computing, Department of Software Engineering.
    Ghomari, Abdessamed Réda
    Ecole nationale superieure d'Informatique, DZA.
    Hadaya, Pierre
    Universite du Quebec a Montreal, CAN.
    Business Process Integration: How to Achieve Interoperability through Process Patterns2017In: Proceedings - 14th IEEE International Conference on E-Business Engineering, ICEBE 2017 - Including 13th Workshop on Service-Oriented Applications, Integration and Collaboration, SOAIC 207, Institute of Electrical and Electronics Engineers Inc. , 2017, p. 109-117Conference paper (Refereed)
    Abstract [en]

    Business process integration (BPI) is a crucial technique for supporting inter-organizational business interoperability. BPI allows automation of business processes and the integration of systems across numerous organizations. The integration of organizations' process models is one of the most addressed and used approach to achieve BPI. However, this model integration is complex and requires that designers have extensive experience in particular when organizations' business processes are incompatible. This paper considers the issue of modeling cross-organization processes out of a collection of organizations' private process models. To this end, we propose six adaptation patterns to resolve incompatibilities when combining organizations' processes. Each pattern is formalized with workflow net. © 2017 IEEE.

  • 137. Appel, André
    et al.
    Herold, Sebastian
    Clausthal University of Technology.
    Klus, Holger
    Rausch, Andreas
    Modelling the CoCoME with DisCComp2008In: The Common Component Modeling Example: Comparing Software Component Models, Springer, 2008, p. 267-296Chapter in book (Refereed)
  • 138.
    Aravind, Meera
    Mälardalen University, School of Innovation, Design and Engineering.
    Event-Based Messaging Architecture for Vehicular Internet of Things (IoT) Platforms2017Independent thesis Advanced level (degree of Master (Two Years)), 20 credits / 30 HE creditsStudent thesis
    Abstract [en]

    Internet of Things (IoT) has revolutionized transportation systems by connecting vehicles consequently enabling their tracking, as well as monitoring of driver activities. The IoT platform for most vehicles typically consists of 1) an on-board system consisting of the communication unit, sensors and a set of ECU’s that are interconnected using a CAN network, 2) an off-board system consisting of the applications deployed on the servers (e.g., cloud) that processes the data send by the communication unit over the internet, and 3) mobile devices like a mobile phone or a computer that communicates with the on-board and off-board systems. Such an IoT platform requires a significant amount of data to be send from the on-board system to the off-board servers, contributing to high network usage. There are two main architectural paradigms for sending data: 1) interval based architecture, in which data is send at regular intervals and 2) event based architecture, in which data is send whenever relevant events occur. Currently, (e.g., at Scania), the data is being send at regular intervals, i.e., using an interval based approach. In this case, data is send even if it is not relevant for reporting leading to a wastage of network resources, e.g., when the data does not change considerably compared to the previously sent value. Sending data in an event-based manner, when the data is relevant for reporting, e.g., changes significantly, reduces the network usage when compared to the interval based approach.  In this thesis, we investigate the possibility of using an event based architecture to send data from the on-board system to the off-board system in order to reduce network usage and improve the accuracy of the data available off-board. We first propose an event based architecture for data transfer in the context of Internet of vehicles. We then implement a simulator to evaluate our proposed architecture for the specific case of position data. Finally, we perform extensive experiments varying different parameters and compare, for example, average message size per minute and average off-board error distance. The results show that our event based architecture improves the accuracy of data available at the off-board system, by a careful selection of events. Moreover, we found that our event based architecture significantly decreases the frequency of sending messages, particularly during highway driving, leading to reduced average data transfer rates. Our results enable a customer to perform trade-offs between accuracy and data transfer rates. Future work will aim at implementing the event based architecture on a real platform as well as investigating the possibility of using the event based architecture for more accurate prediction by incorporating additional details such as the final destination of the vehicle and odometer values.

  • 139.
    Ardagna, Danilo
    et al.
    Politecnico di Milano, Italy.
    Bernardi, Simona
    University Center for Defense, Spain.
    Gianniti, Eugenio
    Politecnico di Milano, Italy.
    Aliabadi, Soroush Karimian
    Sharif University of Technology, Iran.
    Perez-Palacin, Diego
    Politecnico di Milano, Italy.
    Requeno, José Ignacio
    University of Zaragoza, Spain.
    Modeling performance of Hadoop applications: A journey from queueing networks to stochastic well formed nets2016In: Algorithms and Architectures for Parallel Processing: 16th International Conference, ICA3PP 2016, Granada, Spain, December 14-16, 2016, Proceedings / [ed] Jesus Carretero, Javier Garcia-Blas, Ryan K.L. Ko, Peter Mueller, Koji Nakano, Springer, 2016, p. 599-613Conference paper (Refereed)
    Abstract [en]

    Nowadays, many enterprises commit to the extraction of actionable knowledge from huge datasets as part of their core business activities. Applications belong to very different domains such as fraud detection or one-to-one marketing, and encompass business analytics and support to decision making in both private and public sectors. In these scenarios, a central place is held by the MapReduce framework and in particular its open source implementation, Apache Hadoop. In such environments, new challenges arise in the area of jobs performance prediction, with the needs to provide Service Level Agreement guarantees to the end-user and to avoid waste of computational resources. In this paper we provide performance analysis models to estimate MapReduce job execution times in Hadoop clusters governed by the YARN Capacity Scheduler. We propose models of increasing complexity and accuracy, ranging from queueing networks to stochastic well formed nets, able to estimate job performance under a number of scenarios of interest, including also unreliable resources. The accuracy of our models is evaluated by considering the TPC-DS industry benchmark running experiments on Amazon EC2 and the CINECA Italian supercomputing center. The results have shown that the average accuracy we can achieve is in the range 9–14%.

  • 140.
    Areskoug, Andreas
    Blekinge Institute of Technology, School of Engineering, Department of Systems and Software Engineering.
    Jämförelse av J2EE och .NET från ett Web Services perspektiv.2006Independent thesis Advanced level (degree of Master (One Year))Student thesis
    Abstract [en]

    This thesis compares the performance of Web Services when hosted on either the J2EE or the .NET platform. The thesis will investigate which platform should be choosen to host Web Services mainly based on performance.

  • 141.
    Arman, Sheikh Ali
    Linköping University, Department of Computer and Information Science, Software and Systems. Linköping University, The Institute of Technology.
    RESTful Mobile Application for Android: Mobile Version of Inspectera Online2014Independent thesis Advanced level (degree of Master (Two Years)), 20 credits / 30 HE creditsStudent thesis
    Abstract [en]

    Web service-based mobile applications have become emergent in the recent years. Representational State Transfer (REST) architecture style introduced the concept of Resource Oriented Architecture (ROA), which has been widely used for building applications for all platforms. This master’s thesis designs and develops a Web service-based mobile application for Android platform following the constraints of REST architectural style. It also proposes an authentication model for RESTful applications. The master’s thesis is completed at the company Inspectera HK AB in Norrköping, Sweden. The developed application is called the “Mobile version of Inspectera Online.” 

  • 142.
    Arneng, Per
    et al.
    Blekinge Institute of Technology, Department of Software Engineering and Computer Science.
    Bladh, Richard
    Blekinge Institute of Technology, Department of Software Engineering and Computer Science.
    Performance Analysis of Distributed Object Middleware Technologies2003Independent thesis Advanced level (degree of Master (One Year))Student thesis
    Abstract [en]

    Each day new computers around the world connects to the Internet or some network. The increasing number of people and computers on the Internet has lead to a demand for more services in different domains that can be accessed from many locations in the network. When the computers communicate they use different kinds of protocols to be able to deliver a service. One of these protocol families are remote procedure calls between computers. Remote procedure calls has been around for quite some time but it is with the Internet that its usage has increased a lot and especially in its object oriented form which comes from the fact that object oriented programming has become a popular choice amongst programmers. When a programmer has to choose a distributed object middleware there is a lot to take into consideration and one of those things is performance. This master thesis aims to give a performance comparison between different distributed object middleware technologies and give an overview of the performance difference between them and make it easier for a programmer to choose one of the technologies when performance is an important factor. In this thesis we have evaluated the performance of CORBA, DCOM, RMI, RMI-IIOP, Remoting-TCP and Remoting-HTTP. The results we have seen from this evaluation is that DCOM and RMI are the distributed object middleware technologies with the best overall performance in terms of throughput and round trip time. Remoting-TCP generally generates the least amount of network traffic, while Remoting-HTTP generates the most amount of network traffic due to it's SOAP-formated protocol.

  • 143.
    Arnesson, Andreas
    Blekinge Institute of Technology, Faculty of Computing, Department of Software Engineering.
    Codename one and PhoneGap, a performance comparison2015Independent thesis Basic level (degree of Bachelor), 10 credits / 15 HE creditsStudent thesis
    Abstract [en]

    Creating smartphone applications for more than one operating system requires knowledge of several code languages, more code maintenance, higher development costs and longer development time. To make this easier cross-platform tools (CPTs) exist. But using a CPT can decrease performance of the application. Applications with low performance are more likely to get uninstalled and this makes developers lose income. There are four main CPT approaches hybrid, interpreter, web and cross-compiler. Each has different disadvantages .and advantages. This study will examine the performance difference between two CPTs, Codename One and PhoneGap. The performance measurements, CPU load, memory usage, energy consumption, time execution and application size will be made to compare the CPTs. If cross-compilers have better performance than other CPT approaches will also be investigated. An experiment where three applications are created with native Android, Codename One and PhoneGap will be made and performance measurements will be made. A literature study with research from IEEE and Engineering village will be conducted on different CPT approaches. PhoneGap performed best with shortest execution time, least energy consumption and least CPU usage while Codename One had smallest application size and least memory usage. The research available on performance for CPTs is short and not well done. The difference between PhoneGap and Codename One is not big except for writing to SQLite. No basis was found for the statement that cross-compilers have better performance than other CPT approaches.  

  • 144.
    Arnesson, Robin
    Örebro University, School of Science and Technology.
    POS-terminal XGD K3702013Independent thesis Basic level (professional degree), 10 credits / 15 HE creditsStudent thesis
    Abstract [en]

    This thesis comprises the implementation of the basic functionality in a POS-terminal (Point OfSale) and the design of a client-server system in which the terminal acts as a client. The thesis wasdeveloped as an assignment from IBSP Labs AB where the goal was to create a system for wirelesspayments using POS-terminal XGD K370. The assignment was mainly comprised of thedevelopment of two programs; the application in the terminal which serves as an interface to thecustomer, and the back end program that processes incoming transactions from the terminal. Thisthesis presents the implementation of these programs and depicts the theory associated with themethods and tools used in the implementation.

  • 145.
    Arnklint, Jonas
    Jönköping University, School of Engineering, JTH, Computer and Electrical Engineering.
    Utveckling av publiceringsverktyg för hantering av webbplatser2009Independent thesis Basic level (university diploma), 10 credits / 15 HE creditsStudent thesis
  • 146.
    Aronis, Stavros
    et al.
    Uppsala University, Disciplinary Domain of Science and Technology, Mathematics and Computer Science, Department of Information Technology, Computing Science.
    Papaspyrou, Nikolaos
    Roukounaki, Katerina
    Sagonas, Konstantinos
    Uppsala University, Disciplinary Domain of Science and Technology, Mathematics and Computer Science, Department of Information Technology, Computing Science.
    Tsiouris, Yiannis
    Venetis, Ioannis E.
    A scalability benchmark suite for Erlang/OTP2012In: Proc. 11th ACM SIGPLAN Workshop on Erlang, New York: ACM Press, 2012, p. 33-42Conference paper (Refereed)
    Abstract [en]

    Programming language implementers rely heavily on benchmarking for measuring and understanding performance of algorithms, architectural designs, and trade-offs between alternative implementations of compilers, runtime systems, and virtual machine components. Given this fact, it seems a bit ironic that it is often more difficult to come up with a good benchmark suite than a good implementation of a programming language.

    This paper presents the main aspects of the design and the current status of bencherl, a publicly available scalability benchmark suite for applications written in Erlang. In contrast to other benchmark suites, which are usually designed to report a particular performance point, our benchmark suite aims to assess scalability, i.e., help developers to study a set of performance points that show how an application's performance changes when additional resources (e.g., CPU cores, schedulers, etc.) are added. We describe the scalability dimensions that the suite aims to examine and present its infrastructure and current set of benchmarks. We also report some limited set of performance results in order to show the capabilities of our suite.

  • 147.
    Aronis, Stavros
    et al.
    Uppsala University, Disciplinary Domain of Science and Technology, Mathematics and Computer Science, Department of Information Technology, Computing Science.
    Sagonas, Konstantinos
    Uppsala University, Disciplinary Domain of Science and Technology, Mathematics and Computer Science, Department of Information Technology, Computing Science.
    The shared-memory interferences of Erlang/OTP built-ins2017In: Proceedings Of The 16Th Acm Sigplan International Workshop On Erlang (Erlang '17) / [ed] Chechina, N.; Fritchie, SL., New York: Association for Computing Machinery (ACM), 2017, p. 43-54Conference paper (Refereed)
    Abstract [en]

    Erlang is a concurrent functional language based on the actor modelof concurrency. In the purest form of this model, actors are realizedby processes that do not share memory and communicate witheach other exclusively via message passing. Erlang comes quiteclose to this model, as message passing is the primary form of interprocesscommunication and each process has its own memoryarea that is managed by the process itself. For this reason, Erlangis often referred to as implementing “shared nothing” concurrency.Although this is a convenient abstraction, in reality Erlang’s mainimplementation, the Erlang/OTP system, comes with a large numberof built-in operations that access memory which is shared byprocesses. In this paper, we categorize these built-ins, and characterizethe interferences between them that can result in observabledifferences of program behaviour when these built-ins are usedin a concurrent setting. The paper is complemented by a publiclyavailable suite of more than one hundred small Erlang programsthat demonstrate the racing behaviour of these built-ins.

  • 148.
    Aronis, Stavros
    et al.
    Uppsala University, Disciplinary Domain of Science and Technology, Mathematics and Computer Science, Department of Information Technology, Computing Science.
    Sagonas, Konstantinos
    Uppsala University, Disciplinary Domain of Science and Technology, Mathematics and Computer Science, Department of Information Technology, Computing Science.
    Lystig Fritchie, Scott
    VMware, Cambridge, MA, USA.
    Testing And Verifying Chain Repair Methods For CORFU Using Stateless Model Checking2017Conference paper (Refereed)
    Abstract [en]

    Corfu is a distributed shared log that is designed to be scalable and reliable in the presence of failures and asynchrony. Internally, Corfu is fully replicated for fault tolerance, without sharding data or sacrificing strong consistency. In this case study, we present the modeling approaches we followed to test and verify, using Concuerror, the correctness of repair methods for the Chain Replication protocol suitable for Corfu. In the first two methods we tried, Concuerror located bugs quite fast. In contrast, the tool did not manage to find bugs in the third method, but the time this took also motivated an improvement in the tool that reduces the number of traces explored. Besides more details about all the above, we present experiences and lessons learned from applying stateless model checking for verifying complex protocols suitable for distributed programming.

  • 149.
    Aroseus, Zara
    et al.
    Blekinge Institute of Technology, Department of Human Work Science and Media Technology.
    Langeström, Emmie
    Blekinge Institute of Technology, Department of Human Work Science and Media Technology.
    Lindberg, Tobias
    Blekinge Institute of Technology, Department of Human Work Science and Media Technology.
    Vår utvecklingsprocess: designaspekter på ett befintligt webbsystem2001Independent thesis Basic level (degree of Bachelor)Student thesis
    Abstract [sv]

    Den här rapporten är ett resultat av ett kandidatarbete på 20-poäng. Projektet som rapporten handlar om, beskriver hur utvecklingen av ett webbutikssystem åt ett flertal företag gått till. De har i sin tur köpt webbsystemet från företaget Opti Use som givit oss uppgiften. I rapporten diskuteras hur utvecklingsprocessen gått till och hur ett arbete med ett befintligt system upplevts. Rapporten beskriver även hur lösningen på uppgiften har tagit form och hur utvecklingsprocessen påverkat den. Metoderna som används under projektets gång beskrivs utifrån hur de förändrats för att önskat resultat skulle uppnås. Titelns innebörd beskriver projektgruppens utvecklingsprocess under arbetet med projektet i förhållande till tidigare erfarenheter under MDA-utbildningen.

  • 150.
    Arrospide Echegaray, Daniel
    KTH, School of Technology and Health (STH), Medical Engineering, Computer and Electronic Engineering.
    Utvärdering av Självstyrandes-utvecklarramverket2016Independent thesis Basic level (professional degree), 10 credits / 15 HE creditsStudent thesis
    Abstract [en]

    Within software engineering there is a diversity of process methods where each one has its specific purpose. A process method can be described as being a repeatable set of step with the purpose to achieve a task and reach a specific result. The majority of process methods found in this study are focused on the software product being developed. There seems to be a lack of process methods that can be used by software developers for there individual soft- ware process improvement. Individual software process improvement, refers to how the in- dividual software developer chooses to structure their own work with the purpose to obtain a specific result

    The Self-Governance Developer Framework (also called SGD-framework) whilst writing this is a newly developed process framework with the purpose of aiding the individual soft- ware developer to improve his own individual software process. Briefly explained the framework is intended to contain all the activities that can come up in a software project. The problem is that this tool has not yet been evaluated and therefore it is unknown if it is relevant for its purpose. To frame and guide the study three problem questions has been for- mulated (1) Is the framework complete for a smaller company in regards to it activities? (2) How high is the cost for the SGD-framework in regard of time?

    The goal of the study is to contribute for future studies for the framework by performing an action study where the Self-Governance Developer Framework is evaluated against a set of chosen evaluation criteria.

    An inductive qualitative research method was used when conducting the study. An induc- tive method means that conclusions are derived from empirically gathered data and from that data form general theories. Specifically, the action study method was used. Data was gathered by keeping a logbook and also time logging during the action study. To evaluate the framework, some evaluation criteria was used which were (1) Completeness, (2) Se- mantic correctness, (3) Cost. A narrative analysis was conducted over the data that was gathered for the criteria. The analysis took the problem formulations in regard.

    The results from the evaluation showed that the framework was not complete with the re- gards of the activities. Although next to complete as only a few activities were further needed during the action study. A total of 3 extra activities were added over the regular 40 activities. Around 10% of the time spent in activities were in activities outside of the Self- Governance Developer Framework. The activities were considered to finely comminute for the context of a smaller company. The framework was considered highly relevant for im- proving the individual software developers own process. The introduction cost in this study reflect on the time it took until the usage of the framework was considered consistent. In this study it was approximately 24 working days with a usage about 3.54% of an eight-hour work day. The total application cost of usage of the framework in the performed action study was on average 4.143 SEK/hour or 662,88 SEK/month. The template cost used was on 172.625 SEK/hour. 

1234567 101 - 150 of 2971
CiteExportLink to result list
Permanent link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf