Digitala Vetenskapliga Arkivet

Change search
Refine search result
58596061626364 3001 - 3050 of 6263
CiteExportLink to result list
Permanent link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Rows per page
  • 5
  • 10
  • 20
  • 50
  • 100
  • 250
Sort
  • Standard (Relevance)
  • Author A-Ö
  • Author Ö-A
  • Title A-Ö
  • Title Ö-A
  • Publication type A-Ö
  • Publication type Ö-A
  • Issued (Oldest first)
  • Issued (Newest first)
  • Created (Oldest first)
  • Created (Newest first)
  • Last updated (Oldest first)
  • Last updated (Newest first)
  • Disputation date (earliest first)
  • Disputation date (latest first)
  • Standard (Relevance)
  • Author A-Ö
  • Author Ö-A
  • Title A-Ö
  • Title Ö-A
  • Publication type A-Ö
  • Publication type Ö-A
  • Issued (Oldest first)
  • Issued (Newest first)
  • Created (Oldest first)
  • Created (Newest first)
  • Last updated (Oldest first)
  • Last updated (Newest first)
  • Disputation date (earliest first)
  • Disputation date (latest first)
Select
The maximal number of hits you can export is 250. When you want to export more records please use the Create feeds function.
  • 3001.
    Karlsson, Gunnar
    et al.
    KTH, School of Electrical Engineering and Computer Science (EECS), Centres, Centre for Cyber Defence and Information Security CDIS.
    Lundén, Paola
    Research Institutes of Sweden (RISE).
    Agile Education Imagined: A report from the Cybercampus workshop onAgile Education2023Report (Other (popular science, discussion, etc.))
    Abstract [en]

    Cybercampus Sweden is a national initiative to provide education, research, innovation and advice in cybersecurity and cyber-defense. This brochure addresses needs for cybersecurity training and education. The contents are fictitious courses created from the outcomes of a planning workshop on agile education, conducted by the planning project for Cybercampus Sweden, held on October 17, 2022.

    Download full text (pdf)
    Agile Education Imagined - A fictitious Course Brochure
  • 3002.
    Karlsson, Ingemar
    et al.
    University of Skövde, School of Engineering Science. University of Skövde, Virtual Engineering Research Environment.
    Bernedixen, Jacob
    University of Skövde, School of Engineering Science. University of Skövde, Virtual Engineering Research Environment.
    Ng, Amos H. C.
    University of Skövde, School of Engineering Science. University of Skövde, Virtual Engineering Research Environment.
    Pehrsson, Leif
    University of Skövde, School of Engineering Science. University of Skövde, Virtual Engineering Research Environment.
    Combining augmented reality and simulation-based optimization for decision support in manufacturing2017In: Proceedings of the 2017 Winter Simulation Conference / [ed] W. K. V. Chan, A. D’Ambrogio, G. Zacharewicz, N. Mustafee, G. Wainer, E. Page, Institute of Electrical and Electronics Engineers (IEEE), 2017, p. 3988-3999Conference paper (Refereed)
    Abstract [en]

    Although the idea of using Augmented Reality and simulation within manufacturing is not a new one, the improvement of hardware enhances the emergence of new areas. For manufacturing organizations, simulation is an important tool used to analyze and understand their manufacturing systems; however, simulation models can be complex. Nonetheless, using Augmented Reality to display the simulation results and analysis can increase the understanding of the model and the modeled system. This paper introduces a decision support system, IDSS-AR, which uses simulation and Augmented Reality to show a simulation model in 3D. The decision support system uses Microsoft HoloLens, which is a head-worn hardware for Augmented Reality. A prototype of IDSS-AR has been evaluated with a simulation model depicting a real manufacturing system on which a bottleneck detection method has been applied. The bottleneck information is shown on the simulation model, increasing the possibility of realizing interactions between the bottlenecks. 

    Download full text (pdf)
    fulltext
  • 3003.
    Karlsson, Isak
    et al.
    Linnaeus University, Faculty of Technology, Department of computer science and media technology (CM).
    Ljungberg, David
    Linnaeus University, Faculty of Technology, Department of computer science and media technology (CM).
    Supplementing Dependabot’svulnerability scanning: A Custom Pipeline for Tracing DependencyUsage in JavaScript Projects2024Independent thesis Basic level (degree of Bachelor), 10 credits / 15 HE creditsStudent thesis
    Abstract [en]

    Software systems are becoming increasingly complex, with developers frequentlyutilizing numerous dependencies. In this landscape, accurate tracking and understanding of dependencies within JavaScript and TypeScript codebases are vital formaintaining software security and quality. However, there exists a gap in how existing vulnerability scanning tools, such as Dependabot, convey information aboutthe usage of these dependencies. This study addresses the problem of providing amore comprehensive dependency usage overview, a topic critical to aiding developers in securing their software systems. To bridge this gap, a custom pipeline wasimplemented to supplement Dependabot, extracting the dependencies identified asvulnerable and providing specific information about their usage within a repository.The results highlight the pros and cons of this approach, showing an improvement inthe understanding of dependency usage. The effort opens a pathway towards moresecure software systems.

    Download full text (pdf)
    fulltext
  • 3004.
    Karlsson, Joel
    et al.
    Jönköping University, School of Engineering, JTH, Computer and Electrical Engineering.
    Bengtsson, Carl
    Jönköping University, School of Engineering, JTH, Computer and Electrical Engineering.
    Webbaserat resultat och uppföljningsprogram för idrottsförening2012Independent thesis Basic level (university diploma), 10 credits / 15 HE creditsStudent thesis
    Abstract [en]

    Students at the School of Engineering at Jönköping University have on behalf of the swimming organization “Nässjö Sim och Livräddningssällskap” conducted research on how to monitor and evaluate swimmers and swimming groups in a simple and efficient manner with a focus on usability and user friendliness. Since no system existed that could meet these requirements, the client had a desire to tailor and develop such a system. The purpose of this work was to develop a system that would work as a tool to be used by coaches and swimmers alike and allow them to eventually follow the development of swimmers and swimming groups. The organization also wanted to have the ability to create tests and exams, both for land and water training, register these in the system and in a structured way know which swimmers took and passed these.

    For this thesis, the following questions functioned as a framework for progress.

    • Usability
      • How to design a usable system and what is required?
    • System
      • What type of system is best suited to meet the client's requirements and preferences?
    • Competition results
      • How can one retrieve all the competition results for swimmers of Nässjö SLS?
    • Database
      • What type of database is best suited to easily share data with an external existing database?

    Usability has been the main focus throughout the project. In order to succeed and stay on track, the students turned to the usability guru Jakob Nielsen, and his 10 heuristic rules have worked as guidelines. The work is also founded on the theories and tests Nielsen conducted on how people interact with a web-based system.

    The result of the project is a working prototype of the system without access to the competition results as the students together with the client decided that the procedure to access the database containing results from competitions became too difficult, time consuming and costly. From the perspective of functionality, focus turned to a system that can display a development curve for swimmers and swimming groups based on training and the results from tests and exams.

    Download full text (pdf)
    fulltext
  • 3005.
    Karlsson, Johanna
    University of Skövde, School of Informatics.
    Sensorteknik som hjälpmedel inom vård och omsorg?: En kvalitativ datavetenskaplig studie kring risker och möjligheter med sensorteknik som hjälpmedel inom vård och omsorg2015Independent thesis Basic level (degree of Bachelor), 15 credits / 22,5 HE creditsStudent thesis
    Abstract [sv]

    Syftet med denna studie är att kartlägga eventuella risker och möjligheter med sensorteknik som hjälpmedel inom vård och omsorg. Studien använder en kvalitativ metod med kvantitativa inslag där semistrukturerade intervjuer tillsammans med frågeformulär utnyttjades. Åtta personer med olika eventuella relationer till sensortekniken intervjuades och resultatet har visat på att både risker och möjligheter kan ses.

    Studien visar att vad respondenterna kan anse användbart att sensortekniken utför eller hjälper till med, inte nödvändigtvis ses som acceptabelt. Den visar även på att personal och ledning inom vård och omsorg inte alltid är överens om t.ex. hur mycket utbildning av personalen som behövs vid införande av ny teknik eller hur kvalitén på utbildningen är. De gemensamma åsikter som framkom var hur viktig och betydelserik teknik som t.ex. sensortekniken kommer bli för den framtida vård och omsorg för att kommunerna ska kunna tillgodose det ökade behovet.

     

    Studien bidrar till en djupare förståelse för attityder gentemot sensortekniken och kan vara som en vägledning för kommuner vid införandet av liknande tekniker i framtiden. Genom att lyfta fram de olika rollernas eventuella tankar och idéer om sensortekniken kan det användas som en form av kravanalys, sett utifrån ett operativt perspektiv till fördel för framtida forskning i ämnet. Det kan också underlätta för kommuner, genom att peka ut vart och hur rädslor och negativitet gentemot tekniken kan finnas, för att kunna förebygga med hjälp av ytterligare information till personal, användare och anhöriga.

    Download full text (pdf)
    fulltext
  • 3006.
    Karlsson, Jonas S.
    Linköping University, Department of Computer and Information Science. Linköping University, The Institute of Technology.
    A scalable data structure for a parallel data server1997Licentiate thesis, monograph (Other academic)
    Abstract [en]

    Modern and future applications, such as in the telecommunication industry and real-time systems, store and manage very large amounts of information. This information needs to be accessed and searched with high performance, and it must have high availability. Databases are traditionally used for managing high volumes of data. Currently, mostly administrative systems use database technology. However, newer applications need the facilities of database support. But just applying traditional database technology to these applications is not enough. The high-performance demands and the required ability to scale to larger data sets are generally not met by current database systems.Data Servers are dedicated computers which manage the internal data in a database system (DBMS). Modern powerful workstations and parallel computers are used for this purpose. The idea is that an Application Server handles user input and data display, parses the queries, and sends the parsed query to the data server that executes it. A data server, using a dedicated machine, can be better tuned in memory management than a general purpose computer.Network multi-computers, such as clusters of workstations or parallel computers, provide a solution that is not limited to the capacity of one single computer. This provides the means for building a data server of potentially any size. This gives rise to the interesting idea of enabling the system to grow over time by adding more components to meet the increased storage and processing demands. This exhibits the need for scalable solutions that allow for data to be reorganized smoothly, unnoticed by the clients, the applications servers, accessing the data server.In this thesis we identify the importance of appropriate data structures for parallel data servers. We focus on Scalable Distributed Data Structures (SDDSs) for this purpose. In particular LH*, and our new data structure LH*LH. An overview is given of related work, and systems that have traditionally implicated the need of such data structures. We begin by discussing high-performance databases, and this leads us to database machines and parallel data servers. We sketch an architecture for an LH*LH-based file storage that we will use for a parallel data server. We also show performance measures for the LH*LH and present its algorithm in detail. The testbed, the Parsytec switched multi-computer, is described along with experience acquired during the implementation process.

  • 3007. Karlsson, Karl
    Att ta brottsplatser till nya dimensioner: En jämförande studie mellan 2D-visning och VR-visning av 3D-skannade brottsplatser2023Independent thesis Basic level (degree of Bachelor), 10 credits / 15 HE creditsStudent thesis
    Abstract [en]

    This study was conducted in efforts to investigate whether Virtual Reality is useful for presenting crime scenes, for example, to individuals involved in legal proceedings. To answer this, two questions were posed and answered using data obtained from an experiment. The experiment involved participants experiencing (crime) scenes through Virtual Reality and through an interactive view of the 3D model on a computer screen. The first question was whether it was possible to statistically distinguish the responses from two observation groups. One observation group consisted of data collected after a participant experienced a location through Virtual Reality, while the other observation group consisted of data collected after a participant experienced a location through a computer screen. The responses from the observation groups were later analyzed to answer the second question: whether one group could be distinguished from the other in their ability to choose the correct response on the questionnaire.

    Practically, participants alternated between experiencing 3D-scanned (crime) scenes via Virtual Reality and a computer screen. After experiencing a location, they were asked to complete a questionnaire. The experiment was conducted with 24 participants, resulting in 12 responses per questionnaire. This generated quantitative data, which was later used in hypothesis testing. The conducted hypothesis tests did not show any significant difference between the observation groups in 3 out of 4 cases. However, one of the hypothesis tests showed a significant difference in one case, where the observation group that experienced the location through a computer screen also had a higher median. The exact reason for this is difficult to determine, but there are certain differences in the level of detail in the visibility of the 3D models that could partly explain why differences were observed.

    Based on these results, there is no obvious advantage to using VR technology (at least not in this implementation) in legal cases. However, it is important to note that the study and experiment were conducted in a short period and on a small scale. Furthermore, the development in this field is progressing rapidly, and the results cannot be assumed to be the same with other types of implementations.

    Download full text (pdf)
    fulltext
  • 3008.
    Karlsson, Linus
    Linköping University, Department of Computer and Information Science, Human-Centered systems. Linköping University, The Institute of Technology.
    RESTful Cloud Server2013Independent thesis Basic level (degree of Bachelor), 10,5 credits / 16 HE creditsStudent thesis
    Abstract [en]

    This report is about the development of a cloud service server that will be used to store user information. The clouds purpose is to store information about the user and the devices he or she uses. The cloud has web services that enable users to retrieve and store the data on the server. It isalso responsible for user administration and security of the data. A web client is also developed in parallel with the cloud. The web client uses the cloud’s  web services to show and modify data for the user.

    Download full text (pdf)
    RCS
  • 3009.
    Karlsson, Magnus
    Mälardalen University, School of Innovation, Design and Engineering.
    Effektiv patchhantering2019Independent thesis Basic level (degree of Bachelor), 5 credits / 7,5 HE creditsStudent thesis
    Abstract [en]

    Organizations are exposed to constant security threats from the internet and penetration tests reveal just how vulnerable networks are when software and hardware patching aren’t up to date. Updates, known in IT as “patches”, usually enhances functions or security. Patch Management is the field in which anything related to patching of software and other various equipment falls under. As of today, Patch Management faces great challenges and the purpose of this study is to understand how the process can be made more efficient. Historically, a common issue has been the number of patch releases, which has made it cumbersome for organizations to stay up to date. Standardization bodies, such as IEC and NIST recommend that patches are tested in test environments before being installed to the production environment, to make sure no unintended consequences arise from faulty patches. Through interviews with professionals working in Patch Management, it became clear that there are ways to stay up to date, but partly through disregarding recommended best practice. Automated tools ease the Patch Management process to great extents but there are still areas that remain non-automated. The testing process has been largely ignored by organizations whose networks are connected to the internet, because said process is much too inefficient. Their answer to the problem of staying up to date is to solve problems quickly that arise through faulty patching, rather than test patches over longer periods of time. Their reasoning being that leaving known vulnerabilities unpatched is more damaging to the network.

    Download full text (pdf)
    fulltext
  • 3010.
    Karlsson, Magnus
    et al.
    Linköping University, Department of Science and Technology, Physics, Electronics and Mathematics. Linköping University, Faculty of Science & Engineering.
    Carlsson, Hakan
    Combitech AB, Sweden.
    Idebro, Mats
    Combitech AB, Sweden.
    Eek, Christoffer
    Combitech AB, Sweden.
    Microwave Heating as a Method to Improve Sanitation of Sewage Sludge in Wastewater Plants2019In: IEEE Access, E-ISSN 2169-3536, Vol. 7, p. 142308-142316Article in journal (Refereed)
    Abstract [en]

    For long-term sustainable agriculture, it is critical that we recycle nutrition to the soil that it came from. One important source is sewage sludge, but it must be sanitized from undesired pathogens before it may be spread on arable land. One common method today is deposition in about six months or more. Not only is such a long deposition-time costly due to the required storage-space, in the future usage of the method is likely to be more restricted from a regulatory perspective. To heat up sewage-sludge is a known method to speed up the sanitation process. However, achieving an even guaranteed temperature is not easy with porous sewage sludge. This is mainly due to the limited heat conductivity of the sludge. Microwaves at a frequency of 2.45 GHz have a penetration depth of a few centimeters and therefore has an advantage compared to other heating methods which only heats the surface. In the proposed system, the sewage sludge is continuously processed through a series of microwave cavities. The pathogen removal effectiveness was studied for different exposure settings, e.g., conveyor speed and applied microwave power in each cavity.

    Download full text (pdf)
    fulltext
  • 3011.
    Karlsson, Marcus
    Linköping University, Department of Electrical Engineering, Communication Systems. Linköping University, Faculty of Science & Engineering.
    Aspects of Massive MIMO2016Licentiate thesis, comprehensive summary (Other academic)
    Abstract [en]

    Next generation cellular wireless technology faces tough demands: increasing the throughput and reliability without consuming more resources, be it spectrum or energy. Massive mimo (Multiple-Input Multiple-Output) has proven, both in theory and practice, that it is up for the challenge. Massive mimo can offer uniformly good service to many users using low-end hardware, simultaneously, without increasing the radiated power compared to contemporary system. In Massive mimo, the base stations are equipped with hundreds of antennas. This abundance of antennas brings many new, interesting aspects compared to single-user mimo and multi-user mimo. Some issues of older technologies are nonexistent in massive mimo, while new issues in need of solutions arise. This thesis considers two aspects, and how these aspects differ in a massive mimo context: physical layer security and transmission of system information. First, it is shown that a jammer with a large number of antennas can outperform a traditional, single-antenna jammer in degrading the legitimate link. The excess of antennas gives the jammer opportunity to find and exploit structure in signals to improve its jamming capability. Second, for transmission of system information, the vast number of antennas prove useful even when the base station does not have any channel state information, because of the increased availability of space-time coding. We show how transmission without channel state information can be done in massive mimo by using a fixed precoding matrix to reduce the pilot overhead and simultaneously apply space-time block coding to use the excess of antennas for spatial diversity.

    Download full text (pdf)
    Aspects of Massive MIMO
    Download (pdf)
    omslag
    Download (jpg)
    presentationsbild
  • 3012.
    Karlsson, Matilda
    et al.
    Halmstad University.
    Johansson, Kajsa
    Halmstad University.
    OK2PARK2017Independent thesis Basic level (university diploma), 15 credits / 22,5 HE creditsStudent thesis
    Abstract [en]

    Parking fines is one of the most common reasons for people being subjected to official testing today and in many cases this happens because the driver has difficulty detecting the parking sign. Roads are changing aswell as traffic signs are renewed or removed, and on previous parking where its been free to park a fee is now charged. Which results in misunderstandings and ambiguities in traffic. Today, there is still no easy-to-access information site where you can find such information about parking spaces or the rules that apply to the particular location on which you are located. Therefore, this application and associated database has been developed, named OK2PARK.

    The purpose of the application is to facilitate and streamline parking with drivers in whatever city they are. The application contains aids that counteract parking fines and, through a map view, the user can easily see where there are parking spaces in the area and where they are located. With a simple click the user gets the rules and a visual image of the parking sign belonging to the selected parking lot. The user can also choose to have a note on theirs phone about when the parking lot expires. The benefit of the project is to reduce parking fines and increase understanding of parking and parking spaces for drivers. 

    Download full text (pdf)
    fulltext
  • 3013.
    Karlsson, Nellie
    et al.
    Halmstad University, School of Information Technology.
    Bengtsson, My
    Halmstad University, School of Information Technology.
    Rahat, Mahmoud
    Halmstad University, School of Information Technology, Center for Applied Intelligent Systems Research (CAISR).
    Sheikholharam Mashhadi, Peyman
    Halmstad University, School of Information Technology, Center for Applied Intelligent Systems Research (CAISR).
    Baseline Selection for Integrated Gradients in Predictive Maintenance of Volvo Trucks’ Turbocharger2023In: VEHICULAR 2023: The Twelfth International Conference on Advances in Vehicular Systems, Technologies and Applications / [ed] Reiner Kriesten; Panos Nasiopoulos, International Academy, Research and Industry Association (IARIA), 2023, p. 29-36Conference paper (Refereed)
    Abstract [en]

    The new advances in Vehicular Systems and Technologies have resulted in a sheer increase in the number of connected vehicles. These connected vehicles use IoT technologies to communicate operational signals with the OEMs, such as the vehicle’s speed, torque, temperature, load, RPM, etc. These signals have provided an unprecedented opportunity to adaptively monitor the status of each piece of the vehicle’s equipment and discover any possible risk of failure before it happens. This emerging field of study is called predictive maintenance (also known as condition-based maintenance) and has recently received much attention. In this paper, we apply Integrated Gradients (IG), an XAI method until now primarily used on image data, on datasets containing tabular and time-series data in the domain of predictive maintenance of trucks’ turbochargers. We evaluate how the results of IG differ, in these new settings, for various types of models. In particular, we investigate how the change of baseline can affect the outcome. Experimental results verify that IG can be applied successfully to both sequenced and non-sequenced data. Contrary to the opinion common in the literature, the gradient baseline does not affect the results of IG significantly, especially on models such as RNN, LSTM, and GRU, where the data contains time series; the effect is more visible for models like MLP with non-sequenced data. To confirm these findings, and to understand them deeper, we have also applied IG to SVM models, which gave the results that the choice of gradient baseline has a significant impact on the performance of SVM. (c) IARIA, 2023

  • 3014.
    Karlsson, Olof
    et al.
    Linnaeus University, Faculty of Technology, Department of Informatics.
    Nilsson, Anton
    Linnaeus University, Faculty of Technology, Department of Informatics.
    Effekter av tillväxt genom ett informationslogistiskt perspektiv: En studie som belyser ett tillverkande företags expansiva tillväxt och transporter2019Independent thesis Basic level (degree of Bachelor), 10 credits / 15 HE creditsStudent thesis
    Abstract [en]

    Many companies have experienced growth within their organizations over the past fewyears, which has influenced these companies’ internal and external organizational structure.Companies have been affected by globalization since the geographical boundaries have been erased, which has led to an increase in demand for products and transportationrequirements. Previous research shows that there is a significant increase in a company’stransportation costs during periods of growth, which therefore requires an improved control over these costs.

    Based on the purpose of the research was to find a suitable and theoretical framework in order to identify and visually highlight the Information logistics’ effects by growth. The purpose of this research was to investigate and explore the effects of expansive growth during a specific period of time and how that effects a manufacturing company through an Information logistics perspective, with limitation to transport.

    The research in this study has been answered with the help of a qualitative method in which semi-structured interviews have been used to collect empirical data. Then this data has been analyzed through the three different approaches which is description, systematisation and categorization and the answers are handled ethically.

    The result of the case study showed that the administrative work has not been prioritized during a period of growth. Through the developed theoretical framework TOEI, the result has been analyzed and the Information logistics’ effects could be identified in the four main categories technical, organizational, environmental and individual. The case study concluded that companies affected by rapid growth experience the following risk factors: information overflow, increasing work assignment, lack of time and uncontrolled cost support.

    Download full text (pdf)
    fulltext
  • 3015.
    Karlsson, Rickard
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering, Computer Science.
    EzMole: A new prototype for securing public Wi-Fi connections2017Independent thesis Advanced level (degree of Master (Two Years)), 20 credits / 30 HE creditsStudent thesis
    Abstract [en]

    When public Wi-Fi networks are being used, it can be hard to know who else that is using the same network or is monitoring the traffic that is traveling across the network. If the network is public and unencrypted anyone can monitor the traffic and to use these networks for work can be very risky. This is a big problem that needs a solution because the information that travels across the public network might have organizational secrets or sensitive personal information that shouldn’t be read by outsiders.

    One way to significantly increase the security while using these public networks is by configuring and setting up a VPN-tunnel, all traffic will then be sent encrypted. But nowadays many computers and mobile phones runs applications in the background that are actively asking for updates. It can for example be news apps, mail clients or instant messaging services like WhatsApp or Telegram. Since the apps is pushing for updates in the background there is a big risk that these programs are asking for updates and therefore transmit and receives information unencrypted over the public network before they have been able to set up their VPN-tunnel.

    People might be unaware about this problem and this research can be used to explain the problem and offer a solution to it and that is the reason why this research is important. This research tries to solve the problem and find answers to the research questions, “How to design and implement an affordable intermediate device that offers the user secure access to Internet on public Wi-Fi networks?" and “What are the design principles of that method?”.

    The proposed solution to solve this problem was to design and implement a new intermediate device, which was called EzMole, in between the public Wi-Fi and the users’ personal devices. The new device will operate and secure the users’ devices from potential malicious users on the public Wi-Fi while the VPN-tunnel is being established. It will also create a new encrypted wireless network that will be used to connect the personal devices to EzMole, for example mobile phone or laptop.

    The methodology that was used to design and develop the new EzMole-device was the Design Science Research Methodology. It includes six steps that was used during three phases of the project that worked in an iterative process with development, testing and evaluation until the device met the initial requirements of a successful device. There were tests for both functionality and security to make sure that it worked in the right way and that it didn’t have any known security weaknesses or flaws. This was very important since EzMole will be and represent an Internet-of-Things(IoT)-device and therefore the security had a big focus.

    After the tests, it was time to evaluate it against the initial requirements and the new device lived up to 9/12 requirements and was therefore classified as successful. The research contributes with a universal solution for the research problem and it gives answers to the research questions and in the meantime, reduces the gap in the literature. It also contributes with providing a new piece of hardware that will can help people to connect to the Internet in a more secure way when they are using public Wi-Fi networks.

    Download full text (pdf)
    fulltext
  • 3016.
    Karlsson, Rikard
    Linköping University, Department of Computer and Information Science, Human-Centered systems.
    Utvärdering av kvalitetsregistret och processtödssystemet Carath2017Independent thesis Basic level (degree of Bachelor), 12 credits / 18 HE creditsStudent thesis
    Abstract [sv]

    Studien genomförs i syfte att utvärdera hur arbetsflödet i sjukvårdssystemet Carath är anpassat efter rutiner hos användarna på de thoraxkliniker som äger systemet och där det används. Tre arbetsplatsundersökningar och elva stycken intervjuer genomfördes med sjukhuspersonal på olika kliniker i Sverige med metoden kontextuell design.

    Resultaten från studien visar att Carath är anpassat efter patientflödet på klinikerna men i mindre grad anpassat efter användares arbetsrutiner och roller. Studien visar även på att systemet saknar inbyggd hantering för återkoppling och kontroller av registrerad data. Dessa funktioner utförs istället manuellt utanför systemet. Samtliga av de saknade funktionerna bekräftas även genom SEIPS-modellen som funktioner som är rekommenderbara inom ett sjukvårdssystem. Sammantaget visar resultatet från kontextuella intervjuer med kommentarer från personal, genererade sekvensmodeller över arbetsflöden och utvärderingar mot SEIPS-modellen att Carath skulle behöva balanseras mellan patientflöde och personalens arbetsflöde. Funktioner för återkoppling och kontroll skulle integreras i systemet. Framtida arbete innefattar en undersökning kring integration mot andra system inom sjukvården.

    Download full text (pdf)
    fulltext
  • 3017.
    Karlsson, Robin
    Blekinge Institute of Technology, Faculty of Computing, Department of Computer Science and Engineering.
    Cooperative Behaviors BetweenTwo Teaming RTS Bots in StarCraft2015Independent thesis Advanced level (degree of Master (Two Years)), 20 credits / 30 HE creditsStudent thesis
    Abstract [en]

    Context. Video games are a big entertainment industry. Many video games let players play against or together. Some video games also make it possible for players to play against or together with computer controlled players, called bots. Artificial Intelligence (AI) is used to create bots.

    Objectives. This thesis aims to implement cooperative behaviors between two bots and determine if the behaviors lead to an increase in win ratio. This means that the bots should be able to cooperate in certain situations, such as when they are attacked or when they are attacking.

    Methods. The bots win ratio will be tested with a series of quantitative experiments where in each experiment two teaming bots with cooperative behavior will play against two teaming bots without any cooperative behavior. The data will be analyzed with a t-test to determine if the data are statistical significant.

    Results and Conclusions. The results show that cooperative behavior can increase performance of two teaming Real Time Strategy bots against a non-cooperative team with two bots. However, the performance could either be increased or decreased depending on the situation. In three cases there were an increase in performance and in one the performance was decreased. In three cases there was no difference in performance. This suggests that more research is needed for these cases.

    Download full text (pdf)
    fulltext
  • 3018.
    Karlsson, S.
    et al.
    KTH, School of Information and Communication Technology (ICT), Microelectronics and Information Technology, IMIT.
    Brorsson, Mats
    KTH, School of Information and Communication Technology (ICT), Communication: Services and Infrastucture, Software and Computer Systems, SCS.
    An Infrastructure for Portable and Efficient Software DSM1999Conference paper (Refereed)
  • 3019.
    Karlsson, S.
    et al.
    KTH, School of Information and Communication Technology (ICT), Microelectronics and Information Technology, IMIT.
    Brorsson, Mats
    KTH, School of Information and Communication Technology (ICT), Communication: Services and Infrastucture, Software and Computer Systems, SCS.
    Priority Based Messaging for Software Distributed Shared Memory2003In: Cluster Computing, Vol. 6, p. 161-169Article in journal (Refereed)
  • 3020.
    Karlsson, S.
    et al.
    KTH, Superseded Departments (pre-2005), Microelectronics and Information Technology, IMIT.
    Brorsson, Mats
    KTH, Superseded Departments (pre-2005), Microelectronics and Information Technology, IMIT.
    Producer-push-a protocol enhancement to page-based software distributed shared memory systems1999In: Proceedings of ICPP’99: 1999 International Conference on Parallel Processing, 1999, p. 291-300Conference paper (Refereed)
    Abstract [en]

    This paper describes a technique called producer-push that enhances the performance of a page-based software distributed shared memory system. Shared data, in software DSM systems, must normally be requested from the node that produced the latest value. Producer-push utilizes the execution history to predict this communication so that the data is pushed to the consumer before it is requested. In contrast to previously proposed mechanisms to proactively send data to where it is needed, producer-push uses information about the source code location of communication to more accurately predict the needed communication. Producer-push requires no source code modifications of the application and it effectively reduces the latency of shared memory accesses. This is confirmed by our performance evaluation which shows that the average time to wait for memory updates is reduced by 74%. Producer-push also changes the communication pattern of an application making it more suitable for modern networks. The latter is a result of a 44% reduction of the average number of messages and an enlargement of the average message size by 65%.

  • 3021.
    Karlsson Schmidt, Carl
    Linköping University, Department of Electrical Engineering, Computer Vision. Linköping University, Faculty of Science & Engineering.
    Rhino and Human Detection in Overlapping RGB and LWIR Images2015Independent thesis Advanced level (degree of Master (Two Years)), 20 credits / 30 HE creditsStudent thesis
    Abstract [en]

    The poaching of rhinoceros has increased dramatically the last few years andthe park rangers are often helpless against the militarised poachers. LinköpingUniversity is running several projects with the goal to aid the park rangers intheir work.This master thesis was produced at CybAero AB, which builds Remotely PilotedAircraft System (RPAS). With their helicopters, high end cameras with a rangesufficient to cover the whole area can be flown over the parks.The aim of this thesis is to investigate different methods to automatically findrhinos and humans, using airborne cameras. The system uses two cameras, onecolour camera and one thermal camera. The latter is used to find interestingobjects which are then extracted in the colour image. The object is then classifiedas either rhino, human or other. Several methods for classification have beenevaluated.The results show that classifying solely on the thermal image gives nearly as highaccuracy as classifying only in combination with the colour image. This enablesthe system to be used in dusk and dawn or in bad light conditions. This is animportant factor since most poaching occurs at dusk or dawn. As a conclusion asystem capable of running on low performance hardware and placeable on boardthe aircraft is presented.

    Download full text (pdf)
    fulltext
  • 3022.
    Karlsson, Stefan
    Mälardalen University, School of Innovation, Design and Engineering, Embedded Systems. ABB, Västerås, Sweden.
    Exploratory test agents for stateful software systems2019In: ESEC/FSE 2019 - Proceedings of the 2019 27th ACM Joint Meeting European Software Engineering Conference and Symposium on the Foundations of Software Engineering, Association for Computing Machinery, Inc , 2019, p. 1164-1167Conference paper (Refereed)
    Abstract [en]

    The adequate testing of stateful software systems is a hard and costly activity. Failures that result from complex stateful interactions can be of high impact, and it can be hard to replicate failures resulting from erroneous stateful interactions. Addressing this problem in an automatic way would save cost and time and increase the quality of software systems in the industry. In this paper, we propose an approach that uses agents to explore software systems with the intention to find faults and gain knowledge.

  • 3023.
    Karlsson, Sven
    et al.
    KTH, School of Information and Communication Technology (ICT), Microelectronics and Information Technology, IMIT.
    Brorsson, Mats
    KTH, School of Information and Communication Technology (ICT), Communication: Services and Infrastucture, Software and Computer Systems, SCS.
    A comparative characterization of communication patterns in applications using MPI and shared memory on an IBM SP21998Conference paper (Refereed)
    Abstract [en]

    In this paper we analyze the characteristics of communication in three different applications, FFT, Barnes and Water, on an IBM SP2. We contrast the communication using two different programming models: message-passing, MPI, and shared memory, represented by a state-of-the-art distributed virtual shared memory package, TreadMarks. We show that while communication time and busy times are comparable for small systems, the communication patterns are fundamentally different leading to poor performance for TreadMarks-based applications when the number of processors increase. This is due to the request/reply technique used in TreadMarks that results in a large fraction of very small messages. However, if the application can be tuned to reduce the impact of small message communication it is possible to achieve acceptable performance at least up to 32 nodes. Our measurements also show that TreadMarks programs tend to cause a more even network load compared to MPI programs

  • 3024.
    Karlsson, Sven
    et al.
    KTH, School of Information and Communication Technology (ICT), Microelectronics and Information Technology, IMIT.
    Brorsson, Mats
    KTH, School of Information and Communication Technology (ICT), Communication: Services and Infrastucture, Software and Computer Systems, SCS.
    Priority Based Messaging for Software Distributed Shared Memory – Model and Implementation2001Conference paper (Refereed)
  • 3025.
    Karlsson, Sven
    et al.
    KTH, School of Information and Communication Technology (ICT), Microelectronics and Information Technology, IMIT.
    Lee, S. -W
    KTH, School of Information and Communication Technology (ICT), Microelectronics and Information Technology, IMIT.
    Brorsson, Mats
    KTH, School of Information and Communication Technology (ICT), Communication: Services and Infrastucture, Software and Computer Systems, SCS.
    A Fully Compliant OpenMP implementation on Software Distributed Shared Memory2002Conference paper (Refereed)
    Abstract [en]

    OpenMP is a relatively new industry standard for programming parallel computers with a shared memory programming model. Given that clusters of workstations are a cost-effective solution to build parallel platforms, it would of course be highly interesting if the OpenMP model could be extended to these systems as well as to the standard shared memory architectures for which it was originally intended. We present in this paper a fully compliant implementation of the OpenMP specification 1.0 for C targeting networks of workstations. We have used an experimental software distributed shared memory system, CVM, to implement a run-time library which is the target of a source-to-source OpenMP translator also developed in this project. The system has been evaluated using an OpenMP microbenchmark suite used to evaluate the effect of some memory coherence protocol improvements. We have also used OpenMP versions of three Splash-2 applications concluding in reasonable speedups on an IBM SP machine with eight nodes. This is the first study to investigate the subtle mechanisms of consistency in OpenMP on software DSM systems.

  • 3026.
    Karlsson, Vide
    KTH, School of Electrical Engineering and Computer Science (EECS), Computer Science, Software and Computer systems, SCS.
    Utvärdering av nyckelordsbaserad textkategoriseringsalgoritmer2016Independent thesis Advanced level (degree of Master (Two Years)), 20 credits / 30 HE creditsStudent thesis
    Abstract [en]

    Supervised learning algorithms have been used for automatic text categoriza- tion with very good results. But supervised learning requires a large amount of manually labeled training data and this is a serious limitation for many practical applications. Keyword-based text categorization does not require manually la- beled training data and has therefore been presented as an attractive alternative to supervised learning. The aim of this study is to explore if there are other li- mitations for using keyword-based text categorization in industrial applications. This study also tests if a new lexical resource, based on the paradigmatic rela- tions between words, could be used to improve existing keyword-based text ca- tegorization algorithms. An industry motivated use case was created to measure practical applicability. The results showed that none of five examined algorithms was able to meet the requirements in the industrial motivated use case. But it was possible to modify one algorithm proposed by Liebeskind et.al. (2015) to meet the requirements. The new lexical resource produced relevant keywords for text categorization but there was still a large variance in the algorithm’s capaci- ty to correctly categorize different text categories. The categorization capacity was also generally too low to meet the requirements in many practical applica- tions. Further studies are needed to explore how the algorithm’s categorization capacity could be improved. 

    Download full text (pdf)
    fulltext
  • 3027.
    Karlström, P.
    et al.
    KTH, School of Information and Communication Technology (ICT), Computer and Systems Sciences, DSV.
    Pargman, T. C.
    KTH, School of Information and Communication Technology (ICT), Computer and Systems Sciences, DSV.
    Ramberg, Robert
    KTH, School of Information and Communication Technology (ICT), Computer and Systems Sciences, DSV.
    Tools, language technology and communication in computer assisted language learning2006In: Writing and Digital Media, Brill Academic Publishers, 2006, p. 189-198Chapter in book (Other academic)
    Abstract [en]

    In this chapter, we discuss the use of Language Technology (LT) (e.g. spelling and grammar checkers) in tools for Computer Assisted Language Learning (CALL). Attempts in merging research from LT and CALL were popular during the 1960s, but has since stagnated. We argue that this state of affairs is unfortunate, and that it has several causes: technology not living up to expectations, a single-minded focus on either communication or on linguistic forms in language teaching and framing computer systems for learning as "tutors" instead of "tools." Following brief introductions to Computer Aided Language Learning and LT, we provide a framework for designing "tool"-based systems, and argue that these tools should be used in a communicative setting while simultaneously training linguistic forms.

  • 3028.
    Karlström, Petter
    et al.
    KTH, School of Information and Communication Technology (ICT), Computer and Systems Sciences, DSV.
    Cerratto-Pargman, Teresa
    KTH, School of Information and Communication Technology (ICT), Computer and Systems Sciences, DSV.
    Analyzing student activity in computer assisted language learning2006In: Proc. Sixth Int. Conf. Advanced Learn. Technol. ICALT, 2006, p. 222-226Conference paper (Refereed)
    Abstract [en]

    We study the use of a computer application, intended for Computer Assisted Language Learning (CALL). We present an analytical framework for CALL, consisting of technology, interaction with technology, a relationship between technology and students, and a context where technology is situated. The application we study performs several different kinds of state of the art linguistic analyses, and is intended for writing texts while paying attention to linguistic forms. We have conducted a naturalistic field study of two informants use the tool collaboratively. Our question is in what manners these students put the tool into use. These students let initiative be taken by the CALL application, despite it being designed with student initiative in mind, and despite students being aware of features and occasions where they could take initiative. We use our framework to point out how this student-system relationship is formed, and provide guidance for future design.

  • 3029.
    Karnbrink, John
    Karlstad University, Faculty of Health, Science and Technology (starting 2013), Department of Mathematics and Computer Science (from 2013).
    Grafisk översikt för orderflöde i realtid2023Independent thesis Basic level (degree of Bachelor), 10 credits / 15 HE creditsStudent thesis
    Abstract [en]

    This thesis describes a project that aimed to create a digital tool for use at a company. The tool performs a useful function that had been requested for a while by both management and the proposed users. The tool shows orders to be packed at the warehouse. The tool differs from the current system used in that it can show new orders placed in realtime instead of with the current systems delay. Wanted features were both ease of use in regards of quickly understanding the information, as well as information about the total workload. There is no interactivity, since the tool simply displays information. New queries to the existing database were tested, and problem solving was done to get the real-time aspect of the tool working correctly. The tool uses the existing database and technology similar to that of a simple web page, with a backend consisting of Node.js and NginX. The frontend for the project, which is the graphical presentation, uses HTML, CSS and JS.

    The result of this project was the actual tool in use by the company. It is used to see the total amount of orders as well as item counts for each order, on a day by day basis. Overall, the project was quite successful since the results were decent. However, some parts could be improved.

    Download full text (pdf)
    fulltext
  • 3030.
    Karokola, Geoffrey Rwezaura
    Stockholm University, Faculty of Social Sciences, Department of Computer and Systems Sciences.
    A Framework for Securing e-Government Services: The Case of Tanzania2012Doctoral thesis, comprehensive summary (Other academic)
    Abstract [en]

    e-Government services are becoming one of the most important and efficient means by which governments (G) interact with businesses (B) and citizens (C). This has brought not only tremendous opportunities but also serious security challenges. Critical information assets are exposed to current and emerging security risks and threats. In the course of this study, it was learnt that e-government services are heavily guided and benchmarked by e-Government maturity models (eGMMs). However, the models lack built-in security services, technical as well as non-technical; leading to lack of strategic objectives alignment between e-government services and security services. Information security has an important role in mitigating security risks and threats posed to e-government services. Security improves quality of the services offered.

    In light of the above, the goal of this research work is to propose a framework that would facilitate government organisations to effectively offer appropriate secure e-government services. To achieve this goal, an empirical investigation was conducted in Tanzania involving six government organizations. The investigations were inter-foiled by a sequence of structural compositions resulting in a proposition of a framework for securing e-government services which integrates IT security services into eGMMs. The research work was mainly guided by a design science research approach complemented in parts by systemic-holistic and socio-technical approaches.

    The thesis contributes to the empirical and theoretical body of knowledge within the computer and systems sciences on securing e-government structures. It encompasses a new approach to secure e-government services incorporating security services into eGMMs. Also, it enhances the awareness, need and importance of security services to be an integral part of eGMMs to different groups such as researched organizations, academia, practitioners, policy and decision makers, stakeholders, and the community.

    Download full text (pdf)
    Comprehensive Summary
  • 3031.
    Karp, Martin
    et al.
    KTH, School of Electrical Engineering and Computer Science (EECS), Computer Science, Computational Science and Technology (CST).
    Podobas, Artur
    KTH, School of Electrical Engineering and Computer Science (EECS), Computer Science, Computational Science and Technology (CST).
    Jansson, Niclas
    KTH, School of Electrical Engineering and Computer Science (EECS), Centres, Centre for High Performance Computing, PDC. KTH, School of Electrical Engineering and Computer Science (EECS), Computer Science, Computational Science and Technology (CST).
    Kenter, Tobias
    Paderborn University.
    Plessl, Christian
    Paderborn University.
    Schlatter, Philipp
    KTH, School of Engineering Sciences (SCI), Engineering Mechanics, Fluid Mechanics and Engineering Acoustics. KTH, School of Engineering Sciences (SCI), Centres, Linné Flow Center, FLOW. KTH, Centres, SeRC - Swedish e-Science Research Centre.
    Markidis, Stefano
    KTH, School of Electrical Engineering and Computer Science (EECS), Computer Science, Computational Science and Technology (CST). KTH, Centres, SeRC - Swedish e-Science Research Centre.
    Appendix to High-Performance Spectral Element Methods on Field-Programmable Gate Arrays2020Other (Other academic)
    Abstract [en]

    In this Appendix we display some results we omitted fromour article ”High-Performance Spectral Element Methods onField-Programmable Gate Arrays”. In particular we showcasethe measured bandwidth for the FPGA we used (Stratix 10) aswell as the performance for our accelerator at different stagesof optimization. In addition to this, we show illustrate morepractical aspects of our performance/resource modeling

    Improvements in computer systems have historically relied on two well-known observations: Moore's law and Dennard's scaling. Today, both these observations are ending, forcing computer users, researchers, and practitioners to abandon the comforts of general-purpose architectures in favor of emerging post-Moore systems. Among the most salient of these post-Moore systems is the Field-Programmable Gate Array (FPGA), which strikes a good balance between complexity and performance.In this paper, we study modern FPGAs' applicability for use in accelerating the Spectral Element Method (SEM) core to many computational fluid dynamics (CFD) applications. We design a custom SEM hardware accelerator that we evaluate and empirically evaluate on the latest Stratix 10 SX-series FPGAs and position its performance (and power-efficiency) against state-of-the-art systems such as ARM ThunderX2, NVIDIA Pascal/Volta/Ampere Tesla-series cards, and general-purpose manycore CPUs. Finally, we develop a performance model for our SEM-accelerator, which we use to project the performance and role of future FPGAs to accelerator CFD applications, ultimately answering the question: what characteristics would a perfect FPGA for CFD applications have?

    Download full text (pdf)
    fulltext
  • 3032.
    Karp, Martin
    et al.
    KTH, School of Electrical Engineering and Computer Science (EECS), Computer Science, Computational Science and Technology (CST).
    Podobas, Artur
    KTH, School of Electrical Engineering and Computer Science (EECS), Computer Science, Computational Science and Technology (CST).
    Jansson, Niclas
    KTH, School of Electrical Engineering and Computer Science (EECS), Computer Science, Computational Science and Technology (CST).
    Kenter, Tobias
    Plessl, Christian
    Schlatter, Philipp
    KTH, School of Engineering Sciences (SCI), Engineering Mechanics, Fluid Mechanics and Engineering Acoustics.
    Markidis, Stefano
    KTH, School of Electrical Engineering and Computer Science (EECS), Computer Science, Computational Science and Technology (CST).
    High-Perfomance Spectral Element Methods on Field-Programmable Gate Arrays: Implementation, Evaluation, and Future Projection2021In: Proceedings of the 35rd IEEE International Parallel & Distributed Processing Symposium, May 17-21, 2021 Portland, Oregon, USA, Institute of Electrical and Electronics Engineers (IEEE) , 2021Conference paper (Refereed)
    Abstract [en]

     Improvements in computer systems have historically relied on two well-known observations: Moore's law and Dennard's scaling. Today, both these observations are ending, forcing computer users, researchers, and practitioners to abandon the general-purpose architectures' comforts in favor of emerging post-Moore systems. Among the most salient of these post-Moore systems is the Field-Programmable Gate Array (FPGA), which strikes a convenient balance between complexity and performance. In this paper, we study modern FPGAs' applicability in accelerating the Spectral Element Method (SEM) core to many computational fluid dynamics (CFD) applications. We design a custom SEM hardware accelerator operating in double-precision that we empirically evaluate on the latest Stratix 10 GX-series FPGAs and position its performance (and power-efficiency) against state-of-the-art systems such as ARM ThunderX2, NVIDIA Pascal/Volta/Ampere Tesla-series cards, and general-purpose manycore CPUs. Finally, we develop a performance model for our SEM-accelerator, which we use to project future FPGAs' performance and role to accelerate CFD applications, ultimately answering the question: what characteristics would a perfect FPGA for CFD applications have? 

  • 3033.
    Karpashevich, Pavel
    et al.
    Bauhaus-Universität Weimar, Bauhausstr. 11, D-99423 Weimar, Germany.
    Hornecker, Eva
    Bauhaus-Universität Weimar, Bauhausstr. 11, D-99423 Weimar, Germany.
    Kesewaa Dankwa, Nana
    Bauhaus-Universität Weimar, Bauhausstr. 11, D-99423 Weimar, Germany.
    Hanafy, Mohamed
    Bauhaus-Universität Weimar, Bauhausstr. 11, D-99423 Weimar, Germany.
    Fietkau, Julian
    UBW München, Werner-Heisenberg-Weg 39, D - 85577 Neubiberg, Germany.
    Blurring boundaries between everyday life and pervasive gaming: an interview study of ingress2016In: Proceeding MUM '16 Proceedings of the 15th International Conference on Mobile and Ubiquitous Multimedia, ACM Digital Library, 2016, p. 217-228Conference paper (Refereed)
    Abstract [en]

    We present findings from an interview-based study of the pervasive mobile multiplayer game Ingress. Our study focuses on how boundaries between (1) everyday life and play and (2) 'real' and game space blur in pervasive gaming. We present findings on how the game is integrated into everyday life and affects players' mobility patterns, and on how players experience the relation between real world and game world, the game 'bleeding' into the everyday (blurring boundaries at least partially) even though it is not explicitly experienced as hybrid. Furthermore we discuss how notions of play versus ordinary life still affect some players, and how some players are willing to take and create risks and treat the game as consequential in their everyday interactions with (enemy) players. This further blurs the boundaries of the magic circle, but also creates tensions between casual and serious styles of play. Our findings add to the empirical literature on pervasive games by focusing on player experience in a large-scale pervasive game.

  • 3034.
    Karresand, M.
    et al.
    Department of Information Security and Communication Technology, Norwegian University of Science and Technology, Gjovik, Norway.
    Axelsson, Stefan
    Halmstad University, School of Information Technology, Halmstad Embedded and Intelligent Systems Research (EIS).
    Dyrkolbotn, G. O.
    Department of Information Security and Communication Technology, Norwegian University of Science and Technology, Gjovik, Norway.
    Disk Cluster Allocation Behavior in Windows and NTFS2020In: Mobile Networks and Applications, ISSN 1383-469X, E-ISSN 1572-8153, Vol. 5, no 1, p. 248-258Article in journal (Refereed)
    Abstract [en]

    The allocation algorithm of a file system has a huge impact on almost all aspects of digital forensics, because it determines where data is placed on storage media. Yet there is only basic information available on the allocation algorithm of the currently most widely spread file system; NTFS. We have therefore studied the NTFS allocation algorithm and its behavior empirically. To do that we used two virtual machines running Windows 7 and 10 on NTFS formatted fixed size virtual hard disks, the first being 64 GiB and the latter 1 TiB in size. Files of different sizes were written to disk using two writing strategies and the $Bitmap files were manipulated to emulate file system fragmentation. Our results show that files written as one large block are allocated areas of decreasing size when the files are fragmented. The decrease in size is seen not only within files, but also between them. Hence a file having smaller fragments than another file is written after the file having larger fragments. We also found that a file written as a stream gets the opposite allocation behavior, i. e. its fragments are increasing in size as the file is written. The first allocated unit of a stream written file is always very small and hence easy to identify. The results of the experiment are of importance to the digital forensics field and will help improve the efficiency of for example file carving and timestamp verification. © 2019, The Author(s).

  • 3035.
    Karresand, Martin
    et al.
    Norges teknisk-naturvitenskapelige universitet, Trondheim, Norway; Totalforsvarets forskningsinstitut, Stockholm, Sweden.
    Dyrkolbotn, Geir Olav
    Norges teknisk-naturvitenskapelige universitet, Trondheim, Norway; Norwegian Defence Cyber Academy (NDCA), Norway.
    Axelsson, Stefan
    Halmstad University, School of Information Technology, Halmstad Embedded and Intelligent Systems Research (EIS). Högskolan i Halmstad.
    An Empirical Study of the NTFS Cluster Allocation Behavior Over Time2020In: Forensic Science International: Digital Investigation, ISSN 2666-2817, Vol. 33Article in journal (Refereed)
    Abstract [en]

    © 2020 The Author(s)The amount of data to be handled in digital forensic investigations is continuously increasing, while the tools and processes used are not developed accordingly. This especially affects the digital forensic sub-field of file carving. The use of the structuring of stored data induced by the allocation algorithm to increase the efficiency of the forensic process has been independently suggested by Casey and us. Building on that idea we have set up an experiment to study the allocation algorithm of NTFS and its behavior over time from different points of view. This includes if the allocation algorithm behaves the same regardless of Windows version or size of the hard drive, its adherence to the best fit allocation strategy and the distribution of the allocation activity over the available (logical) storage space. Our results show that space is not a factor, but there are differences in the allocation behavior between Windows 7 and Windows 10. The results also show that the allocation strategy favors filling in holes in the already written area instead of claiming the unused space at the end of a partition and that the area with the highest allocation activity is slowly progressing from approximately 10 GiB into a partition towards the end as the disk is filling up.

  • 3036.
    Kastegård, Sandra
    Linköping University, Department of Computer and Information Science, Software and Systems. Linköping University, Faculty of Science & Engineering.
    Automated testing of a web-based user interface2015Independent thesis Basic level (university diploma), 10,5 credits / 16 HE creditsStudent thesis
    Abstract [en]

    Testing is a vital part of software development and test automation is an increasingly common practise. Performing automated testing on web-based applications is more complicated than desktop applications, which is particularly clear when it comes to testing a web based user interface as they are becoming more complex and dynamic. Depending on the goals and needed complexity of the testing, a variety of different frameworks/tools are available to help implementing it.

    This thesis investigates how automated testing of a web-based user interface can be implemented. Testing methods and a selection of relevant testing frameworks/tools are presented and evaluated based on given requirements. Out of the selected frameworks/tools, the Selenium WebDriver framework is chosen and used for implementation. The implementation results in automated test cases for regression testing of the functionality of a user interface created by Infor AB.

    Download full text (pdf)
    fulltext
  • 3037.
    Kastrati, Muhamet
    et al.
    University of New York Tirana, Albania.
    Biba, Marenglen
    University of New York Tirana, Albania.
    Imran, Ali Shariq
    Norwegian University of Science and Technology, Norway.
    Kastrati, Zenun
    Linnaeus University, Faculty of Technology, Department of Informatics. Linnaeus University, Faculty of Technology, Department of computer science and media technology (CM).
    Sentiment Polarity and Emotion Detection from Tweets Using Distant Supervision and Deep Learning Models2022In: Foundations of Intelligent Systems. ISMIS 2022 / [ed] Ceci, M., Flesca, S., Masciari, E., Manco, G., Raś, Z.W., Springer, 2022, p. 13-23Conference paper (Refereed)
    Abstract [en]

    Automatic text-based sentiment analysis and emotion detection on social media platforms has gained tremendous popularity recently due to its widespread application reach, despite the unavailability of a massive amount of labeled datasets. With social media platforms in the limelight in recent years, it’s easier for people to express their opinions and reach a larger target audience via Twitter and Facebook. Large tweet postings provide researchers with much data to train deep learning models for analysis and predictions for various applications. However, deep learning-based supervised learning is data-hungry and relies heavily on abundant labeled data, which remains a challenge. To address this issue, we have created a large-scale labeled emotion dataset of 1.83 million tweets by harnessing emotion-indicative emojis available in tweets. We conducted a set of experiments on our distant-supervised labeled dataset using conventional machine learning and deep learning models for estimating sentiment polarity and multi-class emotion detection. Our experimental results revealed that deep neural networks such as BiLSTM and CNN-BiLSTM outperform other models in both sentiment polarity and multi-class emotion classification tasks achieving an F1 score of 62.21% and 39.46%, respectively, an average performance improvement of nearly 2–3 percentage points on the baseline results.

  • 3038. Katayama, K.
    et al.
    Takahashi, H.
    Yokoyama, S.
    Gäfvert, Karl
    KTH, School of Information and Communication Technology (ICT).
    Kinoshita, T.
    Evacuation guidance support using cooperative agent-based IoT devices2017In: 2017 IEEE 6th Global Conference on Consumer Electronics, GCCE 2017, Institute of Electrical and Electronics Engineers Inc. , 2017, p. 1-2Conference paper (Refereed)
    Abstract [en]

    It is an important task to prepare for unprecedented natural disasters. When natural disasters occur, indoor and outdoor situations greatly change. Therefore, it is difficult to provide evacuation guidance during sudden and unpredictable disasters. We propose an agent-based evacuation guidance support system, which autonomously determines the evacuation guidance plan based on the situation assessed by cooperating IoT devices, in order to support quick evacuation guidance. We describe an evacuation guidance support system based on agent-based IoT (AIoT), which consists of situation recognition and evacuation route planning. We performed some experiments to confirm the effectiveness of our proposal.

  • 3039.
    Katsikas, Georgios P.
    KTH, School of Electrical Engineering and Computer Science (EECS), Computer Science, Communication Systems, CoS, Network Systems Laboratory (NS Lab). RISE SICS.
    NFV Service Chains at the Speed of the Underlying Commodity Hardware2018Doctoral thesis, monograph (Other academic)
    Abstract [en]

    Link speeds in networks will in the near-future reach and exceed 100 Gbps. While available specialized hardware can accommodate these speeds, modern networks have adopted a new networking paradigm, also known as Network Functions Virtualization (NFV), that replaces expensive specialized hardware with open-source software running on commodity hardware. However, achieving high performance using commodity hardware is a hard problem mainly because of the processor-memory gap. This gap suggests that only the fastest memories of today’s commodity servers can achieve the desirable access latencies for high speed networks. Existing NFV systems realize chained network functions (also known as service chains) mostly using slower memories; this implies a need for multiple additional CPU cores or even multiple servers to achieve high speed packet processing. In contrast, this thesis combines four contributions to realize NFV service chains with dramatically higher performance and better efficiency than the state of the art.

    The first contribution is a framework that profiles NFV service chains to uncover reasons for performance degradation, while the second contribution leverages the profiler’s data to accelerate these service chains by combining multiplexing of system calls with scheduling strategies. The third contribution synthesizes input/output and processing service chain operations to increase the spatial locality of network traffic with respect to a system’s caches. The fourth contribution combines the profiler’s insights from the first contribution and the synthesis approach of the third contribution to realize NFV service chains at the speed of the underlying commodity hardware. To do so, stateless traffic classification operations are offloaded into available hardware (i.e., programmable switches and/or network cards) and a tag is associated with each traffic class. At the server side, input traffic classes are classified by the hardware based upon the values of these tags, which indicate the CPU core that should undertake their stateful processing, while ensuring zero inter-core communication.

    With commodity hardware, this thesis realizes Internet Service Provider-level service chains and deep packet inspection at a line-rate 40 Gbps and stateful service chains at the speed of a 100 GbE network card on a 16 core single server. This results in up to (i) 4.7x lower latency, (ii) 8.5x higher throughput, and (iii) 6.5x better efficiency than the state of the art. The techniques described in this thesis are crucial for realizing future high speed NFV deployments.

    Download full text (pdf)
    fulltext
  • 3040.
    Katsikas, Georgios P.
    KTH, School of Information and Communication Technology (ICT), Communication Systems, CoS, Network Systems Laboratory (NS Lab).
    Realizing High Performance NFV Service Chains2016Licentiate thesis, monograph (Other academic)
    Abstract [en]

    Network functions (NFs) hold a key role in networks, offering in-network services, such as enhanced performance, policy enforcement, and security. Traditionally, NFs have been implemented in specialized, thus expensive hardware. To lower the costs of deploying NFs, network operators have adopted network functions virtualization (NFV), by migrating NFs from hardware to software running in commodity servers. Several approaches to NFV have shown that commodity network stacks and drivers (e.g., Linux-based) struggle to keep up with increasing hardware speed. Despite this, popular networking services still rely on these commodity components. Moreover, chaining NFs (also known as service chaining) is challenging due to redundancy in the elements of the chain. This licentiate thesis addresses the performance problems of NFV service chains.The first contribution is a framework that (i) profiles NFV service chains to uncover performance degradation reasons and (ii) leverages the profiler’s data to accelerate these chains, by combining multiplexing of system calls with scheduling strategies. These accelerations improve the cache utilization and thereby the end-to-end latency of chained NFs is reduced by a factor of three. Moreover, the same chains experience a multi-fold latency variance reduction; this result improves the quality of highly-interactive services.The second contribution of this thesis substantially revises the way NFV service chains are realized. NFV service chains are synthesized while eliminating redundant input/output and repeated elements, providing consolidated stateful cross layer packet operations across the chain. This software-based synthesis achieves line-rate 40 Gbps throughput for stateful and long service chains. This performance is 8.5x higher than the performance achieved by the software-based state of the art FastClick framework. Experiments with three example Internet Service Provider-level service chains show that this synthesis approach operates at 40 Gbps, when the classification of these chains is offloaded to an OpenFlow switch.

    Download full text (pdf)
    fulltext
  • 3041.
    Katsikas, Georgios P.
    et al.
    KTH, School of Electrical Engineering and Computer Science (EECS), Computer Science, Communication Systems, CoS, Network Systems Laboratory (NS Lab).
    Barbette, Tom
    KTH, School of Electrical Engineering and Computer Science (EECS), Computer Science, Communication Systems, CoS, Network Systems Laboratory (NS Lab).
    Chiesa, Marco
    KTH, School of Electrical Engineering and Computer Science (EECS), Computer Science, Communication Systems, CoS, Network Systems Laboratory (NS Lab).
    Kostic, Dejan
    KTH, School of Electrical Engineering and Computer Science (EECS), Computer Science, Communication Systems, CoS, Network Systems Laboratory (NS Lab).
    Maguire Jr., Gerald Q.
    KTH, School of Electrical Engineering and Computer Science (EECS), Computer Science, Software and Computer systems, SCS.
    What you need to know about (Smart) Network Interface Cards2021In: Proceedings Passive and Active Measurement - 22nd International Conference, PAM 2021 / [ed] Springer International Publishing, Springer Nature , 2021Conference paper (Refereed)
    Abstract [en]

    Network interface cards (NICs) are fundamental componentsof modern high-speed networked systems, supporting multi-100 Gbpsspeeds and increasing programmability. Offloading computation from aserver’s CPU to a NIC frees a substantial amount of the server’s CPU resources, making NICs key to offer competitive cloud services.

    Therefore, understanding the performance benefits and limitations of offloading anetworking application to a NIC is of paramount importance.In this paper, we measure the performance of four different NICs fromone of the largest NIC vendors worldwide, supporting 100 Gbps and200 Gbps. We show that while today’s NICs can easily support multihundred-gigabit throughputs, performing frequent update operations ofa NIC’s packet classifier — as network address translators (NATs) andload balancers would do for each incoming connection — results in adramatic throughput reduction of up to 70 Gbps or complete denial ofservice. Our conclusion is that all tested NICs cannot support high-speednetworking applications that require keeping track of a large number offrequently arriving incoming connections. Furthermore, we show a variety of counter-intuitive performance artefacts including the performanceimpact of using multiple tables to classify flows of packets.

    Download full text (pdf)
    fulltext
  • 3042.
    Katsikas, Georgios P.
    et al.
    KTH, School of Information and Communication Technology (ICT), Communication Systems, CoS, Network Systems Laboratory (NS Lab).
    Enguehard, Marcel
    Kuźniar, Maciej
    Maguire Jr, Gerald Q.
    KTH, School of Information and Communication Technology (ICT), Communication Systems, CoS, Radio Systems Laboratory (RS Lab).
    Kostic, Dejan
    KTH, School of Information and Communication Technology (ICT), Communication Systems, CoS, Network Systems Laboratory (NS Lab).
    SNF: synthesizing high performance NFV service chains2016In: PeerJ Computer Science, ISSN 2376-5992, p. 1-30Article in journal (Refereed)
    Abstract [en]

    In this paper we introduce SNF, a framework that synthesizes (S) network function (NF) service chains by eliminating redundant I/O and repeated elements, while consolidating stateful cross layer packet operations across the chain. SNF uses graph composition and set theory to determine traffic classes handled by a service chain composed of multiple elements. It then synthesizes each traffic class using a minimal set of new elements that apply single-read-single-write and early-discard operations. Our SNF prototype takes a baseline state of the art network functions virtualization (NFV) framework to the level of performance required for practical NFV service deployments. Software-based SNF realizes long (up to 10 NFs) and stateful service chains that achieve line-rate 40 Gbps throughput (up to 8.5x greater than the baseline NFV framework). Hardware-assisted SNF, using a commodity OpenFlow switch, shows that our approach scales at 40 Gbps for Internet Service Provider-level NFV deployments.

    Download full text (pdf)
    fulltext
  • 3043.
    Katsikeas, Sotirios
    et al.
    KTH, School of Electrical Engineering and Computer Science (EECS), Computer Science, Network and Systems Engineering.
    Hacks, Simon
    KTH, School of Electrical Engineering and Computer Science (EECS), Computer Science, Network and Systems Engineering.
    Johnson, Pontus
    KTH, School of Electrical Engineering and Computer Science (EECS), Computer Science, Network and Systems Engineering.
    Ekstedt, Mathias
    KTH, School of Electrical Engineering and Computer Science (EECS), Computer Science, Network and Systems Engineering.
    Lagerström, Robert
    KTH, School of Electrical Engineering and Computer Science (EECS), Computer Science, Network and Systems Engineering.
    Jacobsson, J.
    Wällstedt, B.
    Eliasson, P.
    An Attack Simulation Language for the IT Domain2020In: Graphical Models for Security: 7th International Workshop, GraMSec 2020, Boston, MA, USA, June 22, 2020, Revised Selected Papers, Springer Nature , 2020, Vol. 12419, p. 67-86Conference paper (Refereed)
    Abstract [en]

    Cyber-attacks on IT infrastructures can have disastrous consequences for individuals, regions, as well as whole nations. In order to respond to these threats, the cyber security assessment of IT infrastructures can foster a higher degree of security and resilience against cyber-attacks. Therefore, the use of attack simulations based on system architecture models is proposed. To reduce the effort of creating new attack graphs for each system under assessment, domain-specific languages (DSLs) can be employed. DSLs codify the common attack logics of the considered domain. Previously, MAL (the Meta Attack Language) was proposed, which serves as a framework to develop DSLs and generate attack graphs for modeled infrastructures. In this article, we propose coreLang as a MAL-based DSL for modeling IT infrastructures and analyzing weaknesses related to known attacks. To model domain-specific attributes, we studied existing cyber-attacks to develop a comprehensive language, which was iteratively verified through a series of brainstorming sessions with domain modelers. Finally, this first version of the language was validated against known cyber-attack scenarios.

  • 3044.
    Katsikeas, Sotirios
    et al.
    KTH, School of Electrical Engineering and Computer Science (EECS), Computer Science, Network and Systems Engineering.
    Rencelj Ling, Engla
    KTH, School of Electrical Engineering and Computer Science (EECS), Computer Science, Network and Systems Engineering.
    Johnsson, Pontus
    KTH, School of Electrical Engineering and Computer Science (EECS), Computer Science, Network and Systems Engineering.
    Ekstedt, Mathias
    KTH, School of Electrical Engineering and Computer Science (EECS), Computer Science, Network and Systems Engineering.
    Empirical evaluation of a threat modeling language as a cybersecurity assessment tool2024In: Computers & security (Print), ISSN 0167-4048, E-ISSN 1872-6208, Vol. 140, article id 103743Article in journal (Refereed)
    Abstract [en]

    The complexity of ICT infrastructures is continuously increasing, presenting a formidable challenge in safeguarding them against cyber attacks. In light of escalating cyber threats and limited availability of expert resources, organizations must explore more efficient approaches to assess their resilience and undertake proactive measures. Threat modeling is an effective approach for assessing the cyber resilience of ICT systems. One method is to utilize Attack Graphs, which visually represent the steps taken by adversaries during an attack. Previously, MAL (the Meta Attack Language) was proposed, which serves as a framework for developing Domain-Specific Languages (DSLs) and generating Attack Graphs for modeled infrastructures. coreLang is a MAL-based threat modeling language that utilizes such Attack Graphs to enable attack simulations and security assessments for the generic ICT domain. Developing domain-specific languages for threat modeling and attack simulations provides a powerful approach for conducting security assessments of infrastructures. However, ensuring the correctness of these modeling languages raises a separate research question. In this study we conduct an empirical experiment aiming to falsify such a domain-specific threat modeling language. The potential inability to falsify the language through our empirical testing would lead to its corroboration, strengthening our belief in its validity within the parameters of our study. The outcomes of this approach indicated that, on average, the assessments generated by attack simulations outperformed those of human experts. Additionally, both human experts and simulations exhibited significantly superior performance compared to random guessers in their assessments. While specific human experts occasionally achieved better assessments for particular questions in the experiments, the efficiency of simulation-generated assessments surpasses that of human domain experts.

  • 3045.
    Katura, Robert
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering.
    Information Classification in Information Security Management and its Challenges2023Independent thesis Advanced level (degree of Master (Two Years)), 80 credits / 120 HE creditsStudent thesis
    Abstract [en]

    Information classification is a prerequisite for carrying out risk management in information security, as the assets worth protecting are identified and the need for protection is determined by the classification categories. The information classification thus has a major impact on the security architecture of systems and organizations. Nevertheless, information classification leads a shadowy existence in the scientific literature, which is reflected in a limited number of scientific publications. This discrepancy between the relevance of information classification in risk management and its low scientific attention was the motivation to take a closer look at the topic. This thesis created an overview of the current state of research in information classification and shed some light on potential problems to stimulate new research questions. The results of the work include a current overview of the status of research on information classification in risk management of information security and its context to other academic disciplines and practical needs, particularly research on bias and systems engineering. This thesis also summarized a total of 109 individual research gaps in information classification research, derived from the evaluation of the scientific literature and on the conclusions of identified open questions. From the gaps identified, some suggestions for future research in the field of information classification could be made.

    Download full text (pdf)
    Information Classification in Information Security Management and its Challenges
  • 3046.
    Katz, Andrew
    et al.
    University of Skövde, School of Informatics. University of Skövde, The Informatics Research Centre. Moorcrofts LLP Solicitors, Buckinghamshire, United Kingdom.
    Lundell, Björn
    University of Skövde, School of Informatics. University of Skövde, The Informatics Research Centre.
    Gamalielsson, Jonas
    University of Skövde, School of Informatics. University of Skövde, The Informatics Research Centre.
    Software, copyright and the learning environment: An analysis of the IT contracts Swedish schools impose on their students and the implications for FOSS2016In: International Free and Open Source Software Law Review, ISSN 1877-6922, Vol. 8, no 1, p. 1-28Article in journal (Refereed)
    Abstract [en]

    Free and open source software (FOSS) is commonly made available to students in schools, but the schools do not necessarily take a holistic approach to their provision of IT (including software) which takes into account the nature of FOSS. In particular, we have identified a number of contracts with which Swedish students who are provided with laptops by their schools are required to comply which set out conditions for the use of the laptops, and associated software and content. Many clauses in these contracts are legally incompatible with certain FOSS licences, or contain misconceptions about FOSS, licensing and culture. This paper explores the relationship between the contracts and FOSS licensing and culture, and suggests a number of resolutions to the contradictions and misconceptions, as well as considering related issues.

  • 3047.
    Katzeff, Cecilia
    KTH, School of Architecture and the Built Environment (ABE), Sustainable development, Environmental science and Engineering, Strategic Sustainability Studies.
    Homes in the smart grid2022In: Towards the energy of the future: The invisible revolution behind the electrical socket / [ed] Vetenskap och Allmänhet, Stockholm: Vetenskap och Allmänhet , 2022, 1, p. 119-128Chapter in book (Other (popular science, discussion, etc.))
  • 3048.
    Kauküla, Marcus
    University of Skövde, School of Informatics.
    Analysing Performance Effects of Deduplication on Virtual Machine Storage2017Independent thesis Basic level (degree of Bachelor), 15 credits / 22,5 HE creditsStudent thesis
    Abstract [en]

    Virtualization is a widely used technology for running multiple operating systems on a single set of hardware. Virtual machines running the same operating system have been shown to have a large amount of identical data, in such cases deduplication have been shown to be very effective in eliminating duplicated data.

    This study aimed to investigate if the storage savings are as large as shown in previous research, as well as to investigate if there are any negative performance impacts when using deduplication. The selected performance variables are resource utilisation and disk performance.

    The selected deduplication implementations are SDFS and ZFS deduplication. Each file system is tested with its respective non-deduplicated file systems, ext4 and ZFS.

    The results show that the storage savings are between 72,5 % and 73,65 % while the resource utilisation is generally higher when using deduplication. The results also show that deduplication using SDFS has an overall large negative disk performance impact, while ZFS deduplication has a general disk performance increase.

    Download full text (pdf)
    fulltext
  • 3049.
    Kaur, Manshaman
    et al.
    Blekinge Institute of Technology, Faculty of Computing, Department of Software Engineering.
    Iftikhar, Saima
    Blekinge Institute of Technology, Faculty of Computing, Department of Software Engineering.
    Why is it not important to follow guidelines when conducting Daily Standup Meetings?2022Independent thesis Advanced level (professional degree), 12 credits / 18 HE creditsStudent thesis
    Abstract [en]

    Background: Agile practices are widely used in the software development process to quickly adapt and respond to the challenges occurring during the whole software development lifecycle. Daily standup meetings are a common practice to share an update about the status and impediments with the team. However, how much value this practice adds to help the team members is uncertain.

    Objective: The purpose of this study is to find how daily standup meetings should be conducted with effectiveness. This study aims to identify that there are different needs for different teams depending upon the factors like team size, geographic location, and project progress and how these should be focused on when planning and conducting daily standup meetings.

    Method: This case study was conducted on two different projects of IT software development companies. A total of 10 people were interviewed who have expert knowledge of Agile ways of working and 34 digital DSMs attended to observe the meetings. Thematic analysis was performed on the collected data. In addition, feedback was collected from different people working in those teams and quantitative analysis is performed on data collected to perform a validation on the results of the thematic analysis.

    Results: The result indicates that DSM is an important meeting in Agile ways of working and this helps not only with status monitoring or planning work but also helps people to connect as a team. Impediments are fixed at a fast pace and deviations are tracked without delays. Further different factors like team size, duration, frequency, and participants are studied to find how these should be considered when planning and conducting DSMs. Each team has its own needs, so to make DSM more effective, team needs should be studied and understood well before conducting DSM.

    Conclusion: Daily standup meetings are one of the crucial meetings which help in the success of the project when conducted in an effective way. It is not just a status update meeting but also a place where the full team comes together to work towards a common goal and discuss and highlight important information. So, it is important to consider all factors for its effectiveness so that the project can benefit from this meeting for better progress.

    Download full text (pdf)
    Why is it not important to follow guidelines when conducting Daily Standup Meetings?
  • 3050.
    Kavallos, Christos-Sotirios
    Luleå University of Technology, Department of Computer Science, Electrical and Space Engineering.
    Parliament proceeding classification via Machine Learning algorithms: A case of Greek parliament proceedings2023Independent thesis Advanced level (degree of Master (One Year)), 10 credits / 15 HE creditsStudent thesis
    Abstract [en]

    The Greek Parliament is a critical institution for the Greek Democracy, where important decisions are made that affect the lives of millions of people. It consists of representatives from different political parties, and each party has a unique political ideology, stance, and agenda. The proposed research aims to automatically classify parliamentary proceedings to their respective political parties based on the content of their speeches, debates, and discussions. The goal of this research is to assess the feasibility of classifying Greek parliament proceedings to their respective political party via machine learning and neural network algorithms. By using machine learning algorithms and neural networks, the system can learn from large amounts of data and make accurate predictions about the category of a given proceeding. One possible approach is to use supervised learning algorithms, where the system is trained on a dataset of parliamentary proceedings labeled with the respective political parties. The machine learning algorithms can then learn the underlying patterns and features in the text data and accurately classify new proceedings to their respective parties. Another potential approach is to use deep learning neural networks, such as recurrent neural networks (RNNs), to classify the proceedings. These networks can be trained on large amounts of labeled data and can learn the complex relationships between the text features and political parties. The results of this research can be used to gain insights into the political landscape and the positions of different parties on various issues. The ability to automatically classify parliamentary proceedings to their political parties can also aid in political analysis, including tracking the voting patterns of different parties and their representatives and generally the potential revolutionization of social and human sciences is existent. Moreover, the proposed research can have implications for policy-making and governance. By analyzing the proceedings and identifying the political parties' positions and priorities, policymakers can better understand the political landscape and craft policies that align with the values and priorities of different parties. In conclusion, the classification of parliament proceedings, in our case Greek, to their political parties via NLP with machine learning algorithms is a promising research topic that has potential applications in political analysis and decision-making. The ability to automatically classify parliamentary proceedings to their respective parties can enhance transparency and accountability in the democratic system and aid in policy-making and governance.

    Download full text (pdf)
    fulltext
58596061626364 3001 - 3050 of 6263
CiteExportLink to result list
Permanent link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf