Change search
Refine search result
1 - 8 of 8
CiteExportLink to result list
Permanent link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Rows per page
  • 5
  • 10
  • 20
  • 50
  • 100
  • 250
Sort
  • Standard (Relevance)
  • Author A-Ö
  • Author Ö-A
  • Title A-Ö
  • Title Ö-A
  • Publication type A-Ö
  • Publication type Ö-A
  • Issued (Oldest first)
  • Issued (Newest first)
  • Created (Oldest first)
  • Created (Newest first)
  • Last updated (Oldest first)
  • Last updated (Newest first)
  • Disputation date (earliest first)
  • Disputation date (latest first)
  • Standard (Relevance)
  • Author A-Ö
  • Author Ö-A
  • Title A-Ö
  • Title Ö-A
  • Publication type A-Ö
  • Publication type Ö-A
  • Issued (Oldest first)
  • Issued (Newest first)
  • Created (Oldest first)
  • Created (Newest first)
  • Last updated (Oldest first)
  • Last updated (Newest first)
  • Disputation date (earliest first)
  • Disputation date (latest first)
Select
The maximal number of hits you can export is 250. When you want to export more records please use the Create feeds function.
  • 1.
    Alissandrakis, Aris
    et al.
    Linnaeus University, Faculty of Technology, Department of computer science and media technology (CM), Department of Media Technology.
    Reski, Nico
    Linnaeus University, Faculty of Technology, Department of computer science and media technology (CM), Department of Media Technology.
    Using Mobile Augmented Reality to Facilitate Public Engagement2017In: Extended Papers of the International Symposium on Digital Humanities (DH 2016) / [ed] Koraljka Golub, Marcelo Milrad, CEUR-WS , 2017, Vol. 2021, p. 99-109Conference paper (Refereed)
    Abstract [en]

    This paper presents our initial efforts towards the development of a framework for facilitating public engagement through the use of mobile Augmented Reality (mAR), that fall under the overall project title "Augmented Reality for Public Engagement" (PEAR). We present the concept, implementation, and discuss the results from the deployment of a mobile phone app (PEAR 4 VXO). The mobile app was used for a user study in conjunction with a campaign carried out by Växjö municipality (Sweden) while exploring how to get citizens more engaged in urban planning actions and decisions. These particular activities took place during spring 2016.One of the salient features of our approach is that it combines novel ways of using mAR together with social media, online databases, and sensors, to support public engagement. In addition, the data collection process and audience engagement were tested in a follow-up limited deployment.The analysis and outcomes of our initial results validate the overall concept and indicate the potential usefulness of the app as a tool, but also highlight the need for an active campaign from the part of the stakeholders.Our future efforts will focus on addressing some of the problems and challenges that we have identified during the different phases of this user study.

  • 2.
    Alissandrakis, Aris
    et al.
    Linnaeus University, Faculty of Technology, Department of computer science and media technology (CM).
    Reski, Nico
    Linnaeus University, Faculty of Technology, Department of computer science and media technology (CM).
    Laitinen, Mikko
    University of Eastern Finland, Finland.
    Tyrkkö, Jukka
    Linnaeus University, Faculty of Arts and Humanities, Department of Languages.
    Levin, Magnus
    Linnaeus University, Faculty of Arts and Humanities, Department of Languages.
    Lundberg, Jonas
    Linnaeus University, Faculty of Technology, Department of computer science and media technology (CM).
    Visualizing dynamic text corpora using Virtual Reality2018In: ICAME 39 : Tampere, 30 May – 3 June, 2018: Corpus Linguistics and Changing Society : Book of Abstracts, Tampere: University of Tampere , 2018, p. 205-205Conference paper (Refereed)
    Abstract [en]

    In recent years, data visualization has become a major area in Digital Humanities research, and the same holds true also in linguistics. The rapidly increasing size of corpora, the emergence of dynamic real-time streams, and the availability of complex and enriched metadata have made it increasingly important to facilitate new and innovative approaches to presenting and exploring primary data. This demonstration showcases the uses of Virtual Reality (VR) in the visualization of geospatial linguistic data using data from the Nordic Tweet Stream (NTS) project (see Laitinen et al 2017). The NTS data for this demonstration comprises a full year of geotagged tweets (12,443,696 tweets from 273,648 user accounts) posted within the Nordic region (Denmark, Finland, Iceland, Norway, and Sweden). The dataset includes over 50 metadata parameters in addition to the tweets themselves.

    We demonstrate the potential of using VR to efficiently find meaningful patterns in vast streams of data. The VR environment allows an easy overview of any of the features (textual or metadata) in a text corpus. Our focus will be on the language identification data, which provides a previously unexplored perspective into the use of English and other non-indigenous languages in the Nordic countries alongside the native languages of the region.

    Our VR prototype utilizes the HTC Vive headset for a room-scale VR scenario, and it is being developed using the Unity3D game development engine. Each node in the VR space is displayed as a stacked cuboid, the equivalent of a bar chart in a three-dimensional space, summarizing all tweets at one geographic location for a given point in time (see: https://tinyurl.com/nts-vr). Each stacked cuboid represents information of the three most frequently used languages, appropriately color coded, enabling the user to get an overview of the language distribution at each location. The VR prototype further encourages users to move between different locations and inspect points of interest in more detail (overall location-related information, a detailed list of all languages detected, the most frequently used hashtags). An underlying map outlines country borders and facilitates orientation. In addition to spatial movement through the Nordic areas, the VR system provides an interface to explore the Twitter data based on time (days, weeks, months, or time of predefined special events), which enables users to explore data over time (see: https://tinyurl.com/nts-vr-time).

    In addition to demonstrating how the VR methods aid data visualization and exploration, we will also briefly discuss the pedagogical implications of using VR to showcase linguistic diversity.

  • 3.
    Reski, Nico
    et al.
    Linnaeus University, Faculty of Technology, Department of Media Technology.
    Alissandrakis, Aris
    Linnaeus University, Faculty of Technology, Department of Media Technology.
    Change Your Perspective: Exploration of a 3D Network Created from Open Data in an Immersive Virtual Reality Environment2016In: ACHI 2016: The Ninth International Conference on Advances in Computer-Human Interactions / [ed] Alma Leora Culén, Leslie Miller, Irini Giannopulu, Birgit Gersbeck-Schierholz, International Academy, Research and Industry Association (IARIA), 2016, p. 403-410Conference paper (Refereed)
    Abstract [en]

    This paper investigates an approach of how to naturally interact and explore information (based on open data) within an immersive virtual reality environment (VRE) using a head-mounted display and vision-based motion controls. We present the results of a user interaction study that investigated the acceptance of the developed prototype, estimated the workload as well as examined the participants' behavior. Additional discussions with experts provided further feedback towards the prototype's overall design and concept. The results indicate that the participants were enthusiastic regarding the novelty and intuitiveness of exploring information in a VRE, as well as were challenged (in a positive manner) with the applied interface and interaction design. The presented concept and design were well received by the experts, who valued the idea and implementation and encouraged to be even bolder, making more use of the available 3D environment.

  • 4.
    Reski, Nico
    et al.
    Linnaeus University, Faculty of Technology, Department of computer science and media technology (CM).
    Alissandrakis, Aris
    Linnaeus University, Faculty of Technology, Department of computer science and media technology (CM).
    Open data exploration in virtual reality: a comparative study of input technology2019In: Virtual Reality, ISSN 1359-4338, E-ISSN 1434-9957Article in journal (Refereed)
    Abstract [en]

    In this article, we compare three different input technologies (gamepad, vision-based motion controls, room-scale) for an interactive virtual reality (VR) environment. The overall system is able to visualize (open) data from multiple online sources in a unified interface, enabling the user to browse and explore displayed information in an immersive VR setting. We conducted a user interaction study (n=24; n=8 per input technology, between-group design) to investigate experienced workload and perceived flow of interaction. Log files and observations allowed further insights and comparison of each condition. We have identified trends that indicate user preference of a visual (virtual) representation, but no clear trends regarding the application of physical controllers (over vision-based controls), in a scenario that encouraged exploration with no time limitations.

  • 5.
    Reski, Nico
    et al.
    Linnaeus University, Faculty of Technology, Department of computer science and media technology (CM).
    Alissandrakis, Aris
    Linnaeus University, Faculty of Technology, Department of computer science and media technology (CM).
    Using an Augmented Reality Cube-like Interface and 3D Gesture-based Interaction to Navigate and Manipulate Data2018In: VINCI '18 Proceedings of the 11th International Symposium on Visual Information Communication and Interaction, New York: Association for Computing Machinery (ACM), 2018, p. 92-96Conference paper (Refereed)
    Abstract [en]

    In this paper we describe our work-in-progress to create an interface that enables users to browse and select data within an Augmented Reality environment, using a virtual cube object that can be interacted with through 3D gestural input. We present the prototype design (including the graphical elements), describe the interaction possibilities of touching the cube with the hand/finger, and put the prototype into the context of our Augmented Reality for Public Engagement (PEAR) framework. An interactive prototype was implemented and runs on a typical off-the-shelf smart-phone device.

  • 6. Reski, Nico
    et al.
    Alissandrakis, Aris
    Linnaeus University, Faculty of Technology, Department of computer science and media technology (CM).
    Tyrkkö, Jukka
    Linnaeus University, Faculty of Arts and Humanities, Department of Languages.
    Collaborative exploration of rich corpus data using immersive virtual reality and non-immersive technologies2019In: ADDA: Approaches to Digital Discourse Analysis – ADDA 2, Turku, Finland 23-25 May 2019 ; Book of abstracts, Turku: University of Turku , 2019, p. 7-7Conference paper (Other academic)
    Abstract [en]

    In recent years, large textual data sets, comprising many data points and rich metadata, have become a common object of investigation and analysis. Information Visualization and Visual Analytics provide practical tools for visual data analysis, most commonly as interactive two-dimensional (2D) visualizations that are displayed through normal computer monitors. At the same time, display technologies have evolved rapidly over the past decade. In particular, emerging technologies such as virtual reality (VR), augmented reality (AR), or mixed reality (MR) have become affordable and more user-friendly (LaValle 2016). Under the banner of “Immersive Analytics”, researchers started to explore the novel application of such immersive technologies for the purpose of data analysis (Marriott et al. 2018).

    By using immersive technologies, researchers hope to increase motivation and user engagement for the overall data analysis activity as well as providing different perspectives on the data. This can be particularly helpful in the case of exploratory data analysis, when the researcher attempts to identify interesting points or anomalies in the data without prior knowledge of what exactly they are searching for. Furthermore, the data analysis process often involves the collaborative sharing of information and knowledge between multiple users for the goal of interpreting and making sense of the explored data together (Isenberg et al. 2011). However, immersive technologies such as VR are often rather single user-centric experiences, where one user is wearing a head-mounted display (HMD) device and is thus visually isolated from the real-world surroundings. Consequently, new tools and approaches for co-located, synchronous collaboration in such immersive data analysis scenarios are needed.

    In this software demonstration, we present our developed VR system that enables two users to explore data at the same time, one inside an immersive VR environment, and one outside VR using a non-immersive companion application. The context of this demonstrated data analysis activity is centered around the exploration of the language variability in tweets from the perspectives of multilingualism and sociolinguistics (see, e.g. Coats 2017 and Grieve et al. 2017). Our primary data come from the the Nordic Tweet Stream (NTS) corpus (Laitinen et al. 2018, Tyrkkö 2018), and the immersive VR application visualizes in three dimensions (3D) the clustered Twitter traffic within the Nordic region as stacked cuboids according to their geospatial position, where each stack represents a color-coded language share (Alissandrakis et al. 2018). Through the utilization of 3D gestural input, the VR user can interact with the data using hand postures and gestures in order to move through the virtual 3D space, select clusters and display more detailed information, and to navigate through time (Reski and Alissandrakis 2019) ( https://vrxar.lnu.se/apps/odxvrxnts-360/ ). A non-immersive companion application, running in a normal web browser, presents an overview map of the Nordic region as well as other supplemental information about the data that are more suitable to be displayed using non-immersive technologies.

    We will present two complementary applications, each with a different objective within the collaborative data analysis framework. The design and implementation of certain connectivity and collaboration features within these applications facilitate the co-located, synchronous exploration and sensemaking. For instance, the VR user’s position and orientation are displayed and updated in real-time within the overview map of the non-immersive application. The other way around, the selected cluster of the non-immersive user is also highlighted for the user in VR. Initial tests with pairs of language students validated the proof-of-concept of the developed collaborative system and encourage the conduction of further future investigations in this direction.

  • 7.
    Reski, Nico
    et al.
    Linnaeus University, Faculty of Technology, Department of Media Technology.
    Nordmark, Susanna
    Linnaeus University, Faculty of Technology, Department of Media Technology.
    Milrad, Marcelo
    Linnaeus University, Faculty of Technology, Department of Media Technology.
    Exploring New Interaction Mechanisms to Support Information Sharing and Collaboration Using Large Multi-touch Displays in the Context of Digital Storytelling2014In: Proceedings of the 14th IEEE International Conference on Advanced Learning Technologies IEEE - ICALT2014, IEEE Press, 2014, p. 176-180Conference paper (Refereed)
    Abstract [en]

    A wide range of Information and Communications Technologies (ICT) have been used to support teaching and to enhance the learning process in the last decades. With the latest introduction of large interactive tabletops, multi-touch interaction is taken to the next level since large displays allow and invite not just one but multiple users to interact and collaborate at the same time. The latest presents designers and developers with new challenges in terms of interaction possibilities to promote active collaboration and information sharing. This paper evaluates the use of novel Tangible User Interface (TUI) approaches for the design of an interactive tabletop application conceived to support co-located collaborative learning in the particular context of Digital Storytelling (DS). We present the results of a user interaction study, which considers the users' subjective reaction and acceptance for these User Interface (UI) paradigms, as well as their level of collaboration and communication while working together. The results of the study indicated that the users adapted working in a close collaboration using the provided multi-touch functionalities very quickly. Furthermore, users appreciated the possibility to closely discussing, conversing and exchanging information with their peers through simultaneous interactions on the multi-touch display.

  • 8.
    Yousefi, Shahrouz
    et al.
    Linnaeus University, Faculty of Technology, Department of computer science and media technology (CM), Department of Media Technology.
    Kidane, Mhretab
    ManoMotion AB, Stockholm.
    Delgado, Yeray
    ManoMotion AB, Sweden.
    Chana, Julio
    ManoMotion AB, Sweden.
    Reski, Nico
    Linnaeus University, Faculty of Technology, Department of computer science and media technology (CM), Department of Media Technology.
    3D Gesture-Based Interaction for Immersive Experience in Mobile VR2016In: 2016 23rd International Conference on Pattern Recognition (ICPR)Cancún Center, Cancún, México, December 4-8, 2016, Cancun: IEEE Press, 2016, , p. 6p. 2122-2127Conference paper (Refereed)
    Abstract [en]

    In this paper we introduce a novel solution for real-time 3D hand gesture analysis using the embedded 2D camera of a mobile device. The presented framework is based on forming a large database of hand gestures including the ground truth information of hand poses and details of finger joints in 3D. For a query frame captured by the mobile device's camera in real time, the gesture analysis system finds the best match from the database. Once the best match is found, the corresponding ground truth information will be used for interaction in the designed interface. The presented framework performs an extremely efficient gesture analysis (more than 30 fps) in flexible lighting condition and complex background with dynamic movement of the mobile device. The introduced work is implemented in Android and tested in Gear VR headset.

1 - 8 of 8
CiteExportLink to result list
Permanent link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf