Digitala Vetenskapliga Arkivet

Ändra sökning
Avgränsa sökresultatet
1 - 30 av 30
RefereraExporteraLänk till träfflistan
Permanent länk
Referera
Referensformat
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Annat format
Fler format
Språk
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Annat språk
Fler språk
Utmatningsformat
  • html
  • text
  • asciidoc
  • rtf
Träffar per sida
  • 5
  • 10
  • 20
  • 50
  • 100
  • 250
Sortering
  • Standard (Relevans)
  • Författare A-Ö
  • Författare Ö-A
  • Titel A-Ö
  • Titel Ö-A
  • Publikationstyp A-Ö
  • Publikationstyp Ö-A
  • Äldst först
  • Nyast först
  • Skapad (Äldst först)
  • Skapad (Nyast först)
  • Senast uppdaterad (Äldst först)
  • Senast uppdaterad (Nyast först)
  • Disputationsdatum (tidigaste först)
  • Disputationsdatum (senaste först)
  • Standard (Relevans)
  • Författare A-Ö
  • Författare Ö-A
  • Titel A-Ö
  • Titel Ö-A
  • Publikationstyp A-Ö
  • Publikationstyp Ö-A
  • Äldst först
  • Nyast först
  • Skapad (Äldst först)
  • Skapad (Nyast först)
  • Senast uppdaterad (Äldst först)
  • Senast uppdaterad (Nyast först)
  • Disputationsdatum (tidigaste först)
  • Disputationsdatum (senaste först)
Markera
Maxantalet träffar du kan exportera från sökgränssnittet är 250. Vid större uttag använd dig av utsökningar.
  • 1.
    Abedan Kondori, Farid
    et al.
    Umeå universitet, Teknisk-naturvetenskapliga fakulteten, Institutionen för tillämpad fysik och elektronik.
    Liu, Li
    Umeå universitet, Teknisk-naturvetenskapliga fakulteten, Institutionen för tillämpad fysik och elektronik.
    3D Active Human Motion Estimation for Biomedical Applications2012Ingår i: World Congress on Medical Physics and Biomedical Engineering May 26-31, 2012, Beijing, China / [ed] Mian Long, Springer Berlin/Heidelberg, 2012, , s. 4s. 1014-1017Konferensbidrag (Refereegranskat)
    Abstract [en]

    Movement disorders forbid many people from enjoying their daily lives. As with other diseases, diagnosis and analysis are key issues in treating such disorders. Computer vision-based motion capture systems are helpful tools for accomplishing this task. However Classical motion tracking systems suffer from several limitations. First they are not cost effective. Second these systems cannot detect minute motions accurately. Finally they are spatially limited to the lab environment where the system is installed. In this project, we propose an innovative solution to solve the above-mentioned issues. Mounting the camera on human body, we build a convenient, low cost motion capture system that can be used by the patient while practicing daily-life activities. We refer to this system as active motion capture, which is not confined to the lab environment. Real-time experiments in our lab revealed the robustness and accuracy of the system.

  • 2.
    Abedan Kondori, Farid
    et al.
    Umeå universitet, Teknisk-naturvetenskapliga fakulteten, Institutionen för tillämpad fysik och elektronik.
    Yousefi, Shahrouz
    KTH Royal Institute of Technology, Department of Media Technology and Interaction Design.
    Kouma, Jean-Paul
    Umeå universitet, Teknisk-naturvetenskapliga fakulteten, Institutionen för tillämpad fysik och elektronik.
    Liu, Li
    Umeå universitet, Teknisk-naturvetenskapliga fakulteten, Institutionen för tillämpad fysik och elektronik.
    Li, Haibo
    KTH Royal Institute of Technology, Department of Media Technology and Interaction Design.
    Direct hand pose estimation for immersive gestural interaction2015Ingår i: Pattern Recognition Letters, ISSN 0167-8655, E-ISSN 1872-7344, Vol. 66, s. 91-99Artikel i tidskrift (Refereegranskat)
    Abstract [en]

    This paper presents a novel approach for performing intuitive gesture based interaction using depth data acquired by Kinect. The main challenge to enable immersive gestural interaction is dynamic gesture recognition. This problem can be formulated as a combination of two tasks; gesture recognition and gesture pose estimation. Incorporation of fast and robust pose estimation method would lessen the burden to a great extent. In this paper we propose a direct method for real-time hand pose estimation. Based on the range images, a new version of optical flow constraint equation is derived, which can be utilized to directly estimate 3D hand motion without any need of imposing other constraints. Extensive experiments illustrate that the proposed approach performs properly in real-time with high accuracy. As a proof of concept, we demonstrate the system performance in 3D object manipulation On two different setups; desktop computing, and mobile platform. This reveals the system capability to accommodate different interaction procedures. In addition, a user study is conducted to evaluate learnability, user experience and interaction quality in 3D gestural interaction in comparison to 2D touchscreen interaction.

  • 3.
    Abedan Kondori, Farid
    et al.
    Umeå universitet, Teknisk-naturvetenskapliga fakulteten, Institutionen för tillämpad fysik och elektronik.
    Yousefi, Shahrouz
    Umeå universitet, Teknisk-naturvetenskapliga fakulteten, Institutionen för tillämpad fysik och elektronik.
    Liu, Li
    Umeå universitet, Teknisk-naturvetenskapliga fakulteten, Institutionen för tillämpad fysik och elektronik.
    Active human gesture capture for diagnosing and treating movement disorders2013Konferensbidrag (Övrigt vetenskapligt)
    Abstract [en]

    Movement disorders prevent many people fromenjoying their daily lives. As with other diseases, diagnosisand analysis are key issues in treating such disorders.Computer vision-based motion capture systems are helpfultools for accomplishing this task. However Classical motiontracking systems suffer from several limitations. First theyare not cost effective. Second these systems cannot detectminute motions accurately. Finally they are spatially limitedto the lab environment where the system is installed. In thisproject, we propose an innovative solution to solve the abovementionedissues. Mounting the camera on human body, webuild a convenient, low cost motion capture system that canbe used by the patient in daily-life activities. We refer tothis system as active motion capture, which is not confinedto the lab environment. Real-time experiments in our labrevealed the robustness and accuracy of the system.

    Ladda ner fulltext (pdf)
    Active Human Gesture Capture for Diagnosing and Treating Movement Disorders
  • 4.
    Abedan Kondori, Farid
    et al.
    Umeå universitet, Teknisk-naturvetenskapliga fakulteten, Institutionen för tillämpad fysik och elektronik.
    Yousefi, Shahrouz
    KTH Royal Institute of Technology, Department of Media Technology and Interaction Design.
    Liu, Li
    Umeå universitet, Teknisk-naturvetenskapliga fakulteten, Institutionen för tillämpad fysik och elektronik.
    Li, Haibo
    KTH Royal Institute of Technology, Department of Media Technology and Interaction Design.
    Head operated electric wheelchair2014Ingår i: IEEE Southwest Symposium on Image Analysis and Interpretation (SSIAI 2014), IEEE , 2014, s. 53-56Konferensbidrag (Refereegranskat)
    Abstract [en]

    Currently, the most common way to control an electric wheelchair is to use joystick. However, there are some individuals unable to operate joystick-driven electric wheelchairs due to sever physical disabilities, like quadriplegia patients. This paper proposes a novel head pose estimation method to assist such patients. Head motion parameters are employed to control and drive an electric wheelchair. We introduce a direct method for estimating user head motion, based on a sequence of range images captured by Kinect. In this work, we derive new version of the optical flow constraint equation for range images. We show how the new equation can be used to estimate head motion directly. Experimental results reveal that the proposed system works with high accuracy in real-time. We also show simulation results for navigating the electric wheelchair by recovering user head motion.

  • 5.
    Abedan Kondori, Farid
    et al.
    Umeå universitet, Teknisk-naturvetenskapliga fakulteten, Institutionen för tillämpad fysik och elektronik.
    Yousefi, Shahrouz
    Ostovar, Ahmad
    Umeå universitet, Teknisk-naturvetenskapliga fakulteten, Institutionen för datavetenskap.
    Liu, Li
    Umeå universitet, Teknisk-naturvetenskapliga fakulteten, Institutionen för tillämpad fysik och elektronik.
    Li, Haibo
    A Direct Method for 3D Hand Pose Recovery2014Ingår i: 22nd International Conference on Pattern Recognition, 2014, s. 345-350Konferensbidrag (Refereegranskat)
    Abstract [en]

    This paper presents a novel approach for performing intuitive 3D gesture-based interaction using depth data acquired by Kinect. Unlike current depth-based systems that focus only on classical gesture recognition problem, we also consider 3D gesture pose estimation for creating immersive gestural interaction. In this paper, we formulate gesture-based interaction system as a combination of two separate problems, gesture recognition and gesture pose estimation. We focus on the second problem and propose a direct method for recovering hand motion parameters. Based on the range images, a new version of optical flow constraint equation is derived, which can be utilized to directly estimate 3D hand motion without any need of imposing other constraints. Our experiments illustrate that the proposed approach performs properly in real-time with high accuracy. As a proof of concept, we demonstrate the system performance in 3D object manipulation. This application is intended to explore the system capabilities in real-time biomedical applications. Eventually, system usability test is conducted to evaluate the learnability, user experience and interaction quality in 3D interaction in comparison to 2D touch-screen interaction.

  • 6. Chen, Mengying
    et al.
    Shi, Xiaoyan
    Chen, Yinhua
    Cao, Zhaolan
    Cheng, Rui
    Xu, Yuxiang
    Liu, Li
    Umeå universitet, Teknisk-naturvetenskapliga fakulteten, Institutionen för tillämpad fysik och elektronik.
    Li, Xiaonan
    A prospective study of pain experience in a neonatal intensive care unit of China2012Ingår i: The Clinical Journal of Pain, ISSN 0749-8047, E-ISSN 1536-5409, Vol. 28, nr 8, s. 700-704Artikel i tidskrift (Refereegranskat)
    Abstract [en]

    Objectives: To assess pain burden in neonates during their hospitalization in China and thus provide evidence for the necessity of neonatal pain management. Patients and Methods: The Neonatal Facial Coding System was used to evaluate pain in neonates. We prospectively collected data of all painful procedures performed on 108 neonates (term, 62; preterm, 46) recruited from admission to discharge in a neonatal intensive care unit of a university-affiliated hospital in China. Results: We found that during hospitalization each preterm and term neonate was exposed to a median of 100.0 (range, 11 to 544) and 56.5 (range, 12 to 249) painful procedures, respectively. Most of the painful procedures were performed within the first 3 days. Preterm neonates, especially those born at 28 and 29 weeks' gestational age, experienced more pain than those born at 30 weeks' gestation or later (P < 0.001). Among those painful procedures, tracheal aspiration was the most frequently performed on preterm neonates, and intravenous cannulation was the most common for term neonates. Moreover, tracheal intubations and femoral venous puncture were found to be the most painful. Notably, none of the painful procedures was accompanied by analgesia. Conclusions: Neonates, particularly preterm neonates, were exposed to numerous invasive painful procedures without appropriate analgesia in hospitals in China. The potential long-term impacts of poorly treated pain in neonates call for a change in pediatric practice in China and in countries with similar practices.

  • 7. Chen, Mengying
    et al.
    Xia, Dongqing
    Min, Cuiting
    Zhao, Xiaoke
    Chen, Yinhua
    Liu, Li
    Umeå universitet, Teknisk-naturvetenskapliga fakulteten, Institutionen för tillämpad fysik och elektronik.
    Li, Xiaonan
    Neonatal repetitive pain in rats leads to impaired spatial learning and dysregulated hypothalamic-pituitary-adrenal axis function in later life2016Ingår i: Scientific Reports, E-ISSN 2045-2322, Vol. 6, artikel-id 39159Artikel i tidskrift (Refereegranskat)
    Abstract [en]

    Preterm birth is a major health issue. As part of their life-saving care, most preterm infants require hospitalization and are inevitably exposed to repetitive skin-breaking procedures. The long-term effects of neonatal repetitive pain on cognitive and emotional behaviors involving hypothalamic-pituitary-adrenal (HPA) axis function in young and adult rats are unknown. From P8 to P85, mechanical hypersensitivity of the bilateral hindpaws was observed in the Needle group (P < 0.001). Compared with the Tactile group, the Needle group took longer to find the platform on P30 than on P29 (P = 0.03), with a decreased number of original platform site crossings during the probe trial of the Morris water maze test (P = 0.026). Moreover, the Needle group spent more time and took longer distances in the central area than the Tactile group in the Open-field test, both in prepubertal and adult rats (P < 0.05). The HPA axis function in the Needle group differed from the Tactile group (P < 0.05), with decreased stress responsiveness in prepuberty and puberty (P < 0.05) and increased stress responsiveness in adulthood (P < 0.05). This study indicates that repetitive pain that occurs during a critical period may cause severe consequences, with behavioral and neuroendocrine disturbances developing through prepuberty to adult life.

    Ladda ner fulltext (pdf)
    fulltext
  • 8.
    Fahlquist, Karin
    et al.
    Umeå universitet, Teknisk-naturvetenskapliga fakulteten, Institutionen för tillämpad fysik och elektronik.
    Karlsson, Johannes
    Umeå universitet, Teknisk-naturvetenskapliga fakulteten, Institutionen för tillämpad fysik och elektronik.
    Ren, Keni
    Umeå universitet, Teknisk-naturvetenskapliga fakulteten, Institutionen för tillämpad fysik och elektronik.
    Liu, Li
    Umeå universitet, Teknisk-naturvetenskapliga fakulteten, Institutionen för tillämpad fysik och elektronik.
    Li, Haibo
    Umeå universitet, Teknisk-naturvetenskapliga fakulteten, Institutionen för tillämpad fysik och elektronik.
    Ur-Rehman, Shafiq
    Umeå universitet, Teknisk-naturvetenskapliga fakulteten, Institutionen för tillämpad fysik och elektronik.
    Wark, Tim
    CSIRO.
    Human animal machine interaction: Animal behavior awareness and digital experience2010Ingår i: Proceedings of ACM Multimedia 2010 - Brave New Ideas, 25-29 October 2010, Firenze, Italy., 2010, s. 1269-1274Konferensbidrag (Refereegranskat)
    Abstract [en]

    This paper proposes an intuitive wireless sensor/actuator based communication network for human animal interaction for a digital zoo. In order to enhance effective observation and control over wild life, we have built a wireless sensor network. 25 video transmitting nodes are installed for animal behavior observation and experimental vibrotactile collars have been designed for effective control in an animal park.

    The goal of our research is two-folded. Firstly, to provide an interaction between digital users and animals, and monitor the animal behavior for safety purposes. Secondly, we investigate how animals can be controlled or trained based on vibrotactile stimuli instead of electric stimuli.

    We have designed a multimedia sensor network for human animal machine interaction. We have evaluated the effect of human animal machine state communication model in field experiments.

  • 9.
    Kondori, Farid Abedan
    et al.
    Umeå universitet, Teknisk-naturvetenskapliga fakulteten, Institutionen för tillämpad fysik och elektronik.
    Liu, Li
    Umeå universitet, Teknisk-naturvetenskapliga fakulteten, Institutionen för tillämpad fysik och elektronik.
    Li, Haibo
    KTH Royal Institue of Technology, Department of Media Technology and Interaction Design, School of Computer Science and Communication.
    Telelife: An immersive media experience for rehabilitation2014Ingår i: Proceedings of the 2014 Asia-Pacific Signal and Information Processing Association Annual Summit and Conference (APSIPA 2014), IEEE, 2014Konferensbidrag (Refereegranskat)
    Abstract [en]

    In recent years, emergence of telerehabilitation systems for home-based therapy has altered healthcare systems. Telerehabilitation enables therapists to observe patients status via Internet, thus a patient does not have to visit rehabilitation facilities for every rehabilitation session. Despite the fact that telerehabilitation provides great opportunities, there are two major issues that affect effectiveness of telerehabilitation: relegation of the patient at home, and loss of direct supervision of the therapist. Since patients have no actual interaction with other persons during the rehabilitation period, they will become isolated and gradually lose their social skills. Moreover, without direct supervision of therapists, rehabilitation exercises can be performed with bad compensation strategies that lead to a poor quality recovery. To resolve these issues, we propose telelife, a new concept for future rehabilitation systems. The idea is to use media technology to create a totally new immersive media experience for rehabilitation. In telerehabilitation patients locally execute exercises, and therapists remotely monitor patients' status. In telelife patients, however, remotely perform exercises and therapists locally monitor. Thus, not only telelife enables rehabilitation at distance, but also improves the patients' social competences, and provides direct supervision of therapists. In this paper we introduce telelife to enhance telerehabilitation, and investigate technical challenges and possible methods to achieve telelife.

  • 10.
    Kouma, Jean-Paul
    et al.
    Umeå universitet, Teknisk-naturvetenskapliga fakulteten, Institutionen för tillämpad fysik och elektronik.
    Liu, Li
    Umeå universitet, Teknisk-naturvetenskapliga fakulteten, Institutionen för tillämpad fysik och elektronik.
    Internet of Food2011Ingår i: 2011 IEEE International Conferences on Internet of Things, and Cyber, Physical and Social Computing / [ed] Feng Xia, Zhikui Chen, Gang Pan, Laurence T. Yang, and Jianhua Ma, Los Alamitos: IEEE Computer Society, 2011, s. 713-716Konferensbidrag (Refereegranskat)
    Abstract [en]

    Today, more and more "things" have an IP identity and thus are able to communicate with each other though the Internet. The IP identity of things (or Internet of Things), refers to uniquely identifiable objects and their virtual representation in an internet-like structure. The paper describes the impact of the Internet of Things has on the technology roadmap through time. The author asks how will the eating habits be affected if foods also were equipped with IP identity?".

  • 11.
    Li, Fei
    et al.
    Umeå universitet, Teknisk-naturvetenskapliga fakulteten, Institutionen för tillämpad fysik och elektronik. Institute of Signal Processing and Transmission, Nanjing University of Posts and Telecommunication, Nanjing, China.
    Liu, Li
    Umeå universitet, Teknisk-naturvetenskapliga fakulteten, Institutionen för tillämpad fysik och elektronik.
    Innovation pedagogy and improving quality in higher education in web-based environments2011Ingår i: 2011 IEEE 3rd International Conference on Communication Software and Networks, IEEE, 2011, s. 215-218, artikel-id 6013812Konferensbidrag (Refereegranskat)
    Abstract [en]

    Innovation is one of the main objectives for the higher education to meet the global challenges and to deliver the high levels of sustainable, knowledge-based growth. It is crucial to promote innovative pedagogy and improve quality. Based on the study of teaching programmes, pedagogies, web-based course management system and quality assurance system in engineering education at Umeå University, an innovative technology, Outcome-Based Approach, used in teaching at Umeå University is presented in this paper. The pedagogical experimentation taken by faculty teaching signal and system is described. Where the approach is successful is highlighted. The process of periodic review and revision of curricula is illustrated. The systematic quality work at Umeå University is described. Finally, some inspirations are drawn. 

  • 12.
    Li, Fei
    et al.
    Umeå universitet, Teknisk-naturvetenskapliga fakulteten, Institutionen för tillämpad fysik och elektronik. Key Lab of Broadband Wireless Communication and Sensor Network Technology, Nanjing University of Posts and Telecommunication, Ministry of Education, Nanjing, China.
    Zhou, Lizhi
    Key Lab of Broadband Wireless Communication and Sensor Network Technology, Nanjing University of Posts and Telecommunication, Ministry of Education, Nanjing, China.
    Liu, Li
    Umeå universitet, Teknisk-naturvetenskapliga fakulteten, Institutionen för tillämpad fysik och elektronik.
    Li, Haibo
    Umeå universitet, Teknisk-naturvetenskapliga fakulteten, Institutionen för tillämpad fysik och elektronik.
    A quantum search based signal detection for MIMO-OFDM systems2011Ingår i: 18th International Conference on Telecommunications, ICT 2011, IEEE, 2011, s. 276-281, artikel-id 5898934Konferensbidrag (Refereegranskat)
    Abstract [en]

    Multiple input multiple output-orthogonal frequency division multiplexing (MIMO-OFDM) is considered as candidates for future broadband wireless services. In this paper a novel signal detection scheme based on Grover's quantum search algorithm is proposed for MIMO-OFDM systems. Grover's quantum search algorithm is based on the concept and principles of quantum computing, such as quantum bit, quantum register and quantum parallelism. An analysis is given to the theoretical basis of Grover's algorithm and the performance of Grover's algorithm is evaluated. A novel signal detector based on Grover's algorithm (GD) for MIMO-OFDM system is proposed. The simulation results show that the proposed detector has more powerful properties in bit error rate than MMSE detector and VBLAST-MMSE detector. The performance of the proposed GD detector is close to optimal when the failure probability is 0.001. When the failure probability is 0.00001, the performance of GD detector declines. In this case, our proposed improved Grover's algorithm based detector is still close to the optimal ML detector. The complexity of GD and IGD is O(√N). It's much better than classical ML detector which complexity is O(N). 

  • 13.
    Li, Liu
    et al.
    Umeå universitet, Teknisk-naturvetenskapliga fakulteten, Institutionen för tillämpad fysik och elektronik.
    ur Réhman, Shafiq
    Umeå universitet, Teknisk-naturvetenskapliga fakulteten, Institutionen för tillämpad fysik och elektronik.
    Lindahl, Olof
    Umeå universitet, Teknisk-naturvetenskapliga fakulteten, Institutionen för tillämpad fysik och elektronik.
    Li, Haibo
    Umeå universitet, Teknisk-naturvetenskapliga fakulteten, Institutionen för tillämpad fysik och elektronik.
    Vibrotactile chair: A social interface for blind2006Ingår i: Proceedings SSBA 2006: Symposium on image analysis, Umeå, March 16-17, 2006 / [ed] Fredrik Georgsson, 1971-, Niclas Börlin, 1968-, Umeå: Umeå universitet. Institutionen för datavetenskap , 2006, s. 117-120Konferensbidrag (Övrigt vetenskapligt)
    Abstract [en]

    In this paper we present our vibrotactile chair, a social interface for the blind. With this chair the blind can get on-line emotion information from the person he / she is heading to. This greatly enhances communication ability and improve the quality of social life of the blind. In the paper we are discussing technical challenges and design principles behind the chair, and introduce the experimental platform: tactile facial expression appearance recognition system (TEARS)TM".

  • 14.
    Lu, Guanming
    et al.
    College of Telecommunications & Information Engineering, Nanjing University of Posts and Telecommunications, Nanjing, China.
    Shi, Wanwan
    College of Telecommunications & Information Engineering, Nanjing University of Posts and Telecommunications, Nanjing, China.
    Li, Xu
    College of Telecommunications & Information Engineering, Nanjing University of Posts and Telecommunications, Nanjing, China.
    Li, Xiaonan
    Nanjing Children's Hospital Affiliated to Nanjing Medical University, Nanjing, China.
    Chen, Mengying
    Nanjing Children's Hospital Affiliated to Nanjing Medical University, Nanjing, China.
    Liu, Li
    Umeå universitet, Teknisk-naturvetenskapliga fakulteten, Institutionen för tillämpad fysik och elektronik.
    Recognition method for neonatal pain expression based on LBP feature and sparse representation2015Ingår i: Journal of Nanjing University of Posts and Telecommunications, ISSN 1673-5439, Vol. 35, nr 1, s. 19-25Artikel i tidskrift (Refereegranskat)
    Abstract [en]

    Facial expressions are considered as a reliable indicator in neonatal pain assessment. This paper proposes a novel recognition method for neonatal pain expression. The method can utilize the feature descriptors based on the weighted local binary pattern (LBP) and the classifier based on sparse representation. Firstly, the normalized facial image is described using a feature vector, which is histogram sequence obtained by concatenating the weighted histograms of the LBP feature maps of all the local blocks. Then, the principalc component analysis (PCA) method is used to reduce the dimensions of the feature vector of training and test samples. Finally, the over-complete dictionary is built and the classifier based on sparse representation is used to classify test samples into four classes of facial expressions: calm, crying, mild pain, and severe pain. The objective of this study is to assist the clinicians in assessing neonatal pain by utilizing computer-based image analysis techniques. Experimental results on neonate facial image database show the effectiveness of the proposed method. The classification accuracy rate reaches 84.50%.

  • 15.
    ur Rehman, Shafiq
    et al.
    Umeå universitet, Teknisk-naturvetenskapliga fakulteten, Institutionen för tillämpad fysik och elektronik.
    Liu, Li
    Umeå universitet, Teknisk-naturvetenskapliga fakulteten, Institutionen för tillämpad fysik och elektronik.
    iFeeling: vibrotactile rendering of human emotions on mobile phones2010Ingår i: Mobile multimedia processing: fundamentals, methods, and applications, Springer, 2010, s. 1-20Konferensbidrag (Refereegranskat)
    Abstract [en]

    Today, the mobile phone technology is mature enough to enable us to effectively interact with mobile phones using our three major senses namely, vision, hearing and touch. Similar to the camera, which adds interest and utility to mobile experience, the vibration motor in a mobile phone could give us a new possibility to improve interactivity and usability of mobile phones. In this chapter, we show that by carefully controlling vibration patterns, more than 1-bit information can be rendered with a vibration motor. We demonstrate how to turn a mobile phone into a social interface for the blind so that they can sense emotional information of others. The technical details are given on how to extract emotional information, design vibrotactile coding schemes, render vibrotactile patterns, as well as how to carry out user tests to evaluate its usability. Experimental studies and users tests have shown that we do get and interpret more than one bit emotional information. This shows a potential to enrich mobile phones communication among the users through the touch channel.

  • 16.
    ur Réhman, Shafiq
    et al.
    Umeå universitet, Teknisk-naturvetenskapliga fakulteten, Institutionen för tillämpad fysik och elektronik.
    Khan, Muhammad Sikandar Lal
    Umeå universitet, Teknisk-naturvetenskapliga fakulteten, Institutionen för tillämpad fysik och elektronik.
    Liu, Li
    Nanjing University of Posts and Telecommunications, Nanjing, China.
    Li, Haibo
    Media technology and interaction design, Royal Institute of Technology (KTH), Sweden; Nanjing University of Posts and Telecommunications, Nanjing, China.
    Vibrotactile TV for immersive experience2014Ingår i: Signal and Information Processing Association Annual Summit and Conference (APSIPA), 2014 Asia-Pacific, 2014Konferensbidrag (Refereegranskat)
    Abstract [en]

    Audio and video are two powerful media forms to shorten the distance between audience/viewer and actors or players in the TV and films. The recent research shows that today people are using more and more multimedia contents on mobile devices, such as tablets and smartphones. Therefore, an important question emerges - how can we render high-quality, personal immersive experiences to consumers on these systems? To give audience an immersive engagement that differs from `watching a play', we have designed a study to render complete immersive media which include the `emotional information' based on augmented vibrotactile-coding on the back of the user along with audio-video signal. The reported emotional responses to videos viewed with and without haptic enhancement, show that participants exhibited an increased emotional response to media with haptic enhancement. Overall, these studies suggest that the effectiveness of our approach and using a multisensory approach increase immersion and user satisfaction.

  • 17.
    ur Réhman, Shafiq
    et al.
    Umeå universitet, Teknisk-naturvetenskapliga fakulteten, Institutionen för tillämpad fysik och elektronik.
    Liu, Li
    Umeå universitet, Teknisk-naturvetenskapliga fakulteten, Institutionen för tillämpad fysik och elektronik.
    iFeeling: Vibrotactile rendering of human emotions on mobile phones2010Ingår i: Mobile multimedia processing: fundamentals, methods, and applications / [ed] Xiaoyi Jiang, MatthewY. Ma, Chang Wen Chen, Heidelberg, Germany: Springer Berlin , 2010, 1st Edition, s. 1-20Kapitel i bok, del av antologi (Övrigt vetenskapligt)
    Abstract [en]

    Today, the mobile phone technology is mature enough to enable us to effectively interact with mobile phones using our three major senses namely, vision, hearing and touch. Similar to the camera, which adds interest and utility to mobile experience, the vibration motor in a mobile phone could give us a new possibility to improve interactivity and usability of mobile phones. In this chapter, we show that by carefully controlling vibration patterns, more than 1-bit information can be rendered with a vibration motor. We demonstrate how to turn a mobile phone into a social interface for the blind so that they can sense emotional information of others. The technical details are given on how to extract emotional information, design vibrotactile coding schemes, render vibrotactile patterns, as well as how to carry out user tests to evaluate its usability. Experimental studies and users tests have shown that we do get and interpret more than one bit emotional information. This shows a potential to enrich mobile phones communication among the users through the touch channel.

  • 18.
    ur Réhman, Shafiq
    et al.
    Umeå universitet, Teknisk-naturvetenskapliga fakulteten, Institutionen för tillämpad fysik och elektronik.
    liu, li
    Umeå universitet, Teknisk-naturvetenskapliga fakulteten, Institutionen för tillämpad fysik och elektronik.
    Real-time lip tracking for emotion understanding2006Ingår i: Swedish Symposium on Image Analysis: Symposium on image analysis, Umeå, March 16-17, 2006 / [ed] Georgsson, Fredrik, 1971-, Börlin, Niclas, 1968-, Umeå: Umeå universitet. Institutionen för datavetenskap , 2006, s. 29-32Konferensbidrag (Övrigt vetenskapligt)
  • 19.
    ur Réhman, Shafiq
    et al.
    Umeå universitet, Teknisk-naturvetenskapliga fakulteten, Institutionen för tillämpad fysik och elektronik.
    Liu, Li
    Umeå universitet, Teknisk-naturvetenskapliga fakulteten, Institutionen för tillämpad fysik och elektronik.
    Sensing expressive lips with a mobile phone2008Ingår i: Proceedings of the 1st international workshop on mobile multimedia processing: In conjunction with 19th international conference on pattern recognition, Tampa, Florida, USA, 2008Konferensbidrag (Refereegranskat)
    Abstract [en]

    Considering potential benefits of vibrations in mobile phones,we propose an intuitive method to render human emotions for the vi-sually impaired. A mobile phone is "synchronized" with emotional in-formation extracted from human lips dynamics. By holding the mobilephone, the user will be able to get on-line emotion information of others.Experimental results based on usability evaluation of the system are encouraging. The user studies show a perfect pattern recognition accuracy on the designed vibration patterns.

    Ladda ner fulltext (pdf)
    fulltext
  • 20.
    ur Réhman, Shafiq
    et al.
    Umeå universitet, Teknisk-naturvetenskapliga fakulteten, Institutionen för tillämpad fysik och elektronik.
    liu, li
    Umeå universitet, Teknisk-naturvetenskapliga fakulteten, Institutionen för tillämpad fysik och elektronik.
    Vibrotactile emotions on a mobile phone2008Ingår i: Proceedings  of the 4th IEEE international conference on signal image technology and internet based systems, 2008:  (SITIS '08), 2008, s. 239-243Konferensbidrag (Refereegranskat)
    Abstract [en]

    Vibrotactile sensations to sight and sound cues on mobile handsets make content more realistic, fun and operation more intuitive. An intuitive method to render human facial feature for visually impaired is evaluated for offectiveness. In current system, the vibration of a mobile phone is used to provide emotion information based on human lippsilas shape. Using designed vibration patterns, the users are able to tactually perceive emotion information of others. The evaluation of current systempsilas efficiency based on usability guidelines shows almost perfect recognition accuracy of emotions based on facial features.

  • 21.
    ur Réhman, Shafiq
    et al.
    Umeå universitet, Teknisk-naturvetenskapliga fakulteten, Institutionen för tillämpad fysik och elektronik.
    Liu, Li
    Umeå universitet, Teknisk-naturvetenskapliga fakulteten, Institutionen för tillämpad fysik och elektronik.
    Vibrotactile rendering of human emotions on the manifold of facial expressions2008Ingår i: Journal of Multimedia, E-ISSN 1796-2048, Vol. 3, nr 3, s. 18-25Artikel i tidskrift (Refereegranskat)
    Abstract [en]

    Facial expressions play an important role in every day social interaction. To enhance the daily life experience for the visually impaired, we present the Facial Expression Appearance vibroTactile System (FEATS), which uses a vibrotactile chair as the social interface for the visually impaired. An array of vibrating motors are mounted spatially on the back of an office chair. The Locally Linear Embedding (LLE) algorithm is extended to compute the manifold of facial expressions, which is used to control vibration of motors to render emotions. Thus, the chair could provide the visually impaired with on-line dynamic emotion information about the person he/she is communicating with. Usability evaluation of the system is carried out. The results are encouraging and demonstrate usability for the visually impaired. The user studies show that perfect recognition accuracy of emotion type is achieved by the FEATS.

  • 22.
    ur Réhman, Shafiq
    et al.
    Umeå universitet, Teknisk-naturvetenskapliga fakulteten, Institutionen för tillämpad fysik och elektronik.
    Liu, Li
    Umeå universitet, Teknisk-naturvetenskapliga fakulteten, Institutionen för tillämpad fysik och elektronik.
    Li, Haibo
    Umeå universitet, Teknisk-naturvetenskapliga fakulteten, Institutionen för tillämpad fysik och elektronik.
    Facial expression appearance for tactile perception of emotions2007Ingår i: Proceedings of Swedish symposium on image analysis, 2007, 2007, s. 157-160Konferensbidrag (Refereegranskat)
    Abstract [en]

    To enhance the daily life experience for visually challengedpersons, the Facial Expression Appearance for Tactile systemis proposed. The manifold of facial expressions is used fortactile perception. Locally Linear Embedding (LLE) codingalgorithm is implemented for tactile display. LLE algorithmis extended to handle the real time video coding. The vibrotactilechair as a social interface for the blind is used to displaythe facial expression. The chair provides the visuallyimpaired with on-line emotion information about the personhe/she is approaching. The preliminary results are encouragingand show that it greatly enhances communication abilityof the visually impaired person.

  • 23.
    ur Réhman, Shafiq
    et al.
    Umeå universitet, Teknisk-naturvetenskapliga fakulteten, Institutionen för tillämpad fysik och elektronik.
    Liu, Li
    Umeå universitet, Teknisk-naturvetenskapliga fakulteten, Institutionen för tillämpad fysik och elektronik.
    Li, Haibo
    Umeå universitet, Teknisk-naturvetenskapliga fakulteten, Institutionen för tillämpad fysik och elektronik.
    How to use manual labelers in evaluation of lip analysis systems?2009Ingår i: Visual speech recognition: Lip segmentation and mapping / [ed] Shilin W & Alan Liew, USA: IGI Global , 2009, s. 239-259Kapitel i bok, del av antologi (Övrigt vetenskapligt)
    Abstract [en]

    The purpose of this chapter is not to describe any lip analysis algorithms but rather to discuss some of the issues involved in evaluating and calibrating labeled lip features from human operators. In the chapter we question the common practice in the field: using manual lip labels directly as the ground truth for the evaluation of lip analysis algorithms. Our empirical results using an Expectation-Maximization procedure show that subjective noise in manual labelers can be quite significant in terms of quantifying both human and  algorithm extraction performance. To train and evaluate a lip analysis system one can measure the performance of human operators and infer the “ground truth” from the manual labelers, simultaneously.

  • 24.
    ur Réhman, Shafiq
    et al.
    Umeå universitet, Teknisk-naturvetenskapliga fakulteten, Institutionen för tillämpad fysik och elektronik.
    Liu, li
    Umeå universitet, Teknisk-naturvetenskapliga fakulteten, Institutionen för tillämpad fysik och elektronik.
    Li, Haibo
    Umeå universitet, Teknisk-naturvetenskapliga fakulteten, Institutionen för tillämpad fysik och elektronik.
    Lip segmentation: performance evaluation criteria2006Rapport (Övrigt vetenskapligt)
    Abstract [en]

    In this work, we determined the measure to compare and evaluate the performance of lip detection techniques.Despite of a number of methods used for lip-detecction/localization, a reliable method for comparing and determining the quality of the result is still missing. The proposed criterion ensures the clear and fair way to report the results so that mentioned results can be comparable and measurable to enhance the quality of lip detection and/or error rate. After applying the EM-like algorithm it is showed that performance of the specific technique can be evaluated.

    Ladda ner fulltext (pdf)
    FULLTEXT01
  • 25.
    ur Réhman, Shafiq
    et al.
    Umeå universitet, Teknisk-naturvetenskapliga fakulteten, Institutionen för tillämpad fysik och elektronik.
    Liu, Li
    Umeå universitet, Teknisk-naturvetenskapliga fakulteten, Institutionen för tillämpad fysik och elektronik.
    Li, Haibo
    Umeå universitet, Teknisk-naturvetenskapliga fakulteten, Institutionen för tillämpad fysik och elektronik.
    Lipless tracking and emotion estimation2007Ingår i: Proceedings of IEEE 3rd International Conference on Signal ImageTechnology & Internet based Systems , Shanghai, China: IEEE , 2007, s. 768-774Konferensbidrag (Refereegranskat)
    Abstract [en]

    Automatic human lip tracking is one of the key components to many facial image analysis tasks, such as, lip-reading and emotion from lips. It has been a classical hard image analysis problem over decades. In this paper, we propose an indirect lip tracking strategy: ‘lipless tracking’. It is based on the observation that many of us don’t have clear lips and some even don’t have visible lips. The strategy is to select and localize stable lip features around the mouth for tracking. For this purpose deformable contour-segments are modelled based on lip features and tracking is done using dynamic programming and viterbi algorithm. The strength of proposed algorithm is demonstrated in emotion estimation domain. Finally, real-time video experiments performed on private and publicly available data sets (MMI face database) have shown the robustness of our proposed lipless tracking technique.

  • 26.
    ur Réhman, Shafiq
    et al.
    Umeå universitet, Teknisk-naturvetenskapliga fakulteten, Institutionen för tillämpad fysik och elektronik.
    Liu, Li
    Umeå universitet, Teknisk-naturvetenskapliga fakulteten, Institutionen för tillämpad fysik och elektronik.
    Li, Haibo
    Umeå universitet, Teknisk-naturvetenskapliga fakulteten, Institutionen för tillämpad fysik och elektronik.
    Tactile car warning system2005Ingår i: Proceedings of first joint Euro Haptics conference and symposium on haptic interfaces for virtual environment and teleoperator systems, Pisa, Italy: IEEE , 2005Konferensbidrag (Övrigt vetenskapligt)
    Abstract [en]

    Driving on the busy road is a critical job. Drivers need to combine all senses to solve to handle upcoming events and situations. According to recent survey pedestrian based accidents represent a huge portion of traffic accidents in EU, it is stated that more than 200,000 pedestrians are injured and about 9,000 are killed in accidents yearly. Enormous amounts of research have been on the detectionDriving on the busy road is a critical job. Drivers need to combine all senses to solve to handle upcoming events and situations. According to recent survey pedestrian based accidents represent a huge portion of traffic accidents in EU, it is stated that more than 200,000 pedestrians are injured and about 9,000 are killed in accidents yearly. Enormous amounts of research have been on the detection of pedestrian from moving platform using different image processing techniques like shape/texture-based method. Currently, Guilloux and his colleague pointed out the advantages of using infrared cameras. A few pedestrian detection systems using infrared video sequences have been experimented as well. The possibilities of using the human hands as tactile sensory input are explored by the researchers in order to obtain precise knowledge for building tactual display. Recently a number of vibrotactile devices have been accessible for experimental as well as commercial reason. We present a driver assistant system which will provide tactual alert on detection of the pedestrians. The Different issues regarding the development of driver assistant program are considered. Here the Template matching based pedestrian detection in infrared videos is performed. Finally, ‘Driver assistant system’ experiment is presented.

  • 27.
    ur Réhman, Shafiq
    et al.
    Umeå universitet, Teknisk-naturvetenskapliga fakulteten, Institutionen för tillämpad fysik och elektronik.
    Raytchev, Bisser
    Department of Information Engineering, Hiroshima University, Japan.
    Yoda, Ikushi
    Information Technology Research Institute (ITRI), National Institute of Advanced Industrial Science and Technology (AIST), Japan.
    Liu, Li
    Umeå universitet, Teknisk-naturvetenskapliga fakulteten, Institutionen för tillämpad fysik och elektronik.
    Vibrotactile rendering of head gestures for controlling electric wheelchair2009Ingår i: Proceedings of IEEE international conference on systems, man and cybernetics, San Antonio, Texas, USA: IEEE , 2009, s. 413-417Konferensbidrag (Refereegranskat)
    Abstract [en]

    We have developed a head gesture controlled electric wheelchair system to aid persons with severe disabilities. Real-time range information obtained from a stereo camera is used to locate and segment the face images of the user from the sensed video. We use an Isomap based nonlinear manifold learning map of facial textures for head pose estimation. Our system is a non-contact vision system, making it much more convenient to use. The user is only required to gesture his/her head to command the wheelchair. To overcome problems with a non responding system, it is necessary to notify the user of the exact system state while the system is in use. In this paper, we explore the use of vibrotactile rendering of head gestures as feedback. Three different feedback systems are developed and tested, audio stimuli, vibrotactile stimuli and audio plus vibrotactile stimuli. We have performed user tests to study the usability of these three display methods. The usability studies show that the method using both audio plus ibrotactile response outperforms the other methods (i.e. audio stimuli, vibrotactile stimuli response).

  • 28.
    ur Réhman, Shafiq
    et al.
    Umeå universitet, Teknisk-naturvetenskapliga fakulteten, Institutionen för tillämpad fysik och elektronik.
    Rönnbäck, Sven
    Umeå universitet, Teknisk-naturvetenskapliga fakulteten, Institutionen för tillämpad fysik och elektronik.
    Liu, Li
    Umeå universitet, Teknisk-naturvetenskapliga fakulteten, Institutionen för tillämpad fysik och elektronik.
    Tongue operated electric wheelchair2010Konferensbidrag (Övrigt vetenskapligt)
    Abstract [en]

    In this paper we propose a tongue operated electric wheelchair system to aid persons with severe disabilities. Real-time tongue gestures are detected and estimated from a video camera pointing to the face of the user. The tongue gestures are used to control the wheelchair. Our system is a non-contact vision system, making it much more convenient to use. The user is only required to move his/her tongue to command the wheelchair. To make the system easy to drive, the system is also equipped with a laser scanner for obstacle avoidance. We also mount a 2D array of vibrators on the chair to provide the user with the information of the response from his/her tongue movement and the status of the system. We are carrying out user tests to measure the usability of the system.

  • 29. Yousefi, Shahrouz
    et al.
    Li, Haibo
    Liu, Li
    Umeå universitet, Teknisk-naturvetenskapliga fakulteten, Institutionen för tillämpad fysik och elektronik.
    3D Gesture Analysis Using a Large-Scale Gesture Database2014Ingår i: Advances in Visual Computing: 10th International Symposium, ISVC 2014, Las Vegas, NV, USA, December 8-10, 2014, Proceedings, Part I / [ed] Bebis, G; Boyle, R; Parvin, B; Koracin, D; McMahan, R; Jerald, J; Zhang, H; Drucker, SM; Kambhamettu, C; ElChoubassi, M; Deng, Z; Carlson, M, 2014, s. 206-217Konferensbidrag (Refereegranskat)
    Abstract [en]

    3D gesture analysis is a highly desired feature of future interaction design. Specifically, in augmented environments, intuitive interaction with the physical space seems unavoidable and 3D gestural interaction might be the most effective alternative for the current input facilities. This paper, introduces a novel solution for real-time 3D gesture analysis using an extremely large gesture database. This database includes the images of various articulated hand gestures with the annotated 3D position/orientation parameters of the hand joints. Our unique search algorithm is based on the hierarchical scoring of the low-level edge-orientation features between the query input and database and retrieving the best match. Once the best match is found from the database in real-time, the pre-calculated 3D parameters can instantly be used for gesture-based interaction.

  • 30.
    Zhong, Yang
    et al.
    Umeå universitet, Teknisk-naturvetenskapliga fakulteten, Institutionen för tillämpad fysik och elektronik.
    Liu, Li
    Umeå universitet, Teknisk-naturvetenskapliga fakulteten, Institutionen för tillämpad fysik och elektronik.
    Remote Neonatal Pain Assessment System Based on Internet of Things2011Ingår i: Internet of Things (iThings/CPSCom): 2011 International Conference on and 4th International Conference on Cyber, Physical and Social Computig, IEEE Computer Society, 2011, s. 629-633Konferensbidrag (Refereegranskat)
    Abstract [en]

    The Internet of Things has shown its unique advantages for medical services in the recent years. In this paper, we present a new idea of remote pain assessment system based on the Internet of Things. The system detects the pain feelings for infants through facial expression sensing. It tunnels a comprehensive and indispensable medical service via internet to any location. The system featuring ubiquitous connectivity and ambient intelligence provides reliable real-time pain level through facial expression in infants. It is reliable and cost efficient as well.

1 - 30 av 30
RefereraExporteraLänk till träfflistan
Permanent länk
Referera
Referensformat
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Annat format
Fler format
Språk
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Annat språk
Fler språk
Utmatningsformat
  • html
  • text
  • asciidoc
  • rtf