Ändra sökning
RefereraExporteraLänk till posten
Permanent länk

Direktlänk
Referera
Referensformat
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Annat format
Fler format
Språk
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Annat språk
Fler språk
Utmatningsformat
  • html
  • text
  • asciidoc
  • rtf
Complexity Analysis of Vision Functions for implementation of Wireless Smart Cameras using System Taxonomy
Mittuniversitetet, Fakulteten för naturvetenskap, teknik och medier, Institutionen för informationsteknologi och medier.ORCID-id: 0000-0003-1923-3843
Mittuniversitetet, Fakulteten för naturvetenskap, teknik och medier, Institutionen för informationsteknologi och medier.ORCID-id: 0000-0002-6484-9260
Mittuniversitetet, Fakulteten för naturvetenskap, teknik och medier, Institutionen för informationsteknologi och medier.
Mittuniversitetet, Fakulteten för naturvetenskap, teknik och medier, Institutionen för informationsteknologi och medier.
Visa övriga samt affilieringar
2012 (Engelska)Ingår i: Proceedings of SPIE - The International Society for Optical Engineering, Belgium: SPIE - International Society for Optical Engineering, 2012, s. Art. no. 84370C-Konferensbidrag, Publicerat paper (Refereegranskat)
Abstract [en]

There are a number of challenges caused by the large amount of data and limited resources such as memory, processing capability, energy consumption and bandwidth when implementing vision systems on wireless smart cameras using embedded platforms. It is usual for research in this field to focus on the development of a specific solution for a particular problem. There is a requirement for a tool which has the ability to predict the resource requirements for the development and comparison of vision solutions in wireless smart cameras. To accelerate the development of such tool, we have used a system taxonomy, which shows that the majority of wireless smart cameras have common functions. In this paper, we have investigated the arithmetic complexity and memory requirements of vision functions by using the system taxonomy and proposed an abstract complexity model. To demonstrate the use of this model, we have analysed a number of implemented systems with this model and showed that complexity model together with system taxonomy can be used for comparison and generalization of vision solutions. Moreover, it will assist researchers/designers to predict the resource requirements for different class of vision systems in a reduced time and which will involve little effort. 

Ort, förlag, år, upplaga, sidor
Belgium: SPIE - International Society for Optical Engineering, 2012. s. Art. no. 84370C-
Serie
Proceedings of SPIE, ISSN 0277-786X ; 8437
Nyckelord [en]
wireless smart camera, complexity analysis, system taxonomy, comparison, resource requirements
Nationell ämneskategori
Elektroteknik och elektronik
Identifikatorer
URN: urn:nbn:se:miun:diva-16036DOI: 10.1117/12.923797ISI: 000305693900010Scopus ID: 2-s2.0-84861946720Lokalt ID: STCISBN: 978-0-8194-9129-9 (tryckt)OAI: oai:DiVA.org:miun-16036DiVA, id: diva2:513137
Konferens
Real-Time Image and Video Processing 2012;Brussels;19 April 2012through19 April 2012;Code90041
Tillgänglig från: 2012-03-30 Skapad: 2012-03-30 Senast uppdaterad: 2016-10-20Bibliografiskt granskad
Ingår i avhandling
1. Energy Efficient and Programmable Architecture for Wireless Vision Sensor Node
Öppna denna publikation i ny flik eller fönster >>Energy Efficient and Programmable Architecture for Wireless Vision Sensor Node
2013 (Engelska)Doktorsavhandling, sammanläggning (Övrigt vetenskapligt)
Abstract [en]

Wireless Vision Sensor Networks (WVSNs) is an emerging field which has attracted a number of potential applications because of smaller per node cost, ease of deployment, scalability and low power stand alone solutions. WVSNs consist of a number of wireless Vision Sensor Nodes (VSNs). VSN has limited resources such as embedded processing platform, power supply, wireless radio and memory.  In the presence of these limited resources, a VSN is expected to perform complex vision tasks for a long duration of time without battery replacement/recharging. Currently, reduction of processing and communication energy consumptions have been major challenges for battery operated VSNs. Another challenge is to propose generic solutions for a VSN so as to make these solutions suitable for a number of applications.

To meet these challenges, this thesis focuses on energy efficient and programmable VSN architecture for machine vision systems which can classify objects based on binary data. In order to facilitate generic solutions, a taxonomy has been developed together with a complexity model which can be used for systems’ classification and comparison without the need for actual implementation. The proposed VSN architecture is based on tasks partitioning between a VSN and a server as well as tasks partitioning locally on the node between software and hardware platforms. In relation to tasks partitioning, the effect on processing, communication energy consumptions, design complexity and lifetime has been investigated.

The investigation shows that the strategy, in which front end tasks up to segmentation, accompanied by a bi-level coding, are implemented on Field Programmable Platform (FPGA) with small sleep power, offers a generalized low complexity and energy efficient VSN architecture. The implementation of data intensive front end tasks on hardware reconfigurable platform reduces processing energy. However, there is a scope for reducing communication energy, related to output data. This thesis also explores data reduction techniques including image coding, region of interest coding and change coding which reduces output data significantly.

For proof of concept, VSN architecture together with tasks partitioning, bi-level video coding, duty cycling and low complexity background subtraction technique has been implemented on real hardware and functionality has been verified for four applications including particle detection system, remote meter reading, bird detection and people counting. The results based on measured energy values shows that, depending on the application, the energy consumption can be reduced by a factor of approximately 1.5 up to 376 as compared to currently published VSNs. The lifetime based on measured energy values showed that for a sample period of 5 minutes, VSN can achieve 3.2 years lifetime with a battery of 37.44 kJ energy. In addition to this, proposed VSN offers generic architecture with smaller design complexity on hardware reconfigurable platform and offers easy adaptation for a number of applications as compared to published systems.

Ort, förlag, år, upplaga, sidor
Sundsvall: Mid Sweden University, 2013. s. 115
Serie
Mid Sweden University doctoral thesis, ISSN 1652-893X ; 167
Nyckelord
Wireless Vision Sensor Node, Smart camera, Wireless Vision Sensor Networks, Architecture, Video coding.
Nationell ämneskategori
Elektroteknik och elektronik
Identifikatorer
urn:nbn:se:miun:diva-20179 (URN)STC (Lokalt ID)978-91-87557-12-5 (ISBN)STC (Arkivnummer)STC (OAI)
Disputation
2013-10-22, M108, holmgatan 10,SE 85170, sundsvall, 10:03 (Engelska)
Opponent
Handledare
Forskningsfinansiär
KK-stiftelsen
Tillgänglig från: 2013-11-11 Skapad: 2013-11-11 Senast uppdaterad: 2016-10-20Bibliografiskt granskad

Open Access i DiVA

Fulltext saknas i DiVA

Övriga länkar

Förlagets fulltextScopus

Sök vidare i DiVA

Av författaren/redaktören
Imran, MuhammadKhursheed, KhursheedAhmad, NaeemMalik, Abdul WaheedO'Nils, MattiasLawal, Najeem
Av organisationen
Institutionen för informationsteknologi och medier
Elektroteknik och elektronik

Sök vidare utanför DiVA

GoogleGoogle Scholar

doi
isbn
urn-nbn

Altmetricpoäng

doi
isbn
urn-nbn
Totalt: 796 träffar
RefereraExporteraLänk till posten
Permanent länk

Direktlänk
Referera
Referensformat
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Annat format
Fler format
Språk
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Annat språk
Fler språk
Utmatningsformat
  • html
  • text
  • asciidoc
  • rtf