Available Mote Platforms for Wireless Image Sensors R. Zilan, J. M. Barcelo-Ordinas, B. Tavli

Similar documents
The Case for Multi tier Camera Sensor Networks

Wireless Sensor Networks

Presented by Viraj Anagal Kaushik Mada. Presented to Dr. Mohamed Mahmoud. ECE 6900 Fall 2014 Date: 09/29/2014 1

Intel Research mote. Ralph Kling Intel Corporation Research Santa Clara, CA

LabVIEW ON SMALL TARGET

A survey on wireless multimedia sensor networks

Distributed Pervasive Systems

Lecture 8 Wireless Sensor Networks: Overview

Wireless Sensor Networks CS742

KMote - Design and Implementation of a low cost, low power platform for wireless sensor networks. Naveen Madabhushi

COL862 - Low Power Computing

Metrics for Sensor Network Platforms

Wireless Sensor Networks (WSN)

Module Introduction. This training module provides an overview of Freescale s scalable solutions for low data rate 2.4 GHz connectivity.

Let s first take a look at power consumption and its relationship to voltage and frequency. The equation for power consumption of the MCU as it

TAG: A TINY AGGREGATION SERVICE FOR AD-HOC SENSOR NETWORKS

CS263: Wireless Communications and Sensor Networks

Message acknowledgement and an optional beacon. Channel Access is via Carrier Sense Multiple Access with

Presented by: Murad Kaplan

AIM: To create a project for implement a wireless communication protocol on an embedded system- ZigBee.

RESOURCES. By: Chris Downey, Laird Technologies Product Manager, Telematics & Wireless M2M Date: May 25, 2011

Agriculture Wireless Temperature and Humidity Sensor Network Based on ZigBee Technology

TOSSIM simulation of wireless sensor network serving as hardware platform for Hopfield neural net configured for max independent set

By Ambuj Varshney & Akshat Logar

A Survey on Sensor Coverage and Visual Data Capturing/Processing/Transmission in Wireless Visual Sensor Networks

Wake-on-WLAN. Power management for mesh networks using Kameswari Chebrolu Department of Electrical Engineering, IIT Kanpur

An Industrial Employee Development Application Protocol Using Wireless Sensor Networks

Wireless Connectivity Options for IoT. By: MIST Makers John Varela and Nicholas Landy

BT-22 Product Specification

CM5000 DATASHEET v0.1

TEMPERATURE MONITORING SYSTEM

CHAPTER 2 WIRELESS SENSOR NETWORKS AND NEED OF TOPOLOGY CONTROL

Indriya_DP_03A14. Features. Block Diagram. XBEE based Wireless Sensor Network development platform

WIRELESS sensor networks capable of capturing video

TSAR : A Two Tier Sensor Storage Architecture using Interval Skip Graphs

DASH7 ALLIANCE PROTOCOL - WHERE RFID MEETS WSN. public

Wireless Embedded Systems ( x) Ad hoc and Sensor Networks

New STM32WB Series MCU with Built-in BLE 5 and IEEE

Remote Keyless Entry In a Body Controller Unit Application

1. IEEE and ZigBee working model.

Energy-aware Reconfiguration of Sensor Nodes

From Smart Cameras to Visual Sensor Networks

Capacity Planning for Next Generation Utility Networks (PART 1) An analysis of utility applications, capacity drivers and demands

Memory Systems IRAM. Principle of IRAM

Last Time. Making correct concurrent programs. Maintaining invariants Avoiding deadlocks

SBD WARRIOR DATA SHEET

Interfacing Java-DSP with Sensor Motes

Implementation of a Wake-up Radio Cross-Layer Protocol in OMNeT++ / MiXiM

AN4696 Application note

A PERFORMANCE EVALUATION OF YMAC A MEDIUM ACCESS PROTOCOL FOR WSN

INTRODUCTION TO WIRELESS SENSOR NETWORKS. CHAPTER 2: ANATOMY OF A SENSOR NODE Anna Förster

QoS-Enabled Video Streaming in Wireless Sensor Networks

Wireless Local Area Networks (WLANs)) and Wireless Sensor Networks (WSNs) Computer Networks: Wireless Networks 1

ISSN (PRINT): , (ONLINE): , VOLUME-6, ISSUE-1,

CS263 Wireless Sensor Networks

Wireless Local Area Networks (WLANs) and Wireless Sensor Networks (WSNs) Primer. Computer Networks: Wireless LANs

Power-efficient Communication Protocol for Social Networking Tags for Visually Impaired

Chapter 2. Literature Survey. 2.1 Remote access technologies

ZigBee Compliant Platform 2.4G RF Low Power Transceiver Module for IEEE Standard. DATA SHEET Version B

RN-174. WiSnap M2 Super Module. Features. Description. Applications. ~ page 1 ~ rn-174-ds v1.1 6/1/2011

101seminartopics.com. Bluetooth Based Smart Sensor Networks

Product Specification

TI SimpleLink dual-band CC1350 wireless MCU

Wireless Sensor Networks: From Science to Reality. Kay Römer ETH Zurich

References. The vision of ambient intelligence. The missing component...

Hello, and welcome to this presentation of the STM32L4 power controller. The STM32L4 s power management functions and all power modes will also be

End-To-End Delay Optimization in Wireless Sensor Network (WSN)

Etiquette protocol for Ultra Low Power Operation in Sensor Networks

Main objectives or functions can be modelled like different blocks or components that can be observed in Figure 1. Figure 1: HOPE System Architecture

Routing protocols in WSN

Amarjeet Singh. February 7, 2012

Realization of Online Video Streaming in Wireless Multimedia Sensor Network with prolonging Battery life

Tag a Tiny Aggregation Service for Ad-Hoc Sensor Networks. Samuel Madden, Michael Franklin, Joseph Hellerstein,Wei Hong UC Berkeley Usinex OSDI 02

RN-174 WiFly Super Module

Reminder. Course project team forming deadline. Course project ideas. Friday 9/8 11:59pm You will be randomly assigned to a team after the deadline

Ultra-low power wireless sensor networks: distributed signal processing and dynamic resources management

Changing your point of view SNC-P5. Sony Network Camera.

RN-171-EK Evaluation Board

Hybrid Coax-Wireless Multimedia Home Networks Using Technology. Noam Geri, Strategic Marketing Manager

Wireless Technology and System Integration in Body Area Networks for m-health Application

Published in A R DIGITECH

Clock and Fuses. Prof. Prabhat Ranjan Dhirubhai Ambani Institute of Information and Communication Technology, Gandhinagar

AN EFFICIENT MAC PROTOCOL FOR SUPPORTING QOS IN WIRELESS SENSOR NETWORKS

TinySec: A Link Layer Security Architecture for Wireless Sensor Networks. Presented by Paul Ruggieri

Smart Ultra-Low Power Visual Sensing

MDR-1 Mobile Document Reader

Wireless Multimedia Sensor Networks: Applications and Testbeds

WSN Routing Protocols

CSC 774 Advanced Network Security

WIRELESS SENSOR NETWORK

System Architecture Directions for Networked Sensors[1]

HotChips An innovative HD video and digital image processor for low-cost digital entertainment products. Deepu Talla.

Hardware Design of Wireless Sensors

AT-501 Cortex-A5 System On Module Product Brief

CS 425 / ECE 428 Distributed Systems Fall 2014

Application Development in Vision-Enabled Wireless Sensor Networks

INTEGRATION OF AD HOC WIRELESS SENSOR NETWORKS IN A VIRTUAL INSTRUMENTATION CONFIGURATION

Mote Design Supported with Remote Hardware Modifications Capability for Wireless Sensor Network applications

TINY System Ultra-Low Power Sensor Hub for Always-on Context Features

Product Specification

Transcription:

Available Mote Platforms for Wireless Image Sensors R. Zilan, J. M. Barcelo-Ordinas, B. Tavli Abstract Nowadays, the requirement of a Mote for Wireless Images Sensor Networks is one of the attractive subjects on this area. Thus it is important to know available motes. This report presents a literature survey in the Section I and also obtains a comparison of available motes in Section II for this purpose. I. Literature Survey Downes et. al. in [1], focuses on a design of a new mote for distributed image sensing applications in wireless sensor networks. The study gives short information about Wireless Image Sensor networks and also related works on image sensor motes; Panoptes, an Image sensor mote with FPGA for compression, Cyclops and SenseEye are summarized in brief. As an image mote requirement, it is underlined that more local processing is necessary to extract the information from the data. Thus a combination of a powerful, power efficient processing unit and a supporting memory is also should be considered. Moreover, type of interface to the image sensor is another important point for image motes. Beside this, suitability of the current mote platforms are analyzed shortly; low bandwidth nodes (Spec, Telos, BTmode) and high bandwidth nodes (Intel Mote2 etc.) and between them Medusa MkII are compared with their performances. Also Cyclops is mentioned as a suitable image sensor with a drawback of limiting research into cross layer optimizations. After current available mote platforms are discussed, due to the lack of suitable platforms, the necessity of a new mote platform for image sensors is emphasized. An image sensor mote requires low resolution (CIF or VGA), ease of interfacing and low power consumption. It is also stated that a multi tier image sensor network could be a good choice for advance data gathering and processing. In the light of these, development of a wireless image sensor mote with the similar capability of Medusa MkII is aimed and 802.15.4 is used. The aim is the same with the lowest tier of the multi-tier networks. The radio choice is based on not only individual node s performance but also for the whole network s performance. The new mote uses CIF or low resolution (30x30). For the further research, intelligent sensor networks are aimed. Hengstler and Aghajan in [2], introduce WISNAP; a Matlab based application development platform for wireless image sensors networks. Due to the powerful features in algorithm development and graphical tools, MATLAB is selected as a development environment. WISNAP consists of two application program interfaces; an Image Sensor API and a Wireless Mote API. The first API makes frame capturing from image sensors possible and the second one provides access to wireless motes. Also available libraries provide a set of hardware dependant functions. Two examples of the applications are given: Event detection and node localization. The first one is the application attempts to detect the event of a changing counter reading. Event detection by tracking the number of pixels that differ substantially between successive images is decided. Once this number exceeds a threshold present above the camera s noise level, the event has occurred and the algorithm can issue a notification for further action. It is difficult to match differing application requirements and platform characteristics to achieve optimal resource usage. Jan Beutel in [3] gives Metrics for Sensor Network Platforms. The motivation is to define suitable and useful metrics for comparison of sensor network platforms. These metrics are given under two different subtitles: General Platform Metrics and Problematic Platform Metrics. General requirements and metrics are important for the assessment, evaluation and comparisons of platforms and applications. A small form factor, ubiquitous usage, unobtrusive application, programmability, availability, affordability, the appropriate development tools, sample applications, documentation and support are given as general requirements of WSN. Power consumption and capacity are accepted as problematic platform metrics. Also it is stated that there is no one-size-fits-all solution because of the large the application space. Stephan Hengstler and Hamid Aghajan in [4], proposed a Smart Camera Mote Architecture designed for in-node processing, with the aim of facilitating distributed intelligent surveillance. With this application in mind, the suggested mote architecture targets the provision of sufficient processing power and an adequate vision system while minimizing component count and power consumption. Low power consumption is an important design objective to enable mobile surveillance applications using battery-powered camera motes. Cao, Ji and Hu in [5], proposed an Image Sensor Node with FPGA for compression for Wireless Sensor Networks; the authors develop an alternative approach for an image sensor mote where the transmission of compressed images is considered. Definition of characteristics and requirements of Image Sensor Networks and design of a CMOS sensor image device are the motivation of the study. For these purposes, the focus is on high processing power and more memory, real-time processing of frames and high communication bandwidth, energy efficient in terms of computation and transmission, robust transmission and QoS and distributed processing and collaboration. In order to maintain mobility experiments in sensor networks, the XYZ platform [6], takes a forward step in this direction by instantiating a new sensor node platform. Need of an open-source sensing platform to support 1

experimental research in mobile sensor networks and definition of the testbed platform including the sensor is the motivation. Definition of requirements and characteristics of the testbed are stated as: Long term sleep modes, support of mobility high computation and large memory for fast and flexible prototyping capabilities, multiple operational modes for design better power performance protocols, a wide choice of sensing peripherals close to the sensors (avoid moving data), cost and form factor. Robert Szewczyk, and David Culler [7] presented Telos, an ultra low power wireless sensor mote for research and experimentation. Telos is the latest in a line of motes developed by UC Berkeley to enable wireless sensor network (WSN) research. Telos new design consists of three major goals to enable experimentation: minimal power consumption, easy to use, and increased software and hardware robustness. It uses the Chipcon CC2420 radio in the 2.4GHz band, a wideband radio with O-QPSK modulation with DSSS at 250kbps. The higher data rate allows shorter active periods further reducing energy consumption. Chulsung Park and Pai H. Chou in [8] introduces Eco; an ultra wearable and expandable wireless sensor platform. It has the 15% of the volume of Micadot2 which is the smallest of the Berkeley mote. Although other platforms were sacrificing either size or expandability, this platform achieves both with a novel flex-pcb extension connector for digital/analog, I/O, firmware-programming and battery charging. It also gives a brief comparison with PASTA, which is bigger than Mica and has the size of 6.5x 4.5cm, MICA and on the small side Spart (17mmx35mm without batteries) in the sense of their sizes. Eco has a 1cm 3 volume without antennas. Platform consists of the Eco node, a data aggregator and a development / base station board. Result of experiments give that radio performance changes depending on the antennas and when the data rate set to 1Mbps the measured maximum data rate is around 800kbps. ecam [9] consists of a video cam and an Eco wireless sensor node. It can capture 30 fps. Uses JPEG compression engine (the JPEG codec can be set by different parameters so that the compression rate can change). These cams transmit at the low resolution (320x240 or 160x128). Park and Chou propose an ultra-compact, high data rate wireless sensor node with a miniature camera. This node is ultra compact beside that; it has much higher performance than sensor nodes in the same spectrum. Eco is an ultra compact, high bandwidth wireless sensor node, which has a size of 13mmx11mmx7mm. It weights 2grams. The study aims to show Eco could be a good platform alternative for such small WSNs. Eco has a microcontroller, a radio, a 3-axial accelerometer, a temperature sensor, an infrared sensor, an antenna and a battery. Also E-cam has a higher capacity, around 1 Mbps and 10m transmission range, than several other platforms such as, Zegbee (250Kbps). Another approach to the design of an image sensor mote is undertaken by [10], where an image sensor daughterboard is developed. This work seeks to provide sensor networks with similar capabilities by exploiting emerging, cheap, low- power and small form factor CMOS imaging technology. A small camera device is developed called Cyclops that bridges the gap between the computationally constrained wireless sensor nodes such as Motes, and CMOS imagers which, while low power and inexpensive, are nevertheless designed to mate with resource-rich hosts. Cyclops enables development of new class of vision applications that span across wireless sensor network. The hardware and software architectures are described. In this study, high number of visual sensors provides; overcome occlusion effects, multiple perspectives, and closer views of the phenomena (or objects). Beside, low power CMOS camera modules are available and Cyclops is an electronics interface between a camera module and a lightweight wireless host. Two application areas are explored: Object detection (frame difference threshold) and hand posture recognition (histogram matching to known gestures). Feng et al., in [11] describes Panoptes, a videobased sensor networking architecture, including its design, implementation, and performance. Two video sensor platforms that can deliver high-quality video over 802.11 networks with a power requirement less than 5 watts are described. Two different platforms are used: Applied Data Bitsy board and Crossbow Stargate platform. Stargate s computation power is twice that of Bitsy and power requirements is less. In video acquisition there are two conflicting resources. If high frame rate is required then compression of video to be able to get the data from the camera to the main board through, is needed USB 1.0 interface. Compression consumes the processing power of the CPU. If images don t be compressed then high video frame rates cannot be achieved. Kulkarni et.al. in [12] presents a multi-tier multimedia sensor networking. To demonstrate its benefits, they implement a surveillance application using SensEye comprising three tasks: object detection, recognition and tracking. During the design 3 main principles are considered; map each task to the least powerful tier with sufficient resources, exploit wakeupon-demand and exploit redundancy in coverage levels. SensEye seeks to provide a low-latency yet energyefficient camera sensing solution. Latency and energy-efficiency are conflicting system goals. To achieve lowlatency sensing, sensors need to detect, recognize and track new objects as they enter and move across the field of view of the camera network and minimize missed objects. In contrast, energy-efficient sensing requires that sensors and nodes are switched off as much as possible (duty-cycled), which adversely impacts the latency of sensing and hence the reliability. Duty-cycling a distributed camera network incurs other sources of latency since wakeup triggers need to be propagated across distributed sensor nodes, and operating system latency is incurred for switching from sleep state to active state. One of the goals of SensEye is to illustrate these tradeoffs while demonstrating the overall benefits of the multi-tier approach. To do so, SensEye assumes a three-tier architecture. 2

The lowest tier of Sense Eye consists of low-power camera sensor (e.g., Cyclops, CMUcam), crossbow mote. Second tier consist of a logitech quickcam Pro Webcam, a crossbow mote (for communication with tier 1), an Intel Stargate (communicates with tier 3 via IEEE 802.11). The third tier consist of Sony-RZ30N PTZ camera and a mini-itx embedded PC (communicates with tier 1 via IEEE 802.11). It was envisioned that there is a tier 0 below tier 1 consisting of motes equipped with simple sensors like vibration sensors, which detect any motion with small energy and wake tier 1 nodes. Campbell in [13] presents an overview of IrisNet which is a sensor network architecture that enables the creation of a planetary-scale infrastructure of multimedia sensors that can be shared by a large number of applications. To ensure the efficient collection of sensor readings, IRISNET enables the application-specific processing of sensor feeds on the significant computation resources that are typically attached to multimedia sensors. It is based on sensing agents and organizing agents and includes heterogeneous sensors. The system supplies core services that permit efficient processing and filtering of data at the source, the sharing and reuse of large sensor deployments, efficient querying and updates of collected data, replication and fault tolerance, and multi-sensor calibration. Each image is processed using an interest operator to identify regions that are highly distinctive and stable to local changes. In order to find stable local optima in scale space of the image, IRIS NET uses the scale invariant future transform interest operator. Meerkats [14] is a wireless network of battery-operated camera nodes that can be used for monitoring and surveillance of wide areas. The study describes the Meerkats architecture, including; a number of application-level visual sensor acquisition and processing techniques such as image acquisition policies; visual analysis for event detection, parameter estimation, and hierarchical representation; resource management strategies that dynamically assess the power versus application-level requirements to make decisions on the tasks to be performed by the system and networklevel techniques for bandwidth- and power-adaptive routing as well as media scaling. The study presents results from the power consumption characterization of the Meerkats sensor node which is based on the Crossbow Stargate. Each Meerkats node runs all of Meerkats components. The Visual Processing (VP) modul is responsible for detecting and characterizing events within a camera s field of view (FOV). It also performs collaborative sensor processing where information from multiple cameras is fused for more accurate event detection. Falchi et al., gives a comparison of Berkeley motes, in [15]. Mica2 versus mica2dot are compared and as it can be seen in the Section II Mica2dot consumes less energy than mica2, only in one idle level both consumes equal power. Canbolat et. al., in [16], using a number of small cameras instead of a big camera will lower the cost of implementation. Coding concepts such as multi-terminal coding in sensor networks are applied to make the system efficient. The demonstration includes taking images with two cameras, and see how efficiently they can be transmitted to the base station. Then they applied some coding strategies, to reduce the rate-distortion of the overall system. Although imaging is an information-rich sensing modality, the use of cameras in sensor networks is very often prohibited by factors such as power, computation cost, storage, communication bandwidth and privacy. Teixeira et al., in [17] built an Address-Event imager (dedicated for sensor networks) that detects image features directly from the image. These features are the data to be transmitted to the network. Instead of providing full images with a high degree of redundancy, the efforts in the design of these imagers specialize on selecting a handful of features from a scene and outputting these features in address-event representation. Also it is presented that a lightweight classification scheme to illustrate the computational advantages of address-event sensors. The paper concludes with an evaluation of the classification algorithm and a feasibility study of using COTS components to emulate address-event inside a sensor network. Kling et al., compares Intel Motes in [18], in the sense of increased CPU performance and memory capacity, improved radio bandwidth and reliability, as well as cost-effectiveness. Also they described details of the new platform architectures as well as practical experiences with the new motes regarding CPU and radio performance, networking algorithms and battery life. One of the main challenges posed by wireless visual sensor networks is the constant tension between power conservation and performance. Margi et al., in [19], characterize the energy consumption of a visual sensor network testbed. Each node in the testbed consists of a single-board computer, namely Crossbow s Stargate, equipped with a wireless network card and a webcam. They assess energy consumption of activities representative of the target application using a benchmark that runs basic tasks such as processing flash memory access, image acquisition, and communication over the network. Rowe et al., in [20], designed of an image sensor device which is CMOS based and includes a set of small image processing applications (e.g. color blob tracking). The frame rate is 16.7 fps. The system utilizes a low cost CMOS color camera module and all image data is processed by a high speed, low cost microcontroller. This eliminates the need for a separate frame grabber and high speed host computer typically found in traditional vision systems. The resulting embedded system makes it possible to utilize simple color vision algorithms in applications like small mobile robotics where a traditional vision system would not be practical. Although there are many examples in the literature, these algorithms is often limited by the cost and complexity of the hardware needed to implement them. Such systems traditionally consist of a camera, a frame grabber, and an associated computer to interface to the frame grabber and execute the algorithm. Recent hardware developments 3

now make it possible to greatly simplify and reduce the cost of these systems. Although the cost of sustainability is the same for CCD and CMOS technologies, CCDs offer good image quality and flexibility at the expense of the system size whereas CMOS imagers offer good integration, power dissipation and system size at the expense of image quality and flexibility [24]. Kulkarni et al., in [22], examined recent technology trends that have resulted in a broad spectrum of camera sensors, wireless radio technologies, and embedded sensor platforms with varying capabilities. They argued that future sensor applications will be hierarchical with multiple tiers, where each tier employs sensors with different characteristics. II Comparison and Technical Specifications of Available Platforms and Cams This Section provides a background for image sensing in the field of wireless sensor networks, and compares the principle differences between image sensor networks and other types of sensor networks. A number of research groups are working in areas related to the development of image sensor motes. The following list briefly describes a selection of related works. Properties of the current platforms can be traced back to a number of devices called COTS motes and shown in Table 1. These devices were built to approximate the capabilities of an SmartDust node with off the shelf components. New designs suitable to custom sensor interface and makes the remote programming possible [7]. Table 1. The Family of Berkeley motes proceeding Telos and their capabilities [7]. Berkeley Motes are other popular motes. Mica, released in 2001, was specifically designed to serve as a general purpose platform for WSN research. Compared with previous designs, it offers more memory, extensive sensor interfaces, and a flexible radio interface. Mica uses the RFM TR1000 and simple modulation techniques. The radio s primitive interfaces allow low power operation and quick turn-on times. A number of schemes for radio wakeup, low power asynchronous communications, fairly high bandwidth protocols, and precise time synchronization are implemented. Mica was practical for development, but not suitable for deployments. Due to the fact that the boost converter provides a stable voltage but uses excess quiescent current, also the radio communication range was short and relatively unreliable and moreover the extensive I/O connector was not robust to changes in temperature [6]. Both mica2 and mica2dot sensor nodes have a 4-Mhz, 8-bit Atmel microprocessor and 512 KB of non-volatile flash memory that can be used for logging and data collection. Also, they both have a 32-KHz clock that can be synchronized by the operating system. This way neighbor node to be powered up could listen if there is information to be exchanged or not. The operating system is TinyOS. Thus it is suitable for low power consumption and multiple flows. Channel model of mica2 and mica2 dot motes can be seen in Figure1[15]. 4

Figure 1. Channel model for mica and mica2dot motes [15]. Mica2 is an improved version of Mica. The boost converter was discarded, and the MCU was replaced with the ATmega128. This lowered the Mica2 standby current to about 17A, while waking up the system takes up to 4 ms if using the external crystal. The radio transceiver changed to Chipcon CC1000, which helps frequencies to be tuned from 300 to 900 MHz. Although more flexible, the Mica2 had higher energy per bit and an order of magnitude higher wakeup time. Despite of the fact that they have these inadequacies, Mica2 and the smaller Mica2Dot are the standard research platforms in WSN research. In [15], the power consumption of mica2/mica2dot motes in different operating modes is reviewed. Figure 2. Power consumption in different operating modes [15]. In Figure 2, it can bee seen that mica2 nodes consume more energy than mica2dot nodes in all operating modes. In addition, both types of motes are transmitting slightly more expensive than receiving. Since mica2 and mica2dot use the same processor the power consumption value of idle mode and the radio-off mode is the same (8 ma) and can be viewed as the basic consumption of the sensor node. The consumption in the power down mode is more than three orders of magnitude lower than that in the idle mode. Therefore, this mode is highly recommended when the sensor node has nothing to do. An application that periodically senses the physical environment by using different sensors is considered, results obtained are summarized in Figure 3. It is seen that real power consumption depends on the specific sensor however it is generally limited. When there are no messages to be sent the radio is switched off and the processor enters the power down mode. The results obtained show that when the sensor is sampling the leaked current is 20mA, while it is 18 ma during transmissions. When the sensor node is in the power-down mode, the current decades to 10 ua. It is found that the average current 5

leaked in every cycle) is 0.19 ma. This implies that the power consumption is 0.57 mw (assuming a nominal voltage equal to 3V). It is also shown that the transmission range of a wireless system may be influenced by several factors. The most intuitive one was declared as the transmission power: the more the energy put into a signal, the farther it should travel. Also the sensitivity of the receiver, the gain and efficiency of the antenna, the data transmission rate should be considered. Figure 3. Power consumption of various sensors [15]. Mica2 and mica2dot sensor nodes results are summarized in Table 2. It is also observed that when the distance increases beyond a threshold the percentage of correctly received packets decreases. This threshold can be assumed as an estimate of the transmission range. By assuming the threshold as the distance at which the percentage of received packets drops below 85%, from Table 2 it can be concluded that the transmission range is approximately 55 m for mica2 and 135 m for mica2dot [15]. MicaZ continues the evolution of the Mica family: it replaces the CC1000 radio with a CC2420 and uses IEEE 802.15.4 compatible radio. A single chip mote implementation called Spec resulted from analyzing the Mica platform. Just 5mm 2 in size, Spec uses a number of dedicated hardware accelerators to perform programmable start symbol detection, bit serialization, and encryption. The CPU has been optimized to reduce the cost of context switching [7]. Spec s performance is over 1000 times better than Mica in many applications. Unlike the Mica family, Spec is fully integrated and offers limited interface flexibility. Since it is a research project, it is unlikely to become available in quantities to the research community. Spec provides significant advantages in power consumption because of its integrated design and hardware accelerators. The Telos design similar that of Spec instead of integrating the design into silicon, Telos uses COTS components with hardware accelerators to construct a power efficient system that does not sacrifice performance. A comparison of Telos, Mica2 and Mica Z is given in Table 3 [7]. Two distinct types of low power and low data rate radios are available: narrowband and wideband radios as shown in Table 4. Many narrowband radios provide very fast startup times since they are clocked by the MCU but have simple modulation schemes, no spreading codes, and are not robust to noise. Wideband radios must wait for high speed oscillators to start. [7]. Characteristic figures of Crossbow Mica2 and Mica2Dot, the Moteiv Tmote Sky, the Intel Imote and the BTnode rev3 have been compared (Tables 5 and 6). The Tmote Sky is a modern, ultra low-power architecture using standard components while the I-mote represents a completely different approach of a high-performance, custom node-on-a-chip. All of these platforms are designed to run TinyOS applications [3]. All four TinyOS platforms suffer from very scarce data memory resources, which require extreme care during the design. This restricts the applicability and flexibility of the Mica family considerably. 6

Table 2. Summary of studies for mica2 and mica2dot [15]. Table 3. Measured current consumption of Telos compared to Mica2 and MicaZ motes [7]. Table 4. Capabilities of current COTS radios suitable for WSNs, their features and power profile [7]. 7

Table 5. Platform comparison (system core features) [3]. Table 6. Platform comparisons (radio physical properties) [3]. Micaz and Telos use the same radio as XYZ however, include smaller 8-bit processors. These nodes have lower power consumption than XYZ, but also have more constraints in terms of memory computation and peripherals. The imote from Intel features an ARM/THUMB processor and a Bluetooth radio. The Stargate node designed by Intel manufactured by Crossbow is a higher end node. Stargate has a PXA processor with variable clock speed ranging from 100 to 400MHz, 32MB of Flash memory and 16MB of SDRAM. The Stargate has a mote interface and can also support IEEE 802.11 and Ethernet communication. The PASTA node is a hierarchical collection of hardware modules that can be individually powered down on demand. PASTA features multiple processors and has a dynamic range > 1000X ranging from < 1mW to > 10W [6]. In Figure 4 and Figure 5 different types of platform evaluations are given. Also Figure 6 gives an idea about bandwidth, memory and performance of the popular platforms while Figure 7 shows the fixed energy per-bit line of short range transmission systems. In Figure 7 a useful star graph shows that all modern short range communication standards are just above the linear line in the curve [3]. Figure 4. Platform Comparisons for BTnode3,Mica,Micadot2,Tmote Sky and Imote [3]. 8

Figure 5. Overall Platform Evaluation and Comparisons [3]. Figure 6. Processor performance versus memory size for some of the platforms [1]. 9

Figure 7. The fixed energy per-bit line of short range transmission systems [23]. Researchers at Yale integrated a COTS camera module into their XYZ sensor node, Figure 8 [6]. The module has higher performance than two previous camera modules from Agilent. It supports VGA (640 480) and QVGA resolutions. Nevertheless, its radio data rate (250Kbps) appears to be too low to transmit acquired frames in realtime [4]. XYZ is complementary to this since it has similar functionality and features. Beside these features, XYZ has a larger dynamic range of operation and a deep sleep mode that allows it to hibernate for longer time periods. At its lowest power operational mode, XYZ behaves like a Micaz mote and at its most powerful mode, its functionality is between an imote or a Stargate. Figure 8. The XYZ sensor node. left to right; instructional XYZ, testbed module, XYZ in motion [6]. Table 7 shows the energy consumption of XYZ during sleep and wake up modes, and Table 8 shows the drawn current for different operation modes of XYZ. Table 7. Time and energy overhead for transitioning into and out of sleep modes [6]. 10

Table 8. Breakdown of the current drawn by the XYZ node at different operating modes [6]. Figure 9 shows the current consumption for different frequencies. Also characteristics of Mica, XYZ and Stargate are given in Table 9 [6]. Table 9. Different sensor platforms and their characteristics [12]. Figure 9. Current consumption breakdown in different operating frequencies and for different I/O configurations (Total 3.3V, CPU Core: 2.5V, CPU I/O: 3.25V and Radio:3.25V) [6]. The Panoptes platform, Figure 10, is designed as a video sensor platform capable of medium-resolution video at high frame rates. A Strong ARM-based embedded Linux board is used, running at 206 MHz and equipped with 64 MB of RAM and the board uses a USB web camera as a video source and 802.11 for wireless communications [1]. The Panoptes sensor hardware is similar to the Stargate hardware, but requires higher power. When the power 11

requirements are compared, it is observed that the Stargate provides energy savings of up to 25% [19]. Power consumption of Panoptes video sensors is given in Figure 11 [11]. It takes approximately 1/3 or 1/6 the time using the primitives compared with using a non-intel-specific software optimized algorithm on the Bitsy and Stargate, respectively. Comparison of two platforms, shows that the Stargate platform is able to outperform the Bitsy platform using the Intel Performance primitives but cannot outperform it using the software compression algorithm, Table 10. Figure 10. The Panoptes Video Sensors: a) The applied Data Bitsy Platform. B) The Crossbow Stargate platform [11]. Figure 11. Power consumption of Panoptes [11]. Table 10. Average power requirements in Watts [11]. 12

For wireless image sensor networks, several mote architectures with similar objectives have been proposed in the past. In 2005, Cao et al. proposed an image sensor mote architecture, in which an FPGA (field programmable gate array) device connects to a VGA (640 480 pixel) CMOS imager to carry out image acquisition and compression. An ARM7 microcontroller processes images further and communicates to neighboring motes via an ultra-lowpower transceiver at data rates up to 76.8 KBaud per second [1]. The authors develop an alternative approach for an image sensor mote where the transmission of compressed images is considered. An ARM7 CPU is used in conjunction with an FPGA and additional memory, to support the high memory and processing requirements. A Chipcon CC1000 radio is used [4]. Currently, some research efforts have focused on developing visual sensing nodes. Rahimi et al. suggested another powerful image sensor mote, which combines Agilent Technologies Cyclops with Crossbow s Mica2 mote. Cyclops[5], is a low-power, small sensing node composed of a microcontroller, complex programmable logic device, external SRAM, external Flash and a CMOS imager designed by Agilent Laboratories and the Center for Embedded Network Sensing at UCLA [19][10]. Cyclops was developed as an add-on CIF (320 240 pixel) CMOS camera module board, which hosts an on-board 8- bit microcontroller and 64 Kbytes of static and 512 Kbytes of flash memory for pixel-level processing and storage. The board is designed to attach as an external sensor to a mote board such as one from the Mica family, such as Mica2 and therefore it does not include a radio. This allows the network and image sensing aspects to be divided. To provide a frame buffer, a complex programmable logic device (CPLD) is used to interface to a shared 512 KB SRAM. In addition, the CPLD can be used for basic operations during frame capture [1]. The operating system is Tiny OS and the language is NesC. Since the size of memory buffer is limited in Cyclops, it is possible to obtain images of resolution 64x64 pixels [16]. Different states of Cyclops are expressed in Table 11, PD refers to power down, PS refers to power save mode and S Stands for sleep (Timer interrupt only is in effect if Cyclops is in PS mode) [10]. Figure 12. Cyclops with an attached Mica2 Mote [10]. Table 11. Different States of Cyclops [10]. Downes et. al. in [1], concentrates on a design of a new mote for distributed image sensing applications in wireless sensor networks at Stanford s Wireless Sensor Networks Lab. Unlike the motes, the concept of multiple vision sensors as they can host up to four low-resolution (30 30 pixel) image sensors and two CIF CMOS camera modules. Both types of vision sensors feature a serial interface thus eliminating the need for additional FPGA or CPLD devices. Researchers at UCLA [4] built a camera node by adding a similar camera module (Agilent ADCM-1700) to the widely used MICA2 platform. The MICA2 uses Atmega128L 8-bit microcontroller and has a maximum data rate of 38.4Kbps. WiSNAP [3] was developed by researchers at Stanford University. It uses Agilent s ADCM-1670 camera module with a maximum resolution of 352 288 (CIF) at 15 frames per second (fps). It uses Chipcon s 802.15.4 radio whose maximum data transmission rate is 250Kbps. SensEye is a multi-tier video surveillance network, which uses several different visual nodes: Cyclops, CMUcam, Crossbow Stargate and webcam, and a high-end camera, Figure 13. Kulkarni et al [12], presented an evaluation of the full SensEye system and compared it to a single-tier implementation of the mentioned algorithms. Table 12 gives the characteristics of these devices [19]. At the lowest tier, the most basic motes are equipped with very low-resolution cameras or 13

other detection. At higher tiers more capable motes use high-resolution cameras with pan-tilt-zoom capabilities [1]. To see the latency and power consumption of a recognition algorithm, a neural network based face recognition system is used. The measurements: average latency 228 ms, average current draw 244.8 ma, average power consumption 1.23W and average energy usage of 280.44 mj [12]. Table 12. Different camera sensors and their characteristics [12]. Figure 13. Prototype of a Tier 1 Mote and CMUcam and Tier 2 Stargate, web cam and a Mote [12]. The comparison is done in two aspects; energy consumption and sensing reliability. The fraction of objects that are accurately detected and recognized is called sensing reliability. A set of four Motes, each connected to a CMUcam, constituted Tier 1 and two Stargates, each connected to a webcam, constituted Tier 2 of SensEye. The object appearance time was given as 7 seconds and the interval between appearances was given as 30 seconds. The single tier system consisted of the two Stargate nodes which were woken up every 5 seconds for object detection. This differs from SensEye where a Stargate is woken up only on a trigger from Tier 1, the energy usage and latency of both cams are given in Table 13 and Table 14 respectively [12]. Table 13. SenseEye Tier1 (with CMUcam) latency, breakup and energy usage [12]. 14

Table 14. SenseEye Tier 1 (with Cyclops) latency breakup and energy usage [12]. Two metrics are used to compare the energy usage between SensEye and the single tier system, energy usage in two different modes: awake and suspend modes. Tables 15 and 16 show the number of wakeups and details of detection at each component of the single tier system and SensEye respectively. Table illustrates the Stargates of the single tier system wakeup more frequently than the Stargates at Tier 2 of SensEye. This is due to the periodic sampling of the region to detect objects. Table 15. Number of wakeups and energy usage of a Single tier system [12]. (Total energy usage for both cases when awake is 2924.9J. Total missed detection is 5.) Table 16. Sense Eye Tier 2 latency and energy usage breakup. The total latency is 4 seconds and total energy usage is 4.71J [12]. Table 17 shows the number of wakeups and energy usage of each SenseEye component [12]. 15

Table 17. Number of wakeups and energy usage of each SenseEye component [12]. ( Total energy usage when components are awake with CMUcam is 466.8J and with Cyclops is 299.6J. Total missed detections are 8) Several studies have focused on single-tier camera sensor networks. As it is mentioned above Panoptes is one of the example of them, a video sensor node built using an Intel StrongARM PDA platform with a Logitech Webcam as the vision sensor. The node uses the 802.11 wireless interfaces and can be used to setup a video based monitoring network. Panoptes is an instance of a single-tier sensor network and is not a multi-tier network like SensEye. A Tier 2 node of SensEye is similar to this, with some additional supports for network wakeups and optimized wakeup-from-suspend energy saving capability. Panoptes also uses compression, filtering and buffering and adaptation mechanisms for the video stream and can be used by Tier 2 nodes of SensEye. The mentioned abilities can be used to improve the SenseEye. In [12], the main goal is reduction of energy usage, other tradeoffs, like system cost and coverage should be studied. Eco is an ultra-compact, high-bandwidth wireless sensor node developed in [8], Figure 14. It measures only 13mm (L) 11mm (W) 7mm(H) and weights 2 grams. It is power consumption is less than 10mA in transmission mode (0dBm) and 22mA in receiving mode. Its maximum data rate and RF range are 1Mbps and 10m. Figure 15 shows photos of Eco hardware and also comparison of Micadot2. Figure 16 shows the power profile of Eco, when the supply voltage is 3.3V. Eco consumes 30.8mA in transmit mode, 24.2mA in receive mode, 6.4mA in standby mode [8]. Figure 14. Photos of Eco node [8]. Figure 15. Size comparison of Eco with Mica2dot [8]. Figure 16. Power profile of Eco [8]. Eco is a suitable platform for miniature WSNs The complete Eco node itself is only 1cm 3 in volume, including the microcontroller, radio, a 3- axial accelerometer and temperature sensor, an infrared sensor, an antenna and a 16

battery. After integration with the camera, ecam itself is still substantially smaller than a single module in many other sensor platforms. At the same time, ecam is capable of a much higher data rate than most platforms. The theoretical peak bandwidth of ecam is 1Mbps, which is higher than similar other nodes. It is shown that ecam can consistently deliver the live video feed without further compromising video quality. The ecam consumes lower power and has a smaller size than some Bluetooth or 802.11b/g modules. The ecam reaches higher efficiencyin several aspects; in-camera hardware compression, in high-speed, low-power wireless communication interface with a simple MAC, in overall streamlined system-level design from the camera, node, and RF to the base station and uplink, and in highly optimized board-level system design for a very compact form factor. It can operate as either a video camera or a JPEG compressed still camera. It consists of a lens, image sensor and compression/serial-bridge chip. Beside these, it supports various image resolutions (VGA/CIF/SIF/QCIF/160 128/80 64) as well as various color formats and also it can capture up to 30fps. With the usage of its JPEG codec quality settings can be changed between different scales [9]. Address-Event Representation (AER) imager [17] is a different type of imager. It takes an 8-bit grayscale input stream from a COTS USB camera and sends the outputs as a queue of events to a text file. The imager which is produced for sensor networks detects image features directly from the image and then these features are transmitted as data to the network. They are light sensitive, data are produced depending on the intensity threshold. A main advantage of AER image sensors is that they are not query based instead they push the information to the receiver. This property is really important since image sensors can detect the interested objects and inform the sink by itself. Another significant feature of AER sensors they could rank the data depending of the importance of the data. They can supply compression and reduced latency for recognition systems by sending only interested data and using prioritizing. It is reported that an AER imaging WSN able to stream pictures formed from 500 evens at frame-rates as high as 6fps and with the 90mW power consumption [17]. The aim of building an advanced mote platform resulted in construction of Intel Motes. A particular integrated wireless microcontroller from Zeevo, Inc. consisting of an ARM7TDMI core running at 12MHz, a Bluetooth 1.1 radio, 64kB RAM and 512kB FLASH memory was chosen. Since this module was its capability of fully supporting the Bluetooth scatternet mode which is suitable to build a mesh network. The size of the Intel Mote module is about 3x3cm with a height of about 5mm, Figure 17. The Intel Mote runs the TinyOS operating system. It gives the opportunity of downloading the Intel Mote specific additions to the TinyOS software stack and to produce your own applications [18]. Figure 17. Intel Mote main board [18]. The Intel Mote 2 is one of the novel platforms based on an Intel PXA270 XScale CPU that integrates high performance processing including DSP capabilities, large amounts of RAM and FLASH memory, standard and highspeed I/O interfaces as well as an advanced security subsystem. It also obtains an on-board 802.15.4 radio and the option to add other wireless standards. The radio is based on the low power 802.15.4 specification and it also provides facilities for implementing different QoS levels. Uses TinyOS and transmission range is between 30m- 60m and can be extended to 100m with an antenna [18]. Meerkats [14], is a wireless network of battery-operated camera nodes, that can be used for monitoring and surveillance of wide areas. Meerkats testbed consists of 8 Meerkats nodes and a laptop acting as a sink. Each node consists of: a Crossbow Stargate platform, a Orinoco Gold 802.11b PCMCIA wireless card, a Logitech QuickCam Pro 4000 webcam connected via USB. Meerkats node s major hardware subsystems can be categorized as: Processor core; consists of the processor itself, memory (RAM and flash), and associated hardware, sensor core; includes the sensing devices, i.e. the webcam, together 17

with the USB interface and communication core; consists of the wireless communication card, and associated PCMCIA modules. Table 18 shows average current (over five runs) in milli-amperes drawn from the Meerkats node when running the different benchmarks with different combinations of active hardware subsystems for steady state. Figure 18 also shows steady state current draw [19]. Energy consumption characterization benchmark consisting of a set of basic operations that are representative of activities performed by visual sensor nodes, which can be categorized into five main task categories: Idle, processing intensive (fft), storage intensive (read, write), communication intensive (TX, RX), visual sensing (image). Table 18.Average current (over 5 runs) in ma and Standard deviation drawn by the Meerkats node [19]. Figure 18. Steady state current draw in the Meerkats node [19]. Table 19 shows the transition durations for five different runs, also again for five different runs charge in mili Coulombs required for transitions is given in Table 20. And charge consumed by transitions from idle to sleep and back to idle of the Processor/ Sensor Core is given in Figure 19 [14]. Table 19. Transition durations in msec (for five different runs) [19]. 18

Table 20. Charge in mcoulombs (for five different runs) required for transitions [19]. Figure 19. Charge consumed by transitions from idle to sleep and back to idle of the Processor/ Sensor Core [19]. CMUcam [20, 21], Figure 20, has a simple vision capabilities for small embedded systems in the form of an intelligent sensor. The motivation is design of an image sensor device based on a CMOS and a set of small image processing applications (e.g. color blob tracking). The main goal is to provide a flexible and a user friendly open source development environment that complements a low cost hardware platform. Hardware of the image sensor is composed of; an OV6620 CMOS camera, a Ubicom SX28 microcontroller with 2048 word flash and 136 Bytes of SRAM. It does not include an RF. They supports; CIF resolution (352x288), RGB color sensor, image processing rate of 26 frames per second, software JPEG compression, basic image manipulation library, image down sampling, threshold and convolution functions, RGB, YCrCb and HSV color space, user defined color blobs, frame differencing, histogram generation, B/W analog video output, Wireless Mote networking interface (802.15.4), Tmote Sky / Telos Connection and 115,200 / 38,400 / 19,200 / 9600 baud serial communication. Cams are working following the given steps: Frames are processed as raw from the camera then basic image manipulation is done during the time between pixels. Color blob tracking allows tracking the objects in real-time (identifies boundaries based on color changes between pixels). Color statistics keep a running sum of individual color channel components. Also it could filter noise, controls transfer flow and modifies the CMOS camera s internal image settings. The authors would like to add several new features to this system, such as the ability to compute color histograms of selected image regions and the ability to perform simple frame differencing by using the SX52 microcontroller. Figure 20. The microcontroller board mated with the CMOS camera module [20]. 19

III. Conclusion: A demand for low-power processors, image sensors, radios, and flash memory storages is increases recently. Although WMSN is a relatively new subject, today s WMSN technology is based on camera sensors and sensor platforms. An overall comparison table of available Motes, Cams and Platforms is given in Table 21 as an attachment. Camera sensors: As it is seen above in Table 12 reviews different classes of cameras that are available today either as prototypes or as commercial products. Cyclops consumes 46mW and can capture low resolution video. CMU-cams are processing for motion detection, histogram computation, etc. and web-cams can capture highresolution video at full frame rate while consuming 200mW, whereas pan-tilt-zoom cameras produce high quality video while consuming 1W. Sensor Platforms: A variety of sensor nodes have emerged recently, from the resource limited Mica Motes to intermediate platforms such as the Yale XYZ, to larger PDA-class platforms such as the Intel Stargates. (Table 2 compares the power consumption and the available processing, memory and storage resources on these platforms.) Micaz and Telos use the same radio as XYZ but have smaller processors. These nodes have lower power consumption than XYZ, but also have more limitations in terms of memory computation and peripherals. Also the Mica Motes are highly resource constrained and very low power, and thus they are only suitable for simple sensing and detection tasks. The Yale XYZ platform is however, more capable and has more memory and processing resources than the Mote. On the other hand, it consumes roughly 3 times more power of the Mica mote at the highest frequency settings. These nodes can be suitable for simple object identification and target localization. At the higher performance set, PDA-class devices such as the Stargate (more powerful than the intermediate nodes but also consume more power). These nodes are fits in complex tasks (e.g. object identification or resource intensive tasks). The selection of processors depends on the required job, energy efficiency and also sleep and wakeup power consumptions. Today dynamic voltage and frequency scaling techniques for low power consumption are used for processors. Processing costs which is given as joules per instruction are approximately two to three orders of magnitude lower than communication costs which is given as joules per bit, on available embedded platforms such as Mote and Yale XYZ [22]. Power consumption of radios decreases day by day. As shown in Table 2, at the lowest end of the power spectrum are low bit-rate radios such as 802.15.4 (Zigbee), which consume roughly 50mW and can transmit at 250Kbps, whereas higher end 802.11 radios consume more than 1W but can transmit at 54Mbps. Having giving information about available motes, cams and platforms in brief, it can be roughly concluded that - Data-centric sensors are not designed for WMSN, - Imagers are developed for vision computing systems, not particularly for WMSN, - Defined Platforms are for WMSN however with available hardware, not specifically thought for WMSN, - In general designs are concentrated on a sink or gateway sensors more than a node, since manipulation of multimedia data with actual sensors are not suitable. - Limitations of Cyclops which is the most appropriate mote for WMSN, can be improved to obtain a better one. - As it is stated in several studies, such as [1],[22], multi-tier networks can be considered as good solutions for WMSN. In the light of these it can be concluded that a new studies on imagers and platforms for WMSN are a promising areas. 20