JOURNAL OF NETWORKS, VOL. 8, NO. 11, NOVEMBER 2013
2607
A Novel Data Detection Method for Data Aggregation Security in WSN Pinghui Zou Shenzhen Polytechnic, Shenzhen City, 518055, China Email:
[email protected] Abstract—Data aggregation technology can effectively save network energy resource and storage resource so it is extensively applied in wireless sensor network. However, data aggregation technology also brings security problems and it demands that corresponding security schemes should be taken for precaution. Based on the requirement of the data aggregation security on WSN, combined with the features of sensor network, current data aggregation security schemes are analyzed. The defects of SIA scheme are pointed that it cannot provide data aggregation results with effective message authentication code. So an improved scheme based data integrity for data aggregation security in WSN is presented. First, the improved scheme inherits the idea of SIA, in data filtering and cross-layer verification; Second, it expands the adaptability of SIA protocol from a single-path data filtering protocol to a complete data aggregation security scheme. Meanwhile, improved scheme supports that intermediate nodes can execute data aggregation operation to effectively reduce the transmitting overhead. It is also adaptive for more data aggregation operations, which shows better performance in saving storage resource and energy resource. Index Terms—SIA; Cluster-Head Node; Data Aggregation Tree; Homomorphic Message Identity Code
I.
INTRODUCTION
In wireless sensor network, the resources of battery energy on sensor node, handling capability, storage capacity and communication bandwidth are limited [1]. Data aggregation technology is an effective method for resource constraint. Its idea is to aggregate information from different data source, by deleting redundant information and reducing the amount of transferring data. So we can save energy and improve efficiency and accuracy in data collection, at the expense of delay and robustness [2]. Since sensor nodes are usually deployed to outdoor or battlefield area and they are easily attacked by various means, the simple data aggregation scheme is always interrupted or destroyed by external attackers. This leads to that base station users cannot acquire correct results and they tend to make wrong decisions consequently [3-6]. Similar to confidentiality, data integrity is an important security target of security scheme on wireless sensor network data. If base station or intermediate node cannot distinguish data completeness, it will bring serious security risk. On one hand, by capturing node or forging node, malicious attackers will inject illegal data in network, to hinder data aggregation © 2013 ACADEMY PUBLISHER doi:10.4304/jnw.8.11.2607-2613
or cause data lost. That will make base station or users to accept inaccurate data aggregation and make wrong judging decision. On the other hand, data ineffectiveness or physical fault can also generate inaccurate results. Since data aggregation operation in application layer always adopts the way of damaging data aggregation [7, 8, 9], it can also ignore some detailed information or lower data quality while reducing data transmission or efficiency on data collection. Due to the intermediate information loss, intermediate nodes or base station cannot identify data integrity of data aggregation results. Therefore, data aggregation operation confronts serious security threat. At present, most attack mainly contains data tampering, data forging and data discarding. Their ultimate purpose is to force base station or other nodes to receive inaccurate information and influence users’ decision. Currently, security scheme towards data integrity of data aggregation is mainly divided into four types [10-15]: integrity verification scheme based on delaying data aggregation; integrity verification scheme based on MACHASH tree; integrity verification scheme based on mutual supervision mechanism; integrity verification scheme based on statistics. S. C. Zhu, etc. offered a data aggregation security scheme based on data filtering and cross-layer verification idea (SIA for short) in reference [16]. One of this scheme’s advantages is it realizes the combination between security and resilience, which can tolerate that certain sensor nodes are captured. The other is it can also reduce the influence of inaccurate information on network during guaranteeing data integrity security to avoid useless expense. However, this scheme is not a pure data aggregation method since it is only applied to data aggregation operation on the way of in-clustering voting. The intermediate node cannot handle data so the method has low adaptability. On the basis of above introduction, we find that scheme SIA are effective on illegal data filtering. It can also tolerate that certain nodes be captured with great resilience. Furthermore, this scheme can guarantee that inaccurate data cannot be transferred long distance in the network, which reduces the interruption of inaccurate information on network. However, this scheme is not adaptive for most data aggregation operation(such as sum
2608
JOURNAL OF NETWORKS, VOL. 8, NO. 11, NOVEMBER 2013
v1 u8
u7
u6
u5
u4
u3
u2
u1
Base Station
CH v3 v2
Figure 1. The definition of association when t=3
and mean, etc) and it has certain restriction in practice. Moreover, intermediate node does not support data aggregation operation on data and this step cannot further reduce stored and transferred data quantity. In order to develop SIA scheme adaptability, this paper provides a data integrity identification scheme for data aggregation (abbreviated as DID), which is adaptive for hierarchical multi-hop network. In this scheme, we introduced the homomorphic message identity code algorithm. Then the single-path data filtering protocol in SIA is improved to be a scheme for whole network. This scheme supports that intermediate node and base station can supply integrity identification on data aggregation results. It ensures that attackers cannot inject illegal data in network by means of capturing nodes or forging nodes. It also further saves limited resources and broadens the range for implementation. Compared to SIA scheme, on one hand, DID can support addition data aggregation operation, which can be used to satisfy mean and addition solution; On the other hand, it can support that data aggregation can be finished by intermediate nodes. So it saves more stored resources or energy resources and has better performance in security and resilience. II.
RELATED WORKS
A. Homomorphic Message Identity Code To make up the deficiency of regular information identity code, we introduced the definition of homomorphic message identity code. Homomorphic message identity code refers to information identity code satisfying MACk ( x y) MACk ( x) MACk ( y) . The feature of homomorphic message identity code lies in that its data aggregation result can be used to verify data integrity of data aggregation results. Due to the its specialty, it is always a complicated problem to construct an effective homomorphic message identity code and guarantee the balance between security and energy conservation. K. Minami, A. J. Lee, M. Winslett, etc presented the example of homomorphic MAC algorithm in reference [17]. This scheme supposes that each node has one or more paths to base station and it can establish security relationship with base station. Each node cannot only establish pair-wise key with approximate node, but it can also share the pair-wise key with nodes which are several hops far away by security patterns. Base station masters the information and topology location of all nodes in the network. We assume that attackers could capture one or more nodes in network-wide range. After nodes are © 2013 ACADEMY PUBLISHER
captured, attackers can acquire all information stored in sensor nodes [18-21]. , node ID, router information and storage keys are included. B. SIA Scheme SIA scheme contains 5 stages: network initialization, path discovery, data aggregation, data transmission and base station verification. At the stage of network initialization, the nodes are deployed at first. After deployment is completed, the shared pair-wise key is established through secure pattern between each node and its approximate node. The purpose is that security communication can be operated among nodes during the process of data aggregation. At the stage of path discovery, each node will find out its upper stream node and downstream node on the base station path and store its relevant node information. Figure 1 gives a path discovery sample. In the figure with t 3 , the points connected by dotted line are correlation nodes. The node approximating to base station is the upper stream node of another node. The node which is far from base station is the downstream node of another node. At the stage of data aggregation, cluster-head node will generate data according to data requests from base station, to send data to t clusters. In-clustering node will compare its own data with received data. If they are matched, the shared key between in-clustering node and upstream node will calculate MAC of data. Then it will be sent to the cluster-head node finally. At the stage of data transmission, cluster-head node will wait for the MAC returned value from t Inclustering nodes. Then, it will generate a MAC table whose length is t 1 , with calculated MAC value by itseld. Finally, data and MAC table will be sent upward. After intermediate node receives data, it will verify MAC table length at first. Then, it will verify one of the MAC value. If verification is correct, this value will be deleted and added on the new MAC value after calculation. At the stage of base station verification, the base station will finally verify the integrity data. Security of HA scheme is up to security threshold t . As long as attacker cannot capture more than t 1 nodes, the scheme will always be effective. The overhead of SIA scheme comes from two aspects. First, while each node is transferring data, it must add one original MAC and t 1 verified MAC . This overhead in this aspect is up to security threshold t . Second, the node must also transfer node ID while transferring data. This
JOURNAL OF NETWORKS, VOL. 8, NO. 11, NOVEMBER 2013
overhead is up to path stability. As long as its path does not change, it will not need to send node ID value each time. Since SIA scheme can verify data during transferring data, it can find out errors as early as possible . So the transmission overhead brought by error data transmission can be reduced. III.
2609
will use public one-way function HK () to generate new key. Node can establish reference algorithm on key method with node of several hops distance as referred in [24, 25]
... ... ...
DATA AGGREGATION SECURITY SCHEME BASED ON DATA INTEGRITY
then the jth node for witness of cluster i is associated to ID which is on the place of t 1 j in the testing message else delete the latest ID in the testing message Put the ID of node i on the first place of the testing message Rebroadcast this testing message to the children nodes end if end if end for
(2) Upper stream and downstream node discovery: After establishing data aggregation tree, each cluster-head i will select t supervision nodes and relevant node information will be informed to base station or clusterhead nodes. Then, the node will find out its upper stream and downstream node. The above program presents the pseudo-code of node discovery algorithm. This algorithm is similar to discovery algorithm in [23]. The difference lies in that we can suppose there are t 1 identities at base station which are correlated to approximate t 1 layer nodes. B. Key Distribution and Data Preparation After establishing data aggregation tree and finding out upper stream and downstream node, the keys are distributed continuously. The stage of key distribution follows two steps: Establish pair-wise key: In order to avoid replay attack and guarantee smooth process of cross validation. Each node will individually share key seed K with its upstream node. When data aggregate in each round, node
© 2013 ACADEMY PUBLISHER
...
For the number of nodes N in the aggregation tree If the testing message is sent to node i by it’s parent then take the latest ID as the ID of upstream node if N belongs to cluster node CH i
... ...
Association discovery algorithm BS sends a testing message which is composed of t 1 IDs like: { BS1 , BS2 , BS3 BSt 1 }
... ... ...
A. Network Initialization Network initialization stage mainly contains following steps: (1) Establish the data aggregation tree: First, we can divide nodes in the network into n clusters and each cluster will vote a cluster-head node (named as CHi ). Then, on the basis of n cluster-head nodes, algorithm TAG [22] is used to establish data aggregation trees. As is shown by figure 2, on the established trees, the intermediate nodes are cluster-head nodes of each cluster and each node in the cluster constitutes leaf node of the data aggregation tree.
... ... ...
(a) Node cluster Base station
A1 A2 A3
Ai CH1 Ai-1 CH2 CHn CH3
CHn-1
(b) Data aggregation tree construction Figure 2. Establishing data aggregation tree.
Establish group key: each intermediate node will select group key a , which will be sent to all downstream nodes. At the stage of data preparation, base station will send data query request to all nodes in the network and clusterhead node will indicate in-clustering node to transfer data. After cluster-head node collects all updating data, it will execute data aggregation first. Then, data aggregation results will be sent to t supervision nodes and they will verify the aggregation results. If passed, its pair-wise key and group key will be used to calculate PMAC value from data aggregation results and the value will be sent to cluster-head node. The verification method of supervision node on data aggregation results are: Step 1: All the leaf nodes send detected data to clusterhead node. Meanwhile, all leaf nodes will accept other information from other nodes; Step 2: Cluster-head nodes will finish data aggregation on all received data and data aggregation results will be broadcast to all leaf nodes; Step 3: All leaf nodes will compare data aggregation results of cluster-head node broadcasting with its calculated results. If they are the same, the shared key of leaf node’s result with base station will calculate MAC of data aggregation results. The final value will be sent to cluster-head node;
2610
JOURNAL OF NETWORKS, VOL. 8, NO. 11, NOVEMBER 2013
Step 4: The cluster-head nodes will execute XOR operation on all received MAC . Then the results of XOR and data aggregation will be sent to base station; Step 5: The base station will calculate the value of data aggregation result and the that will be compared to verification code value updated by cluster-head node, to determine data accuracy. This scheme is simple and effective and it has perfect resilience and security. But its requirement for networked topology structure is higher and has certain restrictions. C. Data Aggregation and Information Filtering Then we will introduce each node’s task at the stage of data aggregation and information filtering from clusterhead node, intermediate node and base station, respectively. Cluster-node operation: After t supervision nodes return their PMAC value, cluster-head node will be similarly to calculate a PMAC value based on its key. Then, t 1 PMAC values will form PMAC table. Finally, data aggregation result, PMAC table and ID table will be sent to upper node ( ID table includes cluster-head node and supervision node ID ) Intermediate node operation: After intermediate node receives the data from multiple sub-nodes, most received data and PMAC table will be aggregated to acquire data aggregation result and PMAC table aggregation result. Then, according to downstream node appeared in each ID table, intermediate node will figure out PMAC value of data aggregation result. Next, it will be compared to the last PMAC value in aggregated PMAC table, to verify the result of data aggregation. When verification is finished, intermediate node will delete the last PMAC value in PMAC table first and figure out the new PMAC value, to add to the first position of PMAC table. Then, after aggregation, the new ID table will be acquired and the downstream node ID will be deleted. Its ID will add to the ID table. Finally, data aggregation result, PMAC table and ID table will be sent to upper node. Base Station Operation: The base station operation is consistent with intermediate node. However, since base station is related to t 1 layer node, base station needs to verify t 1 PMAC values with different identities. Base station
[A1,BS4]
[A2,BS3]
[A3,BS2] [A4,BS1]
[A5,A1] [CH2,A2] [CH1,A2] [WN23,A5] [WN13,A5] [WN21,A3] [WN11,A3]
[WN22,A4]
[WN12,A4]
Figure 3. Example for network model.
© 2013 ACADEMY PUBLISHER
Suppose that security threshold t 3 and we will explain the operation steps of intermediate node’s, with the example of topology structure as figure 3 shows: (1) Cluster-head node CH1 and CH 2 send data to intermediate node A5 ; CH1 A5 :{mch1 , IDlistch1 , PMAclistch1} CH 2 A5 :{mch 2 , IDlistch 2 , PMAclistch 2 } (2) A5 verifies if there are t 1 PMAC value in PMAclistch1 and PMAclistch 2 . Examine whether IDlistch1 and IDlistch 2 contain their downstream node; (3) The following operations are executed: a) Compute mA5 mch1 mch 2
b) Compute PMAclist A5 PMAclistch1 * PMAclistch 2 {PMAclisti ( PMACi1...PMACit 1 )* PMAclist j ( PMAC j1...PMAC jt 1 )} PMAclist ( PMACi1 * PMAC j1 mod M ,..., PMACit 1 * PMAC jt 1 mod M )
c) Compute X PMAC (mA5 , a, Kwn13 Kwn 23 , M ) d) Compute PMACA5 PMAC (mA5 , a, K A5 , M ) e) Compare X with the last PMAC value in PMAclist A5 to verify the data integrity. If verification is failed, error alarm is sent; else continue the following steps; f) Delete the last PMAC value in PMAclist A5 and put PMACA5 on the first place in PMAclist A5 ;
g) Compute IDlist A5 IDlistch1 IDlistch 2 , then delete ID mark WN13 and WN 23 in IDlist A5 . Add ID
A5 to it; (4) Upload data: A5 A4 :{mA5 , IDlist A5 , PMAclist A5 } IV.
PERFORMANCE E VALUATIONS
A. Safety Performance Analysis DID scheme adopts the idea of homomorphic message identity code to verify the data integrity on data aggregation result. Without information of the keys, attackers cannot produce accurate security verification code. Meanwhile, since flow key mechanism is used so data aggregation will use different key in each round. The attackers also cannot attack by data replay. Similar to SIA protocol, when guaranteeing data integrity verification, DID scheme also supplies perfect resilience and tolerates which can tolerate certain results be captured. DID scheme are invalid only under these following conditions: Attackers capture cluster-head node and t supervision nodes in the same cluster: Under this condition, attackers can forge data and generate normal message identity code; Attackers capture continuous t 1 nodes on the same path in data aggregation tree: Under this condition, attackers can tamper data and generate normal information identity code.
JOURNAL OF NETWORKS, VOL. 8, NO. 11, NOVEMBER 2013
B. Aggregation Performance Evaluation We assume the sensors nodes are distributed in the range of 50 50 m 2 . The network is composed of 240 nodes. First we take the effect caused by malicious nodes into account. The ration that normal sensor nodes are captured as malicious nodes is 10%-50%. The record of successful detection of malicious nodes is shown in figure 4. It can be seen that when the number of malicious nodes is desreasing (less than 30%). The detection rate is rather high and the algorithm has better performance. When the number of malicious nodes is large (more than 40%), the detection rate is obviously decreasing and the algorithm loses effect. That is because sensing data distribution deviates from the assumption, when large amount of malicious nodes send forged data. So it causes errors of the improved method in this paper.
Figure 4. Detect rate of malicious node.
Besides the amount of malicious nodes, the density of nodes is a factor that influences the detection rate. If the distribution range of sensor nodes is fixed, the amount of nodes can represent the density of nodes. When the density of sensors nodes in network is getting bigger, the information redundancy existing in the data gathered by neighbour nodes will become large. Then the value calculated by nodes will be more accurate and the detection rate of malicious nodes will be higher. Otherwise, the detection rate will be lower. From figure 5 we can conclude the evaluation of aggregation results given by the nodes is low, at initial
© 2013 ACADEMY PUBLISHER
stage when the data is sending out. Since the node are not stable during the initial operation of the algorithm, the malicious nodes are not detected yet. So the evaluation of aggregation results is low. With the increase of the rounds for data sending, the malicious nodes are detected continuously and the malicious data will be dropped. The the evaluation of data aggregation results will get higher correspondingly. 1 Evaluation 0.9 0.8 0.7 Evaluation value
Towards the first case, we can adopt rotating strategy of supervision node and this can increase the difficulty of forging data; Towards the second case, in practical scheme, since the sensor nodes are usually distributed in a large field. it is also difficult for attackers to capture continuous t 1 layer nodes. Same to SIA protocol, security overhead of DID scheme comes from two aspects. One is that intermediate node needs to add t 1 PMAC values during data transmission. The size of PMAC value is up to a large number M value. Usually, we can suppose that the size of PMAC value is from 1 to 2 bit; On the other hand, the overhead is from the transmission node ID. In practical scheme, after networked topology structure is stabilized, it only needs to convey invalid node ID. This can obviously reduce transmission overhead and stored overhead.
2611
0.6 0.5 0.4 0.3 0.2 0.1 0
0
50
100
150 200 Number of rounds
250
300
Figure 5. Opinion of the final data aggregation results.
C. Transmitting Overhead The scheme SIA and DID are simulated under different conditions that the safety threshold and node amount are not the same. If transmitting overhead of data in application layer is considered and we ignore the overhead of data aggregation tree construction and key distribution. Because in actual scheme, data aggregation tree and key do not need to be established each time. Simulation A set the initial data size of each node is 127(7bit) and the experiment result is shown in figure 6(a); Simulation B set the initial data size of each node is 1023(10bit) and the experiment result is shown in figure 6(b); Simulation C set the initial data size of each node is 65535(16bit) and the experiment result is shown in figure 6(c). From these figures we can see: In the aspect of application, the data transmitting amount of DID is less than SIA, as is more obvious with the increase of the safety threshold. In 3 simulations, the data transmitting amount of SIA depends on the threshold t . The theoretical maximum value of data transmitting amount of DID is log 2 (nodes number data amount). The effect of DID is more obvious when data amount is large. Since DID and SIA adopt the method of multiplex data transmission, under the same network configure, the underlying overhead of two schemes are almost same. So DID has better performance than SIA in the overhead of network data transmitting. We also compare the transferring transmission quantity of DID scheme on application layer with that of classical SIA scheme. It is supposed that data aggregation tree is a complete trinomial tree with 7 layers and 3279 nodes. Initialized value range of each node data is 0~1023 (10bit) and information identity code size is 7 bit. Here we only consider the amount of data transmission. The overhead of established data aggregation tree and ID transmission
2612
JOURNAL OF NETWORKS, VOL. 8, NO. 11, NOVEMBER 2013
can be ignored. table I describes the specific simulation results. From table I we can see that network transmitting overhead of DID scheme is far smaller than that of classical data aggregation security scheme SIA. Furthermore, networked overhead is more balanced and the pressure of intermediate node is far smaller. It means when data quantity is large, DID scheme is more effective. TABLE I. Layer 1 2 3 4 5 6 7 Total
Amount of nodes 3 9 27 81 243 729 2187 3279
A COMPREHENSIVE COMPARISON Overhead of DID (bit) 144 414 1215 3483 10206 29160 37179 81801
Overhead of SIA (bit) 246 711 1971 5103 15552 12282 37179 103044
No security scheme 60 162 459 1215 3402 8748 21870 35916
(a) Initial data is 7 bit
V.
CONCLUSION
According to data aggregation security requirement in wireless sensor network and combined with features of sensor network, this paper analyzes of SIA scheme systemly. It discovers that the defects of SIA scheme lie that it cannot provide effective message identity code with data aggregation results. Therefore, we presents a novel data aggregation security protocol based on data integrity – scheme DID. DID has expanded the original scheme from a single-path data filtering and intrusion detection protocol to a hierarchical identity scheme of innetwork data aggregation integrity. Meanwhile, it supports that intermediate node and base station can verify the integrity identity on data aggregation results. It not only strengthens WSN networked data security but also reduces the transmitting overhead and stored consumption of original scheme. With in-depth research, there are problems needed to be solved. The next job will be carried out on the basis of following aspects: Enhance program’s resilience and further reduce the influence of node capture on security program. Reduce additional cost brought by security program and realize effective combination between energy-saving type and security type. Improve algorithm adaptability. It can be better applied to changing network of topology structure and transmission cost of node ID can also be reduced. In addition, how to construct security program of other data aggregation operation will be taken as our research focus in the future. REFERENCES
(b) Initial data is 10 bit
(c) Initial data is 16 bit Figure 6. Transmitting overhead comparison.
© 2013 ACADEMY PUBLISHER
[1] Y. Wang, G. Attebury, B. Ramamurthy, “A survey of security issues in wireless sensor networks”, Communications Surveys & Tutorials, vol. 8, no. 2, pp. 223, 2006. [2] HUANG Manguo, FAN Shang-chun, ZHENG Dezhi, “Research progress of multi-sensor data fusion technology”, Transducer and Microsystem Technologies, vol. 37, no. 3, pp. 1060-1068, 2010. [3] Zhu Zejun, Huang Tao, Liu Xixia, “Current Research Status and Its Development Direction of Multi-sensor Data Fusion Technology”, Ship Electronic Engineering, vol. 8, no. 2, pp. 52-55, 2009. [4] YANG Geng, WANG An-Q, CHEN Zheng-Yu, “An Energy-Saving Privacy-Preserving Data Aggregation Algorithm”, Chinese Journal of Computers, vol. 42, no. 5, pp. 2211-2219, 2011. [5] Avinash Sridharan, Bhaskar Krishnamachari, “Maximizing network utilization with max–min fairness in wireless sensor networks”, Wireless Networks, vol. 14, no. 5, pp. 585-600, 2008. [6] Ke Xu, Wen Cui, Jun Tie, “An Algorithm for Detecting Group in Mobile Social Network”, Journal Of Networks, Vol. 7, NO. 10, pp. 1584-1591, 2012. [7] W. R. Heinzelman, A. Chandrakasan, H. Balakrishnan, “Energy-efficient communication protocol for wireless microsensor networks”, In: Proc of the Hawaii International Conference on System Sciences. Washington: IEEE Computer Society Press, pp. 1–10, 2000.
JOURNAL OF NETWORKS, VOL. 8, NO. 11, NOVEMBER 2013
[8] Ozdemir Suat, Xiao Yang, “Integrity protecting hierarchical concealed data aggregation for wireless sensor networks”, Computer Networks, vol. 55, no. 8, pp. 17351746, 2011. [9] Huawang Shi, “Integration of Unascertained Method with Neural Networks and Its Application”, Journal Of Network , vol. 6, no. 11, pp. 1631-1638, 2011. [10] Bagaa Miloud, Challal Yacine, Ouadjaout Abdelraouf, “Efficient data aggregation with in-network integrity control for WSN”, Journal of Parallel and Distributed Computing, vol. 72, no. 10, pp. 1157-1170, 2012. [11] Galluccio Laura, Palazzo Sergio, Campbell Andrew T, “Modeling and designing efficient data aggregation in wireless sensor networks under entropy and energy bounds”, International Journal of Wireless Information Networks, vol. 16, no. 3, pp. 175-183, 2009. [12] Yen H. -H, Lin C. -L, “Integrated channel assignment and data aggregation routing problem in wireless sensor networks”, IET Communications, vol. 3, no. 5, pp. 784-793, 200 9. [13] Wei Guiyi, Ling Yun, Guo Binfeng, “Prediction-based data aggregation in wireless sensor networks: Combining grey model and Kalman Filter”, Computer Communications, vol. 3 4, no. 6, pp. 793-802, 2011. [14] Di Pietro Roberto, Michiardi Pietro, Molva Refik, “Confidentiality and integrity for data aggregation in WSN using peer monitoring”, Security and Communicati- on Networks, vol. 2, no. 2, pp. 181-194, 2009. [15] Bista Rabindra, Kim Yong-Ki, Song Myoung-Seon, “Improving data confidentiality and integrity for data aggregation in wireless sensor networks”, IEICE Transactions on Information and Systems, vol. E-95-D, no. 1, pp. 67-77, 2012. [16] S. C. Zhu, S. Setia, S. Jajodia, et al. , “An Interleaved Hopby-Hop Authentication Scheme for Filtering of Injected False Data in Sensor Networks”, In: Proc of the 2004
© 2013 ACADEMY PUBLISHER
2613
[17]
[18] [19]
[20]
[21]
[22] [23]
[24]
[25]
IEEE Symposium on Security and Privacy. Washington: IEEE Computer Society Press, pp. 259-271, 2004. K. Minami, A. J. Lee, M. Winslett, et al, “Secure aggregation in a publish-subscribe system”, In: Proc of the 7th ACM workshop on Privacy in the electronic society. New York: ACM Press, pp. 95-104, 2008. XU Jin-hong WANG Wei-ping, “Study on Typical Attack Models and Security Strategies in WSN”, Communications Technology, vol. 13, no. 12, pp. 69-72, 2009. ZHAN Yong-zhao, RAO Jing-yi, WANG Liang-min, “Security Evaluation Model of WSN Based on Routing Attack Effect”, Computer Science, vol. 28, no. 7, pp. 812816, 2010. LIU Gang, LIU Jun-li, YAN Jun-song, “Extending probabilistic packet marking algorithm to detect DDoS in WSN”, Application Research of Comuters, vol. 20, no. 6, 297- -300, 2010. Kalyan Kumar Debnath, Sultan Uddin Ahmed, Md. Shahjahan, “A Paper Currency Recognition System using Negatively Correlated Neural Network Ensemble”, Journal of Multimedia, vol. 5, no. 6, pp. 560-568, 2010. H. Çam, S. Özdemir, H. O. Sanli, “Secure Differential Data Aggregation for Wireless Sensor Networks”, Sensor Network Operations, vol. 19, no. 2, pp. 422-441, 2004. WANG Xiao-dong, SUN Yan-qiang, MENG Xiang-xu, “Cluster-based Defending Mechanism for Sybil Attacks in Wireless Sensor Network”, vol. 31, no. 15, pp. 132-136, 2009. D. G. Liu, P. Ning, “Establishing Pair wise Keys in Distributed Sensor Networks”, In: Proc of the 10th ACM Conference on Computer and Communications Security. New York: ACM Press, pp. 41-77, 2003. Zhu WenTao, Gao Fei, Xiang Yang, “A secure and efficient data aggregation scheme for wireless sensor networks”, Concurrency Computation Practice and Experience, vol. 23, no. 12, pp. 1414-1430, 2011.