SECURE DEDUPLICATION OF DATA IN CLOUD STORAGE

Size: px
Start display at page:

Download "SECURE DEDUPLICATION OF DATA IN CLOUD STORAGE"

Transcription

1 SECURE DEDUPLICATION OF DATA IN CLOUD STORAGE 1 Babaso D. Aldar, 2 Vidyullata Devmane 1,2 Computer Engg. Dept, Shah and Anchor Kutchhi Engg. College, Mumbai University. 1 babaaldar@gmail.com, 2 devmane.vidyullata@gmail.com Abstract Cloud storage is the service model in which data is maintained, backed up remotely and made available to users over a network. Deduplication is the method for eliminating duplicate copies of repeating data. It has been used in cloud storage to decrease the amount of storage space. To make the management of data scalable in cloud storage, deduplication method is used. To better protect the security of data, the deduplication of data should be authoritative. Different from conventional deduplication systems, the differential privileges of users can be considered in duplicate check of the data. The user is only allowed to perform the duplicate check for files noticeable with the corresponding privileges. The system can be enhanced by supporting stronger security and encrypting the files with differential privilege keys. To protect the privacy of susceptible data while supporting deduplication, the convergent encryption method can be used to encrypt the information before outsourcing. The data is encrypted by using convergent key, which is the hash value of the same information. Keywords Deduplication, Convergent encryption, Authorized duplicate check, privacy, Authenticity. I. INTRODUCTION Storage as a service is a business model in which cloud service providers rents highly available storage space and massively parallel computing resources to a smaller company or individual. The advent of Cloud Storage motivates enterprises and organizations to outsource data storage to third party cloud providers. An increasing quantity of information is being stored in the cloud storage and shared by users with particular privileges, which define the access rights of the stored data. Google drive is an example of cloud storage, which is used by most of us regularly. There is a major problem of cloud storage services is the management of the increasing quantity of information. To make the information management scalable in cloud storage, deduplication is a technique. Data deduplication is data compression method for eliminating duplicate copies of repeated data in storage [1]. If it was possible, Information Technology organizations would only protect the distinctive information from their backups. Instead of saving the whole thing again and again, the ideal scenario is one where only the new or unique information is saved [2]. Data deduplication is also known as single instancing or intellectual compression technique. Instead of keeping number of data copies with the same content on cloud storage, data deduplication eliminates duplicate data by keeping only one physical copy of the data and referring other redundant data to that copy, which is used to improve storage utilization. Previous deduplication systems cannot support differential authorization duplicate check [2], which is important in many applications. In an authoritative deduplication system, each user has been given a set of privileges during system initialization. Each data file uploaded to the storage server is also bounded by a set of privileges to identify which kind of users is permissible to perform the duplicate check and right to use the files. For every data file which is at cloud site having unique token. By comparing the token of new file with the tokens, which are already stored at cloud site, if that token matches with existing tokens then the file is getting duplicated. Otherwise new file with token information s getting stored at cloud site. The deduplication system should hold the differential authorization and authorised duplicate check, to solve the problem of privacy preserving deduplication of data in cloud storage. In differential authorization [2], each authoritative user is able to get his/her individual tokens of his files to perform duplicate check based on his/her privileges. In authorised duplicate check [2], the authorized user is able to generate a query for certain file according to the privileges he/she owned with the help of trusted third party, while the cloud server performs duplicate check directly and Babaso D. Aldar and Vidyullata Devmane ijesird, Vol. III, Issue III, September 2016/190

2 tells the user if there is any duplicate. The security requirements are considered for security of file token and the security of data file. The unauthorized users without proper privileges or file should be prevented from getting or generating the file tokens for duplicate check of any file stored at the cloud service provider. It requires that any user without querying the trusted third party for some file token, he cannot get any useful information from the token, which includes the file information or the privilege information. For the protection of the data file, the data file should be secured that is illegal users without appropriate privileges or files, including the cloud service provider and the trusted third party, should be prevented from access to the encrypted file stored at cloud service provider. Convergent encryption [2] key is the hash value of the same file, which is used to encrypt the file. It is known that some money-making cloud storage providers, such as Bitcasa, also deploy convergent encryption [1], [2].Security and privacy is maintained by uploading the encrypted data files and on necessity downloads the encrypted data files and decrypt it by using symmetric encryption algorithm. Deduplication can also decrease the amount of network bandwidth required for backup processes, and in some cases, it can speed up the backup and recovery process. Deduplication is actively used by a number of cloud backup providers (e.g. Bitcasa) as well as various cloud services provider (e.g. Dropbox) [1]. Jin et al. [2], today s money-making cloud storage services, such as Mozy, Memopal, have been applying deduplication to user data to save maintenance cost. conventional encryption requires different users to encrypt their data with their own keys by which indistinguishable data copies of different users will lead to different cipher texts, making deduplication impossible [3]. The solution for balancing secrecy and efficiency in deduplication was described by M. Bellare et al [4] called convergent encryption. It has been proposed to enforce data privacy while making deduplication. It encrypts/decrypts a data files with a convergent key, which is originated by computing the cryptographic hash value of the content of the data copy itself. II. REVIEW OF LITERATURE There are different types of the data deduplication, which are Inline deduplication, Postprocess deduplication, Source based-deduplication, Target based-deduplication and Global deduplication. In Inline deduplication, it is performed at the time of storing data on storage system. It reduces the raw disk space needed in system. It increases CPU overhead in the production environment but limits the total amount of data ultimately transferred to backup storage. This technique allows most of the hard work to be done in RAM, which minimizes I/O overhead. In Post-Process deduplication, it refers to the type of system where software processes, filters the redundant data from a data set only after it has already been transferred to a data stored location. It writes the backup data into a disk cache before it starts the deduplication process. It doesn't necessarily write the full backup to disk before starting the process; once the data starts to hit the disk the deduplication process begins. The deduplication process is separate from the backup process so we can deduplicate the data outside the backup window without degrading our backup performance. The following table summarizing the operations of Inline method and Post-process method: TABLE I. SUMMARY OF THE INLINE AND POST-PROCESS METHOD In Source side deduplication i.e. client side deduplication, It uses client software to compare new data blocks on the primary storage device with previously backed up data blocks [5]. It is different Babaso D. Aldar and Vidyullata Devmane ijesird, Vol. III, Issue III, September 2016/191

3 from all other forms of deduplication in that duplicate data is first only identified before it has to be sent over the network. Previously stored data blocks are not transmitted. Source-based deduplication uses less bandwidth for data transmission. This will definitely create burden on the CPU but at the same time reduces the load on the network. From following fig, there are three copies of files A and B each. So only unique data is sent on the network (only one copy of file A and one copy file B) to cloud storage because of this network bandwidth is saved and for client and it requires CPU utilization more. It requires different backup software on the client where this deduplication technique will be used. They may be in data center and they may be in a remote datacenter, or they can even be a laptop. This software tells that the file already exists in the storage server, no need to upload it on the storage server. Fig. 1 Source-Based Data Deduplication [5]. In Target-based deduplication [5], [6], it will remove the redundancies from a backup transmission as and when it passes through an appliance that is present between the source and the target. Target deduplication appliances represent the high end of modern deduplication. These deduplication applicen- ces sit between our network and our storage, and provide fast, highcapacity dedup-lication for our data. Unlike source deduplication, the target deduplication does not reduce the total amount of data that need to be transferred across a WAN or LAN during the backup, but it reduces the amount of storage space required. Fig. 2 Target-Based Data Deduplication [5]. However, because target deduplication provides faster performance for large (multiple terabyte) data sets, it is typically employed by companies with larger datasets and fewer bandwidth constraints. Data deduplication at the source is best suited to use at remote offices for backup to a central data centre or the cloud. It is usually a feature of the backup application. Target deduplication is best suited to use in the data centre for the reduction of very large data sets and is usually carried out with a hardware appliance. In Global data deduplication [6], there is a procedure of eliminating redundant data when backing up data to more number of deduplication devices. This situation might require backing up data to more than one target deduplication system or in the case of source deduplication it might require backing up to multiple backup nodes which will themselves be backing up multiple clients. When data is sent from one node to another, the second node recognizes that the first node already has a copy of the data, and does not make an additional copy. There are two methods of the deduplication, first one is file level deduplication and second is block level deduplication. For file level deduplication, it eliminates duplicate copies of the same file. It is commonly known as single-instance storage. It compares a file that has to be archived or backup that has already been stored by checking all its attributes against the index. The index is updated and stored only if the file is unique, if not than only a pointer to the existing file that is stored references. Only the single instance of file is saved in the result and relevant copies are replaced by "stub" which points to the original file [7]. E.g. Structure-Aware File and Deduplication (SAFE) [8] first uses a file-level deduplication for unstructured file types such as multimedia content Babaso D. Aldar and Vidyullata Devmane ijesird, Vol. III, Issue III, September 2016/192

4 and encrypted content and Duplicate Data Elimination (DDE) is designed for IBM Storage Tank, a heterogeneous, scalable SAN file system[9]. In Block-level data deduplication, which is operates on the basis of the sub-file level. As the name implies, that the file is being broken into segments blocks or chunks that will be examined for previously stored information verses redundancy. Fixed-size chunking [7] is conceptually simple and fast. Chunks can be of variable size. The popular approach to determine redundant data is by assigning identifier to chunk of data, by using hash algorithm. Files can be broken into fixed length blocks or variable length blocks. Metadata Aware Deduplication [9] is an accurate deduplication network backup service which works at both the file level and the block level. III. PROPOSED SYSTEM The proposed system will eliminate the duplicate data from the cloud storage. The objective of the proposed system is to solve the problem of deduplication of data with differential privileges in cloud storage. In this new deduplication system, cloud architecture is introduced to solve the problem. The deduplication system supports the differential authorization and authorised duplicate check [10], to solve the problem of privacy preserving deduplication of data in cloud storage. In differential authorization, each authorized user is able to get his/her individual token of his file to perform duplicate check and In authorised duplicate check, the authorized user is able to generate a query for certain file and the privileges he/she owned with the help of trusted third party, while the cloud server performs duplicate check directly and tells the user if there is any duplicate. To get a file token, the user needs to send a request to the trusted third party. It will also check the user s identity before issuing the corresponding file token to the user. The authorized duplicate check for this file can be performed by the user with the cloud server before uploading this file. Based on the results of duplicate check, the user either uploads this file or runs Proof of Ownership (PoW). An identification protocol = (Proof, Verify) is also defined, where Proof and Verify are the proof and verification algorithm respectively. The security requirements considered are security of file token and security of data files. A. SECURITY OF FILE TOKEN For the security of file token, two aspects are defined as unforgeability and indistinguishability of file token. In Unforgeability of file token/duplicatecheck token [10], unauthorized users without appropriate privileges or file should be prevented from getting or generating the file tokens for duplicate check of any file stored at the Cloud Service Provider (CSP). In Indistinguishability of file token/duplicate-check token [10], It requires that any user without querying the trusted third party for some file token, he cannot get any useful information from the token, which includes the file information or the privilege information. B. SECURITY OF DATA FILES Unauthorized users without appropriate privileges or files, including the CSP and the trusted third party, should be prevented from access to the underlying plaintext stored at CSP. The convergent encryption [11], [12] technique has been proposed to encrypt the data before outsourcing to the cloud storage server. M. Bellare [12] designs a system, DupLESS that combines a CE-type i.e. convergent encryption scheme with the ability to obtain message-derived keys. It encrypts/ decrypts a data copy with a convergent key, which is obtained by computing the cryptographic hash value of the content of the data copy. After key generation and data encryption, users retain the keys and send the cipher text to the cloud server. Since the encryption operation is deterministic and is derived from the data content, identical data copies will generate the same convergent key and hence the same cipher text. A user can download the encrypted file with the pointer from the server, which can only be decrypted by the corresponding data owners with their convergent keys. Thus, convergent encryption allows the cloud server to perform deduplication on the cipher texts and the proof of ownership prevents the unauthorized user to access the file. Babaso D. Aldar and Vidyullata Devmane ijesird, Vol. III, Issue III, September 2016/193

5 C. SYSTEM MODEL There are three entities in the system, that is, Owner/User, trusted third party and Cloud Service Provider. The CSP performs deduplication by checking if the contents of two files are the same and stores only one of them. The access right to a file is defined based on a set of privileges. The exact definition of a privilege varies across applications. Each privilege is represented in the form of a short message called token. Each file is associated with some file tokens, which denote the tag with specified privileges. Owner/User is the entity, which wants to outsource his/her data and later access it. Owner only uploads unique data but does not upload any duplicate data to save the upload bandwidth. This data may be owned by the same user or different users. Users have access to trusted third party, which will aid in performing deduplicable encryption by generating file tokens for the requesting users. Each file is protected with the convergent encryption key and privilege keys to realize the authorized deduplication with differential privileges. The Trusted Third Party (TTP) is an entity introduced for facilitating user s secure usage of cloud service. Since the computing resources at data owner/user side are restricted and the cloud server is not fully trusted party, therefore TTP is able to provide data owner/user with an execution environment and infrastructure working as an interface between owner/user and the CSP. The secret key for the privileges are managed by the TTP, who answers the file token requests from the users. The token of users are issued from the TTP. He cannot get any useful information from the token, which includes the file information or the privilege information. The interface offered by the TTP allows user to submit files and queries to be securely stored and computed respectively. The Cloud Service Provider (CSP) is an entity that provides a data storage service. The CSP provides the data outsourcing service and stores data on behalf of the owner. To reduce the storage cost, the CSP eliminates the storage of redundant data via deduplication and keeps only unique data. Owner/User can upload and download the files from CSP but TTP provides the security for that data. That means only the authorized person can upload and download the files from the cloud server. The tag F_tag of a file F will be determined by the Owner of file F. A secret key kp will be bounded with a privilege p to generate a file token. Let µ F, p = TokenGenerate (F_tag, kp) denote the token of F that is only allowed to access by the owner/user with privilege p. In another word, the token µ F, p could only be computed by the owner. If a file has been uploaded by a user with a duplicate token µ F, p, then a duplicate check sent from another user will be successful if and only if he also has the file F and privilege p. Such a token generation function could be easily implemented as H (F, kp), where H ( ) denotes a cryptographic hash function. (i) File Uploading The data owner of a file F wants to upload that file F on a cloud server with privilege p. First, He/she has to compute the tag by using µ F, p = TagGenerate (F). Fig. 3 Architecture of Secure deduplication of Data After calculating the tag value, Owner of a file has to send that tag value to TTP to compute token. This token value will be unique for the file containing unique data. The Token generated will be send by owner to cloud service provide for duplicate check. If a duplicate is found by the CSP, the user proceeds proof of ownership of this file with the CSP. If the proof is passed, the user will be assigned a pointer, which allows him to access the file. Otherwise, if no duplicate is found, the user Babaso D. Aldar and Vidyullata Devmane ijesird, Vol. III, Issue III, September 2016/194

6 computes the encrypted file C F = Encrption CE (k c, FILE) with the convergent key k c = KeyGenerationCE(FILE) and uploads the encrypted file along with File ID, Token and Privileges to the cloud server. The convergent key k c is stored by the user locally. (ii) File Downloading User wants to download a file. First, he/she has to get the privilege key from the TPA. After getting the privilege key, user request for the file to cloud server. Fig. 5 Procedure for file download Cloud server verifies that the user under the privileges. If that user is valid to retrieve the file then CSP provide the encrypted file to the user else abort the communication with that user. If User succeeds to download the encrypted file then he/she decrypts that file using convergent key which is stored locally and save that decrypted file on local disk. IV. IMPLEMENTATION Fig. 4 Procedure for file upload. We implement a prototype of the proposed secure deduplication system, in which we model three entities as separate JSP programs. An Owner/User program is used to model the data users to carry out the file upload process. A TTP program is used to model the trusted third party which handles the file token computation. A CSP program is used to model the Cloud Storage Server which stores and deduplicates files. We implement cryptographic operations of hashing and encryption with the OpenSSL library[13]. We also implement the communication between the entities based on HTTP, Thus, users can issue HTTP Post requests to the servers. Our implementation of the Owner/User provides the following methods to support token generation and deduplication along the file upload process. FileTag (F): Tag of a File F is generated using SHA-1 algorithm. Babaso D. Aldar and Vidyullata Devmane ijesird, Vol. III, Issue III, September 2016/195

7 TknReq (Tag, Userid): This request is generated for the token by providing the tag and user ID of a user. DupChkReq (Token): This is used for checking the file is duplicated on cloud server or not. FileEncryptAlgo (F): By AES algorithm, the file F is encrypted. The encryption is done using convergent key which is the hash value of the original file and generated using SHA-256. FileUpReq (FileID, F, Token): If file is not duplicate on cloud server then request to store that file on cloud by providing FileID, Encrypted file, Token and Privilege key. ShareTknReq (Token, Privilege): User request to the trusted third party for sharing the file token to the other users. According to the privilege the other users will get file token. By using this token and privilege, they can check the duplicate file or can access the file. Our implementation of the TTP includes corresponding request handlers for the token generation. TknGen (Tag, UserID): TTP takes the tag value received from owner/user to generate the token by using HMAC-SHA-1 algorithm. ShareTkn (Token, Priv): According to owner/user request, TTP shares the token among the specified users. Our implementation of the CSP provides deduplication and data storage maintains a map between existing files and associated token. DupChk (Token): It maps the generated token with existing tokens for duplicate check. If duplicate found then reply with file duplicate otherwise reply with No File Duplicate to owner/user. FileStore (FileID, F, Token): It store the encrypted file F, file ID and Token and maps the update. (A) PROOF OF OWNERSHIP (POW) The notion of proof of ownership is to solve the problem of using a small hash value as a proxy for the entire file in data deduplication [10]. PoW is a protocol enables users to prove their ownership of data copies to the storage server. PoW is implemented as an interactive algorithm run by a user and a storage server. The storage server derives a short value from a data copy M. User needs to send and run a proof algorithm with the storage server. The formal security definition for PoW roughly follows the threat model in a content distribution network, where an attacker does not know the entire file, but has accomplices who have the file. To solve the problem of using a small hash value as a proxy for the entire file, we want to design a solution where a client proves to the server that it indeed has the file. If = then storage server allows otherwise abort the communication. V. EVALUTION We conduct testbed on this prototype. Our evolution includes file token generation; share token generation, convergent encryption and file upload steps. We evaluate the overhead by varying different factors, including file size and number of stored files. We conduct the experiments with three machines equipped with an Intel Core-2-Duo processor 1.66 GHz CPU, 1.5 GB RAM and installed with windows-7 Home Premium 64-Bit Operating System. The systems are connected with 100Mbps LAN cables. The upload process is divided into six steps; Tag generation, Token Generation, Duplicate file check, Encryption of the file, Transfer of the encrypted file and share the token with users. For each step, we recorded the starting and ending time and obtained the difference of the start and end time for each step. Following fig. shows the average time taken in each step. To find the time spent on different steps, we have chosen the different sized files. We uploaded 50 files and recorded the time required for each step. The time spent on tag generation, encryption of file and transfer of file increases linearly with the file size, because these steps involve the actual file content and the I/O with the file. Babaso D. Aldar and Vidyullata Devmane ijesird, Vol. III, Issue III, September 2016/196

8 Fig. 6 Time required for different file size Token generation, duplicate check and share token uses the metadata i.e. data about the file for computation and time slowly getting increased as compared to the other steps. Token checking is done with a hash tables for duplicate check and to avoid the collision, token searching was linear. Thus, the notion of the secure data deduplication is to protect and deduplicate the files using differential privileges of the users. Owners/users deduplicate the files by calculating the tag value. Tokens are calculated by trusted third party. Using token concept, our system easily identifies the duplicate files and store the unique files. If a person having the file and wants to upload file but the same file is already stored by another person then cloud server runs the proof of ownership protocol to verify the ownership and gives the reference to that stored file. To protect the uploading files from internal and external attacks, it is encrypted using convergent encryption technique. To encrypt the file, convergent key is used which is derived from file itself. To avoid brute force, attacks only authorized users are allowed to request the duplicate check request. As a proof, we implemented a system according to the proposed secure duplicate check scheme and conducted testbed experiment on this system. Secure deduplication is done on file level but it can be performed on block level. If two files are duplicate then their blocks are also same. VI. CONCLUSION REFERENCES [1] X.Alphonseinbaraj, Authorized data deduplication check in hybrid cloud with cluster as a service, November [2] Jin. Li, Yan Kit Li, Xiaofeng, P. Lee, and W. Lou., A Hybrid Cloud Approach for Secure Authorised Deduplication,In IEEE Transactions on Parallel and Distributed Systems, [3] Boga Venkatesh, Anamika Sharma, Gaurav Desai, Dadaram Jadhav, Secure Authorised Deduplication by Using Hybrid Cloud Approach, November [4] Pasquale Puzio, Refik Molva, Melek Onen, Sergio Loureiro, ClouDedup: Secure Deduplication with Encrypted Data for Cloud Storage. [5] T.Y.J. NagaMalleswari, D.Malathi, "Deduplication Techniques: A Technical Survey, International Journal for Innovative Research in Science & Technology, December [6] Pooja S Dodamani, Pradeep Nazareth, A Survey on Hybrid Cloud with De-Duplication, International Journal of Innovative Research in Computer and Communication Engineering, December [7] Usha Dalvi, Sonali Kakade, Priyanka Mahadik, Arati Chavan, A Secured and Authenticated Mechanism for Cloud Data Deduplication Using Hybrid Clouds, International Journal of Engineering Research and Review, December [8] Kim, Sejun Song, Baek-Young Choi, SAFE: Structure-Aware File and Deduplication for Cloud-based Storage Systems. [9] Deepak Mishra, Dr. Sanjeev Sharma, Comprehensive study of data de-duplication, International Conference on Cloud, Big Data and Trust 2013, Nov [10] Jin. Li, Xiaofeng Chen, M. Li, J. Li, P. Lee, and W. Lou., Secure Deduplication with Efficient and Reliable Convergent Key Management, In IEEE Transactions on Parallel and Distributed Systems,June [11] P.Gokulraj, K.Kiruthika Devi, Deduplication Avoidance Using Convergent Key Management in Cloud, International Journal On Engineering Technology and Sciences, Volume I, Issue III, July [12] M. Bellare, S. Keelveedhi, and T. Ristenpart, Message-locked encryption and secure deduplication, in Proc. IACR Cryptology eprint Archive, [13] OpenSSL Project. April Babaso D. Aldar and Vidyullata Devmane ijesird, Vol. III, Issue III, September 2016/197

DATA DEDUPLCATION AND MIGRATION USING LOAD REBALANCING APPROACH IN HDFS Pritee Patil 1, Nitin Pise 2,Sarika Bobde 3 1

DATA DEDUPLCATION AND MIGRATION USING LOAD REBALANCING APPROACH IN HDFS Pritee Patil 1, Nitin Pise 2,Sarika Bobde 3 1 DATA DEDUPLCATION AND MIGRATION USING LOAD REBALANCING APPROACH IN HDFS Pritee Patil 1, Nitin Pise 2,Sarika Bobde 3 1 Department of Computer Engineering 2 Department of Computer Engineering Maharashtra

More information

Secure Data De-Duplication With Dynamic Ownership Management In Cloud Storage

Secure Data De-Duplication With Dynamic Ownership Management In Cloud Storage Secure Data De-Duplication With Dynamic Ownership Management In Cloud Storage 1 A. Sumathi, PG Scholar, Department Of Computer Science And Engineering, Maha Barathi Engineering College, Chinna Salem, Villupuram,

More information

ENCRYPTED DATA MANAGEMENT WITH DEDUPLICATION IN CLOUD COMPUTING

ENCRYPTED DATA MANAGEMENT WITH DEDUPLICATION IN CLOUD COMPUTING ENCRYPTED DATA MANAGEMENT WITH DEDUPLICATION IN CLOUD COMPUTING S KEERTHI 1*, MADHAVA REDDY A 2* 1. II.M.Tech, Dept of CSE, AM Reddy Memorial College of Engineering & Technology, Petlurivaripalem. 2. Assoc.

More information

S. Indirakumari, A. Thilagavathy

S. Indirakumari, A. Thilagavathy International Journal of Scientific Research in Computer Science, Engineering and Information Technology 2017 IJSRCSEIT Volume 2 Issue 2 ISSN : 2456-3307 A Secure Verifiable Storage Deduplication Scheme

More information

Secure Data Deduplication with Dynamic Ownership Management in Cloud Storage

Secure Data Deduplication with Dynamic Ownership Management in Cloud Storage Secure Data Deduplication with Dynamic Ownership Management in Cloud Storage Dr.S.Masood Ahamed 1, N.Mounika 2, N.vasavi 3, M.Vinitha Reddy 4 HOD, Department of Computer Science & Engineering,, Guru Nanak

More information

Deduplication on Encrypted Big Data in HDFS

Deduplication on Encrypted Big Data in HDFS Deduplication on Encrypted Big Data in HDFS Saif Ahmed Salim 1, Prof. Latika R. Desai 2 1Department of Computer Engineering, Dr. D.Y. Patil Institute of Technology, Pune University, Pune, India 2Department

More information

ISSN Vol.08,Issue.16, October-2016, Pages:

ISSN Vol.08,Issue.16, October-2016, Pages: ISSN 2348 2370 Vol.08,Issue.16, October-2016, Pages:3146-3152 www.ijatir.org Public Integrity Auditing for Shared Dynamic Cloud Data with Group User Revocation VEDIRE AJAYANI 1, K. TULASI 2, DR P. SUNITHA

More information

ABSTRACT I. INTRODUCTION

ABSTRACT I. INTRODUCTION 2018 IJSRSET Volume 4 Issue 4 Print ISSN: 2395-1990 Online ISSN : 2394-4099 Themed Section : Engineering and Technology An Efficient Search Method over an Encrypted Cloud Data Dipeeka Radke, Nikita Hatwar,

More information

Improving data integrity on cloud storage services

Improving data integrity on cloud storage services International Journal of Engineering Science Invention Volume 2 Issue 2 ǁ February. 2013 Improving data integrity on cloud storage services Miss. M.Sowparnika 1, Prof. R. Dheenadayalu 2 1 (Department of

More information

LOAD BALANCING AND DEDUPLICATION

LOAD BALANCING AND DEDUPLICATION LOAD BALANCING AND DEDUPLICATION Mr.Chinmay Chikode Mr.Mehadi Badri Mr.Mohit Sarai Ms.Kshitija Ubhe ABSTRACT Load Balancing is a method of distributing workload across multiple computing resources such

More information

Survey Paper on Efficient and Secure Dynamic Auditing Protocol for Data Storage in Cloud

Survey Paper on Efficient and Secure Dynamic Auditing Protocol for Data Storage in Cloud Available Online at www.ijcsmc.com International Journal of Computer Science and Mobile Computing A Monthly Journal of Computer Science and Information Technology IJCSMC, Vol. 3, Issue. 1, January 2014,

More information

International Journal of Computer Engineering and Applications, Volume XII, Special Issue, March 18, ISSN

International Journal of Computer Engineering and Applications, Volume XII, Special Issue, March 18,   ISSN International Journal of Computer Engineering and Applications, Volume XII, Special Issue, March 18, www.ijcea.com ISSN 2321-3469 SECURE DATA DEDUPLICATION FOR CLOUD STORAGE: A SURVEY Vidya Kurtadikar

More information

ISSN: [Shubhangi* et al., 6(8): August, 2017] Impact Factor: 4.116

ISSN: [Shubhangi* et al., 6(8): August, 2017] Impact Factor: 4.116 IJESRT INTERNATIONAL JOURNAL OF ENGINEERING SCIENCES & RESEARCH TECHNOLOGY DE-DUPLICABLE EFFECTIVE VALIDATION of CAPACITY for DYNAMIC USER ENVIRONMENT Dr. Shubhangi D C *1 & Pooja 2 *1 HOD, Department

More information

FADE: A Secure Overlay Cloud Storage System with Access Control and Assured Deletion. Patrick P. C. Lee

FADE: A Secure Overlay Cloud Storage System with Access Control and Assured Deletion. Patrick P. C. Lee FADE: A Secure Overlay Cloud Storage System with Access Control and Assured Deletion Patrick P. C. Lee 1 Cloud Storage is Emerging Cloud storage is now an emerging business model for data outsourcing Individual

More information

A Secure and Dynamic Multi-keyword Ranked Search Scheme over Encrypted Cloud Data

A Secure and Dynamic Multi-keyword Ranked Search Scheme over Encrypted Cloud Data An Efficient Privacy-Preserving Ranked Keyword Search Method Cloud data owners prefer to outsource documents in an encrypted form for the purpose of privacy preserving. Therefore it is essential to develop

More information

Cloud security is an evolving sub-domain of computer and. Cloud platform utilizes third-party data centers model. An

Cloud security is an evolving sub-domain of computer and. Cloud platform utilizes third-party data centers model. An Abstract Cloud security is an evolving sub-domain of computer and network security. Cloud platform utilizes third-party data centers model. An example of cloud platform as a service (PaaS) is Heroku. In

More information

Encrypted Data Deduplication in Cloud Storage

Encrypted Data Deduplication in Cloud Storage Encrypted Data Deduplication in Cloud Storage Chun- I Fan, Shi- Yuan Huang, Wen- Che Hsu Department of Computer Science and Engineering Na>onal Sun Yat- sen University Kaohsiung, Taiwan AsiaJCIS 2015 Outline

More information

SDD: A Novel Technique for Enhancing Cloud Security with Self Destructing Data

SDD: A Novel Technique for Enhancing Cloud Security with Self Destructing Data SDD: A Novel Technique for Enhancing Cloud Security with Self Destructing Data Kishore K, Ramchand V M.Tech Student, Dept. of CSE, The Oxford College Of Engineering, Bangalore, India Associate Professor,

More information

A SURVEY ON MANAGING CLOUD STORAGE USING SECURE DEDUPLICATION

A SURVEY ON MANAGING CLOUD STORAGE USING SECURE DEDUPLICATION ISSN: 0976-3104 SPECIAL ISSUE: (Emerging Technologies in Networking and Security (ETNS) Keerthana et al. ARTICLE OPEN ACCESS A SURVEY ON MANAGING CLOUD STORAGE USING SECURE DEDUPLICATION K. Keerthana*,

More information

Group User Revocation in Cloud for Shared Data

Group User Revocation in Cloud for Shared Data Group User Revocation in Cloud for Shared Data Mahesh Salunke, Harshal Meher, Ajay Tambe, Sudir Deshmukh,Prof.Sanjay Agarwal Abstract With the excessive use of internet cloud has received much of the attention.

More information

International Journal of Advance Engineering and Research Development. AN Optimal Matrix Approach for virtual load allocation and data sharing

International Journal of Advance Engineering and Research Development. AN Optimal Matrix Approach for virtual load allocation and data sharing Scientific Journal of Impact Factor (SJIF): 5.71 International Journal of Advance Engineering and Research Development Volume 5, Issue 02, February -2018 e-issn (O): 2348-4470 p-issn (P): 2348-6406 AN

More information

Data Deduplication using Even or Odd Block (EOB) Checking Algorithm in Hybrid Cloud

Data Deduplication using Even or Odd Block (EOB) Checking Algorithm in Hybrid Cloud Data Deduplication using Even or Odd Block (EOB) Checking Algorithm in Hybrid Cloud Suganthi.M 1, Hemalatha.B 2 Research Scholar, Depart. Of Computer Science, Chikkanna Government Arts College, Tamilnadu,

More information

Secure Role-Based Access Control on Encrypted Data in Cloud Storage using ARM

Secure Role-Based Access Control on Encrypted Data in Cloud Storage using ARM Secure Role-Based Access Control on Encrypted Data in Cloud Storage using ARM Rohini Vidhate, V. D. Shinde Abstract With the rapid developments occurring in cloud computing and services, there has been

More information

Efficient Auditable Access Control Systems for Public Shared Cloud Storage

Efficient Auditable Access Control Systems for Public Shared Cloud Storage Efficient Auditable Access Control Systems for Public Shared Cloud Storage Vidya Patil 1, Prof. Varsha R. Dange 2 Student, Department of Computer Science Dhole Patil College of Engineering, Pune, Maharashtra,

More information

Lamassu: Storage-Efficient Host-Side Encryption

Lamassu: Storage-Efficient Host-Side Encryption Lamassu: Storage-Efficient Host-Side Encryption Peter Shah, Won So Advanced Technology Group 9 July, 2015 1 2015 NetApp, Inc. All rights reserved. Agenda 1) Overview 2) Security 3) Solution Architecture

More information

Integrity Check Mechanism in Cloud Using SHA-512 Algorithm

Integrity Check Mechanism in Cloud Using SHA-512 Algorithm www.ijecs.in International Journal Of Engineering And Computer Science ISSN:2319-7242 Volume 3 Issue 5 may, 2014 Page No. 6033-6037 Integrity Check Mechanism in Cloud Using SHA-512 Algorithm Mrs.Shantala

More information

Message-Locked Encryption and Secure Deduplication

Message-Locked Encryption and Secure Deduplication Message-Locked Encryption and Secure Deduplication Eurocrypt 2013 Mihir Bellare 1 Sriram Keelveedhi 1 Thomas Ristenpart 2 1 University of California, San Diego 2 University of Wisconsin-Madison 1 Deduplication

More information

DATA INTEGRITY TECHNIQUES IN CLOUD: AN ANALYSIS

DATA INTEGRITY TECHNIQUES IN CLOUD: AN ANALYSIS DATA INTEGRITY TECHNIQUES IN CLOUD: AN ANALYSIS 1 MS. R. K. PANDYA, 2 PROF. K. K. SUTARIA 1 M.E.[Cloud Computing] Student, Computer Engineering Department, V. V. P. Engineering College, Rajkot, Gujarat

More information

Secure Conjunctive Keyword Ranked Search over Encrypted Cloud Data

Secure Conjunctive Keyword Ranked Search over Encrypted Cloud Data Secure Conjunctive Keyword Ranked Search over Encrypted Cloud Data Shruthishree M. K, Prasanna Kumar R.S Abstract: Cloud computing is a model for enabling convenient, on-demand network access to a shared

More information

COMPOSABLE AND ROBUST OUTSOURCED STORAGE

COMPOSABLE AND ROBUST OUTSOURCED STORAGE SESSION ID: CRYP-R14 COMPOSABLE AND ROBUST OUTSOURCED STORAGE Christian Badertscher and Ueli Maurer ETH Zurich, Switzerland Motivation Server/Database Clients Write Read block 2 Outsourced Storage: Security

More information

SECURE MULTI-KEYWORD TOP KEY RANKED SEARCH SCHEME OVER ENCRYPTED CLOUD DATA

SECURE MULTI-KEYWORD TOP KEY RANKED SEARCH SCHEME OVER ENCRYPTED CLOUD DATA Research Manuscript Title SECURE MULTI-KEYWORD TOP KEY RANKED SEARCH SCHEME OVER ENCRYPTED CLOUD DATA Dr.B.Kalaavathi, SM.Keerthana, N.Renugadevi Professor, Assistant professor, PGScholar Department of

More information

International Journal of Advance Engineering and Research Development. Carefree Data Access Solution for Public Cloud Storage

International Journal of Advance Engineering and Research Development. Carefree Data Access Solution for Public Cloud Storage Scientific Journal of Impact Factor (SJIF): 4.14 International Journal of Advance Engineering and Research Development Volume 3, Issue 3, March -2016 Carefree Data Access Solution for Public Cloud Storage

More information

CloudSky: A Controllable Data Self-Destruction System for Untrusted Cloud Storage Networks

CloudSky: A Controllable Data Self-Destruction System for Untrusted Cloud Storage Networks CloudSky: A Controllable Data Self-Destruction System for Untrusted Cloud Storage Networks The material in these slides mainly comes from the paper CloudSky: A Controllable Data Self-Destruction System

More information

TIBX NEXT-GENERATION ARCHIVE FORMAT IN ACRONIS BACKUP CLOUD

TIBX NEXT-GENERATION ARCHIVE FORMAT IN ACRONIS BACKUP CLOUD TIBX NEXT-GENERATION ARCHIVE FORMAT IN ACRONIS BACKUP CLOUD 1 Backup Speed and Reliability Are the Top Data Protection Mandates What are the top data protection mandates from your organization s IT leadership?

More information

AES and DES Using Secure and Dynamic Data Storage in Cloud

AES and DES Using Secure and Dynamic Data Storage in Cloud Available Online at www.ijcsmc.com International Journal of Computer Science and Mobile Computing A Monthly Journal of Computer Science and Information Technology IJCSMC, Vol. 3, Issue. 1, January 2014,

More information

Privacy Preserving Public Auditing in Secured Cloud Storage Using Block Authentication Code

Privacy Preserving Public Auditing in Secured Cloud Storage Using Block Authentication Code Privacy Preserving Public Auditing in Secured Cloud Storage Using Block Authentication Code Sajeev V 1, Gowthamani R 2 Department of Computer Science, Nehru Institute of Technology, Coimbatore, India 1,

More information

IBM Tivoli Storage Manager Version Introduction to Data Protection Solutions IBM

IBM Tivoli Storage Manager Version Introduction to Data Protection Solutions IBM IBM Tivoli Storage Manager Version 7.1.6 Introduction to Data Protection Solutions IBM IBM Tivoli Storage Manager Version 7.1.6 Introduction to Data Protection Solutions IBM Note: Before you use this

More information

ABSTRACT I. INTRODUCTION. Telangana, India 2 Professor, Department of Computer Science & Engineering, Shadan College of Engineering & Technology,

ABSTRACT I. INTRODUCTION. Telangana, India 2 Professor, Department of Computer Science & Engineering, Shadan College of Engineering & Technology, International Journal of Scientific Research in Computer Science, Engineering and Information Technology 2017 IJSRCSEIT Volume 2 Issue 6 ISSN : 2456-3307 Secure Proxy Server Data Sharing Scheme in Hybrid

More information

PRIVACY PRESERVING RANKED MULTI KEYWORD SEARCH FOR MULTIPLE DATA OWNERS. SRM University, Kattankulathur, Chennai, IN.

PRIVACY PRESERVING RANKED MULTI KEYWORD SEARCH FOR MULTIPLE DATA OWNERS. SRM University, Kattankulathur, Chennai, IN. Volume 115 No. 6 2017, 585-589 ISSN: 1311-8080 (printed version); ISSN: 1314-3395 (on-line version) url: http://www.ijpam.eu ijpam.eu PRIVACY PRESERVING RANKED MULTI KEYWORD SEARCH FOR MULTIPLE DATA OWNERS

More information

Network Security Issues and Cryptography

Network Security Issues and Cryptography Network Security Issues and Cryptography PriyaTrivedi 1, Sanya Harneja 2 1 Information Technology, Maharishi Dayanand University Farrukhnagar, Gurgaon, Haryana, India 2 Information Technology, Maharishi

More information

Efficient integrity checking technique for securing client data in cloud computing

Efficient integrity checking technique for securing client data in cloud computing International Journal of Electrical & Computer Sciences IJECS-IJENS Vol: 11 No: 05 43 Efficient integrity checking technique for securing client data in cloud computing Dalia Attas and Omar Batrafi Computer

More information

EFFICIENT DATA SHARING WITH ATTRIBUTE REVOCATION FOR CLOUD STORAGE

EFFICIENT DATA SHARING WITH ATTRIBUTE REVOCATION FOR CLOUD STORAGE EFFICIENT DATA SHARING WITH ATTRIBUTE REVOCATION FOR CLOUD STORAGE Chakali Sasirekha 1, K. Govardhan Reddy 2 1 M.Tech student, CSE, Kottam college of Engineering, Chinnatekuru(V),Kurnool,Andhra Pradesh,

More information

Nasuni UniFS a True Global File System

Nasuni UniFS a True Global File System Nasuni UniFS a True Global File System File systems are the time-proven way to store, share, and protect unstructured data. But traditional device-based file systems no longer meet the needs of the modern

More information

IBM Spectrum Protect Version Introduction to Data Protection Solutions IBM

IBM Spectrum Protect Version Introduction to Data Protection Solutions IBM IBM Spectrum Protect Version 8.1.2 Introduction to Data Protection Solutions IBM IBM Spectrum Protect Version 8.1.2 Introduction to Data Protection Solutions IBM Note: Before you use this information

More information

Operating systems and security - Overview

Operating systems and security - Overview Operating systems and security - Overview Protection in Operating systems Protected objects Protecting memory, files User authentication, especially passwords Trusted operating systems, security kernels,

More information

Operating systems and security - Overview

Operating systems and security - Overview Operating systems and security - Overview Protection in Operating systems Protected objects Protecting memory, files User authentication, especially passwords Trusted operating systems, security kernels,

More information

Security context. Technology. Solution highlights

Security context. Technology. Solution highlights Code42 CrashPlan Security Code42 CrashPlan provides continuous, automatic desktop and laptop backup. Our layered approach to security exceeds industry best practices and fulfills the enterprise need for

More information

Software-defined Storage: Fast, Safe and Efficient

Software-defined Storage: Fast, Safe and Efficient Software-defined Storage: Fast, Safe and Efficient TRY NOW Thanks to Blockchain and Intel Intelligent Storage Acceleration Library Every piece of data is required to be stored somewhere. We all know about

More information

MULTI - KEYWORD RANKED SEARCH OVER ENCRYPTED DATA SUPPORTING SYNONYM QUERY

MULTI - KEYWORD RANKED SEARCH OVER ENCRYPTED DATA SUPPORTING SYNONYM QUERY ISSN: 0976-3104 SPECIAL ISSUE Jayanthi and Prabadevi RESEARCH OPEN ACCESS MULTI - KEYWORD RANKED SEARCH OVER ENCRYPTED DATA SUPPORTING SYNONYM QUERY Jayanthi M.* and Prabadevi School of Information Technology

More information

A Survey on Secure Sharing In Cloud Computing

A Survey on Secure Sharing In Cloud Computing A Survey on Secure Sharing In Cloud Computing Aakanksha maliye, Sarita Patil Department of Computer Engineering, G.H.Raisoni College of Engineering & Management, Wagholi, India ABSTRACT: Cloud computing

More information

Fine-Grained Data Sharing Supporting Attribute Extension in Cloud Computing

Fine-Grained Data Sharing Supporting Attribute Extension in Cloud Computing wwwijcsiorg 10 Fine-Grained Data Sharing Supporting Attribute Extension in Cloud Computing Yinghui Zhang 12 1 National Engineering Laboratory for Wireless Security Xi'an University of Posts and Telecommunications

More information

International Journal of Computer Science Trends and Technology (IJCST) Volume 5 Issue 4, Jul Aug 2017

International Journal of Computer Science Trends and Technology (IJCST) Volume 5 Issue 4, Jul Aug 2017 RESEARCH ARTICLE OPEN ACCESS Optimizing Fully Homomorphic Encryption Algorithm using Greedy Approach in Cloud Computing Kirandeep Kaur [1], Jyotsna Sengupta [2] Department of Computer Science Punjabi University,

More information

Session Based Ciphertext Policy Attribute Based Encryption Method for Access Control in Cloud Storage

Session Based Ciphertext Policy Attribute Based Encryption Method for Access Control in Cloud Storage IOSR Journal of Engineering (IOSRJEN) ISSN (e): 2250-3021, ISSN (p): 2278-8719 Vol. 04, Issue 09 (September. 2014), V3 PP 21-25 www.iosrjen.org Session Based Ciphertext Policy Attribute Based Encryption

More information

Online Version Only. Book made by this file is ILLEGAL. Design and Implementation of Binary File Similarity Evaluation System. 1.

Online Version Only. Book made by this file is ILLEGAL. Design and Implementation of Binary File Similarity Evaluation System. 1. , pp.1-10 http://dx.doi.org/10.14257/ijmue.2014.9.1.01 Design and Implementation of Binary File Similarity Evaluation System Sun-Jung Kim 2, Young Jun Yoo, Jungmin So 1, Jeong Gun Lee 1, Jin Kim 1 and

More information

A Simple Secure Auditing for Cloud Storage

A Simple Secure Auditing for Cloud Storage A Simple Secure Auditing for Cloud Storage Lee-Hur Shing Institute of Information Science Academia Sinica leehurs@iis.sinica.edu.tw Marn-Ling Shing University of Taipei Yu-Hsuan Yeh, Yan-Zhi Hu and Shih-Ci

More information

System Approach for Single Keyword Search for Encrypted data files Guarantees in Public Infrastructure Clouds

System Approach for Single Keyword Search for Encrypted data files Guarantees in Public Infrastructure Clouds System Approach for Single Keyword Search for Encrypted data files Guarantees in Public Infrastructure s B.Nandan 1, M.Haripriya 2, N.Tejaswi 3, N. Sai Kishore 4 Associate Professor, Department of CSE,

More information

Lecture 6: Symmetric Cryptography. CS 5430 February 21, 2018

Lecture 6: Symmetric Cryptography. CS 5430 February 21, 2018 Lecture 6: Symmetric Cryptography CS 5430 February 21, 2018 The Big Picture Thus Far Attacks are perpetrated by threats that inflict harm by exploiting vulnerabilities which are controlled by countermeasures.

More information

SecureDoc Disk Encryption Cryptographic Engine

SecureDoc Disk Encryption Cryptographic Engine SecureDoc Disk Encryption Cryptographic Engine Security Policy Abstract: This document specifies Security Policy enforced by the SecureDoc Cryptographic Engine compliant with the requirements of FIPS 140-2

More information

Proximity-Aware Location Based Collaborative Sensing for Energy-Efficient Mobile Devices

Proximity-Aware Location Based Collaborative Sensing for Energy-Efficient Mobile Devices Volume 03 - Issue 10 October 2018 PP. 30-34 Proximity-Aware Location Based Collaborative Sensing for Energy-Efficient Mobile Devices Pranav Nair 1, Hitesh Patil 2, Tukaram Gore 3, Yogesh Jadhav 4 1 (Computer

More information

Rio-2 Hybrid Backup Server

Rio-2 Hybrid Backup Server A Revolution in Data Storage for Today s Enterprise March 2018 Notices This white paper provides information about the as of the date of issue of the white paper. Processes and general practices are subject

More information

HOW DATA DEDUPLICATION WORKS A WHITE PAPER

HOW DATA DEDUPLICATION WORKS A WHITE PAPER HOW DATA DEDUPLICATION WORKS A WHITE PAPER HOW DATA DEDUPLICATION WORKS ABSTRACT IT departments face explosive data growth, driving up costs of storage for backup and disaster recovery (DR). For this reason,

More information

Research and Design of Crypto Card Virtualization Framework Lei SUN, Ze-wu WANG and Rui-chen SUN

Research and Design of Crypto Card Virtualization Framework Lei SUN, Ze-wu WANG and Rui-chen SUN 2016 International Conference on Wireless Communication and Network Engineering (WCNE 2016) ISBN: 978-1-60595-403-5 Research and Design of Crypto Card Virtualization Framework Lei SUN, Ze-wu WANG and Rui-chen

More information

Cooperative Private Searching in Clouds

Cooperative Private Searching in Clouds Cooperative Private Searching in Clouds Jie Wu Department of Computer and Information Sciences Temple University Road Map Cloud Computing Basics Cloud Computing Security Privacy vs. Performance Proposed

More information

An Efficient Provable Data Possession Scheme based on Counting Bloom Filter for Dynamic Data in the Cloud Storage

An Efficient Provable Data Possession Scheme based on Counting Bloom Filter for Dynamic Data in the Cloud Storage , pp. 9-16 http://dx.doi.org/10.14257/ijmue.2016.11.4.02 An Efficient Provable Data Possession Scheme based on Counting Bloom Filter for Dynamic Data in the Cloud Storage Eunmi Jung 1 and Junho Jeong 2

More information

SECURE CLOUD BACKUP AND RECOVERY

SECURE CLOUD BACKUP AND RECOVERY SECURE CLOUD BACKUP AND RECOVERY Learn more about how KeepItSafe can help to reduce costs, save time, and provide compliance for online backup, disaster recovery-as-a-service, mobile data protection, and

More information

Implementation of Decentralized Access Control with Anonymous Authentication in Cloud

Implementation of Decentralized Access Control with Anonymous Authentication in Cloud Volume-5, Issue-6, December-2015 International Journal of Engineering and Management Research Page Number: 210-214 Implementation of Decentralized Access Control with Anonymous Authentication in Cloud

More information

1 Defining Message authentication

1 Defining Message authentication ISA 562: Information Security, Theory and Practice Lecture 3 1 Defining Message authentication 1.1 Defining MAC schemes In the last lecture we saw that, even if our data is encrypted, a clever adversary

More information

Efficient Data Storage Security with Multiple Batch Auditing in Cloud Computing

Efficient Data Storage Security with Multiple Batch Auditing in Cloud Computing Efficient Data Storage Security with Multiple Batch Auditing in Cloud Computing P. Sukumar [1] Department of Computer Science Sri Venkateswara College of Engineering, Chennai B. Sathiya [2] Department

More information

Executive Summary SOLE SOURCE JUSTIFICATION. Microsoft Integration

Executive Summary SOLE SOURCE JUSTIFICATION. Microsoft Integration Executive Summary Commvault Simpana software delivers the unparalleled advantages and benefits of a truly holistic approach to data management. It is one product that contains individually licensable modules

More information

Scale-out Object Store for PB/hr Backups and Long Term Archive April 24, 2014

Scale-out Object Store for PB/hr Backups and Long Term Archive April 24, 2014 Scale-out Object Store for PB/hr Backups and Long Term Archive April 24, 2014 Gideon Senderov Director, Advanced Storage Products NEC Corporation of America Long-Term Data in the Data Center (EB) 140 120

More information

Alternative Approaches for Deduplication in Cloud Storage Environment

Alternative Approaches for Deduplication in Cloud Storage Environment International Journal of Computational Intelligence Research ISSN 0973-1873 Volume 13, Number 10 (2017), pp. 2357-2363 Research India Publications http://www.ripublication.com Alternative Approaches for

More information

SEGMENT STATURE HASH TABLE BASED COST EFFICIENT DATA SHARING IN CLOUD ENVIRONMENT

SEGMENT STATURE HASH TABLE BASED COST EFFICIENT DATA SHARING IN CLOUD ENVIRONMENT SEGMENT STATURE HASH TABLE BASED COST EFFICIENT DATA SHARING IN CLOUD ENVIRONMENT K. Karthika Lekshmi 1, Dr. M. Vigilsonprem 2 1 Assistant Professor, Department of Information Technology, Cape Institute

More information

Generating A Digital Signature Based On New Cryptographic Scheme For User Authentication And Security

Generating A Digital Signature Based On New Cryptographic Scheme For User Authentication And Security Indian Journal of Science and Technology, Vol 7(S6), 1 5, October 2014 ISSN (Print) : 0974-6846 ISSN (Online) : 0974-5645 Generating A Digital Signature Based On New Cryptographic Scheme For User Authentication

More information

Veritas NetBackup Appliance Family OVERVIEW BROCHURE

Veritas NetBackup Appliance Family OVERVIEW BROCHURE Veritas NetBackup Appliance Family OVERVIEW BROCHURE Veritas NETBACKUP APPLIANCES Veritas understands the shifting needs of the data center and offers NetBackup Appliances as a way for customers to simplify

More information

HP Dynamic Deduplication achieving a 50:1 ratio

HP Dynamic Deduplication achieving a 50:1 ratio HP Dynamic Deduplication achieving a 50:1 ratio Table of contents Introduction... 2 Data deduplication the hottest topic in data protection... 2 The benefits of data deduplication... 2 How does data deduplication

More information

Key Terms: Cloud Computing, cloud Service Provider, Provable Data Possession, Dynamic File Block, Map Version Table.

Key Terms: Cloud Computing, cloud Service Provider, Provable Data Possession, Dynamic File Block, Map Version Table. Volume 6, Issue 6, June 2016 ISSN: 2277 128X International Journal of Advanced Research in Computer Science and Software Engineering Research Paper Available online at: www.ijarcsse.com Dynamic File Block

More information

International Journal of Advance Engineering and Research Development. Duplicate File Searcher and Remover

International Journal of Advance Engineering and Research Development. Duplicate File Searcher and Remover Scientific Journal of Impact Factor (SJIF): 4.72 International Journal of Advance Engineering and Research Development Volume 4, Issue 4, April -2017 Duplicate File Searcher and Remover Remover Of Duplicate

More information

International Journal of Advance Research in Engineering, Science & Technology

International Journal of Advance Research in Engineering, Science & Technology Impact Factor (SJIF): 5.302 International Journal of Advance Research in Engineering, Science & Technology e-issn: 2393-9877, p-issn: 2394-2444 Volume 5, Issue 3, March-2018 Key Aggregate Tagged File Searching(KATFS)

More information

Attribute Based Encryption with Privacy Protection in Clouds

Attribute Based Encryption with Privacy Protection in Clouds Attribute Based Encryption with Privacy Protection in Clouds Geetanjali. M 1, Saravanan. N 2 PG Student, Department of Information Technology, K.S.R College of Engineering, Tiruchengode, Tamilnadu, India

More information

Attribute-based encryption with encryption and decryption outsourcing

Attribute-based encryption with encryption and decryption outsourcing Edith Cowan University Research Online Australian Information Security Management Conference Conferences, Symposia and Campus Events 2014 Attribute-based encryption with encryption and decryption outsourcing

More information

Opendedupe & Veritas NetBackup ARCHITECTURE OVERVIEW AND USE CASES

Opendedupe & Veritas NetBackup ARCHITECTURE OVERVIEW AND USE CASES Opendedupe & Veritas NetBackup ARCHITECTURE OVERVIEW AND USE CASES May, 2017 Contents Introduction... 2 Overview... 2 Architecture... 2 SDFS File System Service... 3 Data Writes... 3 Data Reads... 3 De-duplication

More information

Delegated Access for Hadoop Clusters in the Cloud

Delegated Access for Hadoop Clusters in the Cloud Delegated Access for Hadoop Clusters in the Cloud David Nuñez, Isaac Agudo, and Javier Lopez Network, Information and Computer Security Laboratory (NICS Lab) Universidad de Málaga, Spain Email: dnunez@lcc.uma.es

More information

Deploying De-Duplication on Ext4 File System

Deploying De-Duplication on Ext4 File System Deploying De-Duplication on Ext4 File System Usha A. Joglekar 1, Bhushan M. Jagtap 2, Koninika B. Patil 3, 1. Asst. Prof., 2, 3 Students Department of Computer Engineering Smt. Kashibai Navale College

More information

Computer Security CS 526

Computer Security CS 526 Computer Security CS 526 Topic 4 Cryptography: Semantic Security, Block Ciphers and Encryption Modes CS555 Topic 4 1 Readings for This Lecture Required reading from wikipedia Block Cipher Ciphertext Indistinguishability

More information

Backup Solution. User Guide. Issue 01 Date

Backup Solution. User Guide. Issue 01 Date Issue 01 Date 2017-08-30 Contents Contents 1 Introduction... 1 1.1 What Is the Backup Solution?... 1 1.2 Why Choose the Backup Solution?... 2 1.3 Concepts and Principles...3 1.3.1 Basic OBS Concepts...3

More information

DEDUPLICATION BASICS

DEDUPLICATION BASICS DEDUPLICATION BASICS 4 DEDUPE BASICS 6 WHAT IS DEDUPLICATION 8 METHODS OF DEDUPLICATION 10 DEDUPLICATION EXAMPLE 12 HOW DO DISASTER RECOVERY & ARCHIVING FIT IN? 14 DEDUPLICATION FOR EVERY BUDGET QUANTUM

More information

Deduplication Storage System

Deduplication Storage System Deduplication Storage System Kai Li Charles Fitzmorris Professor, Princeton University & Chief Scientist and Co-Founder, Data Domain, Inc. 03/11/09 The World Is Becoming Data-Centric CERN Tier 0 Business

More information

QUALITY OF SEVICE WITH DATA STORAGE SECURITY IN CLOUD COMPUTING

QUALITY OF SEVICE WITH DATA STORAGE SECURITY IN CLOUD COMPUTING QUALITY OF SEVICE WITH DATA STORAGE SECURITY IN CLOUD COMPUTING ABSTRACT G KALYANI 1* 1. M.Tech Student, Dept of CSE Indira Institute of Engineering and Technology, Markapur, AP. Cloud computing has been

More information

IMPROVING DATA SECURITY USING ATTRIBUTE BASED BROADCAST ENCRYPTION IN CLOUD COMPUTING

IMPROVING DATA SECURITY USING ATTRIBUTE BASED BROADCAST ENCRYPTION IN CLOUD COMPUTING IMPROVING DATA SECURITY USING ATTRIBUTE BASED BROADCAST ENCRYPTION IN CLOUD COMPUTING 1 K.Kamalakannan, 2 Mrs.Hemlathadhevi Abstract -- Personal health record (PHR) is an patient-centric model of health

More information

Boundary control : Access Controls: An access control mechanism processes users request for resources in three steps: Identification:

Boundary control : Access Controls: An access control mechanism processes users request for resources in three steps: Identification: Application control : Boundary control : Access Controls: These controls restrict use of computer system resources to authorized users, limit the actions authorized users can taker with these resources,

More information

IBM řešení pro větší efektivitu ve správě dat - Store more with less

IBM řešení pro větší efektivitu ve správě dat - Store more with less IBM řešení pro větší efektivitu ve správě dat - Store more with less IDG StorageWorld 2012 Rudolf Hruška Information Infrastructure Leader IBM Systems & Technology Group rudolf_hruska@cz.ibm.com IBM Agenda

More information

Data safety for digital business. Veritas Backup Exec WHITE PAPER. One solution for hybrid, physical, and virtual environments.

Data safety for digital business. Veritas Backup Exec WHITE PAPER. One solution for hybrid, physical, and virtual environments. WHITE PAPER Data safety for digital business. One solution for hybrid, physical, and virtual environments. It s common knowledge that the cloud plays a critical role in helping organizations accomplish

More information

Copyright 2010 EMC Corporation. Do not Copy - All Rights Reserved.

Copyright 2010 EMC Corporation. Do not Copy - All Rights Reserved. 1 Using patented high-speed inline deduplication technology, Data Domain systems identify redundant data as they are being stored, creating a storage foot print that is 10X 30X smaller on average than

More information

Virtual WAN Optimization Controllers

Virtual WAN Optimization Controllers acel E RA VA DATAS HEET Virtual WAN Optimization Controllers acelera VA Virtual WAN Optimization Controllers accelerate applications, speed data transfers and reduce bandwidth costs using a combination

More information

Technology Insight Series

Technology Insight Series IBM ProtecTIER Deduplication for z/os John Webster March 04, 2010 Technology Insight Series Evaluator Group Copyright 2010 Evaluator Group, Inc. All rights reserved. Announcement Summary The many data

More information

Multiple forgery attacks against Message Authentication Codes

Multiple forgery attacks against Message Authentication Codes Multiple forgery attacks against Message Authentication Codes David A. McGrew and Scott R. Fluhrer Cisco Systems, Inc. {mcgrew,sfluhrer}@cisco.com May 31, 2005 Abstract Some message authentication codes

More information

Cloud FastPath: Highly Secure Data Transfer

Cloud FastPath: Highly Secure Data Transfer Cloud FastPath: Highly Secure Data Transfer Tervela helps companies move large volumes of sensitive data safely and securely over network distances great and small. Tervela has been creating high performance

More information

An efficient and practical solution to secure password-authenticated scheme using smart card

An efficient and practical solution to secure password-authenticated scheme using smart card An efficient and practical solution to secure password-authenticated scheme using smart card R. Deepa 1, R. Prabhu M.Tech 2, PG Research scholor 1, Head of the Department 2 Dept.of Information Technology,

More information

A New Symmetric Key Algorithm for Modern Cryptography Rupesh Kumar 1 Sanjay Patel 2 Purushottam Patel 3 Rakesh Patel 4

A New Symmetric Key Algorithm for Modern Cryptography Rupesh Kumar 1 Sanjay Patel 2 Purushottam Patel 3 Rakesh Patel 4 IJSRD - International Journal for Scientific Research & Development Vol. 2, Issue 08, 2014 ISSN (online): 2321-0613 A New Symmetric Key Algorithm for Modern Cryptography Rupesh Kumar 1 Sanjay Patel 2 Purushottam

More information

Google Cloud Platform: Customer Responsibility Matrix. December 2018

Google Cloud Platform: Customer Responsibility Matrix. December 2018 Google Cloud Platform: Customer Responsibility Matrix December 2018 Introduction 3 Definitions 4 PCI DSS Responsibility Matrix 5 Requirement 1 : Install and Maintain a Firewall Configuration to Protect

More information