Группа авторов

Cyber Security and Network Security


Скачать книгу

(IDMSs), which also manage authentication, authorization, and data interchange over the internet. In addition to a lack of interoperability, single points of vulnerability, and privacy concerns, such as allowing bulk data collection and device tracking, traditional identity management systems suffer from a lack of interoperability, single points of vulnerability, and privacy concerns. Blockchain technology has the potential to alleviate these problems by allowing users to track who owns their own IDs and authentication credentials, as well as enabling novel information ownership and administration frameworks with built-in control and consensus methods. As a result, the number of blockchain-based identity management solutions, which can benefit both enterprises and clients, has been fast expanding. We’ll classify these frameworks using scientific criteria based on differences in blockchain architecture, administration methods, and other important features. Context is provided by scientific classification, which includes the depiction of significant concepts, evolving principles, and use cases, as well as highlighting important security and privacy concerns.

      In Chapter 6, the concept of feed forward networks is introduced which serve as the foundation for recurrent neural networks. Simple writing analysis is the best analogy for RNN, because the prediction of the next word is always dependent on prior knowledge of the sentence’s contents. RNN is a form of artificial neural network that is used to recognize a sequence of data and then analyze the results in order to predict the outcome. The LSTM is a type of RNN that consists of a stack of layers with neurons in each layer. This article also goes into the issues that each technology has as well as possible remedies. Optimization algorithms alter the features of neural networks, such as weights and learning rates, to reduce losses. Optimization Algorithms in Neural Networks is one of the sections. A section dedicated to some of the most current in-depth studies on Steganography and neural network combinations. Finally, for the prior five years, we give an analysis of existing research on the current study (2017 to 2021).

      In Chapter 8, the increasing demands for the preservation and transit of multi-media data have been a part of everyday life over the last many decades. Images and videos, as well as multimedia data, play an important role in creating an immersive experience. In today’s technologically evolved society, data and information must be sent rapidly and securely; nevertheless, valuable data must be protected from unauthorized people. A deep neural network is used to develop a covert communication and textual data extraction strategy based on steganography and picture compression in such work. The original input textual image and cover image are both pre-processed using spatial steganography, and then the covert text-based pictures are separated and implanted into the least significant bit of the cover image picture element. Following that, stego-images are compressed to provide a higher-quality image while also saving storage space at the sender’s end. After that, the stego-image will be transmitted to the receiver over a communication link. At the receiver’s end, steganography and compression are then reversed. This work contains a plethora of issues, making it an intriguing subject to pursue. The most crucial component of this task is choosing the right steganography and image compression method. The proposed technology, which combines image steganography and compression, achieves higher peak signal-to-noise efficiency.

      Chapter 10 has explored the modern Information Technology environment necessitates increasing the value for money while ignoring the potency of the gathered components. The rising demand for storage, networking, and accounting has fueled the growth of massive, complex data centers, as well as the big server businesses that manage several current internet operations, as well as economic, trading, and corporate operations. A data centre can hold thousands of servers and consume the same amount of electricity as a small city. The massive amount of calculating power required to run such server systems controls a variety of conflicts, including energy consumption, greenhouse gas emissions, substitutes, and restarting affairs, among others. This is virtualization, which refers to a group of technologies that cover a wide range of applications and hobbies. This can be applied to the sectors of hardware and software, as well as innovations on the outskirts of virtualization’s emergence. This study demonstrates how we proposed using virtualization technologies to gradually transform a traditional data centre structure into a green data centre. This study looks into the reasons for the price profits of supporting virtualization technology, which is recommended by practically every major company in the market. This is a technology that can drastically reduce capital costs in our environment while also almost committing to low operating costs for the next three years while pursuing the finance. We’ll talk about value in terms of cost and space, with space equating to future cost.

      The security of big data is being studied, as well as how to keep the performance of the data while it is being transmitted over the network. There have been various studies that have looked into the topic of big data. Furthermore, many of those studies claimed to provide data security but failed to maintain performance. Several encryption techniques, including RSA and AES, have been utilized in past studies. However, if these encryption technologies are used, the network system’s performance suffers. To address these concerns, the proposed approach employs compression mechanisms to minimize the file size before performing encryption. Furthermore, data is spit to increase the reliability of transmission. Data has been transferred from multiple routes after the data was separated.

      Acknowledgments

      We express our great pleasure, sincere thanks, and gratitude to the people who significantly helped, contributed and supported to the completion of this book. Our sincere thanks to Fr. Benny Thomas, Professor, Department of Computer Science