for training deep neural networks. J. Mach. Learn. Res., 1, 1–40.
2 2 Hinton, G.E. and Salakhutdinov, R.R. (2006) Reducing the dimensionality of data with neural networks. Science, 313, 504–507.
3 3 Hastie, T., Tibshirani, R., and Friedman, J. (2002) The Elements of Statistical Learning, Springer, New York.
4 4 Boyd, S., Boyd, S.P., and Vandenberghe, L. (2004) Convex Optimization, Cambridge university press.
5 5 Nocedal, J. and Wright, S. (2006) Numerical Optimization, Springer Science & Business Media.
6 6 Izenman, A.J. (2008) Modern multivariate statistical techniques. Regression Classif. Manifold Learn., 10, 978–980.
7 7 Gori, M. and Tesi, A. (1992) On the problem of local minima in backpropagation. IEEE Trans. Pattern Anal. Mach. Intell., 14, 76–86.
8 8 LeCun, Y., Bottou, L., Bengio, Y., and Haffner, P. (1998) Gradient‐based learning applied to document recognition. Proc. IEEE, 86, 2278–2324.
9 9 LeCun, Y. (1998) The MNIST Database of Handwritten Digits, http://yann.lecun.com/exdb/mnist/ (accessed 20 April 2021).
10 10 Krizhevsky, A., Sutskever, I., and Hinton, G.E. (2012) Imagenet classification with deep convolutional neural networks. Adv. Neural Inf. Process. Syst., 25, 1097–1105.
11 11 Simonyan, K. and Zisserman, A. (2014) Very deep convolutional networks for large‐scale image recognition. arXiv preprint arXiv:1409.1556.
12 12 He, K., Zhang, X., Ren, S., and Sun, J. (2016) Deep Residual Learning for Image Recognition. Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp. 770–778.
13 13 Goodfellow, I., Bengio, Y., and Courville, A. (2016) Deep Learning, MIT Press.
14 14 Krizhevsky, A. (2009) Learning multiple layers of features from tiny images.
15 15 Bickel, P.J., Li, B., Tsybakov, A.B. et al. (2006) Regularization in statistics. Test, 15, 271–344.
16 16 Rumelhart, D.E., Hinton, G.E., and Williams, R.J. (1986) Learning Internal Representations by Error Propagation. Tech. report. California Univ San Diego La Jolla Inst for Cognitive Science.
17 17 Kingma, D.P. and Welling, M. (2014) Auto‐Encoding Variational Bayes. International Conference on Learning Representations.
18 18 Kullback, S. and Leibler, R.A. (1951) On information and sufficiency. Ann. Math. Stat., 22, 79–86.
19 19 Hochreiter, S. and Schmidhuber, J. (1997) Long short‐term memory. Neural Comput., 9, 1735–1780.
20 20 Gers, F., Schmidhuber, J., and Cummins, F. (1999) Learning to Forget: Continual Prediction with LSTM. 1999 Ninth International Conference on Artificial Neural Networks ICANN 99. (Conf. Publ. No. 470), vol. 2, pp. 850–855.
4 Streaming Data and Data Streams
Taiwo Kolajo1,2, Olawande Daramola3, and Ayodele Adebiyi4
1Federal University Lokoja, Lokoja, Nigeria
2Covenant University, Ota, Nigeria
3Cape Peninsula University of Technology, Cape Town, South Africa
4Landmark University, Omu‐Aran, Kwara, Nigeria
1 Introduction
As at the dawn of 2020, the amount of the world data generated was estimated to be 44 zettabytes (i.e., 40 times more than the number of stars in the observable universe). The amount of data generated daily is projected to be 463 exabytes globally by 2025 [1]. Not only that, data are growing in volume but also in structure, in complexity, and geometrically [2]. These high‐volume data, generated at a high‐velocity, lead to what is called streaming data. Data streams can originate from IoT devices and sensors, spreadsheets, text files, images, audio and video recordings, chat and instant messaging, email, blogs and social networking sites, web traffic, financial transactions, telephone usage records, customer service records, satellite data, smart devices, GPS data, and network traffic and messages.
There are different schools of thought when it comes to defining streaming data and data stream, and it is difficult to situate a position between these two concepts. One school of thought defined streaming data as the act of sending data bit by bit instead of a whole package while data stream is the actual source of data. That is, streaming data is the act, the verb, the action while data stream is the product. In the field of Engineering, streaming data is the process or art of collecting the streamed data. It is the main activity or operation, while data stream is the pipeline through which streaming is performed. It is the engineering architecture, that is the line‐up of tools that will perform the streaming. In the context of data science, streaming data and data streams are used interchangeably. To better understand the concepts, let us first define what a stream is. A stream S is a possibly infinite bag of elements (x, t) where x is a tuple belonging to the schema S and t ∈ T is the timestamp of the elements [3]. Data stream refers to an unbounded and ordered sequence of instances of data arriving over time [4]. Data stream can be formally defined as an infinite sequence of tuples S = (x1, ti), (x2, t2),…, (xn, tn),… where xi is a tuple and ti is a timestamp [5]. Streaming data can be defined as frequently changing, and potentially infinite data flow generated from disparate sources [6]. Formally, streaming data
Table 1 Streaming data versus static data [9, 10]
Dimension | Streaming data | Static data |
---|---|---|
Hardware | Typical single constrained measure of memory | Multiple CPUs |
Input | Data streams or updates | Data chunks |
Time | A few moments or even milliseconds | Much longer |
Data size | Infinite or unknown in advance | Known and finite |
Processing | A single or few pass over data | Processes in multiple rounds |
Storage | Not store or store a significant portion in memory | Store |
Applications | Web mining, traffic monitoring, sensor networks | Widely adopted in many domains |