Александр Юрьевич Чесалов

Глоссариум по искусственному интеллекту: 2500 терминов. Том 2


Скачать книгу

computing is an information technology model for providing ubiquitous and convenient access using the Internet to a common set of configurable computing resources («cloud»), data storage devices, applications and services that can be quickly provided and released from the load with minimal operating costs or with little or no involvement of the provider216.

      Cloud is a general metaphor that is used to refer to the Internet. Initially, the Internet was seen as a distributed network and then with the invention of the World Wide Web as a tangle of interlinked media. As the Internet continued to grow in both size and the range of activities it encompassed, it came to be known as «the cloud.» The use of the word cloud may be an attempt to capture both the size and nebulous nature of the Internet217.

      Cloud TPU is a specialized hardware accelerator designed to speed up machine learning workloads on Google Cloud Platform218.

      Cluster analysis is a type of unsupervised learning used for exploratory data analysis to find hidden patterns or groupings in the data; clusters are modeled with a similarity measure defined by metrics such as Euclidean or probability distance.

      Clustering is a data mining technique for grouping unlabeled data based on their similarities or differences. For example, K-means clustering algorithms assign similar data points into groups, where the K value represents the size of the grouping and granularity. This technique is helpful for market segmentation, image compression, etc219.

      Co-adaptation is when neurons predict patterns in training data by relying almost exclusively on outputs of specific other neurons instead of relying on the network’s behavior as a whole. When the patterns that cause co-adaption are not present in validation data, then co-adaptation causes overfitting. Dropout regularization reduces co-adaptation because dropout ensures neurons cannot rely solely on specific other neurons220.

      COBWEB is an incremental system for hierarchical conceptual clustering. COBWEB was invented by Professor Douglas H. Fisher, currently at Vanderbilt University. COBWEB incrementally organizes observations into a classification tree. Each node in a classification tree represents a class (concept) and is labeled by a probabilistic concept that summarizes the attribute-value distributions of objects classified under the node. This classification tree can be used to predict missing attributes or the class of a new object221.

      Code is a one-to-one mapping of a finite ordered set of symbols belonging to some finite alphabet222.

      Codec is a codec is the means by which sound and video files are compressed for storage and transmission purposes. There are various forms of compression: ’lossy’ and ’lossless’, but most codecs perform lossless compression because of the much larger data reduction ratios that occur with lossy compression. Most codecs are software, although in some areas codecs are hardware components of image and sound systems. Codecs are necessary for playback, since they uncompress or decompress the moving image and sound files and allow them to be rendered223.

      Cognitive architecture – the Institute of Creative Technologies defines cognitive architecture as: «hypothesis about the fixed structures that provide a mind, whether in natural or artificial systems, and how they work together – in conjunction with knowledge and skills embodied within the architecture – to yield intelligent behavior in a diversity of complex environments»224.

      Cognitive computing is used to refer to the systems that simulate the human brain to help with the decision- making. It uses self-learning algorithms that perform tasks such as natural language processing, image analysis, reasoning, and human—computer interaction. Examples of cognitive systems are IBM’s Watson and Google DeepMind225.

      Cognitive Maps are structured representations of decision depicted in graphical format (variations of cognitive maps are cause maps, influence diagrams, or belief nets). Basic cognitive maps include nodes connected by arcs, where the nodes represent constructs (or states) and the arcs represent relationships. Cognitive maps have been used to understand decision situations, to analyze complex cause-effect representations and to support communication226.

      Cognitive science – the interdisciplinary scientific study of the mind and its processes227.

      Cohort is a sample in study (conducted to evaluate a machine learning algorithm, for example) where it is followed prospectively or retrospectively and subsequent status evaluations with respect to a disease or outcome are conducted to determine which initial participants’ exposure characteristics (risk factors) are associated with it.

      Cold-Start is a potential issue arising from the fact that a system cannot infer anything for users or items for which it has not gathered a sufficient amount of information yet228.

      Collaborative filtering – making predictions about the interests of one user based on the interests of many other users. Collaborative filtering is often used in recommendation systems229.

      Combinatorial optimization in operations research, applied mathematics and theoretical computer science, combinatorial optimization is a topic that consists of finding an optimal object from a finite set of objects230.

      Committee machine is a type of artificial neural network using a divide and conquer strategy in which the responses of multiple neural networks (experts) are combined into a single response. The combined response of the committee machine is supposed to be superior to those of its constituent experts. Compare ensembles of classifiers231.

      Commoditization is the process of transforming a product from an elite to a generally available (comparatively cheap commodity of mass consumption)232.

      Common Data Element (CDE) is a tool to support data management for clinical research233.

      Commonsense reasoning is a branch of artificial intelligence concerned with simulating the human ability to make presumptions about the type and essence of ordinary situations they encounter every day234.

      Compiler is a program that translates text written in a programming language into a set of machine codes. AI framework compilers collect the computational data of the frameworks and try to optimize the code of each of them, regardless of the hardware of the accelerator. The compiler contains programs and blocks with which the framework performs several tasks. The computer memory resource allocator, for example, allocates power individually for each accelerator235.

      Composite AI is a combined application of various artificial