.


Confidentiality of information is a mandatory requirement for a person who has access to certain information not to transfer such information to third parties without the consent of its owner282.


Confirmation Bias – the tendency to search for, interpret, favor, and recall information in a way that confirms one’s own beliefs or hypotheses while giving disproportionately less attention to information that contradicts it283.


Confusion matrix is a situational analysis table that summarizes the prediction results of a classification model in machine learning. The records in the dataset are summarized in a matrix according to the real category and the classification score made by the classification model284,285.


Consumer artificial intelligence is specialized artificial intelligence programs embedded in consumer devices and processes286.


Continuous feature is a floating-point feature with an infinite range of possible values. Contrast with discrete feature287,288.


Contributor is a human worker providing annotations on the Appen data annotation platform289.


Convenience sampling – using a dataset not gathered scientifically in order to run quick experiments. Later on, it’s essential to switch to a scientifically gathered dataset290.


Convergence – informally, often refers to a state reached during training in which training loss and validation loss change very little or not at all with each iteration after a certain number of iterations. In other words, a model reaches convergence when additional training on the current data will not improve the model. In deep learning, loss values sometimes stay constant or nearly so for many iterations before finally descending, temporarily producing a false sense of convergence. See also early stopping291,292.


Convex function is a function in which the region above the graph of the function is a convex set. The prototypical convex function is shaped something like the letter U. For example, the following are all convex functions:



By contrast, the following function is not convex. Notice how the region above the graph is not a convex set:



A strictly convex function has exactly one local minimum point, which is also the global minimum point. The classic U-shaped functions are strictly convex functions. However, some convex functions (for example, straight lines) are not U-shaped. A lot of the common loss functions, including the following, are convex functions: L2 loss; Log Loss; L1 regularization; L2 regularization. Many variations of gradient descent are guaranteed to find a point close to the minimum of a strictly convex function. Similarly, many variations of stochastic gradient descent have a high probability (though, not a guarantee) of finding a point close to the minimum of a strictly convex function. The sum of two convex functions (for example, L2 loss + L1 regularization) is a convex function. Deep models are never convex functions. Remarkably, algorithms designed for convex optimization tend to find reasonably good solutions on deep networks anyway, even though those solutions are not guaranteed to be a global minimum293,294.


Convex optimization – the process of using mathematical techniques such as gradient descent to find the minimum of a convex function. A great deal of research in machine learning has focused on formulating various problems as convex optimization problems and in solving those problems more efficiently. For complete details, see Boyd and Vandenberghe, Convex Optimization295.


Convex set is a subset of Euclidean space such that a line drawn between any two points in the subset remains completely within the subset.