0 Index1 Preparation2 Install3 Input4 Run5 What6 Discussion7 Play8 Share9 Presentation

Glossary

Artificial intelligence (AI)

Describes a machine acting „smart“, not because the „smart“ was programmed into it, but because it can understand and consider a situation and create a justified reaction.

Machine learning (ML)

Concept that enables the machine to learn from scenarios and develop appropriate reactions.

Neural networks (NN)

A technology that enables machine learning by saving gathered knowledge into cells (like a brain). While learning, these cells form a web (or network) of logic streams, that later lead to the desired output. During the decision process, these cells will trigger each other (the machine is contemplating).

Deep learning (DL)

Stacking or layering of networks, to allow more different aspects to be respected in the decision process. The output of one network layer is taken as input for another layer.

Recurrent neural network (RNN)

A type of NN model, that not only prodicts on outcome, but uses each outcome as the input again. This enables us to generate text or any other kind of sequence.

Long Short Term Memory (LSTM)

A type of RNN model, that can remember things beyond its sequence length.

Model

Working procedure of a neural network

Training

Sequentially predicting output while comparing it to the input. The model gets direct feedback wether it was right or wrong and accordingly adjusts the weights of the networks logic streams. After several repetitions (epochs), the results improve.

Epoch

Round of training repetitions. The length of one epoch depends on the amount of input text.

Sequence

A text sequence of n characters (30 by default) to train on.

Batch

A group of sequences inside one epoch.

Loss

A score that describes the mistakes made during the training process. The aim is to lower the value towards 0, but not actually reaching 0 to prevent overfitting. The loss values are not comparable in between different projects.

Accuracy

A score that describes the accuracy reached by the network during the training process. The aim is to increase the value towards 1, but not actually reaching 1 to prevent overfitting.

Checkpoint

A state of training progress saved after every few epochs. One checkpoint consists of 3 files: .meta, .index and .data-00000-of-00001. Every checkpoint can be played. Unfortunately it is not possible to train from a given checkpoint.

Overfitting

Describes the effect when prediction is too close or similar to actual input. This happens often with small or unsufficient input datasets. The model then will repeat input and may fail to respond to new and unknown input.


USBGlossaryPythonTensorflowTerminal