MapReduce (Batch Processing)

Glossary Page

The term "Batch Processing" refers to the automatic, sequential, and complete execution of data contained in input files. The program runs independently, without user intervention, after being initiated. The data to be processed is handled one by one. The results can be stored, for example, in files or databases. Apache Hadoop MapReduce is a common example of batch processing in Big Data.

https://www.bitkom.org/sites/default/files/file/import/140228-Big-Data-Technologien-Wissen-fuer-Entscheider.pdf external-link

Latest Webinars

Latest Articles