Changed core functionality of the framework to implement streaming processing: added separate implementations
of metric processor: for static and streaming data sources.
Added streaming DQ job and added corresponding builders in DQ context.
Refactored error accumulators to work with both batch and streaming processing.
Configuration model of sources has changed to enable reading them as a stream dataframes.
Sources that are not streamable are marked accordingly.
Added trait with loadDataStream method to mix in connections that support reading data as a stream.
Modified kafka connection to implement loadDataStream method.
Modified source readers to enable readStream method for sources.
Added functionality to read streaming virtual sources: only for those virtual source types that can be build over streaming sources.
Virutal Source readers updated accordingly.
Application configuration has extended with streaming settings.
Documentation updated to include description of functionality related to running quality checks over streaming sources.
In addition, logging have been revised: Checkita uses Log4J2 library. For earlier versions of Spark (that work with Log4J 1x)
the Log4j2 dependency is added explicitly as well as log4j2 -> slf4j bridge.
Other minor fixes and changes related to implementation of the streaming processing functionality.