issues
search
AbsaOSS
/
spark-commons
Apache License 2.0
7
stars
0
forks
source link
issues
Newest
Newest
Most commented
Recently updated
Oldest
Least commented
Least recently updated
Add Scala 2.13 to readme
#117
Zejnilovic
opened
1 year ago
2
Add Scala 2.13 to the README
#116
Zejnilovic
opened
1 year ago
0
#97 Add scala 2.13 support
#115
Zejnilovic
closed
1 year ago
2
Rename error handling to error handler:113
#114
TebaleloS
closed
1 year ago
4
Rename `ErrorHandling` to `ErrorHandler`
#113
benedeki
closed
1 year ago
0
#111: Create ErrorHandling implementation that throws an exception on error detected
#112
benedeki
closed
1 year ago
2
Create ErrorHandling implementation that throws an exception on error detected
#111
benedeki
closed
1 year ago
0
Implement error handling by putting the info into dev null silently ignoring it: #88
#110
TebaleloS
closed
1 year ago
2
#108: Add assignment to project and dependent actions
#109
benedeki
closed
1 year ago
2
Add assignment to project and dependent actions
#108
benedeki
closed
1 year ago
0
#89: Add implicits for easier usage of the trait
#107
TebaleloS
closed
1 year ago
3
Function isOfType might incorrectly reply on nonexistent column: #95
#106
TebaleloS
closed
1 year ago
2
#48: Fix ScalaDoc cross-module links
#105
benedeki
closed
1 year ago
2
Add methods to the trait error handling that will return the type/schema of the generated error column and their aggregation: #91
#104
TebaleloS
closed
1 year ago
2
#101: ErrorHandling documentation and fields renames
#103
benedeki
closed
1 year ago
3
Create assign_project_to_issue.yml
#102
benedeki
closed
1 year ago
2
ErrorHandling documentation and fields renames
#101
benedeki
closed
1 year ago
0
Documentation
#100
benedeki
opened
1 year ago
1
#98 - Add code coverage support improve
#99
miroslavpojer
closed
1 year ago
2
Improve code-coverage & add GH check action
#98
miroslavpojer
closed
1 year ago
1
Add support for Scala 2.13
#97
benedeki
closed
1 year ago
0
Add methods to the trait error handling that will return the type/schema of the generated error column and their aggregation: #91
#96
TebaleloS
closed
1 year ago
0
Function `isOfType` might incorrectly reply on nonexistent column
#95
benedeki
closed
1 year ago
0
implement error handling that will filter the rows that have any error: #93
#94
TebaleloS
closed
1 year ago
2
Implement error handling that will filter the rows that have any error
#93
benedeki
closed
1 year ago
0
alter once per spark session to be instantiatable even without provided spark session: 82
#92
TebaleloS
closed
1 year ago
0
Add methods to the `trait ErrorHandling` that will return the type/schema of the generated `ErrorColumn` and their aggregation.
#91
benedeki
closed
1 year ago
0
Add support for serialization into`additionalInfo: AdditionalInfo` from `Map`, `case class`, other collections...
#90
benedeki
opened
1 year ago
0
Add implicits for easier usage of the `trait`
#89
benedeki
closed
1 year ago
0
Implement error handling by putting the info into __Dev Null_ (silently ignoring it)
#88
benedeki
closed
1 year ago
0
Implement error handling by putting the info into a log file
#87
benedeki
opened
1 year ago
0
Implement error handling by putting the info into _Spark_'s standard error column (`String`)
#86
benedeki
opened
1 year ago
0
Introducing Map for error columns and their values
#85
benedeki
closed
1 year ago
1
#83: Create a Spike for error handling
#84
benedeki
closed
1 year ago
3
Create a Spike for error handling
#83
benedeki
closed
1 year ago
1
Alter `OncePerSparkSession` to be instantiatable even without provided `SparkSession`
#82
benedeki
closed
1 year ago
1
#80 Add enforceTypeOnNullTypeFields function to DataFrameImplicits
#81
jakipatryk
closed
1 year ago
3
Add function casting all NullType fields to a target type
#80
jakipatryk
closed
1 year ago
0
#78: Move ErrorMessage type from spark data standardization into spar…
#79
TebaleloS
closed
1 year ago
1
Move `ErrorMessage` type from spark-data-standardization into spark-commons
#78
benedeki
closed
1 year ago
0
#76 add spark.sql.shuffle.partitions to DefaultSparkConfiguration
#77
Zejnilovic
closed
1 year ago
0
Add spark.sql.shuffle.partitions setting to the DefaultSparkConfiguration
#76
Zejnilovic
closed
1 year ago
0
#74 spark 3.3 added to project matrix
#75
dk1844
closed
1 year ago
1
Add support for Spark 3.3
#74
dk1844
closed
1 year ago
0
#65 - add code coverage support
#73
miroslavpojer
closed
1 year ago
0
#71: Publish fails on scaladoc generation
#72
benedeki
closed
1 year ago
0
Publish fails on scaladoc generation
#71
benedeki
closed
1 year ago
0
#69: Make it possible to get column value based on column path
#70
benedeki
closed
1 year ago
0
Make it possible to get column value based on column path
#69
benedeki
closed
1 year ago
0
#67: Make it easy to avoid repeated UDF registration
#68
benedeki
closed
1 year ago
0
Next