Hi I have one question regarding the correct configure to make spark-tensorflow-distributor to utilize multi-thread per worker. As pyspark will allocate tasks per core*executor, so there may be multiple workers running on one executor, any idea how to avoid this? Thanks in advance.
Hi I have one question regarding the correct configure to make spark-tensorflow-distributor to utilize multi-thread per worker. As pyspark will allocate tasks per core*executor, so there may be multiple workers running on one executor, any idea how to avoid this? Thanks in advance.