Probem with spark assembly jars
When I try to run my process composed by these operators: retrieve data from hive, set role, and decision tree (spark). I have this error:
the specified spark assembly jar, archive or lib directory does not exist or cannot be read. Please check the assembly jar location on the advanced connection panel and make sure that the Hadoop user has read permission on it.
Can I get some help as to why this might be happening? What additional configuration needs to be made for me to successfully run Spark?