Access right problem with SparkContext

Hi,

I am having trouble using PySpark. Indeed, when I try to use the global SparkContext, sc, in the notebook it says that it's not defined and in my jupyter terminal I have the following error :

21/04/18 13:35:28 ERROR SparkContext: Error initializing SparkContext.
org.apache.hadoop.security.AccessControlException: Permission denied: user=magron, access=WRITE, inode="/user":hdfs:hdfs:drwxr-xr-x

I tried troubleshooting but any solution I found is telling me to modify the hdfs directory and create a directory to which I have read and write access. Even though it's the right solution I can't access the hdfs directory in home because I get a permission denied.

Thank you

Top comment

I have created a home directory for you in hdfs with your username and provided you with write access.

Hello, I have the same problem...

Top comment

Add comment

Post as Anonymous Dont send out notification