python - Error initializing SparkContext. java.io.IOException: No space left on device -


after executing shell command launch pyspark session: pyspark --master yarn-client --num-executors 16 --driver-memory 16g --executor-memory 6g

i following error:

error sparkcontext: error initializing sparkcontext. java.io.ioexception: no space left on device   @ java.io.fileoutputstream.writebytes(native method)  @ java.io.fileoutputstream.write(fileoutputstream.java:345) @ java.util.zip.deflateroutputstream.deflate(deflateroutputstream.java:253)  etc ... 

it seems slaves out of disk space. how clear up?

edit: when running df -h on virtual machine want launch job on:

filesystem                 size  used avail use% mounted on /dev/mapper/rootvg01-lv01   20g   18g  2.5g  88% / devtmpfs                    16g     0   16g   0% /dev tmpfs                       16g  5.7g   10g  36% /dev/shm tmpfs                       16g  1.6g   15g  11% /run tmpfs                       16g     0   16g   0% /sys/fs/cgroup /dev/mapper/rootvg01-lv03   20g  933m   19g   5% /var /dev/mapper/rootvg01-lv02  2.0g   33m  2.0g   2% /tmp /dev/sda1                  997m   92m  905m  10% /boot /dev/mapper/rootvg01-lv04  1.6t  892g  734g  55% /data 


Comments

Popular posts from this blog

Hatching array of circles in AutoCAD using c# -

ios - UITEXTFIELD InputView Uipicker not working in swift -