& echo 'force-unsafe-io' > /etc/dpkg//docker-apt-speedup \
![install spark ubuntu docker image install spark ubuntu docker image](https://i.ytimg.com/vi/wP7BhXrJKR8/hqdefault.jpg)
& sed -i 's/^exit.*/exit 0/' /sbin/initctl \ & cp -a /usr/sbin/policy-rc.d /sbin/initctl \ & dpkg-divert -local -rename -add /sbin/initctl \ & echo 'exit 101' > /usr/sbin/policy-rc.d \ & echo '#!/bin/sh' > /usr/sbin/policy-rc.d \ My Dockerfile is defined by FROM ubuntu:16.04ĮNTRYPOINT Raise Exception("Java gateway process exited before sending its port number")Įxception: Java gateway process exited before sending its port number
![install spark ubuntu docker image install spark ubuntu docker image](https://tsmatz.files.wordpress.com/2020/12/20201207_kubernetes_service.jpg)
SparkContext._gateway = gateway or launch_gateway(conf)įile "/home/rmarkbio/project/spark-2.4.2-bin-hadoop2.7/python/pyspark/java_gateway.py", line 46, in launch_gatewayįile "/home/rmarkbio/project/spark-2.4.2-bin-hadoop2.7/python/pyspark/java_gateway.py", line 109, in _launch_gateway Type "help", "copyright", "credits" or "license" for more information.Įxception in thread "main" Īt $.(SparkConf.scala:716)Īt $.(SparkConf.scala)Īt .$anonfun$getOption$1(SparkConf.scala:389)Īt .getOption(SparkConf.scala:389)Īt .get(SparkConf.scala:251)Īt .SparkHadoopUtil$.org$apache$spark$deploy$SparkHadoopUtil$$appendS3AndSparkHadoopConfigurations(SparkHadoopUtil.scala:463)Īt .SparkHadoopUtil$.newConfiguration(SparkHadoopUtil.scala:436)Īt .SparkSubmit.$anonfun$prepareSubmitEnvironment$3(SparkSubmit.scala:334)Īt (Option.scala:138)Īt .SparkSubmit.prepareSubmitEnvironment(SparkSubmit.scala:334)Īt .SparkSubmit.submit(SparkSubmit.scala:143)Īt .SparkSubmit.doSubmit(SparkSubmit.scala:86)Īt .SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:924)Īt .SparkSubmit$.main(SparkSubmit.scala:933)Īt .SparkSubmit.main(SparkSubmit.scala)Ĭaused by: : linuxkit-025000000001: linuxkit-025000000001: Name or service not knownĪt (InetAddress.java:1506)Īt .Utils$.findLocalInetAddress(Utils.scala:946)Īt .Utils$.localIpAddress$lzycompute(Utils.scala:939)Īt .Utils$.localIpAddress(Utils.scala:939)Īt .Utils$.$anonfun$localCanonicalHostName$1(Utils.scala:996)Īt .Utils$.localCanonicalHostName(Utils.scala:996)Īt .config.package$.(package.scala:302)Īt .config.package$.(package.scala)Ĭaused by: : linuxkit-025000000001: Name or service not knownĪt (Native Method)Īt $2.lookupAllHostAddr(InetAddress.java:929)Īt (InetAddress.java:1324)Īt (InetAddress.java:1501)Ĭonn_info_file: /tmp/tmpiuwhok7q/tmplief2cbaįile "/home/rmarkbio/project/spark-2.4.2-bin-hadoop2.7/python/pyspark/shell.py", line 38, in įile "/home/rmarkbio/project/spark-2.4.2-bin-hadoop2.7/python/pyspark/context.py", line 316, in _ensure_initialized My issue is that I cannot get around the following Java error when I run the pyspark executable pyspark For the same of maintain consistent development environments when different developers contribute to code, I require all development to take place in a well-defined Docker container. I am trying to establish a development environment to play around with Apache Spark, specifically pyspark, inside a Docker container running Ubuntu 16.04.