Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Error when creating spark session #16

Closed
sambhav opened this issue May 8, 2020 · 6 comments
Closed

Error when creating spark session #16

sambhav opened this issue May 8, 2020 · 6 comments

Comments

@sambhav
Copy link

sambhav commented May 8, 2020

Exception in thread Thread-4:
Traceback (most recent call last):
  File "/Library/Frameworks/Python.framework/Versions/3.7/lib/python3.7/threading.py", line 926, in _bootstrap_inner
    self.run()
  File "/Users/skothari44/dev/jupyter-test/.venv/lib/python3.7/site-packages/sparkmonitor/kernelextension.py", line 125, in run
    self.onrecv(msg)
  File "/Users/skothari44/dev/jupyter-test/.venv/lib/python3.7/site-packages/sparkmonitor/kernelextension.py", line 144, in onrecv
    'msg': msg
  File "/Users/skothari44/dev/jupyter-test/.venv/lib/python3.7/site-packages/sparkmonitor/kernelextension.py", line 226, in sendToFrontEnd
    monitor.send(msg)
  File "/Users/skothari44/dev/jupyter-test/.venv/lib/python3.7/site-packages/sparkmonitor/kernelextension.py", line 56, in send
    self.comm.send(msg)
AttributeError: 'ScalaMonitor' object has no attribute 'comm'
@itsjafer
Copy link
Owner

This is a difficult error to debug; the kernel extension is complaining that the Scala Listener has no comms; this is usually because the listener wasn't able to find a Spark Instance.

One solution might be to restart your kernel and refresh the page when you encounter this. I've found that sometimes installing the extension on a kernel that was previously running can result in some wonky errors that are simply fixed by a simply stop and start of the kernel and a refresh of the page.

@metasim
Copy link

metasim commented Aug 27, 2020

I'm also having this problem. @samj1912 Figure anything out?

@metasim
Copy link

metasim commented Aug 31, 2020

I note it happens even when the extension is loaded directly from the notebook (in lieu of kernel initialization):

image

@metasim
Copy link

metasim commented Aug 31, 2020

Extension thinks its loaded:

image

@metasim
Copy link

metasim commented Aug 31, 2020

So my current read is that register_comm is begin called, but the self.target_func callback isn't being called before comm is being dereferenced in send.

def register_comm(self):
"""Register a comm_target which will be used by
frontend to start communication."""
self.ipython.kernel.comm_manager.register_target(
'SparkMonitor', self.target_func)

@footplus
Copy link

footplus commented Nov 4, 2020

It can be worked around by checking if the frontend has already connected or not when using the comm object. It might be interesting to queue messages maybe ?

I suspect #18 is an other manifestation of this thread crash (but from the Spark listener side).

diff --git a/sparkmonitor/kernelextension.py b/sparkmonitor/kernelextension.py
index 6a94f5e..fbf3c04 100644
--- a/sparkmonitor/kernelextension.py
+++ b/sparkmonitor/kernelextension.py
@@ -41,6 +41,7 @@ class ScalaMonitor:
         ipython is the instance of ZMQInteractiveShell
         """
         self.ipython = ipython
+        self.comm = None

     def start(self):
         """Creates the socket thread and returns assigned port"""
@@ -53,7 +54,8 @@ class ScalaMonitor:

     def send(self, msg):
         """Send a message to the frontend"""
-        self.comm.send(msg)
+        if self.comm:
+            self.comm.send(msg)

     def handle_comm_message(self, msg):
         """Handle message received from frontend

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants