Silly typo in ein-kernel.el. Querying kernelspecs is slightly more efficient,
caching results when url is a string or port (previously just cached results for
ports).
This reduces some of the complexity in testein.py, but unfortunately running
tests is still unreliable. Running batchwise tests don't work at all in Windows,
and running from inside emacs tests often need to be run multiple times before
they pass.
The worst offender is the delete notebook test, which will pass on usually only
1 out of 3 tries.
Testing seems to have revealed a couple bugs, so win??
There may be situations when a jupyter server restarts at a new url/port while a
notebook is open. This function allows the user to specify the new url/port and
continue using the notebook (or at the very least, save any changes made while
the server was down and rebooting). EIN will helpfully try to restart the kernel
once the url is changed. Jupyter may complain about session not being found, but
it does not seem to affect the running kernel.
Make even more unique edit-cell buffer names. Check that an edit-cell-buffer
does not already exits before creating one.
Also try to be even more aggressive in limiting output in backtraces when
debugging ein.
Modify ein:get-kernel so it can retrieve the kernel from the local
worksheet variable in edit-cell-mode. This is done so that functions
such as ein:completer-complete and tooltips / inline help are
availeble to be bound by the user.
Check that `filename` actually exists before jumping to it.
Without this patch M-. on an arbitrary word ZZZ creates two empty buffers,
one named "None" and the other named (yes, this is the actual buffer name):
"code.py:456: UserWarning: Argument given (ZZZ) can't be found as a variable or as a filename."
Login and notebooklist requests have a tendency to fail with the curl request
backend when there is no cookie jar file. Request will create the cookiejar, but
apparently with the asynchronous nature of ein this happens too slowly causing
content REST queries to fail. My tactic here is to repeat the call once, and
only once when a failure is detected. It is a difficult problem to reliably
create so I'm not sure if this is a good fix or not. Time will tell.
When the customizable variable `ein:enable-keepalive` is non-nil EIN will
automatically call `ein:notebooklist-enable-keepalive` when calling
`ein:notebooklist-open`. Keepalive has also been tweaked to make sure it does
not create multiple timers.
When calling `ein:jupyter-server-stop` ask to user if they are sure and also
give them the option to save any unsaved notebooks before killing the server.
Before closing the server also close any open notebook buffers.
For the moment we only support current version of jupyter (i.e. 4.3.1 or
greater).
The special commands are `ein:jupyter-server-start` to start a notebook server
and `ein:jupyter-server-stop` to, clearly, stop a notebook server.
On starting a server EIN will try to determine the url and token for accessing
the notebook server, login and open the notebook list automatically.
Changed description of manual installation in "usage" section.
Made it more explicit that you only need to manually put files in
the load path when not using MELPA.
Initial steps integrating magit's sections into ein. Also an attempt to make
notebooklist buffers more "stable" by adding a slight pause during session
queries.
Display value of object's __repr__, link to source when available, and
documentation.
Adding command and keybinding (C-ci) to notebook buffer keymap.
Get rid of unnecessary api check when rendering notebooklist buffer.
Mistyped some accessors for worksheet slot accessors. Correct
`*running-ipython-version*` hash to accept strings (i.e. URL's) as well as just
port numbers.