Page MenuHomePhabricator

Newpyter - SWAP Juypter Rewrite
Closed, ResolvedPublic

Description

This parent task is meant to lay out potential designs to make using Jupyter Notebooks for Analytics a first class citizen. This means:

  • Distribution by default - all notebooks should run either in YARN or Kubernetes - not doing this :(
  • Notebook sharing - notebooks should be readable and shareable between engineers - still not sure how :(
  • Easy to use and work with existing Analytics tools and data (Hive, Spark, HDFS, Druid, etc.)
  • Easy to understand, manage, and deploy dependencies
  • Should integrate with TBD ML Pipeline project

This would be a full SWAP rewrite.

Ideas:

Design Document:https://docs.google.com/document/d/1r-oqMXViWvQCqsYz0qzezZBWpip8LvkvCGF6GivFB_8/edit#heading=h.vpanev2oq14b


Final outcome:

We were able to satisfy 2 out of 5 of the stated outcomes:

  • Easy to use and work with existing Analytics tools and data (Hive, Spark, HDFS, Druid, etc.)
  • Easy to understand, manage, and deploy dependencies

The other outcomes were difficult to support in for notebooks running inside of WMF production networks. ML integrationwork is being experimented with in T243089.

Details

SubjectRepoBranchLines +/-
operations/puppetproduction+20 -0
operations/puppetproduction+4 -9
operations/puppetproduction+6 -0
operations/puppetproduction+5 -4
operations/debs/anaconda-wmfdebian+1 -3
operations/puppetproduction+44 -44
operations/puppetproduction+27 -10
operations/debs/anaconda-wmfdebian+37 -37
operations/debs/anaconda-wmfdebian+3 -3
operations/debs/anaconda-wmfdebian+61 -28
operations/puppetproduction+9 -5
operations/puppetproduction+688 -2
operations/puppetproduction+1 -1
operations/puppetproduction+3 -4
operations/puppetproduction+1 -1
operations/puppetproduction+1 -1
operations/puppetproduction+11 -5
operations/puppetproduction+0 -0
operations/puppetproduction+1 -1
operations/puppetproduction+3 -12
operations/puppetproduction+1 -1
operations/puppetproduction+1 -1
operations/puppetproduction+4 -0
operations/puppetproduction+10 -0
Show related patches Customize query in gerrit

Related Objects

StatusSubtypeAssignedTask
ResolvedOttomata
DuplicateNone
DeclinedNone
OpenNone
DeclinedNone
InvalidNone
DuplicateOttomata
DeclinedNone
ResolvedOttomata
ResolvedOttomata
ResolvedOttomata
Resolvedelukey
OpenNone
In Progressbking
Openbking
Openbking
ResolvedOttomata
ResolvedOttomata
ResolvedOttomata
ResolvedBUG REPORTmpopov
ResolvedBUG REPORTOttomata

Event Timeline

There are a very large number of changes, so older changes are hidden. Show Older Changes

Thanks for the design doc @Ottomata. The Product Analytics team discussed this and we are hoping we can find a way to include notebook sharing or some sort of centralized repository for notebooks. We are flexible but having some way to share is high on our list of feature requests. Hoping we can continue to explore options before a plan is finalized.

Thanks! I just added another idea to the sharing section:

We may be able to do this naturally by the fact that user’s notebook files will be in HDFS. If we store the notebook files in a globally accessible place, rather than the user’s HDFS homedir, perhaps everyone’s Notebook Server file browser will be able to navigate and read each other’s notebooks, while still isolating conda environments.

@Ottomata our team is super interested in this and we'd love to provide more feedback in addition to the comment Shay made about sharing. What's your timeline/when should we make sure to get feedback to you?

Hiya! I will likely begin working on this in the next week or two. There's a lot of backend infra work to be done before it will be useable, and I'm sure things will change along the way. So feedback is welcome anytime. I'd also love to have a person on your team to bounce ideas off of as I work (hopefully someone in US or EU timezones).

@Ottomata sorry, didn't see that you replied! Looping in @SNowick_WMF; please reach out to her and she'll be available to bounce ideas off of & can help coordinate responses from our team.

In a meeting with @SNowick_WMF @nshahquinn-wmf and @nettrom_WMF today, Neil pointed out a potential problem with the HDFS contents manager approach for notebook files. He uses git to manage his notebook projects, which includes both notebook files and any other files that might be part of a project. If notebook files are stored in HDFS, but all other files are not, then using git on the CLI to sync notebook projects to github will be pretty cumbersome. A user would have to manually hdfs dfs -get the .ipynb files in question down to the local temporarily filesystem, git add them, and then git push them.

We discussed a few workarounds, but I think this points out a serious flaw in the Newpyter proposal as is. Using HDFS as a semi-shared filesystem only for .ipynb files is going to be confusing for users. I expect other issues than just this git workflow to pop up because of the HDFS contents manager.

@elukey I'm beginning to think again that it won't be possible to do what we need to do without a real shared filesystem. If there were an HDFS mounter that worked well, our problems could be solved. Today I found fsspec with FUSE which uses pyarrow. However its FUSE support is experimental, and in my quick test it kinda works, but only barely. We could consider using that, or even our existing hadoop-hdfs-fuse to mount the HDFS user dir into the yarn temporary directory when the notebook server is launched. Then the HDFS mounts would be as ephemeral as the yarn temporary dirs. Maybe that would be ok?

NFS would be even simpler.
@yuvipanda noted to me back in January that he is using NFS for a larger Jupyter userbase than we have. Would using NFS for non-prod analytics user only use cases really be so bad?

@Ottomata, what about the data living in HDFS and the code living in a shared folder mounted through sshfs (or similar). I understand that might be some security issues with sshfs. But if we just store code the data-leak problems should be smaller.

Hm, @diego I think the problem is more complex than that. The intention is to run notebooks in Yarn, which means that your Notebook Server will run on any Hadoop worker. By default the Notebook Server will only have access to an empty temporary directory which will only exist while your Notebook Server process is alive. So, even if we could sshfs, where would we sshfs to? :)

@Ottomata, the sshfs will go to a centralized (NAS-like) ~/ folder, with small quota just to store the code (very similar with what you would store on a git repository). Workers will mount that ~/ from the NAS as home folder instead of the temporary (local) directory you are proposing. So, instead of destroying the home folder after each session, that folder will live permanently on the NAS.

I see ya, so just using sshfs instead of NFS. Aye either would be functionally equivalent, although I'd expect implementing that securely with sshfs in our stack would be difficult, since we don't allow ssh key forwarding or ssh password authentication.

(Heh, if we had a real NAS, we could just hook it up too all the worker nodes.)

Change 620784 had a related patch set uploaded (by Ottomata; owner: Ottomata):
[operations/puppet@production] Add profile::analytics::jupyterhub

https://gerrit.wikimedia.org/r/620784

Change 620784 merged by Ottomata:
[operations/puppet@production] Add profile::analytics::jupyterhub

https://gerrit.wikimedia.org/r/620784

Change 620942 had a related patch set uploaded (by Ottomata; owner: Ottomata):
[operations/puppet@production] jupyterhub_config.py.erb - check type before rendering

https://gerrit.wikimedia.org/r/620942

Change 620942 merged by Ottomata:
[operations/puppet@production] jupyterhub_config.py.erb - check type before rendering

https://gerrit.wikimedia.org/r/620942

Change 620945 had a related patch set uploaded (by Ottomata; owner: Ottomata):
[operations/puppet@production] Fix typo ni jupyterhub::server

https://gerrit.wikimedia.org/r/620945

Change 620945 merged by Ottomata:
[operations/puppet@production] Fix typo in jupyterhub::server

https://gerrit.wikimedia.org/r/620945

Change 620959 had a related patch set uploaded (by Ottomata; owner: Ottomata):
[operations/puppet@production] jupyterhub-conda.systemd.erb - allow to write into @data_path

https://gerrit.wikimedia.org/r/620959

Change 620959 merged by Ottomata:
[operations/puppet@production] jupyterhub-conda.systemd.erb - allow to write into @data_path

https://gerrit.wikimedia.org/r/620959

Change 620962 had a related patch set uploaded (by Ottomata; owner: Ottomata):
[operations/puppet@production] jupyterhub-conda - Always set proxy_pid_file to a writable path

https://gerrit.wikimedia.org/r/620962

Change 620962 merged by Ottomata:
[operations/puppet@production] jupyterhub-conda - Always set proxy_pid_file to a writable path

https://gerrit.wikimedia.org/r/620962

Change 620965 had a related patch set uploaded (by Ottomata; owner: Ottomata):
[operations/puppet@production] jupterhub_config.py.erb - fix missing comma

https://gerrit.wikimedia.org/r/620965

Change 620965 merged by Ottomata:
[operations/puppet@production] jupterhub_config.py.erb - fix missing comma

https://gerrit.wikimedia.org/r/620965

Change 620988 had a related patch set uploaded (by Ottomata; owner: Ottomata):
[operations/puppet@production] jupyterhub - make jupyterhub-singleuser-conda-env.sh executable

https://gerrit.wikimedia.org/r/620988

Change 620988 merged by Ottomata:
[operations/puppet@production] jupyterhub - make jupyterhub-singleuser-conda-env.sh executable

https://gerrit.wikimedia.org/r/620988

Change 620993 had a related patch set uploaded (by Ottomata; owner: Ottomata):
[operations/puppet@production] jupterhub

https://gerrit.wikimedia.org/r/620993

Change 620993 merged by Ottomata:
[operations/puppet@production] jupyterhub - Create config files explicitly with proper modes

https://gerrit.wikimedia.org/r/620993

Change 620999 had a related patch set uploaded (by Ottomata; owner: Ottomata):
[operations/puppet@production] jupyterhub - make sure $config_path is created

https://gerrit.wikimedia.org/r/620999

Change 620999 merged by Ottomata:
[operations/puppet@production] jupyterhub - make sure $config_path is created

https://gerrit.wikimedia.org/r/620999

Change 621009 had a related patch set uploaded (by Ottomata; owner: Ottomata):
[operations/puppet@production] jupyterhub - pass conda_env_path to jupyterhub_singleuser_conda_env_script

https://gerrit.wikimedia.org/r/621009

Change 621009 merged by Ottomata:
[operations/puppet@production] jupyterhub - pass conda_env_path to jupyterhub_singleuser_conda_env_script

https://gerrit.wikimedia.org/r/621009

Change 621011 had a related patch set uploaded (by Ottomata; owner: Ottomata):
[operations/puppet@production] jupyterhub - don't allow named servers

https://gerrit.wikimedia.org/r/621011

Change 621011 merged by Ottomata:
[operations/puppet@production] jupyterhub - don't allow named servers

https://gerrit.wikimedia.org/r/621011

Change 621012 had a related patch set uploaded (by Ottomata; owner: Ottomata):
[operations/puppet@production] jupyterhub - Subscribe jupyterhub to spawners.py

https://gerrit.wikimedia.org/r/621012

Change 621012 merged by Ottomata:
[operations/puppet@production] jupyterhub - Subscribe jupyterhub to spawners.py

https://gerrit.wikimedia.org/r/621012

Change 621315 had a related patch set uploaded (by Ottomata; owner: Ottomata):
[operations/puppet@production] jupyterhub - sort profiles by user name; increase spawner timeout

https://gerrit.wikimedia.org/r/621315

Change 621315 merged by Ottomata:
[operations/puppet@production] jupyterhub - sort profiles by user name; increase spawner timeout

https://gerrit.wikimedia.org/r/621315

I just tried out Newpyter on stat1008 and had one big issue: I can't deactivate my current conda environment.

When I try source deactivate, I get bash: deactivate: No such file or directory. When I try invoking it using the full path[1], I get a never-ending stream of errors like:

bash: [: 1839: unary operator expected
Deactivating /home/neilpquinn-wmf/.conda/envs/2021-02-25T17.18.53_neilpquinn-wmf
bash: conda: command not found
bash: [: 1840: unary operator expected
Deactivating /home/neilpquinn-wmf/.conda/envs/2021-02-25T17.18.53_neilpquinn-wmf
bash: conda: command not found
bash: [: 1841: unary operator expected
Deactivating /home/neilpquinn-wmf/.conda/envs/2021-02-25T17.18.53_neilpquinn-wmf

When I try source conda-deactivate-stacked, I get the same stream of errors. If I try to use the full path, as recommended when I try to activate another environment[2], I get No such file or directory.

[1]: source /home/neilpquinn-wmf/.conda/envs/2021-02-25T17.18.53_neilpquinn-wmf/bin/deactivate
[2]: source /home/neilpquinn-wmf/.conda/envs/2021-02-25T17.18.53_neilpquinn-wmf/bin/conda-deactivate-stacked

Because of the bug above, I updated the Jupyter page on Wikitech to remove the recommendation that all users switch to Newpyter. I look forward to continuing to test it and then helping to spread the word to switch once it's ready 😊

Another observation: I attempted to use wmfdata to avoid replicating spark session code. The wmf base conda env contains an older version, and upgrading it fails with

!pip install --upgrade git+https://github.com/wikimedia/wmfdata-python.git@release
[...]
  Attempting uninstall: wmfdata
    Found existing installation: wmfdata 1.0.4
    Uninstalling wmfdata-1.0.4:
ERROR: Could not install packages due to an EnvironmentError: [Errno 30] Read-only file system: 'WHEEL'

It seems reasonable to have the base image be write only, though Newpyter should also allow users to upgrade packages that are installed in the base env.

@nshahquinn-wmf are you using the ssh terminal or the Notebook Terminal?

In the ssh terminal I can't reproduce:

00:43:04 [@stat1008:/home/otto] 1 $ source conda-activate-stacked
Activating /home/otto/.conda/envs/2021-02-17T18.50.02_otto stacked on /usr/lib/anaconda-wmf...
Tue 02 Mar 2021 12:43:14 AM UTC Activated user conda environment in /home/otto/.conda/envs/2021-02-17T18.50.02_otto stacked on /usr/lib/anaconda-wmf.

     active environment : 2021-02-17T18.50.02_otto
    active env location : /home/otto/.conda/envs/2021-02-17T18.50.02_otto
            shell level : 2
       user config file : /home/otto/.condarc
 populated config files : /home/otto/.conda/condarc
          conda version : 4.8.4
    conda-build version : 3.18.11
         python version : 3.7.8.final.0
       virtual packages : __glibc=2.28
       base environment : /usr/lib/anaconda-wmf  (read only)
           channel URLs : https://repo.anaconda.com/pkgs/main/linux-64
                          https://repo.anaconda.com/pkgs/main/noarch
                          https://repo.anaconda.com/pkgs/r/linux-64
                          https://repo.anaconda.com/pkgs/r/noarch
          package cache : /home/otto/.conda/pkgs
                          /usr/lib/anaconda-wmf/pkgs
       envs directories : /home/otto/.conda/envs
                          /usr/lib/anaconda-wmf/envs
               platform : linux-64
             user-agent : conda/4.8.4 requests/2.24.0 CPython/3.7.8 Linux/5.8.0-0.bpo.2-amd64 debian/10 glibc/2.28
                UID:GID : 2129:500
             netrc file : None
           offline mode : False


To deactivate the conda environments, run:
  source deactivate
or
  source /usr/lib/anaconda-wmf/bin/conda-deactivate-stacked

00:43:15 [@stat1008:/home/otto] $ source deactivate
Deactivating /home/otto/.conda/envs/2021-02-17T18.50.02_otto
Deactivating /usr/lib/anaconda-wmf
00:43:36 [@stat1008:/home/otto] $ conda info

     active environment : None
            shell level : 0
       user config file : /home/otto/.condarc
 populated config files : /home/otto/.conda/condarc
          conda version : 4.8.4
    conda-build version : 3.18.11
         python version : 3.7.8.final.0
       virtual packages : __glibc=2.28
       base environment : /usr/lib/anaconda-wmf  (read only)
           channel URLs : https://repo.anaconda.com/pkgs/main/linux-64
                          https://repo.anaconda.com/pkgs/main/noarch
                          https://repo.anaconda.com/pkgs/r/linux-64
                          https://repo.anaconda.com/pkgs/r/noarch
          package cache : /home/otto/.conda/pkgs
                          /usr/lib/anaconda-wmf/pkgs
       envs directories : /home/otto/.conda/envs
                          /usr/lib/anaconda-wmf/envs
               platform : linux-64
             user-agent : conda/4.8.4 requests/2.24.0 CPython/3.7.8 Linux/5.8.0-0.bpo.2-amd64 debian/10 glibc/2.28
                UID:GID : 2129:500
             netrc file : None
           offline mode : False

@fkaelin

Ah ah indeed it should. I think this is a pip thing. In my previous tests conda lets me install newer versions of things into my conda env.

For pip, it looks like you need --ignore-installed

pip install --ignore-installed --upgrade git+https://github.com/wikimedia/wmfdata-python.git@release

worked for me.

@nshahquinn-wmf are you using the ssh terminal or the Notebook Terminal?

This was in the notebook terminal. I just tried in a regular SSH terminal, and I don't get these errors there.

Some other initial thoughts:

  • If I try to run conda-activate-stacked while an environment is already activated, I get an error saying I need to use the deactivate command first. Why doesn't it just automatically run that command itself?
  • The read-only base environment seems to behave differently from user-created environments, which increases the learning curve. What's the benefit of defaulting to the base environment instead of to an auto-created "normal" user-created environment?
  • Why can't we automatically set the http_proxy and https_proxy environment variables?

Change 667909 had a related patch set uploaded (by Ottomata; owner: Ottomata):
[operations/debs/anaconda-wmf@debian] Fix bug in conda-deactivate-stacked that would cause infinite loop

https://gerrit.wikimedia.org/r/667909

Change 667913 had a related patch set uploaded (by Ottomata; owner: Ottomata):
[operations/debs/anaconda-wmf@debian] Symlink conda-(de)activate-stacked scripts into user env instead of cp

https://gerrit.wikimedia.org/r/667913

This was in the notebook terminal. I just tried in a regular SSH terminal, and I don't get these errors there.

Ah ha, I see why. Thanks this is def a bug in the conda-deactivate-stack script, and I see an improvement I can make in the activate one too. Fix here: https://gerrit.wikimedia.org/r/c/operations/debs/anaconda-wmf/+/667909.

The read-only base environment seems to behave differently from user-created environments, which increases the learning curve. What's the benefit of defaulting to the base environment instead of to an auto-created "normal" user-created environment?

By defaulting, do you mean in the Jupyter profile selector UI when you start your server? If you already have an user environment created, it will default to using the most recently created one. But, ya I suppose we could change the UI default to be 'create new environment' instead of readonly anaconoda-wmf base env. Is that what you mean?

Why can't we automatically set the http_proxy and https_proxy environment variables?

In Jupyter we do, don't we? Do you mean on the ssh CLI?

This was in the notebook terminal. I just tried in a regular SSH terminal, and I don't get these errors there.

Ah ha, I see why. Thanks this is def a bug in the conda-deactivate-stack script, and I see an improvement I can make in the activate one too. Fix here: https://gerrit.wikimedia.org/r/c/operations/debs/anaconda-wmf/+/667909.

Nice!

The read-only base environment seems to behave differently from user-created environments, which increases the learning curve. What's the benefit of defaulting to the base environment instead of to an auto-created "normal" user-created environment?

By defaulting, do you mean in the Jupyter profile selector UI when you start your server? If you already have an user environment created, it will default to using the most recently created one. But, ya I suppose we could change the UI default to be 'create new environment' instead of readonly anaconoda-wmf base env. Is that what you mean?

Essentially—I was thinking about removing the base environment option from the UI altogether. That way, a user who sticks to one environment has the same experience as before, but for a user who switches, all the environments (including the auto-created default) behave the same.

Why can't we automatically set the http_proxy and https_proxy environment variables?

In Jupyter we do, don't we? Do you mean on the ssh CLI?

Yeah, I think I was in an SSH terminal when I noticed this. I don't know if if it's set by default in Jupyter—I haven't started from scratch in a long time. It would be nice to do a complete reset of my environment on one of the stat machines so I can try that; that way I can get try the full new user experience.

It really simplifies things when the terminals behave identically (for example, @elukey's work on making them share Kerberos credentials was great).

Essentially—I was thinking about removing the base environment option from the UI altogether.

Hm, I suppose we could do that. I left it in because it is much faster to start from an existing environment that create a new one, but perhaps you are right. Once they create their new conda env, it will be the same to start from it every other time. Ok, let's remove it.

It really simplifies things when the terminals behave identically.

In this case, the terminals do! You have to set the http proxy env vars manually (or add it to your ~/.bashrc) both the ssh terminal too. I think if you did add it to your .bashrc, it would work the same in both places.

https://wikitech.wikimedia.org/wiki/HTTP_proxy

It really simplifies things when the terminals behave identically.

In this case, the terminals do! You have to set the http proxy env vars manually (or add it to your ~/.bashrc) both the ssh terminal too. I think if you did add it to your .bashrc, it would work the same in both places.

https://wikitech.wikimedia.org/wiki/HTTP_proxy

Hahaha, okay. So, why can't we set the env vars automatically so that things are both consistent and easy? 😊

A good question, I don't think SRE would want us to? But we do set it automatically in Juypter...so maybe?

For me, I wouldn't want that. I wouldn't want my work to depend on constant access to the internet. For example...the fact that wmfdata (maybe it was the last release?) reaches out to the internet when you import it is a little strange...I'm glad that I know it is doing it. Since Jupyter encourages you to install packages with pip/conda, it makes sense to enable this here, but the Jupyter processes are contained in a systemd service that has a bit more control over what the user can do that a regular shell does. Anyway, maybe it could be done, buuuuut, let's not poke that nest at this time :)

Change 667909 merged by Ottomata:
[operations/debs/anaconda-wmf@debian] Fix bug in conda-deactivate-stacked that would cause infinite loop

https://gerrit.wikimedia.org/r/667909

Change 667913 merged by Ottomata:
[operations/debs/anaconda-wmf@debian] Symlink conda-(de)activate-stacked scripts into user env instead of cp

https://gerrit.wikimedia.org/r/667913

Change 668425 had a related patch set uploaded (by Ottomata; owner: Ottomata):
[operations/debs/anaconda-wmf@debian] Don't install a copy of R in a stacked user conda env

https://gerrit.wikimedia.org/r/668425

Change 668439 had a related patch set uploaded (by Ottomata; owner: Ottomata):
[operations/puppet@production] Exclude the readonly conda base env from the list of Jupyter profiles

https://gerrit.wikimedia.org/r/668439

Change 668445 had a related patch set uploaded (by Ottomata; owner: Ottomata):
[operations/puppet@production] Rename CONDA_BASE_ENV_PATH to CONDA_BASE_ENV_PREFIX

https://gerrit.wikimedia.org/r/668445

Change 668446 had a related patch set uploaded (by Ottomata; owner: Ottomata):
[operations/debs/anaconda-wmf@debian] Rename CONDA_BASE_ENV_PATH to CONDA_BASE_ENV_PREFIX

https://gerrit.wikimedia.org/r/668446

Change 668446 merged by Ottomata:
[operations/debs/anaconda-wmf@debian] Rename CONDA_BASE_ENV_PATH to CONDA_BASE_ENV_PREFIX

https://gerrit.wikimedia.org/r/668446

Change 668439 merged by Ottomata:
[operations/puppet@production] Exclude the readonly conda base env from the list of Jupyter profiles

https://gerrit.wikimedia.org/r/668439

Change 668445 merged by Ottomata:
[operations/puppet@production] Rename CONDA_BASE_ENV_PATH to CONDA_BASE_ENV_PREFIX

https://gerrit.wikimedia.org/r/668445

Change 668466 had a related patch set uploaded (by Ottomata; owner: Ottomata):
[operations/puppet@production] Jupyter - never use webproxy for *.wmnet URLs and use system cacerts

https://gerrit.wikimedia.org/r/668466

Change 668485 had a related patch set uploaded (by Ottomata; owner: Ottomata):
[operations/puppet@production] Spark JVMs inherit system http proxy settings

https://gerrit.wikimedia.org/r/668485

Change 668425 merged by Ottomata:
[operations/debs/anaconda-wmf@debian] Don't install a copy of R in a stacked user conda env

https://gerrit.wikimedia.org/r/668425

Change 668466 merged by Ottomata:
[operations/puppet@production] Jupyter - never use webproxy for *.wmnet URLs and make Java use system cacerts

https://gerrit.wikimedia.org/r/668466

Change 668485 merged by Ottomata:
[operations/puppet@production] Spark JVMs inherit system http settings

https://gerrit.wikimedia.org/r/668485

Because of the bug above, I updated the Jupyter page on Wikitech to remove the recommendation that all users switch to Newpyter. I look forward to continuing to test it and then helping to spread the word to switch once it's ready 😊

@nshahquinn-wmf I reverted this change to put the recommendation to ouse newpyter back in the docs.

A new anaconda-wmf is out with the newer version of wmfdata and with some improvements to conda integration with spark and python requests.

Change 676999 had a related patch set uploaded (by Ottomata; author: Ottomata):

[operations/puppet@production] Configure Notebook Terminal to not use login shell

https://gerrit.wikimedia.org/r/676999

Change 677341 had a related patch set uploaded (by Ottomata; author: Ottomata):

[operations/puppet@production] Fix bug in jupyterhub-conda that was not allowing users to start new Notebook servers

https://gerrit.wikimedia.org/r/677341

Change 677341 merged by Ottomata:

[operations/puppet@production] Fix bug in jupyterhub-conda that was keeping users from starting new Notebooks

https://gerrit.wikimedia.org/r/677341

Change 676999 merged by Ottomata:

[operations/puppet@production] Configure Notebook Terminal to not use login shell

https://gerrit.wikimedia.org/r/676999

There are still some child tasks of this Newpyter parent task, but as of today I think we can call the 'Newpyter' project done.

There are still some child tasks of this Newpyter parent task, but as of today I think we can call the 'Newpyter' project done.

Congratulations! \o/