I am migrating a bunch of notebooks and files from a DS server instance to a 7.10 Jupyter container. I have like 40 Jupyter notebooks backed-up on S3. I want to copy or import them onto this new environment. Based on my little understanding of how Jupyter containers work, I cannot simply copy the notebooks on the container’s file system and expect them to work (because the notebooks are persisted as types in the type system, and not files in the file system).
Option 1) I could download those 40 notebooks and individually upload them to the Jupyter container one at a time (assuming that the upload action triggers the upsertion of the .ipynb file). Naturally, that approach is tedious and I was wondering if there were another way.
Option 2) Is there, for example, a function on one of the Jupyter types to upsert a jupyter notebook file (.ipynb), which is on the container’s local filesystem, into the type system? So the function’s input would be a local file path on the container. If I had that feature, I could copy the files on S3 onto the local file system (using AWS CLI) and then use code to upsert the notebooks into the type system.
Option 3) Of course, a better option would be to provide the path to an .ipynb file on S3 (instead of the local file system). So perhaps the file could be directly upserted from S3 without having to go through the Jupyter container’s file system.