Stay organized with collections Save and categorize content based on your preferences.
Create a managed notebooks instance with a custom containerVertex AI Workbench managed notebooks is deprecated. On April 14, 2025, support for managed notebooks will end and the ability to create managed notebooks instances will be removed. Existing instances will continue to function but patches, updates, and upgrades won't be available. To continue using Vertex AI Workbench, we recommend that you migrate your managed notebooks instances to Vertex AI Workbench instances.
This page shows you how to add a custom container to a Vertex AI Workbench managed notebooks instance as a kernel that you can run your notebook files on.
OverviewYou can add a custom container for use with your managed notebooks instance. The custom container is then available as a local kernel that you can run your notebook file on.
Custom container requirementsVertex AI Workbench managed notebooks supports any of the current Deep Learning Containers container images.
To create a custom container image of your own, you can modify one of the Deep Learning Containers container images to create a derivative container image.
To create a custom container image from scratch, make sure the container image meets the following requirements:
Use a Docker container image with at least one valid Jupyter kernelspec. This exposed kernelspec lets Vertex AI Workbench managed notebooks load the container image as a kernel. If your container image includes an installation of JupyterLab or Jupyter Notebook, the installation will include the kernelspec by default. If your container image doesn't have the kernelspec, you can install the kernelspec directly.
Note: If your container has a valid Jupyter kernelspec, it will respond tojupyter kernelspec list --json
with a list of available kernelspecs.The Docker container image must support sleep infinity
.
To use your custom container with the managed notebooks executor, ensure that your custom container has the nbexecutor
extension.
nbexecutor
extension by default.
The following example Dockerfile text builds a custom Docker image from scratch that is based on an Ubuntu image and includes the latest Python version.
FROM --platform=linux/amd64 ubuntu:22.04
RUN apt-get -y update
RUN apt-get install -y --no-install-recommends \
python3-pip \
pipx \
git \
make \
jq
RUN pip install \
argcomplete>=1.9.4 \
poetry==1.1.14 \
jupyterlab==3.3.0
# Create a link that points to the right python bin directory
RUN ln -s /usr/bin/pythonVERSION_NUMBER /usr/bin/python
Replace VERSION_NUMBER
with the version of Python that you're using.
For each custom container image provided, your managed notebooks instance identifies the available Jupyter kernelspec on the container image when the instance starts. The kernelspec appears as a local kernel in the JupyterLab interface. When the kernelspec is selected, the managed notebooks kernel manager runs the custom container as a kernel and starts a Jupyter session on that kernel.
How custom container kernels are updatedVertex AI Workbench pulls the latest container image for your kernel:
When you create your instance.
When you upgrade your instance.
When you start your instance.
The custom container kernel doesn't persist when your instance is stopped, so each time your instance is started, Vertex AI Workbench pulls the latest version of the container image.
If your instance is running when a new version of a container is released, your instance's kernel isn't updated until you stop and start your instance.
Custom container image availabilityDeep Learning Containers container images are available to all users. When you use a Deep Learning Containers container image, you must grant specific roles to your instance's service account so your instance can load the Deep Learning Containers container image as a kernel. Learn more about the required permissions and how to grant them in the Permissions section.
If you want to use your own custom container image, it must be located in Artifact Registry and the container image must be publicly available.
Before you beginIn the Google Cloud console, on the project selector page, select or create a Google Cloud project.
Note: If you don't plan to keep the resources that you create in this procedure, create a project instead of selecting an existing project. After you finish these steps, you can delete the project, removing all resources associated with the project.Verify that billing is enabled for your Google Cloud project.
Enable the Notebooks and Artifact Registry APIs.
In the Google Cloud console, on the project selector page, select or create a Google Cloud project.
Note: If you don't plan to keep the resources that you create in this procedure, create a project instead of selecting an existing project. After you finish these steps, you can delete the project, removing all resources associated with the project.Verify that billing is enabled for your Google Cloud project.
Enable the Notebooks and Artifact Registry APIs.
To add a custom container to a managed notebooks instance, the custom container image must be specified at instance creation.
To add a custom container while you create a managed notebooks instance, complete the following steps.
In the Google Cloud console, go to the Managed notebooks page.
Click add_box Create new.
In the Name field, enter a name for your instance.
Click the Region list, and select a region for your instance.
In the Environment section, select Provide custom Docker images.
Add a Docker container image in one of the following ways:
us-docker.pkg.dev/deeplearning-platform-release/gcr.io/tf-cpu.2-12.py310
.Complete the rest of the Create instance dialog according to your needs.
Click Create.
Vertex AI Workbench automatically starts the instance. When the instance is ready to use, Vertex AI Workbench activates an Open JupyterLab link.
If you aren't using a Deep Learning Containers container image, skip this section.
To ensure that your instance's service account has the necessary permissions to load a Deep Learning Containers container image from Artifact Registry, ask your administrator to grant your instance's service account the following IAM roles on your instance:
Important: You must grant these roles to your instance's service account, not to your user account. Failure to grant the roles to the correct principal might result in permission errors.roles/compute.instanceAdmin.v1
)roles/artifactregistry.reader
)For more information about granting roles, see Manage access to projects, folders, and organizations.
Your administrator might also be able to give your instance's service account the required permissions through custom roles or other predefined roles.
Set up a notebook file to run in your custom containerTo open JupyterLab, create a new notebook file, and set it up to run on your custom container's kernel, complete the following steps.
Next to your managed notebooks instance's name, click Open JupyterLab.
In the Authenticate your managed notebook dialog, click the button to get an authentication code.
Choose an account and click Allow. Copy the authentication code.
In the Authenticate your managed notebook dialog, paste the authentication code, and then click Authenticate.
Your managed notebooks instance opens JupyterLab.
Select File > New > Notebook.
In the Select kernel dialog, select the kernel for the custom container image that you want to use, and then click Select. Larger container images may take some time to appear as a kernel. If the kernel that you want isn't there yet, try again in a few minutes. You can change the kernel whenever you want to run your notebook file on a different kernel.
Your new notebook file opens.
Learn how to access Cloud Storage buckets and files from within JupyterLab.
Learn how to query data in BigQuery tables from within JupyterLab.
Except as otherwise noted, the content of this page is licensed under the Creative Commons Attribution 4.0 License, and code samples are licensed under the Apache 2.0 License. For details, see the Google Developers Site Policies. Java is a registered trademark of Oracle and/or its affiliates.
Last updated 2025-08-07 UTC.
[[["Easy to understand","easyToUnderstand","thumb-up"],["Solved my problem","solvedMyProblem","thumb-up"],["Other","otherUp","thumb-up"]],[["Hard to understand","hardToUnderstand","thumb-down"],["Incorrect information or sample code","incorrectInformationOrSampleCode","thumb-down"],["Missing the information/samples I need","missingTheInformationSamplesINeed","thumb-down"],["Other","otherDown","thumb-down"]],["Last updated 2025-08-07 UTC."],[],[]]
RetroSearch is an open source project built by @garambo | Open a GitHub Issue
Search and Browse the WWW like it's 1997 | Search results from DuckDuckGo
HTML:
3.2
| Encoding:
UTF-8
| Version:
0.7.4