With the Q3 2025 CDP release, AI Workbench notebooks now run in fully isolated environments with fewer bundled dependencies. These changes give developers more control over package management and improve stability.
Each scheduled run uses its own isolated environment. Previously, all scheduled notebooks shared the same environment. Now, each run creates a clean kernel that’s deleted afterward, reducing the chance of unintended breakages.
No more version constraints on package installs. You can install any version of a Python package, even if a different version was pre-installed.
Fewer default packages. Some packages previously included by default are no longer pre-installed unless needed by BlueConic.
Check Python packages
With the current release, the kernels used will have several packages installed by default. The kernels used for scheduled runs will have a few more packages installed than the edit kernel. All packages installed in the expected kernel are in the table below, with the last column indicating if they are also available in the edit kernel:
Package Name | Version | Available in Edit Kernel |
ansicolors | 1.1.8 | - |
appnope | 0.1.4 | ✅ |
asttokens | 3.0.0 | ✅ |
attrs | 25.3.0 | - |
certifi | 2025.4.26 | ✅ |
charset-normalizer | 3.4.2 | ✅ |
click | 8.2.1 | - |
comm | 0.2.2 | ✅ |
cython | 3.1.1 | ✅ |
debugpy | 1.8.14 | ✅ |
decorator | 5.2.1 | ✅ |
entrypoints | 0.4 | - |
executing | 2.2.0 | ✅ |
fastjsonschema | 2.21.1 | - |
idna | 3.10 | ✅ |
ijson | 3.3.0 | ✅ |
ipykernel | 6.29.5 | ✅ |
ipython | 9.3.0 | ✅ |
ipython-pygments-lexers | 1.1.1 | ✅ |
ipywidgets | 8.1.7 | ✅ |
jedi | 0.19.2 | ✅ |
jsonschema | 4.24.0 | - |
jsonschema-specifications | 2025.4.1 | - |
jupyter-client | 8.6.3 | ✅ |
jupyter-core | 5.8.1 | ✅ |
jupyterlab-widgets | 3.0.15 | ✅ |
matplotlib-inline | 0.1.7 | ✅ |
nbclient | 0.10.2 | - |
nbformat | 5.10.4 | - |
nest-asyncio | 1.6.0 | ✅ |
oauthlib | 3.2.2 | ✅ |
orjson | 3.10.18 | ✅ |
packaging | 25 | ✅ |
papermill | 2.6.0 | - |
parso | 0.8.4 | ✅ |
pexpect | 4.9.0 | ✅ |
platformdirs | 4.3.8 | ✅ |
prompt-toolkit | 3.0.51 | ✅ |
psutil | 7.0.0 | ✅ |
ptyprocess | 0.7.0 | ✅ |
pure-eval | 0.2.3 | ✅ |
pygments | 2.19.1 | ✅ |
python-dateutil | 2.9.0.post0 | ✅ |
pyyaml | 6.0.2 | - |
pyzmq | 26.4.0 | ✅ |
referencing | 0.36.2 | - |
requests | 2.32.3 | ✅ |
requests-oauthlib | 1.3.1 | ✅ |
rpds-py | 0.25.1 | - |
six | 1.17.0 | ✅ |
stack-data | 0.6.3 | ✅ |
tenacity | 9.1.2 | - |
tornado | 6.5.1 | ✅ |
tqdm | 4.67.1 | ✅ |
traitlets | 5.14.3 | ✅ |
types-requests | 2.32.0.20250602 | ✅ |
typing-extensions | 4.14.0 | - |
urllib3 | 2.3.0 | ✅ |
wcwidth | 0.2.13 | ✅ |
wheel | 0.45.1 | ✅ |
widgetsnbextension | 4.0.14 | ✅ |
setuptools | 80.9.0 | ✅ |
Previously, one notebook could install packages, used by other notebooks, even if those other notebooks did not have a pip install line that explicitly installed those packages, but this is no longer the case. PIP Packages that are no longer installed by default:
paramiko
pandas
matplotlib
pytz
numpy
seaborn
sklearn
Tip: Use CMD+f or CTRL+f to search for pip install to locate lines of code that call Python library packages.
Update your packages
If your notebooks use packages that are no longer pre-installed, you'll need to update your code:
Verify your user role includes Notebook editor permissions in AI Workbench.
Go to More > AI Workbench.
Open the notebook editor to assess whether your notebooks are calling the supported packages. Use the table above or the Python dependencies page to find the supported packages.
Examine the Run history for an AI Workbench notebook to check for errors in missing packages.
Run pip install for those missing packages.
Click Save and run the notebook.
Check for errors in the notebook's run history to see if there are additional updates or fixes required based on your custom implementation.
Each scheduled run starts from a clean environment, so required packages must be explicitly installed during execution. Edit Mode behavior remains unchanged, but is now isolated to prevent conflicts with scheduled runs.
DO's:
If a notebook uses (imports) packages that are not available out of the box, always install these packages explicitly in that notebook.
Always install one specific version of a package. Use the “==” to specify the version.
Try to use the latest versions of the packages. Old (4+ years old) versions may cause errors during installation and may also exhibit reduced performance.
DONT’s:
Avoid manually (re-)installing packages that are installed by default from Blueconic (see the first table above). The reason to avoid this is that BlueConic may upgrade to a higher version in future releases. However, if you install an older version, it will remove the newer version that comes by default, possibly decreasing performance.
Do not uninstall the default packages (see the first table above). BlueConic needs these to run correctly.
BlueConic scheduled runs now use ‘uv pip install’ to install packages, instead of ‘pip install’. But BlueConic users who install packages should not use uv for manually installing packages. Although supported by Jupyter, BlueConic scheduled runs do not support these install lines. For stability, only use ‘%pip install….’ or ‘!pip install….’.
Never use “--force-reinstall”. Perhaps this was necessary in the previous version to install a package that was already present, but with scheduled runs, you start with a clean slate every time. In that case, “--force-reinstall” is redundant, but will be detrimental for performance and memory use later during the scheduled run.
FAQs
How is the isolation improved?
Notebook runs use a Python interpreter to run the Python code. Within Jupyter notebooks, these are called ‘kernels’. Previously, we always used a single kernel for all runs. In this release, we create several kernels, used for different notebook runs. In edit mode, we use one kernel (called the edit mode kernel). When running a notebook via scheduling (or ‘run now’), a kernel is created for that single run. This kernel is deleted after the scheduled run has finished.
How will you notice this isolation?
Previously, you could install Python packages within a notebook during a scheduled run. Once a package was installed during the first run, it remained available for subsequent runs, not just for that specific notebook, but for all scheduled notebooks. This remained true until the next BlueConic release, at which point the system running the notebooks was reset. After a reset, any previously installed packages must be reinstalled. With the current release, each scheduled run starts with a newly created, clean kernel environment. As a result, any packages installed during a previous run will not persist. Therefore, every scheduled notebook must explicitly install any required packages at the beginning of its execution—unless those packages are already included by default. Packages installed in Edit Mode will remain available for future Edit Mode sessions—this behavior has not changed. However, the kernel used in Edit Mode is now separate from the one used by the Jupyter server. As a result, if there's an issue (such as a faulty package installation), it may affect only the Edit Mode kernel without impacting the Jupyter server or scheduled runs. This isolation helps ensure greater stability across your environment.
How will you notice the absence of constraints?
Previously, any packages installed by BlueConic out of the box had a specific version, and it was not possible to install a different version. The reason for using constraints was to prevent potential conflicts between packages required by BlueConic and those needed by the user. With the current release, this is no longer the case. However, packages are still installed by default. Installing the same package with a different version will rarely conflict due to kernel isolation(see above) and can be fixed easily.
Why did BlueConic make these changes?
In previous versions, one notebook could break another notebook by installing incompatible packages. We’ve now improved isolation by creating a new virtual environment for each scheduled notebook run, which in turn should make notebooks more robust.
Are the package installations required?
Yes, in order to run your AI Workbench notebook code as of June 2025, the packages used in your notebook code have to be installed by that notebook if not part of the table above.
When will my BlueConic server be updated?
These changes were made as part of the Q3-2025 release. Visit status.blueconic.com for current status updates for the BlueConic platform. This page is updated continually and announces major updates to the BlueConic platform
What will happen to notebooks still using packages but not installing them explicitly?
Any notebook using packages that it is not installed will not run successfully and will throw an error. Check the AI Workbench run history for your notebooks for errors.
The notebook run history will show an error: ERROR: ModuleNotFoundError: No module named 'pandas'
How can I check if a notebook needs to be updated?
There are two ways to check:
You can examine the Run history for an AI Workbench notebook to check for errors, as shown above.
Or, if your user role has notebook editor privileges (BlueConic Administrators or Data Scientists, for example), you can inspect the notebook code in the Notebook editor window in AI Workbench.
Will existing notebooks automatically use the new kernel?
Existing notebooks will automatically use the new kernel. When you open the notebook, it will detect and display the correct (latest) kernel without requiring you to save or update anything manually.
Who can I contact for questions or help with resolving library or package updates?
Contact BlueConic Support if you have questions. Note that BlueConic Support cannot edit your AI notebooks but can help you understand if there are errors and how they should be resolved.