Skip to content

Commit acb8bcc

Browse files
authored
Upgrade base image to colab_20250725-060057_RC00, install google-adk (#1498)
A couple of things here: we upgraded to our base to latest images that is still - on py3.11 - uses keras 3.8.x and tf 2.18.x we encounter some issues with upgrading to py3.12, so we'll punt that for later. in addition, we will install google-adk and pyngrok we will add a more involved test in a later PR. also we diffed new base image and removed packages that is now installed in their image @jplotts please lmk if pyngrok is needed for b/443054743 , we can always remove this at a later point
1 parent 506c34b commit acb8bcc

File tree

3 files changed

+7
-12
lines changed

3 files changed

+7
-12
lines changed

Dockerfile.tmpl

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -25,8 +25,8 @@ RUN uv pip uninstall --system google-cloud-bigquery-storage
2525
# b/394382016: sigstore (dependency of kagglehub) requires a prerelease packages, installing separate.
2626
# b/408284143: google-cloud-automl 2.0.0 introduced incompatible API changes, need to pin to 1.0.1,
2727
# installed outside of kaggle_requirements.txt due to requiring an incompatibile version of protobuf.
28-
RUN uv pip install --system --force-reinstall --prerelease=allow kagglehub[pandas-datasets,hf-datasets,signing]>=0.3.12 \
29-
google-cloud-automl==1.0.1
28+
RUN uv pip install --system --force-reinstall --prerelease=allow "kagglehub[pandas-datasets,hf-datasets,signing]>=0.3.12" \
29+
"google-cloud-automl==1.0.1"
3030

3131
# uv cannot install this in requirements.txt without --no-build-isolation
3232
# to avoid affecting the larger build, we'll post-install it.

config.txt

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,4 @@
11
BASE_IMAGE=us-docker.pkg.dev/colab-images/public/runtime
2-
BASE_IMAGE_TAG=release-colab_20250626-060053_RC00
2+
BASE_IMAGE_TAG=release-colab_20250725-060057_RC00
33
CUDA_MAJOR_VERSION=12
44
CUDA_MINOR_VERSION=5

kaggle_requirements.txt

Lines changed: 4 additions & 9 deletions
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,4 @@
11
# Please keep this in alphabetical order
2-
Altair>=5.4.0
3-
Babel
42
Boruta
53
Cartopy
64
ImageHash
@@ -24,7 +22,6 @@ category-encoders
2422
cesium
2523
comm
2624
cytoolz
27-
dask-expr
2825
# Older versions of datasets fail with "Loading a dataset cached in a LocalFileSystem is not supported"
2926
# https://stackoverflow.com/questions/77433096/notimplementederror-loading-a-dataset-cached-in-a-localfilesystem-is-not-suppor
3027
datasets>=2.14.6
@@ -48,6 +45,8 @@ geojson
4845
# geopandas > v0.14.4 breaks learn tools
4946
geopandas==v0.14.4
5047
gensim
48+
# b/443054743
49+
google-adk
5150
google-cloud-aiplatform
5251
# b/315753846: Unpin translate package.
5352
google-cloud-translate==3.12.1
@@ -111,7 +110,6 @@ pyLDAvis
111110
pycryptodome
112111
pydegensac
113112
pydicom
114-
pydub
115113
pyemd
116114
pyexcel-ods
117115
pymc3
@@ -144,18 +142,15 @@ tensorflow-cloud
144142
tensorflow-io
145143
tensorflow-text
146144
tensorflow_decision_forests
147-
timm
148-
torchao
149145
torchinfo
150146
torchmetrics
151147
torchtune
152148
transformers>=4.51.0
153-
triton
154-
tsfresh
155149
vtk
156-
wandb
157150
wavio
158151
# b/350573866: xgboost v2.1.0 breaks learntools
159152
xgboost==2.0.3
160153
xvfbwrapper
161154
ydata-profiling
155+
# b/443054743: pinned as newer versions requires protobuf > 3.20.3
156+
ydf==0.9.0

0 commit comments

Comments
 (0)