In production systems, the decision to deploy a model usually goes beyond the global metrics (e.g. accuracy) set during training. It is also important to evaluate how your model performs in different scenarios. For instance, does your weather forecasting model perform equally well in summer compared to winter? Or does your camera-based defect detector work only in certain lighting conditions? This type of investigation helps to ensure that your model can handle different cases. More than that, it can help uncover any learned biases that can result in a negative experience for your users. For example, if you're supposed to have a gender-neutral application, you don't want your model to only work well for one while poorly for another.
In this lab, you will be working with TensorFlow Model Analysis (TFMA) -- a library built specifically for analyzing a model's performance across different configurations. It allows you to specify slices of your data, then it will compute and visualize how your model performs on each slice. You can also set thresholds that your model must meet before it is marked ready for deployment. These help you make better decisions regarding any improvements you may want to make to boost your model's performance and ensure fairness.
For this exercise, you will use TFMA to analyze models trained on the Census Income dataset. Specifically, you will:
Credits: Some of the code and discussions are based on the TensorFlow team's official tutorial.
In this section, you will first setup your workspace to have all the modules and files to work with TFMA. You will
If running in a local Jupyter notebook, then these Jupyter extensions must be installed in the environment before running Jupyter. These are already available in Colab so we'll just leave the commands here for reference.
jupyter nbextension enable --py widgetsnbextension --sys-prefix
jupyter nbextension install --py --symlink tensorflow_model_analysis --sys-prefix
jupyter nbextension enable --py tensorflow_model_analysis --sys-prefix
This will pull in all the dependencies and will take 6 to 8 minutes to complete.
# Upgrade pip to the latest version and install required packages
!pip install -U pip
!pip install --use-deprecated=legacy-resolver tensorflow_data_validation==1.1.0
!pip install --use-deprecated=legacy-resolver tensorflow-transform==1.0.0
!pip install --use-deprecated=legacy-resolver tensorflow-model-analysis==0.32.0
!pip install apache-beam==2.32.0
Looking in indexes: https://pypi.org/simple, https://us-python.pkg.dev/colab-wheels/public/simple/ Requirement already satisfied: pip in /usr/local/lib/python3.7/dist-packages (21.1.3) Collecting pip Downloading pip-22.2.2-py3-none-any.whl (2.0 MB) |████████████████████████████████| 2.0 MB 9.0 MB/s Installing collected packages: pip Attempting uninstall: pip Found existing installation: pip 21.1.3 Uninstalling pip-21.1.3: Successfully uninstalled pip-21.1.3 Successfully installed pip-22.2.2 Looking in indexes: https://pypi.org/simple, https://us-python.pkg.dev/colab-wheels/public/simple/ Collecting tensorflow_data_validation==1.1.0 Downloading tensorflow_data_validation-1.1.0-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (1.4 MB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 1.4/1.4 MB 22.5 MB/s eta 0:00:00 Collecting apache-beam[gcp]<3,>=2.29 Downloading apache_beam-2.41.0-cp37-cp37m-manylinux2010_x86_64.whl (10.9 MB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 10.9/10.9 MB 53.8 MB/s eta 0:00:00 Collecting joblib<0.15,>=0.12 Downloading joblib-0.14.1-py2.py3-none-any.whl (294 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 294.9/294.9 kB 28.0 MB/s eta 0:00:00 Collecting tfx-bsl<1.2,>=1.1 Downloading tfx_bsl-1.1.1-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (19.0 MB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 19.0/19.0 MB 51.1 MB/s eta 0:00:00 Requirement already satisfied: tensorflow!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.*,!=2.4.*,<3,>=1.15.2 in /usr/local/lib/python3.7/dist-packages (from tensorflow_data_validation==1.1.0) (2.8.2+zzzcolab20220719082949) Collecting pyarrow<3,>=1 Downloading pyarrow-2.0.0-cp37-cp37m-manylinux2014_x86_64.whl (17.7 MB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 17.7/17.7 MB 58.8 MB/s eta 0:00:00 Collecting tensorflow-metadata<1.2,>=1.1 Downloading tensorflow_metadata-1.1.0-py3-none-any.whl (48 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 48.1/48.1 kB 7.4 MB/s eta 0:00:00 Requirement already satisfied: pandas<2,>=1.0 in /usr/local/lib/python3.7/dist-packages (from tensorflow_data_validation==1.1.0) (1.3.5) Collecting numpy<1.20,>=1.16 Downloading numpy-1.19.5-cp37-cp37m-manylinux2010_x86_64.whl (14.8 MB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 14.8/14.8 MB 106.7 MB/s eta 0:00:00 Requirement already satisfied: six<2,>=1.12 in /usr/local/lib/python3.7/dist-packages (from tensorflow_data_validation==1.1.0) (1.15.0) Requirement already satisfied: protobuf<4,>=3.13 in /usr/local/lib/python3.7/dist-packages (from tensorflow_data_validation==1.1.0) (3.17.3) Collecting absl-py<0.13,>=0.9 Downloading absl_py-0.12.0-py3-none-any.whl (129 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 129.4/129.4 kB 16.4 MB/s eta 0:00:00 Requirement already satisfied: typing-extensions>=3.7.0 in /usr/local/lib/python3.7/dist-packages (from apache-beam[gcp]<3,>=2.29->tensorflow_data_validation==1.1.0) (4.1.1) Collecting orjson<4.0 Downloading orjson-3.8.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (270 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 270.2/270.2 kB 33.9 MB/s eta 0:00:00 Collecting pymongo<4.0.0,>=3.8.0 Downloading pymongo-3.12.3-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (508 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 508.1/508.1 kB 42.1 MB/s eta 0:00:00 Collecting dill<0.3.2,>=0.3.1.1 Downloading dill-0.3.1.1.tar.gz (151 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 152.0/152.0 kB 20.3 MB/s eta 0:00:00 Preparing metadata (setup.py) ... done Requirement already satisfied: grpcio<2,>=1.33.1 in /usr/local/lib/python3.7/dist-packages (from apache-beam[gcp]<3,>=2.29->tensorflow_data_validation==1.1.0) (1.47.0) Requirement already satisfied: pydot<2,>=1.2.0 in /usr/local/lib/python3.7/dist-packages (from apache-beam[gcp]<3,>=2.29->tensorflow_data_validation==1.1.0) (1.3.0) Collecting fastavro<2,>=0.23.6 Downloading fastavro-1.6.0-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (2.4 MB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 2.4/2.4 MB 8.2 MB/s eta 0:00:00 Collecting hdfs<3.0.0,>=2.1.0 Downloading hdfs-2.7.0-py3-none-any.whl (34 kB) Requirement already satisfied: python-dateutil<3,>=2.8.0 in /usr/local/lib/python3.7/dist-packages (from apache-beam[gcp]<3,>=2.29->tensorflow_data_validation==1.1.0) (2.8.2) Collecting requests<3.0.0,>=2.24.0 Downloading requests-2.28.1-py3-none-any.whl (62 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 62.8/62.8 kB 9.5 MB/s eta 0:00:00 Collecting cloudpickle<3,>=2.1.0 Downloading cloudpickle-2.1.0-py3-none-any.whl (25 kB) Collecting proto-plus<2,>=1.7.1 Downloading proto_plus-1.22.1-py3-none-any.whl (47 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 47.9/47.9 kB 7.3 MB/s eta 0:00:00 Requirement already satisfied: pytz>=2018.3 in /usr/local/lib/python3.7/dist-packages (from apache-beam[gcp]<3,>=2.29->tensorflow_data_validation==1.1.0) (2022.2.1) Requirement already satisfied: crcmod<2.0,>=1.7 in /usr/local/lib/python3.7/dist-packages (from apache-beam[gcp]<3,>=2.29->tensorflow_data_validation==1.1.0) (1.7) Requirement already satisfied: httplib2<0.21.0,>=0.8 in /usr/local/lib/python3.7/dist-packages (from apache-beam[gcp]<3,>=2.29->tensorflow_data_validation==1.1.0) (0.17.4) Collecting google-cloud-bigquery-storage<2.14,>=2.6.3; extra == "gcp" Downloading google_cloud_bigquery_storage-2.13.2-py2.py3-none-any.whl (180 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 180.2/180.2 kB 28.2 MB/s eta 0:00:00 Requirement already satisfied: google-api-core!=2.8.2,<3; extra == "gcp" in /usr/local/lib/python3.7/dist-packages (from apache-beam[gcp]<3,>=2.29->tensorflow_data_validation==1.1.0) (1.31.6) Requirement already satisfied: google-cloud-datastore<2,>=1.8.0; extra == "gcp" in /usr/local/lib/python3.7/dist-packages (from apache-beam[gcp]<3,>=2.29->tensorflow_data_validation==1.1.0) (1.8.0) Collecting google-cloud-dlp<4,>=3.0.0; extra == "gcp" Downloading google_cloud_dlp-3.8.1-py2.py3-none-any.whl (119 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 119.4/119.4 kB 18.0 MB/s eta 0:00:00 Collecting google-cloud-bigtable<2,>=0.31.1; extra == "gcp" Downloading google_cloud_bigtable-1.7.2-py2.py3-none-any.whl (267 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 267.7/267.7 kB 33.0 MB/s eta 0:00:00 Collecting google-cloud-recommendations-ai<0.8.0,>=0.1.0; extra == "gcp" Downloading google_cloud_recommendations_ai-0.7.1-py2.py3-none-any.whl (148 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 148.2/148.2 kB 21.6 MB/s eta 0:00:00 Collecting google-cloud-pubsub<3,>=2.1.0; extra == "gcp" Downloading google_cloud_pubsub-2.13.6-py2.py3-none-any.whl (235 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 235.1/235.1 kB 33.1 MB/s eta 0:00:00 Collecting google-apitools<0.5.32,>=0.5.31; extra == "gcp" Downloading google-apitools-0.5.31.tar.gz (173 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 173.5/173.5 kB 21.9 MB/s eta 0:00:00 Preparing metadata (setup.py) ... done Collecting grpcio-gcp<1,>=0.2.2; extra == "gcp" Downloading grpcio_gcp-0.2.2-py2.py3-none-any.whl (9.4 kB) Collecting google-auth-httplib2<0.2.0,>=0.1.0; extra == "gcp" Downloading google_auth_httplib2-0.1.0-py2.py3-none-any.whl (9.3 kB) Requirement already satisfied: google-cloud-core<3,>=0.28.1; extra == "gcp" in /usr/local/lib/python3.7/dist-packages (from apache-beam[gcp]<3,>=2.29->tensorflow_data_validation==1.1.0) (1.0.3) Collecting google-cloud-pubsublite<2,>=1.2.0; extra == "gcp" Downloading google_cloud_pubsublite-1.4.3-py2.py3-none-any.whl (267 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 267.2/267.2 kB 33.0 MB/s eta 0:00:00 Collecting google-cloud-spanner<2,>=1.13.0; extra == "gcp" Downloading google_cloud_spanner-1.19.3-py2.py3-none-any.whl (255 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 255.6/255.6 kB 28.4 MB/s eta 0:00:00 Requirement already satisfied: google-cloud-bigquery<3,>=1.6.0; extra == "gcp" in /usr/local/lib/python3.7/dist-packages (from apache-beam[gcp]<3,>=2.29->tensorflow_data_validation==1.1.0) (1.21.0) Collecting google-cloud-videointelligence<2,>=1.8.0; extra == "gcp" Downloading google_cloud_videointelligence-1.16.3-py2.py3-none-any.whl (183 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 183.9/183.9 kB 26.3 MB/s eta 0:00:00 Collecting google-cloud-language<2,>=1.3.0; extra == "gcp" Downloading google_cloud_language-1.3.2-py2.py3-none-any.whl (83 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 83.6/83.6 kB 12.2 MB/s eta 0:00:00 Requirement already satisfied: google-auth<3,>=1.18.0; extra == "gcp" in /usr/local/lib/python3.7/dist-packages (from apache-beam[gcp]<3,>=2.29->tensorflow_data_validation==1.1.0) (1.35.0) Requirement already satisfied: cachetools<5,>=3.1.0; extra == "gcp" in /usr/local/lib/python3.7/dist-packages (from apache-beam[gcp]<3,>=2.29->tensorflow_data_validation==1.1.0) (4.2.4) Collecting google-cloud-vision<2,>=0.38.0; extra == "gcp" Downloading google_cloud_vision-1.0.2-py2.py3-none-any.whl (435 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 435.1/435.1 kB 44.2 MB/s eta 0:00:00 Collecting tensorflow-serving-api!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.*,!=2.4.*,<3,>=1.15 Downloading tensorflow_serving_api-2.9.1-py2.py3-none-any.whl (37 kB) Requirement already satisfied: google-api-python-client<2,>=1.7.11 in /usr/local/lib/python3.7/dist-packages (from tfx-bsl<1.2,>=1.1->tensorflow_data_validation==1.1.0) (1.12.11) Requirement already satisfied: tensorboard<2.9,>=2.8 in /usr/local/lib/python3.7/dist-packages (from tensorflow!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.*,!=2.4.*,<3,>=1.15.2->tensorflow_data_validation==1.1.0) (2.8.0) Requirement already satisfied: astunparse>=1.6.0 in /usr/local/lib/python3.7/dist-packages (from tensorflow!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.*,!=2.4.*,<3,>=1.15.2->tensorflow_data_validation==1.1.0) (1.6.3) Requirement already satisfied: libclang>=9.0.1 in /usr/local/lib/python3.7/dist-packages (from tensorflow!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.*,!=2.4.*,<3,>=1.15.2->tensorflow_data_validation==1.1.0) (14.0.6) Requirement already satisfied: h5py>=2.9.0 in /usr/local/lib/python3.7/dist-packages (from tensorflow!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.*,!=2.4.*,<3,>=1.15.2->tensorflow_data_validation==1.1.0) (3.1.0) Requirement already satisfied: keras<2.9,>=2.8.0rc0 in /usr/local/lib/python3.7/dist-packages (from tensorflow!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.*,!=2.4.*,<3,>=1.15.2->tensorflow_data_validation==1.1.0) (2.8.0) Requirement already satisfied: wrapt>=1.11.0 in /usr/local/lib/python3.7/dist-packages (from tensorflow!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.*,!=2.4.*,<3,>=1.15.2->tensorflow_data_validation==1.1.0) (1.14.1) Requirement already satisfied: tensorflow-estimator<2.9,>=2.8 in /usr/local/lib/python3.7/dist-packages (from tensorflow!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.*,!=2.4.*,<3,>=1.15.2->tensorflow_data_validation==1.1.0) (2.8.0) Requirement already satisfied: tensorflow-io-gcs-filesystem>=0.23.1 in /usr/local/lib/python3.7/dist-packages (from tensorflow!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.*,!=2.4.*,<3,>=1.15.2->tensorflow_data_validation==1.1.0) (0.26.0) Requirement already satisfied: termcolor>=1.1.0 in /usr/local/lib/python3.7/dist-packages (from tensorflow!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.*,!=2.4.*,<3,>=1.15.2->tensorflow_data_validation==1.1.0) (1.1.0) Requirement already satisfied: google-pasta>=0.1.1 in /usr/local/lib/python3.7/dist-packages (from tensorflow!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.*,!=2.4.*,<3,>=1.15.2->tensorflow_data_validation==1.1.0) (0.2.0) Requirement already satisfied: opt-einsum>=2.3.2 in /usr/local/lib/python3.7/dist-packages (from tensorflow!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.*,!=2.4.*,<3,>=1.15.2->tensorflow_data_validation==1.1.0) (3.3.0) Requirement already satisfied: setuptools in /usr/local/lib/python3.7/dist-packages (from tensorflow!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.*,!=2.4.*,<3,>=1.15.2->tensorflow_data_validation==1.1.0) (57.4.0) Requirement already satisfied: keras-preprocessing>=1.1.1 in /usr/local/lib/python3.7/dist-packages (from tensorflow!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.*,!=2.4.*,<3,>=1.15.2->tensorflow_data_validation==1.1.0) (1.1.2) Requirement already satisfied: flatbuffers>=1.12 in /usr/local/lib/python3.7/dist-packages (from tensorflow!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.*,!=2.4.*,<3,>=1.15.2->tensorflow_data_validation==1.1.0) (2.0.7) Requirement already satisfied: gast>=0.2.1 in /usr/local/lib/python3.7/dist-packages (from tensorflow!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.*,!=2.4.*,<3,>=1.15.2->tensorflow_data_validation==1.1.0) (0.5.3) Requirement already satisfied: googleapis-common-protos<2,>=1.52.0 in /usr/local/lib/python3.7/dist-packages (from tensorflow-metadata<1.2,>=1.1->tensorflow_data_validation==1.1.0) (1.56.4) Requirement already satisfied: pyparsing>=2.1.4 in /usr/local/lib/python3.7/dist-packages (from pydot<2,>=1.2.0->apache-beam[gcp]<3,>=2.29->tensorflow_data_validation==1.1.0) (3.0.9) Collecting docopt Downloading docopt-0.6.2.tar.gz (25 kB) Preparing metadata (setup.py) ... done Requirement already satisfied: urllib3<1.27,>=1.21.1 in /usr/local/lib/python3.7/dist-packages (from requests<3.0.0,>=2.24.0->apache-beam[gcp]<3,>=2.29->tensorflow_data_validation==1.1.0) (1.24.3) Requirement already satisfied: certifi>=2017.4.17 in /usr/local/lib/python3.7/dist-packages (from requests<3.0.0,>=2.24.0->apache-beam[gcp]<3,>=2.29->tensorflow_data_validation==1.1.0) (2022.6.15) Requirement already satisfied: charset-normalizer<3,>=2 in /usr/local/lib/python3.7/dist-packages (from requests<3.0.0,>=2.24.0->apache-beam[gcp]<3,>=2.29->tensorflow_data_validation==1.1.0) (2.1.1) Requirement already satisfied: idna<4,>=2.5 in /usr/local/lib/python3.7/dist-packages (from requests<3.0.0,>=2.24.0->apache-beam[gcp]<3,>=2.29->tensorflow_data_validation==1.1.0) (2.10) Requirement already satisfied: packaging>=14.3 in /usr/local/lib/python3.7/dist-packages (from google-api-core!=2.8.2,<3; extra == "gcp"->apache-beam[gcp]<3,>=2.29->tensorflow_data_validation==1.1.0) (21.3) Collecting grpc-google-iam-v1<0.13dev,>=0.12.3 Downloading grpc_google_iam_v1-0.12.4-py2.py3-none-any.whl (26 kB) Collecting grpcio-status>=1.16.0 Downloading grpcio_status-1.48.1-py3-none-any.whl (14 kB) Collecting fasteners>=0.14 Downloading fasteners-0.17.3-py3-none-any.whl (18 kB) Requirement already satisfied: oauth2client>=1.4.12 in /usr/local/lib/python3.7/dist-packages (from google-apitools<0.5.32,>=0.5.31; extra == "gcp"->apache-beam[gcp]<3,>=2.29->tensorflow_data_validation==1.1.0) (4.1.3) Collecting overrides<7.0.0,>=6.0.1 Downloading overrides-6.2.0-py3-none-any.whl (17 kB) Requirement already satisfied: google-resumable-media!=0.4.0,<0.5.0dev,>=0.3.1 in /usr/local/lib/python3.7/dist-packages (from google-cloud-bigquery<3,>=1.6.0; extra == "gcp"->apache-beam[gcp]<3,>=2.29->tensorflow_data_validation==1.1.0) (0.4.1) Requirement already satisfied: rsa<5,>=3.1.4; python_version >= "3.6" in /usr/local/lib/python3.7/dist-packages (from google-auth<3,>=1.18.0; extra == "gcp"->apache-beam[gcp]<3,>=2.29->tensorflow_data_validation==1.1.0) (4.9) Requirement already satisfied: pyasn1-modules>=0.2.1 in /usr/local/lib/python3.7/dist-packages (from google-auth<3,>=1.18.0; extra == "gcp"->apache-beam[gcp]<3,>=2.29->tensorflow_data_validation==1.1.0) (0.2.8) Requirement already satisfied: uritemplate<4dev,>=3.0.0 in /usr/local/lib/python3.7/dist-packages (from google-api-python-client<2,>=1.7.11->tfx-bsl<1.2,>=1.1->tensorflow_data_validation==1.1.0) (3.0.1) Requirement already satisfied: tensorboard-data-server<0.7.0,>=0.6.0 in /usr/local/lib/python3.7/dist-packages (from tensorboard<2.9,>=2.8->tensorflow!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.*,!=2.4.*,<3,>=1.15.2->tensorflow_data_validation==1.1.0) (0.6.1) Requirement already satisfied: google-auth-oauthlib<0.5,>=0.4.1 in /usr/local/lib/python3.7/dist-packages (from tensorboard<2.9,>=2.8->tensorflow!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.*,!=2.4.*,<3,>=1.15.2->tensorflow_data_validation==1.1.0) (0.4.6) Requirement already satisfied: markdown>=2.6.8 in /usr/local/lib/python3.7/dist-packages (from tensorboard<2.9,>=2.8->tensorflow!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.*,!=2.4.*,<3,>=1.15.2->tensorflow_data_validation==1.1.0) (3.4.1) Requirement already satisfied: werkzeug>=0.11.15 in /usr/local/lib/python3.7/dist-packages (from tensorboard<2.9,>=2.8->tensorflow!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.*,!=2.4.*,<3,>=1.15.2->tensorflow_data_validation==1.1.0) (1.0.1) Requirement already satisfied: tensorboard-plugin-wit>=1.6.0 in /usr/local/lib/python3.7/dist-packages (from tensorboard<2.9,>=2.8->tensorflow!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.*,!=2.4.*,<3,>=1.15.2->tensorflow_data_validation==1.1.0) (1.8.1) Requirement already satisfied: wheel>=0.26 in /usr/local/lib/python3.7/dist-packages (from tensorboard<2.9,>=2.8->tensorflow!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.*,!=2.4.*,<3,>=1.15.2->tensorflow_data_validation==1.1.0) (0.37.1) Requirement already satisfied: cached-property; python_version < "3.8" in /usr/local/lib/python3.7/dist-packages (from h5py>=2.9.0->tensorflow!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.*,!=2.4.*,<3,>=1.15.2->tensorflow_data_validation==1.1.0) (1.5.2) Requirement already satisfied: pyasn1>=0.1.7 in /usr/local/lib/python3.7/dist-packages (from oauth2client>=1.4.12->google-apitools<0.5.32,>=0.5.31; extra == "gcp"->apache-beam[gcp]<3,>=2.29->tensorflow_data_validation==1.1.0) (0.4.8) Requirement already satisfied: requests-oauthlib>=0.7.0 in /usr/local/lib/python3.7/dist-packages (from google-auth-oauthlib<0.5,>=0.4.1->tensorboard<2.9,>=2.8->tensorflow!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.*,!=2.4.*,<3,>=1.15.2->tensorflow_data_validation==1.1.0) (1.3.1) Requirement already satisfied: importlib-metadata>=4.4; python_version < "3.10" in /usr/local/lib/python3.7/dist-packages (from markdown>=2.6.8->tensorboard<2.9,>=2.8->tensorflow!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.*,!=2.4.*,<3,>=1.15.2->tensorflow_data_validation==1.1.0) (4.12.0) Requirement already satisfied: oauthlib>=3.0.0 in /usr/local/lib/python3.7/dist-packages (from requests-oauthlib>=0.7.0->google-auth-oauthlib<0.5,>=0.4.1->tensorboard<2.9,>=2.8->tensorflow!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.*,!=2.4.*,<3,>=1.15.2->tensorflow_data_validation==1.1.0) (3.2.0) Requirement already satisfied: zipp>=0.5 in /usr/local/lib/python3.7/dist-packages (from importlib-metadata>=4.4; python_version < "3.10"->markdown>=2.6.8->tensorboard<2.9,>=2.8->tensorflow!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.*,!=2.4.*,<3,>=1.15.2->tensorflow_data_validation==1.1.0) (3.8.1) Building wheels for collected packages: dill, google-apitools, docopt Building wheel for dill (setup.py) ... done Created wheel for dill: filename=dill-0.3.1.1-py3-none-any.whl size=78544 sha256=a4200a0cdd9f6ad01eee4d4195a76694c76f11a6b50f0bbf342ef17157bd6de3 Stored in directory: /root/.cache/pip/wheels/a4/61/fd/c57e374e580aa78a45ed78d5859b3a44436af17e22ca53284f Building wheel for google-apitools (setup.py) ... done Created wheel for google-apitools: filename=google_apitools-0.5.31-py3-none-any.whl size=131039 sha256=396aa1fd1793367d92d94d402e97f81257a5b238e2458b9fff55acb24edd4ade Stored in directory: /root/.cache/pip/wheels/19/b5/2f/1cc3cf2b31e7a9cd1508731212526d9550271274d351c96f16 Building wheel for docopt (setup.py) ... done Created wheel for docopt: filename=docopt-0.6.2-py2.py3-none-any.whl size=13723 sha256=bfbd7f3a61da6517d5a974ebd48a018fda2b815cc3400b4e001c487684f5ebcb Stored in directory: /root/.cache/pip/wheels/72/b0/3f/1d95f96ff986c7dfffe46ce2be4062f38ebd04b506c77c81b9 Successfully built dill google-apitools docopt Installing collected packages: numpy, orjson, pymongo, dill, fastavro, docopt, requests, hdfs, cloudpickle, pyarrow, proto-plus, google-cloud-bigquery-storage, google-cloud-dlp, grpc-google-iam-v1, google-cloud-bigtable, google-cloud-recommendations-ai, grpcio-status, google-cloud-pubsub, fasteners, google-apitools, grpcio-gcp, google-auth-httplib2, overrides, google-cloud-pubsublite, google-cloud-spanner, google-cloud-videointelligence, google-cloud-language, google-cloud-vision, apache-beam, joblib, tensorflow-serving-api, absl-py, tensorflow-metadata, tfx-bsl, tensorflow_data_validation Attempting uninstall: numpy Found existing installation: numpy 1.21.6 Uninstalling numpy-1.21.6: Successfully uninstalled numpy-1.21.6 Attempting uninstall: pymongo Found existing installation: pymongo 4.2.0 Uninstalling pymongo-4.2.0: Successfully uninstalled pymongo-4.2.0 Attempting uninstall: dill Found existing installation: dill 0.3.5.1 Uninstalling dill-0.3.5.1: Successfully uninstalled dill-0.3.5.1 Attempting uninstall: requests Found existing installation: requests 2.23.0 Uninstalling requests-2.23.0: Successfully uninstalled requests-2.23.0 Attempting uninstall: cloudpickle Found existing installation: cloudpickle 1.5.0 Uninstalling cloudpickle-1.5.0: Successfully uninstalled cloudpickle-1.5.0 Attempting uninstall: pyarrow Found existing installation: pyarrow 6.0.1 Uninstalling pyarrow-6.0.1: Successfully uninstalled pyarrow-6.0.1 Attempting uninstall: google-cloud-bigquery-storage Found existing installation: google-cloud-bigquery-storage 1.1.2 Uninstalling google-cloud-bigquery-storage-1.1.2: Successfully uninstalled google-cloud-bigquery-storage-1.1.2 Attempting uninstall: google-auth-httplib2 Found existing installation: google-auth-httplib2 0.0.4 Uninstalling google-auth-httplib2-0.0.4: Successfully uninstalled google-auth-httplib2-0.0.4 Attempting uninstall: google-cloud-language Found existing installation: google-cloud-language 1.2.0 Uninstalling google-cloud-language-1.2.0: Successfully uninstalled google-cloud-language-1.2.0 Attempting uninstall: joblib Found existing installation: joblib 1.1.0 Uninstalling joblib-1.1.0: Successfully uninstalled joblib-1.1.0 Attempting uninstall: absl-py Found existing installation: absl-py 1.2.0 Uninstalling absl-py-1.2.0: Successfully uninstalled absl-py-1.2.0 Attempting uninstall: tensorflow-metadata Found existing installation: tensorflow-metadata 1.10.0 Uninstalling tensorflow-metadata-1.10.0: Successfully uninstalled tensorflow-metadata-1.10.0 ERROR: pip's legacy dependency resolver does not consider dependency conflicts when selecting packages. This behaviour is the source of the following dependency conflicts. xarray-einstats 0.2.2 requires numpy>=1.21, but you'll have numpy 1.19.5 which is incompatible. tensorflow 2.8.2+zzzcolab20220719082949 requires numpy>=1.20, but you'll have numpy 1.19.5 which is incompatible. google-cloud-bigquery-storage 2.13.2 requires protobuf<4.0.0dev,>=3.19.0, but you'll have protobuf 3.17.3 which is incompatible. cmdstanpy 1.0.7 requires numpy>=1.21, but you'll have numpy 1.19.5 which is incompatible. proto-plus 1.22.1 requires protobuf<5.0.0dev,>=3.19.0, but you'll have protobuf 3.17.3 which is incompatible. google-cloud-dlp 3.8.1 requires google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.*,!=2.4.*,!=2.5.*,!=2.6.*,!=2.7.*,<3.0.0dev,>=1.32.0, but you'll have google-api-core 1.31.6 which is incompatible. google-cloud-dlp 3.8.1 requires protobuf<5.0.0dev,>=3.19.0, but you'll have protobuf 3.17.3 which is incompatible. google-cloud-bigtable 1.7.2 requires google-cloud-core<3.0dev,>=1.4.1, but you'll have google-cloud-core 1.0.3 which is incompatible. google-cloud-recommendations-ai 0.7.1 requires google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.*,!=2.4.*,!=2.5.*,!=2.6.*,!=2.7.*,<3.0.0dev,>=1.32.0, but you'll have google-api-core 1.31.6 which is incompatible. google-cloud-recommendations-ai 0.7.1 requires protobuf<5.0.0dev,>=3.19.0, but you'll have protobuf 3.17.3 which is incompatible. grpcio-status 1.48.1 requires grpcio>=1.48.1, but you'll have grpcio 1.47.0 which is incompatible. google-cloud-pubsub 2.13.6 requires google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.*,!=2.4.*,!=2.5.*,!=2.6.*,!=2.7.*,<3.0.0dev,>=1.32.0, but you'll have google-api-core 1.31.6 which is incompatible. google-cloud-pubsub 2.13.6 requires protobuf<5.0.0dev,>=3.19.0, but you'll have protobuf 3.17.3 which is incompatible. google-cloud-pubsublite 1.4.3 requires google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.*,!=2.4.*,!=2.5.*,!=2.6.*,!=2.7.*,<3.0.0dev,>=1.32.0, but you'll have google-api-core 1.31.6 which is incompatible. google-cloud-spanner 1.19.3 requires google-cloud-core<3.0dev,>=1.4.1, but you'll have google-cloud-core 1.0.3 which is incompatible. tensorflow-serving-api 2.9.1 requires tensorflow<3,>=2.9.1, but you'll have tensorflow 2.8.2+zzzcolab20220719082949 which is incompatible. tfx-bsl 1.1.1 requires google-cloud-bigquery<2.21,>=1.28.0, but you'll have google-cloud-bigquery 1.21.0 which is incompatible. Successfully installed absl-py-0.12.0 apache-beam-2.41.0 cloudpickle-2.1.0 dill-0.3.1.1 docopt-0.6.2 fastavro-1.6.0 fasteners-0.17.3 google-apitools-0.5.31 google-auth-httplib2-0.1.0 google-cloud-bigquery-storage-2.13.2 google-cloud-bigtable-1.7.2 google-cloud-dlp-3.8.1 google-cloud-language-1.3.2 google-cloud-pubsub-2.13.6 google-cloud-pubsublite-1.4.3 google-cloud-recommendations-ai-0.7.1 google-cloud-spanner-1.19.3 google-cloud-videointelligence-1.16.3 google-cloud-vision-1.0.2 grpc-google-iam-v1-0.12.4 grpcio-gcp-0.2.2 grpcio-status-1.48.1 hdfs-2.7.0 joblib-0.14.1 numpy-1.19.5 orjson-3.8.0 overrides-6.2.0 proto-plus-1.22.1 pyarrow-2.0.0 pymongo-3.12.3 requests-2.28.1 tensorflow-metadata-1.1.0 tensorflow-serving-api-2.9.1 tensorflow_data_validation-1.1.0 tfx-bsl-1.1.1 WARNING: Running pip as the 'root' user can result in broken permissions and conflicting behaviour with the system package manager. It is recommended to use a virtual environment instead: https://pip.pypa.io/warnings/venv Looking in indexes: https://pypi.org/simple, https://us-python.pkg.dev/colab-wheels/public/simple/ Collecting tensorflow-transform==1.0.0 Downloading tensorflow_transform-1.0.0-py3-none-any.whl (402 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 402.9/402.9 kB 14.1 MB/s eta 0:00:00 Collecting tfx-bsl<1.1.0,>=1.0.0 Downloading tfx_bsl-1.0.0-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (2.2 MB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 2.2/2.2 MB 75.4 MB/s eta 0:00:00 Requirement already satisfied: absl-py<0.13,>=0.9 in /usr/local/lib/python3.7/dist-packages (from tensorflow-transform==1.0.0) (0.12.0) Requirement already satisfied: pydot<2,>=1.2 in /usr/local/lib/python3.7/dist-packages (from tensorflow-transform==1.0.0) (1.3.0) Requirement already satisfied: protobuf<4,>=3.9.2 in /usr/local/lib/python3.7/dist-packages (from tensorflow-transform==1.0.0) (3.17.3) Requirement already satisfied: numpy<1.20,>=1.16 in /usr/local/lib/python3.7/dist-packages (from tensorflow-transform==1.0.0) (1.19.5) Collecting tensorflow!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.*,!=2.4.*,<2.6,>=1.15.2 Downloading tensorflow-2.5.3-cp37-cp37m-manylinux2010_x86_64.whl (460.3 MB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 460.3/460.3 MB 3.8 MB/s eta 0:00:00 Requirement already satisfied: six<2,>=1.12 in /usr/local/lib/python3.7/dist-packages (from tensorflow-transform==1.0.0) (1.15.0) Requirement already satisfied: pyarrow<3,>=1 in /usr/local/lib/python3.7/dist-packages (from tensorflow-transform==1.0.0) (2.0.0) Collecting tensorflow-metadata<1.1.0,>=1.0.0 Downloading tensorflow_metadata-1.0.0-py3-none-any.whl (48 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 48.1/48.1 kB 7.5 MB/s eta 0:00:00 Requirement already satisfied: apache-beam[gcp]<3,>=2.29 in /usr/local/lib/python3.7/dist-packages (from tensorflow-transform==1.0.0) (2.41.0) Requirement already satisfied: tensorflow-serving-api!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.*,!=2.4.*,<3,>=1.15 in /usr/local/lib/python3.7/dist-packages (from tfx-bsl<1.1.0,>=1.0.0->tensorflow-transform==1.0.0) (2.9.1) Requirement already satisfied: google-api-python-client<2,>=1.7.11 in /usr/local/lib/python3.7/dist-packages (from tfx-bsl<1.1.0,>=1.0.0->tensorflow-transform==1.0.0) (1.12.11) Requirement already satisfied: pandas<2,>=1.0 in /usr/local/lib/python3.7/dist-packages (from tfx-bsl<1.1.0,>=1.0.0->tensorflow-transform==1.0.0) (1.3.5) Requirement already satisfied: pyparsing>=2.1.4 in /usr/local/lib/python3.7/dist-packages (from pydot<2,>=1.2->tensorflow-transform==1.0.0) (3.0.9) Collecting gast==0.4.0 Downloading gast-0.4.0-py3-none-any.whl (9.8 kB) Collecting flatbuffers~=1.12.0 Downloading flatbuffers-1.12-py2.py3-none-any.whl (15 kB) Collecting tensorflow-estimator<2.6.0,>=2.5.0 Downloading tensorflow_estimator-2.5.0-py2.py3-none-any.whl (462 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 462.4/462.4 kB 32.7 MB/s eta 0:00:00 Requirement already satisfied: opt-einsum~=3.3.0 in /usr/local/lib/python3.7/dist-packages (from tensorflow!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.*,!=2.4.*,<2.6,>=1.15.2->tensorflow-transform==1.0.0) (3.3.0) Collecting typing-extensions~=3.7.4 Downloading typing_extensions-3.7.4.3-py3-none-any.whl (22 kB) Requirement already satisfied: h5py~=3.1.0 in /usr/local/lib/python3.7/dist-packages (from tensorflow!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.*,!=2.4.*,<2.6,>=1.15.2->tensorflow-transform==1.0.0) (3.1.0) Collecting grpcio~=1.34.0 Downloading grpcio-1.34.1-cp37-cp37m-manylinux2014_x86_64.whl (4.0 MB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 4.0/4.0 MB 99.5 MB/s eta 0:00:00 Collecting wrapt~=1.12.1 Downloading wrapt-1.12.1.tar.gz (27 kB) Preparing metadata (setup.py) ... done Requirement already satisfied: google-pasta~=0.2 in /usr/local/lib/python3.7/dist-packages (from tensorflow!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.*,!=2.4.*,<2.6,>=1.15.2->tensorflow-transform==1.0.0) (0.2.0) Requirement already satisfied: wheel~=0.35 in /usr/local/lib/python3.7/dist-packages (from tensorflow!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.*,!=2.4.*,<2.6,>=1.15.2->tensorflow-transform==1.0.0) (0.37.1) Requirement already satisfied: astunparse~=1.6.3 in /usr/local/lib/python3.7/dist-packages (from tensorflow!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.*,!=2.4.*,<2.6,>=1.15.2->tensorflow-transform==1.0.0) (1.6.3) Collecting keras-nightly~=2.5.0.dev Downloading keras_nightly-2.5.0.dev2021032900-py2.py3-none-any.whl (1.2 MB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 1.2/1.2 MB 66.3 MB/s eta 0:00:00 Requirement already satisfied: termcolor~=1.1.0 in /usr/local/lib/python3.7/dist-packages (from tensorflow!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.*,!=2.4.*,<2.6,>=1.15.2->tensorflow-transform==1.0.0) (1.1.0) Requirement already satisfied: tensorboard~=2.5 in /usr/local/lib/python3.7/dist-packages (from tensorflow!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.*,!=2.4.*,<2.6,>=1.15.2->tensorflow-transform==1.0.0) (2.8.0) Requirement already satisfied: keras-preprocessing~=1.1.2 in /usr/local/lib/python3.7/dist-packages (from tensorflow!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.*,!=2.4.*,<2.6,>=1.15.2->tensorflow-transform==1.0.0) (1.1.2) Requirement already satisfied: googleapis-common-protos<2,>=1.52.0 in /usr/local/lib/python3.7/dist-packages (from tensorflow-metadata<1.1.0,>=1.0.0->tensorflow-transform==1.0.0) (1.56.4) Requirement already satisfied: pymongo<4.0.0,>=3.8.0 in /usr/local/lib/python3.7/dist-packages (from apache-beam[gcp]<3,>=2.29->tensorflow-transform==1.0.0) (3.12.3) Requirement already satisfied: httplib2<0.21.0,>=0.8 in /usr/local/lib/python3.7/dist-packages (from apache-beam[gcp]<3,>=2.29->tensorflow-transform==1.0.0) (0.17.4) Requirement already satisfied: requests<3.0.0,>=2.24.0 in /usr/local/lib/python3.7/dist-packages (from apache-beam[gcp]<3,>=2.29->tensorflow-transform==1.0.0) (2.28.1) Requirement already satisfied: python-dateutil<3,>=2.8.0 in /usr/local/lib/python3.7/dist-packages (from apache-beam[gcp]<3,>=2.29->tensorflow-transform==1.0.0) (2.8.2) Requirement already satisfied: fastavro<2,>=0.23.6 in /usr/local/lib/python3.7/dist-packages (from apache-beam[gcp]<3,>=2.29->tensorflow-transform==1.0.0) (1.6.0) Requirement already satisfied: orjson<4.0 in /usr/local/lib/python3.7/dist-packages (from apache-beam[gcp]<3,>=2.29->tensorflow-transform==1.0.0) (3.8.0) Requirement already satisfied: hdfs<3.0.0,>=2.1.0 in /usr/local/lib/python3.7/dist-packages (from apache-beam[gcp]<3,>=2.29->tensorflow-transform==1.0.0) (2.7.0) Requirement already satisfied: crcmod<2.0,>=1.7 in /usr/local/lib/python3.7/dist-packages (from apache-beam[gcp]<3,>=2.29->tensorflow-transform==1.0.0) (1.7) Requirement already satisfied: proto-plus<2,>=1.7.1 in /usr/local/lib/python3.7/dist-packages (from apache-beam[gcp]<3,>=2.29->tensorflow-transform==1.0.0) (1.22.1) Requirement already satisfied: pytz>=2018.3 in /usr/local/lib/python3.7/dist-packages (from apache-beam[gcp]<3,>=2.29->tensorflow-transform==1.0.0) (2022.2.1) Requirement already satisfied: cloudpickle<3,>=2.1.0 in /usr/local/lib/python3.7/dist-packages (from apache-beam[gcp]<3,>=2.29->tensorflow-transform==1.0.0) (2.1.0) Requirement already satisfied: dill<0.3.2,>=0.3.1.1 in /usr/local/lib/python3.7/dist-packages (from apache-beam[gcp]<3,>=2.29->tensorflow-transform==1.0.0) (0.3.1.1) Requirement already satisfied: cachetools<5,>=3.1.0; extra == "gcp" in /usr/local/lib/python3.7/dist-packages (from apache-beam[gcp]<3,>=2.29->tensorflow-transform==1.0.0) (4.2.4) Requirement already satisfied: grpcio-gcp<1,>=0.2.2; extra == "gcp" in /usr/local/lib/python3.7/dist-packages (from apache-beam[gcp]<3,>=2.29->tensorflow-transform==1.0.0) (0.2.2) Requirement already satisfied: google-cloud-spanner<2,>=1.13.0; extra == "gcp" in /usr/local/lib/python3.7/dist-packages (from apache-beam[gcp]<3,>=2.29->tensorflow-transform==1.0.0) (1.19.3) Requirement already satisfied: google-cloud-recommendations-ai<0.8.0,>=0.1.0; extra == "gcp" in /usr/local/lib/python3.7/dist-packages (from apache-beam[gcp]<3,>=2.29->tensorflow-transform==1.0.0) (0.7.1) Requirement already satisfied: google-auth<3,>=1.18.0; extra == "gcp" in /usr/local/lib/python3.7/dist-packages (from apache-beam[gcp]<3,>=2.29->tensorflow-transform==1.0.0) (1.35.0) Requirement already satisfied: google-cloud-core<3,>=0.28.1; extra == "gcp" in /usr/local/lib/python3.7/dist-packages (from apache-beam[gcp]<3,>=2.29->tensorflow-transform==1.0.0) (1.0.3) Requirement already satisfied: google-auth-httplib2<0.2.0,>=0.1.0; extra == "gcp" in /usr/local/lib/python3.7/dist-packages (from apache-beam[gcp]<3,>=2.29->tensorflow-transform==1.0.0) (0.1.0) Requirement already satisfied: google-cloud-dlp<4,>=3.0.0; extra == "gcp" in /usr/local/lib/python3.7/dist-packages (from apache-beam[gcp]<3,>=2.29->tensorflow-transform==1.0.0) (3.8.1) Requirement already satisfied: google-cloud-bigquery<3,>=1.6.0; extra == "gcp" in /usr/local/lib/python3.7/dist-packages (from apache-beam[gcp]<3,>=2.29->tensorflow-transform==1.0.0) (1.21.0) Requirement already satisfied: google-cloud-pubsub<3,>=2.1.0; extra == "gcp" in /usr/local/lib/python3.7/dist-packages (from apache-beam[gcp]<3,>=2.29->tensorflow-transform==1.0.0) (2.13.6) Requirement already satisfied: google-apitools<0.5.32,>=0.5.31; extra == "gcp" in /usr/local/lib/python3.7/dist-packages (from apache-beam[gcp]<3,>=2.29->tensorflow-transform==1.0.0) (0.5.31) Requirement already satisfied: google-cloud-bigtable<2,>=0.31.1; extra == "gcp" in /usr/local/lib/python3.7/dist-packages (from apache-beam[gcp]<3,>=2.29->tensorflow-transform==1.0.0) (1.7.2) Requirement already satisfied: google-cloud-language<2,>=1.3.0; extra == "gcp" in /usr/local/lib/python3.7/dist-packages (from apache-beam[gcp]<3,>=2.29->tensorflow-transform==1.0.0) (1.3.2) Requirement already satisfied: google-cloud-datastore<2,>=1.8.0; extra == "gcp" in /usr/local/lib/python3.7/dist-packages (from apache-beam[gcp]<3,>=2.29->tensorflow-transform==1.0.0) (1.8.0) Requirement already satisfied: google-cloud-bigquery-storage<2.14,>=2.6.3; extra == "gcp" in /usr/local/lib/python3.7/dist-packages (from apache-beam[gcp]<3,>=2.29->tensorflow-transform==1.0.0) (2.13.2) Requirement already satisfied: google-api-core!=2.8.2,<3; extra == "gcp" in /usr/local/lib/python3.7/dist-packages (from apache-beam[gcp]<3,>=2.29->tensorflow-transform==1.0.0) (1.31.6) Requirement already satisfied: google-cloud-vision<2,>=0.38.0; extra == "gcp" in /usr/local/lib/python3.7/dist-packages (from apache-beam[gcp]<3,>=2.29->tensorflow-transform==1.0.0) (1.0.2) Requirement already satisfied: google-cloud-pubsublite<2,>=1.2.0; extra == "gcp" in /usr/local/lib/python3.7/dist-packages (from apache-beam[gcp]<3,>=2.29->tensorflow-transform==1.0.0) (1.4.3) Requirement already satisfied: google-cloud-videointelligence<2,>=1.8.0; extra == "gcp" in /usr/local/lib/python3.7/dist-packages (from apache-beam[gcp]<3,>=2.29->tensorflow-transform==1.0.0) (1.16.3) Requirement already satisfied: uritemplate<4dev,>=3.0.0 in /usr/local/lib/python3.7/dist-packages (from google-api-python-client<2,>=1.7.11->tfx-bsl<1.1.0,>=1.0.0->tensorflow-transform==1.0.0) (3.0.1) Requirement already satisfied: cached-property; python_version < "3.8" in /usr/local/lib/python3.7/dist-packages (from h5py~=3.1.0->tensorflow!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.*,!=2.4.*,<2.6,>=1.15.2->tensorflow-transform==1.0.0) (1.5.2) Requirement already satisfied: setuptools>=41.0.0 in /usr/local/lib/python3.7/dist-packages (from tensorboard~=2.5->tensorflow!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.*,!=2.4.*,<2.6,>=1.15.2->tensorflow-transform==1.0.0) (57.4.0) Requirement already satisfied: markdown>=2.6.8 in /usr/local/lib/python3.7/dist-packages (from tensorboard~=2.5->tensorflow!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.*,!=2.4.*,<2.6,>=1.15.2->tensorflow-transform==1.0.0) (3.4.1) Requirement already satisfied: google-auth-oauthlib<0.5,>=0.4.1 in /usr/local/lib/python3.7/dist-packages (from tensorboard~=2.5->tensorflow!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.*,!=2.4.*,<2.6,>=1.15.2->tensorflow-transform==1.0.0) (0.4.6) Requirement already satisfied: tensorboard-data-server<0.7.0,>=0.6.0 in /usr/local/lib/python3.7/dist-packages (from tensorboard~=2.5->tensorflow!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.*,!=2.4.*,<2.6,>=1.15.2->tensorflow-transform==1.0.0) (0.6.1) Requirement already satisfied: tensorboard-plugin-wit>=1.6.0 in /usr/local/lib/python3.7/dist-packages (from tensorboard~=2.5->tensorflow!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.*,!=2.4.*,<2.6,>=1.15.2->tensorflow-transform==1.0.0) (1.8.1) Requirement already satisfied: werkzeug>=0.11.15 in /usr/local/lib/python3.7/dist-packages (from tensorboard~=2.5->tensorflow!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.*,!=2.4.*,<2.6,>=1.15.2->tensorflow-transform==1.0.0) (1.0.1) Requirement already satisfied: certifi>=2017.4.17 in /usr/local/lib/python3.7/dist-packages (from requests<3.0.0,>=2.24.0->apache-beam[gcp]<3,>=2.29->tensorflow-transform==1.0.0) (2022.6.15) Requirement already satisfied: idna<4,>=2.5 in /usr/local/lib/python3.7/dist-packages (from requests<3.0.0,>=2.24.0->apache-beam[gcp]<3,>=2.29->tensorflow-transform==1.0.0) (2.10) Requirement already satisfied: urllib3<1.27,>=1.21.1 in /usr/local/lib/python3.7/dist-packages (from requests<3.0.0,>=2.24.0->apache-beam[gcp]<3,>=2.29->tensorflow-transform==1.0.0) (1.24.3) Requirement already satisfied: charset-normalizer<3,>=2 in /usr/local/lib/python3.7/dist-packages (from requests<3.0.0,>=2.24.0->apache-beam[gcp]<3,>=2.29->tensorflow-transform==1.0.0) (2.1.1) Requirement already satisfied: docopt in /usr/local/lib/python3.7/dist-packages (from hdfs<3.0.0,>=2.1.0->apache-beam[gcp]<3,>=2.29->tensorflow-transform==1.0.0) (0.6.2) Requirement already satisfied: grpc-google-iam-v1<0.13dev,>=0.12.3 in /usr/local/lib/python3.7/dist-packages (from google-cloud-spanner<2,>=1.13.0; extra == "gcp"->apache-beam[gcp]<3,>=2.29->tensorflow-transform==1.0.0) (0.12.4) Requirement already satisfied: pyasn1-modules>=0.2.1 in /usr/local/lib/python3.7/dist-packages (from google-auth<3,>=1.18.0; extra == "gcp"->apache-beam[gcp]<3,>=2.29->tensorflow-transform==1.0.0) (0.2.8) Requirement already satisfied: rsa<5,>=3.1.4; python_version >= "3.6" in /usr/local/lib/python3.7/dist-packages (from google-auth<3,>=1.18.0; extra == "gcp"->apache-beam[gcp]<3,>=2.29->tensorflow-transform==1.0.0) (4.9) Requirement already satisfied: google-resumable-media!=0.4.0,<0.5.0dev,>=0.3.1 in /usr/local/lib/python3.7/dist-packages (from google-cloud-bigquery<3,>=1.6.0; extra == "gcp"->apache-beam[gcp]<3,>=2.29->tensorflow-transform==1.0.0) (0.4.1) Requirement already satisfied: grpcio-status>=1.16.0 in /usr/local/lib/python3.7/dist-packages (from google-cloud-pubsub<3,>=2.1.0; extra == "gcp"->apache-beam[gcp]<3,>=2.29->tensorflow-transform==1.0.0) (1.48.1) Requirement already satisfied: fasteners>=0.14 in /usr/local/lib/python3.7/dist-packages (from google-apitools<0.5.32,>=0.5.31; extra == "gcp"->apache-beam[gcp]<3,>=2.29->tensorflow-transform==1.0.0) (0.17.3) Requirement already satisfied: oauth2client>=1.4.12 in /usr/local/lib/python3.7/dist-packages (from google-apitools<0.5.32,>=0.5.31; extra == "gcp"->apache-beam[gcp]<3,>=2.29->tensorflow-transform==1.0.0) (4.1.3) Requirement already satisfied: packaging>=14.3 in /usr/local/lib/python3.7/dist-packages (from google-api-core!=2.8.2,<3; extra == "gcp"->apache-beam[gcp]<3,>=2.29->tensorflow-transform==1.0.0) (21.3) Requirement already satisfied: overrides<7.0.0,>=6.0.1 in /usr/local/lib/python3.7/dist-packages (from google-cloud-pubsublite<2,>=1.2.0; extra == "gcp"->apache-beam[gcp]<3,>=2.29->tensorflow-transform==1.0.0) (6.2.0) Requirement already satisfied: importlib-metadata>=4.4; python_version < "3.10" in /usr/local/lib/python3.7/dist-packages (from markdown>=2.6.8->tensorboard~=2.5->tensorflow!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.*,!=2.4.*,<2.6,>=1.15.2->tensorflow-transform==1.0.0) (4.12.0) Requirement already satisfied: requests-oauthlib>=0.7.0 in /usr/local/lib/python3.7/dist-packages (from google-auth-oauthlib<0.5,>=0.4.1->tensorboard~=2.5->tensorflow!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.*,!=2.4.*,<2.6,>=1.15.2->tensorflow-transform==1.0.0) (1.3.1) Requirement already satisfied: pyasn1<0.5.0,>=0.4.6 in /usr/local/lib/python3.7/dist-packages (from pyasn1-modules>=0.2.1->google-auth<3,>=1.18.0; extra == "gcp"->apache-beam[gcp]<3,>=2.29->tensorflow-transform==1.0.0) (0.4.8) Requirement already satisfied: zipp>=0.5 in /usr/local/lib/python3.7/dist-packages (from importlib-metadata>=4.4; python_version < "3.10"->markdown>=2.6.8->tensorboard~=2.5->tensorflow!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.*,!=2.4.*,<2.6,>=1.15.2->tensorflow-transform==1.0.0) (3.8.1) Requirement already satisfied: oauthlib>=3.0.0 in /usr/local/lib/python3.7/dist-packages (from requests-oauthlib>=0.7.0->google-auth-oauthlib<0.5,>=0.4.1->tensorboard~=2.5->tensorflow!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.*,!=2.4.*,<2.6,>=1.15.2->tensorflow-transform==1.0.0) (3.2.0) Building wheels for collected packages: wrapt Building wheel for wrapt (setup.py) ... done Created wheel for wrapt: filename=wrapt-1.12.1-cp37-cp37m-linux_x86_64.whl size=68676 sha256=0c9a55460468de643637775fe332121bdfbf3306f974c35cf38ba177e0fa5daa Stored in directory: /root/.cache/pip/wheels/62/76/4c/aa25851149f3f6d9785f6c869387ad82b3fd37582fa8147ac6 Successfully built wrapt Installing collected packages: gast, flatbuffers, tensorflow-estimator, typing-extensions, grpcio, wrapt, keras-nightly, tensorflow, tensorflow-metadata, tfx-bsl, tensorflow-transform Attempting uninstall: gast Found existing installation: gast 0.5.3 Uninstalling gast-0.5.3: Successfully uninstalled gast-0.5.3 Attempting uninstall: flatbuffers Found existing installation: flatbuffers 2.0.7 Uninstalling flatbuffers-2.0.7: Successfully uninstalled flatbuffers-2.0.7 Attempting uninstall: tensorflow-estimator Found existing installation: tensorflow-estimator 2.8.0 Uninstalling tensorflow-estimator-2.8.0: Successfully uninstalled tensorflow-estimator-2.8.0 Attempting uninstall: typing-extensions Found existing installation: typing_extensions 4.1.1 Uninstalling typing_extensions-4.1.1: Successfully uninstalled typing_extensions-4.1.1 Attempting uninstall: grpcio Found existing installation: grpcio 1.47.0 Uninstalling grpcio-1.47.0: Successfully uninstalled grpcio-1.47.0 Attempting uninstall: wrapt Found existing installation: wrapt 1.14.1 Uninstalling wrapt-1.14.1: Successfully uninstalled wrapt-1.14.1 Attempting uninstall: tensorflow Found existing installation: tensorflow 2.8.2+zzzcolab20220719082949 Uninstalling tensorflow-2.8.2+zzzcolab20220719082949: Successfully uninstalled tensorflow-2.8.2+zzzcolab20220719082949 Attempting uninstall: tensorflow-metadata Found existing installation: tensorflow-metadata 1.1.0 Uninstalling tensorflow-metadata-1.1.0: Successfully uninstalled tensorflow-metadata-1.1.0 Attempting uninstall: tfx-bsl Found existing installation: tfx-bsl 1.1.1 Uninstalling tfx-bsl-1.1.1: Successfully uninstalled tfx-bsl-1.1.1 ERROR: pip's legacy dependency resolver does not consider dependency conflicts when selecting packages. This behaviour is the source of the following dependency conflicts. xarray-einstats 0.2.2 requires numpy>=1.21, but you'll have numpy 1.19.5 which is incompatible. tensorflow-serving-api 2.9.1 requires tensorflow<3,>=2.9.1, but you'll have tensorflow 2.5.3 which is incompatible. tensorflow-data-validation 1.1.0 requires tensorflow-metadata<1.2,>=1.1, but you'll have tensorflow-metadata 1.0.0 which is incompatible. tensorflow-data-validation 1.1.0 requires tfx-bsl<1.2,>=1.1, but you'll have tfx-bsl 1.0.0 which is incompatible. grpcio-status 1.48.1 requires grpcio>=1.48.1, but you'll have grpcio 1.34.1 which is incompatible. google-cloud-spanner 1.19.3 requires google-cloud-core<3.0dev,>=1.4.1, but you'll have google-cloud-core 1.0.3 which is incompatible. google-cloud-pubsublite 1.4.3 requires google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.*,!=2.4.*,!=2.5.*,!=2.6.*,!=2.7.*,<3.0.0dev,>=1.32.0, but you'll have google-api-core 1.31.6 which is incompatible. google-cloud-pubsublite 1.4.3 requires grpcio<2.0.0dev,>=1.38.1, but you'll have grpcio 1.34.1 which is incompatible. google-cloud-pubsub 2.13.6 requires google-api-core[grpc]!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.*,!=2.4.*,!=2.5.*,!=2.6.*,!=2.7.*,<3.0.0dev,>=1.32.0, but you'll have google-api-core 1.31.6 which is incompatible. google-cloud-pubsub 2.13.6 requires grpcio<2.0dev,>=1.38.1, but you'll have grpcio 1.34.1 which is incompatible. google-cloud-pubsub 2.13.6 requires protobuf<5.0.0dev,>=3.19.0, but you'll have protobuf 3.17.3 which is incompatible. google-cloud-bigtable 1.7.2 requires google-cloud-core<3.0dev,>=1.4.1, but you'll have google-cloud-core 1.0.3 which is incompatible. Successfully installed flatbuffers-1.12 gast-0.4.0 grpcio-1.34.1 keras-nightly-2.5.0.dev2021032900 tensorflow-2.5.3 tensorflow-estimator-2.5.0 tensorflow-metadata-1.0.0 tensorflow-transform-1.0.0 tfx-bsl-1.0.0 typing-extensions-3.7.4.3 wrapt-1.12.1 WARNING: Running pip as the 'root' user can result in broken permissions and conflicting behaviour with the system package manager. It is recommended to use a virtual environment instead: https://pip.pypa.io/warnings/venv Looking in indexes: https://pypi.org/simple, https://us-python.pkg.dev/colab-wheels/public/simple/ Collecting tensorflow-model-analysis==0.32.0 Downloading tensorflow_model_analysis-0.32.0-py3-none-any.whl (1.7 MB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 1.7/1.7 MB 32.8 MB/s eta 0:00:00 Requirement already satisfied: ipywidgets<8,>=7 in /usr/local/lib/python3.7/dist-packages (from tensorflow-model-analysis==0.32.0) (7.7.1) Collecting tensorflow-metadata<1.2.0,>=1.1.0 Using cached tensorflow_metadata-1.1.0-py3-none-any.whl (48 kB) Requirement already satisfied: six<2,>=1.12 in /usr/local/lib/python3.7/dist-packages (from tensorflow-model-analysis==0.32.0) (1.15.0) Requirement already satisfied: pyarrow<3,>=1 in /usr/local/lib/python3.7/dist-packages (from tensorflow-model-analysis==0.32.0) (2.0.0) Requirement already satisfied: numpy<1.20,>=1.16 in /usr/local/lib/python3.7/dist-packages (from tensorflow-model-analysis==0.32.0) (1.19.5) Requirement already satisfied: apache-beam[gcp]<3,>=2.29 in /usr/local/lib/python3.7/dist-packages (from tensorflow-model-analysis==0.32.0) (2.41.0) Requirement already satisfied: tensorflow!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.*,!=2.4.*,<3,>=1.15.2 in /usr/local/lib/python3.7/dist-packages (from tensorflow-model-analysis==0.32.0) (2.5.3) Requirement already satisfied: ipython<8,>=7 in /usr/local/lib/python3.7/dist-packages (from tensorflow-model-analysis==0.32.0) (7.9.0) Requirement already satisfied: absl-py<0.13,>=0.9 in /usr/local/lib/python3.7/dist-packages (from tensorflow-model-analysis==0.32.0) (0.12.0) Requirement already satisfied: protobuf<4,>=3.13 in /usr/local/lib/python3.7/dist-packages (from tensorflow-model-analysis==0.32.0) (3.17.3) Requirement already satisfied: scipy<2,>=1.4.1 in /usr/local/lib/python3.7/dist-packages (from tensorflow-model-analysis==0.32.0) (1.7.3) Requirement already satisfied: pandas<2,>=1.0 in /usr/local/lib/python3.7/dist-packages (from tensorflow-model-analysis==0.32.0) (1.3.5) Collecting tfx-bsl<1.2.0,>=1.1.0 Using cached tfx_bsl-1.1.1-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (19.0 MB) Requirement already satisfied: widgetsnbextension~=3.6.0 in /usr/local/lib/python3.7/dist-packages (from ipywidgets<8,>=7->tensorflow-model-analysis==0.32.0) (3.6.1) Requirement already satisfied: ipython-genutils~=0.2.0 in /usr/local/lib/python3.7/dist-packages (from ipywidgets<8,>=7->tensorflow-model-analysis==0.32.0) (0.2.0) Requirement already satisfied: ipykernel>=4.5.1 in /usr/local/lib/python3.7/dist-packages (from ipywidgets<8,>=7->tensorflow-model-analysis==0.32.0) (5.3.4) Requirement already satisfied: traitlets>=4.3.1 in /usr/local/lib/python3.7/dist-packages (from ipywidgets<8,>=7->tensorflow-model-analysis==0.32.0) (5.1.1) Requirement already satisfied: jupyterlab-widgets>=1.0.0; python_version >= "3.6" in /usr/local/lib/python3.7/dist-packages (from ipywidgets<8,>=7->tensorflow-model-analysis==0.32.0) (3.0.2) Requirement already satisfied: googleapis-common-protos<2,>=1.52.0 in /usr/local/lib/python3.7/dist-packages (from tensorflow-metadata<1.2.0,>=1.1.0->tensorflow-model-analysis==0.32.0) (1.56.4) Requirement already satisfied: pydot<2,>=1.2.0 in /usr/local/lib/python3.7/dist-packages (from apache-beam[gcp]<3,>=2.29->tensorflow-model-analysis==0.32.0) (1.3.0) Requirement already satisfied: crcmod<2.0,>=1.7 in /usr/local/lib/python3.7/dist-packages (from apache-beam[gcp]<3,>=2.29->tensorflow-model-analysis==0.32.0) (1.7) Requirement already satisfied: orjson<4.0 in /usr/local/lib/python3.7/dist-packages (from apache-beam[gcp]<3,>=2.29->tensorflow-model-analysis==0.32.0) (3.8.0) Requirement already satisfied: grpcio<2,>=1.33.1 in /usr/local/lib/python3.7/dist-packages (from apache-beam[gcp]<3,>=2.29->tensorflow-model-analysis==0.32.0) (1.34.1) Requirement already satisfied: requests<3.0.0,>=2.24.0 in /usr/local/lib/python3.7/dist-packages (from apache-beam[gcp]<3,>=2.29->tensorflow-model-analysis==0.32.0) (2.28.1) Requirement already satisfied: dill<0.3.2,>=0.3.1.1 in /usr/local/lib/python3.7/dist-packages (from apache-beam[gcp]<3,>=2.29->tensorflow-model-analysis==0.32.0) (0.3.1.1) Requirement already satisfied: pytz>=2018.3 in /usr/local/lib/python3.7/dist-packages (from apache-beam[gcp]<3,>=2.29->tensorflow-model-analysis==0.32.0) (2022.2.1) Requirement already satisfied: proto-plus<2,>=1.7.1 in /usr/local/lib/python3.7/dist-packages (from apache-beam[gcp]<3,>=2.29->tensorflow-model-analysis==0.32.0) (1.22.1) Requirement already satisfied: pymongo<4.0.0,>=3.8.0 in /usr/local/lib/python3.7/dist-packages (from apache-beam[gcp]<3,>=2.29->tensorflow-model-analysis==0.32.0) (3.12.3) Requirement already satisfied: cloudpickle<3,>=2.1.0 in /usr/local/lib/python3.7/dist-packages (from apache-beam[gcp]<3,>=2.29->tensorflow-model-analysis==0.32.0) (2.1.0) Requirement already satisfied: hdfs<3.0.0,>=2.1.0 in /usr/local/lib/python3.7/dist-packages (from apache-beam[gcp]<3,>=2.29->tensorflow-model-analysis==0.32.0) (2.7.0) Requirement already satisfied: typing-extensions>=3.7.0 in /usr/local/lib/python3.7/dist-packages (from apache-beam[gcp]<3,>=2.29->tensorflow-model-analysis==0.32.0) (3.7.4.3) Requirement already satisfied: httplib2<0.21.0,>=0.8 in /usr/local/lib/python3.7/dist-packages (from apache-beam[gcp]<3,>=2.29->tensorflow-model-analysis==0.32.0) (0.17.4) Requirement already satisfied: fastavro<2,>=0.23.6 in /usr/local/lib/python3.7/dist-packages (from apache-beam[gcp]<3,>=2.29->tensorflow-model-analysis==0.32.0) (1.6.0) Requirement already satisfied: python-dateutil<3,>=2.8.0 in /usr/local/lib/python3.7/dist-packages (from apache-beam[gcp]<3,>=2.29->tensorflow-model-analysis==0.32.0) (2.8.2) Requirement already satisfied: google-cloud-vision<2,>=0.38.0; extra == "gcp" in /usr/local/lib/python3.7/dist-packages (from apache-beam[gcp]<3,>=2.29->tensorflow-model-analysis==0.32.0) (1.0.2) Requirement already satisfied: google-cloud-pubsub<3,>=2.1.0; extra == "gcp" in /usr/local/lib/python3.7/dist-packages (from apache-beam[gcp]<3,>=2.29->tensorflow-model-analysis==0.32.0) (2.13.6) Requirement already satisfied: grpcio-gcp<1,>=0.2.2; extra == "gcp" in /usr/local/lib/python3.7/dist-packages (from apache-beam[gcp]<3,>=2.29->tensorflow-model-analysis==0.32.0) (0.2.2) Requirement already satisfied: google-cloud-dlp<4,>=3.0.0; extra == "gcp" in /usr/local/lib/python3.7/dist-packages (from apache-beam[gcp]<3,>=2.29->tensorflow-model-analysis==0.32.0) (3.8.1) Requirement already satisfied: google-cloud-recommendations-ai<0.8.0,>=0.1.0; extra == "gcp" in /usr/local/lib/python3.7/dist-packages (from apache-beam[gcp]<3,>=2.29->tensorflow-model-analysis==0.32.0) (0.7.1) Requirement already satisfied: google-auth<3,>=1.18.0; extra == "gcp" in /usr/local/lib/python3.7/dist-packages (from apache-beam[gcp]<3,>=2.29->tensorflow-model-analysis==0.32.0) (1.35.0) Requirement already satisfied: google-api-core!=2.8.2,<3; extra == "gcp" in /usr/local/lib/python3.7/dist-packages (from apache-beam[gcp]<3,>=2.29->tensorflow-model-analysis==0.32.0) (1.31.6) Requirement already satisfied: google-cloud-bigquery-storage<2.14,>=2.6.3; extra == "gcp" in /usr/local/lib/python3.7/dist-packages (from apache-beam[gcp]<3,>=2.29->tensorflow-model-analysis==0.32.0) (2.13.2) Requirement already satisfied: google-cloud-language<2,>=1.3.0; extra == "gcp" in /usr/local/lib/python3.7/dist-packages (from apache-beam[gcp]<3,>=2.29->tensorflow-model-analysis==0.32.0) (1.3.2) Requirement already satisfied: google-cloud-bigtable<2,>=0.31.1; extra == "gcp" in /usr/local/lib/python3.7/dist-packages (from apache-beam[gcp]<3,>=2.29->tensorflow-model-analysis==0.32.0) (1.7.2) Requirement already satisfied: google-apitools<0.5.32,>=0.5.31; extra == "gcp" in /usr/local/lib/python3.7/dist-packages (from apache-beam[gcp]<3,>=2.29->tensorflow-model-analysis==0.32.0) (0.5.31) Requirement already satisfied: google-cloud-bigquery<3,>=1.6.0; extra == "gcp" in /usr/local/lib/python3.7/dist-packages (from apache-beam[gcp]<3,>=2.29->tensorflow-model-analysis==0.32.0) (1.21.0) Requirement already satisfied: google-cloud-pubsublite<2,>=1.2.0; extra == "gcp" in /usr/local/lib/python3.7/dist-packages (from apache-beam[gcp]<3,>=2.29->tensorflow-model-analysis==0.32.0) (1.4.3) Requirement already satisfied: google-cloud-spanner<2,>=1.13.0; extra == "gcp" in /usr/local/lib/python3.7/dist-packages (from apache-beam[gcp]<3,>=2.29->tensorflow-model-analysis==0.32.0) (1.19.3) Requirement already satisfied: google-cloud-videointelligence<2,>=1.8.0; extra == "gcp" in /usr/local/lib/python3.7/dist-packages (from apache-beam[gcp]<3,>=2.29->tensorflow-model-analysis==0.32.0) (1.16.3) Requirement already satisfied: google-cloud-core<3,>=0.28.1; extra == "gcp" in /usr/local/lib/python3.7/dist-packages (from apache-beam[gcp]<3,>=2.29->tensorflow-model-analysis==0.32.0) (1.0.3) Requirement already satisfied: google-auth-httplib2<0.2.0,>=0.1.0; extra == "gcp" in /usr/local/lib/python3.7/dist-packages (from apache-beam[gcp]<3,>=2.29->tensorflow-model-analysis==0.32.0) (0.1.0) Requirement already satisfied: google-cloud-datastore<2,>=1.8.0; extra == "gcp" in /usr/local/lib/python3.7/dist-packages (from apache-beam[gcp]<3,>=2.29->tensorflow-model-analysis==0.32.0) (1.8.0) Requirement already satisfied: cachetools<5,>=3.1.0; extra == "gcp" in /usr/local/lib/python3.7/dist-packages (from apache-beam[gcp]<3,>=2.29->tensorflow-model-analysis==0.32.0) (4.2.4) Requirement already satisfied: opt-einsum~=3.3.0 in /usr/local/lib/python3.7/dist-packages (from tensorflow!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.*,!=2.4.*,<3,>=1.15.2->tensorflow-model-analysis==0.32.0) (3.3.0) Requirement already satisfied: wrapt~=1.12.1 in /usr/local/lib/python3.7/dist-packages (from tensorflow!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.*,!=2.4.*,<3,>=1.15.2->tensorflow-model-analysis==0.32.0) (1.12.1) Requirement already satisfied: flatbuffers~=1.12.0 in /usr/local/lib/python3.7/dist-packages (from tensorflow!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.*,!=2.4.*,<3,>=1.15.2->tensorflow-model-analysis==0.32.0) (1.12) Requirement already satisfied: h5py~=3.1.0 in /usr/local/lib/python3.7/dist-packages (from tensorflow!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.*,!=2.4.*,<3,>=1.15.2->tensorflow-model-analysis==0.32.0) (3.1.0) Requirement already satisfied: google-pasta~=0.2 in /usr/local/lib/python3.7/dist-packages (from tensorflow!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.*,!=2.4.*,<3,>=1.15.2->tensorflow-model-analysis==0.32.0) (0.2.0) Requirement already satisfied: keras-preprocessing~=1.1.2 in /usr/local/lib/python3.7/dist-packages (from tensorflow!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.*,!=2.4.*,<3,>=1.15.2->tensorflow-model-analysis==0.32.0) (1.1.2) Requirement already satisfied: astunparse~=1.6.3 in /usr/local/lib/python3.7/dist-packages (from tensorflow!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.*,!=2.4.*,<3,>=1.15.2->tensorflow-model-analysis==0.32.0) (1.6.3) Requirement already satisfied: keras-nightly~=2.5.0.dev in /usr/local/lib/python3.7/dist-packages (from tensorflow!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.*,!=2.4.*,<3,>=1.15.2->tensorflow-model-analysis==0.32.0) (2.5.0.dev2021032900) Requirement already satisfied: tensorboard~=2.5 in /usr/local/lib/python3.7/dist-packages (from tensorflow!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.*,!=2.4.*,<3,>=1.15.2->tensorflow-model-analysis==0.32.0) (2.8.0) Requirement already satisfied: wheel~=0.35 in /usr/local/lib/python3.7/dist-packages (from tensorflow!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.*,!=2.4.*,<3,>=1.15.2->tensorflow-model-analysis==0.32.0) (0.37.1) Requirement already satisfied: gast==0.4.0 in /usr/local/lib/python3.7/dist-packages (from tensorflow!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.*,!=2.4.*,<3,>=1.15.2->tensorflow-model-analysis==0.32.0) (0.4.0) Requirement already satisfied: tensorflow-estimator<2.6.0,>=2.5.0 in /usr/local/lib/python3.7/dist-packages (from tensorflow!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.*,!=2.4.*,<3,>=1.15.2->tensorflow-model-analysis==0.32.0) (2.5.0) Requirement already satisfied: termcolor~=1.1.0 in /usr/local/lib/python3.7/dist-packages (from tensorflow!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.*,!=2.4.*,<3,>=1.15.2->tensorflow-model-analysis==0.32.0) (1.1.0) Requirement already satisfied: pygments in /usr/local/lib/python3.7/dist-packages (from ipython<8,>=7->tensorflow-model-analysis==0.32.0) (2.6.1) Requirement already satisfied: decorator in /usr/local/lib/python3.7/dist-packages (from ipython<8,>=7->tensorflow-model-analysis==0.32.0) (4.4.2) Requirement already satisfied: pexpect; sys_platform != "win32" in /usr/local/lib/python3.7/dist-packages (from ipython<8,>=7->tensorflow-model-analysis==0.32.0) (4.8.0) Requirement already satisfied: pickleshare in /usr/local/lib/python3.7/dist-packages (from ipython<8,>=7->tensorflow-model-analysis==0.32.0) (0.7.5) Collecting jedi>=0.10 Downloading jedi-0.18.1-py2.py3-none-any.whl (1.6 MB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 1.6/1.6 MB 82.3 MB/s eta 0:00:00 Requirement already satisfied: backcall in /usr/local/lib/python3.7/dist-packages (from ipython<8,>=7->tensorflow-model-analysis==0.32.0) (0.2.0) Requirement already satisfied: prompt-toolkit<2.1.0,>=2.0.0 in /usr/local/lib/python3.7/dist-packages (from ipython<8,>=7->tensorflow-model-analysis==0.32.0) (2.0.10) Requirement already satisfied: setuptools>=18.5 in /usr/local/lib/python3.7/dist-packages (from ipython<8,>=7->tensorflow-model-analysis==0.32.0) (57.4.0) Requirement already satisfied: google-api-python-client<2,>=1.7.11 in /usr/local/lib/python3.7/dist-packages (from tfx-bsl<1.2.0,>=1.1.0->tensorflow-model-analysis==0.32.0) (1.12.11) Requirement already satisfied: tensorflow-serving-api!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.*,!=2.4.*,<3,>=1.15 in /usr/local/lib/python3.7/dist-packages (from tfx-bsl<1.2.0,>=1.1.0->tensorflow-model-analysis==0.32.0) (2.9.1) Requirement already satisfied: notebook>=4.4.1 in /usr/local/lib/python3.7/dist-packages (from widgetsnbextension~=3.6.0->ipywidgets<8,>=7->tensorflow-model-analysis==0.32.0) (5.3.1) Requirement already satisfied: tornado>=4.2 in /usr/local/lib/python3.7/dist-packages (from ipykernel>=4.5.1->ipywidgets<8,>=7->tensorflow-model-analysis==0.32.0) (5.1.1) Requirement already satisfied: jupyter-client in /usr/local/lib/python3.7/dist-packages (from ipykernel>=4.5.1->ipywidgets<8,>=7->tensorflow-model-analysis==0.32.0) (6.1.12) Requirement already satisfied: pyparsing>=2.1.4 in /usr/local/lib/python3.7/dist-packages (from pydot<2,>=1.2.0->apache-beam[gcp]<3,>=2.29->tensorflow-model-analysis==0.32.0) (3.0.9) Requirement already satisfied: urllib3<1.27,>=1.21.1 in /usr/local/lib/python3.7/dist-packages (from requests<3.0.0,>=2.24.0->apache-beam[gcp]<3,>=2.29->tensorflow-model-analysis==0.32.0) (1.24.3) Requirement already satisfied: idna<4,>=2.5 in /usr/local/lib/python3.7/dist-packages (from requests<3.0.0,>=2.24.0->apache-beam[gcp]<3,>=2.29->tensorflow-model-analysis==0.32.0) (2.10) Requirement already satisfied: certifi>=2017.4.17 in /usr/local/lib/python3.7/dist-packages (from requests<3.0.0,>=2.24.0->apache-beam[gcp]<3,>=2.29->tensorflow-model-analysis==0.32.0) (2022.6.15) Requirement already satisfied: charset-normalizer<3,>=2 in /usr/local/lib/python3.7/dist-packages (from requests<3.0.0,>=2.24.0->apache-beam[gcp]<3,>=2.29->tensorflow-model-analysis==0.32.0) (2.1.1) Requirement already satisfied: docopt in /usr/local/lib/python3.7/dist-packages (from hdfs<3.0.0,>=2.1.0->apache-beam[gcp]<3,>=2.29->tensorflow-model-analysis==0.32.0) (0.6.2) Requirement already satisfied: grpcio-status>=1.16.0 in /usr/local/lib/python3.7/dist-packages (from google-cloud-pubsub<3,>=2.1.0; extra == "gcp"->apache-beam[gcp]<3,>=2.29->tensorflow-model-analysis==0.32.0) (1.48.1) Requirement already satisfied: grpc-google-iam-v1<1.0.0dev,>=0.12.4 in /usr/local/lib/python3.7/dist-packages (from google-cloud-pubsub<3,>=2.1.0; extra == "gcp"->apache-beam[gcp]<3,>=2.29->tensorflow-model-analysis==0.32.0) (0.12.4) Requirement already satisfied: rsa<5,>=3.1.4; python_version >= "3.6" in /usr/local/lib/python3.7/dist-packages (from google-auth<3,>=1.18.0; extra == "gcp"->apache-beam[gcp]<3,>=2.29->tensorflow-model-analysis==0.32.0) (4.9) Requirement already satisfied: pyasn1-modules>=0.2.1 in /usr/local/lib/python3.7/dist-packages (from google-auth<3,>=1.18.0; extra == "gcp"->apache-beam[gcp]<3,>=2.29->tensorflow-model-analysis==0.32.0) (0.2.8) Requirement already satisfied: packaging>=14.3 in /usr/local/lib/python3.7/dist-packages (from google-api-core!=2.8.2,<3; extra == "gcp"->apache-beam[gcp]<3,>=2.29->tensorflow-model-analysis==0.32.0) (21.3) Requirement already satisfied: oauth2client>=1.4.12 in /usr/local/lib/python3.7/dist-packages (from google-apitools<0.5.32,>=0.5.31; extra == "gcp"->apache-beam[gcp]<3,>=2.29->tensorflow-model-analysis==0.32.0) (4.1.3) Requirement already satisfied: fasteners>=0.14 in /usr/local/lib/python3.7/dist-packages (from google-apitools<0.5.32,>=0.5.31; extra == "gcp"->apache-beam[gcp]<3,>=2.29->tensorflow-model-analysis==0.32.0) (0.17.3) Requirement already satisfied: google-resumable-media!=0.4.0,<0.5.0dev,>=0.3.1 in /usr/local/lib/python3.7/dist-packages (from google-cloud-bigquery<3,>=1.6.0; extra == "gcp"->apache-beam[gcp]<3,>=2.29->tensorflow-model-analysis==0.32.0) (0.4.1) Requirement already satisfied: overrides<7.0.0,>=6.0.1 in /usr/local/lib/python3.7/dist-packages (from google-cloud-pubsublite<2,>=1.2.0; extra == "gcp"->apache-beam[gcp]<3,>=2.29->tensorflow-model-analysis==0.32.0) (6.2.0) Requirement already satisfied: cached-property; python_version < "3.8" in /usr/local/lib/python3.7/dist-packages (from h5py~=3.1.0->tensorflow!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.*,!=2.4.*,<3,>=1.15.2->tensorflow-model-analysis==0.32.0) (1.5.2) Requirement already satisfied: google-auth-oauthlib<0.5,>=0.4.1 in /usr/local/lib/python3.7/dist-packages (from tensorboard~=2.5->tensorflow!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.*,!=2.4.*,<3,>=1.15.2->tensorflow-model-analysis==0.32.0) (0.4.6) Requirement already satisfied: werkzeug>=0.11.15 in /usr/local/lib/python3.7/dist-packages (from tensorboard~=2.5->tensorflow!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.*,!=2.4.*,<3,>=1.15.2->tensorflow-model-analysis==0.32.0) (1.0.1) Requirement already satisfied: markdown>=2.6.8 in /usr/local/lib/python3.7/dist-packages (from tensorboard~=2.5->tensorflow!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.*,!=2.4.*,<3,>=1.15.2->tensorflow-model-analysis==0.32.0) (3.4.1) Requirement already satisfied: tensorboard-data-server<0.7.0,>=0.6.0 in /usr/local/lib/python3.7/dist-packages (from tensorboard~=2.5->tensorflow!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.*,!=2.4.*,<3,>=1.15.2->tensorflow-model-analysis==0.32.0) (0.6.1) Requirement already satisfied: tensorboard-plugin-wit>=1.6.0 in /usr/local/lib/python3.7/dist-packages (from tensorboard~=2.5->tensorflow!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.*,!=2.4.*,<3,>=1.15.2->tensorflow-model-analysis==0.32.0) (1.8.1) Requirement already satisfied: ptyprocess>=0.5 in /usr/local/lib/python3.7/dist-packages (from pexpect; sys_platform != "win32"->ipython<8,>=7->tensorflow-model-analysis==0.32.0) (0.7.0) Requirement already satisfied: parso<0.9.0,>=0.8.0 in /usr/local/lib/python3.7/dist-packages (from jedi>=0.10->ipython<8,>=7->tensorflow-model-analysis==0.32.0) (0.8.3) Requirement already satisfied: wcwidth in /usr/local/lib/python3.7/dist-packages (from prompt-toolkit<2.1.0,>=2.0.0->ipython<8,>=7->tensorflow-model-analysis==0.32.0) (0.2.5) Requirement already satisfied: uritemplate<4dev,>=3.0.0 in /usr/local/lib/python3.7/dist-packages (from google-api-python-client<2,>=1.7.11->tfx-bsl<1.2.0,>=1.1.0->tensorflow-model-analysis==0.32.0) (3.0.1) Requirement already satisfied: jinja2 in /usr/local/lib/python3.7/dist-packages (from notebook>=4.4.1->widgetsnbextension~=3.6.0->ipywidgets<8,>=7->tensorflow-model-analysis==0.32.0) (2.11.3) Requirement already satisfied: jupyter-core>=4.4.0 in /usr/local/lib/python3.7/dist-packages (from notebook>=4.4.1->widgetsnbextension~=3.6.0->ipywidgets<8,>=7->tensorflow-model-analysis==0.32.0) (4.11.1) Requirement already satisfied: nbformat in /usr/local/lib/python3.7/dist-packages (from notebook>=4.4.1->widgetsnbextension~=3.6.0->ipywidgets<8,>=7->tensorflow-model-analysis==0.32.0) (5.4.0) Requirement already satisfied: nbconvert in /usr/local/lib/python3.7/dist-packages (from notebook>=4.4.1->widgetsnbextension~=3.6.0->ipywidgets<8,>=7->tensorflow-model-analysis==0.32.0) (5.6.1) Requirement already satisfied: terminado>=0.8.1 in /usr/local/lib/python3.7/dist-packages (from notebook>=4.4.1->widgetsnbextension~=3.6.0->ipywidgets<8,>=7->tensorflow-model-analysis==0.32.0) (0.13.3) Requirement already satisfied: Send2Trash in /usr/local/lib/python3.7/dist-packages (from notebook>=4.4.1->widgetsnbextension~=3.6.0->ipywidgets<8,>=7->tensorflow-model-analysis==0.32.0) (1.8.0) Requirement already satisfied: pyzmq>=13 in /usr/local/lib/python3.7/dist-packages (from jupyter-client->ipykernel>=4.5.1->ipywidgets<8,>=7->tensorflow-model-analysis==0.32.0) (23.2.1) Requirement already satisfied: pyasn1>=0.1.3 in /usr/local/lib/python3.7/dist-packages (from rsa<5,>=3.1.4; python_version >= "3.6"->google-auth<3,>=1.18.0; extra == "gcp"->apache-beam[gcp]<3,>=2.29->tensorflow-model-analysis==0.32.0) (0.4.8) Requirement already satisfied: requests-oauthlib>=0.7.0 in /usr/local/lib/python3.7/dist-packages (from google-auth-oauthlib<0.5,>=0.4.1->tensorboard~=2.5->tensorflow!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.*,!=2.4.*,<3,>=1.15.2->tensorflow-model-analysis==0.32.0) (1.3.1) Requirement already satisfied: importlib-metadata>=4.4; python_version < "3.10" in /usr/local/lib/python3.7/dist-packages (from markdown>=2.6.8->tensorboard~=2.5->tensorflow!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.*,!=2.4.*,<3,>=1.15.2->tensorflow-model-analysis==0.32.0) (4.12.0) Requirement already satisfied: MarkupSafe>=0.23 in /usr/local/lib/python3.7/dist-packages (from jinja2->notebook>=4.4.1->widgetsnbextension~=3.6.0->ipywidgets<8,>=7->tensorflow-model-analysis==0.32.0) (2.0.1) Requirement already satisfied: fastjsonschema in /usr/local/lib/python3.7/dist-packages (from nbformat->notebook>=4.4.1->widgetsnbextension~=3.6.0->ipywidgets<8,>=7->tensorflow-model-analysis==0.32.0) (2.16.1) Requirement already satisfied: jsonschema>=2.6 in /usr/local/lib/python3.7/dist-packages (from nbformat->notebook>=4.4.1->widgetsnbextension~=3.6.0->ipywidgets<8,>=7->tensorflow-model-analysis==0.32.0) (4.3.3) Requirement already satisfied: bleach in /usr/local/lib/python3.7/dist-packages (from nbconvert->notebook>=4.4.1->widgetsnbextension~=3.6.0->ipywidgets<8,>=7->tensorflow-model-analysis==0.32.0) (5.0.1) Requirement already satisfied: mistune<2,>=0.8.1 in /usr/local/lib/python3.7/dist-packages (from nbconvert->notebook>=4.4.1->widgetsnbextension~=3.6.0->ipywidgets<8,>=7->tensorflow-model-analysis==0.32.0) (0.8.4) Requirement already satisfied: pandocfilters>=1.4.1 in /usr/local/lib/python3.7/dist-packages (from nbconvert->notebook>=4.4.1->widgetsnbextension~=3.6.0->ipywidgets<8,>=7->tensorflow-model-analysis==0.32.0) (1.5.0) Requirement already satisfied: defusedxml in /usr/local/lib/python3.7/dist-packages (from nbconvert->notebook>=4.4.1->widgetsnbextension~=3.6.0->ipywidgets<8,>=7->tensorflow-model-analysis==0.32.0) (0.7.1) Requirement already satisfied: entrypoints>=0.2.2 in /usr/local/lib/python3.7/dist-packages (from nbconvert->notebook>=4.4.1->widgetsnbextension~=3.6.0->ipywidgets<8,>=7->tensorflow-model-analysis==0.32.0) (0.4) Requirement already satisfied: testpath in /usr/local/lib/python3.7/dist-packages (from nbconvert->notebook>=4.4.1->widgetsnbextension~=3.6.0->ipywidgets<8,>=7->tensorflow-model-analysis==0.32.0) (0.6.0) Requirement already satisfied: oauthlib>=3.0.0 in /usr/local/lib/python3.7/dist-packages (from requests-oauthlib>=0.7.0->google-auth-oauthlib<0.5,>=0.4.1->tensorboard~=2.5->tensorflow!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.*,!=2.4.*,<3,>=1.15.2->tensorflow-model-analysis==0.32.0) (3.2.0) Requirement already satisfied: zipp>=0.5 in /usr/local/lib/python3.7/dist-packages (from importlib-metadata>=4.4; python_version < "3.10"->markdown>=2.6.8->tensorboard~=2.5->tensorflow!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.*,!=2.4.*,<3,>=1.15.2->tensorflow-model-analysis==0.32.0) (3.8.1) Requirement already satisfied: pyrsistent!=0.17.0,!=0.17.1,!=0.17.2,>=0.14.0 in /usr/local/lib/python3.7/dist-packages (from jsonschema>=2.6->nbformat->notebook>=4.4.1->widgetsnbextension~=3.6.0->ipywidgets<8,>=7->tensorflow-model-analysis==0.32.0) (0.18.1) Requirement already satisfied: attrs>=17.4.0 in /usr/local/lib/python3.7/dist-packages (from jsonschema>=2.6->nbformat->notebook>=4.4.1->widgetsnbextension~=3.6.0->ipywidgets<8,>=7->tensorflow-model-analysis==0.32.0) (22.1.0) Requirement already satisfied: importlib-resources>=1.4.0; python_version < "3.9" in /usr/local/lib/python3.7/dist-packages (from jsonschema>=2.6->nbformat->notebook>=4.4.1->widgetsnbextension~=3.6.0->ipywidgets<8,>=7->tensorflow-model-analysis==0.32.0) (5.9.0) Requirement already satisfied: webencodings in /usr/local/lib/python3.7/dist-packages (from bleach->nbconvert->notebook>=4.4.1->widgetsnbextension~=3.6.0->ipywidgets<8,>=7->tensorflow-model-analysis==0.32.0) (0.5.1) Installing collected packages: tensorflow-metadata, tfx-bsl, tensorflow-model-analysis, jedi Attempting uninstall: tensorflow-metadata Found existing installation: tensorflow-metadata 1.0.0 Uninstalling tensorflow-metadata-1.0.0: Successfully uninstalled tensorflow-metadata-1.0.0 Attempting uninstall: tfx-bsl Found existing installation: tfx-bsl 1.0.0 Uninstalling tfx-bsl-1.0.0: Successfully uninstalled tfx-bsl-1.0.0 ERROR: pip's legacy dependency resolver does not consider dependency conflicts when selecting packages. This behaviour is the source of the following dependency conflicts. tfx-bsl 1.1.1 requires google-cloud-bigquery<2.21,>=1.28.0, but you'll have google-cloud-bigquery 1.21.0 which is incompatible. tensorflow-transform 1.0.0 requires tensorflow-metadata<1.1.0,>=1.0.0, but you'll have tensorflow-metadata 1.1.0 which is incompatible. tensorflow-transform 1.0.0 requires tfx-bsl<1.1.0,>=1.0.0, but you'll have tfx-bsl 1.1.1 which is incompatible. Successfully installed jedi-0.18.1 tensorflow-metadata-1.1.0 tensorflow-model-analysis-0.32.0 tfx-bsl-1.1.1 WARNING: Running pip as the 'root' user can result in broken permissions and conflicting behaviour with the system package manager. It is recommended to use a virtual environment instead: https://pip.pypa.io/warnings/venv Looking in indexes: https://pypi.org/simple, https://us-python.pkg.dev/colab-wheels/public/simple/ Collecting apache-beam==2.32.0 Downloading apache_beam-2.32.0-cp37-cp37m-manylinux2010_x86_64.whl (9.8 MB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 9.8/9.8 MB 28.0 MB/s eta 0:00:00 Requirement already satisfied: pymongo<4.0.0,>=3.8.0 in /usr/local/lib/python3.7/dist-packages (from apache-beam==2.32.0) (3.12.3) Requirement already satisfied: numpy<1.21.0,>=1.14.3 in /usr/local/lib/python3.7/dist-packages (from apache-beam==2.32.0) (1.19.5) Requirement already satisfied: pytz>=2018.3 in /usr/local/lib/python3.7/dist-packages (from apache-beam==2.32.0) (2022.2.1) Requirement already satisfied: fastavro<2,>=0.21.4 in /usr/local/lib/python3.7/dist-packages (from apache-beam==2.32.0) (1.6.0) Collecting future<1.0.0,>=0.18.2 Downloading future-0.18.2.tar.gz (829 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 829.2/829.2 kB 61.6 MB/s eta 0:00:00 Preparing metadata (setup.py) ... done Requirement already satisfied: python-dateutil<3,>=2.8.0 in /usr/local/lib/python3.7/dist-packages (from apache-beam==2.32.0) (2.8.2) Requirement already satisfied: requests<3.0.0,>=2.24.0 in /usr/local/lib/python3.7/dist-packages (from apache-beam==2.32.0) (2.28.1) Requirement already satisfied: hdfs<3.0.0,>=2.1.0 in /usr/local/lib/python3.7/dist-packages (from apache-beam==2.32.0) (2.7.0) Requirement already satisfied: oauth2client<5,>=2.0.1 in /usr/local/lib/python3.7/dist-packages (from apache-beam==2.32.0) (4.1.3) Requirement already satisfied: httplib2<0.20.0,>=0.8 in /usr/local/lib/python3.7/dist-packages (from apache-beam==2.32.0) (0.17.4) Requirement already satisfied: grpcio<2,>=1.29.0 in /usr/local/lib/python3.7/dist-packages (from apache-beam==2.32.0) (1.34.1) Requirement already satisfied: typing-extensions<3.8.0,>=3.7.0 in /usr/local/lib/python3.7/dist-packages (from apache-beam==2.32.0) (3.7.4.3) Collecting avro-python3!=1.9.2,<1.10.0,>=1.8.1 Downloading avro-python3-1.9.2.1.tar.gz (37 kB) Preparing metadata (setup.py) ... done Requirement already satisfied: orjson<4.0 in /usr/local/lib/python3.7/dist-packages (from apache-beam==2.32.0) (3.8.0) Requirement already satisfied: protobuf<4,>=3.12.2 in /usr/local/lib/python3.7/dist-packages (from apache-beam==2.32.0) (3.17.3) Requirement already satisfied: pydot<2,>=1.2.0 in /usr/local/lib/python3.7/dist-packages (from apache-beam==2.32.0) (1.3.0) Requirement already satisfied: dill<0.3.2,>=0.3.1.1 in /usr/local/lib/python3.7/dist-packages (from apache-beam==2.32.0) (0.3.1.1) Requirement already satisfied: pyarrow<5.0.0,>=0.15.1 in /usr/local/lib/python3.7/dist-packages (from apache-beam==2.32.0) (2.0.0) Requirement already satisfied: crcmod<2.0,>=1.7 in /usr/local/lib/python3.7/dist-packages (from apache-beam==2.32.0) (1.7) Requirement already satisfied: six>=1.5.2 in /usr/local/lib/python3.7/dist-packages (from grpcio<2,>=1.29.0->apache-beam==2.32.0) (1.15.0) Requirement already satisfied: docopt in /usr/local/lib/python3.7/dist-packages (from hdfs<3.0.0,>=2.1.0->apache-beam==2.32.0) (0.6.2) Requirement already satisfied: pyasn1-modules>=0.0.5 in /usr/local/lib/python3.7/dist-packages (from oauth2client<5,>=2.0.1->apache-beam==2.32.0) (0.2.8) Requirement already satisfied: pyasn1>=0.1.7 in /usr/local/lib/python3.7/dist-packages (from oauth2client<5,>=2.0.1->apache-beam==2.32.0) (0.4.8) Requirement already satisfied: rsa>=3.1.4 in /usr/local/lib/python3.7/dist-packages (from oauth2client<5,>=2.0.1->apache-beam==2.32.0) (4.9) Requirement already satisfied: pyparsing>=2.1.4 in /usr/local/lib/python3.7/dist-packages (from pydot<2,>=1.2.0->apache-beam==2.32.0) (3.0.9) Requirement already satisfied: charset-normalizer<3,>=2 in /usr/local/lib/python3.7/dist-packages (from requests<3.0.0,>=2.24.0->apache-beam==2.32.0) (2.1.1) Requirement already satisfied: idna<4,>=2.5 in /usr/local/lib/python3.7/dist-packages (from requests<3.0.0,>=2.24.0->apache-beam==2.32.0) (2.10) Requirement already satisfied: urllib3<1.27,>=1.21.1 in /usr/local/lib/python3.7/dist-packages (from requests<3.0.0,>=2.24.0->apache-beam==2.32.0) (1.24.3) Requirement already satisfied: certifi>=2017.4.17 in /usr/local/lib/python3.7/dist-packages (from requests<3.0.0,>=2.24.0->apache-beam==2.32.0) (2022.6.15) Building wheels for collected packages: avro-python3, future Building wheel for avro-python3 (setup.py) ... done Created wheel for avro-python3: filename=avro_python3-1.9.2.1-py3-none-any.whl size=43513 sha256=e02eea6456f389d236cdf4ed51b8d7dab65ec6662ec95cceca85ff23f575b519 Stored in directory: /root/.cache/pip/wheels/bc/49/5f/fdb5b9d85055c478213e0158ac122b596816149a02d82e0ab1 Building wheel for future (setup.py) ... done Created wheel for future: filename=future-0.18.2-py3-none-any.whl size=491070 sha256=fa0225bd905e085c463126fd6e1e2c5ed11860a254ed9b4ee4a40d25228c18d1 Stored in directory: /root/.cache/pip/wheels/56/b0/fe/4410d17b32f1f0c3cf54cdfb2bc04d7b4b8f4ae377e2229ba0 Successfully built avro-python3 future Installing collected packages: future, avro-python3, apache-beam Attempting uninstall: future Found existing installation: future 0.16.0 Uninstalling future-0.16.0: Successfully uninstalled future-0.16.0 Attempting uninstall: apache-beam Found existing installation: apache-beam 2.41.0 Uninstalling apache-beam-2.41.0: Successfully uninstalled apache-beam-2.41.0 ERROR: pip's dependency resolver does not currently take into account all the packages that are installed. This behaviour is the source of the following dependency conflicts. tfx-bsl 1.1.1 requires google-cloud-bigquery<2.21,>=1.28.0, but you have google-cloud-bigquery 1.21.0 which is incompatible. tensorflow-transform 1.0.0 requires tensorflow-metadata<1.1.0,>=1.0.0, but you have tensorflow-metadata 1.1.0 which is incompatible. tensorflow-transform 1.0.0 requires tfx-bsl<1.1.0,>=1.0.0, but you have tfx-bsl 1.1.1 which is incompatible. Successfully installed apache-beam-2.32.0 avro-python3-1.9.2.1 future-0.18.2 WARNING: Running pip as the 'root' user can result in broken permissions and conflicting behaviour with the system package manager. It is recommended to use a virtual environment instead: https://pip.pypa.io/warnings/venv
Note: In Google Colab, you need to restart the runtime at this point to finalize updating the packages you just installed. Please do not proceed to the next section without restarting. You can also ignore the errors about version incompatibility of some of the bundled packages because we won't be using those in this notebook.
Running the code below should show the versions of the packages. Please re-run the install if you are seeing errors and don't forget to restart the runtime after re-installation.
# Import packages and print versions
import tensorflow as tf
import tensorflow_model_analysis as tfma
import tensorflow_data_validation as tfdv
print('TF version: {}'.format(tf.__version__))
print('TFMA version: {}'.format(tfma.__version__))
print('TFDV version: {}'.format(tfdv.__version__))
TF version: 2.5.3 TFMA version: 0.32.0 TFDV version: 1.1.0
Next, you will download the files you will need for this exercise:
We've also defined some global variables below so you can access these files throughout the notebook more easily.
import os
# String variables for file and directory names
URL = 'https://storage.googleapis.com/mlep-public/course_3/week4/C3_W4_Lab_1_starter_files.tar.gz'
TAR_NAME = 'C3_W4_Lab_1_starter_files.tar.gz'
BASE_DIR = 'starter_files'
DATA_DIR = os.path.join(BASE_DIR, 'data')
CSV_DIR = os.path.join(DATA_DIR, 'csv')
TFRECORD_DIR = os.path.join(DATA_DIR, 'tfrecord')
MODELS_DIR = os.path.join(BASE_DIR, 'models')
SCHEMA_FILE = os.path.join(BASE_DIR, 'schema.pbtxt')
# uncomment this line if you've downloaded the files before and want to reset
# !rm -rf {BASE_DIR}
# Download the tar file from GCP
!wget {URL}
# Extract the tar file to the base directory
!tar xzf {TAR_NAME}
# Delete tar file
!rm {TAR_NAME}
--2022-09-05 02:42:56-- https://storage.googleapis.com/mlep-public/course_3/week4/C3_W4_Lab_1_starter_files.tar.gz Resolving storage.googleapis.com (storage.googleapis.com)... 108.177.98.128, 74.125.197.128, 74.125.135.128, ... Connecting to storage.googleapis.com (storage.googleapis.com)|108.177.98.128|:443... connected. HTTP request sent, awaiting response... 200 OK Length: 1449046 (1.4M) [application/x-gzip] Saving to: ‘C3_W4_Lab_1_starter_files.tar.gz’ C3_W4_Lab_1_starter 100%[===================>] 1.38M --.-KB/s in 0.02s 2022-09-05 02:42:57 (76.7 MB/s) - ‘C3_W4_Lab_1_starter_files.tar.gz’ saved [1449046/1449046]
You can see the top level file and directories by running the cell below (or just using the file explorer on the left side of this Colab). We'll discuss what each contain in the next sections.
print("Here's what we downloaded:")
!ls {BASE_DIR}
Here's what we downloaded: data models schema.pbtxt
The data/csv
directory contains the test split of the Census Income dataset. We've divided it into several files for this demo notebook:
data_test.csv
- 15000 rows of test datadata_test_1.csv
- first 5000 rows of data_test.csv
data_test_2.csv
- next 5000 rows of data_test.csv
data_test_3.csv
- last 5000 rows of data_test.csv
You can see the description of each column here (please open link in a new window if Colab prevents the download). Also for simplicity, we've already preprocessed the label
column as binary (i.e. 0
or 1
) to match the model's output. In your own projects, your labels might be in a different data type (e.g. string) and you want to transform that first so you can evaluate your model properly. You can preview the first few rows below:
# Path to the full test set
TEST_DATA_PATH = os.path.join(CSV_DIR, 'data_test.csv')
# Preview the first few rows
!head {TEST_DATA_PATH}
age,workclass,fnlwgt,education,education-num,marital-status,occupation,relationship,race,sex,capital-gain,capital-loss,hours-per-week,native-country,label 25,Private,226802,11th,7,Never-married,Machine-op-inspct,Own-child,Black,Male,0,0,40,United-States,0 38,Private,89814,HS-grad,9,Married-civ-spouse,Farming-fishing,Husband,White,Male,0,0,50,United-States,0 28,Local-gov,336951,Assoc-acdm,12,Married-civ-spouse,Protective-serv,Husband,White,Male,0,0,40,United-States,1 44,Private,160323,Some-college,10,Married-civ-spouse,Machine-op-inspct,Husband,Black,Male,7688,0,40,United-States,1 18,?,103497,Some-college,10,Never-married,?,Own-child,White,Female,0,0,30,United-States,0 34,Private,198693,10th,6,Never-married,Other-service,Not-in-family,White,Male,0,0,30,United-States,0 29,?,227026,HS-grad,9,Never-married,?,Unmarried,Black,Male,0,0,40,United-States,0 63,Self-emp-not-inc,104626,Prof-school,15,Married-civ-spouse,Prof-specialty,Husband,White,Male,3103,0,32,United-States,1 24,Private,369667,Some-college,10,Never-married,Other-service,Unmarried,White,Female,0,0,40,United-States,0
You also downloaded a schema generated by TensorFlow Data Validation. You should be familiar with this file type already from previous courses. You will load it now so you can use it in the later parts of the notebook.
# Load the schema as a protocol buffer
SCHEMA = tfdv.load_schema_text(SCHEMA_FILE)
# Display the schema
tfdv.display_schema(SCHEMA)
Type | Presence | Valency | Domain | |
---|---|---|---|---|
Feature name | ||||
'age' | INT | required | - | |
'capital-gain' | INT | required | - | |
'capital-loss' | INT | required | - | |
'education' | STRING | required | 'education' | |
'education-num' | INT | required | - | |
'fnlwgt' | INT | required | - | |
'hours-per-week' | INT | required | - | |
'label' | INT | required | - | |
'marital-status' | STRING | required | 'marital-status' | |
'native-country' | STRING | required | 'native-country' | |
'occupation' | STRING | required | 'occupation' | |
'race' | STRING | required | 'race' | |
'relationship' | STRING | required | 'relationship' | |
'sex' | STRING | required | 'sex' | |
'workclass' | STRING | required | 'workclass' |
/usr/local/lib/python3.7/dist-packages/tensorflow_data_validation/utils/display_util.py:180: FutureWarning: Passing a negative integer is deprecated in version 1.0 and will not be supported in future version. Instead, use None to not limit the column width. pd.set_option('max_colwidth', -1)
Values | |
---|---|
Domain | |
'education' | '10th', '11th', '12th', '1st-4th', '5th-6th', '7th-8th', '9th', 'Assoc-acdm', 'Assoc-voc', 'Bachelors', 'Doctorate', 'HS-grad', 'Masters', 'Preschool', 'Prof-school', 'Some-college' |
'marital-status' | 'Divorced', 'Married-AF-spouse', 'Married-civ-spouse', 'Married-spouse-absent', 'Never-married', 'Separated', 'Widowed' |
'native-country' | '?', 'Cambodia', 'Canada', 'China', 'Columbia', 'Cuba', 'Dominican-Republic', 'Ecuador', 'El-Salvador', 'England', 'France', 'Germany', 'Greece', 'Guatemala', 'Haiti', 'Holand-Netherlands', 'Honduras', 'Hong', 'Hungary', 'India', 'Iran', 'Ireland', 'Italy', 'Jamaica', 'Japan', 'Laos', 'Mexico', 'Nicaragua', 'Outlying-US(Guam-USVI-etc)', 'Peru', 'Philippines', 'Poland', 'Portugal', 'Puerto-Rico', 'Scotland', 'South', 'Taiwan', 'Thailand', 'Trinadad&Tobago', 'United-States', 'Vietnam', 'Yugoslavia' |
'occupation' | '?', 'Adm-clerical', 'Armed-Forces', 'Craft-repair', 'Exec-managerial', 'Farming-fishing', 'Handlers-cleaners', 'Machine-op-inspct', 'Other-service', 'Priv-house-serv', 'Prof-specialty', 'Protective-serv', 'Sales', 'Tech-support', 'Transport-moving' |
'race' | 'Amer-Indian-Eskimo', 'Asian-Pac-Islander', 'Black', 'Other', 'White' |
'relationship' | 'Husband', 'Not-in-family', 'Other-relative', 'Own-child', 'Unmarried', 'Wife' |
'sex' | 'Female', 'Male' |
'workclass' | '?', 'Federal-gov', 'Local-gov', 'Never-worked', 'Private', 'Self-emp-inc', 'Self-emp-not-inc', 'State-gov', 'Without-pay' |
TFMA needs a TFRecord file input so you need to convert the CSVs in the data directory. If you've done the earlier labs, you will know that this can be easily done with ExampleGen
. For this notebook however, you will use the helper function below instead to demonstrate how it can be done outside a TFX pipeline. You will pass in the schema you loaded in the previous step to determine the correct type of each feature.
# imports for helper function
import csv
from tensorflow.core.example import example_pb2
from tensorflow_metadata.proto.v0 import schema_pb2
def csv_to_tfrecord(schema, csv_file, tfrecord_file):
''' Converts a csv file into a tfrecord
Args:
schema (schema_pb2) - Schema protobuf from TFDV
csv_file (string) - file to convert to tfrecord
tfrecord_file (string) - filename of tfrecord to create
Returns:
filename of tfrecord
'''
# Open CSV file for reading. Each row is mapped as a dictionary.
reader = csv.DictReader(open(csv_file, 'r'))
# Initialize TF examples list
examples = []
# For each row in CSV, create a TF Example based on
# the Schema and append to the list
for line in reader:
# Intialize example
example = example_pb2.Example()
# Loop through features in the schema
for feature in schema.feature:
# Get current feature name
key = feature.name
# Populate values based on data type of current feature
if feature.type == schema_pb2.FLOAT:
example.features.feature[key].float_list.value[:] = (
[float(line[key])] if len(line[key]) > 0 else [])
elif feature.type == schema_pb2.INT:
example.features.feature[key].int64_list.value[:] = (
[int(line[key])] if len(line[key]) > 0 else [])
elif feature.type == schema_pb2.BYTES:
example.features.feature[key].bytes_list.value[:] = (
[line[key].encode('utf8')] if len(line[key]) > 0 else [])
# Append to the list
examples.append(example)
# Write examples to tfrecord file
with tf.io.TFRecordWriter(tfrecord_file) as writer:
for example in examples:
writer.write(example.SerializeToString())
return tfrecord_file
The code below will do the conversion and we've defined some more global variables that you will use in later exercises.
# Create tfrecord directory
!mkdir {TFRECORD_DIR}
# Create list of tfrecord files
tfrecord_files = [csv_to_tfrecord(SCHEMA, f'{CSV_DIR}/{name}', f"{TFRECORD_DIR}/{name.replace('csv','tfrecord')}")
for name in os.listdir(CSV_DIR)]
# Print created files
print(f'files created: {tfrecord_files}')
# Create variables for each tfrecord
TFRECORD_FULL = os.path.join(TFRECORD_DIR, 'data_test.tfrecord')
TFRECORD_DAY1 = os.path.join(TFRECORD_DIR, 'data_test_1.tfrecord')
TFRECORD_DAY2 = os.path.join(TFRECORD_DIR, 'data_test_2.tfrecord')
TFRECORD_DAY3 = os.path.join(TFRECORD_DIR, 'data_test_3.tfrecord')
# Delete unneeded variable
del tfrecord_files
files created: ['starter_files/data/tfrecord/data_test.tfrecord', 'starter_files/data/tfrecord/data_test_1.tfrecord', 'starter_files/data/tfrecord/data_test_3.tfrecord', 'starter_files/data/tfrecord/data_test_2.tfrecord']
Lastly, you also downloaded pretrained Keras models and they are stored in the models/
directory. TFMA supports a number of different model types including TF Keras models, models based on generic TF2 signature APIs, as well TF estimator based models. The get_started guide has the full list of model types supported and any restrictions. You can also consult the FAQ for examples on how to configure these models.
We have included three models and you can choose to analyze any one of them in the later sections. These were saved in SavedModel format which is the default when saving with the Keras Models API.
# list model directories
!ls {MODELS_DIR}
# Create string variables for each model directory
MODEL1_FILE = os.path.join(MODELS_DIR, 'model1')
MODEL2_FILE = os.path.join(MODELS_DIR, 'model2')
MODEL3_FILE = os.path.join(MODELS_DIR, 'model3')
model1 model2 model3
As mentioned earlier, these models were trained on the Census Income dataset. The label is 1
if a person earns more than 50k USD and 0
if less than or equal. You can load one of the models and look at the summary to get a sense of its architecture. All three models use the same architecture but were trained with different epochs to simulate varying performance.
# Load model 1
model = tf.keras.models.load_model(MODEL1_FILE)
# Print summary. You can ignore the warnings at the start.
model.summary()
WARNING:tensorflow:SavedModel saved prior to TF 2.5 detected when loading Keras model. Please ensure that you are saving the model with model.save() or tf.keras.models.save_model(), *NOT* tf.saved_model.save(). To confirm, there should be a file named "keras_metadata.pb" in the SavedModel directory. WARNING:tensorflow:Inconsistent references when loading the checkpoint into this object graph. Either the Trackable object references in the Python program have changed in an incompatible way, or the checkpoint was generated in an incompatible program. Two checkpoint references resolved to different objects (<tensorflow.python.keras.saving.saved_model.load.TensorFlowTransform>TransformFeaturesLayer object at 0x7f3baf7a6410> and <tensorflow.python.keras.engine.input_layer.InputLayer object at 0x7f3baefd5810>).
Model: "model" __________________________________________________________________________________________________ Layer (type) Output Shape Param # Connected to ================================================================================================== fnlwgt_xf (InputLayer) [(None, 1)] 0 __________________________________________________________________________________________________ education-num_xf (InputLayer) [(None, 1)] 0 __________________________________________________________________________________________________ capital-gain_xf (InputLayer) [(None, 1)] 0 __________________________________________________________________________________________________ capital-loss_xf (InputLayer) [(None, 1)] 0 __________________________________________________________________________________________________ hours-per-week_xf (InputLayer) [(None, 1)] 0 __________________________________________________________________________________________________ concatenate (Concatenate) (None, 5) 0 fnlwgt_xf[0][0] education-num_xf[0][0] capital-gain_xf[0][0] capital-loss_xf[0][0] hours-per-week_xf[0][0] __________________________________________________________________________________________________ dense (Dense) (None, 100) 600 concatenate[0][0] __________________________________________________________________________________________________ dense_1 (Dense) (None, 70) 7070 dense[0][0] __________________________________________________________________________________________________ education_xf (InputLayer) [(None, 21)] 0 __________________________________________________________________________________________________ marital-status_xf (InputLayer) [(None, 12)] 0 __________________________________________________________________________________________________ occupation_xf (InputLayer) [(None, 20)] 0 __________________________________________________________________________________________________ race_xf (InputLayer) [(None, 10)] 0 __________________________________________________________________________________________________ relationship_xf (InputLayer) [(None, 11)] 0 __________________________________________________________________________________________________ workclass_xf (InputLayer) [(None, 14)] 0 __________________________________________________________________________________________________ sex_xf (InputLayer) [(None, 7)] 0 __________________________________________________________________________________________________ native-country_xf (InputLayer) [(None, 47)] 0 __________________________________________________________________________________________________ age_xf (InputLayer) [(None, 4)] 0 __________________________________________________________________________________________________ dense_2 (Dense) (None, 48) 3408 dense_1[0][0] __________________________________________________________________________________________________ concatenate_1 (Concatenate) (None, 146) 0 education_xf[0][0] marital-status_xf[0][0] occupation_xf[0][0] race_xf[0][0] relationship_xf[0][0] workclass_xf[0][0] sex_xf[0][0] native-country_xf[0][0] age_xf[0][0] __________________________________________________________________________________________________ dense_3 (Dense) (None, 34) 1666 dense_2[0][0] __________________________________________________________________________________________________ dense_4 (Dense) (None, 128) 18816 concatenate_1[0][0] __________________________________________________________________________________________________ concatenate_2 (Concatenate) (None, 162) 0 dense_3[0][0] dense_4[0][0] __________________________________________________________________________________________________ dense_5 (Dense) (None, 1) 163 concatenate_2[0][0] __________________________________________________________________________________________________ transform_features_layer (Tenso multiple 0 ================================================================================================== Total params: 31,723 Trainable params: 31,723 Non-trainable params: 0 __________________________________________________________________________________________________
You can see the code to build these in the next lab. For now, you'll only need to take note of a few things. First, the output is a single dense unit with a sigmoid activation (i.e. dense_5
above). This is standard for binary classification problems.
Another is that the model is exported with a transformation layer. You can see this in the summary above at the bottom row named transform_features_layer
and it is not connected to the other layers. From previous labs, you will know that this is taken from the graph generated by the Transform
component. It helps to avoid training-serving skews by making sure that raw inputs are transformed in the same way that the model expects. It is also available as a tft_layer
property of the model object.
# Transformation layer can be accessed in two ways. These are equivalent.
model.get_layer('transform_features_layer') is model.tft_layer
True
TFMA invokes this layer automatically for your raw inputs but we've included a short snippet below to demonstrate how the transformation works. You can see that the raw features are indeed reformatted to an acceptable input for the model. The raw numeric features are scaled and the raw categorical (string) features are encoded to one-hot vectors.
from tensorflow_transform.tf_metadata import schema_utils
# Load one tfrecord
tfrecord_file = tf.data.TFRecordDataset(TFRECORD_DAY1)
# Parse schema object as a feature spec
feature_spec = schema_utils.schema_as_feature_spec(SCHEMA).feature_spec
# Create a batch from the dataset
for records in tfrecord_file.batch(1).take(1):
# Parse the batch to get a dictionary of raw features
parsed_examples = tf.io.parse_example(records, feature_spec)
# Print the results
print("\nRAW FEATURES:")
for key, value in parsed_examples.items():
print(f'{key}: {value.numpy()}')
# Pop the label since the model does not expect a label input
parsed_examples.pop('label')
# Transform the rest of the raw features using the transform layer
transformed_examples = model.tft_layer(parsed_examples)
# Print the input to the model
print("\nTRANSFORMED FEATURES:")
for key, value in transformed_examples.items():
print(f'{key}: {value.numpy()}')
RAW FEATURES: age: [[25]] capital-gain: [[0]] capital-loss: [[0]] education: [[b'11th']] education-num: [[7]] fnlwgt: [[226802]] hours-per-week: [[40]] label: [[0]] marital-status: [[b'Never-married']] native-country: [[b'United-States']] occupation: [[b'Machine-op-inspct']] race: [[b'Black']] relationship: [[b'Own-child']] sex: [[b'Male']] workclass: [[b'Private']] TRANSFORMED FEATURES: education_xf: [[0. 0. 0. 0. 0. 1. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.]] workclass_xf: [[1. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.]] hours-per-week_xf: [0.39795917] sex_xf: [[1. 0. 0. 0. 0. 0. 0.]] fnlwgt_xf: [0.14569008] education-num_xf: [0.4] marital-status_xf: [[0. 1. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.]] capital-gain_xf: [0.] native-country_xf: [[1. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.]] age_xf: [[1. 0. 0. 0.]] relationship_xf: [[0. 0. 1. 0. 0. 0. 0. 0. 0. 0. 0.]] race_xf: [[0. 1. 0. 0. 0. 0. 0. 0. 0. 0.]] occupation_xf: [[0. 0. 0. 0. 0. 0. 1. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.]] capital-loss_xf: [0.]
The transformed features can be passed into the model to get the predictions. The snippet below demonstrates this and we used a low-threshold BinaryAccuracy metric to compare the true labels and model predictions.
from tensorflow_transform.tf_metadata import schema_utils
# Load one tfrecord
tfrecord_file = tf.data.TFRecordDataset(TFRECORD_DAY1)
# Parse schema object as a feature spec
feature_spec = schema_utils.schema_as_feature_spec(SCHEMA).feature_spec
# Create a batch from the dataset
for records in tfrecord_file.batch(5).take(1):
# Get the label values from the raw input
parsed_examples = tf.io.parse_example(records, feature_spec)
y_true = parsed_examples.pop('label')
print(f'labels:\n {y_true.numpy()}\n')
# Transform the raw features and pass to the model to get predictions
transformed_examples = model.tft_layer(parsed_examples)
y_pred = model(transformed_examples)
print(f'predictions:\n {y_pred.numpy()}\n')
# Measure the binary accuracy
metric = tf.keras.metrics.BinaryAccuracy(threshold=0.3)
metric.update_state(y_true, y_pred)
print(f'binary accuracy: {metric.result().numpy()}\n')
labels: [[0] [0] [1] [1] [0]] predictions: [[1.6402992e-34] [3.4708142e-02] [5.1936507e-03] [3.3919078e-01] [2.3632433e-15]] binary accuracy: 0.800000011920929
Last thing to note is the model is also exported with a serving signature. You will know more about this in the next lab and in later parts of the specialization but for now, you can think of it as a configuration for when the model is deployed for inference. For this particular model, the default signature is configured to transform batches of serialized raw features before feeding to the model input. That way, you wouldn't have to explicitly code the transformations as previously shown in the snippet above. You can just pass in batches of data directly as shown below.
# Load one tfrecord
tfrecord_file = tf.data.TFRecordDataset(TFRECORD_DAY1)
# Print available signatures
print(f'model signatures: {model.signatures}\n')
# Create a batch
for records in tfrecord_file.batch(5).take(1):
# Pass the batch to the model serving signature to get predictions
output = model.signatures['serving_default'](examples=records)
# Print results
print(f"predictions:\n {output['output_0']}\n")
model signatures: _SignatureMap({'serving_default': <ConcreteFunction signature_wrapper(*, examples) at 0x7F3BAB8FBAD0>}) predictions: [[1.6402867e-34] [3.4708112e-02] [5.1937401e-03] [3.3919084e-01] [2.3632433e-15]]
TFMA accesses this model signature so it can work with the raw data and evaluate the model to get the metrics. Not only that, it can also extract specific features and domain values from your dataset before it computes these metrics. Let's see how this is done in the next section.
With the dataset and model now available, you can now move on to use TFMA. There are some additional steps needed:
tfma.EvalConfig
protocol message containing details about the models, metrics, and data slices you'd like to analyzetfma.EvalSharedModel
that points to your saved models.The tfma.EvalConfig() is a protocol message that sets up the analysis. Here, you will specify:
model_specs
- message containing at least the label key so it can be extracted from the evaluation/test data
metrics_specs
- containing the metrics you would like to evaluate. A comprehensive guide can be found here and you will use the binary classification metrics for this exercise.
slicing_specs
- containing the feature slices you would like to compute metrics for. A short description of different types of slices is shown here
The eval config should be passed as a protocol message and you can use the google.protobuf.text_format module for that as shown below.
import tensorflow_model_analysis as tfma
from google.protobuf import text_format
# Setup tfma.EvalConfig settings
eval_config = text_format.Parse("""
## Model information
model_specs {
# For keras (and serving models), you need to add a `label_key`.
label_key: "label"
}
## Post training metric information. These will be merged with any built-in
## metrics from training.
metrics_specs {
metrics { class_name: "ExampleCount" }
metrics { class_name: "BinaryAccuracy" }
metrics { class_name: "BinaryCrossentropy" }
metrics { class_name: "AUC" }
metrics { class_name: "AUCPrecisionRecall" }
metrics { class_name: "Precision" }
metrics { class_name: "Recall" }
metrics { class_name: "MeanLabel" }
metrics { class_name: "MeanPrediction" }
metrics { class_name: "Calibration" }
metrics { class_name: "CalibrationPlot" }
metrics { class_name: "ConfusionMatrixPlot" }
# ... add additional metrics and plots ...
}
## Slicing information
# overall slice
slicing_specs {}
# slice specific features
slicing_specs {
feature_keys: ["sex"]
}
slicing_specs {
feature_keys: ["race"]
}
# slice specific values from features
slicing_specs {
feature_values: {
key: "native-country"
value: "Cambodia"
}
}
slicing_specs {
feature_values: {
key: "native-country"
value: "Canada"
}
}
# slice feature crosses
slicing_specs {
feature_keys: ["sex", "race"]
}
""", tfma.EvalConfig())
TFMA also requires an EvalSharedModel instance that points to your model so it can be shared between multiple threads in the same process. This instance includes information about the type of model (keras, etc) and how to load and configure the model from its saved location on disk (e.g. tags, etc). The tfma.default_eval_shared_model() API can be used to create this given the model location and eval config.
# Create a tfma.EvalSharedModel that points to your model.
# You can ignore the warnings generated.
eval_shared_model = tfma.default_eval_shared_model(
eval_saved_model_path=MODEL1_FILE,
eval_config=eval_config)
WARNING:tensorflow:SavedModel saved prior to TF 2.5 detected when loading Keras model. Please ensure that you are saving the model with model.save() or tf.keras.models.save_model(), *NOT* tf.saved_model.save(). To confirm, there should be a file named "keras_metadata.pb" in the SavedModel directory. WARNING:tensorflow:Inconsistent references when loading the checkpoint into this object graph. Either the Trackable object references in the Python program have changed in an incompatible way, or the checkpoint was generated in an incompatible program. Two checkpoint references resolved to different objects (<tensorflow.python.keras.saving.saved_model.load.TensorFlowTransform>TransformFeaturesLayer object at 0x7f3baa360910> and <tensorflow.python.keras.engine.input_layer.InputLayer object at 0x7f3baf032a90>).
# Show properties of EvalSharedModel
print(f'EvalSharedModel contents: {eval_shared_model}')
EvalSharedModel contents: EvalSharedModel(model_path='starter_files/models/model1', add_metrics_callbacks=[], include_default_metrics=True, example_weight_key=None, additional_fetches=None, model_loader=<tensorflow_model_analysis.types.ModelLoader object at 0x7f3ba9e33460>, model_name='', model_type='tf_keras', rubber_stamp=False, is_baseline=False)
With the setup complete, you just need to declare an output directory then run TFMA. You will pass in the eval config, shared model, dataset, and output directory to tfma.run_model_analysis()
as shown below. This will create a tfma.EvalResult
which you can use later for rendering metrics and plots.
# Specify output path for the evaluation results
OUTPUT_DIR = os.path.join(BASE_DIR, 'output')
# Run TFMA. You can ignore the warnings generated.
eval_result = tfma.run_model_analysis(
eval_shared_model=eval_shared_model,
eval_config=eval_config,
data_location=TFRECORD_FULL,
output_path=OUTPUT_DIR)
WARNING:absl:Tensorflow version (2.5.3) found. Note that TFMA support for TF 2.0 is currently in beta WARNING:apache_beam.runners.interactive.interactive_environment:Dependencies required for Interactive Beam PCollection visualization are not available, please use: `pip install apache-beam[interactive]` to install necessary dependencies to enable all data visualization features.
WARNING:tensorflow:SavedModel saved prior to TF 2.5 detected when loading Keras model. Please ensure that you are saving the model with model.save() or tf.keras.models.save_model(), *NOT* tf.saved_model.save(). To confirm, there should be a file named "keras_metadata.pb" in the SavedModel directory. WARNING:tensorflow:Inconsistent references when loading the checkpoint into this object graph. Either the Trackable object references in the Python program have changed in an incompatible way, or the checkpoint was generated in an incompatible program. Two checkpoint references resolved to different objects (<tensorflow.python.keras.saving.saved_model.load.TensorFlowTransform>TransformFeaturesLayer object at 0x7f3ba961cb10> and <tensorflow.python.keras.engine.input_layer.InputLayer object at 0x7f3ba96e3690>). WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter. WARNING:tensorflow:SavedModel saved prior to TF 2.5 detected when loading Keras model. Please ensure that you are saving the model with model.save() or tf.keras.models.save_model(), *NOT* tf.saved_model.save(). To confirm, there should be a file named "keras_metadata.pb" in the SavedModel directory. WARNING:tensorflow:Inconsistent references when loading the checkpoint into this object graph. Either the Trackable object references in the Python program have changed in an incompatible way, or the checkpoint was generated in an incompatible program. Two checkpoint references resolved to different objects (<tensorflow.python.keras.saving.saved_model.load.TensorFlowTransform>TransformFeaturesLayer object at 0x7f3ba839da50> and <tensorflow.python.keras.engine.input_layer.InputLayer object at 0x7f3ba83e4c10>). WARNING:apache_beam.io.tfrecordio:Couldn't find python-snappy so the implementation of _TFRecordUtil._masked_crc32c is not as fast as it could be. WARNING:tensorflow:SavedModel saved prior to TF 2.5 detected when loading Keras model. Please ensure that you are saving the model with model.save() or tf.keras.models.save_model(), *NOT* tf.saved_model.save(). To confirm, there should be a file named "keras_metadata.pb" in the SavedModel directory. WARNING:tensorflow:Inconsistent references when loading the checkpoint into this object graph. Either the Trackable object references in the Python program have changed in an incompatible way, or the checkpoint was generated in an incompatible program. Two checkpoint references resolved to different objects (<tensorflow.python.keras.saving.saved_model.load.TensorFlowTransform>TransformFeaturesLayer object at 0x7f3baa03c410> and <tensorflow.python.keras.engine.input_layer.InputLayer object at 0x7f3ba9f11810>). WARNING:tensorflow:SavedModel saved prior to TF 2.5 detected when loading Keras model. Please ensure that you are saving the model with model.save() or tf.keras.models.save_model(), *NOT* tf.saved_model.save(). To confirm, there should be a file named "keras_metadata.pb" in the SavedModel directory. WARNING:tensorflow:Inconsistent references when loading the checkpoint into this object graph. Either the Trackable object references in the Python program have changed in an incompatible way, or the checkpoint was generated in an incompatible program. Two checkpoint references resolved to different objects (<tensorflow.python.keras.saving.saved_model.load.TensorFlowTransform>TransformFeaturesLayer object at 0x7f3baa085a90> and <tensorflow.python.keras.engine.input_layer.InputLayer object at 0x7f3ba4dc0490>). WARNING:tensorflow:SavedModel saved prior to TF 2.5 detected when loading Keras model. Please ensure that you are saving the model with model.save() or tf.keras.models.save_model(), *NOT* tf.saved_model.save(). To confirm, there should be a file named "keras_metadata.pb" in the SavedModel directory. WARNING:tensorflow:Inconsistent references when loading the checkpoint into this object graph. Either the Trackable object references in the Python program have changed in an incompatible way, or the checkpoint was generated in an incompatible program. Two checkpoint references resolved to different objects (<tensorflow.python.keras.saving.saved_model.load.TensorFlowTransform>TransformFeaturesLayer object at 0x7f3ba36caa10> and <tensorflow.python.keras.engine.input_layer.InputLayer object at 0x7f3ba6046b10>). WARNING:tensorflow:SavedModel saved prior to TF 2.5 detected when loading Keras model. Please ensure that you are saving the model with model.save() or tf.keras.models.save_model(), *NOT* tf.saved_model.save(). To confirm, there should be a file named "keras_metadata.pb" in the SavedModel directory. WARNING:tensorflow:Inconsistent references when loading the checkpoint into this object graph. Either the Trackable object references in the Python program have changed in an incompatible way, or the checkpoint was generated in an incompatible program. Two checkpoint references resolved to different objects (<tensorflow.python.keras.saving.saved_model.load.TensorFlowTransform>TransformFeaturesLayer object at 0x7f3ba30d41d0> and <tensorflow.python.keras.engine.input_layer.InputLayer object at 0x7f3ba2e7e050>). WARNING:tensorflow:SavedModel saved prior to TF 2.5 detected when loading Keras model. Please ensure that you are saving the model with model.save() or tf.keras.models.save_model(), *NOT* tf.saved_model.save(). To confirm, there should be a file named "keras_metadata.pb" in the SavedModel directory. WARNING:tensorflow:Inconsistent references when loading the checkpoint into this object graph. Either the Trackable object references in the Python program have changed in an incompatible way, or the checkpoint was generated in an incompatible program. Two checkpoint references resolved to different objects (<tensorflow.python.keras.saving.saved_model.load.TensorFlowTransform>TransformFeaturesLayer object at 0x7f3ba28de690> and <tensorflow.python.keras.engine.input_layer.InputLayer object at 0x7f3ba2935610>). WARNING:tensorflow:From /usr/local/lib/python3.7/dist-packages/tensorflow_model_analysis/writers/metrics_plots_and_validations_writer.py:113: tf_record_iterator (from tensorflow.python.lib.io.tf_record) is deprecated and will be removed in a future version. Instructions for updating: Use eager execution and: `tf.data.TFRecordDataset(path)`
You can visualize the results also using TFMA methods. In this section, you will view the returned metrics and plots for the different slices you specified in the eval config.
You can view the metrics with the tfma.view.render_slicing_metrics()
method. By default, the views will display the Overall
slice. To view a particular slice you can pass in a feature name to the slicing_column
argument as shown below. You can visualize the different metrics through the Show
dropdown menu and you can hover over the bar charts to show the exact value measured.
We encourage you to try the different options you see and also modify the command. Here are some examples:
slicing_column
argument will produce the overall slice.race
(since it was specified in the eval config) to see the results for that particular slice.Examples (Weighted) Threshold
slider above 5421 will remove the Female
slice because it has less examples than that.View
dropdown to Metrics Histogram
will show the results divided into buckets. For example, if you're slicing column is sex
and the Histogram Type
dropdown is at Slice Counts
, then you will one slice in two of the 10 (default) buckets since we only have two values for that feature ('Male' and 'Female'). The x-axis show the values for the metric in the Select Metric
dropdown. This is the default view when the number of slices is large.# Render metrics for a feature
tfma.view.render_slicing_metrics(eval_result, slicing_column='sex')
If you haven't yet, you can also pass in the native-country
to the slicing column. The difference in this visualization is we only specified two of its values in the eval config earlier. This is useful if you just want to study a subgroup of a particular feature and not the entire domain.
# Render metrics for feature. Review EvalConfig message to see what values were selected.
tfma.view.render_slicing_metrics(eval_result, slicing_column='native-country')
TFMA also supports creating feature crosses to analyze combinations of features. Our original settings created a cross between sex
and race
and you can pass it in as a SlicingSpec as shown below.
# Render metrics for feature crosses
tfma.view.render_slicing_metrics(
eval_result,
slicing_spec=tfma.SlicingSpec(
feature_keys=['sex', 'race']))
In some cases, crossing the two columns creates a lot of combinations. You can narrow down the results to only look at specific values by specifying it in the slicing_spec
argument. Below shows the results for the sex
feature for the Other
race.
# Narrow down the feature crosses by specifying feature values
tfma.view.render_slicing_metrics(
eval_result,
slicing_spec=tfma.SlicingSpec(
feature_keys=['sex'], feature_values={'race': 'Other'}))
Any plots that were added to the tfma.EvalConfig
as post training metric_specs
can be displayed using tfma.view.render_plot
.
As with metrics, plots can be viewed by slice. Unlike metrics, only plots for a particular slice value can be displayed so the tfma.SlicingSpec
must be used and it must specify both a slice feature name and value. If no slice is provided then the plots for the Overall
slice is used.
The example below displays the plots that were computed for the sex:Male
slice. You can click on the names at the bottom of the graph to see a different plot type. Alternatively, you can tick the Show all plots
checkbox to show all the plots in one screen.
# Render plots
tfma.view.render_plot(
eval_result,
tfma.SlicingSpec(feature_values={'sex': 'Male'}))
Your training dataset will be used for training your model, and will hopefully be representative of your test dataset and the data that will be sent to your model in production. However, while the data in inference requests may remain the same as your training data, it can also start to change enough so that the performance of your model will change. That means that you need to monitor and measure your model's performance on an ongoing basis so that you can be aware of and react to changes.
Let's take a look at how TFMA can help. You will load three different datasets and compare the model analysis results using the render_time_series()
method.
import os
# Put data paths we prepared earlier in a list
TFRECORDS = [TFRECORD_DAY1, TFRECORD_DAY2, TFRECORD_DAY3]
# Initialize output paths list for each result
output_paths = []
# Run eval on each tfrecord separately
for num, tfrecord in enumerate(TFRECORDS):
# Use the same model as before
eval_shared_model = tfma.default_eval_shared_model(
eval_saved_model_path=MODEL1_FILE,
eval_config=eval_config)
# Prepare output path name
output_path = os.path.join('.', 'time_series', str(num))
output_paths.append(output_path)
# Run TFMA on the current tfrecord in the loop
tfma.run_model_analysis(eval_shared_model=eval_shared_model,
eval_config=eval_config,
data_location=tfrecord,
output_path=output_path)
WARNING:tensorflow:SavedModel saved prior to TF 2.5 detected when loading Keras model. Please ensure that you are saving the model with model.save() or tf.keras.models.save_model(), *NOT* tf.saved_model.save(). To confirm, there should be a file named "keras_metadata.pb" in the SavedModel directory. WARNING:tensorflow:Inconsistent references when loading the checkpoint into this object graph. Either the Trackable object references in the Python program have changed in an incompatible way, or the checkpoint was generated in an incompatible program. Two checkpoint references resolved to different objects (<tensorflow.python.keras.saving.saved_model.load.TensorFlowTransform>TransformFeaturesLayer object at 0x7f3b9eb61fd0> and <tensorflow.python.keras.engine.input_layer.InputLayer object at 0x7f3b9eb9b090>). WARNING:absl:Tensorflow version (2.5.3) found. Note that TFMA support for TF 2.0 is currently in beta WARNING:tensorflow:SavedModel saved prior to TF 2.5 detected when loading Keras model. Please ensure that you are saving the model with model.save() or tf.keras.models.save_model(), *NOT* tf.saved_model.save(). To confirm, there should be a file named "keras_metadata.pb" in the SavedModel directory. WARNING:tensorflow:Inconsistent references when loading the checkpoint into this object graph. Either the Trackable object references in the Python program have changed in an incompatible way, or the checkpoint was generated in an incompatible program. Two checkpoint references resolved to different objects (<tensorflow.python.keras.saving.saved_model.load.TensorFlowTransform>TransformFeaturesLayer object at 0x7f3b9e3c4f50> and <tensorflow.python.keras.engine.input_layer.InputLayer object at 0x7f3b9e40e190>). WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter. WARNING:tensorflow:SavedModel saved prior to TF 2.5 detected when loading Keras model. Please ensure that you are saving the model with model.save() or tf.keras.models.save_model(), *NOT* tf.saved_model.save(). To confirm, there should be a file named "keras_metadata.pb" in the SavedModel directory. WARNING:tensorflow:Inconsistent references when loading the checkpoint into this object graph. Either the Trackable object references in the Python program have changed in an incompatible way, or the checkpoint was generated in an incompatible program. Two checkpoint references resolved to different objects (<tensorflow.python.keras.saving.saved_model.load.TensorFlowTransform>TransformFeaturesLayer object at 0x7f3b9d9d7790> and <tensorflow.python.keras.engine.input_layer.InputLayer object at 0x7f3b9d9d7450>). WARNING:tensorflow:SavedModel saved prior to TF 2.5 detected when loading Keras model. Please ensure that you are saving the model with model.save() or tf.keras.models.save_model(), *NOT* tf.saved_model.save(). To confirm, there should be a file named "keras_metadata.pb" in the SavedModel directory. WARNING:tensorflow:Inconsistent references when loading the checkpoint into this object graph. Either the Trackable object references in the Python program have changed in an incompatible way, or the checkpoint was generated in an incompatible program. Two checkpoint references resolved to different objects (<tensorflow.python.keras.saving.saved_model.load.TensorFlowTransform>TransformFeaturesLayer object at 0x7f3b9cf5fb10> and <tensorflow.python.keras.engine.input_layer.InputLayer object at 0x7f3b9d0aa6d0>). WARNING:tensorflow:SavedModel saved prior to TF 2.5 detected when loading Keras model. Please ensure that you are saving the model with model.save() or tf.keras.models.save_model(), *NOT* tf.saved_model.save(). To confirm, there should be a file named "keras_metadata.pb" in the SavedModel directory. WARNING:tensorflow:Inconsistent references when loading the checkpoint into this object graph. Either the Trackable object references in the Python program have changed in an incompatible way, or the checkpoint was generated in an incompatible program. Two checkpoint references resolved to different objects (<tensorflow.python.keras.saving.saved_model.load.TensorFlowTransform>TransformFeaturesLayer object at 0x7f3b9c055d50> and <tensorflow.python.keras.engine.input_layer.InputLayer object at 0x7f3b9c0a73d0>). WARNING:tensorflow:SavedModel saved prior to TF 2.5 detected when loading Keras model. Please ensure that you are saving the model with model.save() or tf.keras.models.save_model(), *NOT* tf.saved_model.save(). To confirm, there should be a file named "keras_metadata.pb" in the SavedModel directory. WARNING:tensorflow:Inconsistent references when loading the checkpoint into this object graph. Either the Trackable object references in the Python program have changed in an incompatible way, or the checkpoint was generated in an incompatible program. Two checkpoint references resolved to different objects (<tensorflow.python.keras.saving.saved_model.load.TensorFlowTransform>TransformFeaturesLayer object at 0x7f3b9b923250> and <tensorflow.python.keras.engine.input_layer.InputLayer object at 0x7f3b9b8e5090>). WARNING:tensorflow:SavedModel saved prior to TF 2.5 detected when loading Keras model. Please ensure that you are saving the model with model.save() or tf.keras.models.save_model(), *NOT* tf.saved_model.save(). To confirm, there should be a file named "keras_metadata.pb" in the SavedModel directory. WARNING:tensorflow:Inconsistent references when loading the checkpoint into this object graph. Either the Trackable object references in the Python program have changed in an incompatible way, or the checkpoint was generated in an incompatible program. Two checkpoint references resolved to different objects (<tensorflow.python.keras.saving.saved_model.load.TensorFlowTransform>TransformFeaturesLayer object at 0x7f3b9e8b8e10> and <tensorflow.python.keras.engine.input_layer.InputLayer object at 0x7f3b9e7c5210>). WARNING:tensorflow:SavedModel saved prior to TF 2.5 detected when loading Keras model. Please ensure that you are saving the model with model.save() or tf.keras.models.save_model(), *NOT* tf.saved_model.save(). To confirm, there should be a file named "keras_metadata.pb" in the SavedModel directory. WARNING:tensorflow:Inconsistent references when loading the checkpoint into this object graph. Either the Trackable object references in the Python program have changed in an incompatible way, or the checkpoint was generated in an incompatible program. Two checkpoint references resolved to different objects (<tensorflow.python.keras.saving.saved_model.load.TensorFlowTransform>TransformFeaturesLayer object at 0x7f3ba2c9e410> and <tensorflow.python.keras.engine.input_layer.InputLayer object at 0x7f3ba2abd590>). WARNING:tensorflow:SavedModel saved prior to TF 2.5 detected when loading Keras model. Please ensure that you are saving the model with model.save() or tf.keras.models.save_model(), *NOT* tf.saved_model.save(). To confirm, there should be a file named "keras_metadata.pb" in the SavedModel directory. WARNING:tensorflow:Inconsistent references when loading the checkpoint into this object graph. Either the Trackable object references in the Python program have changed in an incompatible way, or the checkpoint was generated in an incompatible program. Two checkpoint references resolved to different objects (<tensorflow.python.keras.saving.saved_model.load.TensorFlowTransform>TransformFeaturesLayer object at 0x7f3baa078610> and <tensorflow.python.keras.engine.input_layer.InputLayer object at 0x7f3ba9e9ecd0>). WARNING:absl:Tensorflow version (2.5.3) found. Note that TFMA support for TF 2.0 is currently in beta WARNING:tensorflow:SavedModel saved prior to TF 2.5 detected when loading Keras model. Please ensure that you are saving the model with model.save() or tf.keras.models.save_model(), *NOT* tf.saved_model.save(). To confirm, there should be a file named "keras_metadata.pb" in the SavedModel directory. WARNING:tensorflow:Inconsistent references when loading the checkpoint into this object graph. Either the Trackable object references in the Python program have changed in an incompatible way, or the checkpoint was generated in an incompatible program. Two checkpoint references resolved to different objects (<tensorflow.python.keras.saving.saved_model.load.TensorFlowTransform>TransformFeaturesLayer object at 0x7f3ba9d3ba50> and <tensorflow.python.keras.engine.input_layer.InputLayer object at 0x7f3ba9e0f250>). WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter. WARNING:tensorflow:SavedModel saved prior to TF 2.5 detected when loading Keras model. Please ensure that you are saving the model with model.save() or tf.keras.models.save_model(), *NOT* tf.saved_model.save(). To confirm, there should be a file named "keras_metadata.pb" in the SavedModel directory. WARNING:tensorflow:Inconsistent references when loading the checkpoint into this object graph. Either the Trackable object references in the Python program have changed in an incompatible way, or the checkpoint was generated in an incompatible program. Two checkpoint references resolved to different objects (<tensorflow.python.keras.saving.saved_model.load.TensorFlowTransform>TransformFeaturesLayer object at 0x7f3ba7e53190> and <tensorflow.python.keras.engine.input_layer.InputLayer object at 0x7f3ba2376750>). WARNING:tensorflow:SavedModel saved prior to TF 2.5 detected when loading Keras model. Please ensure that you are saving the model with model.save() or tf.keras.models.save_model(), *NOT* tf.saved_model.save(). To confirm, there should be a file named "keras_metadata.pb" in the SavedModel directory. WARNING:tensorflow:Inconsistent references when loading the checkpoint into this object graph. Either the Trackable object references in the Python program have changed in an incompatible way, or the checkpoint was generated in an incompatible program. Two checkpoint references resolved to different objects (<tensorflow.python.keras.saving.saved_model.load.TensorFlowTransform>TransformFeaturesLayer object at 0x7f3ba2485ad0> and <tensorflow.python.keras.engine.input_layer.InputLayer object at 0x7f3ba2c393d0>). WARNING:tensorflow:SavedModel saved prior to TF 2.5 detected when loading Keras model. Please ensure that you are saving the model with model.save() or tf.keras.models.save_model(), *NOT* tf.saved_model.save(). To confirm, there should be a file named "keras_metadata.pb" in the SavedModel directory. WARNING:tensorflow:Inconsistent references when loading the checkpoint into this object graph. Either the Trackable object references in the Python program have changed in an incompatible way, or the checkpoint was generated in an incompatible program. Two checkpoint references resolved to different objects (<tensorflow.python.keras.saving.saved_model.load.TensorFlowTransform>TransformFeaturesLayer object at 0x7f3ba2dbdb50> and <tensorflow.python.keras.engine.input_layer.InputLayer object at 0x7f3ba26c3a50>). WARNING:tensorflow:SavedModel saved prior to TF 2.5 detected when loading Keras model. Please ensure that you are saving the model with model.save() or tf.keras.models.save_model(), *NOT* tf.saved_model.save(). To confirm, there should be a file named "keras_metadata.pb" in the SavedModel directory. WARNING:tensorflow:Inconsistent references when loading the checkpoint into this object graph. Either the Trackable object references in the Python program have changed in an incompatible way, or the checkpoint was generated in an incompatible program. Two checkpoint references resolved to different objects (<tensorflow.python.keras.saving.saved_model.load.TensorFlowTransform>TransformFeaturesLayer object at 0x7f3ba36bf490> and <tensorflow.python.keras.engine.input_layer.InputLayer object at 0x7f3ba9d4c290>). WARNING:tensorflow:SavedModel saved prior to TF 2.5 detected when loading Keras model. Please ensure that you are saving the model with model.save() or tf.keras.models.save_model(), *NOT* tf.saved_model.save(). To confirm, there should be a file named "keras_metadata.pb" in the SavedModel directory. WARNING:tensorflow:Inconsistent references when loading the checkpoint into this object graph. Either the Trackable object references in the Python program have changed in an incompatible way, or the checkpoint was generated in an incompatible program. Two checkpoint references resolved to different objects (<tensorflow.python.keras.saving.saved_model.load.TensorFlowTransform>TransformFeaturesLayer object at 0x7f3ba27e2c90> and <tensorflow.python.keras.engine.input_layer.InputLayer object at 0x7f3b9e8fe650>). WARNING:tensorflow:SavedModel saved prior to TF 2.5 detected when loading Keras model. Please ensure that you are saving the model with model.save() or tf.keras.models.save_model(), *NOT* tf.saved_model.save(). To confirm, there should be a file named "keras_metadata.pb" in the SavedModel directory. WARNING:tensorflow:Inconsistent references when loading the checkpoint into this object graph. Either the Trackable object references in the Python program have changed in an incompatible way, or the checkpoint was generated in an incompatible program. Two checkpoint references resolved to different objects (<tensorflow.python.keras.saving.saved_model.load.TensorFlowTransform>TransformFeaturesLayer object at 0x7f3ba843c090> and <tensorflow.python.keras.engine.input_layer.InputLayer object at 0x7f3b9d311790>). WARNING:tensorflow:SavedModel saved prior to TF 2.5 detected when loading Keras model. Please ensure that you are saving the model with model.save() or tf.keras.models.save_model(), *NOT* tf.saved_model.save(). To confirm, there should be a file named "keras_metadata.pb" in the SavedModel directory. WARNING:tensorflow:Inconsistent references when loading the checkpoint into this object graph. Either the Trackable object references in the Python program have changed in an incompatible way, or the checkpoint was generated in an incompatible program. Two checkpoint references resolved to different objects (<tensorflow.python.keras.saving.saved_model.load.TensorFlowTransform>TransformFeaturesLayer object at 0x7f3ba2c2c050> and <tensorflow.python.keras.engine.input_layer.InputLayer object at 0x7f3ba2ff4ad0>). WARNING:absl:Tensorflow version (2.5.3) found. Note that TFMA support for TF 2.0 is currently in beta WARNING:tensorflow:SavedModel saved prior to TF 2.5 detected when loading Keras model. Please ensure that you are saving the model with model.save() or tf.keras.models.save_model(), *NOT* tf.saved_model.save(). To confirm, there should be a file named "keras_metadata.pb" in the SavedModel directory. WARNING:tensorflow:Inconsistent references when loading the checkpoint into this object graph. Either the Trackable object references in the Python program have changed in an incompatible way, or the checkpoint was generated in an incompatible program. Two checkpoint references resolved to different objects (<tensorflow.python.keras.saving.saved_model.load.TensorFlowTransform>TransformFeaturesLayer object at 0x7f3b9b1ddc90> and <tensorflow.python.keras.engine.input_layer.InputLayer object at 0x7f3b9b8ff990>). WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter. WARNING:tensorflow:SavedModel saved prior to TF 2.5 detected when loading Keras model. Please ensure that you are saving the model with model.save() or tf.keras.models.save_model(), *NOT* tf.saved_model.save(). To confirm, there should be a file named "keras_metadata.pb" in the SavedModel directory. WARNING:tensorflow:Inconsistent references when loading the checkpoint into this object graph. Either the Trackable object references in the Python program have changed in an incompatible way, or the checkpoint was generated in an incompatible program. Two checkpoint references resolved to different objects (<tensorflow.python.keras.saving.saved_model.load.TensorFlowTransform>TransformFeaturesLayer object at 0x7f3b9e32d210> and <tensorflow.python.keras.engine.input_layer.InputLayer object at 0x7f3b9c09a890>). WARNING:tensorflow:SavedModel saved prior to TF 2.5 detected when loading Keras model. Please ensure that you are saving the model with model.save() or tf.keras.models.save_model(), *NOT* tf.saved_model.save(). To confirm, there should be a file named "keras_metadata.pb" in the SavedModel directory. WARNING:tensorflow:Inconsistent references when loading the checkpoint into this object graph. Either the Trackable object references in the Python program have changed in an incompatible way, or the checkpoint was generated in an incompatible program. Two checkpoint references resolved to different objects (<tensorflow.python.keras.saving.saved_model.load.TensorFlowTransform>TransformFeaturesLayer object at 0x7f3ba2e14d90> and <tensorflow.python.keras.engine.input_layer.InputLayer object at 0x7f3ba2806650>). WARNING:tensorflow:SavedModel saved prior to TF 2.5 detected when loading Keras model. Please ensure that you are saving the model with model.save() or tf.keras.models.save_model(), *NOT* tf.saved_model.save(). To confirm, there should be a file named "keras_metadata.pb" in the SavedModel directory. WARNING:tensorflow:Inconsistent references when loading the checkpoint into this object graph. Either the Trackable object references in the Python program have changed in an incompatible way, or the checkpoint was generated in an incompatible program. Two checkpoint references resolved to different objects (<tensorflow.python.keras.saving.saved_model.load.TensorFlowTransform>TransformFeaturesLayer object at 0x7f3ba36b2f50> and <tensorflow.python.keras.engine.input_layer.InputLayer object at 0x7f3ba82b9090>). WARNING:tensorflow:SavedModel saved prior to TF 2.5 detected when loading Keras model. Please ensure that you are saving the model with model.save() or tf.keras.models.save_model(), *NOT* tf.saved_model.save(). To confirm, there should be a file named "keras_metadata.pb" in the SavedModel directory. WARNING:tensorflow:Inconsistent references when loading the checkpoint into this object graph. Either the Trackable object references in the Python program have changed in an incompatible way, or the checkpoint was generated in an incompatible program. Two checkpoint references resolved to different objects (<tensorflow.python.keras.saving.saved_model.load.TensorFlowTransform>TransformFeaturesLayer object at 0x7f3b9d839550> and <tensorflow.python.keras.engine.input_layer.InputLayer object at 0x7f3b9d459290>). WARNING:tensorflow:SavedModel saved prior to TF 2.5 detected when loading Keras model. Please ensure that you are saving the model with model.save() or tf.keras.models.save_model(), *NOT* tf.saved_model.save(). To confirm, there should be a file named "keras_metadata.pb" in the SavedModel directory. WARNING:tensorflow:Inconsistent references when loading the checkpoint into this object graph. Either the Trackable object references in the Python program have changed in an incompatible way, or the checkpoint was generated in an incompatible program. Two checkpoint references resolved to different objects (<tensorflow.python.keras.saving.saved_model.load.TensorFlowTransform>TransformFeaturesLayer object at 0x7f3ba3e4cfd0> and <tensorflow.python.keras.engine.input_layer.InputLayer object at 0x7f3ba31b6290>). WARNING:tensorflow:SavedModel saved prior to TF 2.5 detected when loading Keras model. Please ensure that you are saving the model with model.save() or tf.keras.models.save_model(), *NOT* tf.saved_model.save(). To confirm, there should be a file named "keras_metadata.pb" in the SavedModel directory. WARNING:tensorflow:Inconsistent references when loading the checkpoint into this object graph. Either the Trackable object references in the Python program have changed in an incompatible way, or the checkpoint was generated in an incompatible program. Two checkpoint references resolved to different objects (<tensorflow.python.keras.saving.saved_model.load.TensorFlowTransform>TransformFeaturesLayer object at 0x7f3b9df93050> and <tensorflow.python.keras.engine.input_layer.InputLayer object at 0x7f3b9e214590>).
First, imagine that you've trained and deployed your model yesterday. And now, you want to see how it's doing on the new data coming in today. The visualization will start by displaying AUC. From the UI, you can:
Note: In the metric series charts, the x-axis is just the model directory name of the model that you're examining.
# Load results for day 1 and day 2 datasets
eval_results_from_disk = tfma.load_eval_results(output_paths[:2])
# Visualize results
tfma.view.render_time_series(eval_results_from_disk)
Now imagine that another day has passed and you want to see how it's doing on the new data coming in today.
# Load results for all three days
eval_results_from_disk = tfma.load_eval_results(output_paths)
# Visualize the results
tfma.view.render_time_series(eval_results_from_disk)
This type of investigation lets you see if your model is behaving poorly on new data. You can make the decision to retrain your production model based on these results. Retraining might not always produce the best results and you also need a way to detect that. You will see how TFMA helps in that regard in the next section.
TFMA can be configured to evaluate multiple models at the same time. Typically, this is done to compare a candidate model against a baseline (such as the currently serving model) to determine what the performance differences in metrics are. When thresholds are configured, TFMA will produce a tfma.ValidationResult
record indicating whether the performance matches expecations.
Below, you will re-configure the EvalConfig settings to compare two models: a candidate and a baseline. You will also validate the candidate's performance against the baseline by setting a tmfa.MetricThreshold
on the BinaryAccuracy
metric. This helps in determining if your new model can indeed replace your currently deployed model.
# Setup tfma.EvalConfig setting with metric thresholds
eval_config_with_thresholds = text_format.Parse("""
## Model information
model_specs {
name: "candidate"
label_key: "label"
}
model_specs {
name: "baseline"
label_key: "label"
is_baseline: true
}
## Post training metric information
metrics_specs {
metrics { class_name: "ExampleCount" }
metrics {
class_name: "BinaryAccuracy"
threshold {
# Ensure that metric is always > 0.9
value_threshold {
lower_bound { value: 0.9 }
}
# Ensure that metric does not drop by more than a small epsilon
# e.g. (candidate - baseline) > -1e-10 or candidate > baseline - 1e-10
change_threshold {
direction: HIGHER_IS_BETTER
absolute { value: -1e-10 }
}
}
}
metrics { class_name: "BinaryCrossentropy" }
metrics { class_name: "AUC" }
metrics { class_name: "AUCPrecisionRecall" }
metrics { class_name: "Precision" }
metrics { class_name: "Recall" }
metrics { class_name: "MeanLabel" }
metrics { class_name: "MeanPrediction" }
metrics { class_name: "Calibration" }
metrics { class_name: "CalibrationPlot" }
metrics { class_name: "ConfusionMatrixPlot" }
# ... add additional metrics and plots ...
}
## Slicing information
slicing_specs {} # overall slice
slicing_specs {
feature_keys: ["race"]
}
slicing_specs {
feature_keys: ["sex"]
}
""", tfma.EvalConfig())
# Create tfma.EvalSharedModels that points to the candidate and baseline
candidate_model_path = MODEL1_FILE
baseline_model_path = MODEL2_FILE
eval_shared_models = [
tfma.default_eval_shared_model(
model_name=tfma.CANDIDATE_KEY,
eval_saved_model_path=candidate_model_path,
eval_config=eval_config_with_thresholds),
tfma.default_eval_shared_model(
model_name=tfma.BASELINE_KEY,
eval_saved_model_path=baseline_model_path,
eval_config=eval_config_with_thresholds),
]
# Specify validation path
validation_output_path = os.path.join(OUTPUT_DIR, 'validation')
# Run TFMA on the two models
eval_result_with_validation = tfma.run_model_analysis(
eval_shared_models,
eval_config=eval_config_with_thresholds,
data_location=TFRECORD_FULL,
output_path=validation_output_path)
WARNING:tensorflow:SavedModel saved prior to TF 2.5 detected when loading Keras model. Please ensure that you are saving the model with model.save() or tf.keras.models.save_model(), *NOT* tf.saved_model.save(). To confirm, there should be a file named "keras_metadata.pb" in the SavedModel directory. WARNING:tensorflow:Inconsistent references when loading the checkpoint into this object graph. Either the Trackable object references in the Python program have changed in an incompatible way, or the checkpoint was generated in an incompatible program. Two checkpoint references resolved to different objects (<tensorflow.python.keras.saving.saved_model.load.TensorFlowTransform>TransformFeaturesLayer object at 0x7f3ba9fdc510> and <tensorflow.python.keras.engine.input_layer.InputLayer object at 0x7f3b9674ced0>). WARNING:tensorflow:SavedModel saved prior to TF 2.5 detected when loading Keras model. Please ensure that you are saving the model with model.save() or tf.keras.models.save_model(), *NOT* tf.saved_model.save(). To confirm, there should be a file named "keras_metadata.pb" in the SavedModel directory. WARNING:tensorflow:Inconsistent references when loading the checkpoint into this object graph. Either the Trackable object references in the Python program have changed in an incompatible way, or the checkpoint was generated in an incompatible program. Two checkpoint references resolved to different objects (<tensorflow.python.keras.saving.saved_model.load.TensorFlowTransform>TransformFeaturesLayer object at 0x7f3b96741fd0> and <tensorflow.python.keras.engine.input_layer.InputLayer object at 0x7f3b9605fa50>). WARNING:absl:Tensorflow version (2.5.3) found. Note that TFMA support for TF 2.0 is currently in beta WARNING:tensorflow:SavedModel saved prior to TF 2.5 detected when loading Keras model. Please ensure that you are saving the model with model.save() or tf.keras.models.save_model(), *NOT* tf.saved_model.save(). To confirm, there should be a file named "keras_metadata.pb" in the SavedModel directory. WARNING:tensorflow:Inconsistent references when loading the checkpoint into this object graph. Either the Trackable object references in the Python program have changed in an incompatible way, or the checkpoint was generated in an incompatible program. Two checkpoint references resolved to different objects (<tensorflow.python.keras.saving.saved_model.load.TensorFlowTransform>TransformFeaturesLayer object at 0x7f3b958c3d10> and <tensorflow.python.keras.engine.input_layer.InputLayer object at 0x7f3b961c8b50>). WARNING:tensorflow:SavedModel saved prior to TF 2.5 detected when loading Keras model. Please ensure that you are saving the model with model.save() or tf.keras.models.save_model(), *NOT* tf.saved_model.save(). To confirm, there should be a file named "keras_metadata.pb" in the SavedModel directory. WARNING:tensorflow:Inconsistent references when loading the checkpoint into this object graph. Either the Trackable object references in the Python program have changed in an incompatible way, or the checkpoint was generated in an incompatible program. Two checkpoint references resolved to different objects (<tensorflow.python.keras.saving.saved_model.load.TensorFlowTransform>TransformFeaturesLayer object at 0x7f3b95282ad0> and <tensorflow.python.keras.engine.input_layer.InputLayer object at 0x7f3b9b19e910>). WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter. WARNING:tensorflow:SavedModel saved prior to TF 2.5 detected when loading Keras model. Please ensure that you are saving the model with model.save() or tf.keras.models.save_model(), *NOT* tf.saved_model.save(). To confirm, there should be a file named "keras_metadata.pb" in the SavedModel directory. WARNING:tensorflow:Inconsistent references when loading the checkpoint into this object graph. Either the Trackable object references in the Python program have changed in an incompatible way, or the checkpoint was generated in an incompatible program. Two checkpoint references resolved to different objects (<tensorflow.python.keras.saving.saved_model.load.TensorFlowTransform>TransformFeaturesLayer object at 0x7f3b944ed290> and <tensorflow.python.keras.engine.input_layer.InputLayer object at 0x7f3b9454a450>). WARNING:tensorflow:SavedModel saved prior to TF 2.5 detected when loading Keras model. Please ensure that you are saving the model with model.save() or tf.keras.models.save_model(), *NOT* tf.saved_model.save(). To confirm, there should be a file named "keras_metadata.pb" in the SavedModel directory. WARNING:tensorflow:Inconsistent references when loading the checkpoint into this object graph. Either the Trackable object references in the Python program have changed in an incompatible way, or the checkpoint was generated in an incompatible program. Two checkpoint references resolved to different objects (<tensorflow.python.keras.saving.saved_model.load.TensorFlowTransform>TransformFeaturesLayer object at 0x7f3b9432db10> and <tensorflow.python.keras.engine.input_layer.InputLayer object at 0x7f3b94d1dd10>). WARNING:tensorflow:SavedModel saved prior to TF 2.5 detected when loading Keras model. Please ensure that you are saving the model with model.save() or tf.keras.models.save_model(), *NOT* tf.saved_model.save(). To confirm, there should be a file named "keras_metadata.pb" in the SavedModel directory. WARNING:tensorflow:Inconsistent references when loading the checkpoint into this object graph. Either the Trackable object references in the Python program have changed in an incompatible way, or the checkpoint was generated in an incompatible program. Two checkpoint references resolved to different objects (<tensorflow.python.keras.saving.saved_model.load.TensorFlowTransform>TransformFeaturesLayer object at 0x7f3b9390ee10> and <tensorflow.python.keras.engine.input_layer.InputLayer object at 0x7f3b939528d0>). WARNING:tensorflow:SavedModel saved prior to TF 2.5 detected when loading Keras model. Please ensure that you are saving the model with model.save() or tf.keras.models.save_model(), *NOT* tf.saved_model.save(). To confirm, there should be a file named "keras_metadata.pb" in the SavedModel directory. WARNING:tensorflow:Inconsistent references when loading the checkpoint into this object graph. Either the Trackable object references in the Python program have changed in an incompatible way, or the checkpoint was generated in an incompatible program. Two checkpoint references resolved to different objects (<tensorflow.python.keras.saving.saved_model.load.TensorFlowTransform>TransformFeaturesLayer object at 0x7f3b930a1110> and <tensorflow.python.keras.engine.input_layer.InputLayer object at 0x7f3b930aa4d0>). WARNING:tensorflow:SavedModel saved prior to TF 2.5 detected when loading Keras model. Please ensure that you are saving the model with model.save() or tf.keras.models.save_model(), *NOT* tf.saved_model.save(). To confirm, there should be a file named "keras_metadata.pb" in the SavedModel directory. WARNING:tensorflow:Inconsistent references when loading the checkpoint into this object graph. Either the Trackable object references in the Python program have changed in an incompatible way, or the checkpoint was generated in an incompatible program. Two checkpoint references resolved to different objects (<tensorflow.python.keras.saving.saved_model.load.TensorFlowTransform>TransformFeaturesLayer object at 0x7f3b93408490> and <tensorflow.python.keras.engine.input_layer.InputLayer object at 0x7f3b9287a610>). WARNING:tensorflow:SavedModel saved prior to TF 2.5 detected when loading Keras model. Please ensure that you are saving the model with model.save() or tf.keras.models.save_model(), *NOT* tf.saved_model.save(). To confirm, there should be a file named "keras_metadata.pb" in the SavedModel directory. WARNING:tensorflow:Inconsistent references when loading the checkpoint into this object graph. Either the Trackable object references in the Python program have changed in an incompatible way, or the checkpoint was generated in an incompatible program. Two checkpoint references resolved to different objects (<tensorflow.python.keras.saving.saved_model.load.TensorFlowTransform>TransformFeaturesLayer object at 0x7f3b91e89c50> and <tensorflow.python.keras.engine.input_layer.InputLayer object at 0x7f3b91eaa4d0>). WARNING:tensorflow:SavedModel saved prior to TF 2.5 detected when loading Keras model. Please ensure that you are saving the model with model.save() or tf.keras.models.save_model(), *NOT* tf.saved_model.save(). To confirm, there should be a file named "keras_metadata.pb" in the SavedModel directory. WARNING:tensorflow:Inconsistent references when loading the checkpoint into this object graph. Either the Trackable object references in the Python program have changed in an incompatible way, or the checkpoint was generated in an incompatible program. Two checkpoint references resolved to different objects (<tensorflow.python.keras.saving.saved_model.load.TensorFlowTransform>TransformFeaturesLayer object at 0x7f3b95d393d0> and <tensorflow.python.keras.engine.input_layer.InputLayer object at 0x7f3b95fb2f10>). WARNING:tensorflow:SavedModel saved prior to TF 2.5 detected when loading Keras model. Please ensure that you are saving the model with model.save() or tf.keras.models.save_model(), *NOT* tf.saved_model.save(). To confirm, there should be a file named "keras_metadata.pb" in the SavedModel directory. WARNING:tensorflow:Inconsistent references when loading the checkpoint into this object graph. Either the Trackable object references in the Python program have changed in an incompatible way, or the checkpoint was generated in an incompatible program. Two checkpoint references resolved to different objects (<tensorflow.python.keras.saving.saved_model.load.TensorFlowTransform>TransformFeaturesLayer object at 0x7f3b9c11ba50> and <tensorflow.python.keras.engine.input_layer.InputLayer object at 0x7f3ba30b49d0>). WARNING:tensorflow:SavedModel saved prior to TF 2.5 detected when loading Keras model. Please ensure that you are saving the model with model.save() or tf.keras.models.save_model(), *NOT* tf.saved_model.save(). To confirm, there should be a file named "keras_metadata.pb" in the SavedModel directory. WARNING:tensorflow:Inconsistent references when loading the checkpoint into this object graph. Either the Trackable object references in the Python program have changed in an incompatible way, or the checkpoint was generated in an incompatible program. Two checkpoint references resolved to different objects (<tensorflow.python.keras.saving.saved_model.load.TensorFlowTransform>TransformFeaturesLayer object at 0x7f3ba2c72390> and <tensorflow.python.keras.engine.input_layer.InputLayer object at 0x7f3b9e02da10>). WARNING:tensorflow:SavedModel saved prior to TF 2.5 detected when loading Keras model. Please ensure that you are saving the model with model.save() or tf.keras.models.save_model(), *NOT* tf.saved_model.save(). To confirm, there should be a file named "keras_metadata.pb" in the SavedModel directory. WARNING:tensorflow:Inconsistent references when loading the checkpoint into this object graph. Either the Trackable object references in the Python program have changed in an incompatible way, or the checkpoint was generated in an incompatible program. Two checkpoint references resolved to different objects (<tensorflow.python.keras.saving.saved_model.load.TensorFlowTransform>TransformFeaturesLayer object at 0x7f3b9577ed10> and <tensorflow.python.keras.engine.input_layer.InputLayer object at 0x7f3b9e198a50>). WARNING:tensorflow:SavedModel saved prior to TF 2.5 detected when loading Keras model. Please ensure that you are saving the model with model.save() or tf.keras.models.save_model(), *NOT* tf.saved_model.save(). To confirm, there should be a file named "keras_metadata.pb" in the SavedModel directory. WARNING:tensorflow:Inconsistent references when loading the checkpoint into this object graph. Either the Trackable object references in the Python program have changed in an incompatible way, or the checkpoint was generated in an incompatible program. Two checkpoint references resolved to different objects (<tensorflow.python.keras.saving.saved_model.load.TensorFlowTransform>TransformFeaturesLayer object at 0x7f3b9c0841d0> and <tensorflow.python.keras.engine.input_layer.InputLayer object at 0x7f3b9d557590>). WARNING:tensorflow:SavedModel saved prior to TF 2.5 detected when loading Keras model. Please ensure that you are saving the model with model.save() or tf.keras.models.save_model(), *NOT* tf.saved_model.save(). To confirm, there should be a file named "keras_metadata.pb" in the SavedModel directory. WARNING:tensorflow:Inconsistent references when loading the checkpoint into this object graph. Either the Trackable object references in the Python program have changed in an incompatible way, or the checkpoint was generated in an incompatible program. Two checkpoint references resolved to different objects (<tensorflow.python.keras.saving.saved_model.load.TensorFlowTransform>TransformFeaturesLayer object at 0x7f3b9e1fe990> and <tensorflow.python.keras.engine.input_layer.InputLayer object at 0x7f3b9c4e8310>). WARNING:tensorflow:SavedModel saved prior to TF 2.5 detected when loading Keras model. Please ensure that you are saving the model with model.save() or tf.keras.models.save_model(), *NOT* tf.saved_model.save(). To confirm, there should be a file named "keras_metadata.pb" in the SavedModel directory. WARNING:tensorflow:Inconsistent references when loading the checkpoint into this object graph. Either the Trackable object references in the Python program have changed in an incompatible way, or the checkpoint was generated in an incompatible program. Two checkpoint references resolved to different objects (<tensorflow.python.keras.saving.saved_model.load.TensorFlowTransform>TransformFeaturesLayer object at 0x7f3b9523bd10> and <tensorflow.python.keras.engine.input_layer.InputLayer object at 0x7f3b95235350>). WARNING:tensorflow:SavedModel saved prior to TF 2.5 detected when loading Keras model. Please ensure that you are saving the model with model.save() or tf.keras.models.save_model(), *NOT* tf.saved_model.save(). To confirm, there should be a file named "keras_metadata.pb" in the SavedModel directory. WARNING:tensorflow:Inconsistent references when loading the checkpoint into this object graph. Either the Trackable object references in the Python program have changed in an incompatible way, or the checkpoint was generated in an incompatible program. Two checkpoint references resolved to different objects (<tensorflow.python.keras.saving.saved_model.load.TensorFlowTransform>TransformFeaturesLayer object at 0x7f3b9dadce10> and <tensorflow.python.keras.engine.input_layer.InputLayer object at 0x7f3b95863a50>).
When running evaluations with one or more models against a baseline, TFMA automatically adds different metrics for all the metrics computed during the evaluation. These metrics are named after the corresponding metric but with the string _diff
appended to the metric name. A positive value for these _diff
metrics indicates an improved performance against the baseline.
Like in the previous section, you can view the results with render_time_series()
.
# Render results
tfma.view.render_time_series(eval_result_with_validation)
You can use tfma.load_validator_result
to view the validation results you specified with the threshold settings. For this example, the validation fails because BinaryAccuracy
is below the threshold.
# Print validation result
validation_result = tfma.load_validation_result(validation_output_path)
print(validation_result.validation_ok)
False
Congratulations! You have now explored the different methods of model analysis using TFMA. In the next section, you will see how these can fit into a TFX pipeline so you can automate the process and store the results in your pipeline directory and metadata store.