×
1 Choose EITC/EITCA Certificates
2 Learn and take online exams
3 Get your IT skills certified

Confirm your IT skills and competencies under the European IT Certification framework from anywhere in the world fully online.

EITCA Academy

Digital skills attestation standard by the European IT Certification Institute aiming to support Digital Society development

LOG IN TO YOUR ACCOUNT

CREATE AN ACCOUNT FORGOT YOUR PASSWORD?

FORGOT YOUR PASSWORD?

AAH, WAIT, I REMEMBER NOW!

CREATE AN ACCOUNT

ALREADY HAVE AN ACCOUNT?
EUROPEAN INFORMATION TECHNOLOGIES CERTIFICATION ACADEMY - ATTESTING YOUR PROFESSIONAL DIGITAL SKILLS
  • SIGN UP
  • LOGIN
  • INFO

EITCA Academy

EITCA Academy

The European Information Technologies Certification Institute - EITCI ASBL

Certification Provider

EITCI Institute ASBL

Brussels, European Union

Governing European IT Certification (EITC) framework in support of the IT professionalism and Digital Society

  • CERTIFICATES
    • EITCA ACADEMIES
      • EITCA ACADEMIES CATALOGUE<
      • EITCA/CG COMPUTER GRAPHICS
      • EITCA/IS INFORMATION SECURITY
      • EITCA/BI BUSINESS INFORMATION
      • EITCA/KC KEY COMPETENCIES
      • EITCA/EG E-GOVERNMENT
      • EITCA/WD WEB DEVELOPMENT
      • EITCA/AI ARTIFICIAL INTELLIGENCE
    • EITC CERTIFICATES
      • EITC CERTIFICATES CATALOGUE<
      • COMPUTER GRAPHICS CERTIFICATES
      • WEB DESIGN CERTIFICATES
      • 3D DESIGN CERTIFICATES
      • OFFICE IT CERTIFICATES
      • BITCOIN BLOCKCHAIN CERTIFICATE
      • WORDPRESS CERTIFICATE
      • CLOUD PLATFORM CERTIFICATENEW
    • EITC CERTIFICATES
      • INTERNET CERTIFICATES
      • CRYPTOGRAPHY CERTIFICATES
      • BUSINESS IT CERTIFICATES
      • TELEWORK CERTIFICATES
      • PROGRAMMING CERTIFICATES
      • DIGITAL PORTRAIT CERTIFICATE
      • WEB DEVELOPMENT CERTIFICATES
      • DEEP LEARNING CERTIFICATESNEW
    • CERTIFICATES FOR
      • EU PUBLIC ADMINISTRATION
      • TEACHERS AND EDUCATORS
      • IT SECURITY PROFESSIONALS
      • GRAPHICS DESIGNERS & ARTISTS
      • BUSINESSMEN AND MANAGERS
      • BLOCKCHAIN DEVELOPERS
      • WEB DEVELOPERS
      • CLOUD AI EXPERTSNEW
  • FEATURED
  • SUBSIDY
  • HOW IT WORKS
  •   IT ID
  • ABOUT
  • CONTACT
  • MY ORDER
    Your current order is empty.
EITCIINSTITUTE
CERTIFIED

How do Keras and TensorFlow work together with Pandas and NumPy?

by Andrew Eliasz / Wednesday, 24 December 2025 / Published in Artificial Intelligence, EITC/AI/GCML Google Cloud Machine Learning, First steps in Machine Learning, Plain and simple estimators

Keras and TensorFlow, two well-integrated libraries in the machine learning ecosystem, are often used together with Pandas and NumPy, which provide robust tools for data manipulation and numerical computation. Understanding how these libraries interact is critical for those embarking on machine learning projects, especially when using Google Cloud Machine Learning services or similar platforms.

Keras and TensorFlow: Roles and Integration

Keras is a high-level neural networks API, written in Python, that runs on top of lower-level frameworks such as TensorFlow. Keras simplifies the process of defining, training, and evaluating deep learning models, offering a user-friendly interface and abstracting away many of the complexities of direct TensorFlow usage. TensorFlow, developed by Google, is a comprehensive open-source platform for machine learning. It provides both high-level and low-level APIs for building and deploying machine learning models at scale.

Since TensorFlow 2.x, Keras has become tightly integrated into TensorFlow and is accessible via `tf.keras`. This means that when a user writes model-building code with `tf.keras`, it leverages TensorFlow’s computational backend for model execution, optimization, and deployment.

NumPy: Foundation for Numerical Computation

NumPy is the foundational scientific computing library in Python. It provides a powerful N-dimensional array object, called ndarray, and a suite of functions for performing fast mathematical operations on large datasets. Many operations in machine learning, such as linear algebra, matrix manipulation, and vectorized computations, are efficiently handled by NumPy.

Pandas: Data Manipulation and Preparation

Pandas complements NumPy by offering high-level data structures such as DataFrames, which are especially useful for tabular data manipulation, cleaning, exploration, and feature engineering. DataFrames make it easy to handle missing values, encode categorical variables, and perform aggregations or groupings, all of which are common tasks in preparing data for machine learning models.

Workflow: Interaction Between the Libraries

When building machine learning models with Keras and TensorFlow, Pandas and NumPy play critical roles throughout the data preparation and model training pipeline. The typical workflow proceeds as follows:

1. Data Ingestion and Initial Exploration (Pandas)
Data is often loaded from external sources (CSV, SQL databases, Google BigQuery, etc.) using Pandas’ powerful `read_*` functions. DataFrames provide an intuitive interface for viewing, filtering, and summarizing the data.

python
   import pandas as pd

   df = pd.read_csv('dataset.csv')
   print(df.head())
   

2. Data Cleaning and Feature Engineering (Pandas)
DataFrames are used to handle missing values, encode categorical features, normalize numerical features, and generate new features. These steps are important for optimizing model performance.

python
   df['feature_normalized'] = (df['feature'] - df['feature'].mean()) / df['feature'].std()
   df['category_encoded'] = df['category'].astype('category').cat.codes
   

3. Conversion to NumPy Arrays
While Pandas DataFrames are efficient for data manipulation, Keras and TensorFlow models expect data in the form of NumPy arrays or TensorFlow tensors. The `values` attribute or `.to_numpy()` method of DataFrames facilitates this conversion.

python
   import numpy as np

   X = df[['feature1', 'feature2', 'feature_normalized']].to_numpy()
   y = df['label'].to_numpy()
   

4. Model Building and Training (Keras/TensorFlow)
The NumPy arrays are passed to Keras model methods to train neural networks or other machine learning estimators. Keras models accept NumPy arrays directly as inputs for training (`fit`), evaluation (`evaluate`), and prediction (`predict`).

python
   import tensorflow as tf
   from tensorflow import keras

   model = keras.Sequential([
       keras.layers.Dense(64, activation='relu', input_shape=(X.shape[1],)),
       keras.layers.Dense(1, activation='sigmoid')
   ])

   model.compile(optimizer='adam', loss='binary_crossentropy', metrics=['accuracy'])
   model.fit(X, y, epochs=10, batch_size=32)
   

5. Advanced Data Pipelines (tf.data and Integration)
For larger datasets or production workflows, TensorFlow’s `tf.data` API can be used to create efficient data pipelines. Pandas DataFrames or NumPy arrays can be converted into `tf.data.Dataset` objects, which support batch processing, shuffling, and parallel prefetching.

python
   dataset = tf.data.Dataset.from_tensor_slices((X, y))
   dataset = dataset.shuffle(buffer_size=1024).batch(32)
   model.fit(dataset, epochs=10)
   

Compatibility and Data Types

Keras and TensorFlow are highly compatible with NumPy, as both are designed for numerical computation and utilize similar paradigms for multi-dimensional data storage and manipulation. NumPy arrays (`ndarray`) and TensorFlow tensors can often be interchanged, as TensorFlow provides utilities such as `tf.convert_to_tensor` and `tf.make_ndarray` to facilitate conversion.

Pandas DataFrames, while not directly accepted by Keras or TensorFlow, can be seamlessly converted to NumPy arrays. This interoperability allows users to leverage the strengths of each library at different stages of the machine learning workflow.

Handling Categorical Data and Feature Engineering

A notable aspect of preparing data for Keras models involves handling categorical variables. Pandas provides methods such as `get_dummies` for one-hot encoding or `.astype('category').cat.codes` for label encoding. These transformations convert categorical columns into numerical representations that Keras can process. For more complex feature engineering, Pandas can be used to create interaction terms, polynomial features, or aggregate statistics.

Scaling and Normalization

Standardization and normalization are common preprocessing steps, ensuring that features contribute equally to the model's learning process. Pandas can be used to apply these transformations, or users may employ scikit-learn’s preprocessing utilities and then convert the results to NumPy arrays for model training.

python
from sklearn.preprocessing import StandardScaler

scaler = StandardScaler()
X_scaled = scaler.fit_transform(X)

Model Evaluation and Output Handling

After training, predictions generated by Keras models are returned as NumPy arrays. These can be converted back into Pandas DataFrames for easy analysis, visualization, or further processing.

python
predictions = model.predict(X_test)
predictions_df = pd.DataFrame(predictions, columns=['predicted_label'])

This bidirectional flow—data from Pandas to NumPy to Keras/TensorFlow and back—enables a cohesive machine learning pipeline.

Google Cloud Machine Learning Context

When working with Google Cloud Machine Learning services, these libraries remain foundational. Data may be ingested from Google Cloud Storage, BigQuery, or other cloud data sources into Pandas DataFrames. The subsequent steps of cleaning, processing, and conversion to NumPy arrays remain consistent. TensorFlow and Keras models can be built locally and then deployed to the cloud using tools such as TensorFlow Serving or Google AI Platform. The ability to serialize models and data in formats compatible with these libraries ensures portability and scalability across local and cloud environments.

Plain and Simple Estimators: Bridging Simplicity with Power

While Keras and TensorFlow are often associated with deep learning, they also offer simple estimators suitable for straightforward regression or classification tasks. For example, logistic regression or basic dense networks can be defined with minimal code. The integration with Pandas and NumPy ensures that even users with less experience in machine learning can build, train, and deploy performant models with a clear and concise workflow.

– *Example: Simple Binary Classification*

Suppose a dataset contains features about bank customers and a binary label indicating whether each customer subscribed to a term deposit. Using Pandas, the dataset is loaded and preprocessed; with NumPy, arrays are generated for model input; and with Keras (via TensorFlow), a simple neural network model is trained to predict the outcome.

python
import pandas as pd
import numpy as np
import tensorflow as tf
from tensorflow import keras

# Load and preprocess data
df = pd.read_csv('bank_marketing.csv')
X = df[['age', 'balance', 'duration']].to_numpy()
y = (df['y'] == 'yes').astype(int).to_numpy()

# Build and train model
model = keras.Sequential([
    keras.layers.Dense(1, activation='sigmoid', input_shape=(X.shape[1],))
])
model.compile(optimizer='adam', loss='binary_crossentropy', metrics=['accuracy'])
model.fit(X, y, epochs=5, batch_size=16)

This example demonstrates the straightforward integration of these tools, allowing users to focus on the logic of the machine learning task rather than the intricacies of data or computation management.

Best Practices and Considerations

1. Data Consistency: Ensure that the data passed to Keras models matches the expected shape and type. Inconsistent dimensions or data types can lead to errors.
2. Memory Management: When working with large datasets, it may be beneficial to use batch generators or the `tf.data` API to avoid loading the entire dataset into memory.
3. Model Saving and Loading: Trained Keras models can be saved and loaded using the `.save()` and `keras.models.load_model()` functions. Inputs and outputs retain compatibility with NumPy arrays, facilitating reproducibility and deployment.
4. Seamless Integration: The interoperability between Pandas, NumPy, Keras, and TensorFlow streamlines the machine learning workflow. Users can freely move data between these libraries as needed, leveraging the unique features of each.

Example: End-to-End Workflow

A typical machine learning pipeline using these libraries might look as follows:

python
import pandas as pd
import numpy as np
import tensorflow as tf
from tensorflow import keras
from sklearn.model_selection import train_test_split
from sklearn.preprocessing import StandardScaler

# Step 1: Load data
df = pd.read_csv('housing.csv')

# Step 2: Clean and preprocess
df = df.dropna()
X = df[['feature1', 'feature2', 'feature3']].to_numpy()
y = df['price'].to_numpy()

# Step 3: Train-test split
X_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.2, random_state=42)

# Step 4: Scale features
scaler = StandardScaler()
X_train_scaled = scaler.fit_transform(X_train)
X_test_scaled = scaler.transform(X_test)

# Step 5: Build and train model
model = keras.Sequential([
    keras.layers.Dense(32, activation='relu', input_shape=(X_train_scaled.shape[1],)),
    keras.layers.Dense(1)
])
model.compile(optimizer='adam', loss='mse', metrics=['mae'])
model.fit(X_train_scaled, y_train, epochs=10, batch_size=32, validation_split=0.1)

# Step 6: Evaluate model
loss, mae = model.evaluate(X_test_scaled, y_test)
print(f"Test MAE: {mae}")

# Step 7: Make predictions
predictions = model.predict(X_test_scaled)
results_df = pd.DataFrame({'Actual': y_test, 'Predicted': predictions.flatten()})
print(results_df.head())

This example illustrates the sequential use of Pandas for data handling, NumPy for numerical arrays, scikit-learn for preprocessing and splitting, and Keras/TensorFlow for model development and inference. Each library plays a distinct and complementary role, contributing to a seamless workflow for developing, training, and deploying machine learning models.

Conclusion Paragraph

The combination of Keras, TensorFlow, Pandas, and NumPy forms a powerful ecosystem for machine learning tasks. Each library contributes unique capabilities: Keras and TensorFlow for model definition and training; Pandas for data manipulation and cleaning; and NumPy for efficient numerical computation. Their interoperability allows practitioners to construct robust, efficient, and maintainable machine learning workflows, whether for simple estimators or more complex deep learning architectures.

Other recent questions and answers regarding Plain and simple estimators:

  • Do I need to install TensorFlow?
  • I have Python 3.14. Do I need to downgrade to version 3.10?
  • Are the methods of Plain and Simple Estimators outdated and obsolete or they still have value in ML?
  • Right now, should I use Estimators since TensorFlow 2 is more effective and easy to use?
  • What is artificial intelligence and what is it currently used for in everyday life?
  • How to use Google environment for machine learning and applying AI models for free?
  • How Keras models replace TensorFlow estimators?
  • How to use TensorFlow Serving?
  • What is the simplest route to most basic didactic AI model training and deployment on Google AI Platform using a free tier/trial using a GUI console in a step-by-step manner for an absolute begginer with no programming background?
  • What is an epoch in the context of training model parameters?

View more questions and answers in Plain and simple estimators

More questions and answers:

  • Field: Artificial Intelligence
  • Programme: EITC/AI/GCML Google Cloud Machine Learning (go to the certification programme)
  • Lesson: First steps in Machine Learning (go to related lesson)
  • Topic: Plain and simple estimators (go to related topic)
Tagged under: Artificial Intelligence, Data Preparation, Data Preprocessing, Keras, Machine Learning Workflow, Model Training, NumPy, Pandas, Python Libraries, TensorFlow
Home » Artificial Intelligence » EITC/AI/GCML Google Cloud Machine Learning » First steps in Machine Learning » Plain and simple estimators » » How do Keras and TensorFlow work together with Pandas and NumPy?

Certification Center

USER MENU

  • My Account

CERTIFICATE CATEGORY

  • EITC Certification (105)
  • EITCA Certification (9)

What are you looking for?

  • Introduction
  • How it works?
  • EITCA Academies
  • EITCI DSJC Subsidy
  • Full EITC catalogue
  • Your order
  • Featured
  •   IT ID
  • EITCA reviews (Medium publ.)
  • About
  • Contact

EITCA Academy is a part of the European IT Certification framework

The European IT Certification framework has been established in 2008 as a Europe based and vendor independent standard in widely accessible online certification of digital skills and competencies in many areas of professional digital specializations. The EITC framework is governed by the European IT Certification Institute (EITCI), a non-profit certification authority supporting information society growth and bridging the digital skills gap in the EU.
Eligibility for EITCA Academy 90% EITCI DSJC Subsidy support
90% of EITCA Academy fees subsidized in enrolment

    EITCA Academy Secretary Office

    European IT Certification Institute ASBL
    Brussels, Belgium, European Union

    EITC / EITCA Certification Framework Operator
    Governing European IT Certification Standard
    Access contact form or call +32 25887351

    Follow EITCI on X
    Visit EITCA Academy on Facebook
    Engage with EITCA Academy on LinkedIn
    Check out EITCI and EITCA videos on YouTube

    Funded by the European Union

    Funded by the European Regional Development Fund (ERDF) and the European Social Fund (ESF) in series of projects since 2007, currently governed by the European IT Certification Institute (EITCI) since 2008

    Information Security Policy | DSRRM and GDPR Policy | Data Protection Policy | Record of Processing Activities | HSE Policy | Anti-Corruption Policy | Modern Slavery Policy

    Automatically translate to your language

    Terms and Conditions | Privacy Policy
    EITCA Academy
    • EITCA Academy on social media
    EITCA Academy


    © 2008-2026  European IT Certification Institute
    Brussels, Belgium, European Union

    TOP
    CHAT WITH SUPPORT
    Do you have any questions?
    We will reply here and by email. Your conversation is tracked with a support token.