×
1 Choose EITC/EITCA Certificates
2 Learn and take online exams
3 Get your IT skills certified

Confirm your IT skills and competencies under the European IT Certification framework from anywhere in the world fully online.

EITCA Academy

Digital skills attestation standard by the European IT Certification Institute aiming to support Digital Society development

LOG IN TO YOUR ACCOUNT

CREATE AN ACCOUNT FORGOT YOUR PASSWORD?

FORGOT YOUR PASSWORD?

AAH, WAIT, I REMEMBER NOW!

CREATE AN ACCOUNT

ALREADY HAVE AN ACCOUNT?
EUROPEAN INFORMATION TECHNOLOGIES CERTIFICATION ACADEMY - ATTESTING YOUR PROFESSIONAL DIGITAL SKILLS
  • SIGN UP
  • LOGIN
  • INFO

EITCA Academy

EITCA Academy

The European Information Technologies Certification Institute - EITCI ASBL

Certification Provider

EITCI Institute ASBL

Brussels, European Union

Governing European IT Certification (EITC) framework in support of the IT professionalism and Digital Society

  • CERTIFICATES
    • EITCA ACADEMIES
      • EITCA ACADEMIES CATALOGUE<
      • EITCA/CG COMPUTER GRAPHICS
      • EITCA/IS INFORMATION SECURITY
      • EITCA/BI BUSINESS INFORMATION
      • EITCA/KC KEY COMPETENCIES
      • EITCA/EG E-GOVERNMENT
      • EITCA/WD WEB DEVELOPMENT
      • EITCA/AI ARTIFICIAL INTELLIGENCE
    • EITC CERTIFICATES
      • EITC CERTIFICATES CATALOGUE<
      • COMPUTER GRAPHICS CERTIFICATES
      • WEB DESIGN CERTIFICATES
      • 3D DESIGN CERTIFICATES
      • OFFICE IT CERTIFICATES
      • BITCOIN BLOCKCHAIN CERTIFICATE
      • WORDPRESS CERTIFICATE
      • CLOUD PLATFORM CERTIFICATENEW
    • EITC CERTIFICATES
      • INTERNET CERTIFICATES
      • CRYPTOGRAPHY CERTIFICATES
      • BUSINESS IT CERTIFICATES
      • TELEWORK CERTIFICATES
      • PROGRAMMING CERTIFICATES
      • DIGITAL PORTRAIT CERTIFICATE
      • WEB DEVELOPMENT CERTIFICATES
      • DEEP LEARNING CERTIFICATESNEW
    • CERTIFICATES FOR
      • EU PUBLIC ADMINISTRATION
      • TEACHERS AND EDUCATORS
      • IT SECURITY PROFESSIONALS
      • GRAPHICS DESIGNERS & ARTISTS
      • BUSINESSMEN AND MANAGERS
      • BLOCKCHAIN DEVELOPERS
      • WEB DEVELOPERS
      • CLOUD AI EXPERTSNEW
  • FEATURED
  • SUBSIDY
  • HOW IT WORKS
  •   IT ID
  • ABOUT
  • CONTACT
  • MY ORDER
    Your current order is empty.
EITCIINSTITUTE
CERTIFIED

Does defining a layer of an artificial neural network with biases included in the model require multiplying the input data matrices by the sums of weights and biases?

by Tomasz Ciołak / Friday, 09 August 2024 / Published in Artificial Intelligence, EITC/AI/DLTF Deep Learning with TensorFlow, TensorFlow, TensorFlow basics

Defining a layer of an artificial neural network (ANN) with biases included in the model does not require multiplying the input data matrices by the sums of weights and biases. Instead, the process involves two distinct operations: the weighted sum of the inputs and the addition of biases. This distinction is important for understanding the mechanics of neural networks and their implementation in frameworks such as TensorFlow.

Theoretical Background

In the context of artificial neural networks, a layer is composed of multiple neurons, each of which performs a specific computation. For a given neuron, the input consists of a vector of features \mathbf{x} and a corresponding vector of weights \mathbf{w}. The neuron computes a weighted sum of the inputs and then adds a bias term b. Mathematically, this can be represented as:

    \[ z = \mathbf{w} \cdot \mathbf{x} + b \]

Here, \mathbf{w} \cdot \mathbf{x} denotes the dot product of the weight vector and the input vector, resulting in a scalar value z. The bias b is then added to this scalar to shift the activation function's input.

Implementation in TensorFlow

TensorFlow, a popular deep learning framework, provides functionalities to define and manipulate neural network layers efficiently. When defining a layer in TensorFlow, such as using `tf.keras.layers.Dense`, the framework internally handles the operations involving weights and biases. Consider the following example:

python
import tensorflow as tf

# Define a dense layer with 10 units
dense_layer = tf.keras.layers.Dense(units=10, use_bias=True)

# Assume input data
input_data = tf.random.normal(shape=(5, 3))  # Batch size of 5, input features of 3

# Forward pass through the dense layer
output = dense_layer(input_data)

In this example, the `Dense` layer is initialized with 10 units (neurons) and is configured to use biases (`use_bias=True`). When the `input_data` passes through the `dense_layer`, TensorFlow performs the following operations internally:

1. Weighted Sum: Each neuron's output is computed by taking the dot product of the input data matrix and the weight matrix. If the input data matrix \mathbf{X} has a shape of (5, 3) and the weight matrix \mathbf{W} has a shape of (3, 10), the resulting matrix \mathbf{Z} will have a shape of (5, 10).

    \[ \mathbf{Z} = \mathbf{X} \cdot \mathbf{W} \]

2. Bias Addition: The bias vector \mathbf{b} is added to each row of the resulting matrix \mathbf{Z}. The bias vector \mathbf{b} has a shape of (10), and broadcasting is used to add it to each row of \mathbf{Z}.

    \[ \mathbf{Z} = \mathbf{Z} + \mathbf{b} \]

Detailed Explanation

Weighted Sum

The weighted sum operation involves matrix multiplication. Given an input matrix \mathbf{X} with dimensions (N, d), where N is the batch size and d is the number of input features, and a weight matrix \mathbf{W} with dimensions (d, m), where m is the number of neurons in the layer, the matrix multiplication \mathbf{X} \cdot \mathbf{W} results in a matrix \mathbf{Z} with dimensions (N, m).

Bias Addition

After computing the weighted sum, the bias vector \mathbf{b} is added to each row of the matrix \mathbf{Z}. The bias vector \mathbf{b} has dimensions (m), corresponding to the number of neurons in the layer. This addition is performed using broadcasting, a technique that allows element-wise operations on arrays of different shapes.

Example with Numerical Values

Consider a simple example with numerical values to illustrate the process. Let the input data matrix \mathbf{X} be:

    \[ \mathbf{X} = \begin{bmatrix} 1 & 2 & 3 \\ 4 & 5 & 6 \\ 7 & 8 & 9 \\ \end{bmatrix} \]

Assume the weight matrix \mathbf{W} is:

    \[ \mathbf{W} = \begin{bmatrix} 0.1 & 0.2 \\ 0.3 & 0.4 \\ 0.5 & 0.6 \\ \end{bmatrix} \]

And the bias vector \mathbf{b} is:

    \[ \mathbf{b} = \begin{bmatrix} 0.1 & 0.2 \\ \end{bmatrix} \]

The weighted sum \mathbf{Z} is computed as:

    \[ \mathbf{Z} = \mathbf{X} \cdot \mathbf{W} = \begin{bmatrix} 1 & 2 & 3 \\ 4 & 5 & 6 \\ 7 & 8 & 9 \\ \end{bmatrix} \cdot \begin{bmatrix} 0.1 & 0.2 \\ 0.3 & 0.4 \\ 0.5 & 0.6 \\ \end{bmatrix} = \begin{bmatrix} 2.2 & 2.8 \\ 4.9 & 6.4 \\ 7.6 & 10.0 \\ \end{bmatrix} \]

Next, the bias vector \mathbf{b} is added to each row of \mathbf{Z}:

    \[ \mathbf{Z} = \begin{bmatrix} 2.2 & 2.8 \\ 4.9 & 6.4 \\ 7.6 & 10.0 \\ \end{bmatrix} + \begin{bmatrix} 0.1 & 0.2 \\ \end{bmatrix} = \begin{bmatrix} 2.3 & 3.0 \\ 5.0 & 6.6 \\ 7.7 & 10.2 \\ \end{bmatrix} \]

Activation Function

After computing the weighted sum and adding the bias, the resulting matrix is typically passed through an activation function, such as ReLU (Rectified Linear Unit), sigmoid, or tanh, to introduce non-linearity into the model. For instance, applying the ReLU activation function to the matrix \mathbf{Z} would yield:

    \[ \text{ReLU}(\mathbf{Z}) = \begin{bmatrix} \max(0, 2.3) & \max(0, 3.0) \\ \max(0, 5.0) & \max(0, 6.6) \\ \max(0, 7.7) & \max(0, 10.2) \\ \end{bmatrix} = \begin{bmatrix} 2.3 & 3.0 \\ 5.0 & 6.6 \\ 7.7 & 10.2 \\ \end{bmatrix} \]

Practical Considerations

When implementing neural networks in TensorFlow, it is essential to understand that the framework abstracts many of these operations, allowing developers to focus on higher-level design aspects. TensorFlow's `Dense` layer, for instance, handles the initialization of weights and biases, the computation of weighted sums, and the addition of biases internally.

Moreover, TensorFlow provides various initializers for weights and biases, such as `tf.keras.initializers.RandomNormal` for weights and `tf.keras.initializers.Zeros` for biases. These initializers can be specified when defining a layer:

python
dense_layer = tf.keras.layers.Dense(
    units=10,
    use_bias=True,
    kernel_initializer=tf.keras.initializers.RandomNormal(mean=0., stddev=1.),
    bias_initializer=tf.keras.initializers.Zeros()
)

The process of defining a layer in an artificial neural network with biases included involves computing a weighted sum of the input data and weights, followed by the addition of biases. This operation does not entail multiplying the input data matrices by the sums of weights and biases. Instead, it consists of two separate steps: matrix multiplication for the weighted sum and element-wise addition for the biases. TensorFlow efficiently handles these operations, abstracting the underlying complexity and providing a high-level interface for neural network design.

Other recent questions and answers regarding TensorFlow basics:

  • How does batch size control the number of examples in the batch, and in TensorFlow does it need to be set statically?
  • In TensorFlow, when defining a placeholder for a tensor, should one use a placeholder function with one of the parameters specifying the shape of the tensor, which, however, does not need to be set?
  • In deep learning, are SGD and AdaGrad examples of cost functions in TensorFlow?
  • Does a deep neural network with feedback and backpropagation work particularly well for natural language processing?
  • Are convolutional neural networks considered a less important class of deep learning models from the perspective of practical applications?
  • Would defining a layer of an artificial neural network with biases included in the model require multiplying the input data matrices by the sums of weights and biases?
  • Does the activation function of a node define the output of that node given input data or a set of input data?
  • In TensorFlow 2.0 and later, sessions are no longer used directly. Is there any reason to use them?
  • Why is TensorFlow often referred to as a deep learning library?
  • How does TensorFlow handle matrix manipulation? What are tensors and what can they store?

View more questions and answers in TensorFlow basics

More questions and answers:

  • Field: Artificial Intelligence
  • Programme: EITC/AI/DLTF Deep Learning with TensorFlow (go to the certification programme)
  • Lesson: TensorFlow (go to related lesson)
  • Topic: TensorFlow basics (go to related topic)
Tagged under: Activation Functions, Artificial Intelligence, Deep Learning, Machine Learning, Neural Networks, TensorFlow
Home » Artificial Intelligence » EITC/AI/DLTF Deep Learning with TensorFlow » TensorFlow » TensorFlow basics » » Does defining a layer of an artificial neural network with biases included in the model require multiplying the input data matrices by the sums of weights and biases?

Certification Center

USER MENU

  • My Account

CERTIFICATE CATEGORY

  • EITC Certification (105)
  • EITCA Certification (9)

What are you looking for?

  • Introduction
  • How it works?
  • EITCA Academies
  • EITCI DSJC Subsidy
  • Full EITC catalogue
  • Your order
  • Featured
  •   IT ID
  • EITCA reviews (Medium publ.)
  • About
  • Contact

EITCA Academy is a part of the European IT Certification framework

The European IT Certification framework has been established in 2008 as a Europe based and vendor independent standard in widely accessible online certification of digital skills and competencies in many areas of professional digital specializations. The EITC framework is governed by the European IT Certification Institute (EITCI), a non-profit certification authority supporting information society growth and bridging the digital skills gap in the EU.
Eligibility for EITCA Academy 90% EITCI DSJC Subsidy support
90% of EITCA Academy fees subsidized in enrolment

    EITCA Academy Secretary Office

    European IT Certification Institute ASBL
    Brussels, Belgium, European Union

    EITC / EITCA Certification Framework Operator
    Governing European IT Certification Standard
    Access contact form or call +32 25887351

    Follow EITCI on X
    Visit EITCA Academy on Facebook
    Engage with EITCA Academy on LinkedIn
    Check out EITCI and EITCA videos on YouTube

    Funded by the European Union

    Funded by the European Regional Development Fund (ERDF) and the European Social Fund (ESF) in series of projects since 2007, currently governed by the European IT Certification Institute (EITCI) since 2008

    Information Security Policy | DSRRM and GDPR Policy | Data Protection Policy | Record of Processing Activities | HSE Policy | Anti-Corruption Policy | Modern Slavery Policy

    Automatically translate to your language

    Terms and Conditions | Privacy Policy
    EITCA Academy
    • EITCA Academy on social media
    EITCA Academy


    © 2008-2026  European IT Certification Institute
    Brussels, Belgium, European Union

    TOP
    CHAT WITH SUPPORT
    Do you have any questions?
    We will reply here and by email. Your conversation is tracked with a support token.