Close Menu
    Trending
    • Designing a Machine Learning System: Part Five | by Mehrshad Asadi | Aug, 2025
    • Innovations in Artificial Intelligence That Are Changing Agriculture
    • Hundreds of thousands of Grok chats exposed in Google results
    • Workers Over 40 Are Turning to Side Hustles — Here’s Why
    • From Pixels to Perfect Replicas
    • In a first, Google has released data on how much energy an AI prompt uses
    • Mastering Fine-Tuning Foundation Models in Amazon Bedrock: A Comprehensive Guide for Developers and IT Professionals | by Nishant Gupta | Aug, 2025
    • The Key to Building Effective Corporate-Startup Partnerships
    AIBS News
    • Home
    • Artificial Intelligence
    • Machine Learning
    • AI Technology
    • Data Science
    • More
      • Technology
      • Business
    AIBS News
    Home»Machine Learning»Understanding non_trainable_weights in Keras: A Complete Guide with Examples | by Karthik Karunakaran, Ph.D. | Mar, 2025
    Machine Learning

    Understanding non_trainable_weights in Keras: A Complete Guide with Examples | by Karthik Karunakaran, Ph.D. | Mar, 2025

    Team_AIBS NewsBy Team_AIBS NewsMarch 23, 2025No Comments2 Mins Read
    Share Facebook Twitter Pinterest LinkedIn Tumblr Reddit Telegram Email
    Share
    Facebook Twitter LinkedIn Pinterest Email


    Deep studying fashions usually require exact management over which parameters are up to date throughout coaching. In Keras, the non_trainable_weights property helps handle such parameters effectively. Whether or not you are fine-tuning a pre-trained mannequin or implementing customized layers, understanding how you can use non_trainable_weights accurately can enhance efficiency and adaptability.

    Keras fashions and layers have two forms of weight attributes:

    • Trainable Weights: These are up to date throughout backpropagation.
    • Non-Trainable Weights: These stay fixed throughout coaching, helpful for storing fastened parameters like statistics in batch normalization.

    The non_trainable_weights property permits entry to those parameters, making certain they’re used with out being modified by gradient updates.

    • Pre-trained Mannequin Positive-Tuning: Freezing layers to retain discovered options.
    • Customized Layers: Defining stateful layers with fastened parameters.
    • Effectivity: Decreasing the variety of trainable parameters optimizes coaching pace.

    To entry non-trainable weights in a mannequin, use:

    from tensorflow import keras

    # Load a pre-trained mannequin
    base_model = keras.functions.MobileNetV2(weights='imagenet', include_top=False)

    # Freeze all layers
    for layer in base_model.layers:
    layer.trainable = False

    print("Non-trainable weights:", base_model.non_trainable_weights)

    Right here, all layers are frozen, making their weights a part of non_trainable_weights.

    You’ll be able to outline a customized layer with fastened parameters:

    import tensorflow as tf

    class CustomLayer(keras.layers.Layer):
    def __init__(self, **kwargs):
    tremendous().__init__(**kwargs)
    self.fixed_weight = self.add_weight(form=(1,), initializer="ones", trainable=False)

    def name(self, inputs):
    return inputs * self.fixed_weight

    layer = CustomLayer()
    print("Non-trainable weights:", layer.non_trainable_weights)

    Right here, fixed_weight stays unchanged throughout coaching.

    Although they don’t seem to be up to date mechanically, non-trainable weights will be manually modified:

    layer.fixed_weight.assign([2.0])
    print("Up to date non-trainable weight:", layer.fixed_weight.numpy())
    1. Use for Frozen Layers: When fine-tuning pre-trained fashions, set trainable = False for layers.
    2. Manually Replace When Wanted: If updates are required, assign values explicitly.
    3. Monitor Parameter Rely: Use mannequin.abstract() to verify trainable vs. non-trainable parameters.

    Understanding and leveraging non_trainable_weights in Keras is essential for optimizing deep studying workflows. Whether or not you are customizing layers or fine-tuning fashions, managing trainable and non-trainable weights can considerably improve mannequin effectivity.

    Enthusiastic about mastering AI and deep studying? Take a look at my Udemy programs: Karthik K on Udemy

    Share your ideas within the feedback! Have you ever used non_trainable_weights earlier than? How did it affect your mannequin coaching?



    Source link

    Share. Facebook Twitter Pinterest LinkedIn Tumblr Email
    Previous ArticleReinforcement Learning for Network Optimization
    Next Article Boogie Fland on How NIL is Changing the Path to the NBA
    Team_AIBS News
    • Website

    Related Posts

    Machine Learning

    Designing a Machine Learning System: Part Five | by Mehrshad Asadi | Aug, 2025

    August 21, 2025
    Machine Learning

    Mastering Fine-Tuning Foundation Models in Amazon Bedrock: A Comprehensive Guide for Developers and IT Professionals | by Nishant Gupta | Aug, 2025

    August 21, 2025
    Machine Learning

    “How to Build an Additional Income Stream from Your Phone in 21 Days — A Plan You Can Copy” | by Zaczynam Od Zera | Aug, 2025

    August 21, 2025
    Add A Comment
    Leave A Reply Cancel Reply

    Top Posts

    Designing a Machine Learning System: Part Five | by Mehrshad Asadi | Aug, 2025

    August 21, 2025

    I Tried Buying a Car Through Amazon: Here Are the Pros, Cons

    December 10, 2024

    Amazon and eBay to pay ‘fair share’ for e-waste recycling

    December 10, 2024

    Artificial Intelligence Concerns & Predictions For 2025

    December 10, 2024

    Barbara Corcoran: Entrepreneurs Must ‘Embrace Change’

    December 10, 2024
    Categories
    • AI Technology
    • Artificial Intelligence
    • Business
    • Data Science
    • Machine Learning
    • Technology
    Most Popular

    Agentic AI: A New Era of Intelligence | by Dave’s Publication | Dec, 2024

    December 27, 2024

    Screen time in bed linked to worse sleep, study finds

    April 1, 2025

    Transformers (and Attention) are Just Fancy Addition Machines

    July 24, 2025
    Our Picks

    Designing a Machine Learning System: Part Five | by Mehrshad Asadi | Aug, 2025

    August 21, 2025

    Innovations in Artificial Intelligence That Are Changing Agriculture

    August 21, 2025

    Hundreds of thousands of Grok chats exposed in Google results

    August 21, 2025
    Categories
    • AI Technology
    • Artificial Intelligence
    • Business
    • Data Science
    • Machine Learning
    • Technology
    • Privacy Policy
    • Disclaimer
    • Terms and Conditions
    • About us
    • Contact us
    Copyright © 2024 Aibsnews.comAll Rights Reserved.

    Type above and press Enter to search. Press Esc to cancel.