Close Menu
    Trending
    • Futurwise: Unlock 25% Off Futurwise Today
    • 3D Printer Breaks Kickstarter Record, Raises Over $46M
    • People are using AI to ‘sit’ with them while they trip on psychedelics
    • Reinforcement Learning in the Age of Modern AI | by @pramodchandrayan | Jul, 2025
    • How This Man Grew His Beverage Side Hustle From $1k a Month to 7 Figures
    • Finding the right tool for the job: Visual Search for 1 Million+ Products | by Elliot Ford | Kingfisher-Technology | Jul, 2025
    • How Smart Entrepreneurs Turn Mid-Year Tax Reviews Into Long-Term Financial Wins
    • Become a Better Data Scientist with These Prompt Engineering Tips and Tricks
    AIBS News
    • Home
    • Artificial Intelligence
    • Machine Learning
    • AI Technology
    • Data Science
    • More
      • Technology
      • Business
    AIBS News
    Home»Artificial Intelligence»The Math Behind In-Context Learning | by Shitanshu Bhushan | Dec, 2024
    Artificial Intelligence

    The Math Behind In-Context Learning | by Shitanshu Bhushan | Dec, 2024

    Team_AIBS NewsBy Team_AIBS NewsJanuary 1, 2025No Comments3 Mins Read
    Share Facebook Twitter Pinterest LinkedIn Tumblr Reddit Telegram Email
    Share
    Facebook Twitter LinkedIn Pinterest Email


    In 2022, Anthropic launched a paper the place they confirmed proof that induction head might constitute the mechanism for ICL. What are induction heads? As acknowledged by Anthropic — “Induction heads are carried out by a circuit consisting of a pair of consideration heads in numerous layers that work collectively to repeat or full patterns.”, merely put what the induction head does is given a sequence like — […, A, B,…, A] it would full it with B with the reasoning that if A is adopted by B earlier within the context, it’s probably that A is adopted by B once more. When you’ve a sequence like “…A, B…A”, the primary consideration head copies earlier token information into every place, and the second consideration head makes use of this information to search out the place A appeared earlier than and predict what got here after it (B).

    Just lately a number of analysis has proven that transformers may very well be doing ICL via gradient descent (Garg et al. 2022, Oswald et al. 2023, and so forth) by exhibiting the relation between linear consideration and gradient descent. Let’s revisit least squares and gradient descent,

    Supply: Picture by Creator

    Now let’s see how this hyperlinks with linear consideration

    Right here we deal with linear consideration as identical as softmax consideration minus the softmax operation. The essential linear consideration components,

    Supply: Picture by Creator

    Let’s begin with a single-layer building that captures the essence of in-context studying. Think about we’ve got n coaching examples (x₁,y₁)…(xₙ,yₙ), and we wish to predict y_{n+1} for a brand new enter x_{n+1}.

    Supply: Picture by Creator

    This appears to be like similar to what we acquired with gradient descent, besides in linear consideration we’ve got an additional time period ‘W’. What linear consideration is implementing is one thing generally known as preconditioned gradient descent (PGD), the place as an alternative of the usual gradient step, we modify the gradient with a preconditioning matrix W,

    Supply: Picture by Creator

    What we’ve got proven right here is that we will assemble a weight matrix such that one layer of linear consideration will do one step of PGD.

    We noticed how consideration can implement “studying algorithms”, these are algorithms the place principally if we offer a lot of demonstrations (x,y) then the mannequin learns from these demonstrations to foretell the output of any new question. Whereas the precise mechanisms involving a number of consideration layers and MLPs are advanced, researchers have made progress in understanding how in-context studying works mechanistically. This text offers an intuitive, high-level introduction to assist readers perceive the internal workings of this emergent skill of transformers.

    To learn extra on this matter, I’d recommend the next papers:

    In-context Learning and Induction Heads

    What Can Transformers Learn In-Context? A Case Study of Simple Function Classes

    Transformers Learn In-Context by Gradient Descent

    Transformers learn to implement preconditioned gradient descent for in-context learning

    This weblog put up was impressed by coursework from my graduate research throughout Fall 2024 at College of Michigan. Whereas the programs supplied the foundational information and motivation to discover these matters, any errors or misinterpretations on this article are totally my very own. This represents my private understanding and exploration of the fabric.



    Source link

    Share. Facebook Twitter Pinterest LinkedIn Tumblr Email
    Previous ArticlePRACTICAL DEMONSTRATION OF LINEAR REGRESSION | by Ajuruvictor | Jan, 2025
    Next Article 4 simple strategies to declutter and get organized
    Team_AIBS News
    • Website

    Related Posts

    Artificial Intelligence

    Become a Better Data Scientist with These Prompt Engineering Tips and Tricks

    July 1, 2025
    Artificial Intelligence

    Lessons Learned After 6.5 Years Of Machine Learning

    July 1, 2025
    Artificial Intelligence

    Prescriptive Modeling Makes Causal Bets – Whether You Know it or Not!

    June 30, 2025
    Add A Comment
    Leave A Reply Cancel Reply

    Top Posts

    Futurwise: Unlock 25% Off Futurwise Today

    July 1, 2025

    I Tried Buying a Car Through Amazon: Here Are the Pros, Cons

    December 10, 2024

    Amazon and eBay to pay ‘fair share’ for e-waste recycling

    December 10, 2024

    Artificial Intelligence Concerns & Predictions For 2025

    December 10, 2024

    Barbara Corcoran: Entrepreneurs Must ‘Embrace Change’

    December 10, 2024
    Categories
    • AI Technology
    • Artificial Intelligence
    • Business
    • Data Science
    • Machine Learning
    • Technology
    Most Popular

    HNO International Converting Wasted Flared Gas into Energy for Data Centers, Bitcoin Mining and Hydrogen

    March 1, 2025

    Why your AI investments aren’t paying off

    January 15, 2025

    Grief Forced Me to Step Away From My Company. These 5 Systems Made It Possible.

    June 30, 2025
    Our Picks

    Futurwise: Unlock 25% Off Futurwise Today

    July 1, 2025

    3D Printer Breaks Kickstarter Record, Raises Over $46M

    July 1, 2025

    People are using AI to ‘sit’ with them while they trip on psychedelics

    July 1, 2025
    Categories
    • AI Technology
    • Artificial Intelligence
    • Business
    • Data Science
    • Machine Learning
    • Technology
    • Privacy Policy
    • Disclaimer
    • Terms and Conditions
    • About us
    • Contact us
    Copyright © 2024 Aibsnews.comAll Rights Reserved.

    Type above and press Enter to search. Press Esc to cancel.