Close Menu
    Trending
    • 3D Printer Breaks Kickstarter Record, Raises Over $46M
    • People are using AI to ‘sit’ with them while they trip on psychedelics
    • Reinforcement Learning in the Age of Modern AI | by @pramodchandrayan | Jul, 2025
    • How This Man Grew His Beverage Side Hustle From $1k a Month to 7 Figures
    • Finding the right tool for the job: Visual Search for 1 Million+ Products | by Elliot Ford | Kingfisher-Technology | Jul, 2025
    • How Smart Entrepreneurs Turn Mid-Year Tax Reviews Into Long-Term Financial Wins
    • Become a Better Data Scientist with These Prompt Engineering Tips and Tricks
    • Meanwhile in Europe: How We Learned to Stop Worrying and Love the AI Angst | by Andreas Maier | Jul, 2025
    AIBS News
    • Home
    • Artificial Intelligence
    • Machine Learning
    • AI Technology
    • Data Science
    • More
      • Technology
      • Business
    AIBS News
    Home»Artificial Intelligence»Build Your Own AI Coding Assistant in JupyterLab with Ollama and Hugging Face
    Artificial Intelligence

    Build Your Own AI Coding Assistant in JupyterLab with Ollama and Hugging Face

    Team_AIBS NewsBy Team_AIBS NewsMarch 24, 2025No Comments9 Mins Read
    Share Facebook Twitter Pinterest LinkedIn Tumblr Reddit Telegram Email
    Share
    Facebook Twitter LinkedIn Pinterest Email


    Jupyter AI brings generative AI capabilities proper into the interface. Having an area AI assistant ensures privateness, reduces latency, and gives offline performance, making it a robust instrument for builders. On this article, we’ll learn to arrange an area AI coding assistant in JupyterLab utilizing Jupyter AI, Ollama and Hugging Face. By the top of this text, you’ll have a completely practical coding assistant in JupyterLab able to autocompleting code, fixing errors, creating new notebooks from scratch, and far more, as proven within the screenshot beneath.

    Coding assistant in Jupyter Lab by way of Jupyter AI | Picture by Writer

    ⚠️ Jupyter AI remains to be underneath heavy growth, so some options could break. As of writing this text, I’ve examined the setup to verify it really works, however count on potential changes because the challenge evolves. Additionally the efficiency of the assistant relies on the mannequin that you choose so ensure you select the one that’s match to your use case.

    First issues first — what’s Jupyter AI? Because the title suggests, Jupyter AI is a JupyterLab extension for generative AI. This highly effective instrument transforms your commonplace Jupyter notebooks or JupyterLab setting right into a generative AI playground. The most effective half? It additionally works seamlessly in environments like Google Colaboratory and Visible Studio Code. This extension does all of the heavy lifting, offering entry to a wide range of mannequin suppliers (each open and closed supply) proper inside your Jupyter setting. 

    Circulate diagram of the set up course of | Picture by Writer

    Organising the setting entails three primary elements:

    • JupyterLab
    • The Jupyter AI extension
    • Ollama (for Native Mannequin Serving)
    • [Optional] Hugging Face (for GGUF fashions)

    Actually, getting the assistant to resolve coding errors is the straightforward half. What is difficult is making certain all of the installations have been carried out accurately. It’s subsequently important you observe the steps accurately.

    1. Putting in the Jupyter AI Extension

    It’s beneficial to create a new environment particularly for Jupyter AI to maintain your current setting clear and organised. As soon as carried out observe the subsequent steps. Jupyter AI requires JupyterLab 4.x or Jupyter Pocket book 7+, so ensure you have the most recent model of Jupyter Lab put in​. You may set up/improve JupyterLab with pip or conda:

    # Set up JupyterLab 4 utilizing pip
    pip set up jupyterlab~=4.0

    Subsequent, set up the Jupyter AI extension as follows.

    pip set up "jupyter-ai[all]"

    That is the best technique for set up because it consists of all supplier dependencies (so it helps Hugging Face, Ollama, and so forth., out of the field). To this point, Jupyter AI helps the next model providers : 

    Supported Mannequin suppliers in Jupyter AI together with the dependencies | Created by Writer from the documentation

    If you happen to encounter errors in the course of the Jupyter AI set up, manually set up Jupyter AI utilizing pip with out the [all] elective dependency group. This fashion you possibly can management which fashions can be found in your Jupyter AI setting. For instance, to put in Jupyter AI with solely added assist for Ollama fashions, use the next:

    pip set up jupyter-ai langchain-ollama

    The dependencies rely on the mannequin suppliers (see desk above).  Subsequent, restart your JupyterLab occasion. If you happen to see a chat icon on the left sidebar, this implies every part has been put in completely. With Jupyter AI, you possibly can chat with fashions or use inline magic instructions straight inside your notebooks.

    Native chat UI in JupyterLab | Picture by Writer

    2. Setting Up Ollama for Native Fashions

    Now that Jupyter AI is put in, we have to configure it with a mannequin. Whereas Jupyter AI integrates with Hugging Face fashions straight, some fashions may not work properly. As a substitute, Ollama gives a extra dependable solution to load fashions domestically.

    Ollama is a helpful instrument for working Large Language Models domestically. It enables you to obtain pre-configured AI fashions from its library. Ollama helps all main platforms (macOS, Home windows, Linux)​, so select the tactic to your OS and obtain and set up it from the official website. After set up, confirm that it’s arrange accurately by working:

    Ollama --version
    ------------------------------
    ollama model is 0.6.2

    Additionally, make sure that your Ollama server should be working which you’ll test by calling ollama serve on the terminal:

    $ ollama serve
    Error: hear tcp 127.0.0.1:11434: bind: handle already in use

    If the server is already energetic, you will notice an error like above confirming that Ollama is working and in use.


    Possibility 1: Utilizing Pre-Configured Fashions

    Ollama gives a library of pre-trained fashions that you would be able to obtain and run domestically. To start out utilizing a mannequin, obtain it utilizing the pull command. For instance, to make use of qwen2.5-coder:1.5b, run:

    ollama pull qwen2.5-coder:1.5b

    This can obtain the mannequin in your native setting. To verify if the mannequin has been downloaded, run:

    ollama record

    This can record all of the fashions you’ve downloaded and saved domestically in your system utilizing Ollama.

    Possibility 2: Loading a Customized Mannequin

    If the mannequin you want isn’t out there in Ollama’s library, you possibly can load a customized mannequin by making a Model File that specifies the mannequin’s supply.For detailed directions on this course of, seek advice from the Ollama Import Documentation.

    Possibility 3: Working GGUF Fashions straight from Hugging Face

    Ollama now helps GGUF models directly from the Hugging Face Hub, together with each private and non-private fashions. This implies if you wish to use GGUF mannequin straight from Hugging Face Hub you are able to do so with out requiring a customized Mannequin File as talked about in Possibility 2 above.

    For instance, to load a 4-bit quantized Qwen2.5-Coder-1.5B-Instruct mannequin from Hugging Face:

    1. First, allow Ollama underneath your Local Apps settings.

    Tips on how to allow Ollama underneath your Local Apps settings on Hugging Face | Picture by Writer

    2. On the mannequin web page, select Ollama from the Use this mannequin dropdown as proven beneath.

    Accessing GGUF mannequin from HuggingFace Hub by way of Ollama | Picture by Writer

    We’re nearly there. In JupyterLab, open the Jupyter AI chat interface on the sidebar. On the high of the chat panel or in its settings (gear icon), there’s a dropdown or area to pick out the Mannequin supplier and mannequin ID. Select Ollama because the supplier, and enter the mannequin title precisely as proven by Ollama record within the terminal (e.g. qwen2.5-coder:1.5b). Jupyter AI will hook up with the native Ollama server and cargo that mannequin for queries​. No API keys are wanted since that is native.

    • Set Language mannequin, Embedding mannequin and inline completions fashions based mostly on the fashions of your selection.
    • Save the settings and return to the chat interface.
    Configure Jupyter AI with Ollama | Picture by Writer

    This configuration hyperlinks Jupyter AI to the domestically working mannequin by way of Ollama. Whereas inline completions needs to be enabled by this course of, if that doesn’t occur, you are able to do it manually by clicking on the Jupyternaut icon, which is positioned within the backside bar of the JupyterLab interface to the left of the Mode indicator (e.g., Mode: Command). This opens a dropdown menu the place you possibly can choose Allow completions by Jupyternaut to activate the function.

    Enabling code completions in pocket book | Picture by Writer

    As soon as arrange, you need to use the AI coding assistant for varied duties like code autocompletion, debugging assist, and producing new code from scratch. It’s necessary to notice right here that you would be able to work together with the assistant both by way of the chat sidebar or straight in pocket book cells utilizing %%ai magic instructions. Let’s have a look at each the methods.

    Coding assistant by way of Chat interface

    That is fairly easy. You may merely chat with the mannequin to carry out an motion. As an illustration, right here is how we will ask the mannequin to elucidate the error within the code after which subsequently repair the error by deciding on code within the pocket book.

    Debugging Help Instance utilizing Jupyter AI by way of Chat | Picture by Writer

    You may as well ask the AI to generate code for a process from scratch, simply by describing what you want in pure language. Here’s a Python perform that returns all prime numbers as much as a given optimistic integer N, generated by Jupyternaut.

    Producing New Code from Prompts utilizing Jupyter AI by way of Chat | Picture by Writer

    Coding assistant by way of pocket book cell or IPython shell:

    You may as well work together with fashions straight inside a Jupyter pocket book. First, load the IPython extension:

    %load_ext jupyter_ai_magics

    Now, you need to use the %%ai cell magic to work together together with your chosen language mannequin utilizing a specified immediate. Let’s replicate the above instance however this time throughout the pocket book cells.

    Producing New Code from Prompts utilizing Jupyter AI within the pocket book | Picture by Writer

    For extra particulars and choices you possibly can seek advice from the official documentation.

    As you possibly can gauge from this text, Jupyter AI makes it simple to arrange a coding assistant, offered you may have the precise installations and setup in place. I used a comparatively small mannequin, however you possibly can select from a wide range of fashions supported by Ollama or Hugging Face. The important thing benefit right here is that utilizing an area mannequin gives vital advantages: it enhances privateness, reduces latency, and reduces dependence on proprietary mannequin suppliers. Nonetheless, working large fashions domestically with Ollama might be resource-intensive so guarantee that you’ve adequate RAM. With the speedy tempo at which open-source fashions are bettering, you possibly can obtain comparable efficiency even with these alternate options.



    Source link

    Share. Facebook Twitter Pinterest LinkedIn Tumblr Email
    Previous ArticleFrom Control to Guidance: Embracing Emergence in Advanced AI Systems | by Laurent | Mar, 2025
    Next Article Confront Underperforming Employees With Confidence By Following This Guide to Effective Accountability
    Team_AIBS News
    • Website

    Related Posts

    Artificial Intelligence

    Become a Better Data Scientist with These Prompt Engineering Tips and Tricks

    July 1, 2025
    Artificial Intelligence

    Lessons Learned After 6.5 Years Of Machine Learning

    July 1, 2025
    Artificial Intelligence

    Prescriptive Modeling Makes Causal Bets – Whether You Know it or Not!

    June 30, 2025
    Add A Comment
    Leave A Reply Cancel Reply

    Top Posts

    3D Printer Breaks Kickstarter Record, Raises Over $46M

    July 1, 2025

    I Tried Buying a Car Through Amazon: Here Are the Pros, Cons

    December 10, 2024

    Amazon and eBay to pay ‘fair share’ for e-waste recycling

    December 10, 2024

    Artificial Intelligence Concerns & Predictions For 2025

    December 10, 2024

    Barbara Corcoran: Entrepreneurs Must ‘Embrace Change’

    December 10, 2024
    Categories
    • AI Technology
    • Artificial Intelligence
    • Business
    • Data Science
    • Machine Learning
    • Technology
    Most Popular

    The people refusing to use AI

    May 6, 2025

    PyScript vs. JavaScript: A Battle of Web Titans

    April 2, 2025

    What Is Open on Easter? Walmart, Whole Foods, Wegmans, More

    April 18, 2025
    Our Picks

    3D Printer Breaks Kickstarter Record, Raises Over $46M

    July 1, 2025

    People are using AI to ‘sit’ with them while they trip on psychedelics

    July 1, 2025

    Reinforcement Learning in the Age of Modern AI | by @pramodchandrayan | Jul, 2025

    July 1, 2025
    Categories
    • AI Technology
    • Artificial Intelligence
    • Business
    • Data Science
    • Machine Learning
    • Technology
    • Privacy Policy
    • Disclaimer
    • Terms and Conditions
    • About us
    • Contact us
    Copyright © 2024 Aibsnews.comAll Rights Reserved.

    Type above and press Enter to search. Press Esc to cancel.