Close Menu
    Trending
    • Why PDF Extraction Still Feels LikeHack
    • GenAI Will Fuel People’s Jobs, Not Replace Them. Here’s Why
    • Millions of websites to get ‘game-changing’ AI bot blocker
    • I Worked Through Labor, My Wedding and Burnout — For What?
    • Cloudflare will now block AI bots from crawling its clients’ websites by default
    • 🚗 Predicting Car Purchase Amounts with Neural Networks in Keras (with Code & Dataset) | by Smruti Ranjan Nayak | Jul, 2025
    • Futurwise: Unlock 25% Off Futurwise Today
    • 3D Printer Breaks Kickstarter Record, Raises Over $46M
    AIBS News
    • Home
    • Artificial Intelligence
    • Machine Learning
    • AI Technology
    • Data Science
    • More
      • Technology
      • Business
    AIBS News
    Home»Machine Learning»PRACTICAL DEMONSTRATION OF LINEAR REGRESSION | by Ajuruvictor | Jan, 2025
    Machine Learning

    PRACTICAL DEMONSTRATION OF LINEAR REGRESSION | by Ajuruvictor | Jan, 2025

    Team_AIBS NewsBy Team_AIBS NewsJanuary 1, 2025No Comments4 Mins Read
    Share Facebook Twitter Pinterest LinkedIn Tumblr Reddit Telegram Email
    Share
    Facebook Twitter LinkedIn Pinterest Email


    scatter plot of the info:

    # Scatter plot for Top vs Weight
    plot(weight_height$Top, weight_height$Weight, important = "Top vs Weight", xlab = "Top (inches)", ylab = "Weight (kilos)", col = rgb(0.2, 0.4, 0.6, 0.5), pch = 16)

    REGRESSION ANALYSIS:

    # Regression Evaluation
    # Easy Linear Regression: Weight ~ Top
    simple_model <- lm(Weight ~ Top, information = weight_height)
    cat("nSimple Linear Regression Abstract:n")
    ## 
    ## Easy Linear Regression Abstract:
    abstract(simple_model)
    ## 
    ## Name:
    ## lm(method = Weight ~ Top, information = weight_height)
    ##
    ## Residuals:
    ## Min 1Q Median 3Q Max
    ## -51.934 -8.236 -0.119 8.260 46.844
    ##
    ## Coefficients:
    ## Estimate Std. Error t worth Pr(>|t|)
    ## (Intercept) -350.73719 2.11149 -166.1 <2e-16 ***
    ## Top 7.71729 0.03176 243.0 <2e-16 ***
    ## ---
    ## Signif. codes: 0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
    ##
    ## Residual commonplace error: 12.22 on 9998 levels of freedom
    ## A number of R-squared: 0.8552, Adjusted R-squared: 0.8552
    ## F-statistic: 5.904e+04 on 1 and 9998 DF, p-value: < 2.2e-16

    RGRESSION ANALYSIS 2(MULTIPLE REGRESSION):

    # A number of Linear Regression: Weight ~ Top + Gender
    multiple_model <- lm(Weight ~ Top + Gender, information = weight_height)
    cat("nMultiple Linear Regression Abstract:n")
    ## 
    ## A number of Linear Regression Abstract:
    abstract(multiple_model)
    ## 
    ## Name:
    ## lm(method = Weight ~ Top + Gender, information = weight_height)
    ##
    ## Residuals:
    ## Min 1Q Median 3Q Max
    ## -44.167 -6.786 -0.118 6.800 35.850
    ##
    ## Coefficients:
    ## Estimate Std. Error t worth Pr(>|t|)
    ## (Intercept) -244.92350 2.29862 -106.55 <2e-16 ***
    ## Top 5.97694 0.03601 165.97 <2e-16 ***
    ## GenderMale 19.37771 0.27710 69.93 <2e-16 ***
    ## ---
    ## Signif. codes: 0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
    ##
    ## Residual commonplace error: 10.01 on 9997 levels of freedom
    ## A number of R-squared: 0.9027, Adjusted R-squared: 0.9027
    ## F-statistic: 4.64e+04 on 2 and 9997 DF, p-value: < 2.2e-16
    # Diagnostic Plots for Regression
    par(mfrow = c(2, 2))
    plot(simple_model, which = 1:4)

    The dataset consisted of 10,000 observations, together with the variables Gender, Top, and Weight. The imply top was M = 66.37 inches (SD = 3.85), and the imply weight was M = 161.44 kilos (SD = 32.11). Gender distribution was as follows: Male (5,000) and Feminine (5,000).

    Boxplots revealed that males tended to have greater common heights and weights in comparison with females. A scatterplot of top in opposition to weight steered a constructive linear relationship between these variables.

    A easy linear regression was carried out to look at whether or not top predicts weight. The regression equation was:

    Weight = b₀ + b₁ · Top

    The mannequin was statistically important, F(1, 9998) = 1420.63, p < .001, and defined 14.2% of the variance in weight (R² = .142). Top was discovered to be a big predictor of weight (b = 7.717, p < .001), indicating that for each one-inch improve in top, weight is predicted to extend by 7.717 kilos.

    A a number of linear regression was carried out to foretell weight utilizing top and gender. The regression equation was:

    Weight = b₀ + b₁ · Top + b₂ · Gender

    The mannequin was statistically important, F(2, 9997) = 2338.42, p < .001, explaining 31.8% of the variance in weight (R² = .318). Each top (b = 6.108, p < .001) and gender (b = -36.413, p < .001) have been important predictors of weight. The unfavourable coefficient for gender signifies that females are inclined to weigh lower than males when controlling for top.

    The analyses confirmed that each top and gender considerably predict weight. The easy linear mannequin demonstrated that top alone accounts for a considerable proportion of the variance in weight. The a number of regression mannequin highlighted the extra explanatory energy of together with gender, underscoring its significance as a predictor.

    These findings recommend that top and gender are important elements in figuring out weight and can be utilized for predictive functions in well being and anthropometric research. Additional analysis would possibly discover further variables or contemplate non-linear relationships for improved modeling.



    Source link

    Share. Facebook Twitter Pinterest LinkedIn Tumblr Email
    Previous ArticleQuadrantid fireballs expected to light Friday’s sky
    Next Article The Math Behind In-Context Learning | by Shitanshu Bhushan | Dec, 2024
    Team_AIBS News
    • Website

    Related Posts

    Machine Learning

    Why PDF Extraction Still Feels LikeHack

    July 1, 2025
    Machine Learning

    🚗 Predicting Car Purchase Amounts with Neural Networks in Keras (with Code & Dataset) | by Smruti Ranjan Nayak | Jul, 2025

    July 1, 2025
    Machine Learning

    Reinforcement Learning in the Age of Modern AI | by @pramodchandrayan | Jul, 2025

    July 1, 2025
    Add A Comment
    Leave A Reply Cancel Reply

    Top Posts

    Why PDF Extraction Still Feels LikeHack

    July 1, 2025

    I Tried Buying a Car Through Amazon: Here Are the Pros, Cons

    December 10, 2024

    Amazon and eBay to pay ‘fair share’ for e-waste recycling

    December 10, 2024

    Artificial Intelligence Concerns & Predictions For 2025

    December 10, 2024

    Barbara Corcoran: Entrepreneurs Must ‘Embrace Change’

    December 10, 2024
    Categories
    • AI Technology
    • Artificial Intelligence
    • Business
    • Data Science
    • Machine Learning
    • Technology
    Most Popular

    Bitcoin Hits a Record High. Here’s Why.

    May 22, 2025

    PyTorch: A Complete Summary of 16 Powerful Transformation Functions! | by Ben Hui | Feb, 2025

    February 24, 2025

    Efficient Reasoning in Large Language Models: A Structured Survey | by Chandini Saisri Uppuganti | May, 2025

    May 8, 2025
    Our Picks

    Why PDF Extraction Still Feels LikeHack

    July 1, 2025

    GenAI Will Fuel People’s Jobs, Not Replace Them. Here’s Why

    July 1, 2025

    Millions of websites to get ‘game-changing’ AI bot blocker

    July 1, 2025
    Categories
    • AI Technology
    • Artificial Intelligence
    • Business
    • Data Science
    • Machine Learning
    • Technology
    • Privacy Policy
    • Disclaimer
    • Terms and Conditions
    • About us
    • Contact us
    Copyright © 2024 Aibsnews.comAll Rights Reserved.

    Type above and press Enter to search. Press Esc to cancel.