Close Menu
    Trending
    • How Smart Entrepreneurs Turn Mid-Year Tax Reviews Into Long-Term Financial Wins
    • Become a Better Data Scientist with These Prompt Engineering Tips and Tricks
    • Meanwhile in Europe: How We Learned to Stop Worrying and Love the AI Angst | by Andreas Maier | Jul, 2025
    • Transform Complexity into Opportunity with Digital Engineering
    • OpenAI Is Fighting Back Against Meta Poaching AI Talent
    • Lessons Learned After 6.5 Years Of Machine Learning
    • Handling Big Git Repos in AI Development | by Rajarshi Karmakar | Jul, 2025
    • National Lab’s Machine Learning Project to Advance Seismic Monitoring Across Energy Industries
    AIBS News
    • Home
    • Artificial Intelligence
    • Machine Learning
    • AI Technology
    • Data Science
    • More
      • Technology
      • Business
    AIBS News
    Home»Machine Learning»Anomaly Detection in Sequential Data using LSTM Autoencoder and KMeans Clustering (unsupervised) | by Falonne KPAMEGAN | Jun, 2025
    Machine Learning

    Anomaly Detection in Sequential Data using LSTM Autoencoder and KMeans Clustering (unsupervised) | by Falonne KPAMEGAN | Jun, 2025

    Team_AIBS NewsBy Team_AIBS NewsJune 21, 2025No Comments2 Mins Read
    Share Facebook Twitter Pinterest LinkedIn Tumblr Reddit Telegram Email
    Share
    Facebook Twitter LinkedIn Pinterest Email


    Let’s visualize our information :

    import matplotlib.pyplot as plt
    import seaborn as sns
    %matplotlib inline

    plt.determine(figsize=(12,6))
    sns.lineplot(x=df.index, y=df['value'])
    plt.present()

    sns.histplot(df['value'], bins=100, kde=True)

    After verifying information are properly formated, we normalized the time collection utilizing MinMaxScaler and generated overlapping home windows of mounted size (SEQ_LENGTH) to feed into the LSTM.

    scaler = MinMaxScaler()
    scaled_data = scaler.fit_transform(df[['value']])

    def create_sequences(information, seq_length):
    X = []
    for i in vary(len(information) - seq_length):
    X.append(information[i:i + seq_length])
    return np.array(X)

    SEQ_LENGTH = 50
    X = create_sequences(scaled_data, SEQ_LENGTH)

    input_dim = X.form[2]
    timesteps = X.form[1]

    inputs = Enter(form=(timesteps, input_dim))
    encoded = LSTM(64, activation='relu', return_sequences=False, title="encoder")(inputs)
    decoded = RepeatVector(timesteps)(encoded)
    decoded = LSTM(64, activation='relu', return_sequences=True)(decoded)

    autoencoder = Mannequin(inputs, decoded)
    autoencoder.compile(optimizer='adam', loss='mse')
    autoencoder.match(X, X, epochs=50, batch_size=64, validation_split=0.1, shuffle=True)

    To entry the latent illustration, we outline a seperate encoder mannequin:

    encoder_model = Mannequin(inputs, encoded)
    latent_vectors = encoder_model.predict(X, verbose=1, batch_size=32) # form = (num_samples, 64)

    Fairly than thresholding reconstruction errors, we apply KMeans clustering to the compressed latent vectors:

    from sklearn.cluster import KMeans

    kmeans = KMeans(n_clusters=2, random_state=42)
    labels = kmeans.fit_predict(latent_vectors)

    # We assume the bigger cluster is "regular"
    normal_cluster = np.bincount(labels).argmax()
    anomaly_mask = labels != normal_cluster

    We visualize the latent house with PCA

    from sklearn.decomposition import PCA

    pca = PCA(n_components=2)
    latent_pca = pca.fit_transform(latent_vectors)

    plt.determine(figsize=(10, 6))
    sns.scatterplot(x=latent_pca[:, 0], y=latent_pca[:, 1], hue=labels, palette='Set1', s=50, alpha=0.7)
    plt.title("Kmeans cluster in latent house (PCA 2D)")
    plt.xlabel("principal element 1")
    plt.ylabel("principal element 2")
    plt.grid(True)
    plt.present()

    timestamps = balancer.index[SEQ_LENGTH:]

    plt.determine(figsize=(15, 5))
    plt.plot(timestamps, df['value'][SEQ_LENGTH:], label='Worth')
    plt.scatter(timestamps[anomaly_mask], df['value'][SEQ_LENGTH:][anomaly_mask], shade='pink', label='Detected Anomalies')
    plt.legend()
    plt.title("Anomalies Detected through KMeans on LSTM Latent House")
    plt.present()

    By combining the sequence modeling capabilities of LSTM Autoencoders with the unsupervised grouping of KMeans, we had been in a position to successfully detect anomalies in time collection information — even with out labeled anomalies.

    This strategy is highly effective as a result of:

    • It doesn’t require labeled coaching information.
    • It adapts to advanced sequential patterns.
    • It permits latent house exploration for clustering and visualization.

    Thanks for studying, I hope it’s helpful !



    Source link

    Share. Facebook Twitter Pinterest LinkedIn Tumblr Email
    Previous ArticleHow to Turn Bad Reviews Into Great News For Your Business
    Next Article This $180 Chromebook Offers Flexibility and Performance for On-the-Go Entrepreneurs
    Team_AIBS News
    • Website

    Related Posts

    Machine Learning

    Meanwhile in Europe: How We Learned to Stop Worrying and Love the AI Angst | by Andreas Maier | Jul, 2025

    July 1, 2025
    Machine Learning

    Handling Big Git Repos in AI Development | by Rajarshi Karmakar | Jul, 2025

    July 1, 2025
    Machine Learning

    A Technical Overview of the Attention Mechanism in Deep Learning | by Silva.f.francis | Jun, 2025

    June 30, 2025
    Add A Comment
    Leave A Reply Cancel Reply

    Top Posts

    How Smart Entrepreneurs Turn Mid-Year Tax Reviews Into Long-Term Financial Wins

    July 1, 2025

    I Tried Buying a Car Through Amazon: Here Are the Pros, Cons

    December 10, 2024

    Amazon and eBay to pay ‘fair share’ for e-waste recycling

    December 10, 2024

    Artificial Intelligence Concerns & Predictions For 2025

    December 10, 2024

    Barbara Corcoran: Entrepreneurs Must ‘Embrace Change’

    December 10, 2024
    Categories
    • AI Technology
    • Artificial Intelligence
    • Business
    • Data Science
    • Machine Learning
    • Technology
    Most Popular

    Survey: Big AI Investments at Odds with Lack of Testing in Generative AI Development

    March 27, 2025

    How ComfyUI-R1 & ComfyUI Transform Unstructured Input into Structured Workflows | by Cobus Greyling | Jun, 2025

    June 26, 2025

    Electric Bill Prices Rising, Are AI Data Centers to Blame?

    June 17, 2025
    Our Picks

    How Smart Entrepreneurs Turn Mid-Year Tax Reviews Into Long-Term Financial Wins

    July 1, 2025

    Become a Better Data Scientist with These Prompt Engineering Tips and Tricks

    July 1, 2025

    Meanwhile in Europe: How We Learned to Stop Worrying and Love the AI Angst | by Andreas Maier | Jul, 2025

    July 1, 2025
    Categories
    • AI Technology
    • Artificial Intelligence
    • Business
    • Data Science
    • Machine Learning
    • Technology
    • Privacy Policy
    • Disclaimer
    • Terms and Conditions
    • About us
    • Contact us
    Copyright © 2024 Aibsnews.comAll Rights Reserved.

    Type above and press Enter to search. Press Esc to cancel.