Skip to content

aalokhya/ShadowFox

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

16 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

ShadowFox

Task 1

🏠 Boston House Price Prediction App

An interactive web app built with Streamlit to predict housing prices in Boston using a Random Forest Regressor trained on housing dataset features.

🔗 Live Demo: https://boston-price-prediction.streamlit.app/

🚀 Features

  • 📥 User-friendly input interface for housing features
  • 🧠 Trained machine learning model (Random Forest)
  • 🔄 Handles missing values via mean imputation
  • 📈 Real-time predictions with performance metrics
  • ☁️ Hosted on Streamlit Cloud

📊 Model Info

  • Algorithm: Random Forest Regressor

  • Train/Test Split: 80/20

  • Evaluation Metrics:

    • Mean Squared Error (MSE)
    • R² Score

🧰 Technologies Used

  • Python 3.11
  • Streamlit
  • pandas, numpy
  • scikit-learn
  • matplotlib, seaborn

Task 2

🏦 Loan Approval Prediction

A machine learning model that predicts loan approval based on applicant details using XGBoost. This task performs data preprocessing, feature engineering, model training, and evaluation with various visualizations to understand model behavior.

📊 Features

  • 🧠 XGBoost Classifier with hyperparameter tuning
  • 🔄 Handles missing values, label encoding, and feature scaling
  • 📉 Generates:
    • Confusion Matrix
    • Feature Importance
    • ROC Curve
    • Precision-Recall Curve
    • Learning Curve
  • ✅ Evaluates model using cross-validation

📊 Model Info

  • Algorithm: XGBoost Classifier
  • Train/Test Split: 80/20
  • Cross-validation: StratifiedKFold (5-fold)

🧰 Technologies Used

  • Python 3.11
  • pandas, numpy
  • matplotlib, seaborn
  • scikit-learn
  • xgboost

Task 3

Language Model Analysis with GPT-2

An in-depth exploratory project analyzing the behavior and capabilities of the GPT-2 language model using the Hugging Face transformers library. Focuses on how GPT-2 performs in text generation, context understanding, and creativity.

✨ Features

  • 🤖 Uses pre-trained GPT-2 from Hugging Face
  • ✍️ Generates text for diverse prompts
  • 📋 Manual rubric-based evaluation of:
    • Context retention
    • Creativity
    • Coherence
  • 📊 Visualizes:
    • Word clouds of generated text
    • Coherence vs. length graphs
    • Response length by prompt
  • 🧠 Includes research questions and future work
  • 🔍 Bias and safety checks discussed
  • ✅ Follows best practices for reproducibility and ethics

🔬 Model Info

  • Model: GPT-2 (transformers pipeline)
  • Task: Text Generation
  • Evaluation: Manual rubric + visual analysis
  • Prompts Tested: 5 inputs across narrative, scientific, and lifestyle contexts

🧰 Technologies Used

  • Python 3.11
  • pandas, numpy
  • matplotlib, seaborn

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors