Right now, I’m doing small freelance projects on the side, reading advanced AI/ML books to deepen my theoretical base, and actively developing my main project - a Security Operations AI Assistant powered by RAG and LLM fine-tuning.
Developed an AI assistant for security operations using Retrieval-Augmented Generation (RAG). Integrated Whisper for speech-to-text, embeddings with ChromaDB, and fine-tuned LLMs for context-aware decision support. Deployed via FastAPI
Reproduced the Transformer architecture from the “Attention Is All You Need” paper in PyTorch. Implemented encoder–decoder attention from scratch, trained on benchmark data, and validated results against reported performance.
Implemented Linear Regression in NumPy and PyTorch, coding gradient descent and backpropagation manually. Ran experiments with learning rates, Ridge regularization, and GD variants, visualizing convergence and performance on Kaggle dataset.
Built a machine learning pipeline to predict telecom customer churn. Preprocessed features, trained Logistic Regression, Random Forest, and XGBoost models, and achieved 85% ROC-AUC. Identified top churn drivers and provided actionable business insights.
Implemented a neural network from scratch in NumPy with manual forward and backward propagation, trained on MNIST. Added an interactive demo with a drawing canvas and real-time confidence bar visualization for digits 0–9.