Technologies and systems I leverage to design, deploy, and scale intelligent infrastructure.
Core Engineering
Python
Primary language for building scalable ML systems, APIs, and automation pipelines.
PostgreSQL
Designing and querying production-grade relational data systems.
MySQL
Structured data analysis, transformations, and analytics workflows.
Git
Version control for collaborative development and experiment tracking.
GitHub
Code hosting, CI workflows, and open-source contributions.
FastAPI
High-performance APIs for serving ML models and AI systems.
PyTorch
Deep learning, graph models, and custom neural architectures.
Scikit-learn
Classical ML pipelines and evaluation frameworks.
LangChain
Building LLM-powered pipelines and RAG workflows.
LangGraph
Designing stateful, agentic AI systems with tool orchestration.
OpenAI APIs
Integrating LLMs for reasoning, generation, and automation.
Anthropic (Claude)
Working with advanced LLMs for agentic workflows.
Pandas
Data manipulation, preprocessing, and analysis.
NumPy
Efficient numerical computation and array operations.
Plotly
Interactive dashboards and web-based visualizations.
Apache Spark
Distributed data processing and large-scale analytics.
Apache Kafka
Real-time data streaming and event pipelines.
Google Cloud Platform (GCP)
Cloud-native ML pipelines and infrastructure.
AWS
Compute, storage, and ML experimentation.
Google Dataproc
Managed Spark clusters for big data workflows.
Docker
Containerizing applications and ML pipelines.
Kubernetes
Orchestrating scalable, production deployments.
Apache Airflow
Scheduling and managing data/ML workflows.
CI/CD (GitHub Actions)
Automating testing, deployment, and pipelines.
Linux (WSL/Arch/Ubuntu)
Development environment for ML and backend systems.
Jupyter
Experimentation and iterative model development.
Anaconda
Environment management for ML workflows.