Bridging Software Engineering with Data, AI, ML, and Cloud Innovation
I build robust enterprise software and integrate scalable Data & AI solutions. My focus is on engineering reliable technology that solves complex business problems and drives real-world transformation.
I am an Engineer dedicated to building scalable, secure enterprise applications. By combining strong Software foundations with expertise in Data Engineering and Applied AI/ML, I deliver high-performance solutions across healthcare, telecom, and corporate sectors.
I build resilient, full-stack systems using cloud-native technologies. Deploying microservices on AWS or managing infrastructure as code (Terraform), I ensure applications are production-ready, highly available, and secure.
Advanced system depends on reliable data. I design efficient data backbones from batch processing to real-time streaming building the pipelines that clean, aggregate, and deliver the high-quality data necessary for analytics and intelligence.
I operationalize intelligent systems at scale. My expertise spans the full lifecycle deploying classical Machine Learning models for predictive analytics and integrating Generative AI for advanced capabilities. I focus on MLOps, model serving, and automation to ensure these systems are reliable, versioned, and production-ready.
Designing, Building & Scaling AI, ML, Data & Software Systems
Academic Background & Learning Journey
Master of Science, Data Science
Feb 2024 – Dec 2025
Grade: 3.6/4
Core Skills: Generative AI, Data Science, Advanced Machine Learning, Big Data, Data Mining
Bachelor of Technology, Information Technology
Aug 2016 – May 2020
Core Skills: App Development, Software Lifecycle, Networking, DBMS, Frontend & Backend Systems
Verified credentials across AI, Cloud, Data, and Engineering
AWS Certified Solutions Architect Associate
AWS Certified Machine Learning – Specialty
AWS Certified AI Practitioner
Certified Kubernetes Application Developer
SAS Viya for Learners Challenge Winner 2024
Microsoft PyTorch Fundamentals
Career Essentials in Generative AI
AI Agents Course
Kaggle – Pandas Essentials
Google Code Jam
Google Kick Start
Techgig Code Gladiators
Key focus areas aligned with industry innovation and career vision
Deploying scalable AI solutions on AWS using services like SageMaker, Kinesis, and Bedrock.
Fine-tuning LLMs, building RAG pipelines, and deploying adaptive AI agents for production.
Transforming data into actionable insights with Python, EDA, ML models, and statistical methods.
Building supervised and unsupervised models using PyTorch, TensorFlow, and scikit-learn.
Automating build, test, and deployment pipelines with Docker, Kubernetes, and CI/CD tools.
Technical competencies across Generative AI, data systems, and cloud engineering
Consistent learning through problem-solving and algorithmic thinking
Coding daily for one full year, showing commitment and discipline.
Recognized in global competitive rankings among LeetCode users.
Mastered a wide range of data structures and algorithms.
Earned elite badges in consistency, contests, and problem milestones.
I recently delivered a live, hands-on session for DevOps Career Hub, architecting and deploying a real-world AI assistant.
This walkthrough covers the full lifecycle: from Python code to scalable cloud infrastructure using AWS Bedrock, Lambda, and API Gateway.
Watch Full WebinarHave a project, question, or just want to chat? I’d love to hear from you!
+61 481 700 945
tayalarajan45@gmail.com