Summary
Overview
Work History
Education
Skills
Core Competencies
Timeline
Generic

Shyam Reddy

Atlanta

Summary

AWS Solutions Architect with over 12 years of experience in cloud architecture, security enforcement, big data, data access control, Data steward second software development. Expertise in designing scalable, high-performance systems and driving large-scale cloud migration projects. Proficient in Python, Spark, EMR, EKS, DynamoDB, PostgreSQL, Kinesis, SQS, S3, and Airflow. Adept at building data pipelines, automating workflows, and collaborating with cross-functional teams to deliver innovative and cost-efficient solutions.

Overview

13
13
years of professional experience

Work History

Solutions Architect

Deloitte Consulting
01.2019 - Current
  • Architected and implemented high-performance, scalable AWS cloud solutions for large-scale migration projects.
  • Architected enterprise level systems with change and integration with external vendors.
  • Implement organization wide security policies to drive data movement between on-prem and cloud infrastructure.
  • Worked with engineers and development teams to enforce/compliant with security frameworks NIST, FedRamp, HIPPA, PCI.
  • Enforce rules to upgrade versions to handle vulnerabilities.
  • Built and managed data pipelines using Spark, Python, and AWS services like EMR, Kinesis, and S3 for batch and real-time processing.
  • Streamlined organization cloud policies and architected to build cloud modules.
  • Automated cloud infrastructure provisioning and management with Terraform, AWS CloudFormation, and Ansible.
  • Designed and maintained EKS clusters, utilizing Kubernetes manifests for containerized application deployments.
  • Established CI/CD pipelines with tools such as Jenkins, ArgoCD, and GitHub Actions, reducing deployment cycles by 30%.
  • Implemented infrastructure monitoring and logging solutions with Prometheus, Grafana, CloudWatch, and Splunk, ensuring high availability and performance.
  • Deployed serverless architectures using AWS Lambda, DynamoDB, and API Gateway for scalable and cost-effective solutions.
  • Configured AWS services like Redshift, DMS, and Qlik Attunity for seamless data replication and transformation.
  • Integrated advanced security practices into CI/CD workflows, achieving compliance with industry standards.
  • Mentored a team of 15 engineers, overseeing end-to-end project execution and fostering technical excellence.
  • Developed and optimized data workflows with Apache Airflow and Oozie for efficient task scheduling and automation.
  • Configured and managed AWS infrastructure, including EMR, EKS, SQS, DynamoDB, and Fargate, for scalable analytics solutions.
  • Designed real-time data streaming architectures using AWS Kinesis, Spark Streaming, and DynamoDB for actionable insights.
  • Automated VM provisioning and OS patching with Ansible and Terraform, enhancing infrastructure reliability and consistency.
  • Configured Splunk and CloudWatch for comprehensive system monitoring and alerting, reducing incident response times.
  • Implemented Blue-Green deployment strategies, improving system availability during application rollouts.

Application Architect

PNC Bank
09.2018 - 01.2019
  • Collaborated with product owners, business analysts, developers, and scrum masters to deliver valuable solutions to clients.
  • Responsible for deploying and operating highly available, scalable, and fault-tolerant systems in AWS.
  • Optimized the AWS infrastructure cost by resizing and deleting the orphaned resources using the Turbonomics tool.
  • Responsible for writing Terraform scripts to create and automate the infrastructure in both VMware and AWS cloud environments.
  • Worked on the assessment tool WQF to identify the resources for the migration of on-premises applications to AWS.
  • Worked on Terraform OSS, and Terraform Cloud for sharing and developing the configuration files.
  • Worked on automating the software and dependency package installation process from source, following configure, make, and make install steps using shell scripts.
  • Responsible for creating a Redshift data warehouse to store the data and providing endpoints to the corresponding team.
  • Worked on Terraform implementation for provisioning of infrastructure in Dev, UAT, environments.
  • Responsible for writing Ansible playbooks for deploying dependencies.
  • I have done a POC on Terraform to explain and demonstrate the possibility of implementing automated CI/CD in the BMS-DevOps platform.
  • Worked on migrating Node.js applications into the Elastic Beanstalk service in AWS.
  • Responsible for Creating golden images in VMware environment using VRA and VRO.
  • Responsible for Maintaining Ansible Roles to make automated for repetitive tasks.
  • Wrote Shell scripts for installing Required packages on Windows Operating systems.
  • Worked on release management & deployment tools like AWS Code Pipeline, AWS Code Deploy.
  • Created Static Auto Scaling Group with scheduled actions and a classic load balancer based on the working nature of an application for high availability of applications.
  • Created monitors, alarms, and notifications for EC2 hosts using Cloud Watch.
  • Responsible for automation and supporting of applications various applications using AWS services EC2, IAM Surrogator, S3 copier, SNS, Cloud Watch, Auto Scaling, Classic Load balancers.
  • Creating infrastructure through Cloud Formation templates based on the project requirements and deployed Through Auto ptp files.
  • Implement security controls and compliance measures within AWS CDK infrastructure deployments.
  • Integrate monitoring and logging solutions into AWS CDK infrastructure deployments to track resource usage, performance metrics, and operational logs.
  • Design AWS Cloud Formation templates to create custom sized VPC, Subnets, NAT to ensure successful deployment of Web applications and database templates.
  • Worked on provisioning the infrastructure for Elastic Beanstalk using cloud formation template and deployed the java application in the Elastic Beanstalk Environment using tomcat-java solution stack.
  • Worked on Cloud Formation Templates for deploying AWS Service Catalog Kits in UAT and Prod environments.
  • Worked on AWS Service Catalog kits to deploy the resources on Lab and Dev Environments.
  • Worked on AWS EMR, Glue jobs for transferring data from various AWS services.
  • Responsible for providing support to Service Now requests and Incidents that are assigned to the team.

Big Data Admin/ Devops SRE

Ericsson
01.2016 - 09.2018
  • Responsible for Writing Terraform Scripts to Automate the Gitlab CI/CD Pipelines for creating the Infrastructure like Sage maker Notebook Instances, Lambda Functions, Athena, Glue Jobs, S3 Buckets Etc.
  • Written Pytest Cases and Integration tests for Lambda Functions which consists of Python code for creating Sage maker Notebooks.
  • Worked on migrating users from on prem Jupyter notebooks to sage maker notebooks and sage maker studio services in AWS.
  • Worked on Monitoring resources usage in Datadog dash boards and take remediation steps for alerts.
  • Created data dog dashboards for multiple environments resources.
  • Architected Entire flow regarding EMR connectivity to Tecton feature store for claims uses cases.
  • Responsible for deploying production models into Sage maker endpoints.
  • Responsible for writing the python Boto3 Code for integration tests for our Gitlab CI/CD pipelines.
  • Worked on Gitlab Yaml files to automate different stages like Build, Pytests, Terrascans, Niqs Scans and deploy in various Environments.
  • Experience in onboarding the customers into our platform and helped them to train, validate and deploy the models.
  • Worked on BYOC Templates for data scientists to easily build, local Training, and deploy their models into batch transform and sage maker endpoints.
  • Worked on Data Life cycle automation using the Step functions, Lambda functions and Python.
  • Worked on implementing GitOps Strategy for Pipelines to automate the branching Strategies.
  • Created the Terraform Enterprise Workspace and Vault to store variables and pass to pipelines.
  • Experience in Architecting the Infrastructure with various Services that was used by Data Scientists.
  • Responsible for Writing BYOC Templates for Containerizing the Pickle files and deploying into the Sage maker Endpoints.
  • Worked on identifying model drifts in production environment.
  • Implemented GitOps Strategy to deploy the source code into the Production Environment.
  • Worked on Creating Federated Role Policies for IAM for Restricting the based-on Teams with Their Infrastructure Resources.
  • Worked on MLFLOW pipelines to automate machine learning models deployments for multiple teams.
  • Responsible for solving tenant issues i.e. data scientists, Data Engineers, MLE’s.
  • Worked on Productionized Pipelines to Deploy the Resources into Production Environment By using Gitlab CI/CD Kubernetes Runner.
  • Worked on moving the data from on premises to data lake by using various AWS services.
  • Worked on Automating the Gitlab CI/CD Pipelines to provision infrastructure and deployed the Pickle files in Various Environments like Research, Test and Production.
  • Responsible for Pipelines Debugging, Improvements in automation with CI/CD Pipelines.

Software Engineer

Buildium
01.2012 - 08.2015
  • Actively involved in Analysis, Detail Design, Development, System Testing and User Acceptance Testing.
  • Developing Intranet Web Application using J2EE architecture, using JSP to design the user interfaces, and JSP tag libraries to define custom tags and JDBC for database connectivity.
  • Implemented struts framework (MVC): developed Action Servlet, Action Form bean, configured the struts-config descriptor, implemented validator framework.
  • Extensively involved in database designing work with Oracle Database and building the application in J2EE Architecture.
  • Integrated messaging with MQSERIES classes for JMS, which provides XML message Based interface. In this application publish-and-subscribe model of JMS is used.
  • Developed the EJB-Session Bean that acts as Facade, will be able to access the business entities through their local home interfaces.
  • Evaluated and worked with EJBContainer Managed Persistent strategy.
  • Used Webservices - WSDL and SOAP for getting Loan information from third party and used SAX and DOM XML parsers for data retrieval.
  • Experienced in writing the DTD for document exchange XML. Generating, parsing and displaying the XML in various formats using XSLT and CSS.
  • Used XPath 1.0 for selecting nodes and XQuery to extract and manipulate data from XML documents.
  • Coding, testing and deploying the web application using RAD 7.0 and Websphere Application Server 6.0.
  • Used JavaScript&or validating client-side data.
  • Wrote unit tests for the implemented bean code using JUnit.
  • Extensively worked on UNIX Environment.
  • Data is exchanged in XML format, which helps in interoperability with other software applications.

Education

Master of Science - Computer Science

University of Illinois
Springfield, IL
01.2016

Bachelor of Technology - Information Technology

Jawaharlal Nehru Technological University
Hyderabad, India
01.2012

Skills

  • Cloud Platforms
  • Security Frameworks
  • Big Data Technologies
  • Programming
  • Workflow orchestration
  • Infrastructure-as-Code
  • Database Management
  • CI/CD & DevOps Tools
  • Monitoring Tools
  • Containerization
  • Version Control
  • Cloud architecture
  • AWS services
  • Data integration
  • Infrastructure automation
  • Data processing
  • Security compliance
  • Scalable systems
  • Stakeholder management

Core Competencies

AWS (EMR, EKS, S3, DynamoDB, Kinesis, SQS, Lambda, Fargate, DMS, Redshift, Sage Maker, sage maker Studio, Bedrock, Sage maker jump start, data wrangler, Athena, Glue, Lambda, Step functions, EC2, IAM, ECS, Cloud watch, Comprehend), Cloud Controls Matrix(CCM), CSA STAR, SOC 2, Hadoop, Spark, Airflow, Kafka, Hive, Sqoop, Python, Java, NodeJs, Scala, Shell Scripting, SQL, HiveQL, Apache Airflow, Oozie, Terraform, AWS CloudFormation, DynamoDB, PostgreSQL, Redshift, MongoDB, Jenkins, ArgoCD, GitHub Actions, Ansible, Puppet, Prometheus, Grafana, Splunk, CloudWatch, Dynatrace, PagerDuty, Docker, Kubernetes (EKS), GitHub, Bitbucket, Git

Timeline

Solutions Architect

Deloitte Consulting
01.2019 - Current

Application Architect

PNC Bank
09.2018 - 01.2019

Big Data Admin/ Devops SRE

Ericsson
01.2016 - 09.2018

Software Engineer

Buildium
01.2012 - 08.2015

Master of Science - Computer Science

University of Illinois

Bachelor of Technology - Information Technology

Jawaharlal Nehru Technological University
Shyam Reddy