Summary
Overview
Work History
Education
Skills
Timeline
Generic

Venkata Rajesh Guttikonda

Atlanta

Summary

With over 9 years of dedicated expertise in Data Warehousing and Data Provision, I have excelled as a seasoned Data Management Analyst. My track record includes delivering substantial cost savings, optimizing execution timelines, and ensuring an impeccable 100% data security record for diverse clients within their Test Data Management operations. Throughout my journey, I have demonstrated profound proficiency across all stages of the software development life cycle (SDLC), encompassing design, development, and seamless deployment. Currently, I hold the role of a proficient TDM/MDM/ETL Developer, actively contributing to innovative solutions and driving organizational success.

Test Data Management and ETL:

  • I possess extensive experience as a TDM lead and developer and admin focusing data provisioning technologies such as Data Migration, Data Masking, Data Virtualization, Data Cloning, Data Generation, Data Sub-setting by using various TDM tools, including Informatica ILM TDM, IBM OPTIM TDM, Delphix TDM, CA TDM, Informatica Cloud TDM and ETL tools such as Informatica PC and SSIS.
  • I have a deep understanding of Test Data Automation, Testing and Test Driven Development, and have played a pivotal role in establishing a self-service test data management flow within the organization.
  • I am well-versed Experienced in Data Profiling, Data Cleansing and Data Standardization using Informatica Power center and UNIX Shell script by using Reference Data/Golden copy for Optimal outcomes.

Master Data Management:

  • Proficient in development of Master data management activates such as Match and Merge, Data Cleansing and Data Governance using tool Informatica MDM, Informatica IDQ and Informatica IDD
  • Have experience in conducting MDM Proof of Concept (POC) with Informatica Salesforce MDM solution (Cloud Customer 360 for Salesforce).

Overview

10
10
years of professional experience

Work History

TDM Lead/Developer

Intercontinental Hotel Groups
01.2019 - Current
  • Analyzing the test data requirements and study existing applications.
  • Responsible for detailed design, development, coding, testing, deployment, implementation, and support of major Data Management applications to provide Test Data on various data bases like Oracle, MS SQL, Teradata.
  • Perform Data Masking, Data Generation, Data Sub-setting, Data Profiling using tools like CA TDM, IBM Optim TDM, Optim TDF.
  • Perform Data Visualization and Data cloning jobs using Delphix TDV tool.
  • Create dSources and VDB’s for various application teams using Delphix engine with complex policies while masking the data using IBM Optim tool.
  • Develop and/or enhance to Integrate Selenium and Tosca Automation scripts/Frameworks with CA TDM tool to build Test Data Management Portal for Test Data users to generate their own Test Data.
  • Generate Test data using various tools to perform Unit, Functional Agile, Regression,Smoke, Patch, Business Continuity, Performance, Batch process and Build Verification Testing.
  • Perform Data lineage functionalities to build Test Data Generation applications and Data Compare functionality to find the discrepancies in Test Data Generated
  • Integrate Rest API with Data management tools like CA Javelin to generate Test data for applications without Database access.
  • Resolve and escalate the issues in a timely fashion and helps in the support and maintenance activities.
  • Develop and enhance Data load functionalities to make it more robust and scalable for CI/CD and DevOps.
  • Co-ordinate with various teams and technical tracks to achieve the overall project objectives and milestones

MDM Developer

The Coca-Cola Company
08.2018 - 06.2019
  • Coordinate all aspects of software development life cycle, including documentation, user interface (UI), workflow, development, testing, and deployment, requiring knowledge and experience in translating business requirements into technical designs, coding, integration, and solutions.
  • Responsible for detailed design, development, coding, testing, deployment, implementation, and support of major business applications.
  • Work with systems engineers to determine business requirements, and assist in the design, development, and testing of solutions.
  • Define system architectures, write code, review programs, and engineer applications, leveraging scientific principles to achieve the technical requirements.
  • Create and built the data model for Base Objects, Landing, Stage tables based on the Project Requirement. Define relationships among the base objects and lookups for the Stage tables. Define trust scores for the source systems as per the requirements for trust enabled columns.
  • Setup and perform match & Merge based to the requirement. Analyze the data before consolidation.
  • Build Hierarchies for the Data model to view product, organization and client data.
  • Configured Informatica Data directory application for basic search and extended search to view the data on Informatica Data 360.
  • Utilize concepts and techniques of advanced mathematics and algorithms, engineer ing, computer applications, computer science, computer information systems, IT, the technical sciences, and/or related areas.
  • Coordinate with offshore teams to identify any technical challenges, build planning and provide inputs on resolving any technical challenges

TDM Developer

Ameriprise Financial
02.2017 - 07.2018
  • Perform Data Management Activities such as ETL, Data Archiving, Data Purging, Test Data Management and data profiling based on Business requirements.
  • Involve in in Identifying the software model, Requirement Analysis, Design, Development and unit Testing of applications supporting Data Management activities.
  • Identify Sensitive columns based on business requirements, design and develop business logic for De-identifying/masking sensitive columns.
  • Develop Masking procedures using SQL procedures, Lua scripting to de-identify the sensitive data using IBM InfoSphere OPTIM.
  • Develop data Sub-setting solutions to provide data for Smaller testing environments based on business requirements by maintaining the referential integrity across the database.
  • Production support for applications in operational division. Provide effective onsite environment support.
  • Analyse the existing Environment and Database to identify the RIM policies and compliance factors.
  • Perform data profiling on Historical database to identify the relationship between tables, identify sensitive columns, RIM requirement columns.
  • Analyse Business Requirement, Design and define Data Archiving strategy for Historic data in non-production and production environment.
  • Develop, test and debugging Data Archiving, Purging and restoring solutions based on Rim requirements using IBM InfoSphere Optim tool, Oracle DB, DB2.
  • Maintain Compliance by performing purge operation on historical database and Increase the performance of database Oracle by Archiving the Historic data.
  • Performance tuning to reduce the execution time for developed archiving strategies and masking strategies.
    Design and develop re-usable strategies and code to reduce the development effort and storage capacity on the server.
  • Create blackout plans, deploy and execute them in production environment.
  • Work closely with QA Team to design and develop testing strategies, task and. perform testing on Archive and purge solutions

TDM Developer

Merck
01.2015 - 08.2015
  • Performed full time POC to research, reduce the cost on QE&A environment and reduce Data breach.
  • Analysed existing Test Data process for client in QE&A Environment and suggested better optimized TDM solution.
  • Design and implemented Test Data Management Strategy.
  • Installed Informatica TDM server on Amazon AWS cloud. Acted as TDM administrator.
  • Created Informatica connections for Oracle and MySQL database to perform TDM strategies and ETL Tasks.
  • Derive physical and logical relations using Informatica Data Discovery. Configured Data discovery rules to Identify Sensitive data like PII and PCI across the Source system data (Gold Copy).
  • Created OPTIM access definitions, Extract Requests, Table Mappings and Columns Mappings and apply Data Masking functions for the sensitive columns which are identified.
  • Used Substitution Masking rule, SSN Masking Rule, Date masking rule, Random Masking rule, Advanced masking rule, Key masking on string, integer and date columns.
  • Used Substitution Masking rule, SSN Masking Rule, Date masking rule, Random Masking rule, Advanced masking rule, Key masking on string, integer and date columns.
  • Insert, Convert, Compare and Load jobs based on the masking requirements.
  • Created entities and data models to perform sub-setting on the masked data.
  • Loaded data from MYSQL source to MySQL staging database and MYSQL staging database to MYSQL Target Database to Salesforce testing environment based on the test requirement.
  • This project was successful and reduced 95% of Data cost on QE&A Environment for our client and removed 100% data Breach.
  • Dealt with databases like Oracle, MySQL and Salesforce CRM application.

TDM Analyst

Cognizant Technology Solutions
05.2014 - 12.2014
  • POC on Data Masking, data Subsetting, data Generation, Data Profiling using tools like Informatica’s ILM TDM and IBM Optim.
  • Analysed existing Test Data process for different Retail clients in QE&A Environment and suggested better optimized TDM solution and TDM Tools as a part of Retail Business Development Team.
  • Designed and proposed Proof of concept on Informatica’s TDM solution for different cognizant clients.
  • Acted as a Data Base admin at TDM COE. Managed Oracle, SQl and MySQL databases which has huge amount of data which are used for POC’s.
  • Transforming Unstructured files like EDI, HIPPA, XML to structured data using B2b Data Transformation Tool and good knowledge in concepts such as PII, PCI.
  • Have significant experience on the Oracle, SQL, MySQL and other relational Database Platforms.
  • Data Migration and integration using Informatica power center (9.6.1 version).
  • High level of experience in designing and creating mappings, work flows in power center.
  • Experience in Designing and implementation of passive and active transformations.
  • Hands on Experience with IMB Optim, SSIS and Cognizant owned tools like DataGen.
  • Data validation for Migrated and masked data using Informatica’s Data Validation Tool.
  • Experience in Retrieving and analysing Social media data for retail clients from Twitter, Facebook using Datasift.
  • Experience in Data Migration, Data Cleansing, Data fencing, Data aging.
  • Extensive experience in enterprise software development life cycle (SDLC), including requirements gathering, designing, coding, testing, and deployment.

ETL Developer

Amex
07.2013 - 04.2014
  • Creating Connections for the source, target, staging systems on Informatica Power Center.
  • Designed and developed Mappings in Power center to transform the data based on new application schema.
  • Designed and developed both active and passive transformations based on the requirement.
  • Used different transformations such as Joiner, Aggregate, SQl, expression, router, normalizer, filter, lookup and update transformations in mappings.
  • Loaded the data from source system (Ms SQL) to staging system(Oracle) in the initial phase.
  • Designed and developed mappings for transforming and loading the data from staging system to target system which is from oracle to oracle.
  • Created workflows, parameterized them, automated and scheduled the ETL process based on the time frame.
  • Successfully migrated the data from old application which has data on Ms SQl to new application Oracle.
  • Have significant experience on the Oracle, SQL and other relational Database Platforms

Education

Master of Science - Data Science

University of North Carolina At Charlotte
Charlotte, NC
12.2016

Bachelor of Science - Computer Science

Amrita University
Coimbatore, India
05.2013

Skills

Technologies:

  • MySQL
  • Oracle
  • Microsoft SQL
  • Teradata
  • DB2
  • Salesforce
  • GCP Big Query

Languages:

  • Groovy
  • Java
  • SQL
  • HTML
  • Php
  • Python

Timeline

TDM Lead/Developer

Intercontinental Hotel Groups
01.2019 - Current

MDM Developer

The Coca-Cola Company
08.2018 - 06.2019

TDM Developer

Ameriprise Financial
02.2017 - 07.2018

TDM Developer

Merck
01.2015 - 08.2015

TDM Analyst

Cognizant Technology Solutions
05.2014 - 12.2014

ETL Developer

Amex
07.2013 - 04.2014

Master of Science - Data Science

University of North Carolina At Charlotte

Bachelor of Science - Computer Science

Amrita University
Venkata Rajesh Guttikonda