Abhiram Kanumilli - Informatica Developer
Abhiram Kanumilli - Informatica Developer
Abhiram Kanumilli - Informatica Developer
EXPERIENCE SUMMARY
Have 9+ years of expertise in designing, developing and implementing Data Warehousing Applications using Informatica
PowerCenter 9.x/8.x/7.x, Power Exchange and Informatica Data Quality 9.1. Have Experience in various domains like Health Care,
Oil and Gas, Retail, Media, Telecom and Banking.
9 years of IT experience in System Analysis, Design and Development in the field of Databases.
8+ years of Functional and Technical experience in Decision Support Systems – Data Warehousing and ETL (Extract, Transform
and Load) using Informatica PowerCenter tools.
Experience with working as a team through the project’s Software Development Life Cycle (SDLC) to analyze, gather project
requirements, and to develop, test and maintain developed software.
Experience in OLAP concepts, Data Warehousing concepts and Relational & Dimensional Data Modeling (Star and Snowflake
Schema).
Designed customized ETL process to manage Metadata Dependencies and Relationships.
Extensive experience in Power Center Repository Manager, Mapping Designer, Mapplet Designer, Transformation Developer,
Workflow Manager and Workflow Monitor.
Experience in finding and fixing Data Quality problems using tools like Informatica Analyst, Data Explorer which enables
profiling and analysis of data along with universal connectivity to all types of data sources.
Knowledge about using Power Centre to extract data into a Data warehouse and load data into SAP Net Weaver.
Well acquainted with Performance Tuning of sources, targets, mappings and sessions to overcome bottlenecks in mappings
using Informatica and Database concepts.
Highly proficient in designing and developing complex mappings from varied transformation logic like Lookups, Router, Filter,
Transaction Control, Aggregator, Stored Procedure, Joiner, Update Strategy, SQL, Java, Union etc.
Involved in Parsing, Standardization, Matching, and ETL integration implementations.
Thorough understanding on Match and Merge concepts in defining Match path components and Match rules and configuring
the Match and Merge Rules to optimize the better Match percentage using IDQ results.
Extensively worked with ETL tools to extract data from Relational and Non-Relational databases i.e. Oracle, SQL Server, DB2,
and Flat files.
Extensive experience in various database development using PL/SQL, TSQL, Stored Procedures, Triggers, Functions and
Packages.
Knowledge of preparing SSAS Cubes for data analysis using SQL Server for basic Star and Snowflake Schemas as well as parent-
child schemas.
Experience with DAC administration in executing and monitoring Data warehouse Applications.
Experience in analyzing large databases, searching for patterns and summarizing data.
Strong knowledge and experience in performance tuning both Informatica and Database.
Experience in creating Test cases for different sources like Relational sources, Flat files etc.
Experience in Upgrading, Installing and Configuring Power Center 8.x/7.x/6.x on Client/Server environment.
Experience with TOAD as an interface to databases, to analyze and view data.
Experience of writing UNIX scripts as pre-sessions and/or post-sessions.
Excellent analytical, problem solving, communication skills. Ability to interact with individuals at all levels.
TECHNICAL SKILLS
ETL Tools : Informatica PowerCenter 9.x/8.x/7.x/6.x, Informatica Power Exchange, Data stage 8.1, Cognos 7.x
EIM Tools : Informatica Data Quality 9.1/Informatica Data Explorer
Databases : Oracle 11g/10g/9i, SQL Server, DB2, Teradata V2R5, MySQL
Page 1 of 7
Database Tools: TOAD, SQL*Loader, SQL*Plus, Oracle SQL Developer.
Operating Systems: MS Windows 2000/XP/Win7, UNIX.
Languages : C/C++, PL/SQL, UNIX/Linux Scripting, HTML, SQL, PHP, Visual Basic, JavaScript
Modeling Tools: Toad Data Modeler, Microsoft Visio, Erwin 4.x/7.2
Scheduling Tools: UC4 Operations Manager 8.x/6.x, Informatica Scheduler, Control-M, Tivoli, Crontabs
PROFESSIONAL EXPERIENCE
WellPoint is the largest Commercial Business carrier in the United States. They provide health benefit solutions to Individual, Small
Group and Large Group Business and National Accounts. They have partnered with UST Global (onshore/offshore). EPDS is an
application used by the healthcare providers to add, update and inquiry provider demographic data. It is a consolidated provider
data system for the enterprise which enable current and future provider data and network affiliation needs for all states and that
serve as a centralized source of truth for provider data. EPDS is a central provider demographic repository which supports Provider
validation and Provider details processing while processing a claim in Claim processing system. EPDSv2 uses DB2 on AIX servers as
the Database Tier for managing Enterprise Provider Data across WellPoint. Data loads from various source systems and is scheduled
to update the database on a regular basis. The Source files are received as Flat Files and follow a standard format called CFF
(Common File Format). Other databases include Oracle and SQL Server.
Environment: Informatica PowerCenter 9.1, Informatica Data Quality (8.6/9.1), Informatica Data Explorer, DB2, Flat Files, Oracle 11g,
UNIX, DBVizualizer, Control-M, Windows 7.
Responsibilities:
Involved in assessing the technical and business suitability of the requirements for EPDSv2.
Involved extensively in estimates, planning, translating client’s requirements, identifying options for potential risks/data
gaps.
Coordinated with technical architects, data modelers and DBA’s on ETL strategies to improve EPDS.
Worked on ETL coding using Informatica tool to extract, transform, cleanse and load data.
Developed Power Center mappings using various transformations like Aggregator, Union, Lookup, Joiner, Expression,
Router, Filter, and Update Strategy.
Involved in implementing processes to capture data change (CDC), Business Validation and Data profiling.
Working as a Tech Lead for one of the track to load source data into EPDSv2.
Managing an offshore team for the different tracks related to the EPDS project.
Worked with IDQ to ensure accurate Address matching/cleansing and data quality for all source data types.
Defined Data Quality standardization & cleansing rules using Informatica Data Quality (IDQ) discovered from the profile
results.
Performed Data Quality checks, cleansed the incoming data feeds and profiled the source systems data as per business
rules using IDQ.
Worked with various developer modules like profiling, standardization and matching.
Designed various mappings using different transformations such as key generator, match, labeler, case converter,
standardize, parser and lookup.
Worked extensively with address validator to cleanse the address elements. Created input and output templates.
Created different types of profiles like Column level profiling, Summary profiles, drill down etc. using IDE.
Involved in Match and Merge rules, developed address validations etc. and also reusable error handling rules using IDQ.
Created DQ mapplets and incorporated into PowerCenter mappings to test the standardized and cleansed data.
Handled various issues related to monitoring and performance tuning of data.
Involved in code checks as well as testing to ensure all the requirements defined in the RTM are implemented.
Used scheduling tool Control-M to automate Informatica Workflows for daily Batch Jobs.
Designed and setup tasks using informatica process for on-call support.
Created, Updated and maintain ETL technical documents.
Page 2 of 7
Devon Energy, Houston, TX August 2012 to January 2013
ETL Consultant/Informatica Developer
Enterprise Data Warehouse (EDW)
Devon Energy Corporation is among the largest U.S based independent producer of oil and natural gas. Headquartered in Oklahoma
City, the company's operations are focused on North American onshore exploration and production. The project involved support of
Enterprise Data Warehouse v3.7 and transforming the PPDM model from v3.7 to v3.8. It also involved migration of Legacy data for
Vendors, Materials etc. stored in several Legacy systems and these are migrated to SAP 5.0. The Warehouse is built on SQL Server.
Data loads from various source systems and is scheduled to update the Warehouse on a regular basis. The source systems include
SAP, Oracle database, SQL Server database, Flat Files sources. Source data includes employees, drilling, rigs, oil/gas production and
financial data, etc.
Environment: Informatica PowerCenter 9.1, SAP ECC 6.0, SQL Server 2008, Oracle 11g, UNIX, TOAD, Windows 7.
Responsibilities:
Involved in Extraction, Transformation and Loading of data using Informatica.
Involved in ETL coding using Informatica tool from various source systems according to the PPDM (Professional Petroleum
Data Management) standards.
Extracted data from SAP R/3 4.6 and loaded into SAP ECC 6.0.
Generated and Installed ABAP Program/SAP R/3 Code Using Informatica 9.1.
Loaded data into SAP ECC 6.0 using IDOC’s, BAPI and Function Modules.
Developed Power Center mappings using various transformations like Stored Procedure, Union, Lookup, Joiner, Expression,
Router, Filter, and Update Strategy.
Involved in implementing processes to capture data change (CDC) and also for Data profiling.
Created reusable sessions and commands in the Workflow Manager.
Setup EDW jobs and job schedules using TIDAL scheduler.
Designed and setup tasks using ABC process for EDW failures for on-call support.
Created, Updated and maintain ETL technical documents.
Setup complete EDW jobs folder hierarchy, naming convention and job schedule management in UC4 Operations Manager.
Chesapeake Energy Corporation (NYSE:CHK) is the second-largest producer of natural gas, a Top 15 producer of oil and natural gas
liquids and the most active driller of new wells in the U.S. Headquartered in Oklahoma City, the company's operations are focused
on discovering and developing unconventional natural gas and oil fields onshore in the U.S. The project involves building and support
of Enterprise Data Warehouse. The Warehouse is built on Oracle. Data loads from various source systems are scheduled to update
the Warehouse on a regular basis. The source systems include Oracle database, SQL Server database, Flat Files, DB2, XML, and Excel
sources. Source data includes employees, budgeting, drilling, rigs, oil/gas production, geoscience, delivery systems, land related
data, and financial data, etc. I am currently working in a team to design, develop, test, automate and support ETL processes to load
the Enterprise Data Warehouse using Informatica Power Center 9.0.1.
Environment: Informatica PowerCenter 9.x/8.x/7.x, Oracle 11g/10g, SQL Server, UC4, AIX, Windows 7/XP.
Responsibilities:
Extensively involved in Performance tuning of ETL processes in Data Warehouse.
Extensively involved in ETL coding using Informatica tool to extract, transform, cleanse and load data from various source
systems conforming to the PPDM (Professional Petroleum Data Management) standards.
Setup complete EDW jobs folder hierarchy, naming convention and job schedule management in UC4 Operations Manager.
Design, develop and setup call operators in UC4 tool on job failures for EDW on-call support.
Prepare and maintain ETL standards and common practices documents on SharePoint.
Setup EDW on-call support schedule and support the ETL code development in Production environment.
Develop schedules, events, calendars and master job plans in UC4 tool for automation of EDW processes.
Worked on implementing procedures/functions to support daily ETL process miscellaneous tasks.
Created physical data model in the Data Warehouse and designed custom ETL execution plans.
Page 3 of 7
Assisted in managing Metadata driven dependencies and relationships by capturing deleted records, Index management as
DAC administrator.
Involved in implementing data transformation processing for Relational database (Oracle) using Informatica push down
optimization option.
Provided error reporting and email monitoring to isolate bottlenecks as part of daily process.
Applied Informatica Data Quality methods to apply rules to financial, customer and asset data to profile, specify and
validate rules, and monitor data quality over time.
Actively involved in testing Informatica EDW Power Center Upgradation to version 7, 8, and 9.
Actively involved in testing Oracle EDW Database Upgradation to version 11g.
Provided expertise to Business analysts on how to prepare ETL process documents.
Prepare and maintain documents to support EDW daily UC4 job operations and yearly maintenance procedures.
Create customized metadata reporting process for Informatica Repository.
Involved in implementing processes to capture data change (CDC) using Triggers on tables, applying status indicators or
timestamp indicators on rows of data.
Develop Packages, Stored Procedures, and Functions using PL/SQL for the automation of database maintenance EDW
processes.
Create customized metadata reporting process for UC4 Operations Manager Repository.
Develop UNIX shell scripts to invoke Informatica jobs using the pmcmd commands.
Design and develop unit test cases for the ETL processes.
Implement Informatica partitioning to improve data load time for various EDW Mappings.
Design and develop customized data profiles using the Power Center Designer.
Various complex data cleansing logic was developed using the Power Center Designer.
Develop PowerCenter mappings using various transformations like Stored Procedure, Custom, Transaction Control, XML,
SQL, Union, Lookup, Joiner, Expression, Router, Filter, and Update Strategy.
Develop Mapplets to be used in various other Mappings using the Power Center Designer.
Create complex mappings that involve slowly changing dimensions (Type 1, Type 2, and Type 3), target load order and
constraint based loading using the PowerCenter Designer.
Prepared/maintained documentation on aspects of ETL processes to support knowledge transfer to other team
members.
Provide training to support EDW daily ETL activities and expertise on Informatica related issues.
ProFlowers, a Provide Commerce brand, is a major flower retailer in the United States. It is an e-commerce company that sells
products shipped from growers, suppliers and its own distribution facilities to consumers. The Company's platform combines an
online storefront, proprietary supply chain management technology and established supplier relationships to create a market
platform that includes growers, manufacturers and distribution warehouses.
Environment: Informatica PowerCenter 8.6, Data stage/Quality stage 8.1, Erwin 7.2, Oracle 10g, Teradata V2R5, SQL Server 2008, HP
AIX, Tidal & Windows XP.
Responsibilities:
Created and Imported/Exported various Sources, Targets, and Transformations using Informatica PowerCenter 8.6.
Extracted data from several source systems and loaded data into EDW (Enterprise Dataware house).
Worked on Power Center Designer client tools like Source Analyzer, Warehouse Designer, Mapping Designer and Mapplet
Designer.
Created reusable transformations and mapplets by using Lookup, Aggregator, Normalizer, Update strategy, Expression,
Joiner, Rank, Router, Filter, and Sequence Generator etc. in the Transformation Developer, and Mapplet Designer,
respectively.
Extensively worked with the data conversions, standardizing, correcting, error tracking and enhancing the data.
Develop Packages, Stored Procedures, and Functions using PL/SQL for the automation of database maintenance EDW
processes.
Worked on implementing procedures/functions to support daily ETL process miscellaneous tasks.
Page 4 of 7
Involved in performance tuning with Informatica and database.
Involved in Migration of objects from Dev and Test server to Production servers.
Worked on Data stage/Quality stage 8.1.
Developed Parallel jobs using CDC, Aggregator, Complex File Stage, Join, Transformer, Sort, Merge, Filter, Funnel, Peak,
Modify, Row Generator, Column Generator, Surrogate Key Generator, Remove Duplicates and Lookup Stages.
Developed Parallel Jobs in Data stage Designer to Extract from Sources SQL Server 2005, DB2 and Flat Files/ Transform by
applying business rules/ Load into Target Oracle/Teradata.
Worked on Teradata BTEQ & T-SQL Tools for querying and retrieving data.
Worked with Mload, Fload & Fast export Scripts for loading and exporting data from Teradata database.
Worked with several applications like Cheetah mail, Wine.com, Florist Express etc.
Prepared Source to Target mapping documents and other related support documentation.
Mappings/Jobs scheduled by using Tivoli Scheduler Tool.
Design and develop unit test cases for the ETL processes.
Involved in Development, Testing and Production support.
BellSouth is a private Telephone Communications Services company providing telecommunication services, headquartered in
Atlanta. The project Involved design, development and maintenance of Data Marts. We had to upload the data from various centers
with data in different source systems using ETL tools. Informatica Power Center was used to load strategic source data to the data
marts. An operational data store was created. Meta Data Build up was designed for performing data mapping. We extracted data
from different data sources and then created different Data Marts to answer different query reports.
Environment: Informatica PowerCenter 6.0, Erwin 4.0, XML, DB2, SQL Server 2000, Teradata, Oracle 8.x, AIX, HPUX, Solaris and
Windows 2000.
Responsibilities:
Created and Imported/Exported various Sources, Targets, and Transformations using Informatica Power Center, Repository
Manager and Designer.
Developed various Mappings with the collection of all Sources, Targets, and Transformations.
Created Mapplets with the help of Mapplet Designer and used those Mapplets in the Mappings.
Developed and scheduled various Pre and Post Sessions, Batches for all Mappings for data loading from Source TXT files,
Tables to Target tables.
Tuned performance of Informatica sessions for large data files by implementing pipeline partitioning and increasing block
size, data cache size, sequence buffer length and target based commit interval and resolved bottlenecks.
The tables had to be populated for use in daily load for the initial load 'inserts' and then updated using incremental
aggregation and update strategy transformation.
Created and managed the global and local repositories and permissions using Repository Manager.
Created Mappings between Source and Target using Power Center Designer.
Worked Extensively with Slowly Changing Dimensions i.e. Type1 & Type2.
Used Dimensional Modeling Techniques to create Dimensions, Cubes and Fact tables.
Involved in Logical and Physical Database Design, forward engineering & reverse engineering using Erwin tool.
Imported an XML file to Designer, performed multiple operations, used in the Mappings and exported into an XML file.
Optimized the mappings by changing the logic, reduced running time and encouraged the usage of reusable objects viz.,
and mapplets.
Performed data cleansing activities like Data Enhancement, Parsing, Matching and Consolidation.
Analyzed and Created Facts and Dimension Tables.
Worked on Teradata BTEQ & T-SQL Tools for querying and retrieving data.
Worked with Mload, Fload & Fast export Scripts for Inserting and exporting data from Teradata database.
Involved in Development & Testing Phase.
Page 5 of 7
Bank One is a provider of financial services deposits, loans, insurance, investment, financial planning, credit card and cash
management services. I was involved in developing mappings for executing business logic for their General Ledger Accounts
DataMart (GLA). The sources that fed this DataMart included Oracle, Flat Files and the target data mart was on Oracle 9i. Daily loads
were scheduled to keep updating the DataMart and monthly roll ups were performed. Several reports were generated based on this
DataMart displaying monthly checking account information of customers.
Environment: Informatica PowerCenter 7.1.1, Windows NT, Oracle 9i, SQL Server2000, UNIX (Solaris), Toad V 7.6.
Responsibilities:
Involved in Source Analysis and profiling of the source data, and determine if the source data meets the business
requirements.
Understanding of the business requirements and translating them to data warehouse architectural design.
Used Erwin 4.0 to analyze the logical and physical models.
Extensively worked with Designer, Workflow Manager and Workflow Monitor.
Worked with the Lookup, Aggregator, Expression, Router, Filter, Normalizer, Update Strategy, Rank, Stored
Procedure, Union and Joiner transformations
Developed transformation logic and designed various simple and Complex Mappings in Designer with performance,
accuracy and efficiency to achieve operational objectives.
Involved in performance tuning for previously created mappings.
Worked with pmcmd command to communicate with the Informatica server, to perform various operations like start/stop
workflows, gettaskdetails, getworkflowdetails by checking the performance detail and session logs.
Involved in monitoring tasks, workflows and performance for mappings in Workflow monitor.
Fined tuned ETL processes by considering mapping and session performance issues.
Prepared/maintained documentation on aspects of ETL processes to support knowledge transfer to other team members.
IO Media Group provides Website Design & Development, Database Development, Custom Software Programming and technical
solutions for North Texas. Services include the development of corporate sites, extranets, intranets, and CRM solutions.
Environment: Informatica PowerCenter 6.0, UNIX, Erwin, MS Access, SQL, and PL/SQL.
Responsibilities:
Gathered user's requirements defined and prioritized systems requirements, prototyped for feasibility and discovery.
Generated and evaluated alternatives, Reviewed Recommendations with management.
Developed several forms and reports in the process. Also converted several standalone procedures/functions in PL/SQL to
packaged procedure for code reusability, modularity and control.
Evaluated completeness and accuracy of new and existing data models.
Involved in the full Software Development Life Cycle (SDLC).
Designed, developed, tuned and optimized databases for maximum performance.
Preparing test cases for Data integrity and Consistency.
Involved in preparing detailed ETL design documents, unit test plans for the mappings.
Created database objects like tables, views, synonyms, indexes and sequences.
While working under the DBA, maintained database and answered user’s request. Activities included DDL, DML, Database
design and backups.
Working with Informatica PowerCenter 6.0 to perform basic ETL administrator functions such as import /export / backup
repository.
EDUCATIONAL QUALIFICATION
Page 6 of 7
Madras University, India, 2003
Bachelors in Computer Engineering
CERTIFICATIONS
Page 7 of 7