Resume
Resume
Resume
I am working as a Technology Analyst in Infosys Ltd. Chandigarh since Sep.2014 till now.
EDUCATION
Project:
1.Project Name: CER and Transparency
Client: Investment client(USA) (Nov 2014- Till Now)
Description:
There are two projects of CGC. We migrated all CGC data from oracle server to hadoop and
Microsoft PDW server(Azure cloud) and developed logic for informatica logic and oracle procedure
which was used in old architecture. Ingested all the data in to hadoop using file ingestion concept like
we have data in files from the oracle server for the fundamental tables and ingested that files into hadoop and
then created external table on that ingested data. We used that tables and created same ETL process to write
the procedures as like oracle side and put all the data on Azure cloud. Used the spark for analytics.
Responsibilities:
Interaction with business partners and client for status updates on tasks and requirements gathering for
new enhancements
Gathering and analyzing the requirements and providing a better solution for implementation.
Preparing the high level design documents on the basis of requirements.
Development of ETL flow , database solutions, testing and maintenance, improve data quality.
Used Spark for data analytics.
Development of stored procedures, in order to process data/files.
Managing the deployment of code in ITE and UAT environments.
Value Add to the client - Automation Scripts to reduce manual work and save effort.
Ensuring Quality Standards and improving overall quality of deliverables.
Provide technical assistance to team members.
Reviewing the mappings, sessions and workflows for ETL standards.
Identify the bugs in existing flow by analyzing the data flow.
Responsible for writing unit test cases and performing unit testing.
The data is processed using spark-SQL which used all the basic features of spark sql like
dataframe.
Description:
The project involves the development of Investment Data Warehouse (IDW) for client Capital Group of
Companies. It involves integration of data from disparate source systems into a centralized data warehouse
that helps in making better investment decisions and better management of various funds like mutual funds,
equity funds etc.
This project involves data transformations through Extract, Transform and Load techniques implemented
through Informatica and various database objects. This is an end-to-end application development project
involving requirements elaboration, design, build, testing, implementation and support. The system will be
developed in multiple releases using the staged waterfall development methodology. A prototype will be
developed as a part of this effort to proof out the proposed architecture.
Responsibilities:
Development Responsibilities:
Interaction with business partners and client for status updates on tasks and requirements gathering for
new enhancements
Gathering and analyzing the requirements and providing a better solution for implementation.
Preparing the high level design documents on the basis of requirements.
Development of ETL interfaces, Informatica mapping and workflows, UNIX scripts, database
solutions, testing and maintenance, improve data quality, support very high volumes of trades,
accounts and groups of accounts
Development of stored procedures, Unix shell scripts in order to process data/files
Scheduling the jobs using AutoSys scheduler.
Managing the versioning of the code through Perforce.
Managing the deployment of code in ITE and UAT environments.
Value Add to the client - Automation Scripts / Macros to reduce manual work and save effort.
Ensuring Quality Standards and improving overall quality of deliverables.
Provide technical assistance to team members.
Reviewing the mappings, sessions and workflows for ETL standards.
Identify the bugs in existing mapping by analyzing the data flow.
Responsible for writing unit test cases and performing unit testing.
Provided L3 support for batches, implemented fixes in production in case of job failure.
Extraction of Incidents data from Remedy for the purpose of WSR (Weekly Status Report).
CERTIFICATIONS/ACHEVEMENTS
Big Data University Certifications:
Certificate on “Hadoop fundamentals I – Version 3”
Certificate on “Accessing Hadoop Data Using Hive V2”
Certificate on “Developing Distributed Applications Using ZooKeeper”
Certificate on “Moving Data into Hadoop”
Got INSTA award in Financial Year 2014-15 Q4.
Got Dream Team Award for performance in Project Team in Financial Year 2015-16 Q3
Got INSTA award in Financial Year 2015-16 Q1.
Actively participated in coordinating various team building activities within team as well as on
Account level.
PERSONAL DETAILS
DECLARATION
I hereby declare that the information furnished above is true to the best of my knowledge and belief.
PLACE: