Resume Narasimha Kallam

Download as pdf or txt
Download as pdf or txt
You are on page 1of 3

Narasimha Reddy Kallam Email: narsimhareddykallam@gmail.

com
Business Data Analyst Phone: +1(682)434-3266
Irving, Texas, USA (75063) LinkedIn

PROFESSIONAL SUMMARY

Results-driven Business Data Analyst with 4 years of experience in leveraging advanced data analytics, visualization tools,
and cloud technologies to drive strategic decision-making and operational improvements. Expert in designing and
optimizing data pipelines, automating ETL processes, and integrating diverse data sources into comprehensive data
warehouses. Proficient in SQL, Power BI, Excel, and Power Query for real-time reporting and KPI tracking. Adept at utilizing
Agile methodologies, REST APIs, and big data tools like Apache Spark, Kafka, and Snowflake for scalable data solutions.
Demonstrates strong expertise in data governance, testing, and performance evaluation, ensuring data integrity and
actionable insights that align with business objectives.

TECHNICAL SKILLS

Languages /Cloud Python, SQL, R, Scala, Azure


Database/OS MySQL, Teradata, Oracle, Windows, Linux/Unix, PostgreSQL, Apache HBase
DevOps Agile, Scrum, Jira, Waterfall, Jenkins, Git, GitHub, Aha
Business Skills Defining Business Requirements, Business Process Analysis & Re-engineering, Use Case
Modeling, JAD Sessions, Requirements Workshops, GAP Analysis, SWOT Analysis, Impact
Analysis, Data Analysis, Change Management
Development Tools / Bash/Shell, Lucid chart, Microsoft Visio, ETL, Tableau, Erwin, PowerBi
Data Visualization and
Reporting
Data Warehousing Informatica PowerCenter 10.X/9.X (Repository Manager, Designer, Workflow Manager,
Workflow Monitor, Source Analyzer, Warehouse Designer, Transformation Developer,
Mapplet Designer, Mapping Designer),Snowflake

Data Modeling Star Schema, Snowflake Schema, Extended Star Schema, Physical & Logical Modeling,
Dimension Data Modeling, Fact Tables, and Dimension Tables
Big Data Ecosystems Hive, Impala, MapReduce

PROFESSIONAL EXPERIENCE

Planck Technology Dec 2023 to Present


Business Data Analyst
Responsibilities:
• Gathered requirements from remotely based business users and defined and elaborated the requirements by holding
user meetings.
• Analyzed the historical documentation, supporting documentation, screen prints, and e-mail conversations, presented
business and wrote the business requirements document, and got it electronically signed off by the stakeholder.
• Linked data lineage to data quality and business glossary work within the overall data governance program. Performed
an end-to-end Data Lineage assessment and documentation for select CDEs.
• Gathered requirements by working with the business users on Business Glossary, Data Dictionary, Reference data, and
Policy Management. Involved in extensive DATA validation by writing several complex SQL queries and involved in
back-end testing and worked with data quality issues.
• Performed data analysis and profiling using complex SQL on various source systems including Oracle and Teradata.
Utilized simple methods like PowerPoint presentations while conducting walkthroughs with the stakeholders.
• Worked on a data migration project where we have migrated pipelines to Databricks platform using Lake House
architecture.
• Leveraged Azure Data Factory to orchestrate and automate data workflows, ensuring seamless data integration across
multiple sources.
• Conducted GAP analysis to analyze the variance between the system capabilities and business requirements. Involved
in defining the source to target data mappings, business rules, and business and data definitions.
• Possess working knowledge of AS-IS and TO-BE business processes, with experience in converting these requirements
into technical specifications for preparing test plans, as well as writing and executing unit, system, integration, and UAT
scripts in data warehouse projects; Built Tableau visualizations against various Data Sources like Flat Files (Excel,
Access) and Relational Databases (Oracle, SQL SERVER, and Teradata).
• Managed and optimized Azure SQL databases, including performance tuning, indexing, and query optimization to
support data-driven decision-making.
• With extensive experience in data warehousing, including Slowly Changing Dimensions, Star and Snowflake Schemas,
and data modeling, Proficient in Normalization/De-normalization techniques up to 3NF in dimensional database
environments. hands-on expertise in building complex ETL pipelines, implementing data models, and working within
Snowflake and Databricks ecosystems. My skills extend to developing conceptual, logical, and physical designs for OLTP
and OLAP systems, conducting market research, feasibility studies, and risk management, and utilizing Tableau Desktop
for advanced data visualization and reporting.
• Created Report-Models for ad-hoc reporting and analysis, Created Logical Data Model in Erwin and worked on loading
the tables in the Data Warehouse; Created ad-hoc reports to users in Tableau by connecting various data sources.
• Preparing dashboards using calculated fields, parameters, calculations, groups, sets, and hierarchies in Tableau
Extensively designed Data mapping and filtering, consolidation, cleansing, Integration, ETL, and customization of data
mart. Utilized Python libraries such as Pandas, NumPy, and Matplotlib for data cleaning, transformation, and
visualization, leading to more accurate and insightful data analysis.
• Utilized Azure Synapse Analytics for building and managing large-scale data warehousing solutions, enabling complex
analytical queries and real-time insights.
• Integrated Azure Machine Learning with business data pipelines to perform predictive analytics, improving forecasting
accuracy and business strategy.
• Extensive experience in creating data visualizations for KPIs using Tableau and Excel, building data integration and ETL
solutions for data warehousing with SSIS, and managing requirements through the Requirement Traceability Matrix
(RTM), with proficiency in RDBMS platforms like Oracle and SQL Server, as well as operating products such as SSIS,
SSRS, and SSAS
• Developed and integrated RESTful APIs using Python to connect various business systems, enhancing data accessibility
and interoperability.
• Document various Data Quality mapping documents, Perform small enhancements (data cleansing/data quality).
Conducted User Acceptance testing (UAT) and worked with users and vendors who built the system.
• Gathered requirements from the business team for converting manually run SQL query data extract to Business
Intelligence reports and dashboards.

Magna Infotech, India Sep 2020 to June 2022


Data Analyst
Responsibilities:
• Worked in Data Analysis, Data Profiling, Migration, and Data Integration, background in SDLC, following Agile/SCRUM
methodologies for web application development. Skilled in gathering and analyzing user requirements, preparing BRDs
and FRDs, and have extensive experience with data warehousing, including Slowly Changing Dimensions, Star and
Snowflake Schemas, and data modeling.
• Collaborated with data engineers and operation team to implement ETL process, wrote, and optimized SQL queries to
perform data extraction to fit the analytical requirements.
• Worked with the ETL team to document the Transformation Rules for Data Migration from OLTP to Warehouse
Environment for reporting purposes.
• Developed normalized Logical and Physical database models to design OLTP systems for insurance applications, Hands-
on experience working with Python and Pyspark scripts., Use Jupyter Notebooks and Zeeplin for doing data analysis.
• Worked on moving data from S3 to Snowflake and vice versa, using snowflake and cloud DWH for BI reporting,
Performed SWOT and risk analysis to better understand the product's potential impact on the company's betterment.
• Created data transformation tasks such as BULK INSERT to import client data. Created a dimensional model for the
reporting system by identifying required dimensions and facts using Erwin.
• Leveraged Python for data mining and extraction from large datasets, uncovering hidden patterns and insights that
drove business strategies.
• Implemented Azure Data Lake Storage for efficient and scalable storage of large datasets, enabling advanced analytics
and reporting. Automated business workflows using Azure Logic Apps, improving efficiency and reducing manual
intervention in data processing tasks.
• Created configuration packages in SSIS to move the packages from the development server to the production server.
Transferred data from Oracle database as well as MS Excel into SAS for analysis and used filters based on the analysis.
• Used parallel processing in SAS to aggregate programs to significantly reduce the execution time of the applications that
fetch aggregated and partitioned data.
• Created and maintained reports to display the status and performance of the deployed model and algorithm with
Tableau. Data type inconsistencies between source systems and the target system were resolved using Mapping
Documents.
• Involved using the ETL tool Informatica to populate the database, data transformation from the old database to the new
database. Performed data analysis, Data Migration, and profiling using complex SQL on various source systems including
Oracle and Teradata.
• Tested the database to check field size validation, check constraints, stored procedures, and verify field size defined
within the app with Teradata.

Hexaware, India June 2019 to August 2020


Business Analyst
Responsibilities:
• Translated business needs into data analysis, business intelligence data sources, and reporting solutions for different
types of Clients. Performed requirements gathering and analysis including data analysis, GAP analysis (AS-IS to TO-BE),
and documentation for end users.
• Worked on data transformation and accessed raw marketing data in varied formats with different methods for analyzing
and processing.
• Used forward engineering to create a Physical Data Model with DDL that best suits the requirements of the Logical Data
Model. Worked with Data Warehouse developers to evaluate the impact of the current implementation and redesign of
all ETL logic. Converted the logical model to the Physical Model by giving accurate data types, created indexes on keys,
and forward-engineered the DDL
• Generated comprehensive analytical reports by running SQL queries against current databases to conduct data analysis.
Used complex query statements like sub-queries, correlated queries, derived tables, and case functions to insert the
data.
• Converted and loaded data from Flat Files to temporary tables in the Oracle database using SQL Loader. Managed,
updated, and manipulated report orientation and structures with the use of advanced Excel functions including Pivot
Tables and V-Lookups.
• Worked with internal architects and assisting in the development of current and target state data architectures. Worked
with other Teams to ensure that logical and physical ER/Studio data models were developed in line with corporate
standards and guidelines.
• Collaborate with data modelers, and ETL developers in creating the Data Functional Design documents; Extensively
worked with Model maintenance projects where all the Data models were not up to standards making them
standardized.
• Extensively worked with logical models to change them into complete physical models from where we can implement
these models into the database.
• Created and managed ETL pipelines using Python, ensuring smooth data extraction, transformation, and loading
processes for various data sources.
• Worked with developers on analyzing the impact changes on respective applications based on which design approach
is taken and also same changes are implemented in the database with the help of DBA.
• Designed a Conceptual Data Model, Logical Data Model, and Physical Data Model using Erwin, Created physical models
for the applications that don’t have any physical models from logical models.
• Used SQL and Python for doing extensive data analysis, Querying Spark code using Python and Spark-SQL for faster
testing and data processing and efficient data analysis.
• Writing implementation plans to implement the database changes into the QA and Production environment,
Coordinating with DBA in implementing the Database changes and updating Data Models with changes implemented in
development, QA, and Production
• Perform architecture design, data modeling, and implementation of SQL, Big Data platforms, and analytic applications
for consumer products.

EDUCATION

Master of Science, Computer Science, Dec 2023: Kennesaw State University, Atlanta, GA.
Bachelor of Technology, Electronics and Communication, May 2021: LPU, Punjab, India.

ACADEMIC PROJECTS

E-Commerce Transaction Analysis Kennesaw State University, Dec 2022

• Analyzed e-commerce transaction data to identify buying patterns and customer preferences, focusing on purchase
frequency and average order value.
• Used SQL to extract and retrieve accurate data from relational databases, ensuring comprehensive coverage of
relevant information.
• Employed Python for data cleaning and transformation, leveraging libraries such as Pandas and NumPy for in-depth
analysis and manipulation.
• Created interactive visualizations in Tableau, including charts, heatmaps, and trend lines, to present insights on
customer behavior and sales performance, and conducted exploratory data analysis (EDA) to uncover key trends and
correlations.
Real Estate Market Analysis LPU, May 2021

• Analyzed real estate market data to assess property values and identify trends, focusing on location, property type, and
market fluctuations.
• Used Python for data cleaning and preprocessing with libraries like Pandas and NumPy and applied statistical
techniques to evaluate market dynamics.
• Created visualizations with Matplotlib and Seaborn to present insights and provided actionable recommendations to
investors and real estate agents for data-driven decision-making.

You might also like