Sr. Data Analyst Resume
Mount Laurel, NJ
OBJECTIVE
- To work in a challenging role and enrich my professional skills, thus involving in the growth of the organization.
SUMMARY
- Over 5 years of experience in Information Technology with a strong background in Database development and Data warehousing and reporting.
- Experienced working with HDFS, Podium Ingestion and Talend at Confidential
- Good knowledge of Data warehouse concepts and principles - Star Schema, Snowflake, SCD, Surrogate keys, Normalization/De normalization.
- Experience in integration with various data sources including relational databases like Oracle and SQL Server.
- Have extensively worked in developing ETL for supporting Data Extraction, transformations and loading using Informatica Power Center.
- Well acquainted with Informatica Designer Components - Source Analyzer, Transformation Developer, Mapplet and Mapping Designer.
- Worked extensively with complex mappings using different transformations like Source Qualifiers, Expressions, Filters, Joiners, Routers, Union, Unconnected / Connected Lookups and Aggregators.
- Strong Experience in developing Sessions/tasks, Worklets, Workflows using Workflow Manager Tools - Task Developer, Workflow & Worklet Designer.
- Experience in using the Informatica command line utilities like pmcmd to execute workflows.
- Extensively used Informatica Repository Manager and Workflow Monitor.
- Experience in debugging mappings. Identified bugs in existing mappings by analyzing the data flow and evaluating transformations.
- Hands on experience in Performance Tuning of sources, targets, transformations and sessions.
- Worked with Stored Procedures, Triggers, Cursors, Indexes and Functions.
- Highly motivated to take independent responsibility as well as ability to contribute and be a productive team member.
- Experience in programming with the .Net Framework using C#.Net, ASP.Net and Web Services.
- Hard working, Dedication and Zeal to learn and upgrade technologies.
TECHNICAL SKILLS
ETL Tools: Talend, Podium Ingestion, Informatica PowerCenter 9.1/8.6
Big Data: HDFS, Hive
Scheduling Tools: Autosys, Crontab
Databases: Oracle 11g, MS SQL Server 2012/2008
Scripting Languages: Unix Shell Script, JavaScript
Source Control Tools: GitHub, Apache Subversion SVN, Visual Source Safe 2005,Bitbucket
Reporting Tools: MS SQL Server Reporting Services
Languages: C#.Net, C, C++, Java 2.0
Internet Technologies: ASP.Net 2.0, ADO.Net, .Net XML Web Services, HTML
IDE Tools: Visual Studio .Net 2005
Operating Systems: Windows, Unix
PROFESSIONAL EXPERIENCE
Confidential, Mount Laurel, NJ
Sr. Data Analyst
Responsibilities:
- Worked on Podium tool to ingest and publish Oracle, Mainframe and File based data onto HDFS
- Used Talend ETL tool to transform the source data into target staging and dimension tables
- Built re-usable components to be shared across multiple jobs
- Wrote complex and tuned Hive queries when extracting data from source
- Built a Talend job to replicate data across Clustered Servers which is used for data mining and analysis by Business
- Took leadership in setting up AutoSys environment by working closely with different teams across different time zones under tight deadlines
- Mentored and trained team members for AutoSys usage and creating schedule scripts using "Best Practices"
- Created Hive queries for unit-testing and for reconciling results with Talend jobs
- Been proactive in analysis, troubleshooting and delivery of products with minimal to no defects and providing continuous support to team members and Admin teams
Environment: Talend, HDFS, Podium, Informatica, Autosys, Unix, GitHub, Bitbucket, Hive
Confidential, New York City
Database Developer
Responsibilities:
- Developed and Optimized Stored Procedures, Views, and User-Defined Functions for the Application
- Created DDL and DML scripts to create database schema and database objects
- Designed and developed dashDB database objects
- Implemented process to debug errors and troubleshoot problems relating to database objects
- Worked with UI team to perform integration testing of the database objects
Environment: IBM dashDB, GitHub
Confidential, Parsipanny, NJ
Database Developer
Responsibilities:
- Developed and Optimized Stored Procedures, Views, and User-Defined Functions for the Application.
- Created DML scripts to create database schema and database objects.
- Wrote scripts to validate, extract, transform and load data to data warehouse and data marts.
- Generated periodic reports based on the statistical analysis of the data from various time frame and division using SQL Server Reporting Services (SSRS)
- Developed various operational Drill-through and Drill-down reports using SSRS.
- Developed different kind of reports such a Sub Reports, Charts, Matrix reports, Linked reports.
- Used cascaded parameters to generate a report from two different Data Sets.
- Involved with Query Optimization to increase the performance of the Report.
- Dealt with Fine-tuning Stored Procedures to improve performance with a Query plan using SQL Tuning advisor.
- Interacting with Business Analysts and Developers in identifying the requirements and designing and implementing the Database Schema
Environment: SQL Server 2012, SSRS, MS Visual Studio, SSDT
Confidential, New York City, NY
ETL Developer
Responsibilities:
- Translated Business Requirements into Informatica mappings to build Data Warehouse by using Informatica Designer, which populated the data into the target Star Schema.
- Extracted data from various source systems like Oracle and flat files and loaded into relational data warehouse.
- Designed and developed complex mappings, from varied transformation logic like Unconnected and Connected lookups, Router, Filter, Expression, Aggregator, Joiner, Update Strategy, Stored Procedure and more.
- Modified the existing worklets and workflows to accommodate for these new sessions and mappings. Created the break points in the designer and worked with debugger.
- Extensively involved in coding of the Business Rules through PL/SQL using the Functions, Cursors and Stored Procedures.
- Performance tuning of the Informatica mappings using various components like Parameter files, Variables and Dynamic Cache.
- Extensively used debugger to trace errors in the mapping.
- Involved in unit testing, user acceptance testing to check whether the data loads into target are accurate, which was extracted from different source systems according to the user requirements.
Environment: Informatica Power Center 9.1, Oracle 11g, Toad for oracle 11G, UNIX shell Script, Crontab
Confidential, Jersey City, NJ
Informatica Developer
Responsibilities:
- Wrote SQL overrides in source qualifiers for extracting only the required rows for optimal performance. Created complex transformations using connected/unconnected Lookup.
- Created Stored Procedures to transform the Data and worked extensively for various needs of the transformations while loading the data.
- Used Informatica -Designer for developing mappings, used transformations, which includes Router, Filter, Expression, Aggregator, Joiner, Update Strategy.
- Created reusable transformations called mapplets and used them in mappings in case of reuse the transformations in different mappings.
- Extensively used Informatica for loading the historical data from various tables.
- Involved in gathering business requirements, logical modeling, physical database design, data sourcing and data transformation, data loading, SQL and performance tuning.
- Created sessions to run the mappings.
- Involved in unit testing, user acceptance testing to check whether the data loads into target are accurate, which was extracted from different source systems according to the user requirements.
- Involved in developing test data/cases to verify accuracy and completeness of ETL process.
Environment: Informatica Power Center, AutoSys, SQL Server 2005, UNIX shell Script
Confidential, NY
ETL Developer
Responsibilities:
- Convert the functional requirements into technical document as per the business standards
- Created Database Objects like tables, Views, sequences, Synonyms, Stored Procedures, functions, Packages, Cursors and Triggers as per the Business requirements
- Extensively used ETL Tool to load data from Flat Files, Excel to tables in Oracle server
- Involved in writing new modifying various existing packages, Procedures, functions, triggers according to the new business needs
- Wrote SQL Queries using Joins, Sub queries to retrieve data from the database.
- Used SQL Loader to upload the data into the database
- Extracted data from various source systems like excel and flat files and loaded into relational data warehouse.
- Extensively involved in coding of the Business Rules through PL/SQL using the Functions, Cursors and Stored Procedures.
- Involved in unit testing, user acceptance testing to check whether the data loads into target are accurate, which was extracted from different source systems according to the user requirements.
Environment: Oracle, UNIX shell Script, Crontab