We provide IT Staff Augmentation Services!

Sr. Etl Developer Resume

NY

SUMMARY:

  • 6 years of experience as ETL Developer in Data warehousing, Business Intelligence (BI) Solutions design & development. Business Intelligence area dealing with data analysis, solution design metadata management.
  • IT experience in data management, data warehouse architecting, business intelligence, managing and maintaining analytics and reporting environment.
  • Experience in created logging for ETL load at package level and task level to log number of records processed by each package and each task in a package using SSIS.
  • Strong experience in Business and Data Analysis, Data Profiling, Data Migration, Data Conversion, Data Quality, Data Integration and Metadata Management Services and Configuration Management.
  • Extensively used Transformations like Router, Aggregator, Normalizer, Joiner, Expression and Lookup, Update strategy and Sequence generator and Stored Procedure.
  • Extensively worked on the ETL mappings, analysis and documentation of OLAP reports requirements. Experience in Data Modeling and Database Architecture in OLTP and OLAP environment based on functional requirements.
  • Data Processing experience in designing and implementing Data Mart applications using ETL tool Informatica Power Center 9.0. Experience in Data warehousing, Data Extraction, Transformation and loading (ETL) data from various sources like Oracle, SQL Server, Microsoft Access, Microsoft Excel and Flat files into Data Warehouse and Data Marts using Informatica Power Center.
  • Involved in designing and developing ETL processes using Informatica ETL tool to read the data from XML source system and stage at SQL server database to perform transformation activity and load into Oracle database.
  • Experience in Teradata Enterprise Data Warehouse (EDW) and Data Mart. Worked on Design and Development of ETL Mappings, Using Informatica Mapping Designer within the Informatica Power Center, XML, Lookups, SQL Overrides usage in Lookups, Source filter and SQL Override usage in Source Qualifiers and data flow management into multiple targets using Routers were extensively done.
  • Conduct dashboard performance analysis between client’s existing visualization application and Tableau. Knowledge in the ETL (Extract, Transform and Load) of data into a data ware house/date mart and Business Intelligence (BI) tools like Business Objects Modules (Reporter, Supervisor, Designer, and Web Intelligence).
  • Experienced in conducting requirement analysis, use case design, designing test plans and developed database schemas based on the logical models. Extensively used Informatica debugger to figure out the problems in mapping.
  • Extensive experience in writing UNIX shell scripts and automation of the ETL processes using UNIX shell scripting. Involved in writing UNIX shell scripts to run and schedule batch jobs.
  • Experience in creating tabular reports, charts, parameterized reports, sub - reports, matrix reports, lists, interactive reports according to the business needs in SSRS. Developed test cases and performed unit testing in ETL (informatica 10.1) and database level (DB2).
  • Extensive experience in Strategic development of a Data Warehouse and in Performing Data Analysis and Data Mapping from an Operational Data Store to an Enterprise Data Warehouse
  • Highly motivated team player with excellent Interpersonal and Customer Relational Skills, Proven Communication, Organizational, Analytical, Presentation Skills.

SKILL:

  • SQL
  • Test Cases
  • Parameterization
  • Tableau
  • Scripting Language
  • Business Intelligence
  • Creativity
  • Risk Management
  • Mapplet Designer
  • AWS Cloud
  • Problem Solving
  • Repository
  • Debugging
  • SSIS/SSRS
  • Informatica
  • Data Quality
  • Load Data
  • Data Modelling
  • Data Warehouse
  • Report Writing
  • UNIX
  • Decision Making
  • Aggregator
  • Project Management

PROFESSIONAL EXPERIENCE:

Confidential, NY

Sr. ETL Developer

Responsibilities:

  • Extensively involved in the modeling and development of Reporting Data Warehousing System. Worked closely with the Enterprise Data Warehouse team and Business Intelligence Architecture team to understand repository objects that support the business requirement and process.
  • Debugging ETL jobs errors, ETL Sanity and production Deployment in TAC-Talend Administrator Console using SVN.
  • Worked on optimizing the mappings by creating re-usable transformations and Mapplets. Created debugging and performance tuning of sources, targets, mappings, transformations and sessions. Used Debugger to test the mappings and fixed the bugs.
  • Extensively used SSIS transformations such as Lookup, Derived column, Data conversion, Aggregate, Conditional split, SQL task, Script task and Send Mail task etc and wrote UNIX shell scripts to work with flat files, to define parameter files and to create pre and post session commands.
  • Responsible for executing User Interface Testing, System Testing, Data Quality Testing on the configuration design and prototypes. Created use cases to depict the interaction between the various actors and the system and created data flow models, data quality analysis and performed cost/benefit analysis.
  • Worked in Oracle SQL and PL/SQL including all database objects: Stored procedures, Stored functions, Packages, TYPE Objects, Triggers, cursors, REF cursors, Parameterized cursors, Views, Materialized Views, PL/SQL collections.
  • Designed and implemented the HR Business Intelligence data warehouse and ETL (extract transform load) utilizing Oracle Business Intelligence Enterprise Edition (OBIEE) and PL/SQL.
  • Designed the ETL processes using Informatica tool to load global complaints data from flat files and Teradata into the target Teradata database. Parameterized the mappings and increased the re-usability.
  • Parsing high-level design specification to simple ETL coding and mapping standards. Designed new database tables to meet business information needs. Designed Mapping document, which is a guideline to ETL Coding. Performed Back End testing by writing SQL queries to check data integrity, row-counts and measures.
  • Extensively used ETL to load data from different source systems like, Flat files etc into the Staging table and load the data into the target database ORACLE. Created Test cases for the mappings developed and then created integration Testing Document.
  • Worked on ETL process consisting of data extraction, data cleansing and conversion, mapping and loading using ETL/BI tools such as SSIS, SSAS, SSRS and Informatica PowerCenter.
  • Performed Tableau server and desktop administration functions, applied patches and upgrades, server high availability, and scripting. Assisted users with questions and issues using Tableau, conducted user training/orientation where needed.
  • Advanced techniques in MS SQL Server, Vertica, OLAP Cubes, Tableau and PivotTable are utilized in daily work. Performed backend testing using SQL queries and analysed the server performance on UNIX.
  • Advanced use of Microsoft SSAS and SSIS in multi-platform BI environment (Microsoft / Oracle). User meeting for analysis, study and development used Erwin for Logical /Physical data model and ER-Diagram.
  • Created mappings using various Transformations such as Source qualifier, Aggregator, Expression, lookup, Router, Filter, Rank, Sort, Look up, Sequence Generator, Normalizer and Update Strategy.
  • Created tabular, matrix, chart, drill down reports, parametric, cascaded reports, dashboards and scorecards reports (SSRS) according to business requirement.
  • Worked on Master Data Management (MDM) for maintaining the customer information and for the ETL rules to be applied. Reviewed the data model and reporting requirements for Cognos Reports with the Data warehouse/ETL and Reporting team.
  • Designed the business requirement collection approach based on the project scope and SDLC methodology. Involved in developing Logical and physical data models using Erwin Data Modeler
  • Used Informatica Power Centre 10.2.0/9.6.1 for extraction, transformation and load (ETL) of data in the data warehouse. Familiar with SQL*Loader to load data from external source like flat files into database tables.
  • Parsing high-level design specification to simple ETL coding and mapping standards. Developed business requirement specification documents as well as high-level project plan.

Confidential, Princeton, NJ

ETL Developer

Responsibilities:

  • Successfully implemented Slowly Changing Dimensions (SCD) in ETL Jobs to load the Data warehouse whilst maintaining the historical data. Conceptualization, monitored all the sessions that are scheduled, running completed and failed. Involved in debugging the Mappings which are failed.
  • Develop and maintain sales reporting using in MS Excel queries, SQL in Teradata, and MS Access. Responsible to design, develop and test the software (Informatica, PL SQL, UNIX shell scripts) to maintain the data marts (Load data, analyse using OLAP tools). Experience on Extraction Transformation and Loading (ETL) process using SSIS
  • Troubleshooting, debugging & altering Talend issues, while maintaining the health and performance of the ETL environment. Used debugger and breakpoints to view transformations output and debug mappings.
  • Worked in scripting T-SQL queries, Complex Stored Procedures, User Defined Functions (UDF), Database Triggers, using tools like SQL Profiler and Database Tuning Advisor (DTA)
  • Used Data Blending, groups, combine fields, calculated fields, and aggregated fields and spotlighting to compare and analyse data in different perspectives.
  • Responsible for improving data quality and for designing or presenting conclusions gained from analyzing data using Microsoft Excel as statistical tool
  • Worked on ETL process consisting of data extraction, data cleansing and conversion, mapping and loading using ETL/BI tools such as SSIS, SSAS, SSRS and Informatica PowerCenter.
  • Design and development of Tableau dashboard for sales and marketing team by working in conjunction with the business users. Involved in designing and developing Data Models and Data Marts that support the Business Intelligence Data Warehouse.
  • Involved in creating multiple parameterized stored procedures which were used by the reports to get the data. Extensively used workflow variables, mapping parameters and mapping variables.
  • Review and analyze data mapping document to determine ETL program design. Designed new database tables to meet business information needs. Designed Mapping document, which is a guideline to ETL Coding.
  • Advanced set analysis, action filter, custom SQL writing and Data Blending in Tableau dashboard. Designs and develops the logical and physical data models to support the Data Marts and the Data Warehouse.
  • Tableau dashboard optimization following reporting best practices (Tableau Data Extract, Materialized View etc). Development of custom reports for Erwin using Crystal Reports.
  • Used Informatica Power Centre Workflow manager to create sessions, workflows and batches to run with the logic embedded in the mappings. Wrote automated unit test cases for many mappings.
  • Designed and developed complex mappings to move data from multiple sources into a common target area such as Data Marts and Data Warehouse using Lookups, Source Qualifier, Rank, Router, Filter, Expression, Aggregator.
  • Complete study of the in-house requirements for the data warehouse. Analyzed the DW project database requirements from the users in terms of the dimensions they want to measure and the facts for which the dimensions need to be analysed. Created and scheduled Sessions based on demand, run on time and run only once using Informatica Server Manager.
  • Using UNIX console commands to check database has been connected the other functionalities. Coordinated activities with project manager and various other teams using MS Project.
  • Receive manual request from end users to Load data into staging Tables and process and used Informatica Power Centre to load data in to Staging Area.
  • Data mapping, logical data modelling, created class diagrams and ER diagrams and used SQL queries to filter data within the Oracle database. Used Shell Scripting to automate the loading process.
  • Initiate install and Configuration, Source Definitions, Match Algorithm Design, Weight generation, Threshold Analysis, Task Planning and Bulk Match Job Designs.

Confidential, Saint Paul, MN

ETL Developer

Responsibilities:

  • Performed as an enterprise MDM Analyst. Responsible for delivering customer information systems (MDM) across North America and Canada.
  • Created data feeds using Oracle & Extensively Used Hive to do transformations and some pre-aggregations before storing the data into HDFS. Updated and maintained existing test Matrix and test cases based on code changes and enhancements.
  • Worked in monitoring and scheduling using Job Conductor (Talend Administration Centre), AutoSys and using UNIX (Korn & Bourn Shell) Scripting. Created sessions, batches for incremental load into staging tables and scheduled them to run daily.
  • Created and used Debugger sessions to debug sessions and created breakpoints for better analysis of mappings. Experience in debugging mappings, identified bugs in existing mappings by analyzing the data flow and evaluating transformations.
  • Worked on many components which are there in the palette to design Jobs & used Context Variables to Parameterize Talend Jobs and developed complex dashboards using Tableau table calculations, quick filters, context filters, hierarchies, parameters, and action filters.
  • Conducted design review of data models, cube designs & ETL architecture. Led DW/BI best practices sessions and suggested design changes to processing/querying bottlenecks.
  • Spearheaded BI solutions development and implemented creative analytical solutions utilizing enterprise wide ETL and reporting strategy involving complex ETL, exceptions handling, compliance tracking and data reconciliation. Create Conceptual and Physical Data Models for the source system using Erwin 7.3.
  • Handled Teradata performance SQL Tuning, Query optimization (Explain plans, Collect statistics, Primary and Secondary indexes). Used Informatica Power Centre Workflow manager to create sessions, workflows and batches to run with the logic embedded in the mappings.
  • Follow the modelling process to make changes to the Erwin models. Data modelling was performed using ERWIN tool to build logical and physical models.
  • Worked in Data Warehouse and Business Intelligence Projects along with the team of Informatica, Talend (ETL), Cognos 10/11, Impromptu and Powerplay.
  • Wrote complex SQL queries for validating the data against different kinds of reports generated by Cognos. Have Used Informatica Data Quality as ETL tool to transform the data from various sources and bring them into one common format and load them in to target database for the analysis purpose from Data Warehouse.
  • Worked on developing and monitoring SSIS/SSRS Packages and outstanding knowledge of high availability SQL Server solutions, including replication.
  • Worked in building Data Integration, Workflow Solutions and Extract, Transform and Load (ETL) solutions for data warehousing using SQL Server Integration Service (SSIS) and Informatica.
  • Worked extensively with advanced analysis Actions, Table calculations, Parameters, Maps, Trend Lines, Background images, Groups, Hierarchies & Sets to create detail level summary reports and Dashboards.
  • Performed Data mapping, logical data modeling, created class diagrams and ER diagrams and used SQL queries to filter data. Update MDM elements to report and maintains metrics appropriate for tracking performance.
  • In utilizing Business Intelligence tools for the design and deployment of Data Marts supporting Enterprise Data Warehouses. Re-designed and updated Erwin logical and physical models of data warehouse structures that identified and documented changes for multiple ERP data sources. Develop Test strategy, test plan/design, execute test cases for the ETL & BI systems.
  • Develop mappings transformations like Filter, Joiner, Sequence Generator and Aggregator and perform query overrides in Lookup transformation as and when required to improve the performance of the mappings.
  • Tuned Informatica mappings and sessions using techniques like partitioning and pushdown optimization. Developed Mapping to pull data from Source, apply transformations, and load data into target database like Oracle.
  • Develop and maintain sales reporting using in MS Excel queries, SQL in Teradata, and MS Access. Produce performance reports and implement changes for improved reporting.
  • Using Shared Containers and creating reusable components for local and shared use in the ETL process. Interacted with Business Users to conduct thorough Requirements Analysis

Confidential, NJ

ETL Developer/Data Analyst

Responsibilities:

  • Involved in the analysis of the existing credit card processing system, mapping phase according to functionality and data conversion procedure. Performed Gap Analysis to check the compatibility of the existing system infrastructure with the new business requirements.
  • Involved in defining the source to target ETL data mappings, business rules and data definitions. Responsible for defining the key identifiers for each mapping/interface.
  • Created tabular, matrix, chart, drill down reports, parametric, cascaded reports, dashboards and scorecards reports (SSRS) according to business requirement.
  • Knowledge on Teradata Utility scripts like Fast Load, Multiload to load data from various source systems to Teradata. Utilized SSIS (SQL Server Integration Services) to produce a Data Warehouse for reporting.
  • Implemented Security Features of Business Objects like row level, object level and report level to make the data secure. Conducted JAD sessions with management, SME, vendors, users and other stakeholders for open and pending issues.
  • Established traceability matrix using Rational Requisite Pro to trace completeness of requirements in different SDLC stages. Created and managed project templates, Use Case project templates, requirement types and traceability relationships in Requisite Pro.
  • Developed Systems Specifications document to define the impact of the new requirements on the existing system. Managed Scope and change throughout the SDLC process of the product.
  • Involved in performance measurement of ongoing data collection to determine if a program is implementing activities and achieving objectives.
  • Strong experience in creating tabular reports, charts, parameterized reports, sub-reports, matrix reports, lists, interactive reports according to the business needs in SSRS.
  • The data mapping specifies what data will be extracted from an internal data warehouse, transformed and sent to an external entity. Used SQL for querying the database in the UNIX environment.
  • Involved in Data mapping specifications to create and execute detailed system test plans. The data mapping specifies what data will be extracted from an internal data warehouse, transformed and sent to an external entity.
  • Worked on Informatica Data Quality transformations like Address validator, Parser, Labeler, Match, Exception, Association, Standardizer and other significant transformations for data cleansing.
  • Debugged and implemented the best practices in mappings, sessions and workflows for data extraction and loading into Slowly Changing Dimensions type 1 and type 2.
  • Worked extensively with advanced analysis Actions, Table calculations, Parameters, Maps, Trend Lines, Background images, Groups, Hierarchies & Sets to create detail level summary reports and Dashboards.
  • Assisted in on-going process improvement efforts to ensure Test Planning, Execution, and Reporting is effective, efficient, standardized, coordinated, and integrated.
  • Suggested measures and recommendations to improve the current application performance. Reviewed and analysed the business environment and identified process improvements.
  • Assisted the Senior Business Analyst in writing Functional Requirement Specifications (FRS) and User Requirement Specification (URS). assured that all Artifacts are following corporate SDLC Policies and guidelines. Developed the ETL Informatica Mappings for importing data from ODS into subsequent data marts.
  • Involved in Managing Data modelling project from Logical design and implementation of Sybase Database Acted as User Acceptance Testing coordinator and monitored business testing and interfaced with the development team regarding defect status and fixes daily.
  • Strong Data modelling experience using ER diagram, Dimensional data modelling, Working knowledge of ODBC, OLEDB, T-SQL, SQL, PL/SQL and scripting languages.
  • Worked with database connections, SQL joins, cardinalities, loops, aliases, views, aggregate conditions, parsing of objects and hierarchies.
  • Worked on Debugging and Performance tuning of targets, sources, mappings and sessions. Managed Scheduling of Tasks to run any time without any operator intervention.
  • Created various views in Tableau-like Tree-maps, Heat Maps, Scatter plots, Geographic maps, Line chart, Pie charts etc. Worked on Data warehousing applications using ETL and OLAP tools like Informatica, Cognos with Oracle, SQL Server.
  • Worked closely with the Enterprise Data Warehouse team and Business Intelligence (BI) Architecture team to understand repository objects that support the business requirement and process.
  • Responsible for creating mapping documents required for the ETL team. Created SQL, PL/SQL, SQL Loader control files, functions, procedures, and UNIX Shell scripts.
  • Worked with Memory management for the best throughput of sessions containing Rank, Lookup, Joiner, Sorter and Aggregator transformations and involved in pipeline partitioning. Develop and execute detailed ETL related functional, performance, integration and regression test cases, and prepare test documentation.

Hire Now