We provide IT Staff Augmentation Services!

Etl Architect / Data Analyst Resume

5.00/5 (Submit Your Rating)

Rockville, MarylanD

SUMMARY:

  • 13.5 Years of IT experience in analysis, design, development, testing and Implementation of Business Intelligence solutions using Data Warehousing/Data mart design, ETL, OLTP SYSTEMS & OLAP client /server applications.
  • Worked with Business Users and Source System teams to analyze the data, identify and resolve the data anomalies.
  • Create Data Reports for Business Users with details of data issues.
  • Experience in using Informatica Client tools: Power Center Designer, Workflow Manager, Workflow Monitor, and Repository Manager and MFT.
  • Worked in cloud interfaces like Oracle Eloqua API, Facebook API and AWS.
  • Informatica PowerCenter 9.1, Informatica Cloud, Informatica Cloud Real Time, IBM CDC 6.5 and development of Data Warehouse/Data mart Applications.
  • Experienced with all phases of Software Development Life Cycle (SDLC) such as developing, testing, migrating, administrating, and production supporting on various platforms like UNIX, Window XP/7.
  • Experience in gathering, analyzing and documenting business requirements, functional requirements, designing and developing the mapping based on the requirements.
  • Experienced in implementing the business rules by creating transformations (Expression, Aggregate, Unconnected and Connected Lookup, Router, Update Strategy, Filter, Joiner, Union), and developing Mappings.
  • Strong knowledge on Data warehousing concept, Data mart, Star Schema and Snow Flake Schema modeling, Fact tables and Dimensional tables. Implemented Slowly Changing Dimension methodology for accessing the full history of accounts and transaction information.
  • Experience in the concepts of building Fact Tables, Dimensional Tables, handling Slowly Changing Dimensions and Surrogate Keys.
  • Experience using Teradata utilities (SQL, Fast Load, MultiLoad, FastLoad, and Tpump)...
  • Experience using Netezza Bulk Loader utility.
  • Experience in writing complex SQL queries, simple stored procedures, PL/SQL and Unix Shell Scripting.
  • Excellent experience in Performance tuning in Informatica Power Center and query optimization.
  • Strong knowledge on Relation databases like Oracle 9i/10g/11g, SQL SERVER, TERADATA on different platforms like Windows/Unix Linux using GUI tools like TOAD, SQL DEVELOPER, SQL PLUS, SQL*PLUS, MICROSOFT VISUAL.
  • Experience in OLAP tools such as Business Objects, Crystal Reports (Designer and Reporter). Trained and assisted End Users in developing Ad Hoc reports.
  • Involved in Unit Testing, Integration testing and QA Validation Have experience in using Informatica Workflow Manager to create Sessions, Batches and schedule workflows and Worklets.
  • Extensive experience in Database Performance Tuning, Imports and Exports. Experience in gathering business requirements, establishing functional specifications and translating them to design specifications.

TECHNICAL SKILLS:

ETL/DWH Tools: Informatica Power Center 9.5/8.6/8.x,, DVO, PowerMart 5.1/5.0, IBM InfoSphere CDC 6.5 B2B DX/DT v 8.0, Teradata.14.0 (BTEQ, Fast Load. TPUMP), Netezza

Databases: Oracle 11g, 10g, 9i, 8i

Cloud: Informatica Cloud, Informatica Cloud Real Time, Oracle Eloqua API, Facebook API and AWS.

Microsoft: MS Access, SQL Server 2005/2008, DB2, Web services, Teradata 14, Business Intelligence.

DBMS/Query Tools: Oracle SQL Developer, TOAD, Rapid SQL, SQL Developer, WinSQL, SQL Assistant, SQL Navigator, PL/SQL Developer. PMON.

Operating Systems: Microsoft Windows - Vista, XP, 2000, NT 4.0, OS/2

UNIX Sun Solaris, HP: UX

Programming Lang: SQL, PL/SQL, UNIX Shell Scripting, T-SQL

BI/Reporting Tools: MicroStrategy, Business Objects 5.1/4.x, Cognos, Crystal Reports 8.5

Defect Tracking: Rational Clear Quest (CQ), HP ALM, Jira

Data Analysis: Data Design/Analysis, Business Analysis, User Requirement Gathering, User Requirement Analysis, Gap Analysis, Data Cleansing, Data Transformations, Data Relationships, Source Systems Analysis and Reporting Analysis.

PROFESSIONAL EXPERIENCE:

Confidential, Rockville, Maryland

ETL Architect / Data Analyst

Responsibilities:

  • Working as Onsite Informatica Consultant.
  • Worked closely with the business analyst and Data warehouse architect to understand the source data and need of the Warehouse.
  • Worked with various source systems to find data anomalies in the source data.
  • Generate reports identifying and summarizing data issues in the source data.
  • Worked on solutions to resolve the data anomalies.
  • Worked on solutions based on Informatica Cloud and Informatica Cloud Real Time.
  • Designed an OLAP system to store Marketing Data for the various health / welfare schemes run by Confidential .
  • Involved in designing of star schema based data model with dimensions and facts.
  • Created Reusable ETL jobs based on exclusion Calendar (specific dates of exclusion, day of week / month for exclusion etc.)
  • Performed Data Profiling for the employee / contractor data coming from various departments of Confidential using Informatica Developer.
  • Created Interface between PeopleSoft and EDW to exchange data.
  • Designed reports in Micro Strategy
  • Design and Implementation of Access layer on top of the EDW for Risk Analytics and Reporting Purposes.
  • Created mappings with heterogeneous sources like flat files, MS Access, Oracle databases and created targets in Oracle data warehouse using Informatica PowerCenter 9.6.
  • Created ETL Design to maintain history of Employee / Contractor data in SCD Type 2.
  • Created Informatica PowerCenter mappings to populate EDW with of Employee / Contractor data from various sources in with SCD Type2 Historical Data.
  • Performed Data Analysis using SQL PL/SQL in Netezza.

Confidential, New York City, New York

ETL Architect / Data Analyst

Responsibilities:

  • Working as Onsite Informatica Consultant.
  • Worked on OLTP analysis and design for performance and scalability. The OLTP system is required to capture marketing data for campaigns to attract more customers for Confidential .
  • Worked closely with the business analyst and Data warehouse architect to understand the functionality to be implemented in the ETL processes and needs of the Marketing Team which will receive the data.
  • Involved in designing of various database tables well suited to meet the data requirements of the marketing team at Confidential .
  • Analyzed the source data and made the marketing team familiar with the various data issues. Based on such analysis Marketing Team was able to provide more refined requirements to us.
  • Created Automated ETL jobs to provide realtime data to the Confidential Marketing Team. Used Tidal as the tool for scheduling Informatica Jobs.
  • Created Reusable ETL jobs based on exclusion Calendar (specific dates of exclusion, day of week / month for exclusion etc.)
  • Worked on creating ETL jobs using Oracle Eloqua API, Facebook API, AWS cloud services.
  • Design and Implementation of Access layer on top of the EDW for Risk Analytics and Reporting Purposes.
  • Created mappings with heterogeneous sources like flat files, MS Access, Oracle databases and created targets in Oracle data warehouse using Informatica PowerCenter 9.5.
  • Created Source to Target mapping from design and requirement documents.
  • Built reusable transformations for recurring business logics using mapplets and used them in multiple mappings.
  • Prototyped replication of Sybase database from HPUX to MS NT and Win95 databases for a mobile computing solution.
  • Used Perl and awk to create automated data transformation (ETL) and replication.
  • Worked on various projects to pull data using BulkAPI and RestAPI from different Oracle Eloqua Objects like EmailSend, EmailOpen, EmailClickthru, Contacts, FormSubmit, Web Visit etc. Also Contacts List as filters while pulling data.
  • Worked on various projects to push data using RestAPI from different Oracle Eloqua Objects like Contacts and CDOs.
  • Worked on projects to push email addresses to Facebook through API.
  • Used ILM as an approach between data and storage management. worked with Informatica Data Quality 8.6.1 (IDQ) toolkit and DVO, Analysis, data cleansing, data matching, data conversion, exception handling, and reporting and monitoring capabilities of IDQ 8.6.1.
  • Designed and developed mappings using Source qualifier, Aggregator, Joiner, Lookup, Router, Sequence generator, Expression, Filter and Rank transformations.
  • Managing Application issue resolution and changes in BI.
  • Extensively worked in Informatica B2B DT to extract the data from unstructured files and on line streaming sources.
  • Implemented big data initiative creating lift in paid subscription base.
  • Got good knowledge on Big data components like Hadoop (Hadoop distributed File system and Map Reduce
  • Installed and configured Big data Hadoop distributions of various vendors namely HortonWorks, Cloudera, MapR.
  • Installed and configured different cluster monitoring tools like Ganglia and Nagios.
  • Exposure to Informatica B2B Data Transformation that supports transformation of structured, unstructured, and semi-structured data types while complying with the multiple standards which govern the data formats.
  • Assisting in upgrade of other BI tools from Informatica perspective.
  • Designed and developed test cases for unit and system testing.
  • Modified Unix Shell Scripts for executing the Informatica workflows.
  • Created a global schema with common source and target table to be used across different data marts and implemented shortcuts to the objects within each mart folder so that it is easy to maintain and control the changes to the source and targets.
  • Created parameters and variables for incremental data loading effectively using Informatica workflow manager.
  • Have used Netezza Bulk Loader utility to export and load data to/from Flat files.
  • Utilized of Informatica IDQ 8.6.1 to complete initial dataprofiling and matching/removing duplicate data.
  • Created event raise, event-wait tasks for maintaining dependencies between workflows.
  • Efficiently interacted with UAT team and supported system testing and fixed bugs in mappings during QA phase.
  • Optimized or tuned mapping/ sessions during load testing in QA environment.
  • Designed, developed, and tested various enhancements.
  • Planned to Implement change data capture (CDC) concept using Informatica Power Exchange by creating Registration groups and registrations to extract only the latest data without extracting entire data.

Environment: Informatica Power Center 9.5, Netezza, BI tools, Bigdata,B2B DX/DT v 8.0,IBM InfoSphere CDC 6.5.Oracle 10g, XML, SQL plus, PL/SQL, SQL, Shell Scripts, Oracle Eloqua API, Facebook API, AWS

Confidential, Norwalk, CT

ETL Architect / Sr. Informatica Developer

Responsibilities:

  • Working as Onsite Informatica Consultant.
  • Worked closely with the business analyst and business users to understand the source data and need of the Warehouse. Documented the requirements which were not captured in the initial version of the requirements document.
  • Involved in designing of database schemas and facts to provide effective Financial Risk Related information to the Business Users.
  • Design and Implementation of Access layer on top of the EDW for Risk Analytics and Reporting Purposes.
  • Created mappings with heterogeneous sources like flat files, MS Access, Oracle databases and created targets in Oracle data warehouse using Informatica PowerCenter 9.5.
  • Created Source to Target mapping from design and requirement documents.
  • Built reusable transformations for recurring business logics using mapplets and used them in multiple mappings.
  • Prototyped replication of Sybase database from HPUX to MS NT and Win95 databases for a mobile computing solution.
  • Used Perl and awk to create automated data transformation (ETL) and replication.
  • Best practices with ILM using more sophisticated strategy to data management backed by best-of-breed ILM tools.
  • Used ILM as an approach between data and storage management.
  • Hands on experience on IBM Infosphere CDC for capturing the multiple changes.
  • Worked on Informatica MFT for file transfers monitoring and Autosys for Job Scheduling. worked with Informatica Data Quality 8.6.1 (IDQ) toolkit, Analysis, data cleansing, data matching, data conversion, exception handling, and reporting and monitoring capabilities of IDQ 8.6.1.
  • Designed and developed mappings using Source qualifier, Aggregator, Joiner, Lookup, Router, Sequence generator, Expression, Filter and Rank transformations.
  • Managing Application issue resolution and changes in BI.
  • Have done POC on other Informatica Products like B2B Data Exchange, B2B Data Transformation, Metadata Manager, Data Quality, Data Explorer, and Data Validation Option.
  • Extensively worked in Informatica B2B DT to extract the data from unstructured files and on line streaming sources.
  • Implemented big data initiative creating lift in paid subscription base.
  • Got good knowledge on Big data components like Hadoop (Hadoop distributed File system and Map Reduce
  • Installed and configured Big data Hadoop distributions of various vendors namely HortonWorks, Cloudera, MapR.
  • Exposure to Informatica B2B Data Transformation that supports transformation of structured, unstructured, and semi-structured data types while complying with the multiple standards which govern the data formats.
  • Assisting in upgrade of other BI tools from Informatica perspective.
  • Designed and developed test cases for unit and system testing.
  • Modified Unix Shell Scripts for executing the Informatica workflows.
  • Created a global schema with common source and target table to be used across different data marts and implemented shortcuts to the objects within each mart folder so that it is easy to maintain and control the changes to the source and targets.
  • Created parameters and variables for incremental data loading effectively using Informatica workflow manager.
  • Have used BTEQ, FEXP, FLOAD, MLOAD Teradata utilities to export and load data to/from Flat files.
  • Utilized of Informatica IDQ 8.6.1 to complete initial data profiling and matching/removing duplicate data.
  • Created event raise, event-wait tasks for maintaining dependencies between workflows.
  • Efficiently interacted with UAT team and supported system testing and fixed bugs in mappings during QA phase.
  • Optimized or tuned mapping/ sessions during load testing in QA environment.
  • Designed, developed, and tested various enhancements.
  • Planned to Implement change data capture (CDC) concept using Informatica Power Exchange by creating Registration groups and registrations to extract only the latest data without extracting entire data.

Environment: Informatica Power Center 9.5, Teradata.14.0 (BTEQ, Fast Load. TPUMP),BI tools, Bigdata,B2B DX/DT v 8.0,IBM InfoSphere CDC 6.5.Oracle 11i, Autosys 4.5, XML, SQL plus, PL/SQL, SQL, Shell Scripts

Confidential, New York

ETL Architect / Sr. Informatica Developer

Responsibilities:

  • Working as Onsite Informatica Consultant.
  • Worked closely with the business team and ETL teams to design a data warehouse which provides useful information for Marketing Teams and for the Human Resources Team.
  • Involved in designing of Reporting Access Layer for the Credit Card Risk management team.
  • Created many different mappings to implement the Change Data Capture in the Risk Data warehouse.
  • Created Source to Target mapping from design and requirement documents.
  • Built reusable transformations for recurring business logics using mapplets and used them in multiple mappings.
  • Used shell scripting to create automated data transformation (ETL) and replication.
  • Worked with Informatica Data Quality 8.6.1 (IDQ) toolkit, Analysis, data cleansing, data matching, data conversion, exception handling, and reporting and monitoring capabilities of IDQ 8.6.1.
  • Used Trillium Data Cleansing tool to obtain address information of Customers in correct format. The cleansed address was used as feed for the Customer Mastering Application.
  • Designed and developed mappings using Source qualifier, Aggregator, Joiner, Lookup, Router, Sequence generator, Expression, Filter and Rank transformations.
  • Managing Application issue resolution and changes in BI.
  • Extensively worked in Informatica B2B DT to extract the data from unstructured files and on line streaming sources.
  • Worked on the enhancements to the OLTP systems to capture information about credit card transactions.
  • Exposure to Informatica B2B Data Transformation that supports transformation of structured, unstructured, and semi-structured data types while complying with the multiple standards which govern the data formats.
  • Assisting in upgrade of other BI tools from Informatica perspective.
  • Designed and developed test cases for unit and system testing.
  • Modified Unix Shell Scripts for executing the Informatica workflows.
  • Created a global schema with common source and target table to be used across different data marts and implemented shortcuts to the objects within each mart folder so that it is easy to maintain and control the changes to the source and targets.
  • Created parameters and variables for incremental data loading effectively using Informatica workflow manager.
  • Have used BTEQ, FEXP, FLOAD, MLOAD Teradata utilities to export and load data to/from Flat files.
  • Utilized of Informatica IDQ 8.6.1 to complete initial data profiling and matching/removing duplicate data.
  • Created event raise, event-wait tasks for maintaining dependencies between workflows.
  • Efficiently interacted with UAT team and supported system testing and fixed bugs in mappings during QA phase.
  • Optimized or tuned mapping/ sessions during load testing in QA environment.
  • Used UC4 as the tool for scheduling Informatica jobs.
  • Planned to Implement change data capture (CDC) concept using Informatica Power Exchange by creating Registration groups and registrations to extract only the latest data without extracting entire data.

Environment: Informatica Power Center 9.1, Teradata.14.0 (BTEQ, Fast Load. TPUMP),BI tools, B2B DX/DT v 8.0, Oracle 11i, Autosys 4.5, XML, SQL plus, PL/SQL, SQL, Shell Scripts

Confidential

Sr. Informatica Developer

Responsibilities:

  • Worked as Senior Informatica Developer.
  • Worked with various Databases for extracting the files and loading them into different databases
  • Designing the ETLs and conducting review meets.
  • Played a lead role in managing the offshore team.
  • Worked mainly on troubleshooting the errors which were occurred during the loading process
  • Used Teradata utilities like MultiLoad, Tpump, and Fast Load to load data into Teradata data warehouse from Oracle and DB2 databases.
  • Loaded the data into the Teradata database using Load utilities like ( Fast Load, MultiLoad, and Tpump).
  • Involved in Informatica Application Information Lifecycle Management (ILM) software to empower the IT organization to cost-effectively handle data growth.
  • Mentored team members in using new features of Informatica power center 8.6.1 and power exchange.
  • Worked with Informatica power exchange tools to give on demand access to the business users.
  • Transferred the mainframe data using Direct connect.
  • Used Erwin, Power Designer and Deft for design work. Created stored procedures, triggers, tables, indexes, rules, etc. as needed to support extraction, transformation and load (ETL) processes. Created a replication process to keep re-engineered database in sync with legacy database.
  • Replication and promotion of the repository within the server and onto a different server.
  • Extensively worked on creating mapping parameters and variables which are used in various workflows for reusability
  • Worked with various active transformations in Informatica Power Center like Filter Transformation, Aggregator Transformation, Joiner Transformation, Rank Transformation, Router Transformation, Sorter Transformation, Source Qualifier, and Update Strategy Transformation
  • Extensively worked with various Passive transformations in Informatica Power Center like Expression Transformation, and Sequence Generator
  • Used Informatica real time using Webservices, Salesforce.com and Change Data Capture.
  • Extensively used MS View for creating many tables and altering them in SQL database
  • Extensively worked with Slowly Changing Dimensions Type1, Type2, for Data Loads
  • Responsible for Performance Tuning at the Mapping Level, Session Level, Source Level, and the Target Level
  • Identified and eliminated duplicates in datasets thorough IDQ 8.6.1 components of Edit Distance, Jaro Distance and Mixed Field matcher, It enables the creation of a single view of customers, help control costs associated with mailing lists by preventing multiple pieces of mail.
  • Designed, developed, implemented and maintained Informatica PowerCenter and IDQ 8.6.1 application for matching and merging process.
  • Involved in writing Stored Procedures in SQL and extensively used this transformation in writing many scenarios as per the requirement
  • In-depth knowledge in PowerCenter web services hub configuration and maintenance.
  • Extensively worked with Source Qualifier Transformation to join the homogeneous sources
  • Extensively worked with Joiner Transformation to join the heterogeneous sources
  • Extensively worked with both Connected and Un-Connected Lookups
  • Extensively worked with Look up Caches like Shared Cache, Persistent Cache, Static Cache, and Dynamic Cache to improve the performance of the lookup transformations
  • Worked with re-usable objects like Re-Usable Transformation and Mapplets
  • Extensively worked with aggregate functions like Avg, Min, Max, First, Last in the Aggregator Transformation
  • Extensively used SQL Override function in Source Qualifier Transformation
  • Extensively used Normal Join, Full Outer Join, Detail Outer Join, and master Outer Join in the Joiner Transformation.
  • Extensively worked with Incremental Loading using Parameter Files, Mapping Variables, and Mapping Parameters.
  • Written Queries, procedures, created Indexes, primary keys and data bases testing.
  • Defects were tracked, reviewed and analyzed.
  • Implemented various Performance Tuning techniques on Sources, Targets, Mappings, and Workflows.
  • Used Source Analyzer and Warehouse designer to import the source and target database schemas, and the Mapping Designer to map the sources to the target.
  • Involved in performance tuning and monitoring (both SQL and Informatica) considering the mapping and session performance issues
  • Used workflow manager to create workflows, sessions, and also used various tasks like command, email.
  • Created Workflows, Worklets and Tasks to schedule the loads at required frequency using Workflow Manager.
  • Modified existing Unix Shell Scripts as per the requirements.
  • Utilized SQL-Loader to import data.
  • Designed and implemented Oracle 10g database object to support interface (Table, views, materialized views).
  • Writing PL/SQL packages, stored procedures and functions using new PL/SQL features like Collections, Objects, Object Tables, Nested Tables, External Tables, REF Cursors, MERGE, INTERSECT, MINUS, BULK INTO and Dynamic SQL commands.

Environment: Informatica PowerCenter 9.1, Oracle 11g, Web services, Teradata 14, MS Access 2010, SQL*Loader, UNIX, Winscp, Putty, Erwin, SQL, PL/SQL

Confidential

ETL Developer

Responsibilities:

  • Worked as a Informatica Developer.
  • Extensively worked in data Extraction, Transformation and Loading from Source to target.
  • Involved in analysis, design & testing environment.
  • Used Source Analyzer and Warehouse Designer to import the source and target database schemas, and the mapping designer to map source to the target.
  • Used Transformation Developer to create the Joiner, filters, Router, lookups, Expressions, Aggregation transformations, Update strategy, Stored Procedure Transformations used in mappings.
  • Created and executed sessions using Workflow Manager.
  • Developed reusable mapplets using mapplet designer.
  • Understanding existing business model and customer requirements.
  • Involved in preparation and execution of the unit, integration and end to end test cases.
  • Created multiple universes and resolved loops by creating table aliases and contexts.
  • Used session partitions, Dynamic cache memory and Index caches for improving performance of Informatica server.
  • Extracted data from SQL server Source Systems and loaded into Oracle Target tables.
  • Involved in writing shell scripts for automating pre-session, post-session processes and batch execution at required frequency using power center server manager.
  • Involved in the loading and Scheduling of jobs to be run in the Batch process.
  • Optimized and performed Tuning in mappings to achieve higher response times.
  • Involved in the migration of existing ETL process to Informatica Power center.
  • Created effective Test data and developed thorough Unit test cases to ensure successful execution of the data loading processes.
  • Created reports using business object functionality like queries, slice and dice, drill down, functions and formulas.

Environment: Informatica Power Center 7.1, Windows 2000, Solaris (SunOS 5.8), Oracle 9, Toad 7.6, Putty, Filezilla, SqlPlus

We'd love your feedback!