We provide IT Staff Augmentation Services!

Sr. Data Warehouse Consultant Resume

3.00/5 (Submit Your Rating)

Addison, TX

SUMMARY

  • About 11+ years of experience in Information Technology Database development, ETL tools, strong background wif 2 year noledge in Big data ecosystem related technologies
  • Hands on experience in application development using Java, RDBMS, and Linux shell scripting.
  • Good Experience of Data warehouse concepts and principals (Kimball/Inman) - Star Schema, Snowflake, SCD, Surrogate keys, Normalization/De normalization.
  • Designed and Developed complex mappings like SCD type1 and type2 applications using Informatica Power Centre Designer.
  • Experience in integration of various data sources wif Multiple Relational Databases like Oracle, Teradata, SQL Server and Worked on integrating data from flat files like fixed width and delimited.
  • Have extensively worked in developing ETL for supporting Data Extraction, transformations and loading using Informatica Power Center.
  • Experience in using the Informatica command line utilities like pmcmd to execute workflows in non-windows environments.
  • Good noledge in interacting wifInformatica Data Explorer (IDE), and Informatica Data Quality (IDQ).
  • Excellent understanding / noledge of Hadoop architecture and various components such as HDFS, JobTracker, TaskTracker, NameNode, DataNode and MapReduce programming paradigm.
  • Having good noledge in IBM Infosphere.
  • Good Knowledge in configuring and using Hadoop ecosystem components like Hadoop MapReduce, HDFS, Oozie, Hive, Sqoop and Pig.
  • Good Knowledge on Flume.
  • Good Knowledge analyzing data using Hive QL, Pig Latin and custom MapReduce programs in Java.
  • Extensively used Informatica Repository Manager and Workflow Monitor.
  • Good Knowledge in Java.
  • Good Knowledge in Informatica MDM.
  • Experience in debugging mappings. Identified bugs in existing mappings by analyzing the data flow and evaluating transformations.
  • Hands on experience in Performance Tuning of sources, targets, transformations and sessions.
  • Worked on Business Objects (6.5) in Universe build and Infoview reports.
  • Good experience in documenting the ETL process flow for better maintenance and analyzing the process flow.
  • Worked wif Oracle Stored Procedures, Triggers, Cursors, Indexes and Functions.
  • Good Knowledge in optimizing database performance.
  • Highly Motivated to take independent responsibility as well as ability to contribute and be a productive team member

TECHNICAL SKILLS

ETL Tools: Informatica 9.5.1/8.6.1/8.5.1/7.1/6.2/5.1 (Power Center/Power Mart)

Hadoop /Big Data: HDFS, MapReduce, Hive, Pig, HBase, Sqoop, Flume, Oozie

Data Modeling: Erwin 4.0/3.5, Star Schema Modelling, Snow Flake Modelling.

Databases: Oracle 11i/10g/9i/8i, MS SQL Server 2005/2000, DB2, Teradata v2r6/v2r5

SAP Tools: ECC 6.0, 4.X, BW3.X

OLAP Tools: Business Objects6.5/XI/R1/R2

Programming Languages: C,C++, JAVA, ASP, C# and .NET 3.5

Languages: SQL, PL/SQL, Unix Shell Script, Visual Basic

Tools: Toad, SQL* Loader

Operating Systems: Windows 2003/2000/NT, AIX, Sun Solaris, Linux

PROFESSIONAL EXPERIENCE

Confidential, Addison, TX

Sr. Data Warehouse Consultant

Responsibilities:

  • Designed ETL process, load strategy and maintained the release related documents.
  • Extensively worked on Informatica XML source, XML Transformations and XML targets.
  • Created UNIX shell scripts to the handle xml files as per the requirement.
  • Implemented Data loading using oracle SQL loader utility.
  • Responsible for creating, importing all the required sources and targets to the shared folder.
  • Implemented Type 2 Slowly Changing Dimensions Methodology to keep track of historical data.
  • Maintained Informatica code using version control.
  • Performance tuning TEMPhas been done to increase the through put for both mapping and session level and SQL Queries Optimization as well.
  • Extracted data from heterogeneous sources, transformed and loaded into different databases like Oracle, DB2 and flat files.

Environment: Informatica Power center 9.5.1, UNIX, Salesforce(SFDC), DB2, Oracle.

Confidential, Bloomington, IL

Sr. Data Warehouse Consultant/Tech lead

Responsibilities:

  • Involved in business analysis and technical design sessions wif business and technical staff to develop requirements document, and ETL specifications.
  • Developed Complex transformations, Mapplets using Informatica to Extract, Transform and load data into Datamarts, Enterprise Data warehouse (EDW).
  • Created mappings using transformations like Source Qualifier, Joiner, Aggregator, Expression, Filter, Router, Lookup, Update Strategy, and Sequence Generator.
  • Extract the data from, Flat files and Load the data into Salesforce Standard Objects and Custom objects using Informatica.
  • Implemented Mass delete of the salesforce records using Informatica.
  • Implemented Field Level security for sensitive data holder fields.
  • Involved in analyzing system failures, identifying root causes, and recommended course of actions.
  • Documented the systems processes and procedures for future references.
  • Got good exposure from State Farm Hadoop POC team in using Hadoop ecosystem components like Hadoop MapReduce, HDFS, Oozie, Hive QL, Sqoop and Pig.
  • Good Knowledge on Flume.
  • Good Knowledge in Integrated Oozie wif the rest of the Hadoop stack supporting several types of Hadoop jobs out of the box (like Java MapReduce, Pig, Hive, Sqoop) as well as system specific jobs.

Environment: Informatica/PowerExchange 9.5.1, Unix, Salesforce,Hive, Sqoop, Pig, Oozie, Cloudera Manager, Java, Linux, CentOS

Confidential, Houston, TX

Sr. ETL Developer

Responsibilities:

  • Involved and Designed the System Development Life Cycle, i.e. such as conducting the preliminary investigation and technical specs.
  • Extract the data from DB2 tables, Flat files and Load the data into Salesforce Standard Objects and Custom objects using Informatica.
  • Implemented Mass delete of the salesforce records using Informatica
  • Migrated Custom settings data from CSV files to SFDC using Data Loader
  • Involved in the Support activities and provided the design solutions for enhancements.
  • Designed and developed the logic for handling slowly changing dimension tables load by flagging the record using update strategy for populating the desired.
  • Involved in performance tuning of the mappings, sessions and workflows.
  • Performed Type1 and Type2 mappings

Environment: Informatica 9.5.1, Unix, Salesforce and Business Objects

Confidential, Houston, TX

Sr.Informatica Developer

Responsibilities:

  • Involved in business analysis and technical design sessions wif business and technical staff to develop requirements document, and ETL specifications.
  • Running conversion validations to keep track of legacy changes for conversion
  • Extracted data from relational databases DB2, Oracle, Flat Files, SAP and OpenLink Endur
  • Developed data Mappings, Transformations between source systems and warehouse.
  • Managed DQR team data schemas and ensuring dat day to day data change requests are communicated to the Data Arch team and are implemented in DDL drops.
  • Maintained Informatica code in SVN and communicate wif the CM team during deployments.
  • Worked on improving performance of the SAP Magellan extractions and created implementation process document, Test plan, Test Result and Compliance Check list.
  • Implemented Aggregate, Router, Rank, Joiner, Expression, Union, Lookup and Update Strategy transformations.
  • Partnered wif data stewards to provide summary results of data quality analysis, which will be used to make decisions regarding how to measure business rules and quality of the data.
  • Document Confidential a functional level how the procedures work wifin the data quality(IDQ) applications.
  • Used debugger to test the mapping and fixed the bugs.
  • Trouble shoots the problems and having discussions wif other users for impact and fixing the bugs.
  • Extensively used various Performance tuning techniques to improve the session performance.
  • Loaded the data from Stage tables to the Openlink COV environments using Openlink scripts

Environment: Informatica Power Center 8.6.1/8.x, IDQ 9.1.0, Oracle 11i, SQL, PL/SQL, RAPIDSQL, OPENLINK, SAP R/3,DB2, UNIX, Rational Suite, Business Objects XI R2

Confidential, Bloomington, IL

Sr.Informatica Developer

Responsibilities:

  • Documented user requirements, Translated requirements into system solutions and developed implementation plan and schedule.
  • Extracted data from relational databases DB2, Oracle and Flat Files.
  • Developed Complex transformations, Mapplets using Informatica to Extract, Transform and load data into Datamarts, Enterprise Data warehouse (EDW) and Operational data store (ODS).
  • Developed data Mappings, Transformations between source systems and warehouse
  • Performed Type1 and Type2 mappings
  • Managed the Metadata associated wif the ETL processes used to populate the data warehouse.
  • Implemented Aggregate, Filter, Join, Expression, Lookup and Update Strategy transformations.
  • Used debugger to test the mapping and fixed the bugs.
  • Created sessions, sequential and concurrent batches for proper execution of mappings using server manager.
  • Used Power Center Data Masking Options (Random Masking, Blurring, SSN, Credit Card) in mappings.
  • Utilized of InformaticaIDQ 8.6.1to complete initial data profiling and matching/removing duplicate data.
  • Tuned performance of Informatica session for large data files by increasing block size, data cache size, sequence buffer length and target based commit interval.
  • Responsible to design, develop and test the software (Informatica, PL SQL, UNIX shell scripts) to maintain the data marts (Extract data, Transform data, Load data).
  • Organized data in the report Inserting Filters, Sorting, Ranking and highlighting data.
  • Included data from different sources like Oracle Stored Procedures and Personal data files in the same report.
  • Responsible for daily verification dat all scripts, downloads, and file copies were executed as planned, troubleshooting any steps dat failed, and providing both immediate and long-term problem resolution.

Environment: Informatica Power Center 8.6.1/8.x, Power Exchange, Oracle 11i, SQL, PL/SQL, TOAD, Erwin, SQL SERVER 2005/2008, XML, T-SQL, Shell Scripts, SQL Navigator, Rational Suite, Business Objects XI R2,IDQ 8.6.1

Confidential, Houston, TX

Informatica Developer

Responsibilities:

  • Created Technical design specification documents based on the functional design documents
  • Participated in data modeling (logical and physical) for designing the Data Marts for finance, Sales, Billing.
  • Developed standards and procedures for transformation of data as it moves from source systems to the data warehouse.
  • Responsible for developing, support and maintenance for the ETL (Extract, Transform and Load) processes using Informatica powercenter 8.5.1/7.1.4 .
  • Understanding the domain and nodes as well as by using the informatica integration service to run the workflows in 8.5.1
  • Involved in source data profiling & source system analysis.
  • Developed sql code for data validations and data computation process on source DB2 transaction system and on target warehouse
  • Involved in massive data cleansing prior to data staging.
  • Developed shell scripts for job execution and automation on server side.
  • Define strategy for ETL processes, procedures and operations
  • Prepared a handbook of standards and Documented standards for Informatica code development.
  • Administrated users and user profiles and maintained the repository server.
  • Tuning the complex mappings based on source, target and mapping, session level
  • Extensively used pmcmd commands on command prompt and executed Unix Shell scripts to automate workflows and to populate parameter files
  • Extensively used transformations like router, aggregator, lookup, source qualifier, joiner, expression, aggregator and sequence generator transformations in extracting data in compliance wif the business logic developed.
  • Involved in performance and tuning the ETL processes.
  • Created Workflows and scheduled the jobs .
  • Debugging invalid mappings using break points, testing of stored procedures and functions, testing of Informatica sessions, batches and the target Data.
  • Developed shell scripts, PL/SQL procedures, for creating/dropping of table and indexes of performance for pre and post session management.
  • Used SQL tools like TOAD 9.5 to run SQL queries and validate the data in warehouse.
  • Extensively used Maestro tool for scheduling batch jobs.

Environment: Informatica Power Center 8.5.1/7.1.4, Business Object XI, SAP R/3,SAP BW, Oracle 10g, Mainframes, DB2, MS SQL Server 2005, Toad, PL/SQL, SQL, XML, AIX

Confidential, Middletown, NJ

Informatica Developer

Responsibilities:

  • Involved in full life cycle design and development of Data warehouse.
  • Implemented Informatica 8 new SQL transformation for executing SQL statements.
  • Installed and configured Informatica repository server
  • Created and maintained users and user profiles
  • Maintained repository server and scheduled Workflows
  • Maintained warehouse metadata, naming standards and warehouse standards for future application development.
  • Involved in business analysis and technical design sessions wif business and technical staff to develop Entity Relationship/data models, requirements document, and ETL specifications.
  • Used Informatica Designer to create complex mappings, transformations, source and target tables.
  • Administer the repository by creating folders and logins for the group members and assigning necessary privileges.
  • Used Workflow Manager for creating, validating, testing and running the sequential and concurrent sessions and scheduling them to run Confidential specified time and as well to read data from different sources and write it to target databases.
  • Used Exception handling logic in all mappings to handle the null values or rejected rows
  • Query Optimization for improving the performance of the data warehouse.
  • Developed standard and re-usable mappings and mapplets using various transformations like expression, aggregator, joiner, source qualifier, router, lookup Connected/Unconnected, and filter.
  • Involved in performance tuning of the mappings, sessions and workflows.
  • Performed Type1 and Type2 mappings
  • Used session parameters and parameter files to reuse sessions for different relational sources or targets.
  • Used SQL*Loader for loading bulk data.
  • Developed PL/SQL and UNIX Shell Scripts using VI editor.
  • Created documentation on mapping designs and ETL processes.
  • Maintain Development, Test and Production mapping migration Using Repository Manager.
  • Developed Korn shell scripts for Informatica pre-session, post session procedures
  • Involved in Unit testing, User Acceptance Testing to check whether the data is loading into target, which was extracted from different source systems according to the user requirements

Environment: Informatica Power Center 7.1.3/8.0, Mainframes, DB2, Oracle 10g, SQL, PL/SQL, TOAD, Erwin, SQL SERVER 2005, XML, T-SQL, Shell Scripts, SQL Navigator, Rational Suite, Crystal reports XI, AIX

Confidential, Detroit, MI

Informatica Developer

Responsibilities:

  • Involved in full life cycle development including Design, ETL strategy, troubleshooting Reporting, and Identifying facts and dimensions.
  • Prepared the required application design documents based on functionality required
  • Installed and configured Informatica 7.1
  • Migrated repository and mapping from Informatica 6.2 to 7.1
  • Created and Maintained users and User profiles
  • Designed the ETL processes using Informatica to load data from SAP R/3, Oracle, Flat Files (Fixed Width), and Excel files to staging database and from staging to the target Teradata Warehouse database.
  • Wrote the Algorithm for ETL (Extract, Transform and Load) team for Data Validation.
  • Provided the BTEQ scripts to validate the Edge Model and to generate Serial Key.
  • Automated Test Scenarios using BTEQ scripts to validate Source System Data against the Data Mart Views generated by Edge.
  • Wrote Queries, Procedures and functions dat are used as part of different application modules
  • Optimized Teradata SQL and TSQL queries for better performance.
  • Extract and Load data using different tools in Teradata like Multiload, FastExport, Fastload, OLE Load, Bteq
  • Implemented the best practices for the creation of mappings, sessions and workflows and performance optimization.
  • Created mappings using transformations like Source Qualifier, Joiner, Aggregator, Expression, Filter, Router, Lookup, Update Strategy, and Sequence Generator.
  • Designed and developed the logic for handling slowly changing dimension tables load by flagging the record using update strategy for populating the desired.
  • Involved in cleansing and extraction of data and defined quality process for the warehouse.
  • Involved in performance tuning and optimization of Informatica mappings and sessions using features like partitions and data/index cache to manage very large volume of data.
  • Documented ETL test plans, test cases, test scripts, test procedures, assumptions, and validations based on design specifications for unit testing, system testing, expected results, preparing test data and loading for testing, error handling and analysis.
  • Involved in migration of mappings and sessions from development repository to production repository
  • Involved in Unit testing, User Acceptance Testing to check whether the data is loading into target, which was extracted from different source systems according to the user requirements.
  • Involved in production support working wif various mitigation tickets created while the users working to retrieve the database.
  • Created Stored Procedures to transform the Data and worked extensively in PL/SQL for various needs of the transformations while loading the data.

Environment: Informatica Power Center 7.1.4, Business Objects 6.5, Teradata v2r6, Oracle 10g/9i, DB2, TOAD, SAP R/3, Erwin 4.5, SQL, PL/SQL, XML, Windows XP, HP UNIX

Confidential, Auburn Hills, MI

DW Developer

Responsibilities:

  • Worked on Informatica Power Center 6.2 tool - Source Analyzer, warehouse designer, Mapping Designer, Workflow Manager, Mapplets, and Reusable Transformations.
  • Using Informatica Designer designed Mappings, which populated the Data into the Target Star Schema
  • Optimized Query Performance, Session Performance and Reliability.
  • Extensively used Normalizer (for COBOL source files from S-390 Mainframe system), Router, Lookup, Aggregator, Expression and Update Strategy Transformations.
  • Translation of Business processes into Informatica mappings for building Data marts.
  • Involved in the Migration process from Development, Test and Production Environments.
  • Report generation using PERL and Microsoft Excel spreadsheet
  • Used SQL tools like TOAD to run SQL queries and validate the data.
  • Wrote stored procedures and triggers in Oracle 8i for managing consistency and referential integrity across data mart.
  • Tuning the Mappings for Optimum Performance, Dependencies and Batch Design.
  • Schedule and Run Extraction and Load process and monitor sessions using Informatica Workflow Manager.
  • Scheduled the batches to be run using the Workflow Manager.
  • Involved in identifying bugs in existing mappings by analyzing the data flow, evaluating transformations and fixing the bugs so dat they conform to the business needs.

Environment: Informatica Power Center 6.2/5.1, Sun Solaris 7, Oracle 8.i, SQL Server 2000, PL/SQL, SQL*Loader, Erwin 3.5, Windows NT 4.0, Perl 5.x, Microsoft Excel 2000

Confidential, CA

DW Developer

Responsibilities:

  • Created Stored Procedures to transform the Data and worked extensively in PL/SQL for various needs of the transformations while loading the data.
  • Used Informatica -Designer for developing mappings, using transformations, which includes aggregation, Updating, lookup, and summation. Developed sessions using Server Manager and improved the performance details.
  • Created transformations like Aggregate, Expression, Filter, Sequence Generator, Joiner, and Stored procedure transformations.
  • Created reusable transformations called mapplets and used them in mappings in case os reuse of the transformations in different mappings.
  • Involved in the process design documentation of the Data Warehouse Dimensional Upgrades. Extensively used Informatica for loading the historical data from various tables for different departments.
  • Involved in the development of Data Mart and populating the data marts using Informatica.

Environment: Informatica 5.1, Windows 2000, Oracle 8i, PL/SQL, SQL, Sybase, Cognos, Windows 2000

Confidential

Programmer

Responsibilities:

  • Involved and Designed the System Development Life Cycle, i.e. such as conducting the preliminary investigation and technical specs.
  • Involved in technical and user level documentation and training.
  • Involved in the generation of Forms & Reports.
  • Involved in creation of tables, writing stored procedures, Triggers, PL/SQL Libraries.
  • Performed Normalization and Logical Database Design Unit Testing of the modules.
  • Developed user-friendly forms for easy data entry.

Environment: Oracle 9i, SQL, PL/SQL, Erwin, Windows NT/XP/2000

We'd love your feedback!