Sr. Informatica Etl Developer Resume
WI
SUMMARY
- 8+ years of Data warehousing experience using Informatica, experience in implementation of ETL Methodology, Data Extraction, Transformation and Loading and Business Intelligence, Distributed Computing, Analysis, Design, Test Case Preparation, Development, Implementation of Client/Server, Internet/Intranet Applications, Databases.
- Strong experience in the Analysis, design, development, testing and Implementation of Business Intelligence solutions usingData Warehouse/Data Mart Design, ETL, OLAP, BI, Client/Server applications.
- Strong experiencewith Ralph Kimball and Inmon data modellingmethodologies.
- Strong experience working with ETLtools Informatica/SSIS.
- Strong Data Warehousing ETL experience of usingInformatica 9.6/9.5/9.1/8.6.1/8.5/8.1/7.1 PowerCenter Client tools - Mapping Designer, Repository manager, Workflow Manager/Monitor and Server tools Informatica Server, Repository Server manager.
- Experience working on Dataquality toolsInformatica IDQ, Informatica MDM .
- Expertise in Data Warehouse/Data mart, ODS, OLTP and OLAP implementations teamed with project scope, Analysis, requirements gathering, data modeling, Effort Estimation, ETL Design, development, System testing, Implementation and production support.
- Strong experience in Dimensional Modeling usingStar and Snow FlakeSchema, Identifying Facts and Dimensions, Physical and logical data modeling using ERwin and ER-Studio.
- Expertise in working with various sources such as Oracle 12c/11g/10g/9i/8x, SQL Server 2019/2017/2016/2014/2 012/2008 , DB2, UDB, Netezza,Teradata, flat files, XML, COBOL, Mainframe.
- Extensive experience in developingStored Procedures, Functions, Views and Triggers, Complex SQL queries using SQL Server, T-SQL and Oracle PL/SQL.
- UtilizedAUTOTRACE and EXPLAIN PLANfor monitoring the SQL query performance.
- Experience in resolving on-going maintenance issues and bug fixes monitoring Informatica sessions as well as performance tuning of mappings and sessions.
- Experience on NZLOAD,NZSQLscripts to read and write data with Netezza Database.
- Worked with Parameter file for ease of use in connections acrossDev/QA/Prodenvironments.
- Experience in all phases of Data warehouse development from requirements gathering for the data warehouse to develop the code, Unit Testing and Documenting.
- Extensive experience in writing UNIX shell scripts and automation of the ETL processes using UNIX shell scripting.
- Proficient in the Integration of various data sources with multiple relational databases like Oracle12c/11g /Oracle10g/9i, MS SQL Server, DB2, Teradata, VSAM files and Flat Files into the staging area, ODS, Data Warehouse and Data Mart.
- Experience in using Automation Scheduling tools likeAutosys, Tidal,Control-M,Tivoli Maestro scripts.
- Worked extensively with slowly changing dimensions SCD Type1 and Type2.
- Excellent interpersonal and communication skills, and is experienced in working with senior level managers, business people and developers across multiple disciplines.
TECHNICAL SKILLS
Data Warehousing: Informatica Power Center 9.6/9.5/9.1/8.6.1/8.5/8.1/7.1 , Informatica server, Power Mart 6.2/5.x/4.7, Power Exchange, Informatica Designer, Workflow Manager, Work flow Monitor, ETL, Datamart, Data cleansing, Data Profiling, OLAP, OLTP, Mapplet, Transformations, Autosys, Control M, Maestro, SQL*Loader, Control Center, Oracle 10g,Toad 8.6, PL/SQL
Dimensional Data Modeling: Dimensional Data Modeling, Data Modeling, Star Join Schema Modeling, Snowflake Modeling, FACT and Dimensions Tables, Physical and Logical Data Modeling, ERwin4.5/4.0, Oracle Designer, Visio.
Business Analysis/Data Analysis: Functional Requirements Gathering, User Interviews, Business Requirements Gathering, Process Flow Diagrams, Data Flow Diagrams, MS Project, MS Access, MS Office, Visio
Databases: Oracle 11g/10g/9ixMS SQL Server 2019/2017/2016/2014/2 , Sybase ASE 12/12.5.3, MS Access.
Languages: SQL, T-SQL, PL/SQL, UNIX Shell Scripting.
Environment: Unix clones: - HP-UX10/9, Sun Solaris /8/7/2.6/2.5, AS/400, LINUX, Windows95/98/2000/XP, WinNT4.0
Reporting Tools: OBIEE, Tableau, BI Apps, Cognos
PROFESSIONAL EXPERIENCE
Confidential, WI
Sr. Informatica ETL Developer
Responsibilities:
- Implemented technical specification document for ETL process and Design documents for each module.
- Designed, Developed and Supported Extraction, Transformation and Load Process (ETL) with Informatica Power Center 9.6.
- Responsible for Business Analysis and Requirements Collection.
- Worked on Informatica Power Center tools- Designer, Repository Manager, Workflow Manager, and Workflow Monitor.
- Executed them with Oracle's SQL plus and Teradata's Query man or BTEQ.
- Parsed high-level design specification to simple ETL coding and mapping standards.
- Designed and customized data models for Data warehouse supporting data from multiple sources on real time
- Involved in building the ETL architecture and Source to Target mapping to load data into Data warehouse.
- Created mapping documents to outline data flow from sources to targets.
- Involved in Dimensional modeling (Star Schema) of the Data warehouse and used Erwin to design the business process, dimensions and measured facts.
- Extracted the data from the flat files and other RDBMS databases into staging area and populated onto Data warehouse.
- Maintained stored definitions, transformation rules and targets definitions using Informatica repository Manager.
- Used various transformations like Filter, Expression, Sequence Generator, Update Strategy, Joiner, Stored Procedure, and Union to develop robust mappings in the Informatica Designer.
- Developed mapping parameters and variables to support SQL override.
- Executed them with Oracle's SQL plus and Teradata's Query man or BTEQ.
- Involved in massive data profiling using InformaticaData Quality (IDQ) (Analyst tool) prior to data staging.
- Loaded data into Teradata using Informatica, FastLoad, BTEQ, Fast Export, MLoad, and Korn shell scripts.
- Worked on different tasks in Workflows like sessions, events raise, event wait, decision, e-mail, command, worklets, Assignment, Timer and scheduling of the workflow.
- Implemented process improvement tools in production support activities.
- Used Type 1 SCD and Type 2 SCD mappings to update Slowly Changing Dimension Tables.
- Extensively used SQL* loader to load data from flat files to the database tables in Oracle.
- Designing a Informatica Data Quality (IDQ) Scorecard.
- Used Debugger to test the mappings and fixed the bugs.
- Wrote UNIX shell Scripts & PMCMD commands for FTP of files from remote server and backup of repository and folder.
- Involved in Performance tuning at source, target, mappings, sessions, and system levels.
- Prepared migration document to move the mappings from development to testing and then to production repositories.
Environment: Informatica Power Center 9.6, Workflow Manager, Workflow Monitor, Informatica Power Connect / Power Exchange, Data Analyzer, PL/SQL, Oracle 12c, Erwin, Autosys, IDQ/IDE, Teradata V 14.0, BTEQ, SQL Server 2019/2017, Sybase, UNIX AIX, Toad 9.0.
Confidential, Stamford, CT
Sr. Informatica ETL Developer
Responsibilities:
- Extensively worked with the data modelers to implement logical and physical data modeling to create an enterprise level data warehousing.
- Extensively worked on creating Facts and Dimensions tables.
- Designed and Developed Oracle PL/SQL Procedures and UNIX Shell Scripts for Data Import/Export and Data Conversions.
- Involved in Exception Handling Mappings for InformaticaData Quality (IDQ).
- Extensively used Informatica Power Center 9.5 to extract data from various sources and load in to staging database.
- Extensively worked with Informatica Tools - Source Analyzer, Warehouse Designer, Transformation developer, Mapplet Designer, Mapping Designer, Repository manager, Workflow Manager, Workflow Monitor, Repository server and Informatica server to load data from flat files, legacy data.
- Involved in generating FastLoad, Mload and Tpump scripts to load the data into Teradata tables.
- Created different transformations using Informatica power center 8.6 for loading the data into targets using various transformations like Source Qualifier, Java Transformation, SQL transformation, Joiner transformation, Update Strategy, Lookup transformation, Rank Transformations, Expressions, Aggregator, and Sequence Generator.
- Designed the mappings between sources (external files and databases) to operational staging targets.
- Wrote, tested and implemented Teradata Fastload, Multiload and Bteq scripts, DML and DDL.
- Developed Informatica mappings and Mapplets and also tuned them for Optimum performance, Dependencies and Batch Design.
- Creating BTEQ (Basic Teradata Query) scripts to generate Keys.
- Formulated standards and best practices for creating Navigator data maps to AS/400 non-relational flat-files within the Informatica Power Exchange toolset.
- Fine-tuning of PL/SQL Stored Procedures.
- Writing Oracle Stored Procedures, SQL scripts and calling them by using Perl shell and Korn shell scripts at pre and post session. Designing mapping templates to specify high level approach.
- Involved in InformaticaData Quality (IDQ) process the team considered to create the Column Profiling.
- Involved in Analyzing / building Teradata EDW using Teradata ETL utilities and Informatica.
- Coding using BTEQ SQL of Teradata, writes UNIX scripts to validate, format and Execute the SQL’s on UNIX environment.
- Worked with production support in finalizing scheduling of workflows and database scripts using Autosys.
- Involved in production support and stabilizations.
- Worked on Migration of Data warehouse in between different versions.
- Coordinated between Development, QA and production migration teams.
- Extensively involved in documentation of the project.
Environment: Informatica Power center 9.6/9.5, Informatica Power Exchange, Oracle 11g, Teradata V 13.0, SQL server 2014/2012, ER-studio, Autosys, DB2, Sybase, XML, Flat Files, IDQ/IDE, Windows 7, BTEQ, Linux, Cognos.
Confidential, Pittsburgh, PA
Informatica ETL Developer
Responsibilities:
- Designed and developed ETL process using Informatica Power Center 8.6.1.
- Accomplished automated data extraction from various RDBMS via scripts, ETL processing using Informatica and loading into Oracle Data warehouse.
- Involved in SDLC from analysis to production implementation, with emphasis on identifying the source and source data validation, developing particular logic and transformation as per the requirement and creating mappings and loading the data into different targets.
- Done caching optimization techniques in Aggregator, Lookup, and Joiner transformation.
- Imported Profiles and Rules from IDE to IDQ to fix the issues at mapping level.
- Developed Unix/Linux Shell Scripts and PL/SQL procedures
- Accomplished data movement process that load data from SQL Server, Oracle into Teradata by the development of Korn Shell scripts, using Teradata SQL and utilities such as BTEQ, FASTLOAD, FASTEXPORT, MULTILOAD.
- Informatica Data Quality (IDQ/IDE) is the tool used here for data quality/analysis measurement.
- Created User defined functions and Developed Informatica Variables and parameter files to filter the daily source data.
- Used various Oracle Index techniques like B*tree, bitmap index to improve the query performance and created scripts to update the table statistics for better explain plan.
- Worked with data Extraction, Transformation and Loading data using BTEQ, Fast load, Multiload.
- Created Test cases for Unit Test, System Integration Test and UAT to check the data quality.
- Used PMREP command to create backups, copy, and delete repositories.
- Developed UNIX Shell scripts for validating the files, running stored procedures & Informatica workflows.
- Coordinating with the client and gathering the user requirements.
Environment: Informatica Power Center 8.6.1, Teradata V2R5, UC4, Oracle 10g, IDQ/IDE, BTEQ, SQL Server 2008, PL/SQL, Flat Files, ERWIN Data Modeling tool, Advanced Query Tool, LINUX(Red hat).
Confidential, Secaucus, NJ
Informatica ETL Developer
Responsibilities:
- Involved in Business analysis and requirements gathering.
- Assisted in creating fact and dimension table implementation in Star Schema model based on requirements.
- Preparation of technical specifications and Source to Target mappings.
- Extensively used Informatica power center for extraction, transformation and loading process.
- Created mappings for dimensions and facts.
- Extracted data from various sources like Oracle, flat files and DB2
- Involving in design, development, rollout, maintenance, and ongoing operations of a series of large OLAP data warehouses and data marts.
- Developed several Mappings and Mapplets using corresponding Sources, Targets and Transformations.
- Synchronization has been done between Informatica Metadata and data modeling diagrams.
- Optimizing/Tuning mappings for better performance and efficiency.
- Migrated mappings from Dev to Test and Test to Production repositories.
- Created sessions and workflows to run with the logic embedded in the mappings using Power center Designer.
- Power Exchange Change Data Capture has been done for data updates.
- Designed and developed UNIX shell scripts as part of the ETL process, automate the process of loading, pulling the data.
- Generated PL/SQL Triggers and Shell scripts for scheduling periodic load processes.
- Created Logical and Physical models for Staging, Transition and Production Warehouses using Erwin.
- Responsible for testing and validating the Informatica mappings against the pre-defined ETL design standards.
- Created various tasks like sessions, decision, timer & control to design the workflows based on dependencies
- Used workflow manager for session management, database connection management and scheduling of jobs.
Environment: Informatica Power Center 8.1.1, Oracle 10g, Windows 2000, Power Exchange 8.1, PL/SQL, TOAD, SQL Server 2008, Data Modeling, UNIX Shell Scripting, Data Marts, DB2, Erwin 4.0.