Sr.etl/informatica Developer Resume
Charlotte North, CarolinA
SUMMARY:
- Over 10+ years of IT experience in Informatica Power Center in all phases of Analysis, Design, Development, Implementation and support of data warehousing applications.
- Experience in the field of Data warehouse using ETL tools such as Informatica Power Center 9.6/8x/7x, Power Mart 9x/8x databases a s DB2, Oracle, MS SQL Server and Teradata .
- Extensive experience in the implementation of Extraction, Transformation & Loading (ETL) using Informatica Power Center, Teradata , Oracle 11g/10g/9i, MS SQL Server 2008, SSRS and SSAS.
- Extensively used various transformations such as Filter, Expression, Update Strategy, Look - up, Sequence, Joiner, Router, Aggregate, and Stored procedure transformation.
- Extensively worked in Repository configuration using transformation, creating Mapping, Mapplets, Workflow, Sessions, Work lets, and processing tasks using Designer/ Workflow Manager to move the data from multiple Source system to Targets.
- Solid understanding of Data Warehousing concepts and Dimensional modeling (Star schema and Snowflake schema) and Relational Data Modeling and Slowly Changing Dimensions.
- Experience in documenting High Level and Low level Design, Source to Target Mapping ( STM ), Macro and Micro Documents, Unit test plan, Unit test cases and Code Migration Report.
- Expertise in Data Warehouse/Data mart, ODS, OLTP and OLAP implementations, Analysis, requirements gathering, data modeling, ETL, Design, development, System testing, Implementation and production support.
- Experience in Business analysts and the DBA for requirements gathering, business analysis, designing and documentation of the data warehouse.
- Extensively used SQL and PL/SQL to write Stored Procedures, Functions, Packages and Triggers.
- Utilized Informatica Data Quality (IDQ) for data profiling and matching/removing duplicate data, fixing the bad data, fixing NULL values.
- Knowledge in Oracle, BTEQ, SQL, PL/SQL, MS Access, MS Excel on UNIX and Windows platforms.
- Experience in populating the data warehouse tables using SQL Loader, MULTILOAD, FASTLOAD, Oracle Packages, Stored procedures, Stored functions.
- Knowledge in Data Analyzer tools like Informatica Power Exchange to capture the changed data.
- Worked in Agile Methodology for SDLC (Software Development Life Cycle) and utilize agile scrum meetings for creative work.
- Experience in Netezza database administrative tasks such as creating tables, scheduling backups using NZ scripts, migrating tables from server to server, creating user accounts, user groups and setting up permissions
- Worked closely with Production Control team to schedule Informatica Workflows and Shell Scripts in Control M
- Experience in Automation and scheduling of UNIX shell scripts and Informatica sessions and batches using Tivoli Workload Scheduler (TWS
- Knowledge on BI tools like Tableau
- Experience in managing and reviewing Hadoop log files. Tested and reported defects in an Agile Methodology perspective
- Extensive knowledge of different kinds of testing like Database Testing, ETL Testing, Functional Testing, End to End Testing and UAT Testing.
TECHNICAL SKILLS:
Data Warehouse Tools: Informatica Power Center 9.6/9.5/8.x/7.x, Data Stage, Informatica Power Analyzer/Power mart 9x, Informatica Data Quality 9.5 (IDQ)
Operating System: Windows XP/Vista 08/07, Unix, Linux, IBM Mainframes, Putty, Winscp
Databases: Oracle 11g/10g/9i, MySQL, SQL Server 08/05, Teradata, MS Access
Reporting Tools: SAP BO, SSRS, SSAS
Dimensional Data Modeling: Dimensional Data Modeling, Star Join Schema Modeling, Snow-Flake, Modeling, Erwin.
Testing: UAT, Functional, ETL
Data Base Tools: Oracle SQL Developer, SQL Plus, SQL Loader, TOAD, Netezza 6.0/7.0, MS office, Microsoft Visio, OLTP/OLAP
Languages: SQL, PL/SQL, XML, Unix Shell Scripting, Cobol
Dimensional Data Modeling: Dimensional Data Modeling, Star & Snow Flake modeling, Erwin
PROFESSIONAL EXPERIENCE:
Confidential, Charlotte, North Carolina
Sr.ETL/Informatica Developer
Responsibilities:
- Interacted with the Business Users in gathering and analyzing the Business requirements.
- Worked on Designing, Development and Testing of Workflows and Work lets according to Business Process Flow.
- Created mappings using various transformations like update strategy, lookup, stored procedure, router, joiner, sequence generator and expression transformation.
- Extensively involved in writing ETL Specifications for Development and conversion projects.
- Created technical design specifications for data Extraction, Transformation and Loading (ETL).
- Extensively worked on Change Data Capture/Incremental loading of SCD Type I/II.
- Design the ETL process and schedule the stage and mart loads for the data mart.
- Worked on Informatica Utilities Source Analyzer, warehouse Designer, Mapping Designer, Mapplets Designer and Transformation Developer for defining Source and Target, and coded the process from source system to data warehouse.
- Extensively used transformations like Stored Procedure, Connected and Unconnected lookups, Update Strategy, Filter transformation, Joiner transformations to implement business logic.
- Executed, scheduled workflows using Informatica Cloud tool to load data from Source to Target.
- Involved in Data Extraction from Oracle and Flat Files using SQL Loader Designed and developed mappings using Informatica Power Center
- Extensively created Re-usable Transformations and Mapplets to standardized Business logic.
- Extracted the data from various Flat files and Loaded in Data warehouse Environment and written Unix Shell scripts to move the files across the Servers.
- Used Informatica Workflow Manager to create workflows, database connections, sessions and batches to run the mappings.
- ETL includes the selection criteria to extract data from source systems (Cloud), performing any necessary data transformations or derivations needed, data quality audits, and cleansing. Experience using the Informatica Web Services connector is critical.
- Used Variables and Parameters in the mappings to pass the values between mappings and sessions.
- Worked on building ETL data flows that works natively on HADOOP
- Used Unix Shell Scripts to automate pre-session and post-session processes.
- Created shortcuts for reusable source/target definitions, Reusable Transformations, Mapplets in Shared folder.
- Imported the IDQ address standardized mappings into Informatica Designer as a mapplet
- Coordinated with testing team to make testing team understand Business and transformation rules that have been used throughout ETL process.
Environment: Informatica Power Center 9.6, Informatica IDQ 9.1, Oracle 11g, SQLServer2008, IBM (DB2), MS Access, Windows XP, Toad, SQL developer.
Confidential, Calabasas, CA
ETL/Informatica Developer
Responsibilities:
- Involved and understanding the Business requirements/ discuss with Business Analyst, analyzing the requirements and preparing business rules.
- Design and Developed complex mappings by using Lookup transformation, Expression, Sequence generator, Update, Aggregator, Router, Stored Procedure to implement complex logics while create mappings .
- Developed mappings using Informatica to load the data from sources such as Relational tables, Flat files, Oracle tables into the target Data warehouse.
- Developed mappings/Transformation/Mapplets by using mapping designer, transformation developer and Mapplets designer using Informatica Power Center.
- Experienced in methodologies like SDLC and Agile methodologies such as SCRUM.
- Contributed and actively providing comments for user stories review meeting within an AGILE SCRUM environment.
- Working closely with the client on planning and brainstorming to migrate the current RDBMS to Hadoop
- Did lot of ETL testing based on ETL mapping document for data movement from source to target
- Worked in OLTP and OLAP System Study, Analysis and E-R modeling
- Extensively worked with transformations like Lookup, Expression, Router, Joiner, Update Strategy, Filter, and Aggregate.
- Extensively worked on SQL override in Source Qualifier Transformation for better performance.
- Hands on experience using query tools like TOAD, SQL Developer, PLSQL developer, Teradata SQL.
- Excellent knowledge on ETL tools such as Informatica, SAP BODS to load data to Teradata by making various connections to load and extract data to and from Teradata efficiently.
- Designed and developed stored procedures using PL/SQL and tuned SQL queries for better performance.
- Automated ETL workflows using Control-M Scheduler
- Implemented slowly changing dimension (SCD Type 1 & 2) in various mappings. schema
- Created sessions, database connections and batches using Informatica Server Manager/Workflow Manager.
- Extensively used Informatica Power Center Workflow manager to create sessions, workflows and batches to run with the logic implanted in the mappings.
- Coordinated all ETL (Extract, Transformation and loading) activities and enhancements using Agile Methodology.
- Developed Logical and Physical data models that capture current state/future state data elements and data flows using Erwin.
- Extensively used Informatica debugger to find out the problems in mapping. Also involved in troubleshooting and rectify the bugs.
- Involvement in all phases of software development life cycle in this project which includes but not limited to Requirement gathering, analysis, technical design, development, testing, and user acceptance testing.
Environment: Informatica Power Center 9.5 Oracle 11g, UNIX, XML, PLSQL, Erwin 4.0, Putty, Winscp, Windows XP/Vista, Quality Center, Tivoli, Autosys
Confidential, Los Angeles, CA
ETL/Informatica Developer:
Responsibilities:
- Analyzed source and target systems for loading data into CDW and to create external data feeds.
- Involved all the phases of SDLC design and development.
- Involved in gap analysis and STM documentation to map flat file fields with relational table data in CDW.
- Developed mappings, Mapplets, re-usable transformations to load external data into CDW.
- Extracted data from CDW to create delimited and fixed width flat file data feeds that go out to external sources.
- Analyzed Design documents and developed ETL requirement specs.
- Created re-usable objects and shortcuts for various commonly used flat file and relational sources.
- Developed a shell script in order to append Date and Time stamp for output xml files, to remove empty delta files and to FTP the output xml files to different servers.
- Validate Logical Data Models relationships and entities, Determine data linage by including all associated systems in data profile.
- Excellent Data Profiling experience used Data Polling validating data patterns and formats.
- Integrated data into CDW by sourcing it from different sources like Oracle, Flat Files and Mainframes (DB2 and VSAM) using Power Exchange.
- Technical Analysis writing & reviewing technical designs.
- Developed a dashboard solution for analyzing STD statistics by building SSAS cubes and Tableau
- Created UNIX scripts to deal with flat file data like merging delta files with whole files and to concatenate header, detailed and trailer parts of the files.
- Developed Mappings which loads the data in to Teradata tables with SAP definitions as Sources.
- Created mappings to read parameterized data from tables to create parameter files.
- Used XML Source Qualifier and used only with an XML source definition to represents the data elements that the Informatica Server reads when it executes a session with XML Developed.
- Performed Unit testing and functional testing and documented the results in the testing documents both in development and UAT environments
- Used XML Parser transformation lets you extract XML data from messaging systems.
- Used ODI to perform data integration to ELT processing. Data is extracted from multiple sources, sent through several transformation processes and loaded into a final destination target.
- Used Informatica Power Center to Extract, Transform and Load data into Netezza Data Warehouse from various sources like Oracle and flat files
- Used SAS for data entry, Retrieval, Management report writing and Statistical analysis.
- Developed Complex transformations, Mapplets using Informatica to Extract, Transform and load (ETL) data into Data marts, Enterprise Data warehouse (EDW) and Operational data store (ODS).
- Used Message broker as translate messages between the interfaces.
Environment: Informatica Power Center 9.5, Power Exchange, Windows, Oracle, Toad for Oracle, BODS, UNIX Shell Scripting, workflow scheduler, Perl, Putty, Tableau, WinScp.
Confidential, Chicago, IL
ETL/Informatica Developer
Responsibilities:
- Interact with business analysts, Analyzed, inspected and translate business requirements into technical specifications.
- Participated in system analysis and data modeling, which included creating tables, views, triggers, functions, indexes, functions, procedures, cursors.
- Involved Creating Fact and Dimension tables using Star schema.
- Extensively involved working on the transformations like Source Qualifier, Filter, Joiner, Aggregator, Expression and Lookup.
- Created session logs, workflow logs and debugger to debug the session and analyze the problem associated with the mappings and generic scripts.
- Design and developed complex Informatica mappings including SCD Type 2 (Slow Changing Dimension Type 2).
- Written several complex SQL queries to validate the Data Transformation Rules for ETL testing
- Extensively worked in Workflow Manager, Workflow Monitor and Work let Designer to create, edit and run workflows.
- Involved in the design, development and testing of the PL/SQL stored procedures, packages for the ETL processes.
- Developed UNIX Shell scripts to automate repetitive database processes and maintained shell scripts for data conversion.
- Extensively used Various Data Cleansing and Data Conversion Functions in various transformations.
Environment : Informatica Power Center 9.1/8.6, SQL server 08, Oracle 11g/10g, PL/SQL, SAP BO, SSRS, Windows NT , Flat files (fixed width/delimited), MS-Excel, UNIX shell scripting, Putty, WinScp.
Confidential, CA
ETL/Informatica Developer
Responsibilities:
- Maintaining the data coming from the OLTP systems.
- Developed and maintained complex Informatica mappings.
- Involved in analyzing and development of the Data Warehouse.
- Created complex mappings in Informatica Power Center Designer using Aggregate, Expression, Filter, Sequence Generator, Update Strategy, Rank, Sorter, Lookup, Joiner transformations etc.,
- Involved in coding/updating UNIX scripts to FTP the file from the Source System
- Perform data validation tests using complex SQL statements.
- Worked with data warehouse staff to inbest practices from Informatica.
- Worked with Business Analysts, using work sessions, to translating business requirements into technical specifications, including data, process and specifications.
- Developed Informatica power center and cloud jobs which extracts core data from DB2 to Oracle, Netezza (stage and EDW)
- Implemented Type II slowly changing dimensions using date-time stamping.
- Created database structures, objects and their modification as and when needed.
- Investigating and fixing the bugs occurred in the production environment and providing the on-call support.
- Performed Unit testing and maintained test logs and test cases for all the mappings.
- Testing for Data Integrity and Consistency.
Environment : Informatica Power Center 8.6, Informatica Power Center, Power Exchange, Windows, IBM DB2 8.x, Mainframes, SQL Server 2007, Enterprise Architect, Meta data Manager, ER studio, Oracle, SQL Plus, PL/SQL, Windows 2007.
Confidential, Phoenix, AZ
ETL/ Informatica Developer
Responsibilities:
- Extensively used Informatica to load data from Oracle9 and flat files into the target Oracle 10g database.
- Used various transformations like Joiner, Aggregator, Expression, Lookup, Filter, Update Strategy and Stored Procedures.
- Used Mapplets and Reusable Transformations to prevent redundancy of transformation usage and maintainability.
- Created and scheduled workflows using Workflow Manager to load the data into the Target Database.
- Involved in performance tuning of Targets, Sources, and Mappings. Improved performance by identifying performance bottlenecks.
- Worked on different tasks in Workflows like sessions, events raise, event wait, e-mail and timer for scheduling of the workflow.
- Involved in meetings to gather information and requirements from the clients.
- Involved in Designing the ETL process to extract, translate and load data from flat files to warehouse data base.
- Used Debugger to validate mappings and also to obtain troubleshooting information about data by inserting Breakpoints.
- Extensively used Variables, Break points, and Sorted inputs.
- Documented the number of source / target rows and analyzed the rejected rows and worked on re-loading the rejected rows.
- Created UNIX shell scripting and automation of scheduling processes.
- Wrote SQL Queries, PL/SQL Procedures, Functions, and Triggers for implementing business logic and for validating the data loaded into the target tables using query tool TOAD.
Environment: Informatica Power Center 7.1, ETL, PL/SQL Developer, Oracle 10g, UNIX, Microsoft SQL Server, TOAD.