Sr. Etl Informatica Developer Resume
San Ramon, CaliforniA
SUMMARY:
- Over 8 years of experience in Information Technology with a strong background in Database development and Data warehousing.
- Good experience in designing, Implementation of Data warehousing and Business Intelligence using ETL tools like Informatica Power Center (designer, workflow manager, workflow monitor and repository manager).
- Created mappings in mapping Designer to load data from various sources using transformations like Transaction Control, Lookup (Connected and Un - connected), Router, Filter, Expression, Aggregator, Joiner and Update Strategy, SQL, Stored Procedure and more.
- Worked on Informatica Data Quality 9.1(IDQ) tool kit and performed data profiling, cleansing and matching and imported data quality files as reference tables
- Excellent Knowledge on Slowly Changing Dimensions (SCD Type1, SCD Type 2, SCD Type 3), Change Data Capture, Dimensional Data Modeling, Ralph Kimball Approach, Star/Snowflake Modeling, Data Marts, OLAP and FACT and Dimensions tables, Physical and Logical data modeling.
- Experience in developing Transformations, Mapplets and Mappings in Informatica Designer and creating tasks using Workflow Manager to move data from multiple sources to target.
- Good knowledge on data quality measurement using IDQ and IDE
- Worked in different phases of the projects involving Requirements Gathering, Design, Development, Deployment, Testing and Maintaining.
- Good Knowledge on Database architecture of OLTP and OLAP applications, Data Analysis.
- Worked with wide variety of sources like Relational Databases, Flat Files, Mainframes, XML files and Scheduling tools like Control-M, Auto sys and Informatica Scheduler.
- Very Strong skills and clear understanding of requisites and solutions to various issues in implementation throughout the Software Development life cycle (SDLC).
- Hands on experience in PL/SQL (Stored Procedures, Functions, Packages, Triggers, Cursors, Indexes) and UNIX Shell scripting.
- Expertise in doing Unit Testing, Integration Testing, System Testing and Data Validation for Developed Informatica Mappings.
- Hands on experience working in LINUX, UNIX and Windows environments.
- Experience in working Production Support and migrated the code from DEV to QA to Production.
- Very good exposure to Oracle 11g/10g/9i, MS SQL Server 2008/2005, IBM DB2, Teradata databases.
- Excellent Verbal and Written Communication Skills. Have proven to be highly effective in interfacing across business and technical groups.
TECHNICAL SKILLS:
ETL Tools: Informatica Power Center 9.5.x/9.1/8.6.x/8.5.x., Informatica Power Exchange 9.x, Data Stage, Pentaho., Informatica Data Explorer (IDE), Informatica Data Quality (IDQ), Informatica Power Exchange (PWX), DTS, SSIS.
Data Modeling: Star Schema, Snowflake Schema, Erwin 4.0, Dimension Data Modeling.
Databases: Oracle11g/10g/9i, SQL Server2008/2005, IBM DB2, Teradata 13.1/V2R5, V2R6, Sybase, MS Access
Scheduling Tools: Control-M, CA7 Schedule, CA ESP Workstation, Autosys, Informatica Scheduler.
Reporting Tools: Crystal Reports, Business Objects XI R2/XI 3.3, OBIEE 11g R1 (11.1.5).
Programming: SQL, PL/SQL, Transact SQL, HTML, XML, C, C++, Korn Shell, Bash, Perl, Python.
Operating Systems: Windows 7/XP/NT/95/98/2000, DOS, UNIX and LINUX
Other Tools: SQL*Plus, Toad, SQL Navigator, Putty, WINSCP, MS-Office, SQL Developer.
PROFESSIONAL EXPERIENCE:
Confidential, San Ramon, California
Sr. ETL Informatica Developer
Technical Environment: Informatica 9.6, Oracle 11g, Flat files, Mainframe, Toad for Oracle 11, Oracle SQL Assistant, JIRA, Tidal, SVN, Windows Server, UNIX, IDQ(IDE).
Responsibilities:
- Analyze business and functional specification documents and design Data warehousing and Business Intelligence objects using ETL tools like Informatica Power Center (designer, workflow manager, workflow monitor and repository manager)
- Develop new mappings and enhance existing ones by using Informatica’ s various transformations such as Aggregator, Filter, Router, Expression, Joiner, Union, Normalizer, Sequence generator.
- The mappings involved extensive use of Mapping Parameters/Variables, reusable components such as Mapplets and Worklets.
- Design and conduct Unit Testing, Integration Testing, System Testing and Data Validation for Developed Informatica Mappings Demonstrated the ETL process Design with Business Analyst and Data Warehousing Architect.
- DevOps tickets and request management and resolve complex data defects in the data warehouse by developing, modifying, testing ETL (Extract, Transform, Load) programs.
- Enhanced ETL code for better performance by using Partition and Pushdown optimization concept in Informatica.
- Worked on Oracle Partition swap and built reusable components in Informatica to load data from work table to final target table.
- Perform tuning and optimization on SQL queries, using Explain Plan and tkprof.
- Review and alter database programs to increase operating efficiency.
- Develop database code for the feature enhancement in ALMT project.
- Document, test, implement and provide ongoing support for Oracle and ETL applications.
Confidential, Columbus, Ohio
Sr. ETL Informatica Developer
Technical Environment: Informatica 9.6, Oracle 11g, Teradata 15.1, Power Exchange 9.6, flat files, Mainframe, Toad for Oracle 11, Teradata SQL Assistant, Oracle SQL Assistant, HP Service Center, Service Now, RTC, Harvest Version Control tool, SVN, Windows Server, UNIX and CA ESP Workstation Scheduler, UNIX, IDQ(IDE).
Responsibilities:
- Develop complex mappings by efficiently using various transformations, Mapplets, Mapping Parameters/Variables, Mapplet Parameters in Designer. The mappings involved extensive use of Aggregator, Filter, Router, Expression, Joiner, Union, Normalizer, Sequence generator, SQL and Web Service transformations.
- Extensively used all Power Center/Power mart capabilities such Target override, Connected, Unconnected and Persistent lookup’s
- Demonstrated the ETL process Design with Business Analyst and Data Warehousing Architect.
- Resolved complex data defects in the data warehouse by developing, modifying, testing ETL (Extract, Transform, Load) programs.
- Conducted Data Analysis, helped Business Leads in understanding and designing new reports.
- Perform white board sessions with the development team, Automation tester and RA.
- Extensively worked on UNIX shell scripts for server Health Check monitoring such as Repository Backup, CPU/Disk space utilization, Informatica Server monitoring, UNIX file system maintenance/cleanup and scripts using Informatica Command line utilities.
- Automation of job processing, establish automatic email notification to the concerned persons.
- Created automated process to monitor the space in data directory using Perl and Korn Shell.
- Coded Teradata BTEQ sql scripts to load, transform data, fix defects like SCD 2 date chaining, cleaning up duplicates.
- Worked with users to successfully complete user acceptance testing.
- Analyzed data to answer time critical research questions from Business teams, customer facing agents.
- Converted and redesigned ETL programs from Syncsort/DMExpress to Informatica PowerCenter.
- Worked with Vendor teams to onboard new file feeds, set up and test secure FTP, schedule flows to consume files in batch loads.
- Resolved many issues by applying EBF’s and patches for issues related to Informatica load failures.
- Migration of Informatica Mappings/Sessions/Workflows and Unix scripts to QA, UAT and Prod environments using Harvest and SVN tool.
- Involved in Informatica, Teradata migration testing.
- Worked on retiring code, impact analysis, updating documentation of remediated ETL code.
Confidential, Cleveland, Ohio
Sr. ETL Informatica Developer
Technical Environment: Informatica 9.5.1, Oracle 11g, Power Exchange 9.5.1, flat files, Mainframe, Toad for Oracle 11, Harvest Version Control tool, Windows Server, UNIX and CA7 Scheduler
UNIX, IDQ(IDE), Autosys, Toad
Responsibilities:
- Analyze the business requirements and framing the Business Logic for the ETL Process and maintained the ETL process using Informatica Power Center.
- Work extensively on various transformations like Normalizer, Expression, Union, Joiner, Filter, Aggregator, Router, Update Strategy, Lookup, Stored Procedure and Sequence Generator.
- Develop an ETL Informatica mapping in order to load data into staging area. Extracted from Mainframe files and databases and loaded into Oracle 11g target database.
- Create workflows and work lets for Informatica Mappings.
- Write Stored Procedures and Functions to do Data Transformations and integrate them with Informatica programs and the existing applications.
- Work on SQL coding for overriding for generated SQL query in Informatica.
- Involve in Unit testing for the validity of the data from different data sources.
- Design and develop PL/SQL packages, stored procedure, tables, views, indexes and functions. Experience dealing with partitioned tables and automating the process of partition drop and create in oracle database.
- Involve in migrating the ETL application from development environment to testing environment.
- Perform data validation in the target tables using complex SQLs to make sure all the modules are integrated correctly.
- Perform Data Conversion/Data migration using Informatica PowerCenter.
- Involve in performance tuning for better data migration process.
- Analyze Session log files to resolve error in mapping and identified bottlenecks and tuned them for optimal performance.
- Create UNIX shell scripts for Informatica pre/post session operations.
- Automated the jobs using CA7 Scheduler.
- Worked on Direct Connect process to transfer the files between servers.
- Document and present the production/support documents for the components developed, when handing-over the application to the production support team.
Confidential, Dearborn, Mi
Sr ETL Informatica developer
Technical Environment: Informatica Power Center 9.1, Power Exchange, PL/SQL, Oracle 11g, SQL Server 2005/2000, Windows Server 2003, UNIX, Control-M, Toad
Responsibilities:
- Worked with Business analysts and the DBA for requirements gathering, business analysis and designing of the data warehouse.
- Involved in creating Logical and Physical Data Models and creating Star Schema models.
- Assisted in designing Star Schema to design dimension and fact tables.
- Involved in developing star schema model for target database using ERWIN Data modeling.
- Created Complex mappings using Unconnected and connected Lookups and Aggregate and Router transformations for populating target table in efficient manner.
- Created Mapplet and used them in different Mappings.
- Used sorter transformation and newly changed dynamic lookup.
- Worked on Informatica Data Quality 9.1(IDQ) tool kit and performed data profiling, cleansing and matching
- Created events and tasks in the work flows using workflow manager.
- Developed Informatica mappings and also tuned them for better performance with PL/SQL Procedures/Functions to build business rules to load data.
- Created Schema objects like Indexes, Views and Sequences.
- Daily production supports for the scheduled job and provide the resolution related to failed job and fixed them.
- Created Stored Procedures, Functions and Indexes Using Oracle.
- Worked on batch processing and file transfer protocols.
- Performance tuning of the workflow which are taking time.
- Analyzed the enhancements coming from the individual Client for application and implemented the same.
- Creation of technical documents.
- Developed mappings for policy, claims dimension tables.
- Working with database connections, SQL joins, cardinalities, loops, aliases, views, aggregate conditions, parsing of objects and hierarchies.
- Simulated real time transaction feed by fabricating transactions required for UAT and manually calculating expected results to generate expectancy going into the testing process.
Confidential, Hartford, CT
ETL Informatica Developer
Technical Environment: Informatica Power Center 9.1, Informatica Power Exchange, Oracle 10g, PL/SQL, DB2, Teradata 13.10, Clear Case, Erwin, Business Objects Info view, Windows, UNIX
Responsibilities:
- Analyzed business requirements, technical specification, source repositories and physical data models for ETL mapping and process flow.
- Created detailed Technical specifications for Data Warehouse and ETL processes.
- Involved in design phase of logical and physical data model using Erwin 4.0.
- Worked on Informatica - Source Analyzer, Warehouse Designer, Mapping Designer, Mapplet Designer and Transformation Developer .
- Extracted/loaded data from/into diverse source/target systems like Oracle, XML and Flat Files.
- Used most of the transformations such as the Source Qualifier, Expression, Aggregator, Filter, Connected and Unconnected Lookups, Joiner, update strategy and stored procedure.
- Developed mappings to load Fact and Dimension tables, SCD Type I and SCD Type II dimensions and Incremental loading and unit tested the mappings.
- Prepared low level technical design document and participated in build/review of the BTEQ Scripts, Fast Exports, Multi loads and Fast Load scripts, Reviewed Unit Test Plans and System Test cases.
- Creating new and enhancing the existing stored procedure SQL used for semantic views and load procedures for materialized views.
- Created BTEQ scripts to extract data from EDW to the Business reporting layer.
- Developed BTEQ scripts for validation and testing of the sessions, data integrity between source and target databases and for report generation.
- Loaded data from various data sources into Teradata production and development warehouse using BTEQ, FastExport, multi load and FastLoad.
- Used Teradata Administrator and Teradata Manager Tools for monitoring and control the systems.
- Addressed many performance issues on ETL jobs, semantic views, stored procedures, Reporting and Ad-hoc SQL.
- Involved in creating different types of reports including OLAP, Drill Down and Summary in BO.
- Worked in the Performance Tuning of SQL, ETL and other processes to optimize session performance.
- Created Reusable transformations, Mapplets, Worklets using Transformation Developer , Mapplet Designer and Worklet Designer.
- Tuned Informatica mappings to improve the execution time by applying suitable Partitioning mechanisms and tuning individual transformations inside the mapping.
Confidential
Informatica ETL Prod Support
Technical Environment: Informatica Power Center 8.5.x, ETL, Oracle 9i, MS Access, SQL, PL/SQL, Windows NT
Responsibilities:
- Interacted with business analysts, data architects, application developers to develop a data model.
- Designed and developed complex aggregate, join, look up transformation rules (business rules) to generate consolidated (fact/summary) data identified by dimensions using Informatica Power Center 8.5.x tool.
- Created sessions, database connections and batches using Informatica Server Manager/Work flow Manager.
- Optimized mappings, sessions/tasks, source and target databases as part of the performance tuning.
- Designed ETL Mappings and Workflows using power center designer and workflow manager.
- Involved in the extraction, transformation and loading of data from source flat files and RDBMS tables to target tables.
- Developed various Mappings, Mapplets and Transformations for data marts and Data warehouse.
- Used Shell Scripting to automate the loading process.
- Used VBA Macro excels to compare data to show proof of concept for Unit testing.
- Used Pipeline Partitioning feature in the sessions to reduce the load time.
- Analyzed and Created Facts and Dimension Tables.
- Used Informatica features to implement Type I and II changes in slowly changing dimension tables.
- Used command line program pmcmd to communicate with the server to start and stop sessions and batches, to stop the Informatica server and recover the sessions.
- Designed processes to extract, transform and load data to the Data Mart.
- Performed impact analysis, remedial actions and documented on process, application failures related with Informatica ETL and Power Analyzer.
- Performed regular backup of data warehouse, backup of various production, development repositories including automating and scheduling processes. As part of optimization process, performed design changes in Informatica mappings, transformations, sessions.