We provide IT Staff Augmentation Services!

Senior Etl Informatica Developer Resume

3.00/5 (Submit Your Rating)

Austin, TX

SUMMARY:

  • Over 7+ years of experience in Information Technology with emphasis on Data Warehouse/Data Mart development using developing strategies for extraction, transformation, and loading (ETL) in Informatica Power Center from various database sources.
  • Experienced with full life cycle of Software Development (Planning, Analysis Design, Deployment, Testing, Integration and Support).
  • Extensively worked on different Connectors (AWS, Workday, Twitter, Glassdoor) to retrieve the data Using ICS (Informatica cloud Services).
  • Excellent Knowledge on Slowly Changing Dimensions (SCD Type1, SCD Type 2, SCD Type 3), Change Data Capture, Dimensional Data Modeling, Ralph Kimball Approach, Star/Snowflake Modeling, Data Marts, OLAP and FACT and Dimensions tables, Physical and Logical data modeling.
  • Extensive involvement in composing UNIX shell scripts and computerization of the ETL forms utilizing UNIX shell scripting.
  • Strong work experience in Data Warehouse life cycle development, performed ETL procedure to load data from different sources like SQL Server, Oracle, Mainframe, Teradata and flat files into data marts and data warehouse using Informatic Power Center - Designer, Workflow Manager, and Workflow Monitor.
  • Excellent in designing ETL procedures and strategies to extract data from different heterogeneous source systems like oracle 12c/11g, SQL Server 2013/2009, DB2 10, Flat files, XML, SAP R/3 etc.
  • Created mappings in mapping Designer to load data from various sources using transformations like Transaction Control, Lookup (Connected and Un-connected), Router, Filter, Expression, Aggregator, Joiner and Update Strategy, SQL, Stored Procedure and more.
  • Have experience with IDQ, MDM with knowledge on Big Data Edition Integration with Hadoop and HDFS.
  • Experience in Managing and perform data cleansing, de-duplication and harmonization of data received from On-Prem and Cloud sources.
  • Developing Oracle PL/SQL stored procedures, Functions, Packages, SQL scripts to facilitate the functionality for various modules.
  • Extensive knowledge of various Performance Tuning Techniques on Sources, Targets, Mappings and Workflows using Partitions/Parallelization and eliminating Cache Intensive Transformations.
  • Strong RDBMS concepts and experience in creating, maintaining and tuning Views, Stored Procedures User Defined Functions and System Functions using SQL Server, T-SQL.
  • Involved in the Analysis, Design, Development, Testing and Implementation of business application systems for Health care, Pharmaceutical, Financial, Telecom and Manufacturing Sectors.
  • Hands on experience working in LINUX, UNIX and Windows environments.
  • Good knowledge on data quality measurement using IDQ and IDE
  • Strong experience in Dimensional Modeling using Star and Snow Flake Schema, Identifying Facts and Dimensions, Physical and logical data modeling using Erwin and ER-Studio.
  • Proficient with Informatica Data Quality (IDQ) for cleanup and massaging at staging area.
  • Designed and Developed IDQ mappings for address validation / cleansing, doctor master data matching, data conversion, exception handling, and report exception data.
  • Hands-on experience across all stages of Software Development Life Cycle (SDLC) including business requirement analysis, data mapping, build, unit testing, systems integration and user acceptance testing.
  • Experience in handling initial/full and incremental loads.
  • Expertise in scheduling workflows Windows scheduler, Unix and scheduling tools like CRTL-M &Autosys
  • Designed, Installed, Configured core Informatica/Siperian Hub components such as Informatica Hub Console, Hub Store, Hub Server, Cleanse Match Server, Cleanse Adapter, IDD & Data Modeling.
  • Experience in support and knowledge transfer to the production team.
  • Worked with Business Managers, Analysts, Development, and end users to correlate Business Logic and Specifications for ETL Development.
  • Experienced in Quality Assurance, Manual and Automated Testing Procedures with active involvement in Database/ Session/ Mapping Level Performance Tuning and Debugging.
  • Excellent communication, presentation, project management skills, a very good team player and self-starter with ability to work independently and as part of a team.

TECHNICAL SKILLS:

ETL Tools: Informatica Powercenter 10.x/9.5.x/9.1/8.6.x/8.5.x., Informatica Power Exchange 9.x, Big data edition 9.6.1, Data Stage, Pentaho, Informatica Data Explorer (IDE), Informatica 9.5.1, Informatica Data Quality (IDQ), SSIS.

Data Modeling: Star Schema, Snowflake Schema, Erwin 4.0, Dimension Data Modeling.

Databases: Oracle12c/11g/, SQL Server 2013/2009, IBM DB2, Netezza, Sybase, MS Access

Control: M, CA7 Schedule, CA ESP Workstation, Autosys, Informatica Scheduler

Programming: SQL, PL/SQL, Transact SQL, HTML, XML, C, C++, Korn Shell, Bash, Perl

Operating Systems: Windows 7/XP/NT/95/98/2000, DOS, UNIX and LINUX

Big Data / Hadoop: HDFS, Hive, Spark, Hbase, and Sqoop.

Other Tools: AWS,SQL*Plus, Toad, SQL Navigator, Putty, WINSCP, MS-Office, SQL Developer.

PROFESSIONAL EXPERIENCE:

Senior ETL Informatica Developer

Confidential, Austin, TX

Responsibilities:

  • Enhanced and added performance measures to the HHS performance management dashboard being developed to bridge the gap between now and the availability of the PMAS solution.
  • Augmented existing staff in the implementation of high priority performance measures across HHS divisions.
  • Used Informatica Power Centre and 10.2.1 for extraction, loading and transformation (ETL) of data in the DataMart.
  • Involved in handling and selecting heterogeneous data sources like Oracle, DB2, SQL server and Flat Files.
  • Developed Complex database objects like Stored Procedures, Functions, Packages, Triggers, Tables, Views, Indexes, Constraints, Synonyms, Materialized Views, Partition Tables using SQL.
  • Extensively worked in the performance tuning of Teradata SQL, ETL and other processes to optimize session performance.
  • Loaded various Key Performance measures KPI FACT tables and multiple dimension tables for RMS using Informatica.
  • Developed mappings in Informatica to load the data including facts and dimensions from various sources into the Data Warehouse, using different transformations like Source Qualifier, JAVA, Expression, Lookup, Aggregate, Update Strategy and Joiner.
  • Built several BTEQ to load data from Stage to Base after considering several performance techniques in Teradata sql.
  • Created Informatica workflows and IDQ mappings for - Batch and Real Time.
  • Involved in Dimensional modeling (Star Schema) of the Data warehouse and used Erwin to design the business process, dimensions and measured facts.
  • Worked extensively on Informatica Partitioning when dealing with huge volumes of data and also partitioned the tables in Teradata for optimal performance.
  • Extensively Worked on Star Schema, Snowflake Schema, Data Modeling, Logical and Physical Model, Data Elements, Issue/Question Resolution Logs, and Source to Target Mappings, Interface Matrix and Design elements.
  • Design and develop logical and physical data models that utilize concepts such as Star Schema, Snowflake Schema and Slowly Changing Dimension.
  • Involved in AWS Data Migration Services and Schema Conversion with ETL .
  • Developed mappings to load into staging tables and then to Dimensions and Facts.
  • Worked on Data modeling; Dimensional modeling and E-R Modeling and OLTP AND OLAP in Data Analysis.
  • Developed mapping parameters and variables to support SQL override.
  • Worked extensively on Informatica designer to design a robust end-to-end ETL process involving complex transformation like Source Qualifier , Lookup , Update Strategy , Router, Aggregator, Sequence Generator, Filter, Expression, Stored Procedure, External Procedure, Transactional Control for the efficient extraction, transformation and loading of the data to the staging and then to the Data Mart (Data Warehouse) checking the complex logics for computing the facts.
  • Implemented complex business rules in Informatica Power Center by creating re-usable transformations, and robust Mapplets.
  • Worked extensively in full System Development Life Cycle like participating in requirement gathering, business analysis, user meeting.
  • Extracted data from DB2 database on Mainframes and loaded it into SET and MULTISET tables in the Teradata database by using various Teradata load utilities. Transferred large volumes of data using Teradata FastLoad, MultiLoad, and T-Pump.
  • Transferred large volumes of data using Teradata FastLoad, MultiLoad, and T-Pump.
  • Worked on complex SQL using Teradata functions, macros and stored procedures.
  • Extensively worked with Data Analyst and Data Modeler to design and to understand the structures of dimensions and fact tables and Technical Specification Document.
  • Extensively used the reusable transformation, mappings and codes using Mapplets for faster development and standardization.
  • Worked on developing of on-line transactional processing (OLTP), operational data store (ODS) and decision support system (DSS) (e.g., Data Warehouse) databases.
  • Involved in Performance Tuning in Informatica for source, transformation, targets, mapping and session.
  • Implemented Slowly Changing Dimensions Type-1, Type-2 approach for loading the target tables.

Environment: Informatica Power Center 10.2.1, Oracle 12C/11g, Teradata,SQL, Flat Files, DB2, MS SQL Server, Erwin, PL/SQL, UNIX Shell Scripting, Business Objects XIII, AWS.

Senior ETL Informatica Developer

Confidential, Cincinnati, OH

Responsibilities:

  • Gathered user Requirements and designed Source to Target data load specifications based on business rules.
  • Designed, Developed & Supported Extraction, Transformation & Load Process (ETL) for data migration with Informatica Power Center.
  • Used Informatica Power Centre and 10.2.1 and 9.6.1 for extraction, loading and transformation (ETL) of data in the DataMart.
  • Worked with large scale migration from on-premise applications to cloud platforms (AWS).
  • Worked in designing and developing database solutions using SQL server and/or Cloud database solutions.
  • Participated in the review meetings with functional team to signoff the Technical Design document.
  • Involved in Design, Analysis, Implementation, Testing and support of ETL processes
  • Worked with Informatica Data Quality 9.6 (IDQ) toolkit, Analysis, data cleansing, data matching, data conversion, exception handling, and reporting and monitoring capabilities of IDQ 9.6.
  • Developed IDQ mappings using various transformations like Labeler, Standardization, Case Converter, Match & Address validation Transformation.
  • Extensively worked with MS SQL Server, file-bases sources and targets, data ware house appliances such as Netezza, Teradata.
  • Worked on writing complex SQL queries on Oracle.
  • Developed various mappings using Mapping Designer and worked with Aggregator, Lookup, Filter, Router, Joiner, Source Qualifier, Expression, Stored Procedure, Sorter and Sequence Generator transformations.
  • Created complex mappings which involved Slowly Changing Dimensions, implementation of Business Logic and capturing the deleted records in the source systems.
  • Worked extensively with the connected lookup Transformations using dynamic cache.
  • Worked with complex mappings having an average of 15 transformations.
  • Coded PL/SQL stored procedures and successfully used them in the mappings.
  • Coded Unix Scripts to capture data from different relational systems to flat files to use them as source file for ETL process and also to schedule the automatic execution of workflows.
  • Scheduled the Jobs by using Informatica scheduler& Jobtrac
  • Created and scheduled Sessions, Jobs based on demand, run on time and run only once
  • Monitored Workflows and Sessions using Workflow Monitor.
  • Performed Unit testing, Integration testing and System testing of Informatica mappings.
  • Involved in enhancements and maintenance activities of the data warehouse including tuning, modifying of stored procedures for code enhancements.
  • Responsible for determining the bottlenecks and fixing the bottlenecks with performance tuning at various levels like mapping level, session level, and database level.
  • Provided production support by monitoring the processes running daily.
  • Participated in weekly status meetings and conducting internal and external reviews as well as formal walk through among various teams and documenting the proceedings.
  • Coordinating with the Offshore team and directly interacting with the client for clarifications & resolutions
  • Introduced and created many project related documents for future use/.
  • Designed and developed ETL Mappings to extract data from Flat files and Oracle to load the data into the target database.
  • Developing several complex mappings in Informatica a variety of Power enter transformations, Mapping Parameters, Mapping Variables, Mapplets & Parameter files in Mapping Designer using Informatica Power Center.
  • Built complex reports using SQL scripts.
  • Created complex calculations, various prompts, conditional formatting and conditional blocking etc., accordingly.
  • Created complex mappings to load the data mart & monitored them. The mappings involved extensive use of Aggregator, Filter, Router, Expression, Joiner, Union, Normalizer and Sequence generator transformations.
  • Ran the workflows on a daily and weekly basis using workflow monitor.

Environment: Informatica 10.2.1,9.6.1, PL/SQL, Informatica Data Quality IDQ 9.6,AWS, Informatica 8.6.1, 9.5, Oracle 9i, UNIX, SQL, SSIS, Netezza, PL/SQL, Informatica Scheduler, SQL*loader, SQL Developer, Framework Manager, Transformer, Teradata, Oracle 11g, TOAD, Windows Server 2008, UNIX.

Senior Informatica PowerCenter Developer

Confidential, St Louis, MO

Responsibilities:

  • Assisted Business Analyst with drafting the requirements, implementing design and development of various components of ETL for various applications.
  • Worked closely with ETL Architect and QC team for finalizing ETL Specification document and test scenarios.
  • Experience in managed services for data ingestion/processing with hands on experience working in AWS.
  • Extracted data from oracle database and spreadsheets, CSV files and staged into a single place and applied business logic to load them in the central oracle database.
  • Designed and developed Informatica Mappings and Sessions based on user requirements and business rules to load data from source flat files and RDBMS tables to target tables.
  • Migration of code between the Environments and maintaining the code backups.
  • Worked in handling both on premise and on Cloud (AWS) ETL.
  • Integration of various data sources like Oracle, SQL Server, Fixed Width & Delimited Flat Files, DB2.
  • Involved in the Unit Testing and Integration testing of the workflows developed.
  • Extensively worked with Korn-Shell scripts for parsing and moving files and even for re-creating parameter files in post-session command tasks.
  • Imported Source/Target Tables from the respective databases and created reusable transformations like Joiner, Routers, Lookups, Filter, Expression and Aggregator and created new mappings using Designer module of Informatica.
  • Used the Address Doctor to validate the address and performed exception handling, reporting and monitoring the system. Created different rules as mapplets, Logical Data Objects (LDO), workflows. Deployed the workflows as an application to run them. Tuned the mappings for better performance.
  • Working with database connections, SQL joins, cardinalities, loops, aliases, views, aggregate conditions, parsing of objects and hierarchies.
  • Profiled data on Hadoop to understand the data and identify data quality issues.
  • Imported and exported data from relational databases to Hadoop Distributed file system-using Sqoop.
  • Developed shell scripts for running batch jobs and scheduling them.
  • Handling User Acceptance Test & System Integration Test apart from Unit Testing with the help of Quality Center as bug logging tool. Created & Documented Unit Test Plan (UTP) for the code.
  • Involved in Production Support

Environment: Informatica PowerCenter 9.6, AWS, Oracle 11g, SQL Server, PL/SQL, Unix and WINSCP, Bigdata Edition 9.6.1, Hadoop, HDFS, HIVE, Sqoop.

Informatica Developer

Confidential, Schenectady, NY

Responsibilities:

  • Responsible to meet with business stakeholders and other technical team members to Gather and analyze application requirements.
  • Worked on source analyzer, Target Designer, Mapping and Mapplet Designer, Workflow manager & Workflow Monitor.
  • Created mappings for initial load in Power Center Designer using the transformations Expression, Router and Source Qualifier.
  • Created complex mappings for full load into target in Power Center Designer using Sorter, Connected Lookup, Unconnected Lookup, Update Strategy, Router, Union etc.
  • Created Mapplets to reuse all the set of transformations for all mappings.
  • Mappings, Mapplets and Sessions for data loads and data cleansing. Enhancing the existing mappings where changes are made to the existing mappings using Informatica Power center.
  • Developed system architecture and system design documentation Design and arrangement of scalable, highly attainable, and fault tolerant systems on AWS
  • Involved in development of Logical and Physical data models that capture current state Developed and tested all the Informatica Data mappings, sessions and workflows - involving several Tasks.
  • Responsibilities include creating the sessions and scheduling the sessions.
  • Worked on SAS Data management.
  • Created various tasks to give various conditions in the workflows.
  • Involving in extracting the data from Oracle and Flat files. Developed and implemented various enhancements to the application in the form of Production and new production rollouts.
  • Created parameters and variables for the reusable sessions.
  • Analyzed the various bottlenecks at source, target, mapping and session level.
  • Tuning of the mappings and SQL Scripts for a better performance.
  • Performed Unit testing on the Informatica code by running in the debugger and writing simple test scripts in the database thereby tuning it by identifying and eliminating the bottlenecks for the optimum performance.
  • Assign work and responsible for providing technical expertise for the design and execution of ETL projects to onshore and offshore developers.

Environment: Informatica8.6, IDQ 8.6.1, Teradata, Oracle 10g, PLSQL, DB2, XML, SQL* PLUS, MS Excel, UNIX (AIX), UNIX Shell

ETL/Informatica Developer

Confidential, Mclean, VA

Responsibilities:

  • Understanding the Business Design Documents & creating an overall design based on requirements & Business reviews.
  • Designed and reviewed the ETL solutions in Informatica Power Center.
  • Analyzed the requirements to identify the necessary tables that need to be populated into the staging area.
  • Developed Informatica mappings to load data into various fact tables and dimension tables.
  • Created Mappings using Mapping Designer to load data from various sources like Oracle, Flat Files, MS SQL Server and XML.
  • Used various transformations of Informatica, such as Source Qualifier Transformation, Expression Transformation, Look-up transformation, Update Strategy transformation, Filter transformation, Router transformation, Joiner transformation etc. for developing Informatica mappings.
  • Prepared ETL standards, naming conventions and wrote ETL flow documentation.
  • Importing source and target tables from their respective databases.
  • Created and Monitored Workflows using Workflow Manager and Workflow Monitor.
  • Used shortcuts (Global/Local) to reuse objects without creating multiple objects in the repository and inherit changes made to the source automatically.
  • Performed Repository Administration tasks (Creating Repositories, Users, Assigning privileges, creating backups and recovery).
  • Involved in designing the ETL testing strategies for functional, integration & system testing for Data warehouse implementation.
  • Implemented versioning of folders in the Informatica Designer tool.
  • Used Parameter files to specify DB Connection parameters for sources.
  • Used debugger to test the mapping and fixed the bugs.
  • Created mappings with flat-file from different ERP systems and used for Change Data Capture to reduce load on SAP BI.
  • Prepared unit test cases to meet the business requirements and performed unit testing of mappings.

Environment: Informatica PowerCenter 9.5.1, Teradata 14, Oracle 11g, SQL Server 2012, PL/SQL, T-SQL, Toad, Erwin, Teradata SQL Assistant, SQL Server Management Studio, JIRA, Unix, Win 7.

ETL Developer

Confidential

Responsibilities:

  • Analyze project plan to gather the business requirement.
  • Designed and Customized data models for Data Mart supporting data from multiple sources on real time.
  • Extract data from flat files, Oracle, SQL Plus, MS SQL Server 2008 and to load the data into the target database.
  • Extensively used Informatica Power Center 7.1 an ETL tool to extract, transform and load data from remote sources to DW.
  • Created Complex mappings using transformation like Filter, Expression, Joiner, Aggregator, Router and Stored Procedure transformations for populating target table in efficient manner.
  • Developed complex joins in the mappings to process data from different sources.
  • Used Informatica debugging techniques to debug the mappings and used session log files and bad files to trace errors of target data load.
  • Developed work flow tasks like reusable Email, Event wait, Timer, Command, Decision.
  • Performed unit testing of Informatica sessions, batches and the target Data.
  • Involved in performance tuning of mappings, transformations and (workflow) sessions to optimize session performance.
  • Effectively utilized shared/ persistent caching techniques and incremental aggregation strategies to improve performance.
  • Designed and developed UNIX shell scripts as part of the pre-session and post-session command to automate the process of loading, pulling, renaming and pushing data from and to different servers.
  • Designed and developed SQL, PL/SQL and UNIX shell scripts.

Environment: Informatica Power center Designer 7, Oracle 9.x/10g, TOAD, Flat Files, UNIX, MS SQL Server 2008, SQL, PL/SQL, and SQL PLUS

Informatica ETL Developer

Confidential

Responsibilities:

  • Involved in business analysis and technical design sessions with business and technical staff to develop requirements document and ETL specifications.
  • Involved in designing dimensional modeling and data modeling using Erwin tool.
  • Created high-level Technical Design Document and Unit Test Plans.
  • Developed mapping logic using various transformations like Expression, Lookups (Connected and Unconnected), Joiner, Filter, Sorter, Update strategy and Sequence generator.
  • Wrote complex SQL override scripts at source qualifier level to avoid Informatica joiners and Look-ups to improve the performance as the volume of the data was heavy.
  • Responsible for creating workflows. Created Session, Event, Command, and Control Decision and Email tasks in Workflow Manager
  • Prepared user requirement documentation for mapping and additional functionality.
  • Extensively used ETL to load data using Powercenter from source systems like Flat Files into staging tables and load the data into the target database Oracle. Analyzed the existing systems and made a Feasibility Study.
  • Analyzed current system and programs and prepared gap analysis documents
  • Experience in Performance tuning & Optimization of SQL statements SQL trace
  • Involved in Unit, System integration, User Acceptance Testing of Mapping.
  • Used Transformation Developer to create the reusable Transformations.
  • Used Informatica Powercenter Workflow manager to create sessions, batches to run with the logic embedded in the mappings.
  • Wrote SQL Scripts for the reporting requirements and to meet the Unit Test Requirements.
  • Validated the Mappings, Sessions & Workflows, Generated & Loaded the Data into the target database.
  • Used Informatica’s features to implement Type I, II changes in slowly changing dimension tables and also developed complex mappings to facilitate daily, weekly and monthly loading of data.
  • Extensively worked on Oracle SQL's for Data Analysis and debugging.
  • Handled scripts for pre validating the source File structures before loading into the Staging by comparing the source file headers against the base lined header
  • Worked on Teradata Utilities (Multiplied, fast Load, and Export/Import) to improve performance.
  • Wrote shell scripts and control files to load data into staging tables and then into Oracle base tables using SQL*Loader.
  • Used PMCMD command to automate the Powercenter sessions and workflows through UNIX.
  • Validated the Mappings, Sessions & Workflows, Generated & Loaded the Data into the target database.
  • Involved in troubleshooting existing ETL bugs.

Environment: Informatica Powercenter 8.6, ETL, Flat files, Oracle 10g, MS SQL Server 2008, PL/SQL, Shell Programming, TIBCO, SQL * Loader, Toad, Excel and Unix scripting, Sun Solaris, Windows 2002

We'd love your feedback!