We provide IT Staff Augmentation Services!

Etl Informatica Developer Resume

4.00/5 (Submit Your Rating)

Kansas City, MO

PROFESSIONAL SUMMARY

  • Around 7+ years of experience in Information Technology with emphasis on Data Warehouse/Data Mart development using developing strategies for Extraction, transformation, and loading (ETL) in Informatica Power Center … from various database sources.
  • Strong work experience in ETL life cycle development, performed ETL procedure to load data from different sources into data marts and data warehouse using Power Center, Designer, Workflow Manager and Workflow Monitor.
  • Involved in informatica upgrade projects from one version to another version.
  • Involved in understanding of Business Processes, identification of dimensions and Facts for OLAP applications.
  • Strong knowledge of understanding Data Modeling.
  • Comprehensive experience of working with Type1, Type2 methodologies for Slowly Changing Dimensions (SCD) management.
  • Proficient in the Integration of various data sources with multiple relational databases like Oracle, Teradata, MS SQL Server, DB2, Mainframe, MySQL and Flat Files into the staging area, ODS, Data Warehouse and Data Mart.
  • Extensive experience in developing Stored Procedures, Functions, Triggers and Complex SQL queries.
  • Used Informatica Power Connect for SAP to pull data from SAP R/3.
  • Performed the data profiling and analysis making use of Informatica Data Quality (IDQ).
  • Experience on Debugger to validate the mappings and gain troubleshooting information about data and error conditions.
  • Well versed in Data modeling concepts incorporating dimensional modeling (star and snowflake schema), logical and physical data modeling.
  • Experience on Informatica designer tools - transformations, reusable transformations, mappings, and mapplets, DT Studio includes DT/Engine, DT/Designer and DT console.
  • Experience working in … with database objects like triggers, stored procedures, functions, packages, views, and indexes.
  • Proficient in performance tuning of Informatica Mappings, Transformations, and Sessions experienced optimizing query performance.
  • Worked extensively with Informatica Workflow Manager (using tools such as Task Developer, Worklet and Workflow Designer, and Workflow Monitor) to build and run workflows.
  • Experienced in using InformaticaData Quality (IDQ) tools for Data Analysis.
  • Expert in using InformaticaPower Exchange CDC (Change Data Capture) with Oracle database including DC (Data conversion).
  • Good working knowledge of various Informatica designer transformations like Source Qualifier, Dynamic and Worked Extensively on ETL process using SSIS package. Static Lookups, connected and Unconnected lookups, Expression, Filter, Router, Joiner, Normalizer and Update Strategy transformation.
  • Extensively used Teradata Utilities like Tpump, Fast-Load, Multi-Load, BTEQ and Fast-Export.
  • Involved in Tuning SQL performance Tuning.
  • Worked on Performance Tuning, identifying and resolving performance bottlenecks in informatica.
  • Experience in integrating Hadoop with Informatica Power Center also involved in moving Hadoop process to Informatica process. We have done the end to end development for the data the client needed.
  • Experience in Task Automation using UNIX Scripts, Job scheduling and Communicating with Server using pmcmd.
  • Extensively used Autosys, Control-M and Maestro for Job monitoring and scheduling along with Production on call support.
  • Used Data Masking transformation to change sensitive production data to realistic test data for non-production environments.
  • Highly motivated and goal-oriented individual with a strong background in SDLC Project Management and Resource Planning using Agile methodologies & Waterfall Methodologies.
  • Excellent interpersonal and communication skills.
  • Extensive experience in Documentation of Data Assessment, Data Mapping, Code Review, Test plan and operations.
  • Extensive experience on Agile Methodology and the scrum process.

TECHNICAL EXPERTISE

ETL Tools: Informatica Power Center 9.x/8.x/7.x, Power Exchange, Informatica Data Quality, Informatica B2B data exchange, ODI (Oracle Data Integrator)

Methodologies and Modeling tools: SDLC, Waterfall, Agile Methodology, Gap analysis, JAD, RUP Quality Center, Quick Test Pro (QTP), Rational Rose, MS Visio, Requisite Pro

DBMS: Oracle 11g/10g/9i/8i, DB2, MS SQL Server 2005/2000, MS Access, Excel, Teradata (Fast-Load, Multi-Load, Fast Export, BTEQ), Teradata V2R6, Teradata 12.0 and 13.0, ODBC

Programming Languages: SQL, PL/SQL, Unix Shell scripting, Java

OLAP/Reporting Tools: Siebel Analytics, OBIEE 11.1.1.x/10.1.3.x, Business Objects11.1.1.5, OBIA 7.9.x,Cognos, Scheduler (iBots)

Application Servers: Apache Tomcat 5.x 6.0, Jboss 4.0

Operating Systems: Windows XP/2008/2003/2000/NT/98/95, UNIX, LINUX

Other tools: VISIO, ERWIN, Control-M (Batch Scheduling), Toad 9.0/8.5.1, SQL *Loader, Putty, SQL Navigator 5.5, DAC (7.9.6,10.1.3.4, 10.1.3.4.1)

PROFESSIONAL EXPERIENCE

Confidential, Kansas City, MO

ETL Informatica Developer

Responsibilities:

  • Collaborated with Business analysts and DBA for requirements gathering, business analysis and designing of the data marts mainly in health care.
  • Worked on dimensional modeling to Design and develop STAR Schemas, identifying Fact and Dimension Tables for providing a unified view to ensure consistent decision making.
  • Integrated data from diverse source systems including Sales force, Siebel, MS SQL Server and flat files using Informatica Power Center.
  • Accessed to native database APIs using SQL in power exchange to provide high performance extraction, conversion and filtering of data without intermediate staging and programming.
  • Created mappings using various Transformations like Aggregator, Expression, Filter, Router, Joiner, Lookup, Update strategy, Source Qualifier, Sequence generator, Stored Procedure and Normalizer.
  • Versioning of the mappings was done in the Development Environment to keep track of the changes in the mappings.
  • Reviewing ETL development prior to the production load.
  • Troubleshoot problems by checking sessions and error logs. Also used debugger for complex problem troubleshooting.
  • Responsible for Testing and validating the Informatica mapping against the pre - defined ETL design standards
  • Implemented Performance tuning at database level and at Informatica level. Reduce load time by using partitions and concurrent sessions running at a time
  • Expertise in writing SQL, PL/SQL, Stored Procedures, Triggers and Packages in Data Warehouse environments that employ Oracle.
  • Created various Autosys entries for scheduling various data cleansing scripts and loading.
  • Create Informatica folders, global sources, global targets, global ODBC connections, global FTP connections and schedule workflows, data profiling and manage metadata.
  • Established Informatica best practices in terms of naming conventions.
  • Developed Interfaces using UNIX Shell Scripts to automate the bulk load, update Processes and Batch processes Scheduled Sessions and Batches in Server.
  • Associated with Production support team in various performances related issues.
  • Writing shell scripts to automate the export data into flat files for backup and delete data from staging tables for the given time period and sending emails about the statistics on regular basis

Environment: Informatica Power Center 8.1, MS SQL Server 2003, Oracle 10g, Teradata, BTEQ, Flat files, Erwin, PL/SQL, UNIX Shell Scripting, Toad, Autosys, UNIX ( Sun Solaris), Windows NT..

Confidential, Philadelphia

Informatica ETL Developer

Responsibilities:

  • Interact with the Product Owners to get a brief knowledge of business logics.
  • Participate in Agile team activities including daily standups, sprint planning and product demos, etc. Worked with the Product Owner to understand requirements and translate them into the design, development and implementation of ETL process using Informatica9.x.
  • Working with business and technical resources to address business information and application needs.
  • Developed SQL and DB2 procedures, packages and functions to process the data for CGR Project (Complete Goods Reporting).
  • Involve in Data validating, Data integrity, performances related to DB, Field size validation, check Constraints and Data Manipulation and updates by using SQL.
  • Extensively worked on Informatica data integration platform.
  • Extract data from flat files, DB2, SAP and to load the data into respective tables.
  • Implementing various loads like Daily Loads, Weekly Loads, and Quarterly Loads using Incremental Loading Strategy for CGR.
  • Has a very good knowledge of FACETS tool and Healthcare domain, Worked on the various modules like Subscriber/Member, Groups, Enrollment, Claims, Billing, Accounting, Provider, MTM and Utilization Management.
  • Developed Slowly Changing Dimension Mappings for Type 1 SCD, Type 2 SCD and fact implementation.
  • Extensively used Mapping Variables, Mapping Parameters, and Parameter Files for the capturing delta loads.
  • Extensively used Normal Join, Full Outer Join, Detail Outer Join, and master Outer Join in the Joiner Transformation.
  • Used cvs files and tables as sources and loaded the data into relational tables.
  • Created and Configured Workflows, Worklets, and Sessions to load the data to Netezza tables using Informatica PowerCenter.
  • Extracted the raw data from Microsoft Dynamics CRM to staging tables using Informatica cloud.
  • Developed cloud mappings to extract the data for different regions.
  • Refreshing the mappings for any changes/additions to CRM source attributes.
  • Developed the audit activity for all the cloud mappings.
  • Working with the business analyst team to analyze the final data and fixed the issues if any.
  • Used pushdown optimization to achieve good performance in loading data into Netezza.
  • Responsible for determining the bottlenecks and fixing the bottlenecks with performance tuning.
  • Extensively work on Informatica Metadata tables and created reusable transformations for error handling and exception reprocessing.
  • Developed DVO table to table validation scripts and automated in production.
  • Worked with Informatica Data Quality(IDQ) toolkit, Analysis, data cleansing, data matching, data conversion, exception handling, and reporting and monitoring capabilities of IDQ.
  • Worked on Java code for testing ETL frame work.
  • Conduct code walkthroughs and review peer code and documentation.
  • Writing Unix scrip for handling the source data and mapping.
  • Validate the ongoing data synchronization process using validation tools to ensure the data in source and target system are synchronized.
  • Extensively used ESP tool for scheduling Informaticabatch jobs and provided production support on rotation.

Environment: Informatica Power Center 9.5.1, Business Objects, Oracle 10g, TOAD, Erwin, SQL, PL/SQL, XML, HP UNIX, Test Director/Quality Center

Confidential, Philadelphia

Informatica ETL Developer

Responsibilities:

  • Resolving the reported bugs and various technical issues.
  • Design, configuration of Informatica webservices to automate the eID requests using web services consumer transformation.
  • Assisted ETL data warehouse and BI teams while generating the reports using OBIEE and assisted in writing SQL queries.
  • Designed ETL processes using Informatica to load data from Flat Files, and Excel files to target Oracle Data Warehouse database.
  • Performed data manipulations using various Transformations like Joiner, Expression, Lookup, Aggregate, Filter, Update Strategy, and Sequence Generator, Stored Procedure.
  • Written pre-session and post session scripts in mappings.
  • Created sessions and workflows for designed mappings. Redesigned some of the existing mappings in the system to meet new functionality.
  • Worked on Informatica Utilities Source Analyzer, warehouse Designer, Mapping Designer, Mapplet Designer and Transformation Developer.
  • Created Workflows and used various tasks like Email, Event-wait and Event-raise, Timer, Scheduler, Control, Decision, Session in the workflow manager.
  • Made use of Post-Session success and Post-Session failure commands in the Session task to execute scripts needed for cleanup and update purposes.
  • Implemented parallelism in loads by partitioning workflows using Pipeline, Round-Robin, Hash, Key Range and Pass-through partitions.
  • Validation of Informatica mappings for source compatibility due to version changes at the source.
  • Created RPD subject areas in Oracle BI Enterprise Edition 10g and 11g
  • Used the Catalog manager and maintained the web catalog to manage Dashboards, Answers.
  • Created Repository Groups and Web Groups and assigned users to configure security in the OBIEE.
  • Developed many Reports / Dashboards with different Analytics Views (Drill-Down / Dynamic, Pivot Table, Chart, Column Selector, Tabular with global and local Filters) using OBIEE Presentation server.

Environment: Informatica 8.6.1, ODI, Oracle 9i/10g (SQL, PL/SQL), UNIX, UML, Clear Case, MS Visio, Word, PowerPoint and MS Excel.4

Confidential, Milwaukee, WI

Informatica ETL Developer

RESPONSIBILITIES:

  • Created detailed Technical specifications for the ETL processes.
  • Performed ILIT (Irrevocable Life Insurance Trust) implementation and replacement activities.
  • Assisted the team in the development of design standards and codes for effective ETL procedure development and implementation.
  • Used Informatica as ETL tool, and stored procedures to pull data from source systems/ files, cleanse, transform and load data into databases.
  • Worked on Informatica- Source Analyzer, Warehouse Designer, Mapping Designer & Mapplet, and Transformation Developer.
  • Developed the Informatica mappings using various transformations, Sessions and Workflows. SQL Server was the target database, Source database is a combination of Flat files, Oracle tables, People Soft, Excel files, CSV files etc.
  • Worked with different Caches such as Index cache, Data cache, Lookup cache (Static, Dynamic and Persistence) and Join cache while developing the Mappings.
  • Responsible for Unit Testing, Integration Testing and helped with User Acceptance Testing.
  • Involved with the DBA in performance tuning of the Informatica sessions and workflows. Created the reusable transformations for better performance.
  • Optimizing the Mappings and implementing the complex business rules by creating re-usable transformations and mapplets.
  • Involved with the DBA in performance tuning of the Informatica sessions and workflows. Created the reusable transformations for better performance
  • Involved in writing UNIX shell scripts for Informatics ETL tool to run the Sessions.
  • Fixing and tracking mapping defects and implementing with enhancements.
  • Managed post production issues and delivering task/projects within specific timeline.
  • Involved in the mirroring of the staging environment to production.
  • Worked on Modification of Actuate report to upload and run reports on servers.
  • Worked on Autosys to schedule jobs, define dependencies and etc.
  • Collaborated with teams for migration and Production Deployment activities.
  • Scheduled Informatica jobs and implementing dependencies if necessary using Tidal Scheduler.
  • Responsible for performing SQL query optimization using Hints, Indexes and Explain plan.
  • Played a Vital role in requirement gathering, preparation of engineering gathering requirement specification.
  • Used Perforce as a versioning tool for maintaining revision history for code.
  • Managed production issues and delivered all assignments/projects within specified time lines.
  • Worked on all phases of multiple projects from initial concept through research and development, implementation, QA, to live production, by strict adherence to project timelines.

Environment: Informatica 8.6.1, ODI, Oracle 9i/10g (SQL, PL/SQL), UNIX, UML, Clear Case, MS Visio, Word, PowerPoint and MS Excel.4

Confidential

Informatica ETL Developer

Responsibilities:

  • Involved in all phases of SDLC from requirement gathering, design, development, testing, Production, user training and support for production environment.
  • Create new mapping designs using various tools in Informatica Designer like Source Analyzer, Warehouse Designer, Mapplet Designer and Mapping Designer.
  • Created complex mappings that involved implementation of Business Logic to load data in to staging area.
  • Used Informatica reusability at various levels of development.
  • Developed mappings/sessions using Informatica Power Center 8.6 for data loading.
  • Performed data manipulations using various Informatica Transformations like Filter, Expression, Lookup (Connected and Un-Connected), Aggregate, Update Strategy, Normalizer, Joiner, Router, Sorter and Union.
  • Developed Workflows using task developer, Worklet designer and workflow designer in Workflow manager and monitored the results using workflow monitor.
  • Imported various heterogeneous files using Informatica Power Center 8.x Source Analyzer.
  • Implemented slowly changing dimension methodology for accessing the full history of accounts.
  • Optimizing performance tuning at source, target, mapping and session level
  • Participated in weekly status meetings and conducting internal and external reviews as well as formal walk through among various teams and documenting the proceedings.
  • Set up the Topology including physical architecture, logical architecture and context and Created new models for the data sources - flat files, MS SQL server, Oracle.
  • Did reverse engineering for the data sources and targets.
  • Worked extensively with ODI Designer, operator and metadata Navigator.
  • Good understanding of ODI architecture and installation of Topology, Security Manager, agents, Designer, Operator, Master repository and Work repository.

We'd love your feedback!