We provide IT Staff Augmentation Services!

Etl Informatica Developer Resume

2.00/5 (Submit Your Rating)

Yardley, PA

SUMMARY:

  • 8+ years of professional experience as an ETL Developer in various business domains like Finance, Mortgage & Healthcare.
  • Extensive experience in analyzing in huge data sets and data mining to identify patterns, trends etc.
  • Proficient in the integration of various data sources such as Oracle, SQL Server, Teradata, DB2, XML Files, Fixed Width Flat Files, and Delimited Flat Files.
  • Good in OLAP Data modelling, for both Star Schemas and Snowflake Schemas. Solid understanding of Data Warehousing life cycle and strong working experience on Data Warehousing applications.
  • Experience with writing daily batch jobs using and developing complex UNIX Shell Scripts for automation of ETL.
  • Good in SQL tuning and Informatica performance tuning. Tuned performance of Informatica Sessions for large data files by increasing block size, data cache size, sequence buffer length and target based commit interval.
  • Extensive experience in Production scheduling.
  • Good command in developing Mappings and Mapplets, Sessions, Workflows, Worklets and Tasks using Informatica Designer, Workflow manager.
  • Good in Designing/Developing complex mappings using Informatica transformations like Connected/Unconnected Lookups, Router, Filter, Expression, Aggregator, Normalizer, Joiner and Update Strategy.
  • Expertise in writing the data clean up scripts using SQL queries and UNIX scripts.
  • Extensive experience in writing complex SQL Queries in SQL Server, Oracle, PL/SQL, MS Access and was involved in production support.
  • Good Experience in analysis and resolution of bottlenecks for performance tuning and optimizing SQL. Used Query Analyzer, Execution Plan to optimize SQL Queries.
  • Knowledge in the ETL of data into a data ware house/date mart with Informatica
  • Proficient in data warehousing techniques for Data cleansing, slowly changing Dimensions.
  • Excellent in designing ETL procedures and strategies to extract data from different heterogeneous source systems like oracle 11g/10g, SQL Server … DB2 10, Flat files, XML, SAP R/3 etc.
  • Knowledge of Informatica Power Center 8.x, 9.x.
  • Proficient in building ETL interfaces using Designer, Workflow Manager, Workflow Monitor, Repository Manager, Repository Server Administrator and Central Management Console.

TECHNICAL SKILLS:

Database Design Tools and Data Modeling: MS Visio, ERWIN, Star Schema/Snowflake Schema modeling, FACT& Dimensions tables, Physical & logical data modeling and De - normalization techniques, Kimball &Inmon Methodologies

Oracle: 10g/9i/8i/7.x, PL/SQL, MS SQL Server, Teradata, MS Access

Environment: UNIX, Windows XP

Others: Microsoft Office, XML, JavaScript, HTML, DHTML

PROFESSIONAL EXPERIENCE:

Confidential, Yardley, PA

ETL Informatica Developer

Responsibilities:

  • Interacted with the Business users to identify the process metrics and various key dimensions and measures for the services provided by DaVita to patients.
  • Developed ETL mappings to load the data into DataMart's like Care Engine DataMart (CED), Patient Resistance DataMart (PRD) and Drug Trails DataMart (DTD) which are part of EDW.
  • Created dataflow process diagrams from upstream systems (outbound) to various downstream (inbound) systems.
  • Coordinated with function team to provide the input for creating the FRD (Functional requirement Document) and data architecture document and communicated with the concerned stakeholders. Conducted Impact and feasibility analysis.
  • Involved with data modeling team to provide the input for design and develop data model with which ETL process can be simplified and efficient.
  • Developed Database Triggers in order to enforce complicated business logic and integrity constraints, and to enhance data security at database level.
  • Designed and developed Informatica Mapping using transformation as needed.
  • Responsible for Pre and Post migration planning for optimizing Data load performance, capacity planning and user support.
  • Partitioned sources and used persistent data cache for Lookup's to improve session performance.
  • Created and debugged the Stored Procedures, Functions, Packages, Cursors and triggers using PL/SQL developer.
  • Designed and developed UNIX shell scripts as part of the ETL process.
  • Created Stored Procedures to validate the data coming with different data discrepancies using data conversions.
  • Created Bullet graphs to determine profit generation by using measures and dimensions data from Oracle, MS Access, SQL Server, PL/SQL and MS Excel.
  • Created various documents including high-level design documents, mapping documents and knowledge transfer documents.
  • Support for successful production deployment of the release.

Environment: Informatica Power Center 9.6/9.5, Oracle 11g, PL/SQL, XML, Toad, UNIX Shell Scripting, SQL, MS Excel, Netezza, Putty.

Confidential, Dublin, OH

Informatica Developer

Responsibilities:

  • Requirement gathering and discussion with Architect for design plan.
  • Attended daily scrum meetings for sprint work.
  • Exposure to all the steps of Software Development Life Cycle (SDLC). Also worked in development environment using Agile Methodology.
  • Improved the performance by making use of performance tuning techniques.
  • Develop code that promotes re-usability, maintainability.
  • Worked on various sources/target like flat file (Delimited and fixed width) Relational tables, CSV, XML object definitions.
  • Extensively used Informatica client tools - Source Analyzer, Warehouse designer, Mapping designer, Mapplet Designer, Transformation Developer, Informatica Repository Manager and Informatica Server Manager.
  • Refactoring existing mappings with updated requirements.
  • Implemented various Informatica packages using transformations such as Normalizer, Joiner, Sorter, Aggregate, Lookup, Filter, Update Strategy and Router.
  • Performance Tuning of Queries to handle large volume of data.
  • Good Experience in analysis and resolution of bottlenecks for performance tuning and optimizing SQL. Used Query Analyzer, Execution Plan to optimize SQL Queries.
  • Created Stored Procedures to validate the data coming with different data discrepancies using data conversions.
  • Provided support and quality validation through test cases for all stages of Unit and Integration testing.
  • Created, Deployed & Scheduled jobs in Tidal scheduler for Integration, User acceptance testing and Production region.
  • Raised change management request to promote codes, Tables(new), Grant requests for tables, Linux Parameter files & Tidal jobs.
  • Supported QA for each region testing using Health Rules and Health Answers. Assisted for QA & Business sign off.
  • Involved in reviewing business requirements and analyzing the data from various data sources for design, development, testing, and production rollover of reporting and analysis projects within Tableau Desktop.
  • Created views in Tableau Desktop that were published to internal team for review and further data analysis and customization using filters and actions.
  • Created dashboards according to user specifications and prepared stories to provide an understandable visualizations. Involved in new KPI generation for Scorecards.
  • Written Technical design document and application workbook and handover applications to production team.
  • Worked in Production support team.

Environment: Informatica Power Center 9.1, Oracle 10g/9i, SQL Server 2008, Oracle SQL Developer, Tableau Desktop, Tableau Server Tortoise SVN, Tidal Scheduler, Putty, WinSCP, HA non Production 5.1.5.

Confidential, Milwaukee, WI

ETL Developer

Responsibilities:

  • Performed business analysis, requirement analysis and converted them into technical specifications.
  • Developed mappings to load the data from multiple sources form business units like mortgage servicing, Mortgage Insurance and Loan Origination.
  • Worked with data modeling to come up with optimized model for accounting DataMart.
  • Created the mappings to load the data for activity/balance details for Sub-ledger and GL accounts for accounting needs.
  • Aggregated data from each balance terms and activities and SLS and GL account level.
  • Created ETL solution specifications document explaining the entire process flow of ETL from source to stage to target and reviewed the same with architecture team.
  • Developed mappings to Load the staging tables from flat files which are sent by message manager.
  • Implemented ETL techniques like Update/Insert/Rollup while loading the data from staging to target.
  • Optimized the performance of the Informatica mappings by analyzing the session logs and understanding various bottlenecks (source/target/transformations).
  • Updated mappings and applied manual DB mods to add the new attributes to Aggregate tables as needed by Business Objects reporting needs.
  • Created workflows and implemented session dependencies between detail table and aggregate table loads.
  • Configured Sessions properties for better performance.
  • Supported various testing cycles during the SIT & UAT phases.
  • Supported the daily/weekly ETL batches in the Production environment.
  • Bug fixing/ defect fixing, re-unit testing and delivering the code.
  • Prompt in responding to business user queries and resolving any ad hoc queries.

Environment: Informatica 9.1, Sybase, Oracle 10g, Flat files, Business Objects R4, Teradata, Fast load

Confidential, New York, NY

BI&ETL Developer

Responsibilities:

  • Involved in requirements gathering, analysis, function/technical specifications, development, deploying and testing.
  • Prepare/maintain documentation on all aspects of ETL processes to support knowledge transfer to other team members.
  • Used Informatica Power Center for migrating data from various OLTP databases to the data mart
  • Worked with different sources like Oracle, flat files, XML files, DB2, MS SQL Server.
  • Created Mappings using Mapping Designer to load data from various sources, Made use of various Designer transformations like Source Qualifier, Connected and Unconnected Lookups, Expression, Filter, Router, Sorter, Aggregator, Joiner, Normalizer, Rank, Router, Sequence generator, Union and Update Strategy transformations.
  • Created mapplets using Mapplet Designer and used those Mapplets for reusable business process in development.
  • Worked on different tasks in Workflows like sessions, events raise, event wait, decision-mail, command, worklets, Assignment, Timer and scheduling of the workflow.
  • Performance tuning of Informatica designer and workflow objects.
  • Created pre/post session commands and pre/post session SQLs to perform tasks before and after the sessions.
  • Implemented slowly changing dimensions (Type I and Type II) for customer Dimension table loading.
  • Created UNIX KSH shell scripts to kick off Informatica workflow in batch mode.
  • Invoked Informatica using “pmcmd” utility from the UNIX script.
  • Responsible for Unit testing and Integration testing of mappings and workflows.
  • Provided support for the applications after production deployment to take care of any post-deployment issues.
  • Environment: Informatica Power Center 9.1, Oracle 11g, Flat Files, Win7, SQL * Plus, Toad and UNIX

Confidential

Informatica/ETL Developer

Responsibilities:

  • Conducted Q & A sessions with the SME’s, gathered requirements, analyzed and documented the requirements.
  • Extensively involved in the Analysis, design and Data Modeling.
  • Co-ordinated with Data modeler to create Star Schema data model design using Erwin.
  • Developed ETL procedures to ensure conformity, compliance with CareFirst standards
  • Created Migration Documentation and Process Flow for mappings and sessions.
  • Developed the transformation logic; identifying and tracking the slowly changing dimensions, heterogeneous sources and determining the hierarchies in dimensions.
  • Developed complex ETL Mappings and Mapplets in Informatica to load data from various sources to Data warehouse.
  • Involved in various transformations like Source Qualifier, Look up, Expression, Aggregate, Update Strategy, Sequence Generator, Joiner, Filter, Rank and Router transformations.
  • Created Informatica server administration automation scripts, Shell scripts for data manipulation and process automation using UNIX Shell scripting.
  • Developed Shell Scripts which take care of truncate table, Archival the source and the target files and Clean up the log files.
  • Worked extensively on Flat Files, as the data from various Legacy Systems parsed the Functional specifications into a Technical document, which would provide an efficient platform for peers for development.
  • Designed multi-dimensional Star schema, Generated the database scripts, E-R diagrams using ERWIN.
  • Generated server side PL/SQL scripts for data manipulation, validation and provided DDL scripts for creation of database objects such as tables, indexes, views, sequences, object types, collection types and Materialized Views.
  • Generated reports of Multidimensional analysis, slice and dice and drill down reports.
  • Created, Scheduled and Monitored Batches and Sessions using Power Center Server Manager

Environment: Informatica- Power Center 8.6, Oracle 11g, DB2, SQL Server, Shell Scripts, Power Exchange, Business Objects, Erwin, Windows 7. eDucation:

We'd love your feedback!