We provide IT Staff Augmentation Services!

Etl/informatica Developer Resume

3.00/5 (Submit Your Rating)

Los Angeles, CA

SUMMARY:

  • Over 7+ Years of experience in Information Technology including Data Warehouse/Data Mart development using Informatica Power Center across various industries such as Healthcare and Pharmacy.
  • Expertise in implementing complex Business rules by creating mappings, mapplets, and reusable transformations using Informatica Power Center 9.x/8.x/7. x.
  • Worked with different source and target systems like Oracle, Teradata, DB2, MySQL, SQL Server, Sybase, Flat Files and XML in DWH.
  • Experience in working with most of the transformations like Expression, Router, Data Masking, Joiner, Connected and Unconnected lookups, Filter, Aggregator, Update, Rank, Sorter and Sequence Generator.
  • Strong in SQL and PL/SQL and Extensive hands on experience in doing performance tuning of Database queries.
  • Good understanding of Relational and Dimensional Data Modeling - Star & Snow Flake schema, De-normalization, Normalization.
  • Proficiency in data warehousing techniques like data cleansing, Slowly Changing Dimension (SCD) phenomenon, Surrogate key assignment, change data capture (CDC).
  • Experience in doing Unit Testing, working with QA team for system testing.
  • Involved in doing regression testing for Informatica and database upgrades.
  • Experience in Debugging & Error Handling of the Informatica code.
  • Good experience with Informatica Performance Tuning, identifying and resolving performance bottlenecks in various levels like sources, targets, mappings and sessions.
  • Experience in writing UNIX shell scripts and enhancing the existing ones.
  • Experience in using Teradata Utilities - BTEQ, M-Load, F-Load & TPT.
  • Expert in Coding Teradata SQL, Teradata Stored Procedures, Macros and Triggers.
  • Worked with scheduling tools like Autosys, ControlM, Maestro & Informatica scheduler.
  • Experience working with Business Analysts and Data Modelers in understanding the business requirements, Physical and logical data models.
  • Experience in working under Waterfall & Agile Methodologies for DWH Implementation.
  • Experience in 24*7 on call rotation production support using different Ticketing systems.
  • Excellent interpersonal and communication skills, and experienced in working with senior level managers, business users and developers across multiple disciplines.

TECHNICAL SKILLS:

ETL Tools: Informatica Power Center 9.6.1/9.5/9.1/8.6/7. x/6.x, Sales Force, Informatica CloudInformatica Power: Exchange 5.1/4.7/1.7, Power Analyzer 3.5, Informatica Power Connect and Metadata Manager, Informatica Data Quality (IDQ), Informatica Data Services (IDS).

Databases: Oracle11g/10g/9i/8i/8.0/7.x, Teradata13, DB2 UDB 8.1, MS SQLServer 2008/2005, Netezaa 4.0, Sybase ASE 12.5.3/15 and SSIS.

Operating Systems: UNIX (Sun-Solaris, HP-UX), Windows NT/XP/Vista, MSDOS

Programming: SQL, SQL-Plus, PL/SQL, UNIX Shell Scripting

Reporting Tools: Business ObjectsXIR 2/6.5/5.0/5.1, Cognos Impromptu 7.0/6.0/5.0, Informatica Analytics Delivery Platform, MicroStrategy.

Modeling Tools: Erwin 4.1 and MS Visio

Other Tools: SQL Navigator, Rapid SQL for DB2, Quest Toad for Oracle, SQL Developer 1.5.1, Autosys, Telnet, MS SharePoint, Mercury Quality center, Tivoli Job Scheduling Console, JIRA, Toad for Data Analyst, Business Objects

Methodologies: Ralph Kimball.

PROFESSIONAL EXPERIENCE:

Confidential, Los Angeles, CA

ETL/Informatica Developer

Responsibilities:

  • Analyzed source and target systems for loading data into CDW and to create external data feeds.
  • Involved all the phases of SDLC design and development.
  • Involved in gap analysis and STM documentation to map flat file fields with relational table data in CDW.
  • Developed mappings, Mapplets, re-usable transformations to load external data into CDW.
  • Extracted data from CDW to create delimited and fixed width flat file data feeds that go out to external sources.
  • Analyzed Design documents and developed ETL requirement specs.
  • Created re-usable objects and shortcuts for various commonly used flat file and relational sources.
  • Developed a shell script in order to append Date and Time stamp for output xml files, to remove empty delta files and to FTP the output xml files to different servers.
  • Validate Logical Data Models relationships and entities, Determine data linage by including all associated systems in data profile.
  • Excellent Data Profiling experience used Data Polling validating data patterns and formats.
  • Integrated data into CDW by sourcing it from different sources like Oracle, Flat Files and Mainframes (DB2 and VSAM) using Power Exchange.
  • Technical Analysis writing & reviewing technical designs.
  • Developed a dashboard solution for analyzing STD statistics by building SSAS cubes and Tableau
  • Created UNIX scripts to deal with flat file data like merging delta files with whole files and to concatenate header, detailed and trailer parts of the files.
  • Developed Mappings which loads the data in to Teradata tables with SAP definitions as Sources.
  • Created mappings to read parameterized data from tables to create parameter files.
  • Used XML Source Qualifier and used only with an XML source definition to represents the data elements that the Informatica Server reads when it executes a session with XMLDeveloped.
  • Performed Unit testing and functional testing and documented the results in the testing documents both in development and UAT environments
  • Used XML Parser transformation lets you extract XML data from messaging systems.
  • Used ODI to perform data integration to ELT processing. Data is extracted from multiple sources, sent through several transformation processes and loaded into a final destination target.
  • Used Informatica Power Center to Extract, Transform and Load data into Netezza Data Warehouse from various sources like Oracle and flat files
  • Used SAS for data entry, Retrieval, Management report writing and Statistical analysis.
  • Developed Complex transformations, Mapplets using Informatica to Extract, Transform and load (ETL) data into Data marts, Enterprise Data warehouse (EDW) and Operational data store (ODS).
  • Used Message broker as translate messages between the interfaces.

Environment: Informatica Power Center 9.5, Power Exchange, Windows, Oracle, Toad for Oracle, BODS, UNIX Shell Scripting, workflow scheduler, Perl, Putty, Tableau, WinScp.

Confidential, Colonia, NJ

ETL/Informatica Developer

Responsibilities:

  • Maintaining the data coming from the OLTP systems.
  • Developed and maintained complex Informatica mappings.
  • Involved in analyzing and development of the Data Warehouse.
  • Created complex mappings in Informatica Power Center Designer using Aggregate, Expression, Filter, Sequence Generator, Update Strategy, Rank, Sorter, Lookup, Joiner transformations etc.,
  • Involved in coding/updating UNIX scripts to FTP the file from the Source System
  • Perform data validation tests using complex SQL statements.
  • Worked with data warehouse staff to in corporate best practices from Informatica.
  • Worked with Business Analysts, using work sessions, to translating business requirements into technical specifications, including data, process and specifications.
  • Developed Informatica power center and cloud jobs which extracts core membership data from DB2 to Oracle, Netezza (stage and EDW)
  • Implemented Type II slowly changing dimensions using date-time stamping.
  • Created database structures, objects and their modification as and when needed.
  • Investigating and fixing the bugs occurred in the production environment and providing the on-call support.
  • Performed Unit testing and maintained test logs and test cases for all the mappings.
  • Testing for Data Integrity and Consistency.

Environment: Informatica Power Center 8.6, Informatica Power Center, Power Exchange, Windows, IBM DB2 8.x, Mainframes, SQL Server 2007, Enterprise Architect, Meta data Manager, ER studio, Oracle, SQL Plus, PL/SQL, Windows 2007.

Confidential, Colorado, CA

Senior ETL Developer

Responsibilities:

  • Translated the business processes/SAS code into Informatica mappings for building the data mart.
  • Used Informatica powercenter to load data from different sources like flat files and Oracle, teradata into the Oracle Data Warehouse.
  • Implemented pushdown, pipeline partition, persistence cache for better performance.
  • Developed reusable transformations and mapplets to use in multiple mappings.
  • Implementing Slowly Changing Dimensions (SCD) methodology to keep track of historical data.
  • Assisted the QC team in carrying out its QC process of testing the ETL components.
  • Created pre-session and post-session shell scripts and email notifications.
  • Involved in complete cycle from Extraction, Transformation and Loading of data using Informatica best practices.
  • Created mappings using Data Services to load data into SAP HANA.
  • Involved in data quality checks by interacting with the business analysts.
  • Performing Unit Testing and tuned the mappings for the better performance.
  • Maintained documentation of ETL processes to support knowledge transfer to other team members.
  • Created various UNIX Shell Scripts for scheduling various data cleansing scripts and automated execution of workflows.
  • Involved as a part of Production support.

Environment: Informatica Power Center 9.1, Oracle11g, Teradata, SAP HANA, UNIX Shell Scripts.

Confidential, Lake Forest, IL

DW ETL - Analyst & Developer

Responsibilities:

  • ETL design and development, Creation of the Informatica mappings, sessions and workflows to implement the Business Logic.
  • Created source to target mappings, edit rules and validation, transformations, and business rules. Analyzed client requirements and designed the ETL Informatica mapping.
  • Extensive knowledge and worked with Informatica Data Quality (IDQ 9.5.1) for data analysis, data cleansing, data validation, data profiling and matching/removing duplicate data in using of Informatica.
  • Designed and developed Informatica DQ Jobs, Mapplets using different transformation like Address validator, matching, consolidation, rules etc for data loads and data cleansing.
  • Preparation of technical specification for the development of Extraction, Transformation and Loading data into various stage tables.
  • Created transformations and mappings using Informatica Power center to move data from multiple sources into targets.
  • Scheduled and Run Workflows in Workflow Manager and monitored sessions using Informatica Workflow Monitor.
  • Validated and tested the mappings using Informatica Debugger, session logs and workflow logs.

Environment: Informatica Power Center 9.5.1, Informatica Data Quality 9.5.1(IDQ), Informatica Developer 9.5.1, Oracle 10g (Toad), UNIX (SSH Secure Shell Client)

We'd love your feedback!