We provide IT Staff Augmentation Services!

It App Developer/etl Developer Resume

4.00/5 (Submit Your Rating)

Phoenix, AZ

PROFESSIONAL SUMMARY:

  • 7+ years of experience in Information Technology with a strong background in Database development and Data warehousing (OLAP) and ETL process using Informatica Power Center9.x/8.x/7.x/6.x, Business Objects with Oracle, SQL Server databases and IDQ.
  • Experience in analysis, design and development of enterprise level data warehouses using Informatica. Experience in complete Software Development Life Cycle (SDLC) (Requirement Analysis, Design, Development & Testing)
  • Experience in Data Modeling & Data Analysis experience using Dimensional Data Modeling and Relational Data Modeling, Star Schema/Snowflake Modeling, FACT & Dimensions tables, Physical & Logical Data Modeling.
  • Expertise in developing standard and re - usable mappings using various transformations like expression aggregator, joiner, source qualifier, lookup and router.
  • Experienced in integration of various data sources like Oracle 11g,10g/9i/8i, MS SQL Server 2008/2005/2000 , XML files, Teradata, Netezza,Sybase,DB2, Flat files, XML, Salesforce sources into staging area and different target databases.
  • Designed complex Mappings and have expertise in performance tuning and slowly-changing Dimension Tables and Fact tables.
  • Excellent working experience with Insurance Industry with strong Business Knowledge in Auto, Life and Health Care - Lines of Business
  • Worked on scheduling tools Informatica Scheduler, Autosys, Tivoli/Maestro & CONTROL-M.
  • Experience in PL/SQL Programming and in writing Stored Procedures, Functions etc.
  • Experience in creating complex mappings using various transformations, and developing strategies for Extraction, Transformation and Loading (ETL) mechanism by using Informatica 10.X,9.X/8.X/7.X/6.X
  • Experience in source systems analysis and data extraction from various sources like Flat files, Oracle 11g/10g/9i/8i, IBM DB2 UDB, XML files.
  • Extensively worked on Informatica Data Quality and Informatica Power center throughout complete IDQ and MDM projects.
  • Documented the number of source / target rows and analyzed the rejected rows and worked on re-loading the rejected rows.
  • Performed the data profiling and analysis making use of Informatica Data Quality (IDQ).
  • Worked Master Data Management concepts, Methodologies and ability to apply this knowledge in building MDM solutions
  • Experience in UNIX shell scripting, Perl scripting and automation of ETL Processes.
  • Prepared user requirement documentation for mapping and additional functionality.
  • Extensively used ETL to load data using Power Center / Power Exchange from source systems like Flat Files and Excel Files into staging tables and load the data into the target database Oracle. Analyzed the existing systems and made a Feasibility Study.

TECHNICAL SKILLS:

ETL Tools: Informatica Power Center 10.2, 9.6/9.5/9.1/8.6/7. x/6.x,SalesForce,InformaticaCloud,InformaticaPower Exchange 5.1/4.7/1.7, Power Analyzer 3.5, Information Data Quality (IDQ) 9.6.1/9.5.1 , Informatica Power Connect and Metadata Manager, Informatica MDM 9.1. 0., Informatica Data Services (IDS) 9.6.1, DataStage

Databases: Oracle 11g/10g/9i/8i/8.0/7.x,Teradata13,DB2 UDB 8.1, MS SQLServer 2008/2005Operating Systems: UNIX (Sun-Solaris, HP-UX), Windows NT/XP/Vista, MSDOS

Programming: SQL, SQL-Plus, PL/SQL, Perl,UNIX Shell Scripting

Reporting Tools: Business ObjectsXIR 2/6.5/5.0/5.1 , Cognos Impromptu 7.0/6.0/5.0,Informatica Analytics Delivery Platform, MicroStrategy.

Modeling Tools: Erwin 4.1 and MS Visio

Other Tools: SQL Navigator, Rapid SQL for DB2, Quest Toad for Oracle, SQL Developer 1.5.1, Autosys, Telnet, MS SharePoint, Mercury Quality center, Tivoli Job Scheduling Console, JIRA, Netezaa 4.0.

Methodologies: Ralph Kimball.

PROFESSIONAL EXPERIENCE:

Confidential, Phoenix, AZ

IT App Developer/ETL Developer

Responsibilities:

  • Involved in business analysis and technical design sessions with business and technical staff to develop requirements document and ETL specifications.
  • Developing mapping logic using various transformations like lookups, expression, joiner, filter, sorter, update strategy and sequence generator
  • Loading large scale of data into several tables using unix shell scripts and perl scripts, supporting various groups of employees.
  • Extensively using informatica ETL to load data using power center from source systems like flat files into staging tables and load the data into the target database oracle.
  • Involving in business analysis and technical design sessions with business and technical staff to develop requirement document and ETL specifications.
  • Scheduling all our ETL jobs using CA workstation by creating events for each environment and run daily as per scheduled time.
  • Involved in designing dimensional modeling and data modeling using Erwin tool.
  • Created high-level Technical Design Document and Unit Test Plans.
  • Developed mapping logic using various transformations like Expression, Lookups (Connected and Unconnected), Joiner, Filter, Sorter, Update strategy and Sequence generator.
  • Wrote complex SQL override scripts Confidential source qualifier level to avoid Informatica joiners and Look-ups to improve the performance as the volume of the data was heavy.
  • Responsible for creating workflows. Created Session, Event, Command, and Control Decision and Email tasks in Workflow Manager
  • Prepared user requirement documentation for mapping and additional functionality.
  • Created customer hub by consolidating and integrating various sources and storing the golden record as individual identity in the hub.
  • Successfully completed Customer and Product centric Master Data Management initiatives using Informatica Master data management product.
  • Used IDQ’s standardized plans for addresses and names clean ups.
  • Worked on IDQ file configuration Confidential user’s machines and resolved the issues.
  • Used IDQ to complete initial data profiling and removing duplicate data.
  • Involved in running and scheduling UNIX Shell scripts, Informatica jobs using Tidal
  • And extensively worked on IDQ admin tasks and worked as both IDQ Admin and IDQ developer.
  • Experience in Performance tuning & Optimization of SQL statements using SQL trace
  • Involved in Unit, System integration, User Acceptance Testing of Mapping.
  • Prepared the complete data mapping for all the migrated jobs using SSIS.
  • Creating and monitoring CA WORKSTATION (Scheduler) jobs so as job runs and delivers the data to users.
  • Supported the process steps under development, test and production environment

Environment: Informatica Power Center 10.1, Informatica Power Center 9.6.1, IDQ, Oracle 11g/10g, Perl, GitHub, TOAD, Business Objects XI3.5, DBeaver, RubyMine, ESP Scheduler, CA Workstation.

Confidential, Miami, FL

Sr. ETL Developer/Analyst

Responsibilities:

  • Involved in business analysis and technical design sessions with business and technical staff to develop requirements document and ETL specifications.
  • Involved in designing dimensional modeling and data modeling using Erwin tool.
  • Created high-level Technical Design Document and Unit Test Plans.
  • Developed Complex database objects like Stored Procedures, Functions, Packages and Triggers using SQL and PL/SQL.
  • Developed mapping logic using various transformations like Expression, Lookups (Connected and Unconnected), Joiner, Filter, Sorter, Update strategy and Sequence generator.
  • Created Teradata External loader connections such as MLoad, Upsert and Update, Fastload while loading data into the target tables in Teradata Database.
  • Used Teradata utilities fastload, multiload, tpump to load data.
  • Wrote complex SQL override scripts Confidential source qualifier level to avoid Informatica joiners and Look-ups to improve the performance as the volume of the data was heavy.
  • Worked on different IDQ / Informatica Developer components/transformations like Case, Comparison, Key Generator, Parser, Standardizer, Weight, Exception, Rule Based Analyzer, Lookup, SQL, Expression etc. and created IDQ mappings.
  • Worked on IDQ file configuration Confidential user’s machines and resolved the issues.
  • Used IDQ to complete initial data profiling and removing duplicate data.
  • Used Informatica EIC (Enterprise Information Catalog) for exploring assets to verify the quality of data like profiling the information.
  • Performed the tasks to view relationship between the assets by using Informatica EIC.
  • Extensively used ETL to load data using Power Center from source systems like Flat Files into staging tables and load the data into the target database Oracle. Analyzed the existing systems and made a Feasibility Study.
  • Created customer hub (MDM) - by consolidating and integrating various sources and storing the golden record as individual identity in the hub.
  • Designed, documented, and configured informatica MDM Hub to load, cleanse, match, merge, and publish MDM Data.
  • Successfully completed Customer and Product centric Master Data Management initiatives using Informatica Master data management product.
  • Used IDQ’s standardized plans for addresses and names clean ups.
  • And extensively worked on IDQ admin tasks and worked as IDQ developer.
  • Experience in Performance tuning & Optimization of SQL statements using SQL trace
  • Designed and tested packages to extract, transform and load data (ETL) using SSIS, Designed packages which are utilized for tasks and transformations such as Execute SQL Task, Mapping the right and required data from source to destination, Data Flow Task, Data Conversion, For each Loop Container.
  • Prepared the complete data mapping for all the migrated jobs using SSIS.
  • Creating and monitoring TWS(Tivoli Workload Scheduler) jobs so as job runs and delivers the data to users.
  • Supported the process steps under development, test and production environment

Environment: Informatica Power Center 9.6.1, Informatica Power Center 8.6.1, MDM and IDQ, Oracle 11g/10g, TOAD, Business Objects XI3.5, Solaris 11/10, Teradata, clear case, PL/SQL, Tivoli Job Scheduler, SSIS.

Confidential, San Jose, CA

Sr. Informatica Developer

Responsibilities:

  • Prepared High-level Design and Low-Level Design based on Functional and Business requirements of the project.
  • Designing & documenting the functional specs and preparing the technical design.
  • As a team conducted gap analysis and Discussions with subject matter experts to gather requirements, emphasize on problem areas and define deliverables.
  • Supported the development, optimization, and maintenance of Informatica mappings with various source data including Oracle and SQL.
  • Optimized and Tuned SQL queries and PL/SQL blocks to eliminate Full Table scans to reduce Disk I/O and Sorts.
  • Designed and developed UNIX Shell scripts for creating, dropping tables which are used for scheduling the jobs .
  • Developed Complex database objects like Stored Procedures, Functions, Packages and Triggers using SQL and PL/SQL.
  • Developed several complex mappings in Informatica a variety of Power Center transformations, Mapping Parameters, Mapping Variables, Mapplets & Parameter files.
  • Developed several IDQ complex mappings in Informatica a variety of Power Center, transformations, Mapping Parameters, Mapping Variables, Mapplets & Parameter files in Mapping Designer using Informatica Power Center.
  • Used IDQ to complete initial data profiling and removing duplicate data.
  • Worked with IDQ on data quality for data cleansing, robust data, remove the unwanted data, correctness of data.
  • Extensive experience in integration of Informatica Data Quality (IDQ) with Informatica PowerCenter
  • Create MDM mappings using IHA best practices to capture the errors and have a clean load using Informatica power center.
  • Created functional document with MDM data model configuration, source system definition, data mapping and cleansing requirements, trust score and matching rule definitions
  • Used various transformations like Filter, Expression, Sequence Generator, Update Strategy, Joiner, Stored Procedure, and Union to develop robust mappings in the Informatica Designer.
  • Created Data stage jobs (ETL Process) for populating the data into the Data warehouse constantly from different source systems like ODS, flat files, HDFS Files and scheduled the same using Data Stage Sequencer for System Integration testing.
  • Developed, maintained programs for scheduling data loading and transformations using DataStage
  • Developed mapping parameters and variables to support SQL override.
  • Worked on import and export of data from sources to Staging and Target using Teradata MLOAD, Fast Export, TPUMP and BTEQ.
  • Developing workflows with Worklets, Event waits, Assignments, Conditional flows, Email and Command Tasks using Workflow Manager.
  • Expertise in Performance Tuning by identifying the bottlenecks Confidential sources, targets, PowerCenter transformations and session’s level. Collected performance data for sessions and performance tuned by adjusting Informatica session parameters.
  • Handling all Hadoop environment builds, including design, capacity planning, cluster setup, performance tuning and ongoing monitoring.
  • Worked with the third party scheduler Autosys for scheduling Informatica PowerCenter Workflows Involved with Scheduling team in creating and scheduling jobs in Autosys Workload Scheduler.
  • Extensive documentation on the design, development, implementation, daily loads and process flow of the mappings.

Environment: Informatica Power Center 9.6, Oracle 11g, SQL, IDQ, Teradata, DataStage, PL/SQL, TOAD, Microsoft Visio, Autosys, Unix, SQL Server 2008.

Confidential, Detroit, MI

Sr. Application Developer

Responsibilities:

  • Prepared Conceptual Solutions and Approach documents and gave Ballpark estimates.
  • Prepared Business Requirement Documents, Software Requirement Analysis and Design Documents (SRD) and Requirement Traceability Matrix for each project workflow based on the information gathered from Solution Business Proposal document.
  • Worked in different subject areas - Medical, Pharmacy, Dental, Vision, Medicare, Medicare Advantage, PPO, HMO, POS, Master Medical and Care Management.
  • Designed SSIS Packages to transfer data from flat files to SQL Server using Business Intelligence Development Studio.
  • Worked with different Control flow tasks in creating SSIS packages.
  • Database development experience with Microsoft SQL Server in OLTP/OLAP environments using integration services (SSIS) for ETL (Extraction, Transformation and Loading).
  • Used External Loaders like Multi Load and Fast Load to load data into Teradata database.
  • Data modeling and design of data warehouse and data marts in star schema methodology with confirmed and granular dimensions and FACT tables.
  • Worked on profiling data using IDQ and took the results to create Informatica data quality rules for standardization mapplet
  • Developed mappings in IDQ for Profiling and used in Informatica designer and integrated with existing mappings.
  • Designed, documented, and configured informatica MDM Hub to load, cleanse, match, merge, and publish MDM Data.
  • Performed land process to load data into landing tables of MDM Hub using external batch processing for initial data load in hub store.
  • Coordinated and worked closely with legal, clients, third-party vendors, architects, DBA’s, operations and business units to build and deploy.
  • Data is migrated from Oracle database to Teradata database using Informatica and Teradata utilities like BTEQ.
  • Worked with various Informatica Transformations like Joiner, Expression, Lookup, Aggregate, Filter, Update Strategy, Stored procedure, Router and Normalizer etc.
  • Worked with Connected and Unconnected Stored Procedure for pre & post load sessions
  • Designed and Developed pre-session, post-session routines and batch execution routines using Informatica Server to run Informatica sessions.
  • Used PMCMD command to start, stop and ping server from UNIX and created Shell scripts to automate the process.
  • Creating and monitoring TWS(Tivoli Workload Scheduler) jobs so as job runs and delivers the data to users.
  • Worked on production tickets to resolve the issues in a timely manner.
  • Prepared Test Strategy and Test Plans for Unit, SIT, UAT and Performance testing.

Environment: Informatica Power Center 9.1.0/8.6.1 Oracle 10g, IDQ, MDM, Teradata, SQL Server 2008, Toad, SQL Plus, SSIS,SQL Query Analyzer, SQL Developer, MS Access, Windows NT, Shell Scripting, Clear Quest, Tivoli Job Scheduler.

Confidential, Windsor, CT

Sr. Informatica Developer

Responsibilities:

  • Involved in business analysis and technical design sessions with business and technical staff to develop requirements document and ETL specifications.
  • Coordinated and worked closely with legal, clients, third-party vendors, architects, DBA’s, operations and business units to build and deploy.
  • Collaborating with Business Analysts by providing analysis, consultation, coding, testing and documentation in accordance with the specification and target completion dates.
  • Developed mappings in Power center for Job Monitoring and Data Validation.
  • Developed different mapping logic using various transformations to extract data from different sources like flat files, IBM MQ series, Oracle, IBM DB2 UDB 8 databases that were hosted on HP UX 11i v2 RISC server.
  • Worked with Informatica Data Quality (IDQ) toolkit, Analysis, data cleansing, data matching, data conversion, exception handling, and reporting and monitoring capabilities using IDQ .
  • Worked on data cleansing and standardization using the cleanse functions in Informatica MDM.
  • Performed data profiling to understand the data pattern using Informatica IDQ.
  • Developed mapping logic using various transformations like Expression, Lookups (Connected and Unconnected), Joiner, Filter, Sorter, SQL, Update strategy, Salesforce Lookup and Sequence generator.
  • Actively implemented Informatica performance tuning by identifying and removing the bottlenecks and optimized session performance by tuning complex mappings
  • Documented Informatica mappings, design and validation rules.
  • Create new SSIS packages to ETL data from Salesforce to SQL Server and vice versa
  • Extensively involved in unit testing, integration testing and system testing of the mappings and writing Unit and System Test Plan.
  • Developed UNIX scripts for scheduling the delta loads and master loads using Autosys Scheduler.
  • Migrated objects from the development environment to the QA/testing environment to facilitate the testing of all objects developed and check their consistency end to end on the new environment.
  • Supported the process steps under development, test and production environment

Environment: Informatica Power Center 9.1.0/8.6.1 , Informatica IDQ, MDM, Sales force, JIRA, SQL Server 2008, Toad, AQL,SQL Developer, MS Access, Clear Quest, Autosys Job Scheduler.

Confidential

Sr. Informatica Developer / ETL Analyst

Responsibilities:

  • Analyzed the functional specs provided by the data architect and created technical specs document for all the mappings.
  • Experience in developing SQL and PL/SQL scripts for relational databases such as Oracle, Microsoft SQL Server, IBM DB2 Experience in UNIX programming and shell scripting for Informatica pre & post session operations and database administration activities.
  • Experience on optimizing and tuning the Netezza SQLs to improve the performance of batch.
  • Used SQL Loader to retrieve the data into the database tables from flat files and comma separated files (CSV).
  • Worked on Informatica Power Center tool - Source Analyzer, Target designer, Mapping & Mapplet Designer and Transformation Designer.
  • Resolved issues that cause the production jobs to fail by analyzing the ETL code and log files created by the failed jobs on the Informatica server.
  • Used Transformations like Lookup, Joiner, Rank and Source Qualifier Transformations in the Informatica Designer.
  • Worked on slowly changing dimension tables data.
  • Created complex mappings using Unconnected Lookup, Sorter, Aggregator, Stored Procedure, dynamic Lookup and Router transformations for populating target oracle tables in efficient manner
  • Tuning Informatica Mappings and Sessions for optimum performance

Environment: Informatica Power Center 8.6, Oracle 10g/9i, SQL, TOAD, Netezza, Business Objects 6.5/XIR2, UNIX, TOAD, PVCS

We'd love your feedback!