We provide IT Staff Augmentation Services!

Etl Informatica Developer Resume

Torrance, CA

PROFESSIONAL SUMMARY:

Strong experience in ETL using Informatica Power Center/OBIEE (Mapping Designer, Workflow Manager, Workflow Monitor, Source Analyzer, Transformation developer, Mapplet Designer, and Repository manager).Good exposure on optimizing the SQL and performance tuning using Explain Plan. Thorough understanding of Software Development Life Cycle ( SDLC ) including requirements analysis, system analysis, design, development, documentation, training, implementation and post - implementation review.

  • 6+ years experience as an ETL Developer with the development of ETL’s for Data warehouse/Data Migration using Informatica PowerCenter version and SSIS (SQL Server Integration Services) ETL tools.
  • Strong in SQL, T-SQL, PL/SQL, SQL*LOADER, SQL*PLUS, MS-SQL & PRO*C
  • Extensive knowledge on the PowerCenter components as PowerCenter Designer, PowerCenter Repository Manager, Workflow Manager and Workflow Monitor.
  • Thorough knowledge in creating ETL’s to perform the data load from different data sources and good understanding in installation of Informatica.
  • Experience in integrating business application with Informatica MDM hub using Batch process, SIF and message queues.
  • Experienced in using IDQ tool for profiling, applying rules and develop mappings to move data from source to target systems.
  • Good experience with implementing Informatica B2B DX/DT and Informatica BDE.
  • Hands on experience on Various NoSQL databases such as Hbase and MongoDB.
  • Strong hands on experience using Teradata utilities (SQL, B-TEQ, Fast Load, MultiLoad, FastExport, Tpump, Visual Explain, and Query man), Teradata parallel support, Perl and Unix Shell scripting.
  • Experience working with MS SQL Server, Oracle 11g, Oracle APEX, BIG Data, Oracle NoSQL and Oracle Exadata.
  • Experience in developing Stored Procedures, Functions, Views and Triggers, SQL queries using SQL Server and Oracle PL/SQL
  • Experienced in developing the mappings and transformations using PowerCenter Designer and executing the mappings through Work Flow Manager from multiple sources to multiple targets.
  • Knowledge in Full Life Cycle development of Data Warehousing with experience in ensuring quality and compliance to coding standards.
  • Good experience with Data Cleansing, Data Analysis, Data Profiling and necessary Test plans to ensure successful execution of the data loading processes
  • IBM data quality tools of DataStage were used to help with the cleansing for migration.
  • Experience in implementing OBIEE 11.1.1.3.0/10.1.3.3. X,Oracle BI Applications7.9.Xwhich include hands on expertise in RPD development, Siebel/Oracle BI Answers, BI Dashboards, BI Delivers and Reports.
  • Applied the concept of Change Data Capture (CDC) and imported the source from Legacy systems using Informatica Power Exchange (PWX).
  • Good Knowledge of EAI, ESB, B2B integration
  • Having exposure on Informatica Cloud Services.
  • Good Exposure to big data technologies such as Hadoop, Hive, Hbase, Mapreduce, HDFS.
  • Good experience with Data Cleansing, Data Analysis, Data Profiling and necessary Test plans to ensure successful execution of the data loading processes.
  • Strong at developing complex mappings with Custom Transformations, XML Sources.
  • Extensive development, support and maintenance experience working in all phases of the ETL Life Cycle.
  • Involved in ETL Testing, Test Plan Preparation and Process Improvement for the ETL developments with good exposure to development, testing, debugging, implementation, documentation, user training & production support.
  • Worked directly with non-IT business analysts throughout the development cycle, and provide production support for ETL.
  • Involved in understanding client Requirements, Analysis of Functional specification, Technical specification preparation and Review of Technical Specification.
  • Hands-on experience across all stages of Software Development Life Cycle (SDLC) including business requirement analysis, data mapping, build, unit testing, systems integration and user acceptance testing.
  • Experience with industry Software development methodologies like Waterfall, Agile within the software development life cycle
  • Involved in identifying bugs and enhancement of existing mappings by analyzing the data flow, evaluating transformations and fixing the bugs so that they conform to the business needs and redesign the existing mappings for improving the performance
  • Excellent communication skills and ability to work effectively and efficiently in teams and individually.

TECHNICAL SKILLS:

Data Warehousing: Informatica Power Center 10/9.6/9.1/8.6.1/8.5/8.1.1/7.1.2/7.1.1/6.1/5.1.2, Power Connect, Power Exchange, Informatica PowerMart 6.2/5.1.2/5.1.1/5.0/4.7.2, Informatica Web services, Informatica MDM 10.1/9.X, OBIEE 11g/10g, Oracle Data Integrator 12c/11g, OBIA/BI APPS11g/ 7.9.6.x/7.9.5, Oracle Data warehouse builder (OWB), Informatica CDC,Fp Talend RXT 4.1, Informatica BDE, Informatica B2B DX/DT, SQL*Loader, Informatica on demand IOD, Flat Files (Fixed, CSV, Tilde Delimited, XML, IDQ, IDE, Oracle Data Integrator (ODI), ETL Tools Data Transformation Services (DTS), Exadata, Metadata Manager, MS SQL Server Integration Services (SSIS).

Database and related tools: Oracle 10g/9i/8i/8/7.x, MS SQL Server 2000/7.0/6.5, Teradata, Netezza, Amazon S3, Vertica, Sybase ASE, PL/SQL, T SQL, NoSQL, TOAD 8.5.1/7.5/6.2. DB2 UDB, Amazon Redshift.

Languages: SQL, PL/SQL, SQL*Plus, C, Dynamic SQL, C#, Working knowledge of Unix Shell Scripting, Perl scripting, Java

Web Technologies: HTML, XHTML and XML

Operating Systems: Microsoft XP/NT/2000/98/95, UNIX, Sun Solaris 5

Cloud Technologies: AWS, Azure, Informatica Cloud

WORK EXPERIENCE:

Confidential, Torrance, CA

ETL Informatica Developer

Responsibilities:

  • Developed mappings/sessions using Informatica Power Center 9.6 for data loading.
  • Used various transformations like Source Qualifier, Expression, Aggregator, Joiner, Filter, Lookup, Update Strategy Designing and optimizing the Mapping.
  • Lead the offshore team and coordinated its activities with the onshore team.
  • Involved in requirements gathering, data modeling and designed Technical, Functional &ETL Design documents.
  • Implemented Mappings to extract the data from various sources Oracle, XML files, Mainframe DB2, VSAM files, Mainframe Flat Files and load into oracle
  • Extracted data from Oracle and SQL Server then used Teradata for data warehousing.
  • Created new database objects like Tables, User Defined Functions, Triggers, Procedures, Functions, Indexes and Views, SQL joins and statements as per the developer team's request through team tracks.
  • Responsible for the design, development and implementation of dynamic SSIS packages for ETL (extract, transform, and load) development following company standards and conventions.
  • Import/Export and successfully migrated the Databases such as DB2 and MS Access to SQL Server 2005 using SSIS. Scheduled the same packages by creating the corresponding job tasks.
  • Created packages to extract data from flat files, Teradata, Oracle and DB2 and transform the data according to the business requirements and load the data in SQL server tables.
  • Developed SSRS reports as per client requirements and used SSIS package to trigger SSRS reports
  • Created Complex packages by understanding Informatica Mappings, workflows and sessions.
  • Involved in Analyzing, designing, building & testing of OLAP cubes with SSAS 2005. Identified and defined Fact and dimension relationships. Partitioned the cubes on daily, weekly and monthly based.
  • Worked with DTEXEC and DTUTIL utilities for scheduling in SQL Server Integration Services (SSIS) and jobs in SQL Server.
  • Worked on the Support to solve the existing Informatica mappings issues and SSIS packages.
  • Created Check Points, Breakpoints, Database Logging and Event Handlers where ever necessary. Created Configuration files with XML documents to support the SSIS packages in different environments.
  • All SSIS standards are followed to maintain reliability and scalability in the extraction.
  • Involved in data cleansing to facilitate efficient data transfer between various environments.
  • Documented all the project work in support for maintenance.

Environment: Informatica PowerCenter 10/9.6, EBS, Informatica BDE, Teradata 12, SSRS, Oracle 11/10g, PL/SQL, Perl Scripting, Map reduce, Autosys, TOAD 9.x, Informatica Cloud, Oracle Financials, Shell Scripting, Dynamic SQL, Oracle SQL *Loader, SSIS 2008 and Sun Solaris UNIX, OBIEE, Windows-XP.

Confidential, Charlotte, NC

ETL Informatica Developer

Responsibilities:

  • Lead the offshore team and coordinated its activities with the onshore team.
  • Involved in requirements gathering, data modeling and designed Technical, Functional &ETL Design documents.
  • Implemented Mappings to extract the data from various sources Oracle, XML files and load into oracle and Teradata using Teradata utilities like Mload, Tpump, and Fast Export.
  • Designed and implemented slowly changing dimension mappings to maintain history.
  • Used PowerExchange to capture the changes in the source data.
  • Used Teradata utility Fast Load for bulk loading and Tpump & Mload utilities for loading less and larger volumes of data.
  • Implemented ETL Balancing Process to compare and balance data directly from source and warehouse tables for reconciliation purposes.
  • Worked with XML sources and targets.
  • Created and worked with generic stored procedures for various purposes like truncate data from stage tables, insert a record into the control table, generate parameter files etc.
  • Architecture oversight to various project teams, a key leadership role for the architecture redesign, data governance, security, new developments, and major enhancements.
  • As a release architect, managed various release activities including environment planning, release planning and dependency, project dependencies, architecture review, etc
  • Designed and developed Staging and Error tables to identify and isolate duplicates and unusable data from source systems.
  • Simplified the development and maintenance of ETL by creating Mapplets, Re-usable Transformations to prevent redundancy.
  • Wrote and implemented generic UNIX and FTP Scripts for various purposes like running workflows, archiving files, to execute SQL commands and procedures, move inbound/outbound files.
  • Designed and executed test scripts to validate end-to-end business scenarios.
  • Used session partitions, dynamic cache memory, and index cache to improve the performance of ETL jobs.
  • Resolved complex technical and functional issues/bugs identified during implementation, testing and post production phases.
  • Identified & documented data integration issues and other data quality issues like duplicate data, non-conformed data, and unclean data.
  • Assisted team members in functional and Integration testing.
  • Automated and scheduled Workflows, UNIX scripts and other jobs for the daily, weekly, monthly data loads using Autosys Scheduler.
  • Created standard reports using Business Objects features like Combined Queries, Drill Down, Drill Up, Cross Tab, Master Detail etc.

Environment: Informatica Power Center 9.1/8.6, Teradata, Oracle 11/10g, PL/SQL, Autosys, TOAD 9.x, Oracle Financials, Shell Scripting, Oracle SQL *Loader, SSIS and Sun Solaris UNIX, Windows-XP.

Confidential, Richardson, TX

ETL Developer

Responsibilities:

  • Involved in design, development and maintenance of database for Data warehouse project
  • Optimized the SSRS reports through the use of aggressive scoping of data and judicious use of aggregate tables and materialized views and Caching techniques.
  • Involved in migration of the maps from IDQ to power center
  • Applied the rules and profiled the source and target table's data using IDQ.
  • Developed the mappings, applied rules and transformation logics as per the source and target system requirements.
  • Implemented Out-of-the-box analytics reporting functionality and successfully implemented OBIEE delivers/configured proactive agents and configured interactive dashboards to alert the Business/Field users as per the requirements.
  • Created SSIS package for loading the data coming from various interfaces like OMS, Orders, Adjustments and Objectives and also used multiple transformation in SSIS to collect data from various sources.
  • Proficient in creating SQL Server reports, handling sub reports and defining query for generating drill down reports and drill through reports using SSRS 2005/2008.
  • SSIS Packages are created to export and import data from CSV files, Text files and Excel Spreadsheet.
  • Developing reports and intelligent dashboards for Global Sales team.
  • Performance tuning required for slow running reports, designing of performance enhancing structures on Database.
  • Interacting with Business and analysts to create the Functional Specs.
  • Reviewing and creating the design required to support all reporting needs.
  • Getting signoff on report and dashboard template.
  • Monitored Workflows and Sessions using Workflow Monitor.
  • Performed Unit testing, Integration testing and System testing of Informatica mappings
  • Coded PL/SQL scripts.
  • POCs for implementing Informatica scheduling on Job Automation tool UC4 and making ETL loads as flexible and restart able.
  • Developed the mapping to process multiple flat file as source and staged the data into teradata and DB2 databases.
  • Responsible for migration of the work from dev environment to testing environment
  • Coordinated with Database administrator team to make sure database gets executed correctly at data stage and production instances before loads can start.
  • Used nested stored procedures with complex control flow logic to feed SSRS reports.
  • Reverse Engineering the Data Model from Erwin.
  • Used the metadata of Informatica repository tables.
  • Used statistical functions like regression to view the performance trends.
  • For tracking the current status of loads, current/historic performance, throughput, differentiating long running jobs, view performance trends for individual Informatica sessions and developed metrics to view ETL performance.
  • Interface daily with cross functional team members within the EDW team and across the enterprise to resolve issues
  • Participated and assisted in meetings as a member of the Data Warehouse team
  • Formatted and generated summary, statistical, and presentation reports.

Environment: Informatica Powercenter 8.6.1, SSIS, SSRS, IDQ 8.6, Teradata, Oracle 10g/11g, TOAD, SQL, SQL Server 2005/2008, Windows 7, Unix, Linux.

Confidential

ETL Informatica Developer

Responsibilities:

  • Developed number of complex Informatica Mappings, Mapplets and Reusable Transformations for different types of tests in Customer information, Monthly and Yearly Loading of Data.
  • Analyzing the source data coming from different sources and working with business users and developers to develop the Model.
  • Involved in Dimensional modeling to Design and develop STAR Schema, Using ER-win to design Fact and Dimension Tables.
  • Extracted, Transformed and Loaded OLTP data into the Staging area and Data Warehouse using Informatica mappings and complex transformations (Aggregator, Joiner, Lookup (Connected & Unconnected), Source Qualifier, Filter, Update Strategy, Stored Procedure, Router, and Expression).
  • Using Workflow Manager for Workflow and Session Management, database connection management and Scheduling of jobs to be run in the batch process.
  • Fixing invalid Mappings, testing of Stored Procedures and Functions, Unit and Integration Testing of Informatica Sessions, Batches and Target Data.
  • Extracted data from various sources like IMS Data Flat Files and Oracle.
  • Extensively Used Environment SQL commands in workflows prior to extracting the data in the ETL tool.
  • Wrote complex SQL scripts to avoid Informatica joiners and Look-ups to improve the performance, as the volume of the data was heavy.
  • Created Sessions, reusable Worklets and Batches in Workflow Manager and Scheduled the batches and sessions at specified frequency.
  • Used SQL * Loader for Bulk loading.
  • Created Stored Procedures for data transformation purpose.
  • Monitored the sessions using Workflow Monitor.
  • Used stored procedures to create a standard Time dimension, drop and create indexes before and after loading data into the targets.
  • Removed bottlenecks at source level, transformation level, and target level for the optimum usage of sources, transformations and target loads.
  • Captured data error records corrected and loaded into target system.
  • Created Mappings, Mapplets and Transformations, which remove any duplicate records in source.
  • Implemented efficient and effective performance tuning procedures, Performed benchmarking, and these sessions were used to set a baseline to measure improvements against.
  • Tuned Source System and Target System based on performance details, when source and target were optimized, sessions were run again to determine the impact of changes.
  • Used Calculations, Variables, Break points, Drill down, Slice and Dice and Alerts for creating Business Objects reports.
  • Configured workflows with Email Task which would send mail with session, log for Failure of a sessions and for Target Failed Rows.
  • Used Server Manager to create schedules and monitor sessions. And to send the error messages to the concerned personal in case of process failures.
  • Designed and developed the logic for handling slowly changing dimension table’s load by flagging the record using update strategy for populating the desired.

Environment: Informatica Version 9.1, Oracle 9i, UNIX shell scripting, SQL Server 2008, SQL, PL/SMS SQL Server, UNIX Shell Scripting, Oracle, PL/SQL.

Confidential

Data Warehouse Developer

Responsibilities:

  • Involved in design, development, administration and maintenance of the Application.
  • Trained in Open Systems stream on different database development on Oracle and SQL Server and Crystal Reporting tool.
  • Created and maintained a several Data Base objects including stored procedures, data base triggers and views.
  • Developed OBIEE Repository (.rpd) with all the three layers: Physical Layer, Business Model and Mapping Layer and Presentation Layer.
  • Developed various mappings using Mapping Designer and worked with Aggregator, Lookup, Filter, Router, Joiner, Source Qualifier, Expression, Stored Procedure, Sorter and Sequence Generator transformations.
  • Involved in designing and customizing several forms.
  • Organizing and managing the databases for optimum performance levels.
  • Generated several reports
  • Involved in walk through of code to implement performance tuning.
  • Involved in Analysis and documentation.

Environment: Windows, Oracle, OBIEE, MS SQL Server, Crystal Reports, SQL Server 2008, Java, C, C++, SQL, PL/SQL, UNIX, Microsoft Project\Excel\Word\PowerPoint, Text pad

Hire Now