Sr Etl Informatica Developer Resume
Hartford, CT
SUMMARY
- 7+ years of IT experience in Environment Setup, Design, Development and Implementation of Data Warehouse and its applications.
- Involved in every stage of SDLC in number of projects from requirements elicitation, estimates till the project deployment and support.
- Extensively worked on Extraction, Transformation and Loading data from various sources like Oracle, MS SQL Server, MS access, Mainframe and Flat files.
- Strong Data Analysis and Data Profiling background using Informatica Analyst, Informatica Data Explorer (IDE) and Informatica Data Quality (IDQ).
- Expertise in creating stored procedures, PL/SQL Packages, Triggers and Functions. Strong knowledge in Oracle cursor management and exception handling.
- Well versed in writing UNIX Shell Scripts for running Informatica workflows (pmcmd commands), file manipulations, housekeeping functions and FTP programs
- Informatica Upgrade: Upgraded Powercenter 9.5.1 to Powercenter 9.6.1
- Informatica Upgrade: Upgraded Powercenter 8.6.0 to Powercenter 9.1.0.
- Informatica Upgrade: Upgraded Powercenter 8.1.1 to Powercenter 8.6.0.
- Coordinated with DBAs for Oracle upgrade from 10g to 11g and updated the Informatica Domain.
- Experience in Database design, development and loading of Oracle, SQL Server, Teradata, DB2 and Access.
- Experience in Installation various Informatica Products like Informatica Support Console, IDQ, IDE, and Proactive Monitoring for Informatica and Data Validation Option.
- Extensively worked in Client - Server application development using Oracle 11g, 10g, 9i/8i, PL/SQL and SQL*Plus.
- Used BTEQ Teradata utility to write scripts to import and export the data
- Informatica MDM experience to identify bad data and filtering them at early stage.
- Been involved in the different phases of ETL project during his tenure which includes Requirement analysis, Design, Development, Testing and Production Release Implementation, Stabilization for end-to-end IT solution offerings.
- Excellent skills in a wide variety of technologies and a proven ability to quickly learn new programs and tools.
- Strong understanding of the Data Warehousing Techniques.
- Very good communication skills and quick adaptability to new technologies and new working environment.
- Excellent organizational skills and ability to prioritize workload.
TECHNICAL SKILLS
Data Warehousing: Informatica PowerCenter 9.6/9.5.1/7.1.2/7.1.1/7.0/6.2/5.1.2/4.7 , Power Mart 9.6/9.5.1/6.2/5.1.1/5.0/4.7.2 , Informatica Power Analyzer, Informatica PowerConnect, ETL, Data mart, OLAP, OLTP, Mapplet, Transformations, Autosys, SQL*Plus, SQL*Loader
Dimensional Data Modeling: Dimensional Data Modelling, Data Modeling, Star Schema Modeling, Snowflake Modeling, FACT and Dimensions Tables, Physical and Logical Data Modelling
BI Tools: Crystal Report 8.5/7.0, SSRS, Cognos Series 7.2, MS Access Reports
Databases: Oracle 11g/10g/9i/8i/8.0, MS SQL Server 2008/2005/2000/7.0 , IBM DB2 UDB 8.0/7.0, Sybase 12.x/11.x, Teradata, MS Access, SQL, PL/SQL, SQL Plus, Transact SQL.
Environment: Win 7/3.x/95/98, Win NT 4.0, Sun Solaris 2.6/2.7, Win 2000, and Win XP.
PROFESSIONAL EXPERIENCE
Confidential, Hartford, CT
Sr ETL Informatica Developer
Responsibilities:
- Interacted with Business Analysts to gather and analyze the Business Requirements.
- Involved in full development life cycle of the project from the requirements to production implementation.
- Involved in creation of High level design documents and also low level i.e. source to target mapping documents.
- Worked on Informatica Power Center - Designer, Workflow Manager and Workflow Monitor.
- Extensively used ETL processes to load data from Oracle, SQL server and Main Frame and Flat Files into target systems DB2 by applying business logic.
- Extensively used the Designer to import source and target definitions into the Repository and to build the mappings and mapplets.
- Used Informatica Designer to create complex mappings using different transformations like Filter, Router, Connected & Unconnected lookups, Stored Procedure, Joiner, Update Strategy, Expressions and Aggregator transformations to pipeline the data to Database.
- Created Sessions, Workflows, Link Tasks and Command Tasks using Workflow Manager and monitored the workflows/sessions using Workflow Monitor.
- Extensively used parameter files and automated ETL processes.
- Extensively used PL/SQL and created stored procedures, functions and Triggers.
- Involved in Performance Tuning of mappings, sessions and SQL queries.
- Performed Unit testing for the mappings, scripts developed.
- Performed Integration testing for the ETL process end to end.
Environment: - Informatica 9.6.1 & 9.5.1, Windows7, DB2, Data Studio 4.0.1, UNIX, SQL and PL/SQL, MainFrame and flat files.
Confidential, Boston, MA
Sr ETL Informatica Developer
Responsibilities:
- Designing the ETLs and conducting review meets.
- Worked with various Databases for extracting the files and loading them into different databases
- Defined frameworks for Operational data system (ODS), Central file distribution (CFD) and Data Quality (DQ) and created functional data requirement (FDR) and Master Test Strategy documents.
- Experience using Informatica IDQ for qualifying the data content and MDM to filter duplicate data as well as to deploy the project as well as Meta Data Management.
- Writing PL/SQL packages, stored procedures and functions using new PL/SQL features like Collections, Objects, Object Tables, Nested Tables, External Tables, REF Cursors, Merge, Intersect, Minus, Bulk Into and Dynamic SQL commands.
- Extracted data from various sources to targets like Extracted data from various sources to targets like DB2, Teradata, Flat files, Oracle and XML; Transformed and Loaded the data in to target database using Informatica Power Center, Transformed and Loaded the data in to target database using Informatica Power Center.
- Upgraded Informatica 9.5.1 Standard Edition to Informatica 9.6.0 Advanced Edition successfully without any defects/issues for 4 Environments. Configured Repository Service, Integration Service, Web Service, Model Repository Service, Data Integration Service, Analyst Service, Content Management Service, Metadata Manager Service.
- Developed database monitoring and data validation reports in SQL Server Reporting Service (SSRS).
- Configure/Create mail box ( MFT object ) with proper user ID and password
- Accomplished data movement process that load data from databases, files into Teradata by the development of Korn Shell scripts, using Teradata SQL and utilities such as BTEQ, FASTLOAD, FASTEXPORT, MULTILOAD, Winddi and Queryman.
- Developing Mappings to process files from file partners, mappings to extract data from Oracle, MS SQL Server, MYSQL, MS Access, XML and loading it into Oracle warehouse
- Recommended Best Practice for ETL coding standards, ETL naming standards.
- Cleansed and standardized data using Informatica IDQ transformations like Standardizer, Parser, Match, Merge & Consolidation Transformations using IDQ.
- Performed the code migrations (Deployment Group and Import/export).
- Coordinated with Oracle DBA and UNIX admins on Oracle upgrade and server related issues respectively.
- Used Teradata Utilities BTEQ, FastLoad, T-pump, Multi Load utilities for loading bulk data.
- Involved in doing SSRS enhancements based on the end user request
- Created the users and user groups as per the request and customized groups for migration verification
- Created ETL connections strings, folders and provide the appropriate privileges to users
- Used IDQ transformations like Parser, Standardizer, Match and Consolidation transformations for cleansing of data and loaded into stage tables.
- Oversees the design/development and supports the data models/structures and ETL. Performs quality testing prior to UAT, and reports progress throughout the project and enhancement life cycle.
- Handled application release activities for and production release for the application users.
- Provided Level 3 Production Support as needed offering timely, instantly sound problem resolution& temporary workarounds until the permanent fix being implemented.
Environment: - Informatica 9.6.0 & 9.5.1, Windows-XP (Client), SSRS, Informatica Data Quality 9.6.0 & 9.5.1, Oracle 11g, UNIX, MS SQL Server, MDM, Teradata, MS Access, PL/SQL, My SQL and flat files
Confidential, Longmont, CO
Sr ETL Informatica Developer
Responsibilities:
- Involved in design of database and created Data marts extensively using Star Schema.
- Involved in implementing the data integrity validation checks through constraints and triggers.
- Involved in developing packages for implementing business logic through procedures and functions.
- Responsible for developing complex Informatica mappings using different types of transformations like UNION transformation, Connected and Unconnected LOOKUP transformations, Router, Filter, Aggregator, Expression and Update strategy transformations for Large volumes of Data.
- Extensively involved in application tuning, SQL tuning, memory tuning and I/O tuning using Explain Plan and SQL trace facilities.
- Creating high level and low level functional specification and Technical specification document for application development.
- Identified and eliminated duplicates in datasets thorough IDQ 9.x components of Edit Distance and Mixed Field matcher. It enables the creation of a single view of customers.
- Project life cycle - from analysis to production implementation, with emphasis on identifying the source and source data validation, developing particular logic and transformation as per the requirement and creating mappings and loading the data into different targets.
- Developing Automation Shell Script programs for the job automation in UNIX environment to avoid manual intervention
- Involved in doing SSRS enhancements based on the end user request
- Created scripts to schedule the process of inbound feeds and outbound extract using Autosys.
- Dimensional data modeling using Data Modeling, Star Join Schema/Snowflake modeling, fact and dimensions tables, physical and logical data modelling.
- Used Teradata Utilities BTEQ, FastLoad, T-pump, Multi Load utilities for loading bulk data
- Performance of the queries is enhanced by executing optimization techniques such as index creation, table partition and coding stored procedures.
- Used SQL tools TOAD to run SQL queries and validate the data in warehouse.
- Performed load and integration tests on all programs created and applied version control procedures to ensure that programs are properly implemented in production.
Environment: Informatica Power Center 9.1, Windows-XP (Client), Informatica Data Quality (IDQ), Power Designer, Power Exchange, Work Flows, ETL, Flat Oracle 11g, Teradata, XML, SSRS, Webservices, MS SQL Server, TOAD 9.6.1, Unix Shell Scripts, Autosys.
Confidential, Cleveland, OH
Sr Informatica Developer
Responsibilities:
- Analysis of requirements and finding the gap between use cases, User interface documents and ETL specifications.
- Creation of ETL Design documents with the help of use cases and User Interface documents, Business rule documentation and PDM.
- Creation of mappings according to the use cases by following the ETL specifications.
- Creation of Informatica mappings to build business rules to load data using transformations like Source Qualifier, Expression, Lookup, Filter, Joiner, Union, Aggregator, Sorter, Sequence generator, Router, Update Strategy, and Stored procedure.etc.
- Extracted data from of different types of Flat files, Oracle.
- Implemented Reusable transformations, mappings, User Defined Functions, Sessions.
- Created and tested Power Exchange Data maps for Oracle CDC capture.
- Involved in identifying the Power Exchange Oracle CDC limitations and supplementary solutions.
- Creation of Unit, Functional, Integration and System test cases based on Requirement Specification Documents, Use Case Docs, PDM, and User Interface Specifications. .
- Involved in Unit, Functional, Integration and System testing and preparation review documents for the same.
- Involved in scheduling the jobs in Control M using UNIX scripts and even developed PL/SQL stored procedures.
- Involved in loading all Batch Scheduling tables with the help of UNIX script.
- Maintained all the documents in the clear case and CVS for Version control.
- Created User defined functions (UDF) to reuse the logic in different mappings.
- Created and tested shell scripts to automate Job scheduling by using commands like pmcmd.
- Data investigation in the analysis of incoming data from the various source systems, documenting the data anomalies and generating Data Quality reports.
- Code migration from development to QA and production environments.
- Developed and tested stored procedures, functions and packages in PL/SQL.
- Involved in database testing, writing complex SQL queries to verify the transactions and business logic like identifying the duplicate rows by using SQL Developer and PL/SQL Developer.
Environment: Informatica PowerCenter v9.0.1, Oracle 10g/9i,Tidal Scheduler, Informatica servers on Unix (Solaris), Netezza, PL/SQL Developer.
Confidential, St Paul, MN
Sr Informatica Developer
Responsibilities:
- Responsible for gathering the user requirements and discuss with Business Analysts to acquire the functional and Technical specifications
- Analyzed the source data coming from Oracle ERP system to create the Source to Target Data Mapping.
- Interacted with the Data Architect in Creating and modifying Logical and Physical data model.
- Used Informatica Repository Manager to create Repositories and Users and to give permissions to users.
- Worked with DBA's in optimizing the Major SQL Query's in the process of performance tuning.
- Generated reports with parameters, sub reports, cross tabs, charts using Crystal Reports.
- Responsible for identifying the bottlenecks and tuning the performance of the Informatica mappings/sessions.
- Used parallel processing capabilities, Session-Partitioning and Target Table partitioning utilities.
- Designed and developed mapping using various transformations like Source Qualifier, Sequence Generator, Expression, Lookup, Aggregator, Router, Rank, Filter, Update Strategy and Stored Procedure.
- Worked with Tidal Enterprise Scheduler to schedule jobs and batch processes.
- Experience in using AppWorx software to schedule and run jobsfor custom writtenprograms.
- Worked Extensively with Informatica Power Exchange in extracting the data from Mainframes
- Used Autosys and Informatica Scheduler to schedule jobs for the files and other sources to be extracted and load to target EDW on a daily/weekly/monthly basis.
- Created Unit Test plans and involved in primary unit testing of mappings.
- Created UNIX shell scripts for Informatica post and pre session operations
- Provided data loading, monitoring, system support and general trouble shooting necessary for all the workflows involved in the application during its production support phase.
Environment: Informatica PowerCenter 7.4/8.5, Oracle 10g, Teradata, Windows7,Fast Export, Flat files, TOAD 8.6, UNIX.
Confidential
Sr Informatica Developer
Responsibilities:
- Analyzed business requirements and worked closely with various application teams and business teams to develop ETL procedures that are consistent across all applications and system.
- Wrote Informatica ETL design documents, establish ETL coding standards and perform Informatica mapping reviews.
- Extensively worked on Power Center Client Tools like Repository Admin Console, Repository Manager, Designer, Workflow Manager, and Workflow Monitor.
- Extensively worked on Power Center Designer client tools like Source Analyzer, Warehouse Designer, Mapping Designer, Mapplet Designer and Transformation Developer.
- Analyzed the source data coming from different sources (Oracle, DB2, XML, QCARE, Flat files) and worked on developing ETL mappings.
- Good experience in installation of Informatica Power Exchange.
- Developed complex Informatica Mappings, reusable Mapplets and Transformations for different types of tests in research studies on daily and monthly basis.
- Implemented mapping level optimization with best route possible without compromising with business requirements.
- Used Teradata fast loads for truncate load tables and mloads for insert, update and upsert options.
- Created Sessions, reusable worklets and workflows in Workflow Manager and Scheduled workflows and sessions at specified frequency.
- Responsible for the Performance tuning at the Source Level, Target Level, Mapping Level and Session Level.
- Worked extensively on SQL, PL/SQL, and UNIX shell scripting.
- Generated XML files to deliver to Thompson Reuters.
- Performed Data profiling for data quality purposes.
- Proven Accountability including professional documentation, and weekly status report.
- Performed Quantitative and Qualitative Data Testing.
- Documented flowcharts for the ETL (Extract Transform and Load) flow of data using Microsoft Visio and created metadata documents for the Reports and the mappings developed and Unit test scenario documentation for the mentioned.
Environment: Power Centre 7.4, DB2, Oracle 10g, UNIX, Win XP Pro, TOAD, Autosys, Cognos 7.2
Confidential
ETL Consultant
Responsibilities:
- Worked in various types of transformation like Lookup, Update Strategy, Stored Procedure, Joiner, Filter, Aggregation, Rank, Router and XML Source Qualifier etc of Informatica to bring the data from different databases like DB2 UDB, Sybase and Oracle.
- Creating Mapplets to reduce the development time and complexity of mappings and better maintenance.
- Reviews, analyzes, and modifies programming systems including encoding, testing, debugging and documenting programs.
- Developed Complex mappings by extensively using Informatica Transformations.
- Developing Reusable Transformations, Aggregations and created Target Mappings that contain business rules.
- Creating/Building and Running/scheduling workflows and work lets using the Workflow Manager.
- Involved in the phases of unit and system testing.
- Extensively used Informatica debugger to validate mappings and to gain troubleshooting information about data and error conditions.
- Worked on performance tuning by creating indexing, hints, used explain plans and analyzing the database.
- Running the SQL scripts from TOAD and creating OracleObjects like tables, views, Indexes, sequences, synonyms and other OracleObjects.
- Worked for a combination of Relational and dimensional data modeling.
- Worked with offshore team by helping them to understand the requirements
- Had co-ordination with off shore team in maintaining the different databases.
- Extensively used UNIX shell scripts and called the shell script in the workflow manger through command task.
Environment: Informatica Power Center 7/8.1.1, Business Objects, MS SQL Server 2000, DB2 UDB, TOAD, Oracle9i, UNIX, Windows 2000, Autosys, SQL Loader.