Etl / Informatica Developer Resume
Charlotte, NC
SUMMARY
- 8+ years of expertise in Analysis, Design, Development, Implementation, Modeling, Testing, and Support for Data Warehouse applications.
- Extensive experience in designing, development and production support of Data Warehouse. Extensive experience in Informatica Power Center 9.x/8.x/7.x/6.x/5.x to carry out Extraction, Transformation and Loading process.
- Thorough understanding in Data Warehouse and Business Intelligence concepts with emphasis on Repository Manager, Master Data Management (MDM), Designer, Workflow Manager and Workflow Monitor.
- Extensively worked on Dimensional Modeling, Data Migration, Data Cleansing, and Data Staging of operational sources using ETL process and providing data mining features for data warehouses.
- Experience in Performance Tuning, Error Handling and Production Support.
- Thorough understanding in complete software development life cycle including Requirement Gathering, Requirement Analysis, Cost Estimation, Project Management, Design, Development, Implementation and Testing.
- Expert skills in OLAP, developing database Schemas like Star Schema, Snowflake Schema, Conforming Dimensions and Slowly Changing Dimensions used in relational, dimensional and multi - dimensional modeling.
- Experience in various versions of Oracle, Teradata, SQL Server, SQL, PL/SQL, stored procedures and functions, triggers and exceptions. Good experience in Relational Database Concepts, Entity relational Diagrams.
- Experience in UNIX working environment, developing UNIX shell scripts for Informatica pre and post session operations.
- Experience in implementing update strategies, incremental loads and change data capture.
- Very strong knowledge in Relational Databases (RDBMS), Data modeling and in building Data Warehouse, Data Marts using Star Schema and Snow Flake Schema. Knowledge in Data Modeling (Logical, Physical, Star Schemas) with Erwin.
- In-depth understanding of Star Schema, Snow Flake Schema, Normalization, 1st NF, 2nd NF, 3rd NF, Fact tables, Dimension tables.
- Extensively used SQL statements while performing process and applied Query.
- Experience in working with scheduling tools like Autosys and UC4.
- Well acquainted with Informatica Designer Components - Source Analyzer, Warehouse Designer, Transformation Developer, Mapplets and Mapping Designer.
- Used Informatica Data Quality (IDQ) for data discovery and profiling, monitoring and cleanse data across the enterprise to achieve better business outcomes.
- Involved in full Life Cycle Development including System Analysis, Design, Data Modeling, Implementation and Support of various applications in Data Warehousing, and OLAP applications.
- Worked extensively with complex mappings using different transformations like Source Qualifiers, Expressions, Filters, Joiners, Routers, Union, Unconnected / Connected Lookups, Aggregators, and Re-usable transformations.
- Data Modeler with strong Conceptual, Logical Data Modeling skills and experience in requirements gathering, source to target mapping, writing functional specifications, queries.
- Excellent communication, interpersonal skills, enthusiastic, knowledge-hungry self-starter, eager to meet challenges and quickly assimilate latest technologies concepts and ideas.
TECHNICAL SKILLS
ETL Tools: Informatica Power Center 9.5, 8.6, 8.5.1/ 8.1 / 7.1.3 / 7.0 / 6.2 , Informatica Data Explorer and Informatica Data Quality.
Databases: Oracle 8i / 9i / 10g, Teradata V2R4 / V2R5 / 13, SQL Server 2000 / 2005, MS Access, Sybase, DB2, VSAM, IMS DB/DC, UDB
Data Modeling Tools: Erwin 4.0 / 3.5, Business Objects 6.0.
Languages: SQL, PL/SQL, C, C++, Java, Python
Scheduling tools: Autosys, UC4, Xeena
Operating Systems: UNIX, Linux, Windows XP/2000/98/95/Win 7
Web Technologies: HTML, XML, JavaScript, and VB Script.
Business Intelligence Tools: OBIEE 11g, Tableau, Cognos 7.x.
PROFESSIONAL EXPERIENCE
Confidential, Charlotte, NC
ETL / Informatica Developer
Responsibilities:
- Navigated complex data requirements gathering process and delivery of a sophisticated, analytics-driven product.
- Designed/Developed IDQ reusable mappings to match patient/provider data based on demographic information.
- Designed the ETL/Data Integration engine by keeping Data quality, performance, scalability and modularity in mind.
- Set up and Configuration of Informatica, Netezza, Salesforce.com connectivity and Apex Data loader environment from the scratch.
- Worked closely with business analyst and Data Warehouse architect to understand the source data and need of the Warehouse.
- Developed mappings to load Fact and Dimension tables, SCD Type 1 and SCD Type 2 dimensions and unit tested the mappings.
- Prepared Detail design documentation thoroughly for production support department to use as hand guide for future production runs before code gets migrated.
- Conduct status meetings with project managers, escalate issues when necessary, conducts meetings for issues resolution.
- Extensively worked on tuning (Both Database and Informatica side) and thereby improving the load time.
- Identified performance bottlenecks using Informatica log files, verbose option and then doing performance improvement by using Informatica Partitioning, sorted input, pushdown optimization, high availability, loaders etc. and as well by tuning the SQL queries by using Analytical Functions.
- Forage through the source and EDW data, realizing patterns in order to fulfill the business requirements.
- Involved in Data Profiling using Informatica Data Quality (IDQ).
- Analyzed the Data Architecture, ETL Migration design and Netezza database migration design. documents platform and design and develop QA automation code accordingly.
- Responsible to manage data coming from different sources.
- Used debugger to test the mapping and fixed the bugs. Created and used Debugger sessions to debug sessions and created breakpoints for better analysis of mappings.
- Designed and Developed pre-session, post-session routines for Informatica sessions to drop and recreate indexes and key constraints for Bulk Loading.
- Involved in daily operations in Netezza using queries and utilities like NZLoad.
- Worked on performance and tuning Netezza queries.
- Responsible for gathering business requirements, actively participated in requirement gathering sessions.
- Worked with NZSQL scripts, NZLOAD commands to load the data to Netezza target.
Environment: Informatica Power Center 9.5, IDQ, Netezza, UNIX Shell Scripts, Salesforce, Tableau, JAMS Scheduler, Aginity Workbench.
Confidential, Long beach, CA
Informatica Developer
Responsibilities:
- Developed data conversion/quality/cleansing rules and executed data cleansing activities.
- Responsible for developing, support and maintenance for the ETL (Extract, Transform and Load) processes using Informatica Power Center 9.5.
- Involved in integration of heterogeneous data sources like, Salesforce, SQL Server and Flat Files (Fixed & delimited) into Staging Area.
- Involved in migration of process from Informatica Power Center 9.1 to 9.5.
- All the jobs are integrated using complex Mappings including Mapplets and Workflows using Informatica power center designer and workflow manager.
- Identifying and handling of Rejected rows after the session complete identifying Column Key, value key indicators.
- Performance tuning has been done to increase the through put.
- Designed and developed mappings using Source Qualifier, Expression, Lookup, Router, Aggregator, Filter, Sequence Generator, Stored Procedure, Update Strategy, joiner and Rank transformations.
- Understanding the Business requirements based on Functional specification to design the ETL methodology in technical specifications.
- Implemented performance tuning of Sources, Targets, Mappings and Sessions by identifying bottlenecks and used Debugger to debug the complex mappings and fix them.
- Addressed and track requests for system enhancements, improvements from end users/customer and also resolved production issues.
- Involved with developing in different pipelines before moving the data flow to UAT and Production.
- Improved session Performance by enabling property incremental aggregation to load incremental data into target table.
- Production Support has been done for ongoing issues and troubleshooting the issues to identified the bugs.
- Extensively used SQL Override function in Source Qualifier Transformation.
- Worked with Functional team to make sure required data has been extracted and loaded and performed the Unit Testing and fixed the errors to meet the requirements.
- Used Session parameters, Mapping variable/parameters and created Parameter files for imparting flexible runs of workflows based on changing variable values.
- Worked with Static, Dynamic and Persistent Cache in lookup transformation for better throughput of Sessions.
- Used PMCMD command to automate the Power Center sessions and workflows through UNIX.
Environment: Informatica Power Center 9.1 & 9.5, SQL Server 2008, SFDC (Salesforce), Apex Data Loader, Flat Files, Apex Data Explorer, SQL Server Management Studio, Autosys, Tectia Client, Flat files, UNIX and Windows 7, HIPAA.
Confidential
Informatica Developer
Responsibilities:
- Interacted with business users and business analyst to understand reporting requirements and prepared Functional Requirement document.
- Prepared technical specifications for the development of Informatica (ETL) mappings to load data into various target tables and defining ETL standards.
- Designed and developed data model using Erwin.
- Expertise in analyzing data throughout each project phase, and provide relevant outputs and results from the data quality procedures, including any ongoing procedures that will run after project end.
- Expertise in partnering with data analysts to provide summary results of data quality analysis, which will be used to make decisions regarding how to measure business rules and quality of the data.
- Expertise in documenting and functional level how the procedures work within the data quality applications.
- Expertise in using data quality tool IDQ to profile the project source data, define or confirm the definition of the metadata, cleanse and accuracy check the project data, check for duplicate or redundant records, and provide information on how to proceed with ETL processes.
- Developed complex mappings using Lookups connected and unconnected, Rank, Sorter, Joiner, Aggregator, Filter, Router transformations to transform the data as per the target requirements.
- Created Workflows and used various tasks like Email, Event-wait and Event-raise, Timer, Scheduler, Control, Decision, Session in the Workflow Manager.
- Used Workflow Monitor to monitor the jobs, review error logs that were generated for each session, and rectified them.
- Worked on performance tuning of the ETL processes. Optimized/tuned mappings for better performance and efficiency.
- Designed and Developed ETL logic for implementing CDC by tracking the changes in critical fields required by the user using Informatica Power Exchange.
- Coded complex BTEQ scripts to populate data mart tables from EDW to cater specific reporting needs.
- Interacted with Teradata DBA team for creation of primary and secondary indexes on Data Warehouse tables.
- Developed Procedures and Functions in PL/SQL.
- Used PMCMD, PMREP and UNIX shell scripts for workflow automation and repository administration.
- Developed Schedules for daily and weekly batches using Unix Maestro.
- Prepared ETL mapping specification document.
- Assisted Testing team in creating test plan and test cases.
- Conducted KT sessions for support team to familiarize the business rules of the applications and issues faced during UAT for ease of future maintenance.
- Involved in creating multi-dimensional universe and reports in Business Objects Environment.
- Involved in writing PLSQL procedures and functions.
- Day to Day Deploy Objects from Dev to QA and production during weekly release.
- Design user Security, folder and metadata synchronization.
- Involved in ETL development using Informatica Power Center and UNIX.
Environment: Informatica Power Center 8.x/9.x, IDQ, Oracle 10g/9i, MS SQL 2008, Teradata V2R3, Business Objects, MS Access.
Confidential
ETL Developer
Responsibilities:
- Involved in design, development and maintenance of database for Data warehouse project.
- Designed the ETL processes using Informatica tool to load data from DB2 into the target Oracle 10g database.
- Wrote triggers and stored procedures for cleaning up data and providing underlying structure for reporting.
- Developed mappings/sessions using Informatica Power Center 7.5/6.2 for data loading.
- Scheduled batch and sessions within Informatica using Informatica scheduler.
- Designed and developed Informatica Mappings and Sessions based on business user requirements and business rules to load data from source flat files and RDBMS tables to target tables.
- Used transformations like Joiner, Expression, Connected and Unconnected lookups, Filter, Aggregator, Store Procedure, Rank, Update Strategy, Router and Sequence generator.
- Creating Pre-session and Post-session UNIX scripts to disable/enable Indices, running procedures.
- Designing and Developing Oracle PL/SQL objects, Shell Scripts for Data Conversions and Data Cleansing.
- Implemented Error Handling Strategy in all dimension mappings.
- Created reusable worklets and mapplets transformations.
- Extensively worked with mapping parameters and session parameters.
- Created Partitions to concurrently load the data into targets.
- Configured the sessions in workflow with various dependency’s using Decision, Command, Event Raise and Event Wait Tasks.
- Tuned mappings and sessions for better performance of the data loads.
- Updating existing procedures, functions, triggers and packages to synchronize with the changes in Transformations.
- Extensively used Erwin tool in Forward and reverse engineering, standards in naming conventions, using conformed dimensions.
- Worked on delimited flat file sources and Extracted data from DB2 and Oracle source systems and loaded into flat files.
- Created Mapping Parameters & Parameter files in the mappings.
- Used Workflow manager for creating, testing and running sessions and Batches in the workflow.
Environment: Informatica Power Center 7.5/7.1, PL/SQL, Oracle, DB2, Business Objects, Windows XP/NT, UNIX.