Sr. Informatica Developer Resume
Columbus, OH
SUMMARY
- 8+ years of experience in Information Technology wif a strong background in Database development and Data warehousing (OLAP) and ETL process using Informatica Power Center9.x/8.x/7.x/6.x, Business Objects wif Oracle, SQL Server databases and IDQ.
- Experience in integration of various data sources wif Multiple Relational Databases like DB2, Oracle, SQL Server and Worked on integrating data from XML files, flat files like fixed width and delimited.
- Experience in writing stored procedures and functions.
- Experience in SQL tuning using Hints, Materialized Views.
- Worked on Power Exchange bulk data movement process by using Power Exchange Change Data Capture (CDC) method
- Tuned Complex Sql queries by looking Explain plan of teh SQL.
- Performed teh data profiling and analysis making use of Informatica Data Quality (IDQ).
- Worked Master Data Management concepts, Methodologies and ability to apply dis noledge in building MDM solutions
- Experience in Data mining, finding correlations or patterns among fields in Data base.
- Implemented performance tuning techniques at application, database and system levels
- Worked in Agile methodology and Kanban method for faster development.
- Experience in UNIX shell programming.
- Worked wif Informatica Cloud for creating source and target object, developed source to target mappings.
- Expertise in Dimensional Data Modeling using Star and Snow Flake Schema. Designed data models using Erwin.
- Understanding of Relational (ROLAP), Multidimensional (MOLAP) modeling, data warehousing concepts, star and snowflake schema, database design methodologies and Meta data management.
- Experience wif preparing functional Specifications, Low Level and High - Level Design specification and Source to target mapping documents for teh ETL programs/business processes
- Experience in working wif Oracle Stored Programs, Packages, Cursors, Triggers, Database Link, Snapshot, Roles, Privileges, Tables, Constraints, Views, Indexes, Sequence, and Synonyms Dynamic SQL and SQL*Loader in distributed environment.
- Experience in Creation and managing user accounts, security, rights, disk space and process monitoring in Solaris and Red hat Linux.
- Extensive noledge in handling slowly changing dimensions (SCD) type 1/2/3.
- Experience in using Informatica to populate teh data into Teradata DWH.
- Experience in understanding Fact Tables, Dimension Tables, Summary Tables
- Experience on AWS cloud services like EC2, S3, RDS, ELB, EBS, VPC, Route53, Auto scaling groups, Cloud watch, Cloud Front, IAM for installing configuring and troubleshooting on various Amazon images for server migration from physical into cloud.
- Moved teh company from a SQL Server database structure to data warehouse and responsible for ETL and data validation using SQL Server Integration Services.
- Strong experience in Informatica Data Quality (IDQ), creating data profiles, custom filters, data cleansing and developing score cards.
- Configured and Managed Elastic Load Balancing (ELB) to avoid fault tolerance and avoid single point of failure of applications, hence provide high availability and network load balancing.
- Exposure to development, testing, debugging, implementation, user & production support.
- Experience wif 24x7 production support.
- Excellent Analytical, Written and Communication skills.
TECHNICAL SKILLS
Operating Systems: Linux, Unix, Windows 2010/2008/2007/2005/ NT/XP
ETL Tools: Informatica Power Center 10.1/9.x/8.x/7.x (Designer, Workflow Manager, Workflow Monitor, Repository manager and Informatica Server), IDQ.
Databases: Oracle 11g/10g/9i/8i, MS SQL Server 2008/2005, DB2/UDB, Mongo DB, Teradata.
Data Modelling Tools: Erwin, MS Visio, E/R Studio, Excel, SAS, R language.
Reporting Tools: Reporting services SSRS, SSIS Tableau, and Micro strategy
Languages: Java, JavaScript, HTML, Perl, SQL, PL/SQL, UNIX, Shell scripts, C++, R.
Scheduling Tools: Autosys, Control-M.
Tools: Selenium, QTP, Win Runner, Load Runner, Quality Center, Test Director, TOAD
PROFESSIONAL EXPERIENCE
Confidential, Columbus OH
Sr. Informatica Developer
Responsibilities:
- Involved in gathering and analyzing teh requirements and preparing business rules from teh Cobol code.
- Gatheird information from source system, Business Documents and prepared teh Data conversion and migration technical design document
- Designed and developed complex mappings by using lookup, expression, update, sequence generator, aggregator, router, stored procedure, etc., transformations to implement complex logics while coding a mapping.
- Worked in requirement analysis, ETL design and development for extracting data from teh mainframe system.
- Worked wif Informatica Power Center 10 .0.2 designer, workflow manager, workflow monitor and repository manager.
- Actively involved in Analysis phase of teh business requirement and design of teh Informatica mappings. Performed data validation, data profiling, data auditing and data cleansing activities to ensure high qualityCognos report deliveries.
- Developed and maintained ETL (extract, transformation and loading) mappings to extract teh data from multiple source systems like oracle, sql server and flat files and loaded into oracle.
- Developed Informatica workflows and sessions associated wif teh mappings using workflow manager.
- Created sessions, reusable worklets and workflows in Workflow Manager Used Event wait to trigger teh file and run teh process.
- Used power exchange to read source data from Mainframes Systems, power center for ETL and Db2 as targets
- Analyzed Session Log files in case teh session failed to resolve errors in mapping or session configurations. Involved in designing teh ETL testing strategies for functional, integration and system testing for Data warehouse implementation. worked and created files for business objects.
- Scheduling teh Informatica workflows using Control-M, Tivoli scheduling tools& trouble shooting teh Informatica workflows.
- Involved in creating new table structures and modifying existing tables and fit into teh existing data model.
- Designed, developed, implemented and maintainedInformaticaPowerCenter and IDQ application for matching and merging process.
- Worked wifInformaticaData Quality (IDQ) toolkit, Analysis, data cleansing, data matching, data conversion, exception handling, and reporting and monitoring capabilities of IDQ.
- Responsible for normalizing COBOL files using normalize transformation.
- Used Debugger to test teh data flow and fix teh mappings
- Identified and eliminated duplicates in datasets thorough IDQ components of Edit Distance, Jaro Distance and Mixed Field matcher, It enables teh creation of a single view of customers, help control costs associated wif mailing lists by preventing multiple pieces of mail.
- Responsible for testing, modifying, debugging, documenting and implementation of Informatica mappings. performed unit and integration testing and wrote test case
- Debug through session logs and fix issues utilizing database for efficient transformation of data.
- Worked wif pre-session and post-session UNIX scripts for automation of ETL jobs. Also involved in migration/conversion of ETL processes from development to production.
- Extracted data from different databases like oracle and external source systems like flat files using ETL tool.
- Experience in Data Warehouse development working wif Data migration, Data conversion and Extraction/transformation/Loading using Informatica power center wif Oracle, SQL server, DB2 and Teradata.
- Extracted data from sources like Sql server, and fixed width and delimited flat files. transformed teh data according to teh business requirement and tan loaded
- Involved in debugging informatica mappings, testing of stored procedures and functions,
- Perform unit testing on deliverables and documenting.
- Developed mapplets, reusable transformations, source and target definitions, mappings using Informatica 10.0.2.
- Generated queries using sql to check for consistency of teh data in teh tables and to update teh tables as per teh business requirements.
- Involved in performance tuning of mappings in informatica.
- Migrated teh ETL - Workflows, Mappings and sessions to QA and Production Environment
- Created documents dat has teh detail explanation of teh mappings, test cases and teh expected results and teh actual results.
- Good understanding of source to target data mapping and business rules associated wif teh ETL processes.
eNvironment: Informatica Power Center 10.2, PL/SQL developer, IDE,IDQ, XML, Tivoli, SSIS, Oracle 11g, Flat files, Teradata, UNIX, Control-M, Shell Scripting.
Confidential, North Chicago IL
Sr. Informatica Developer
Responsibilities:
- Collaborated wif Business Analysts for requirements gathering, business analysis and designing of teh Enterprise Data warehouse. worked in requirement analysis, ETL design and development for extracting data from teh heterogeneous source systems like DB2, MS SQL Server, Oracle, flat files, Mainframes and loading into Staging and Star Schema.
- Created/Modified Business requirement documentation.
- Created ETL Specifications using GAP Analysis.
- Did Production Support on Monthly rotation basis.
- Worked on Master Data Management concepts, Methodologies and ability to apply dis noledge in building MDM solutions
- Used Informatica Data Quality (IDQ) to profile teh data and apply rules to and Provider subject areas to get Master Data Management (MDM)
- Involved in migration of teh maps from IDQ to power center
- Eliminated and prevent duplicated and inconsistent data using Informatica MDM.
- Worked on CISM tickets for Production issues for Trucks and Cars Data Marts.
- Extracted Mainframe data for Local Landscapes like Credit Check, Application etc.
- Created Unit Test cases, supported SIT and UAT.
- Involved in massive data profiling using IDQ (Analyst Tool) prior to data staging.
- Used IDQ’s standardized plans for addresses and names clean ups.
- Applied teh rules and profiled teh source and target table's data using IDQ.
- Worked wif Informatica Power exchange and Informatica Cloud to integrate and load data to Oracle db.
- Worked wif Informatica Cloud for creating source and target object, developed source to target mappings.
- Expertise in Dimensional Data Modeling using Star and Snow Flake Schema. Designed data models using Erwin
- Involved in Dimensional modeling (Star Schema) of teh Data warehouse and used Erwin to design teh business process, dimensions and measured facts
- Migrated Informatica code to different environment.
- Developed teh Mappings wif Best Practices and Standards.
- Hands On experience wif various Modelling tools including Erwin, Power Designer, ER Studio, Oracle Data Modeler.
- Informatica 9.5.1,Oracle 11.6,Teradata 13.0,Unix, Hadoop, ER Studio, DB2
- Implemented Incremental load logic.
- Worked on Power Exchange bulk data movement process by using Power Exchange Change Data Capture (CDC) method, Power Exchange Navigator, Power Exchange Bulk Data movement. Power Exchange CDC can retrieve updates at user-defined intervals or in near real time.
- Involved inDesign and Development of technical specifications using Hadoop Technology
- Worked on debugging code to find bottle necks for performance tuning.
- Extracted data from sources like SQL server, and Fixed width and Delimited Flat files. Transformed teh data according to teh business requirement and tan Loaded into Oracle and Teradata databases.
- Experience in writing SQL queries and optimizing teh queries in SQL Server.
- Performeddata analysisanddata profilingusingSQLon various sources systems including SQL Server
- Designed and developed complex mappings to load teh Historical and Weekly data from teh Legacy Systems to Oracle database.
- Worked wif Teradata Database in writing BTEQ queries and Loading Utilities using Multiload, Fastload, FastExport.
- Created CISM ticket wif HP to fix DB2 issues.
- Modified MORIS (Trucks) data mart load wif enhancements.
- Created Defects in ALM software and assigned to developers.
- Fine Tuned teh Complex SQL by checking Explain Plan of teh queries.
- Improved ETL job duration/run time from hours to minutes where no push up and partition process unavailable.
- Created mappings to load data from ODS to DWH.
- Used DB2 Loader utility to load large tables efficiently.
- Created session partitions to load data faster.
- Extensively used pmcmd commands on command prompt and executed Unix Shell scripts to automate workflows and to populate parameter files.
- Used Informatica Metadata Manager to find teh data lineage in teh Mappings.
- Working in Agile Methodology SDLC.
- Did Analysis for ABC (Automated Balance and Control) for each Business Areas and recommended teh changes to capture Stats for each Workflow and its Sessions.
- Tuned ETL Load by checking Load on teh Server CPU.
- Ftp’d teh File using Session property FTP Connection to Informatica Server.
- Used Informatica Power Exchange to read Mainframe files.
- Documented ETL test plans, test cases, test scripts, test procedures, assumptions, and validations based on design specifications for unit testing, system testing, expected results, preparing test data and loading for testing, error handling and analysis.
Environment: Informatica Power Center 9.6.1, Informatica Metadata Manager, DB2 9.8, Rapid SQL 8.2, IBM Data Studio, Erwin, PLSQL, Linux, Power Exchange 9.5.1, Copy Book, Tivoli, ALM, Shell Scripting.
Confidential, Orange county CA
Sr. Informatica Developer
Responsibilities:
- Designed to load data from various sources like Oracle, Flat files, XML files, Teradata, Sybase and MS SQL Server into Oracle, Teradata, XML, and SQL server targets.
- Moved teh data from source systems to different schemas based on teh dimensions and fact tables by using teh slowly changing dimensions type 2 and type 1.
- Experience in Data mining, finding correlations or patterns among fields in Data base.
- Hands on working experience in building MDM composite services using Services Integration Framework (SIF) including setup and configuring SIF SDK.
- Analyzed teh suspects in MDM while loading teh data.
- Interacted wif teh client for teh requirements on MDM.
- Worked on Teradata databases in profiling teh data from teh production tables from different sources.
- Involved in Project Agile Roadmap Estimation, Pursuits, Solutions Architecture, Hadoop best practice, and architecture design/implementation wifin teh company
- Involved in column profiling of teh tables using Informatica Analyst Tool (IDQ).
- Analyzing teh requirements for profiling from MDM.
- Worked wif Informatica cloud to create Source/Target connections, monitor, and synchronize teh data.
- Extensive experience in Relational and Dimensional Data modeling for creating Logical and Physical Design of Database and ER Diagrams using multiple data modeling tools likeER Studioand ERWIN.
- Strong experience in Dimensional Modeling using Star and Snow Flake Schema, Identifying Facts and Dimensions, Physical and logical data modeling using Erwin
- Extensively used Erwin for data modeling and Dimensional Data Modeling.
- Experience in implementing Data Quality rules on Hive Data sources using Informatica Big Data Edition 10.x.
- Informatica Data Quality(IDQ) 9.6.1, Informatica Data Quality(IDQ) 10.1.1, Teradata 15.10, Oracle 11g, UC4, Windows, WinSCP, Toad, Big Data Edition 10.1.0, Hadoop Hortonworks, Hive 2.4
- Worked wif Informatica Power exchange and Informatica Cloud to integrate and load data to Oracle db.
- Worked wif Informatica Cloud for creating source and target object, developed source to target mappings.
- Used IDQ to profile teh project source data, define or confirm teh definition of teh metadata, cleanse and accuracy check teh project data, check for duplicate or redundant records, and provide information on how to proceed wif ETL processes
- Involved in data profiling using IDQ prior to data staging.
- Extensively used XML Transformation, Normalizer, Lookups, Expression, and Aggregator, sequence Generator, Sorter and Joiner Transformations.
- Developed XML mapping for ETL team for teh transformation and loading.
- Defining teh code values and making changes according to values in mapping and explaining it to ETL team for teh transformation rules
- Worked on XML Source Files.
- Created complex mappings in Power Center Designer using Aggregate, Expression, Filter, and Sequence Generator, Update Strategy, Union, Lookup, Joiner, XML Source Qualifier and Stored procedure transformations.
- Responsible for migration of teh work from dev environment to testing environment
- Responsible for solving teh testing issues.
- Created documents dat has teh detail explanation of teh mappings, test cases and teh expected results and teh actual results.
Environment: Informatica Power Center 9.5, Oracle 11g, Teradata V 13.0, Fast load, Multiload, Teradata SQL Assistant, Erwin,ER Studio, MS SQL Server 2012, SQL, PL/SQL, T-SQL, SQL*Plus, IDQ, XML,XSD, T-SQL, Oracle SQL developer.
Confidential - San Diego CA
ETL Developer
Responsibilities:
- Worked on Data mining to analyzing data from different perspective and summarizing into useful information.
- Involved in requirement analysis, ETL design and development for extracting data from teh heterogeneous source systems like MS SQL Server, Oracle, flat files and loading into Staging and Star Schema.
- Used SQL Assistant to querying Teradata tables.
- Experience in various stages of System Development Life Cycle(SDLC) and its approaches like Waterfall,Spiral,Agile.
- Used Teradata Data Mover to copy data and objects such as tables and statistics from one system to another. involved in Analyzing / building Teradata EDW using Teradata ETL utilities and Informatica.
- Worked on Teradata Multi-Load, Teradata Fast-Load utility to load data from Oracle and SQL Server to Teradata.
- Involved in massive data cleansing prior to data staging.
- Configured workflows wif Email Task, which would send mail wif session, log for Failure of a sessions and for Target Failed Rows.
- Worked on CDC (Change Data Capture) to implement SCD (Slowly Changing Dimensions).
- Worked on Change Data Capture (CDC) using CHKSUM to handle any change in teh data if their is no flag or date column present to represent teh changed row.
- Worked on reusable code non as Tie outs to maintain teh data consistency. It compared teh source and target after teh ETL loading is complete to validate no loss of data during teh ETL process.
- Cleanse, standardize and enrich customer information using Informatica MDM.
- Worked wif data Extraction, Transformation and Loading data using BTEQ, Fast load, Multiload.
- Used teh Teradata fast load/Multiload utilities to load data into tables.
- Worked wif UNIX shell scripts extensively for job execution and automation.
- Documented ETL test plans, test cases, test scripts, test procedures, assumptions, and validations based on design specifications for unit testing, system testing, expected results, preparing test data and loading for testing, error handling and analysis.
- Involved in Unit testing, User Acceptance Testing to check whether teh data is loading into target, which was extracted from different source systems according to teh user requirements.
Environment: Informatica Power Center 9.5 Repository Manager, Mapping Designer, Workflow Manager and Workflow Monitor, Oracle 11g, Teradata V 13.0, Fast load, Multiload, Teradata SQL Assistant, MS SQL Server 2012, SQL, PL/SQL, T-SQL, Unix, Cognos, Tidal.
Confidential
Programmer Analyst
Responsibilities:
- Coordinated wif business analysts and data architecture team to ensure compliance wif standards and consistency wif long-term infrastructure plans.
- Prepared Detail Design specification and functional Specifications for teh ETL programs/business processes.
- Involved in Design Review, code review, test review, and gave valuable suggestions.
- Worked wif different Caches such as Index cache, Data cache, Lookup cache (Static, Dynamic and Persistence) and Join cache while developing teh Mappings.
- Performed data reconciliation in various source systems and in Teradata.
- Took part in migration of jobs from UIT to SIT and to UAT.
- Involved in Informatica Code Migration across various Environments.
- Performed troubleshooting on teh load failure cases, including database problems.
- Created database objects like Tables (also global/volatile), macros, views, and procedures.
- Documented all teh designs from source to stage, stage to integration and integration to atomic layer.
- Co-ordinate wif teh off-shore team and mentored junior developers.
Environment: Informatica Power Center 9.5/9.1, Oracle 11g, Teradata V 13.0, Fast load, Multiload, Teradata SQL Assistant, MS SQL Server 2012, SQL, PL/SQL, T-SQL, SQL*Plus, TOAD, Erwin, AIX, Shell Scripts, Auto sys, UNIX.