We provide IT Staff Augmentation Services!

Idq Analyst & Developer, Informatica Developer Resume

Omaha, NE

PROFESSIONAL SUMMARY:

  • Around 9+ years of experience in information technology industry, coupled with extensive experience in analysis, design, development, testing and implementation of Data Warehouse/Data Mart applications, Client/Server and Mainframes in various industry like Healthcare, Retails etc.
  • Expert knowledge in working with Informatica IDQ Informatica Power Center 9.0.1/8.x/7.x (Designer, Repository manager, Repository Server Administrator console, Server Manager, Work flow manager, workflow monitor).
  • Hands - on experience across all stages of Software Development Life Cycle (SDLC) including business requirement analysis, data mapping, build, unit testing, systems integration and user acceptance testing.
  • Proficiency in Data Modeling and Dimensional Modelling using Star and Snow Flake Schema, Identifying Facts and Dimensions, Physical and logical data modelling.
  • Implemented slowly changing dimensions (SCD) for some of the Tables as per user requirement.
  • Experience in working with Informatica Power Center, Power Exchange, B2B Data Exchange (DX)Designer, Work Flow Manager, Work Flow Monitor and Repository Manager.
  • Data modeling knowledge of Ralph Kimball’s Dimensional Data Modeling using Star Schema and Snow-Flake Schema. Worked Effectively in Dimensional Modeling Bottom Up across the Data warehouse requirement.
  • Expertise in implementing complex Business rules by creating robust mappings, mapplets, reusable transformations, worklet, batches using Informatica Power Center.
  • Proficiency in data warehousing techniques like data cleansing (DVO), slowly Changing Dimension (SCD) (Type I, Type II and Type III), Surrogate key assignment, Change Data Capture (CDC).
  • Extensively worked on Informatica Power Center tool - Source Analyzer, Data warehousing designer, Mapping & Mapple Designer and Transformation Developer and Informatica Repository Manager.
  • Expert level skills in creating complex mapping using various transformations like Sorter, Filter, Router, Expression, Lookup (connected and unconnected), Aggregator, Sequence Generator, Update Strategy, Joiner, Normalizer, and Union to develop robust mappings in the Informatica Designer.
  • Experience with high volume data sheets from various sources like Oracle, Faltfiles, SQLServer SSIS, and XML and ERWIN
  • Created, Extracted, transformed data from various sources such as Oracle 10g, 11g, MS SQL server and loaded into target database Oracle.
  • Experience in working with SQL Server Integration Services (SSIS). Extensively working with Dimensional Data Modeling and Cube development in SQL Server Analysis Services (SSAS).
  • Extensively used SQL* loader to load data from delimited flat files to the database tables in Oracle.
  • Extensive experience in PL/SQL such as developing Functions, Database Triggers, Stored Procedures and Packages.
  • Worked with various SQL Editors such as TOAD, SQL Plus, and Query Analyzer.
  • Knowledge in Data warehousing methodologies and its concepts.
  • Using Informatica Power Center Designer analyzed the source data to Extract &Transform from various source systems (oracle 10g, DB2, SQL server) by incorporating business rules using different objects and functions that the tool supports.
  • Worked with Informatica Data Quality toolkit, Analysis, data cleansing, data matching, data conversion, exception handling, and reporting and monitoring capabilities of IDQ 10.
  • Hands on experience with Informatica Data Explorer (IDE) / Informatica Data Quality (IDQ) tools for Data Analysis / Data Profiling and Data Governance
  • Good experience with Teradata Performance tuning including collecting Stats, PI, PPI, Join Indexes, Secondary Indexes and analyzing explain plans.
  • Extensive experience in data profiling, data migration from various legacy sources to OLAP and OLTP target applications.
  • Experience in all phases of SDLC like system analysis, application design, development, testing and implementation of data warehouse and non-data warehouse projects.
  • Experience with static and dynamic deployment methodologies to migrate code from Development to Test to Production environments.

TECHNICAL SKILLS:

ETL Tools: Informatica Power Center 9.6/9.5/9.1/8.6.1/8.5/8.1/7. x/6.x

Data Quality: Informatica Developer IDQ.

Data modeling tools: Erwin 8.x/7.x/ 4.2/3.x, Microsoft Visio

RDBMS: Oracle 11g/10g/9i/8i, IBM-DB2, MSSQL Server, Teradata V2R6, TD10, TD13 versions, MS Access's

Web Tools: HTML, DHTML, XML.

Other tools & Utilities: SQL* Plus, TOAD, Auto Sys

Languages: SQL, SQL Joins, SQL ScriptsScripting languages: UNIX Shell Scripting.

Operating systems: Windows 2000/2003/NT/XP, UNIX, LINUX

BI Tools: Tableau 8.1, 8.2

PROFESSIONAL EXPERIENCE:

Confidential, OMAHA, NE

IDQ Analyst & Developer, Informatica Developer

Responsibilities:

  • Performed profiling using Informatica Analyst and Informatica Developer.
  • Used Informatica Data Quality Tool (Developer) to scrub, standardize and match customer Address against the reference table.
  • Used different IDQ transformations in the Developer and created mapping to meet business rules.
  • Redesigned some of the existing mappings in the system to meet new functionality.
  • Optimized performance by tuning mappings and session level.
  • Used different Data profiling techniques for better Data analysis. I.e. Colum profiling and filter options for better Data over view and identifying data anomalies.
  • Involved in creation of Logical Data Model for ETL mapping and the process flow diagrams.
  • Worked with SQL developer to write the SQL code for data manipulation.
  • Worked on Informatica versioned repository with check in and checkout objects feature.
  • Used Debugger extensively to validate the mappings and gain troubleshooting information about data and error conditions.
  • Provided guidance to less experienced personnel. Conducted quality assurance activities such as peer reviews.
  • Participate in the business analysis process and the development of ETL requirements specifications.
  • Worked with production support systems that required immediate support
  • Develop, execute and maintain appropriate ETL development best practices and procedures.
  • Assisted in the development of test plans for assigned projects.
  • Tuning of Teradata SQL statements using Explain analyzing the data distribution among AMPs and index usage, collect statistics, definition of indexes

Confidential, Dallas, Texas

Informatica ETL Developer

Responsibilities:

  • Involved in requirement gathering from Business to understand the Data inputs.
  • Developed entity diagrams and data dictionaries to accomplish tasks. Worked closely to develop logical and physical data models that capture current state/future state data elements and data flow using ER-WIN.
  • Analyzed the data model to fit in the ETL to load data into its input tables with proper link of surrogate keys.
  • Derived the dimensions and facts for the given data and loaded them on a regular interval as per the business requirement.
  • Actively involved in the Design and development of the STAR schema data model.
  • Design and Development of ETL routines, using Informatica Designer within the Informatica Mappings, usage of Lookups, Aggregator, Ranking, Normalizer, Mapplets, connected and unconnected stored procedures / functions, SQL overrides usage in Lookups and source filter usage in Source qualifiers and data flow management into multiple targets using Routers were extensively done.
  • Extensively used Global Source/Targets, User Defined Function, Reusable Mapping and transformation.
  • Responsible for different Data mapping activities from Source systems to Worktable Table, Prestage Load to Stage Load and Stage to Business Information Factory(BIF) tables.
  • Used Informatica Best Practice to handle Error handling - Logging the record level errors in the metadata tables and Auditing - Capture the source/target record counts in every phase of the process flow.
  • Written SQL commands (Pre-Session & Post-Session commands) and executed them in the target database to drop the index & create the index for the target table before and after loading data into it.
  • Experience in creating Dynamic deployment groups and ETL Query for promoting up to higher environments.
  • Involved in identifying bottlenecks in source, target, mappings and sessions and resolved the bottlenecks by doing Performance tuning techniques like increasing block size, data cache size, buffer length.
  • Extensively used UNIX commands within Informatica for Pre Session and Post Session Data Loading Process.
  • Used UNIX Shell Scripts to search for trigger file and do the loop to schedule the workflow
  • Involved in Unit testing, System, User Acceptance Testing to check whether the data is loaded into target.
  • Used FTP connections to read files from various FTP Sources and to place the files in Target system.
  • Extensively used pmcmd commands on command prompt and executed UNIX Shell scripts to automate workflows and to populate parameter files.

Confidential, Portland, OR

IDQ & Informatica ETL Developer

Responsibilities:

  • Extensively used Informatica Power Center 8.1 working with source definitions, target definitions, mappings, Mapplet, transformations, re-usable transformations etc.
  • Involved in design and development of mappings and stored procedures in an optimized manner.
  • Implemented partitioning and bulk loads for loading large volume of data and better performance and less time performance.
  • Involved in loading the data from Source Tables to ODS (Operational Data Store) Tables using Transformation and Cleansing techniques using Informatica.
  • Used Informatica debugging techniques to debug the mappings and used session log files and bad files to trace errors occurred while loading.
  • Based on the requirements, used various transformation like Source Qualifier, Normalizer, Expression, Filter, Router, Update strategy, Sorter, Lookup, Aggregator, Joiner, Stored procedure transformations in the mapping.
  • Implemented weekly error tracking and correction process using Informatica.
  • Involved in performance tuning of mappings, transformations and (workflow) sessions to optimize session performance.
  • Developed Informatica SCD type-I, II and III mappings. Extensively used almost all of the transformations of Informatica including complex lookups, Stored Procedures, Update Strategy, Mapplet and others.
  • Developed workflow tasks like reusable Email, Event wait, Timer, Command, Decision.
  • Creating Test cases and detailed documentation for Unit Test, System and Integration Test to check the data quality.

Confidential, Houston, TX

Informatica Developer

Responsibilities:

  • Extensively used Informatica Power Center 8.6.1 working with source definitions, target definitions, mappings, Mapplet, and different transformations.
  • Created source-to-target mappings, storage capacity planning, developing ETL processes using the Oracle Data Integrator 10g.
  • Involved in design and development of complex ETL mappings and stored procedures in an optimized manner.
  • Implemented partitioning in session level and bulk loads for loading large volume of data.
  • Involved in loading the data from Source Tables to ODS (Operational Data Store) Tables using Transformation and Cleansing Logic using Informatica.
  • Used Informatica debugging techniques to debug the mappings and used session log files and bad files to trace errors occurred while loading.
  • Based on the requirements, used various transformation like Source Qualifier, Normalizer, Expression, Filter, Router, Update strategy, Sorter, xml, Lookup, Aggregator, Joiner, Stored procedure transformations in the mapping.
  • Developed Informatica SCD type-I and Type-II mappings.
  • Developed workflow tasks like reusable Email, Event wait, Timer, Command, Decision.
  • Created Stored Procedures, Packages in PL/SQL with Oracle in order to create, update several tables like Order processing Information table and Audit Log tables.
  • Used PMCMD command to start, stop and ping server from UNIX and created UNIX SHELL PERL scripts to automate the process
  • Created UNIX Shell scripts and called as pre session and post session commands.
  • Creating Test cases and detailed documentation for Unit Test, System, Integration Test and UAT to check the data quality.

Confidential, AR

Informatica Developer

Responsibilities:

  • Identified the flow of information, analyzing the existing systems, evaluating alternatives and choosing the "most appropriate" alternative
  • Developed mappings to extract data from Oracle, Flat files and load into Data Warehouse using the Mapping Designer
  • Used Source Analyzer and Warehouse designer to import the source and target database schemas, and the Mapping Designer to map the sources to the target
  • Understood the Business point of view to implement coding using Informatica Power Center designer
  • All the jobs are integrated using complex Mappings including Mapplet and Workflows using Informatica Power Center designer and workflow manager.
  • Migrate historical data, which is one time and built migration process for ongoing data loads using Informatica power center designer
  • Used Informatica Designer to create complex mappings using different transformations like Filter, Router, Connected & Unconnected lookups, Stored Procedure, Joiner, Update Strategy, Expressions and Aggregator transformations to pipeline data to Data Warehouse
  • Involved in versioning the whole process and retiring the old records using the built-in DD UPDATE, DD DELETE and DD INSERT
  • Developed standards and procedures for transformation of data as it moves from source systems to the Data Warehouse
  • Extensively used "PMCMD" commands on command prompt and executed UNIX Shell scripts to automate workflows and to populate parameter files
  • Debugging invalid mappings using break points, testing of stored procedures and functions, testing of Informatica sessions, batches and the target Data
  • Identified sources, targets, mappings and sessions and tuned them to improve performance
  • Written Triggers, Stored Procedures for Complex mappings, Used Debugger in troubleshooting the existing mappings
  • Created and scheduled sessions, jobs based on demand, run on time and run only once using Workflow Manage
  • Involved in various testing activities like database testing, unit testing, system testing, performance testing and was also responsible for maintaining of testing metrics, defect tracking.

Hire Now