We provide IT Staff Augmentation Services!

Sr. Informatica/idq Developer Resume

5.00/5 (Submit Your Rating)

Houston, TX

PROFESSIONAL SUMMARY:

  • 8+ years of IT experience in design, analysis, development, documentation, coding, and implementation including Databases, Data Warehouse, ETL Design, Oracle, PL/SQL, SQL Server databases, SSIS , Informatica Power Center 9.x/8.x/7.x, Informatica Data Quality, etc
  • Expertise in Master Data Management concepts, Methodologies, and ability to apply this knowledge in building MDM solutions
  • Experience in installation and configuration of core Informatica MDM Hub components such as Hub Console, Hub Store, Hub Server, Cleanse Match Server, and Cleanse Adapter in Windows.
  • Expertise in the ETL Tool Informatica and have extensive experience in Power Center Client tools including Designer, Repository Manager, Workflow Manager/ Workflow Monitor
  • Extensively worked with complex mappings using various transformations like Filter, Joiner, Router, Source Qualifier, Expression, Union, Unconnected / Connected Lookup, Aggregator, Stored Procedure, XML Parser, Normalize, Sequence Generator, Update Strategy, Reusable Transformations, User Defined Functions, etc.
  • Extensively worked on Relational Databases Systems like Oracle11g/10g/9i/8i , MS SQL Server, Teradata and Source files like flat files, XML files and COBOL files
  • Excellent background in implementation of business applications and in using RDBMS and OOPS concepts.
  • Worked with Informatica Data Quality 9.6.1/10.1 (IDQ) toolkit, Analysis, data cleansing, data matching, data conversion, exception handling, and reporting and monitoring capabilities using IDQ 9.6.1./10.1
  • Expertise in Data warehousing concepts like OLTP/OLAP System Study, Analysis, and E - R modeling, developing database Schemas like Star schema and Snowflake schema used in relational, dimensional, and multidimensional data modeling.
  • Experienced in mapping techniques for Type 1, Type 2, and Type 3 Slowly Changing Dimensions
  • Hands-on experience in Informatica upgrade from 8.6 to 9.1
  • Extensive experience in debugging mappings, identifying bottlenecks/bugs in existing mappings/workflows by analyzing the data flow and evaluating transformations
  • Solid experience in implementing business requirements, error handling, job control & job auditing using Informatica Power Center tools
  • Micro Strategy Professional with extensive experience in BI project implementation, architecture, development and administration.
  • Strong experience in SQL, PL/SQL, Tables, Database Links, Materialized Views, Synonyms, Sequences, Stored Procedures, Functions, Packages, Triggers, Joins, Unions, Cursors, Collections, and Indexes in Oracle
  • Sound knowledge of Linux/UNIX, Shell scripting, experience in command line utilities like pmcmd to execute workflows in non-windows environments
  • Proficiency in working with Teradata utilities like (BTEQ, FASTLOAD, FASTEXPORT, MULTILOAD, Teradata Administrator, SQL Assistant, PMON, Visual Explain).
  • Implemented change data capture (CDC) using Informatica power exchange to load data from clarity DB to TERADATA warehouse.
  • Experience in integration of various data sources with Multiple Relational Databases like DB2, Oracle, SQL Server, MS Access, Teradata, Flat Files, XML files, CSV, and other sources like Salesforce, etc.
  • Experience in Migrating Data from Legacy systems to Oracle database using SQL*Loader
  • Proficient in Oracle Tools and Utilities such as TOAD and SQL*Loader.
  • Experienced in scheduling Sequence and parallel jobs using DataStage Director, UNIX scripts, Linux Scripts and scheduling tool (Control-M v7/v8), Autoflow, CA WA Workstation (ESP)
  • Expert in analyzing Business & Functional Specifications, creating Technical Design Document and Unit Test cases
  • Experience in Performance Tuning of targets, sources, mapping, workflow, system.
  • Identified and fixed bottlenecks and tuned the complex Informatica mappings for better Performance.
  • Involved in SDLC- software development life cycle (Water, Scrum/Agile) of building a Data Warehouse on windows and Unix Platforms.
  • Well versed with onsite/offshore project delivery model and experience in working with offshore teams
  • Designed Applications according to the customer requirements and specifications.
  • Excellent Interpersonal and Communication skills, coupled with strong technical and problem-solving capabilities.
  • Excellent analytical, problem solving, technical, project management, training, and presentation skills.

TECHNICAL SKILLS:

Operating System: UNIX, Linux, Windows

Programming and Scripting: C, C++, Java, .Net, Perl Scripting, Shell Scripting, XSLT, PL/SQL, T-SQL.

Specialist Applications & Software: Informatica Power Center 9.5/9.1/8.6/8.1/7.1, Informatica Power Exchange, Metadata Manager, Informatica Data Quality (IDQ), Informatica Data Explorer (IDE), MDM, SSIS, Salesforce, DataStage, etc.

Data Modeling (working knowledge): Relational Modeling, Dimensional Modeling (Star Schema, Snow-Flake, FACT, Dimensions), Physical, Logical Data Modeling, and ER Diagrams.

Databases tools: SQL Server Management Studio (2008), Oracle SQL Developer (3.0), Toad 11.6 (Oracle), Teradata, AQT v9 (Advanced query tool) (Oracle/Netezza), DB Artisan 9.0 (Sybase), SQL Browser (Oracle Sybase), Visio, ERWIN

Scheduling tools: Informatica Scheduler, CA Scheduler(Autosys), ESP, Maestro, Control-M.

Conversion/Transformation tools: Informatica Data Transformation, Oxygen XML Editor (ver.9, ver.10)

Software Development Methodology: Agile, Water fall.

Domain Expertise: Publishing, Insurance/Finance, HealthCare

Others (working knowledge on some): OBIEE RPD creation, OBIEE, ECM, Informatica Data Transformation XMAP, DAC, Rational Clear Case, WS-FTP Pro, DTD, XML

RDBMS: SQL, SQL*Plus, SQL*Loader, Oracle11g/10g/9i/8i, Teradata, IBM DB2, UDB 8.1/7.0, Sybase 12.5, Netezza v9, MS SQL Server 2000/2005/2008, MS Access 7.0/2000.

PROFESSIONAL EXPERIENCE:

Confidential, Houston, TX

Sr. Informatica/IDQ Developer

Responsibilities:

  • Involved in gathering and analysis business requirement and writing requirement specification documents and identified data sources and targets.
  • Communicated with business customers to discuss the issues and requirements.
  • Design, document and configure the Informatica MDM Hub to support loading, cleansing of data.
  • Involved in implementing the Land Process of loading the customer/product Data Set into Informatica MDM from various source systems.
  • Worked on data cleansing and standardization using the cleanse functions in Informatica MDM.
  • Developed ETL programs using Informatica to implement the business requirements.
  • Developed several IDQ complex mappings in Informatica a variety of Power Center, transformations, Mapping Parameters, Mapping Variables, Mapplets& Parameter files in Mapping Designer using Informatica Power Center.
  • Profiled the data using Informatica Data Explorer (IDE) and performed Proof of Concept for Informatica Data Quality (IDQ)
  • Worked with Informatica Data Quality 10.1 (IDQ) toolkit, Analysis, data cleansing, data matching, data conversion, exception handling, and reporting and monitoring capabilities of IDQ 10.1
  • Used Informatica Power Center to load data from different data sources like xml, flat files, CSV, MS Access and Oracle, Teradata, Salesforce .
  • Imported the IDQ address standardized mappings into Informatica Designer as a Mapplets.
  • Used relational SQL wherever possible to minimize the data transfer over the network.
  • Effectively used Informatica parameter files for defining mapping variables, workflow variables, FTP connections and relational connections.
  • Involved in maintenance and enhancements activities of the data warehouse including tuning, modifying of stored procedures for code enhancements.
  • Effectively worked in Informatica version based environment and used deployment groups to migrate the objects.
  • Used debugger in identifying bugs in existing mappings by analyzing data flow, evaluating transformations.
  • Worked on CDC (Change Data Capture) to implement SCD (Slowly Changing Dimensions) Type 1 and Type 2
  • Extensively used various active and passive transformations like Filter Transformation, Router Transformation, Expression Transformation, Source Qualifier Transformation, Joiner Transformation, and Look up Transformation, Update Strategy Transformation, Sequence Generator Transformation, and Aggregator Transformation.
  • Assisted in the Clarity ETL set-up and maintenance of Meaningful Use reports. Backup support for Clarity ETL tasks.
  • Clarity ETL Production and Test administrative, technical and maintenance support.
  • Involved in creating UNIX shell scripts for Datastage job and Informatica workflow execution.
  • Designed workflows with many sessions with decision, assignment task, event wait, and event raise tasks, used informatica scheduler to schedule jobs.
  • Used the Teradata fast load utilities to load data into tables
  • Used SQL tools like TOAD to run SQL queries and validate the data.
  • Wrote complex SQL override scripts at source qualifier level to avoidInformaticajoiners and Look-ups to improve the performance, as the volume of the data was heavy.
  • Converted all the jobs scheduled in Maestro to Autosys scheduler as the per requirements
  • Worked on maintaining the master data using Informatica MDM
  • Created WebFOCUS reports from user specifications.
  • Managed and controlled flow of Web FOCUS Procedures (FOCEXEC).
  • Wrote UNIX Shell Scripts for Informatica Pre-Session, Post-Session and Autosys scripts for scheduling the jobs (work flows)
  • Used Erwin for Logical and Physical database modeling of the warehouse, responsible for database schema creation based on the logical models
  • Performed tuning of queries, targets, sources, mappings, and sessions.
  • Used Unix scripts and necessary Test Plans to ensure the successful execution of the data loading process
  • Worked with the Quality Assurance team to build the test cases to perform unit, Integration, functional and performance Testing.
  • Developed UNIX Korn shell scripts for archiving, zipping and cleanup bad files data from logs
  • Reviewed and analyzed functional requirements, mapping documents, problem solving and trouble shooting.
  • Analysis of CRs and new JIRA items to be tested for the release
  • Identified problems in existing production data and developed one time scripts to correct them.
  • Provided Knowledge Transfer to the end users and created extensive documentation on the design, development, implementation, daily loads, and process flow of the mappings
  • Fixed the invalid mappings and troubleshoot the technical problems of the database.
  • Provided 24x7 production supports for business users and documented problems and solutions for running the workflows.

Environment: Informatica Power Center 9.6.1, UNIX,JIRA,SQL, IDQ, IDE, CDC, MDM, Linux, Perl, WINSCP, Shell, PL/SQL, Netezza, Maestro, Teradata, Erwin, MS Access, MS Excel, Salesforce, Microsoft SQL Server 2008, and Microsoft Visual studio

Confidential, West Des Moines, IA

Sr. Informatica Developer / Analyst

Responsibilities:

  • Involved in all phases of SDLC from requirement gathering, design, development, testing, Production, and support for production environment.
  • Strong expertise in installing an d configuring the core Informatica MDM components (Informatica MDM Hub Server, Hub Cleanse, Resource Kit, and Cleanse Adaptors like Address Doctor)
  • Defined, configured, and optimized various MDM processes including landing, staging, base objects, foreign-key relationships, lookups, query groups, queries/custom queries, cleanse functions, batch groups and packages using Informatica MDM Hub console.
  • Used Informatica Power Center for (ETL) extraction, transformation and loading data from heterogeneous source systems into target database.
  • Created mappings using Designer and extracted data from various sources, transformed data according to the requirement.
  • Involved in extracting the data from the Flat Files and Relational databases into staging area.
  • Created Stored Procedures for data transformation purpose.
  • Involved in the Dimensional Data Modeling and populating the business rules using mappings into the Repository for Data management
  • Worked on Informatica Power Center 9x tool - Source Analyzer, Data warehousing designer, Mapping, Designer, Mapplet & Transformations.
  • Various kinds of the transformations were used to implement simple and complex business logic.
  • Created numerous Mappings and Mapplets using Transformations like Filters, Aggregator, Lookups, Expression, Sequence generator, Sorter, Joiner, and Update Strategy.
  • Created and configured workflows, worklets & Sessions to transport the data to target warehouse Oracle tables using Informatica Workflow Manager.
  • Worked in building Data Integration and Workflow Solutions and Extract, Transform, and Load (ETL) solutions for data warehousing using SQL Server Integration Service (SSIS).
  • Mapplets and Reusable Transformations were used to prevent redundancy of transformation usage and maintainability.
  • Involved in analyzing existing logical and physical data modeling with STAR and SNOWFLAKE schema techniques using Erwin in Data warehouse
  • Used Teradata Utilities (BTEQ, Multi-Load, and Fast-Load) to maintain the database.
  • Build a re-usable staging area in Teradata for loading data from multiple source systems using template tables for profiling and cleansing in IDQ
  • Actively involved in Data validations and unit testing to make sure data is clean and standardized before loading in to MDM landing tables.
  • Troubleshoot problems by checking sessions and error logs.
  • Configured sessions using server manager to have multiple partitions Source data & to improve performance.
  • Actively involved in Exceptional handling process in IDQ Exceptional transformation after loading the data in MDM and notify the Data Stewards with all exceptions.
  • Worked on Autoflow as job scheduler and used to run the created application and respective workflow in this job scheduler in selected recursive timings.
  • Wrote SQL override scripts at SQ level to avoid Informatica joiners and Look-ups to improve the performance
  • Generated PL/SQL and Perl scripts for scheduling periodic load processes.
  • Extensively worked on the Triggers, Functions, and Database Constraints.
  • Tuned Informatica Mappings and Sessions for optimum performance.
  • Designed and developed Linux scripts for creating, dropping tables which are used for scheduling the jobs.
  • Invoked Informatica using "pmcmd" utility from the UNIX script.
  • Wrote pre-session shell scripts to check session mode (enable/disable) before running/ schedule batches.
  • Participated in problem solving and troubleshooting for the applications implemented with Informatica.
  • Perform process analysis to provide detailed documentation and recommendations to the Load forecasting team for future improvements
  • Involved in supporting 24*7 rotation system and strong grip in using scheduling tool Tivoli.
  • Involved in Production support activities like batch monitoring process in UNIX
  • Involved in Production Support by performing Normal Loads, Bulk Loads, Initial Loads, Incremental Loads, Daily loads and Monthly loads.
  • Transition support of the Load forecasting applications/tools to an IT support organization and implement standard IT support processes and procedures.
  • Prepared Unit test case documents
  • Have performed Peer Reviews within the project.

Environment: Informatica Power Center 9.6.1, UNIX, Oracle, Linux, Perl, Shell, MS Access, MS Excel, MDM, IDQ, IDS, PL/SQL, Tivoli, Oracle 11g/10g, Teradata 14.0, Maestro.

Confidential, Charlotte, NC

Sr. ETL Developer / Analyst

Responsibilities:

  • Worked with Business Analysts (BA) to analyze the Data Quality issue and find the root cause for the problem with the proper solution to fix the issue...
  • Document the process that resolves the issue which involves analysis, design, construction and testing for Data quality issues
  • Involved in doing the Data model changes and other changes in the Transformation logic in the existing Mappings according to the Business requirements for the Incremental Fixes
  • Worked extensively with Informatica tools like Source Analyzer, Warehouse Designer, Transformation Developer, and Mapping Designer.
  • Extracted data from various heterogeneous sources like DB2, Sales force, Mainframes, Teradata, and Flat Files using Informatica Power center and loaded data in target database.
  • Created and Configured Landing Tables, Staging Tables, Base Objects, Foreign key relationships, Queries, Query Groups etc. in MDM.
  • Defined the Match and Merge Rules in MDM Hub by creating Match Strategy, Match columns and rules for maintaining Data Quality
  • Extensively worked with different transformations such as Expression, Aggregator, Sorter, Joiner, Router, Filter, and Union in developing the mappings to migrate the data from source to target.
  • Used connected and Unconnected Lookup transformations and Lookup Caches in looking the data from relational and Flat Files. Used Update Strategy transformation extensively with DD INSERT, DD UPDATE, DD REJECT, and DD DELETE.
  • Extensively Implemented SCD TYPE 2 Mappings for CDC (Change data capture) in EDW.
  • Code walkthrough and Review of documents which are prepared by other team members.
  • Involved in doing Unit Testing, Integration Testing, and Data Validation.
  • Migration of Informatica maps extensively involved in migration from Dev to SIT and UAT to Production environment.
  • Developed Unix script to sftp, archive, cleanse and process many flat files
  • Created and ran Pre-existing and debug sessions in the Debugger to monitor and test the sessions prior to their normal run in the Workflow Manager
  • Extensively worked in migrating the mappings, worklets and workflows within the repository from one folder to another folder as well as among the different repositories.
  • Created Mapping parameters and Variables and written parameter files.
  • Implemented various Performance Tuning techniques by finding the bottle necks at source, target, mapping, and session and optimizing them.
  • Used SQL queries and database programming using PL/SQL (writing Packages, Stored Procedures/Functions, and Database Triggers).
  • Worked on Scheduling Jobs and monitoring them through Control M and CA scheduler tool(Autosys).
  • Used SQL tools like TOAD to run SQL queries and validate the data in warehouse.
  • Worked with the SCM code management tool to move the code to Production
  • Extensively worked with session logs and workflow logs in doing Error Handling and trouble shooting.

Environment: Informatica Power Center 9.0.1, Erwin, Teradata, Tidal, MS Access, MS Excel, SQL Assistance, DB2, XML, Oracle 9i/10g/11g, MQ Series,perl, OBIEE 10.1.3.2, IDQ, MDM Toad and UNIX Shell Scripts.

Confidential, New York

Sr. Informatica Developer

Responsibilities:

  • Designing the dimensional model and data load process using SCD Type 2 for the quarterly membership reporting purposes.
  • Derived the dimensions and facts for the given data and loaded them on a regular interval as per the business requirement.
  • Generating the data feeds from analytical warehouse using required ETL logic to handle data transformations and business constraints while loading from source to target layout.
  • Worked on Master Data Management (MDM), Hub Development, extract, transform, cleansing, loading the data onto the staging and base object tables
  • Extracted data from multiple sources such as Oracle, XML, and Flat Files and loaded the transformed data into targets in Oracle, Flat Files.
  • Wrote Shell Scripts for Data loading and DDL Scripts.
  • Designing and coding the automated balancing process for the feeds that goes out from data warehouse.
  • Implement the automated balancing and control process which will enable the control on the audit and balance and control for the ETL code.
  • Improving the database access performance by tuning the DB access methods like creating partitions, using SQL hints, and using proper indexes.
  • All the jobs are integrated using complex Mappings including Mapplets and Workflows using Informatica power center designer and workflow manager.
  • Worked on data cleansing and standardization using the cleanse functions in Informatica MDM.
  • Created ETL mappings, sessions, workflows by extracting data from MDM system, FLASH DC & REM source systems.
  • Designed and created complex source to target mappings using various transformations inclusive of but not limited to Aggregator, Look Up, Joiner, Source Qualifier, Expression, Sequence Generator, and Router Transformations.
  • Use debugger in identifying bugs in existing mappings by analyzing data flow, evaluating transformations, and create mapplets that provides reusability in mappings.
  • Analyzing the impact and required changes to in corporate the standards in the existing data warehousing design.
  • Following the PDLC process to move the code across the environments though proper approvals and source control environments.
  • Source control using SCM.

Environment: Informatica Power Center 9.0.1, Erwin 7.2/4.5, Business Objects XI, Unix Shell Scripting, XML, Oracle 11g/10g, DB2 8.0, IDQ, MDM, TOAD, MS Excel, Flat Files, SQL Server 2008/2005, PL/SQL, Windows NT 4.0, Sun Solaris 2.6.

Confidential

Informatica Developer

Responsibilities:

  • Involved in analysis, design, development, test data preparation, unit and integration testing, Preparation of Test cases and Test Results
  • Coordinating with client, Business and ETL team on development
  • Developed Batch jobs using extraction programs using COBOL, JCL, VSAM, Datasets, FTP to Load Informatica tables
  • Involved in full project life cycle - from analysis to production implementation and support with emphasis on identifying the source and source data validation, developing logic, and transformation as per the requirement and creating mappings and loading the data into BI database.
  • Based on the business requirements created Functional design documents and Technical design specification documents for ETL Process.
  • Designing and developing ETL solutions in Informatica Power Center 8.1
  • Designing ETL process and creation of ETL design and system design documents.
  • Developing code to extract, transform, and load (ETL) data from inbound flat files and various databases into various outbound files using complex business logic.
  • Most of the transformations were used like Source Qualifier, Aggregator, Filter, Expression, and Unconnected and connected Lookups, Update Strategy.
  • Created automated shell scripts to transfer files among servers using FTP, SFTP protocols and download files.
  • Developed Informatica mappings, enabling the ETL process for large volumes of data into target tables.
  • Designed and developed process to handle high volumes of data and high volumes of data loading in each SLA.
  • Created Workflow, Worklets and Tasks to schedule the loads at required frequency using Informatica scheduling tool.
  • Expertise in creating control files to define job dependencies and for scheduling using Informatica.
  • Provided effective support in delivering process and product change improvement solutions.
  • Ensured acceptable performance of the data warehouse processes by monitoring, researching and identifying the root causes of bottlenecks.
  • Acted as a liaison between the application testing team and development team in order to make timely fixes.
  • Involved in jobs scheduling, monitoring and production support in a 24/7 environment.

Environment: Informatica Power Center 8.1, ETL, Business Objects, Oracle 10g/9i/8i, HP - Unix, PL/SQL

We'd love your feedback!