We provide IT Staff Augmentation Services!

Sr. Informatica Powercenter/idq Developer Resume

Houston, TX

SUMMARY

  • Over 8+ years Extensive experience in using ETL methodologies for supporting Data Extraction, Data Migration, Data Transformation and developing Master Data using Informatica PowerCenter / IDQ, Teradata.
  • Created mappings in mapping Designer to load data from various sources using transformations like Transaction Control, Lookup (Connected and Un - connected), Router, Filter, Expression,Aggregator,Joinerand Update Strategy, SQL, Stored Procedure and more.
  • Worked on Informatica Data Quality 9.1(IDQ) tool kit and performed data profiling, cleansing and matching and imported data quality files as reference tables
  • Have experience with IDQ, MDM with noledge on Big Data Edition Integration with Hadoop and HDFS.
  • Excellent Knowledge on Slowly Changing Dimensions (SCD Type1, SCD Type 2, SCD Type 3), Change Data Capture, Dimensional Data Modeling, Ralph Kimball Approach, Star/Snowflake Modeling, Data Marts, OLAP and FACT and Dimensions tables, Physical and Logical data modeling.
  • Involved in teh Analysis, Design, Development, Testing and Implementation of business application systems for Health care, Pharmaceutical, Financial, Telecom and Manufacturing Sectors.
  • Strong understanding of OLAP and OLTP Concepts
  • Excellent in designing ETLprocedures and strategies to extract data from different heterogeneous source systems like oracle 11g/10g, SQL Server 2008/2005, DB2 10, Flat files, XML, SAP R/3 etc.
  • Experience in SQL, PL/SQL and UNIX shell scripting.
  • Hands on experience working in LINUX, UNIX and Windows environments.
  • Excellent Verbal and Written Communication Skills. Have proven to be highly effective in interfacing across business and technical groups.
  • Good noledge on data quality measurement using IDQ and IDE
  • Extensive ETL experience using Informatica Powercenter (Designer, Workflow Manager, Workflow Monitor and Server Manager) Teradata and Business Objects.
  • Strong experience in Dimensional Modeling using Star and Snow Flake Schema, Identifying Facts and Dimensions, Physical and logical data modeling using Erwin and ER-Studio.
  • Experience in designing and Developing complex Mappings usingInformatica PowercenterwithTransformationssuch asLookup, Filter, Expression, Router, Joiner, Update Strategy, Aggregator, XML generator, XML parser, Stored Procedure, Sorter and Sequence Generator.
  • Working experience usingInformatica Workflow Managerto create Sessions, Batches and schedule workflows and Worklets, Re-usable Tasks, Monitoring Sessions
  • Experienced in Performance tuning of Informatica and tuning teh SQL queries.
  • Proficient with Informatica Data Quality (IDQ) for cleanup and massaging Confidential staging area.
  • Designed and Developed IDQ mappings for address validation / cleansing, doctor master data matching, data conversion, exception handling, and report exception data.
  • Hands-on experience across all stages of Software Development Life Cycle (SDLC) including business requirement analysis, data mapping, build, unit testing, systems integration and user acceptance testing.
  • Experience in handlinginitial/fullandincremental loads.
  • Expertise in scheduling workflows Windows scheduler, Unix and scheduling tools like CRTL-M &Autosys
  • Designed, Installed, Configured core Informatica/Siperia Hub components such as Informatica Hub Console, Hub Store, Hub Server, Cleanse Match Server, Cleanse Adapter, IDD & Data Modeling.
  • Experience in support and noledge transfer to teh production team.
  • Worked with Business Managers, Analysts, Development, and end users to correlate Business Logic and Specifications for ETL Development.
  • Experienced in Quality Assurance, Manual and Automated Testing Procedures with active involvement in Database/ Session/ Mapping Level Performance Tuning and Debugging.

TECHNICAL SKILLS

ETL Tools: Informatica Powercenter 10.x/9.5.x/9.1/8.6.x/8.5.x., Informatica Power Exchange 9.x, Big data edition 9.6.1, Data Stage, Pentaho., Informatica Data Explorer (IDE), Informatica 9.5.1, Informatica Data Quality (IDQ), SSIS.

Data Modeling: Star Schema, Snowflake Schema, Erwin 4.0, Dimension Data Modeling.

Databases: Oracle11g/10g/9i, SQL Server2008/2005, IBM DB2, Teradata 13.1/V2R5, V2R6, Sybase, MS Access

Scheduling Tools: Control-M, CA7 Schedule, CA ESP Workstation, Autosys, Informatica Scheduler.

Reporting Tools: Crystal Reports, Business Objects XI R2/XI 3.3, OBIEE 11g R1 (11.1.5).

Programming: SQL, PL/SQL, Transact SQL, HTML, XML, C, C++, Korn Shell, Bash, Perl, Python.

Operating Systems: Windows 7/XP/NT/95/98/2000, DOS, UNIX and LINUX

Big Data / Hadoop: HDFS, Hive, Spark, Hbase, and Sqoop.

Other Tools: SQL*Plus, Toad, SQL Navigator, Putty, WINSCP, MS-Office, SQL Developer.

PROFESSIONAL EXPERIENCE

Confidential, Houston, TX

Sr. Informatica Powercenter/IDQ Developer

Responsibilities:

  • Analyze teh business requirements and framing teh Business Logic for teh ETL Process and maintained teh ETL process using Informatica Powercenter.
  • Experienced in Parsing high-level design specs to simple ETL coding and mapping standards.
  • Worked on Agile Methodology, participated in daily/weekly team meetings, guided two groups of seven developers in Informatica PowerCenter/Data Quality (IDQ), peer reviewed their development works and provided teh technical solutions. Proposed ETL strategies based on requirements.
  • Coded Teradata BTEQsql scripts to load, transform data, fix defects like SCD 2 date chaining, cleaning up duplicates.
  • Worked with team to convert Trillium process into Informatica IDQ objects.
  • Extensively used teh DQ transformations like Address validator, Exception, Parser, Standardizer, Solid experience in debugging and troubleshooting Sessions using teh Debugger and Workflow Monitor
  • Extensively worked on UNIX shell scripts for server Health Check monitoring such as Repository Backup, CPU/Disk space utilization, Informatica Server monitoring, UNIX file system maintenance/cleanup and scripts using Informatica Command line utilities.
  • Extensively worked on CDC to capture teh data changes into sources and for delta load. Used Debugger to validate teh Mappings and gained troubleshooting information about teh data and error conditions.
  • Developing workflows with Worklets, Event waits, Assignments, Conditional flows, Email and Command Tasks using Workflow Manager.
  • Proficient in System Study, Data Migration, Data integration, Data profiling, Data Cleansing / Data Scrubbing and Data quality
  • Worked onInformaticaData Quality (IDQ) toolkit, analysis, data cleansing, data matching, data conversion, address standardization, exception handling, reporting and monitoring capabilities of IDQ.
  • Coded Teradata BTEQ sql scripts to load, transform data, fix defects like SCD 2 date chaining, cleaning up duplicates.
  • Experience in creation of ETLMappings and Transformations using Informatica Powercenter to move data from multiple sources into target area using complex transformations like Expressions, Routers, Lookups, Source Qualifiers, XML generator, XML Parser, Aggregators, Filters, Joiners
  • Responsible in preparing Logical as well as Physical data models and document teh same
  • Performed ETL code reviews and Migration of ETL Objects across repositories.
  • Developed ETL's for masking teh data when made available for teh Offshore Dev. team
  • Developed UNIX scripts for dynamic generation of Parameter Files& for FTP/SFTP transmission
  • Monitoredday to dayLoads, Addressed & resolved Production issues in an abrupt & timely manner and provided support for teh ETL jobs running in Production in order to meet teh SLA's
  • Integrated IDQ mappings through IDQ web service applications as cleanse functions in Informatica IDQ cleanse Adapters.
  • Migrated codes from Dev to Test to Pre-Prod. Createdeffective Unit, Integration test of data on different layers to capture teh data discrepancies/inaccuracies to ensure successful execution of accurate data loading.
  • Scheduled Informatica workflows using OBIEE.
  • Involved in implementing change data capture (CDC) and Type me,II, III slowly changing Dimensions
  • Developed functions and stored procedures to aid complex mappings

Environment: Informatica Powercenter 10.x/9.6, Informatica Data Quality (IDQ) 9.6, Oracle 11g, Teradata, PL SQL, SQL developer, TOAD, Putty, Unix

Confidential, Chicago, IL

Sr. Informatica Powercenter Developer

Responsibilities:

  • Assisted Business Analyst with drafting teh requirements, implementing design and development of various components of ETL for various applications.
  • Worked closely with ETL Architect and QC team for finalizing ETL Specification document and test scenarios.
  • Extracted data from oracle database and spreadsheets, CSV files and staged into a single place and applied business logic to load them in teh central oracle database.
  • Designed and developed Informatica Mappings and Sessions based on user requirements and business rules to load data from source flat files and RDBMS tables to target tables.
  • Migration of code between teh Environments and maintaining teh code backups.
  • Integration of various data sources like Oracle, SQL Server, Fixed Width & Delimited Flat Files, DB2.
  • Involved in teh Unit Testing and Integration testing of teh workflows developed.
  • Extensively worked with Korn-Shell scripts for parsing and moving files and even for re-creatingparameterfiles in post-session command tasks.
  • Imported Source/Target Tables from teh respective databases and created reusable transformations like Joiner, Routers, Lookups, Filter, Expression and Aggregator and created new mappings using Designer module of Informatica.
  • Used teh Address Doctor to validate teh address and performed exception handling, reporting and monitoring teh system. Created different rules as mapplets, Logical Data Objects (LDO), workflows. Deployed teh workflows as an application to run them. Tuned teh mappings for better performance.
  • Working with database connections, SQL joins, cardinalities, loops, aliases, views, aggregate conditions, parsing of objects and hierarchies.
  • Profiled data on Hadoop to understand teh data and identify data quality issues
  • Imported and exported data from relational databases to Hadoop Distributed file system-using Sqoop.
  • Developed shell scripts for running batch jobs and scheduling them.
  • Handling User Acceptance Test & System Integration Test apart from UnitTesting with teh halp of Quality Center as bug logging tool. Created & Documented Unit Test Plan (UTP) for teh code.
  • Involved in Production Support

Environment: Informatica Powercenter 9.6, Oracle 11g, SQL Server, PL/SQL, Unix and WINSCP, Bigdata Edition 9.6.1, Hadoop, HDFS, HIVE, Sqoop.

Confidential, Denver, CO

Sr. Informatica ETL Developer

Responsibilities:

  • Develop complex mappings by efficiently using various transformations, Mapplets, Mapping Parameters/Variables, Mapplet Parameters in Designer. Teh mappings involved extensive use of Aggregator, Filter, Router, Expression, Joiner, Union, Normalizer, Sequence generator, SQL and Web Service transformations.
  • Demonstrated teh ETL process Design with Business Analyst and Data Warehousing Architect.
  • Assisted in building teh ETL source to Target specification documents
  • Effectively communicate with Business Users and Stakeholders.
  • Work on SQL coding for overriding for generated SQL query in Informatica.
  • Involve in Unit testing for teh validity of teh data from different data sources.
  • Design and develop PL/SQL packages, stored procedure, tables, views, indexes and functions. Experience dealing with partitioned tables and automating teh process of partition drop and create in oracle database.
  • Perform data validation in teh target tables using complex SQLs to make sure all teh modules are integrated correctly.
  • Perform Data Conversion/Data migration using InformaticaPowerCenter.
  • Involve in performance tuning for better data migration process.
  • Analyze Session log files to resolve error in mapping and identified bottlenecks and tuned them for optimal performance.
  • Create UNIX shell scripts for Informatica pre/post session operations.
  • Automated teh jobs using CA7 Scheduler.
  • Document and present teh production/support documents for teh components developed, when handing-over teh application to teh production support team.
  • Created Data Model for teh DataMarts.
  • Used materialized views to create snapshots of history of main tables and for reporting purpose
  • Coordinating with users for migrating code from Informatica 8.6 to Informatica 9.5
  • Contact with Informatica tech support group regarding teh unnon problem
  • On-Call support during teh weekend
  • Monitoredday to dayLoads, Addressed & resolved Production issues in an abrupt & timely manner and provided support for teh ETL jobs running in Production in order to meet teh SLA's
  • Used various transformations like Source Qualifier, Expression, Aggregator, Joiner, Filter, Lookup, Update Strategy Designing and optimizing teh Mapping.
  • Prepared SQL Queries to validate teh data in both source and target databases.

Environment: Informatica 9.5/8.6, Oracle 11g, SQL server 2008 R2, SQL, T-SQL, PL/SQL, Toad 10.6, SQL Loader, OBIEE, Unix, Flat files, Teradata

Confidential, Detroit, MI

Informatica ETL Developer

Responsibilities:

  • Logical and Physical data modeling was done using Erwin for data warehouse database in STAR SCHEMA.
  • Worked on Agile Methodology, participated in daily/weekly team meetings, guided two groups of seven developers in Informatica PowerCenter peer reviewed their development works and provided teh technical solutions. Proposed ETL strategies based on requirements.
  • Worked with health payer related data such as customers, policy, policy transactions, claims.
  • Generated weekly and monthly report status for teh number of incidents handled by teh support team.
  • Designed and developed complex mappings by using Lookup, Expression, Update, Sequence generator, Aggregator, Router, Stored Procedure, etc., transformations to implement complex ETL logics
  • Worked with Informatica powercenter Designer, Workflow Manager, Workflow Monitor and Repository Manager.
  • Used Source Analyzer and Warehouse designer to import teh source and target database schemas, and teh Mapping Informatica Designer to create complex mappings from Business requirements.
  • Created various transformations like filter, router, lookups, stored procedure, joiner, update strategy, expressions and aggregator to pipeline data to Data Warehouse/Data Marts and monitored teh Daily and Weekly Loads.
  • Designed and developed various complex SCD Type1/Type2 mappings in different layers, migrated teh codes from Dev to Test to Prod environment. Wrote down teh techno-functional documentations along with different test cases to smooth transfer of project and to maintain SDLC.
  • Experience in using Stored Procedures, TOAD, Explain Plan, Ref Cursors, Constraints, Triggers, Indexes-B-tree Index, Bitmap Index, Views, Inline Views, Materialized Views, Database Links, Export/Import Utilities.
  • Developed and maintained ETL (Extract, Transformation and Loading) mappings to extract teh data from multiple source systems like Oracle, SQL server and Flat files and loaded into Oracle.
  • Used different algorithms like Biogram, Edit, Jaro, Reverse and Hamming Distance to determine teh threshold values to identify and eliminate teh duplicate datasets and to validate, profile and cleanse teh data. Created/modified reference tables for valid data using Analyst tools.
  • Developed Informatica Workflows and sessions for mappings using Workflow Manager.
  • Deployed teh Informatica code and worked on code merge between two different development teams.
  • Identified teh bottlenecks in teh sources, targets, mappings, sessions and resolved teh problems.
  • Created Pre & Post-Sessions UNIX Scripts to merge teh flat files and to create, delete temporary files, change teh file name to reflect teh file generated date etc.

Environment: Informatica Powercenter Designer 9.5/8.6, Informatica Repository Manager, Oracle10g/9i,DB2 6.1, Erwin, TOAD, Unix- SunOS, PL/SQL,SQL Developer, Teradata

Confidential

Informatica ETL Developer

Responsibilities:

  • Involved in business analysis and technical design sessions with business and technical staff to develop requirements document and ETL specifications.
  • Involved in designing dimensional modeling and data modeling using Erwin tool.
  • Created high-level Technical Design Document and Unit Test Plans.
  • Developed mapping logic using various transformations like Expression, Lookups (Connected and Unconnected), Joiner, Filter, Sorter, Update strategy and Sequence generator.
  • Wrote complex SQL override scripts Confidential source qualifier level to avoid Informatica joiners and Look-ups to improve teh performance as teh volume of teh data was heavy.
  • Responsible for creating workflows. Created Session, Event, Command, and Control Decision and Email tasks in Workflow Manager
  • Prepared user requirement documentation for mapping and additional functionality.
  • Extensively used ETL to load data using Powercenter from source systems like Flat Files into staging tables and load teh data into teh target database Oracle. Analyzed teh existing systems and made a Feasibility Study.
  • Analyzed current system and programs and prepared gap analysis documents
  • Experience in Performance tuning & Optimization of SQL statements SQL trace
  • Involved in Unit, System integration, User Acceptance Testing of Mapping.
  • Supported teh process steps under development, test and production environment
  • Participated in teh technical design along with customer team, preparing design specifications, functional specifications and other documents.
  • Used Transformation Developer to create teh reusable Transformations.
  • Used Informatica Powercenter Workflow manager to create sessions, batches to run with teh logic embedded in teh mappings.
  • Wrote SQL Scripts for teh reporting requirements and to meet teh Unit Test Requirements.
  • Validated teh Mappings, Sessions & Workflows, Generated & Loaded teh Data into teh target database.
  • Used Informatica’ s features to implement Type me, II changes in slowly changing dimension tables and also developed complex mappings to facilitate daily, weekly and monthly loading of data.
  • Extensively worked on Oracle SQL's for Data Analysis and debugging.
  • Handled scripts for pre validating teh source File structures before loading into teh Staging by comparing teh source file headers against teh base lined header
  • Worked on Teradata Utilities (Multiplied, fast Load, and Export/Import) to improve performance.
  • Wrote shell scripts and controlfiles to load data into staging tables and then into Oracle base tables using SQL*Loader.
  • Used PMCMD command to automate teh Powercenter sessions and workflows through UNIX.
  • Validated teh Mappings, Sessions & Workflows, Generated & Loaded teh Data into teh target database.
  • Involved in troubleshooting existing ETL bugs.

Environment: Informatica Powercenter 8.6, ETL, Flat files, Oracle 10g, MS SQL Server 2008, PL/SQL, Shell Programming, TIBCO,SQL * Loader, Toad, Excel and Unix scripting, Sun Solaris, Windows 2002

Hire Now