We provide IT Staff Augmentation Services!

Sr.informaticapowercenter/idq Developer Resume

2.00/5 (Submit Your Rating)

Houston, TX

PROFESSIONAL SUMMARY

  • Over 8+ years Extensive experience in using ETL methodologies for supporting Data Extraction, Data Migration, Data Transformation and developing Master Data using Informatica PowerCenter / IDQ, Teradata.
  • Created mappings in mapping Designer to load data from various sources using transformations like Transaction Control, Lookup (Connected and Un - connected), Router, Filter, Expression,Aggregator,Joinerand Update Strategy, SQL, Stored Procedure and more.
  • Worked on Informatica Data Quality 9.1(IDQ) tool kit and performed data profiling, cleansing and matching and imported data quality files as reference tables
  • Have experience with IDQ, MDM with knowledge on Big Data Edition Integration with Hadoop and HDFS.
  • Excellent Knowledge on Slowly Changing Dimensions (SCD Type1, SCD Type 2, SCD Type 3), Change Data Capture, Dimensional Data Modeling, Ralph Kimball Approach, Star/Snowflake Modeling, Data Marts, OLAP and FACT and Dimensions tables, Physical and Logical data modeling.
  • Involved in the Analysis, Design, Development, Testing and Implementation of business application systems for Health care, Pharmaceutical, Financial, Telecom and Manufacturing Sectors.
  • Strong understanding of OLAP and OLTP Concepts
  • Excellent in designing ETLprocedures and strategies to extract data from different heterogeneous source systems like oracle 11g/10g, SQL Server 2008/2005, DB2 10, Flat files, XML, SAP R/3 etc.
  • Experience in SQL, PL/SQL and UNIX shell scripting.
  • Hands on experience working in LINUX, UNIX and Windows environments.
  • Excellent Verbal and Written Communication Skills. Have proven to be highly effective in interfacing across business and technical groups.
  • Good knowledge on data quality measurement using IDQ and IDE
  • Extensive ETL experience using Informatica Powercenter (Designer, Workflow Manager, Workflow Monitor and Server Manager) Teradata and Business Objects.
  • Strong experience in Dimensional Modeling using Star and Snow Flake Schema, Identifying Facts and Dimensions, Physical and logical data modeling using Erwin and ER-Studio.
  • Experience in designing and Developing complex Mappings usingInformatica PowercenterwithTransformationssuch asLookup, Filter, Expression, Router, Joiner, Update Strategy, Aggregator, XML generator, XML parser, Stored Procedure, Sorter and Sequence Generator.
  • Working experience usingInformatica Workflow Managerto create Sessions, Batches and schedule workflows and Worklets, Re-usable Tasks, Monitoring Sessions
  • Experienced in Performance tuning of Informatica and tuning the SQL queries.
  • Proficient with Informatica Data Quality (IDQ) for cleanup and massaging Confidential staging area.
  • Designed and Developed IDQ mappings for address validation / cleansing, doctor master data matching, data conversion, exception handling, and report exception data.
  • Hands-on experience across all stages of Software Development Life Cycle (SDLC) including business requirement analysis, data mapping, build, unit testing, systems integration and user acceptance testing.
  • Experience in handlinginitial/fullandincremental loads.
  • Expertise in scheduling workflows Windows scheduler, Unix and scheduling tools like CRTL-M &Autosys
  • Designed, Installed, Configured core Informatica/Siperian Hub components such as Informatica Hub Console, Hub Store, Hub Server, Cleanse Match Server, Cleanse Adapter, IDD & Data Modeling.
  • Experience in support and knowledge transfer to the production team.
  • Worked with Business Managers, Analysts, Development, and end users to correlate Business Logic and Specifications for ETL Development.
  • Experienced in Quality Assurance, Manual and Automated Testing Procedures with active involvement in Database/ Session/ Mapping Level Performance Tuning and Debugging.

TECHNICAL SKILLS

ETL Tools: Informatica Powercenter 10.x/9.5.x/9.1/8.6.x/8.5.x., Informatica Power Exchange 9.x, Big data edition 9.6.1, Data Stage, Pentaho., Informatica Data Explorer (IDE), Informatica 9.5.1, Informatica Data Quality (IDQ), SSIS.

Data Modeling: Star Schema, Snowflake Schema, Erwin 4.0, Dimension Data Modeling.

Databases: Oracle11g/10g/9i, SQL Server2008/2005, IBM DB2, Teradata 13.1/V2R5, V2R6, Sybase, MS Access

Scheduling Tools: Control-M, CA7 Schedule, CA ESP Workstation, Autosys, Informatica Scheduler

Reporting Tools: Crystal Reports, Business Objects XI R2/XI 3.3, OBIEE 11g R1 (11.1.5).

Programming: SQL, PL/SQL, Transact SQL, HTML, XML, C, C++, Korn Shell, Bash, Perl, Python.

Operating Systems: Windows 7/XP/NT/95/98/2000, DOS, UNIX and LINUX

Big Data / Hadoop: HDFS, Hive, Spark, Hbase, and Sqoop.

Other Tools: SQL*Plus, Toad, SQL Navigator, Putty, WINSCP, MS-Office, SQL Developer.

PROFESSIONAL EXPERIENCE

Confidential, Houston, TX

Sr.InformaticaPowercenter/IDQ Developer

Responsibilities:

  • Analyze the business requirements and framing the Business Logic for the ETL Process and maintained the ETL process using Informatica Powercenter.
  • Experienced in Parsing high-level design specs to simple ETL coding and mapping standards.
  • Worked on Agile Methodology, participated in daily/weekly team meetings, guided two groups of seven developers in Informatica PowerCenter/Data Quality (IDQ), peer reviewed their development works and provided the technical solutions. Proposed ETL strategies based on requirements.
  • Coded Teradata BTEQsql scripts to load, transform data, fix defects like SCD 2 date chaining, cleaning up duplicates.
  • Worked with team to convert Trillium process into Informatica IDQ objects.
  • Extensively used the DQ transformations like Address validator, Exception, Parser, Standardizer, Solid experience in debugging and troubleshooting Sessions using the Debugger and Workflow Monitor
  • Extensively worked on UNIX shell scripts for server Health Check monitoring such as Repository Backup, CPU/Disk space utilization, Informatica Server monitoring, UNIX file system maintenance/cleanup and scripts using Informatica Command line utilities.
  • Extensively worked on CDC to capture the data changes into sources and for delta load. Used Debugger to validate the Mappings and gained troubleshooting information about the data and error conditions.
  • Developing workflows with Worklets, Event waits, Assignments, Conditional flows, Email and Command Tasks using Workflow Manager.
  • Proficient in System Study, Data Migration, Data integration, Data profiling, Data Cleansing / Data Scrubbing and Data quality
  • Worked onInformaticaData Quality (IDQ) toolkit, analysis, data cleansing, data matching, data conversion, address standardization, exception handling, reporting and monitoring capabilities of IDQ.
  • Coded Teradata BTEQ sql scripts to load, transform data, fix defects like SCD 2 date chaining, cleaning up duplicates.
  • Experience in creation of ETLMappings and Transformations using Informatica Powercenter to move data from multiple sources into target area using complex transformations like Expressions, Routers, Lookups, Source Qualifiers, XML generator, XML Parser, Aggregators, Filters, Joiners
  • Responsible in preparing Logical as well as Physical data models and document the same
  • Performed ETL code reviews and Migration of ETL Objects across repositories.
  • Developed ETL's for masking the data when made available for the Offshore Dev. team
  • Developed UNIX scripts for dynamic generation of Parameter Files& for FTP/SFTP transmission
  • Monitoredday to dayLoads, Addressed & resolved Production issues in an abrupt & timely manner and provided support for the ETL jobs running in Production in order to meet the SLA's
  • Integrated IDQ mappings through IDQ web service applications as cleanse functions in Informatica IDQ cleanse Adapters.
  • Migrated codes from Dev to Test to Pre-Prod. Createdeffective Unit, Integration test of data on different layers to capture the data discrepancies/inaccuracies to ensure successful execution of accurate data loading.
  • Scheduled Informatica workflows using OBIEE.
  • Involved in implementing change data capture (CDC) and Type I,II, III slowly changing Dimensions
  • Developed functions and stored procedures to aid complex mappings

Environment: Informatica Powercenter 10.x/9.6, Informatica Data Quality (IDQ) 9.6, Oracle 11g, Teradata, PL SQL, SQL developer, TOAD, Putty, Unix

Confidential, Chicago, IL

Sr. Informatica Powercenter Developer

Responsibilities:

  • Assisted Business Analyst with drafting the requirements, implementing design and development of various components of ETL for various applications.
  • Worked closely with ETL Architect and QC team for finalizing ETL Specification document and test scenarios.
  • Extracted data from oracle database and spreadsheets, CSV files and staged into a single place and applied business logic to load them in the central oracle database.
  • Designed and developed Informatica Mappings and Sessions based on user requirements and business rules to load data from source flat files and RDBMS tables to target tables.
  • Migration of code between the Environments and maintaining the code backups.
  • Integration of various data sources like Oracle, SQL Server, Fixed Width & Delimited Flat Files, DB2.
  • Involved in the Unit Testing and Integration testing of the workflows developed.
  • Extensively worked with Korn-Shell scripts for parsing and moving files and even for re-creatingparameterfiles in post-session command tasks.
  • Imported Source/Target Tables from the respective databases and created reusable transformations like Joiner, Routers, Lookups, Filter, Expression and Aggregator and created new mappings using Designer module of Informatica.
  • Used the Address Doctor to validate the address and performed exception handling, reporting and monitoring the system. Created different rules as mapplets, Logical Data Objects (LDO), workflows. Deployed the workflows as an application to run them. Tuned the mappings for better performance.
  • Working with database connections, SQL joins, cardinalities, loops, aliases, views, aggregate conditions, parsing of objects and hierarchies.
  • Profiled data on Hadoop to understand the data and identify data quality issues
  • Imported and exported data from relational databases to Hadoop Distributed file system-using Sqoop.
  • Developed shell scripts for running batch jobs and scheduling them.
  • Handling User Acceptance Test & System Integration Test apart from UnitTesting with the help of Quality Center as bug logging tool. Created & Documented Unit Test Plan (UTP) for the code.
  • Involved in Production Support

Environment: Informatica Powercenter 9.6, Oracle 11g, SQL Server, PL/SQL, Unix and WINSCP, Bigdata Edition 9.6.1, Hadoop, HDFS, HIVE, Sqoop.

Confidential, Denver, CO

Sr. Informatica ETL Developer

Responsibilities:

  • Develop complex mappings by efficiently using various transformations, Mapplets, Mapping Parameters/Variables, Mapplet Parameters in Designer. The mappings involved extensive use of Aggregator, Filter, Router, Expression, Joiner, Union, Normalizer, Sequence generator, SQL and Web Service transformations.
  • Demonstrated the ETL process Design with Business Analyst and Data Warehousing Architect.
  • Assisted in building the ETL source to Target specification documents
  • Effectively communicate with Business Users and Stakeholders.
  • Work on SQL coding for overriding for generated SQL query in Informatica.
  • Involve in Unit testing for the validity of the data from different data sources.
  • Design and develop PL/SQL packages, stored procedure, tables, views, indexes and functions. Experience dealing with partitioned tables and automating the process of partition drop and create in oracle database.
  • Perform data validation in the target tables using complex SQLs to make sure all the modules are integrated correctly.
  • Perform Data Conversion/Data migration using InformaticaPowerCenter.
  • Involve in performance tuning for better data migration process.
  • Analyze Session log files to resolve error in mapping and identified bottlenecks and tuned them for optimal performance.
  • Create UNIX shell scripts for Informatica pre/post session operations.
  • Automated the jobs using CA7 Scheduler.
  • Document and present the production/support documents for the components developed, when handing-over the application to the production support team.
  • Created Data Model for the DataMarts.
  • Used materialized views to create snapshots of history of main tables and for reporting purpose
  • Coordinating with users for migrating code from Informatica 8.6 to Informatica 9.5
  • Contact with Informatica tech support group regarding the unknown problem
  • On-Call support during the weekend
  • Monitoredday to dayLoads, Addressed & resolved Production issues in an abrupt & timely manner and provided support for the ETL jobs running in Production in order to meet the SLA's
  • Used various transformations like Source Qualifier, Expression, Aggregator, Joiner, Filter, Lookup, Update Strategy Designing and optimizing the Mapping.
  • Prepared SQL Queries to validate the data in both source and target databases.

Environment: Informatica 9.5/8.6, Oracle 11g, SQL server 2008 R2, SQL, T-SQL, PL/SQL, Toad 10.6, SQL Loader, OBIEE, Unix, Flat files, Teradata

Confidential, Detroit, MI

Informatica ETL Developer

Responsibilities:

  • Logical and Physical data modeling was done using Erwin for data warehouse database in STAR SCHEMA.
  • Worked on Agile Methodology, participated in daily/weekly team meetings, guided two groups of seven developers in Informatica PowerCenter peer reviewed their development works and provided the technical solutions. Proposed ETL strategies based on requirements.
  • Worked with health payer related data such as customers, policy, policy transactions, claims.
  • Generated weekly and monthly report status for the number of incidents handled by the support team.
  • Designed and developed complex mappings by using Lookup, Expression, Update, Sequence generator, Aggregator, Router, Stored Procedure, etc., transformations to implement complex ETL logics
  • Worked with Informatica powercenter Designer, Workflow Manager, Workflow Monitor and Repository Manager.
  • Used Source Analyzer and Warehouse designer to import the source and target database schemas, and the Mapping Informatica Designer to create complex mappings from Business requirements.
  • Created various transformations like filter, router, lookups, stored procedure, joiner, update strategy, expressions and aggregator to pipeline data to Data Warehouse/Data Marts and monitored the Daily and Weekly Loads.
  • Designed and developed various complex SCD Type1/Type2 mappings in different layers, migrated the codes from Dev to Test to Prod environment. Wrote down the techno-functional documentations along with different test cases to smooth transfer of project and to maintain SDLC.
  • Experience in using Stored Procedures, TOAD, Explain Plan, Ref Cursors, Constraints, Triggers, Indexes-B-tree Index, Bitmap Index, Views, Inline Views, Materialized Views, Database Links, Export/Import Utilities.
  • Developed and maintained ETL (Extract, Transformation and Loading) mappings to extract the data from multiple source systems like Oracle, SQL server and Flat files and loaded into Oracle.
  • Used different algorithms like Biogram, Edit, Jaro, Reverse and Hamming Distance to determine the threshold values to identify and eliminate the duplicate datasets and to validate, profile and cleanse the data. Created/modified reference tables for valid data using Analyst tools.
  • Developed Informatica Workflows and sessions for mappings using Workflow Manager.
  • Deployed the Informatica code and worked on code merge between two different development teams.
  • Identified the bottlenecks in the sources, targets, mappings, sessions and resolved the problems.
  • Created Pre & Post-Sessions UNIX Scripts to merge the flat files and to create, delete temporary files, change the file name to reflect the file generated date etc.

Environment: Informatica Powercenter Designer 9.5/8.6, Informatica Repository Manager, Oracle10g/9i,DB2 6.1, Erwin, TOAD, Unix- SunOS, PL/SQL,SQL Developer, Teradata

Confidential

Informatica ETL Developer

Responsibilities:

  • Involved in business analysis and technical design sessions with business and technical staff to develop requirements document and ETL specifications.
  • Involved in designing dimensional modeling and data modeling using Erwin tool.
  • Created high-level Technical Design Document and Unit Test Plans.
  • Developed mapping logic using various transformations like Expression, Lookups (Connected and Unconnected), Joiner, Filter, Sorter, Update strategy and Sequence generator.
  • Wrote complex SQL override scripts Confidential source qualifier level to avoid Informatica joiners and Look-ups to improve the performance as the volume of the data was heavy.
  • Responsible for creating workflows. Created Session, Event, Command, and Control Decision and Email tasks in Workflow Manager
  • Prepared user requirement documentation for mapping and additional functionality.
  • Extensively used ETL to load data using Powercenter from source systems like Flat Files into staging tables and load the data into the target database Oracle. Analyzed the existing systems and made a Feasibility Study.
  • Analyzed current system and programs and prepared gap analysis documents
  • Experience in Performance tuning & Optimization of SQL statements SQL trace
  • Involved in Unit, System integration, User Acceptance Testing of Mapping.
  • Supported the process steps under development, test and production environment
  • Participated in the technical design along with customer team, preparing design specifications, functional specifications and other documents.
  • Used Transformation Developer to create the reusable Transformations.
  • Used Informatica Powercenter Workflow manager to create sessions, batches to run with the logic embedded in the mappings.
  • Wrote SQL Scripts for the reporting requirements and to meet the Unit Test Requirements.
  • Validated the Mappings, Sessions & Workflows, Generated & Loaded the Data into the target database.
  • Used Informatica’ s features to implement Type I, II changes in slowly changing dimension tables and also developed complex mappings to facilitate daily, weekly and monthly loading of data.
  • Extensively worked on Oracle SQL's for Data Analysis and debugging.
  • Handled scripts for pre validating the source File structures before loading into the Staging by comparing the source file headers against the base lined header
  • Worked on Teradata Utilities (Multiplied, fast Load, and Export/Import) to improve performance.
  • Wrote shell scripts and controlfiles to load data into staging tables and then into Oracle base tables using SQL*Loader.
  • Used PMCMD command to automate the Powercenter sessions and workflows through UNIX.
  • Validated the Mappings, Sessions & Workflows, Generated & Loaded the Data into the target database.
  • Involved in troubleshooting existing ETL bugs.

Environment: Informatica Powercenter 8.6, ETL, Flat files, Oracle 10g, MS SQL Server 2008, PL/SQL, Shell Programming, TIBCO,SQL * Loader, Toad, Excel and Unix scripting, Sun Solaris, Windows 2002

We'd love your feedback!