Etl/ Informatica Technical Lead Resume
Chicago, IL
SUMMARY
- 10 plus years of IT experience out of which 8 years of experience in Analysis, Design, Development, Maintenance and Testing BI and Data warehousing Projects.
- Hands - on experience with Informatica Data Quality toolset, and proficient in IDQ development around data profiling, cleansing, parsing, Address Validator, standardization, validation, matching and data quality exception monitoring and handling.
- Implemented Database masking for Files and Relational databases
- Extensive Experience on Informatica platform as a whole and the integration among different Informatica components and services like IDQ, MDM,B2B, Analyst tool, etc.
- Good Experience in Referential data and metadata management.
- Extensive experience in Data Warehousing and implementing ETL (Extraction Transformation Loading) using Informatica Power Center 9.1/9.0/8.6/8.1/7.2/7.1
- With Strong Knowledge in all phases ofSoftware Development Life Cycle(SDLC) such as developing, testing, Maintenance of various platforms like UNIX, Windows NT/2000/XP/98
- Understanding the recent financial regulatory initiatives like CCAR
- Functional and Technical expertise in Decision Support Systems - Data Warehousing, ETL (Extract Transform and Load) using Informatica
- Developed Complex mappings from varied transformation logic like Unconnected /Connected lookups, Router, Filter, Expression, Aggregator, Joiner, Update Strategy and more in Informatica.
- Expertise with ETL (Extract Transform Load)/Administration using Informatica.
- Well versed with architectures of Informatica Power Center 8.x& 9.x
- Experience with Informatica and Teradata utilities in Extraction, Transformation and Loading (ETL).
- Exposure to ERWIN as a Data Modeling Tool.
- Knowledge of Data warehousing concepts and Dimensional modeling (Star schema, Snowflake schema).
- Demonstrated experience implementing Informatica Data Quality as a service in a SOA or Web Services architecture where it is called from one or more production applications to perform Data Quality services.
- Expertise in OLTP/OLAP System Study, Analysis and E-R modeling, developing Database Schemas like Star schema and Snowflake schema used in relational, dimensional and multidimensional modeling
- Good understanding of dimensional modeling and relational database management systems.
- Experience in Teradata withTPT,Fast Load, Multi Load, BTEQ and Macros.
- Experience in Trouble shooting and implementing Performance tuning at various levels such as Source, Target, Mapping, Session and System in ETL Process.
- Experience with TOAD to analyze the tables, create indexes and import data from different schemas.
- Conducted Functionality, Integration, System and UAT testing and investigated software bugs.
- Worked on UNIX shell scripts used in scheduling Informatica pre/post session operations.
- Scheduled jobs and automated the workload using the Tidal Job Scheduler.
- Proficient with wide variety of sources like Oracle, Teradata and Flat Files
- Proficient in coding Oracle and SQL Server Stored Procedures, Triggers, Cursors and Indexes.
- Strong interpersonal and excellent communication skills. Capable of conveying technical terms and solution proposals clearly
TECHNICAL SKILLS
ETL Tools: Informatica9.x/8.x/7.x ( Power Center), MDM 9.1, IDQ 9.6, B2BData transformations, Big data, Hadoop, Splunk, Hive, Pig, MangoDB, Informatica BDE
Reporting tools: OBIEE 10.x
Languages: C, SQL, PL/SQL, C++, Shell Scripting.
Databases: Oracle 11g/10g/9i,Teradata, Netezza 7.x, DB2
Operating System: Windows 9x/NT/2000/XP, UNIX
Data Modeling Tools: Erwin 4.x
Scheduling tools: TIDAL, Autosys
Office Applications: MS-Office 97/2000/XP
Other Tools: TOAD 8.5 and SQL Plus (Database tool).
PROFESSIONAL EXPERIENCE
Confidential, Chicago, IL
ETL/ Informatica Technical Lead
Responsibilities:
- As a lead Data Quality developer in this team initiated the process of Data profiling by profiling different formats of data from different sources and started analyzing its dimensions to find its actual structure and the rules which are implemented as part of Standardization.
- Validation, Standardization and cleansing of data by implementing various business rules.
- Used IDQ’s standardized plans for addresses and names clean ups.
- Match and Merge rules will be implemented in Informatica MDM to find the duplicates and to analyze the golden record.
- Involved in analyzing the requirements, designing and development of stage, target layers.
- Prepared HLD and ETL Specs based on mapping documents.
- Developing the code based on the Specs provided by Data Architects.
- Conducting walk through on TSD with client tech leads, team leads and Data Archs and DBAs
- Developed code and executed UTCs.
- Conducting weekly status meetings to discuss about the progress in that week.
- Creating job setup document for job scheduling tool with all the properties.
- Implemented performance tuning techniques using partitions, macros,etc.
- Well versed with converting various kinds of unstructured data into standard structure data (xml)
- Implemented Performance Tuning of SQL Queries, Sources, Targets and sessions by identifying and rectifying performance bottlenecks.
- Given Technical clarifications and assigned work to off-shore team and monitored their work status daily.
- Worked on Informatica tools like Source Analyzer, Target Designer, Mapping Designer, Workflow Manager, and Workflow Monitor.
- Used various Transformationslike Joiner, Aggregate, Expression, Lookup, Filter, Union, Update Strategy, Stored Procedures, and Router etc. to implement complex logics while coding a Mapping.
- Extracted data from Flat files, SQL Server, Oracle into Teradata.
- Implemented different Tasks in workflows which included Session, Command, E-mail, Event-Wait etc.
- Tested all the mappings and sessions in Development, UAT environments and also migrated into Production environment after everything went successful.
Environment: Informatica Power Center 9.6, IDQ 9.6.1, MDM, Oracle, SQL server, Flat Files,Teradata 13.x,UNIX, Windows XP.
Confidential, New York,NY
ETL/ Informatica Technical Lead
Responsibilities:
- Involved in massive data profiling using IDQ (Analyst Tool) prior to data staging.
- Used IDQ’s standardized plans for addresses and names clean ups.
- Worked on IDQ file configuration at user’s machines and resolved the issues.
- Implemented File and Data masking.
- Used IDQ to complete initial data profiling and removing duplicate data.
- Involved in the designing of Dimensional Model and created Star Schema using E/R studio.
- Involved in analyzing the requirements, designing and development of stage, target layers.
- Prepared HLD and ETL Specs based on mapping documents.
- Worked on various transformations related to IDQ like address validator & standardization etc.
- Developing the code based on the Specs provided by Data Architects.
- Conducting walk through on TSD with client tech leads, team leads and Data Archs and DBAs
- Developed code and executed UTCs.
- Conducting weekly status meetings to discuss about the progress in that week.
- Creating job setup document for job scheduling tool with all the properties.
- Implemented performance tuning techniques using partitions, macros,etc.
- Well versed with converting various kinds of unstructured data into standard structure data (xml)
- Implemented Performance Tuning of SQL Queries, Sources, Targets and sessions by identifying and rectifying performance bottlenecks.
- Given Technical clarifications and assigned work to off-shore team and monitored their work status daily.
- Worked on Informatica tools like Source Analyzer, Target Designer, Mapping Designer, Workflow Manager, and Workflow Monitor.
- Used various Transformationslike Joiner, Aggregate, Expression, Lookup, Filter, Union, Update Strategy, Stored Procedures, and Router etc. to implement complex logics while coding a Mapping.
- Extracted data from Flat files, DB2 andloaded them into DB2.
- Implemented different Tasks in workflows which included Session, Command, E-mail, Event-Wait etc.
- Implemented the recent financial regulatory initiatives like CCAR.
- Tested all the mappings and sessions in Development, UAT environments and also migrated into Production environment after everything went successful.
Environment: Informatica Power Center 9.5, IDQ, DB2, Flat Files, UNIX, Windows XP.
Confidential, Los Angeles, CA
Informatica Technical Lead
Responsibilities:
- Involved in analyzing the requirements, designing and development of stage, target layers.
- Prepared HLD and ETL Specs based on mapping documents.
- Developing the code based on the Specs provided by Data Architects.
- Conducting walk through on TSD with client tech leads, team leads and Data Archs and DBAs
- Developed code and executed UTCs.
- Replaced Trillium rules with IDQ rules
- Developed BTEQ, Fast load scripts, macros and Partitions.
- Worked on various transformations related to IDQ like address validator, parser, standardizer etc.
- Created maplets & mappings in IDQ
- Worked extensively on mapper, parser, seralizer, streamer, UDT, etc
- Conducting weekly status meetings to discuss about the progress in that week.
- Worked on deploying metadata manager like extensive metadata connectors for data integration visibility, advanced search and browse of metadata catalog, data lineage and visibility into data objects, rules, transformations and reference data.
- Used IDQ to profile the project source data, define or confirm the definition of the metadata, cleanse and accuracy check the project data, check for duplicate or redundant records, and provide information on how to proceed with ETL processes.
- Creating job setup document for job scheduling tool with all the properties.
- Implemented performance tuning techniques using partitions, macros,etc.
- Well versed with converting various kinds of unstructured data into standard structure data (xml)
- ImplementedPerformance Tuning of SQL Queries, Sources, Targets and sessions by identifying and rectifying performance bottlenecks.
- Integrated MDM with IDQ using web services
- Given Technical clarifications and assigned work to off-shore team and monitored their work status daily.
- Framed various IDQ rules based on business requirements
- Tested all the mappings and sessions in Development, UAT environments and also migrated into Production environment after everything went successful.
- Transformed bulk amount of data from various sources to Teradata database by using BTEQ scripts.
Environment: Informatica Power Center 9.1, IDQ, Teradata 13.X, Flat Files, UNIX, Windows XP.
Confidential, New York
ETL/ Informatica Technical Lead
Responsibilities:
- Involved in analyzing the requirements, designing and development of data warehouse environment.
- Worked on various transformations related to IDQ like address validator & standardization etc.
- Created maplets & mappings in IDQ
- Used IDQ to profile the project source data, define or confirm the definition of the metadata, cleanse and accuracy check the project data, check for duplicate or redundant records, and provide information on how to proceed with ETL processes.
- Preparing TSD and ETL Specs based on BSD and Mapping document.
- Conducting walk through on TSD with client tech leads, team leads and Data Archs and DBAs
- Developed code and executed UTCs.
- Conducting weekly status meetings to discuss about the progress in that week.
- Creating job setupdocument for job scheduling tool.
- Given Technical clarifications and assigned work to off-shore team and monitored their work status daily.
- Worked on Informatica tools like Source Analyzer, Target Designer, Mapping Designer, Workflow Manager, and Workflow Monitor.
- Used various Transformationslike Joiner, Aggregate, Expression, Lookup, Filter, Union, Update Strategy, Stored Procedures, and Router etc. to implement complex logics while coding a Mapping.
- Extracted data from Flat files, oracle andloaded them into Oracle and Netezza.
- Implemented different Tasks in workflows which included Session, Command, E-mail, Event-Wait etc.
- Worked extensively on mapper, parser, seralizer, streamer, UDT, etc.
- Worked with complex queries for data validation and reporting using SQL and PL/SQL.
- Transformed bulk amount of data from various sources to Netezza database from flat files.
- Written the scripts needed for the business specifications (BTEQ Scripts).
- Well versed with converting various kinds of unstructured data into standard structure data (xml)
- Involved in Performance Tuningusing partitionsfor sessions and relational database tables.
- Involved in Performance Tuning of SQL Queries, Sources, Targets and sessions by identifying and rectifying performance bottlenecks.
Environment: Informatica Power Center 9.1, IDQ 9.1, Netezza7.x,Oracle 11g/9i,flat files, UNIX, Tidal, B2B DT Studio 9.x
Confidential
ETL/ Technical Lead
Responsibilities:
- Involved in analyzing the requirements, designing and development of data warehouse environment.
- Preparing TSD and ETL Specs based on BSD and Mapping document.
- Conducting walk through on TSD with client tech leads, team leads and Data Archs and DBAs
- Developed code and executed UTCs.
- Conducted walk through and created deployment documents, run before moving to SIT, UAT and Production.
- Conducting weekly status meetings to discuss about the progress in that week.
- Creating job setupdocument for job scheduling tool.
- Given Technical clarifications and assigned work to off-shore team and monitored their work status da
- Transformed bulk amount of data from various sources to Teradata database by using BTEQ scripts.
- Written the scripts needed for the business specifications (BTEQ Scripts).
- Transferred data using Teradata utilities like SQL Assistant, Fast Load, MLoad.
- Involved in Performance Tuning of SQL Queries, Sources, Targets and sessions by identifying and rectifying performance bottlenecks.
- Worked on Tidal to automate the Workflows.
- Tested all the mappings and sessions in Development, UAT environments and also migrated into Production environment after everything went successful.
- Monitor performance of workflows, sessions and suggest any performance tuning improvements after consulting with WellPoint team
- Provide support for development, testing and implementation for technology upgrade projects (hardware and software)
- Involved in knowledge transitions from Development teams and Lights On team regarding changes to existing applications in Production.
Environment: Informatica Power Center 9.0.1, Teradata 13.x, MDM,Oracle 11g/9i,flat files, UNIX, Windows XP
Confidential
ETL/Informatica Developer
Responsibilities:
- Involved in the Requirements gathering, Analysis and implementation of Data Warehousing efforts.
- Prepared Transformation Specifications for each and every mapping.
- ETL Implementation for designing.
- Extensively used Informatica to load data from different data sources into the oracle data warehouse.
- Imported Sources and Targets to create Mappings and developed Transformations using Designer.
- Created mappings and transformations using joiner, expression, aggregator, filter, router, sorter, sequence generator, update strategy and lookup.
- Implemented performance tuning in oracle queries by using different techniques
- Good experience with performance tuning in oracle using indexes like bitmap and B tree indexes.
- Preparing test cases for unit test.
- Raised the required requests to migrate the code from different environments and providing the required documentation and DB scripts to the concerned teams
- Coordinated with Developers, Testing team and DBA for the System testing and Performance testing.
- Was always available & accessible to team members in understanding the warehouse and for technical issues.
Environment: Informatica 8.6.1, Oracle 9i, OBIEE 10.1.3.4 and Unix
Confidential
ETL/Informatica Consultant
Responsibilities:
- Involved in gathering and analyzing the requirements and preparing business rules.
- Developed and maintained ETL (Data Extract, Transformation and Loading) mappings to extract the Data from multiple source systems like SAP, and Flat Files and loaded into Oracle.
- Implemented different Tasks in workflows which included Session, Command and E-mail etc.
- Generated queries using SQL to check for consistency of the data in the tables and to update the tables as per the requirements
- Used pre-session and post-session scripts for dropping and recreating indexes on target table.
- Tuned the Sources, Targets, Transformations and Mapping to remove bottlenecks for better performance.
- Involved in fixing Invalid Mappings, testing of Stored Procedures and Functions, Performance and Unit Testing of Informatica Sessions, Batches and Target Data.
- Worked with Developers to troubleshootand resolve mapping / workflow / session logic, as well as performance issues in Development, Test and Production repositories.
Environment: Informatica 7.1.2, Oracle8i, Flat Files and UNIX.
Confidential
ETL/Informatica Consultant
Responsibilities:
- Involved in gathering and analyzing the requirements and preparing business rules.
- Developed and maintained ETL (Data Extract, Transformation and Loading) mappings to extract the Data from multiple source systems like SAP and Flat Files and loaded into Oracle.
- Implemented different Tasks in workflows which included Session, Command and E-mail etc.
- Generated queries using SQL to check for consistency of the data in the tables and to update the tables as per the requirements
- Used pre-session and post-session scripts for dropping and recreating indexes on target table.
- Tuned the Sources, Targets, Transformations and Mapping to remove bottlenecks for better performance.
- Involved in fixing Invalid Mappings, testing ofStored Proceduresand Functions, Performance and Unit Testing of Informatica Sessions, Batches and Target Data.
- Worked with Developers to troubleshootand resolve mapping / workflow / session logic, as well as performance issues in Development, Test and Production repositories.
Environment: Informatica 7.1.1, Oracle8i, Flat Files and UNIX.