- Highly proficient, versatile and resolution focused ETL Architect/Lead Developer offering expertise in data warehousing architecture especially using Informatica Suite and experience in software analysis, design, development implementation and management of technical solutions.
- Around 12 Years of professional IT experience in Data warehousing and implementing development projects. Having good abilities in development and support for successful execution of projects using industry leading Data warehouse technologies and tools such as Informatica Power Center 10.x/9.x/8.x/7.x, Informatica Power Exchange 10.x/9.x, Informatica Data Quality 9.x(IDQ), Teradata 14x/12x, Oracle 11g/10g, SQL Server, IBM DB2, ERWIN and UNIX Shell Scripting.
- Have worked in different projects in Banking, Insurance, Life Sciences and Retail domains at various client locations across Globe.
- 8+ years of extensive experience in Requirement gathering, ETL Architecture, Writing ETL Specifications, Dimensional Data Warehouse Design, Application Integration, Data migration, Data analysis and implementation of Data Warehousing solutions using ETL concepts and tools.
- 11+ years End - to-End SDLC experience including analysis and scoping, design, development, QA, deployment, and maintenance.
- 11+ years’ experience in using Data warehousing ETL tools like Informatica Power Center, Informatica Power Exchange and 2+ year experience in using Informatica Data Quality(IDQ) tools (Developer and Analyst)
- Having very good knowledge on Big Data concepts and technologies including HDFS, MapReduce, PIG and HIVE.
- Have used Informatica Data Quality (IDQ) tool kit for Profiling, Analysis, Standardizing, Cleansing, Matching, Conversion, Exception Management and defining DQ Rules.
- Having very good knowledge on Informatica Master Data Management(MDM) components, such as:
- Creation and configuration of Landing, Staging, Base Objects, Hierarchies, Foreign Key Relationships, Queries, Query Groups and Packages.
- Identification of Golden Record (BVT) by analyzing data, cleansing the data and merging duplicate records coming from different systems.
- Defining Match and Merge Rules in the MDM Hub by creating match path components, match strategy, match columns and rules.
- Execute Stage Jobs, Load Jobs, Match and Merge Jobs using Batch Viewer and Batch Groups.
- Having real time experience in integrating SAP Systems such as SAP ALE, SAP BAPI/RFC, SAP BI/BW & SAP R/3(ECC) using Informatica Power Exchange.
- Having real time experience in integrating CRM and Cloud applications Salesforce and Web Services using Informatica Power Exchange.
- Having knowledge on Data Synchronization, Data Replication and Data Masking in Informatica Cloud.
- Having hands on experience in creating mappings/mapplets/configuration tasks and scheduling the mappings over Informatica Cloud.
- Having real time experience in using XML sources/targets in Informatica Power Center.
- Having hands on experience in writing UNIX Shell Scripting, PL/SQL Procedures/Functions and Teradata Stored Procedures, Macros and BTEQ Scripts.
- 5+ years of experience in using Teradata SQL & Teradata utilities like Multiload, FASTLOAD, and TPUMP.
- Familiar with the usage of HP QC for defect tracking.
- 10+ years of experience in RDBMS and performance tuning of queries.
Data warehousing Tools: Informatica Power Center 10.x, 9.xInformatica Power Exchange 10.x, 9.xInformatica Data Quality(IDQ) 9.xInformatica Master Data Management(MDM) 9.x
BI Tools: MicroStrategy Developer 10.3.0
RDBMS: Oracle, DB2, Teradata 14/12, and MS-SQL Server
Data Modeling Tools: Erwin 4.2/4.0, MS Visio
Programming Languages: Core JAVA, SQL, PL/SQL and UNIX Shell Scripting
Operating Systems: Linux, AIX, UNIX, HP-UX, Sun Solaris Windows and MS DOS
Other Tools: SAP GUI, HP QC, Aqua Studio, Force.com(Salesforce), Teradata SQL Assistant, WinSQL, Toad, SQL Developer & Putty, PVCS
Scheduling Tools: IBM TWS, Tidal, StoneBranch and Autosys
Project Planning Tools: Microsoft Project
Sr. ETL Informatica/IDQ Lead/BI Consultant
- Contributed in the development of system requirements and design specifications.
- Created ETL specification documents using functional documents.
- Participated in the design and development of Dimensional modeling.
- Participated in creating database scripts using Erwin forward and reverse engineering techniques.
- Extracted data from SAP R/3(ECC) using ABAP integration with Informatica Power Exchange
- Created ETL framework to reuse the code across all plants.
- Created DQ Rules, data/custom profiles and score cards using profile results.
- Created DQ mappings for data cleansing and validations, using different transformations.
- Integrated DQ mappings with power center mappings.
- Created Informatica objects and Scheduled in IBM TWS.
- Created and modified UNIX shell scripts to run the jobs using framework.
- Created database Views, MV’S and Stored Procedures to refresh the MV's, as and where needed.
- Participated in Database objects, ETL and Tivoli Jobs migration to QA and Production environments.
- Interacted with various business teams, users and provided support in all the phases of project.
Environment: Informatica Power Center 10.x, 9.x, Informatica Power Exchange for SAP NetWeaver, IDQ 10.x, MicroStrategy 10.3.0, Unix Shell Scripting, SQL Server, Oracle, SAP R/3, Putty, WinScp, SQL Developer and Tivoli Workload Scheduler(TWS)
Sr. SAP-Informatica Integration/IDQ Consultant
- Extracted Sales files residing in Burgeon server and transmitted ARReceipt files to Burgeon using SFTP.
- Created/Modified Unix scripts to invoke SFTP process and to upload/extract data from/into Informatica Server.
- Created ETL Inbound and Outbound mappings to generate IDOCs using SAP/ALE Integration with Informatica Power Exchange.
- Created SAP/ALE IDoc Preparer and Interpreter Transformations to process the segment data from upstream/downstream transformations.
- Extracted data from SAP R/3 systems to generate Checks and Charge back files using Informatica Power Exchange ABAP Integration.
- Extensively used Informatica data quality tools (Developer & Analyst) for analyzing, standardizing, cleansing, matching, conversion, exception management, reporting and monitoring the data.
- Created custom profiles and score cards using Informatica Analyst tool.
- Created DQ rules using expressions and mapplets and reused in multiple profiles.
- Created mappings in developer for duplicate analysis and profiling
- Integrated Informatica developer mappings with power center client.
- Created jobs in workflow manager and scheduled using Tivoli.
Environment: Informatica Power Center 9.1.0, Informatica Power Exchange 9.1.0, Informatica Developer Tool 9.1, UNIX Shell Scripting, SAP GUI, Putty, Tivoli. Flat Files.
Sr. Informatica Developer
- Designed ETL mappings/sessions/workflows for the monthly feeds to extract data from Core Bank DWH and generate files to provide to EDH.
- Involved in Logical / Physical data modeling of the Banks HQA DWH.
- Interacted and worked closely with HQA Business users in data validations and reconciliation of OFSA 4.5 and OFSAA 6.5 systems.
- Involved in SIT, UAT, Pre and Post PROD deployment activities.
- Scheduled and Monitored workflows on the TIDAL server.
- Worked on performance tuning of the existing ETL framework (OFSA 4.5)
Environment: Informatica Power Center 9.1.0 Unix Shell Scripting, DB2, Oracle, TOAD, Putty and Tidal.
Confidential, Baskin Ridge, NJ
Sr. Informatica/IDQ Developer
- Responsible for understanding requirements in Interface Control document (ICD) and creating ETL design documents.
- Responsible for ETL coding, unit testing and UNIX shell scripting.
- Developed data integrity reports through Column Profiling using Informatica Analyst and mappings in Informatica Developer.
- Have used different Informatica Developer (IDQ) transformations like Case, Comparison, Key Generator, Match, Parser, Standardizer, Weight, Exception, Rule Based Analyzer, Lookup, SQL, Expression etc. and created IDQ mappings.
- Imported mapplets and mappings from Informatica developer (IDQ) to Power Center.
- Generated XML target files using Informatica Power Center.
- Have used Informatica Web Services Consumer transformation and Power Center Web Services Hub concepts to interact with Web Services CLOUD applications/clients using WSDL files in the form SOAP messages.
- Have used Informatca Cloud applications (Data Synchronization and Data Replication) to pull the Salesforce data (Accounts, Customers specifically) into Oracle Staging area.
- Responsible for code reviews and deployment activities.
- Responsible for providing support in SIT, Pre-Production and Production environments.
Environment: Informatica Power Center 9.5.1, Informatica Developer\Analyst 9.5.1, Informatica Power Exchange for Web Services, XML, Oracle 11.2.0, UNIX Shell Scripting, Oracle SQL Developer and Putty.
ETL Architect/Sr. Informatica Developer
- Managed requirements and design phase towards identifying mutually agreed solutions with technical teams and business partners.
- Responsible for providing requirements, functional knowledge and design documents to ETL team for development
- Responsible for converting requirements into comprehensive detail design specifications (Integration, databases, data flows, transformations, interfaces etc.) for solutions
- Responsible of creating project plan and working with customer on regular basis to show the progress at each stage
- Responsible for all ETL deliverables in SDLC methodology such as integrated functional specification, source to target mapping sheet, design, coding, unit testing and deployments
- Integrated Informatica with SAP R/3 system using ABAP programs in extracting data
- Integrated Informatica with SAP BI system in extracting data from Open Hubs using power exchange
- Integrated Informatica with Salesforce Cloud in extracting Customer flags and RFM data.
- Integrated Informatica with Web Services using Power Exchange in extracting data.
- Other source systems majorly used are Teradata, Oracle, SQL Server, XML and Flat Files
- Extensively used Teradata utilities MULTILOAD, FASTLOAD, TPUMP, TPT and BTEQ Scripts to load data into Data warehouse.
- Created UNIX shell scripts to run the Informatica workflows, Dynamic parameter files creation, indirect file creation, file watcher and email notification etc…
- Responsible code reviews and data validations.
- Responsible for driving the team to meet the objective of the project by proactively taking decisions.
- Responsible for proactively identifying risks, assumptions, issues and providing solutions\suggestions on time.
- Responsible for preparing migration plan and coordinating with respective teams.