Informatica Analyst/developer Resume
SUMMARY
- Around 8 years of extensive experience in complete Software development life cycle (SDLC) including strong background in data ETL, development and implementation of various data warehouses/data marts.
- Extensive experience as ETL Developer in technologies and methods using Informatica Power Center and Pentaho.
- Extensively worked on Dimensional modeling (Star/Snowflake), Data migration, Slowly Changing Dimensions, Data cleansing and Data Staging of operational sources using ETL processes and providing data mining features for data warehouses.
- Experience in integration of various Operational Data Source (ODS) with Multiple Relational Databases like Oracle & SQL Server and Worked on integrating data from flat files like fixed width and delimited.
- Actively involved in Performance SQL Tuning, ETL tuning and Error handling.
- Experience in implementing the complex business rules by creating transformations, re - usable transformations (Expression, Aggregator, Filter, Connected and Unconnected Lookup’s, Router, Rank, Joiner, Update Strategy), and developing complex Mapplets.
- Experience in Data Modeling (Logical and Physical Design for distributed databases, Reverse-engineering and Forward-engineering using Erwin).
- Involved in the designing and building Pentaho ETL for Cognos analytics.
- Excellent knowledge of studying the data dependencies using Metadata stored in the Pentaho/Informatica Repository and preparing batches for the existing sessions for scheduling of multiple sessions.
- Proficient in understanding business processes/requirements and translating them into technical requirements.
- Experience in building Oracle Data Warehouse from different OLTP systems.
- Strong experience with MS SQL, PL/SQL, T-SQL, Stored Procedures, Packages, and Triggers.
- Extensive experience with Change Data Capture (CDC).
- Experience in shell scripting to run ETL mappings and file handling.
- Experience with various Databases includes Oracle 12c, 11g, 10g, SQL Server 2005/2008.
- Developed Test Scripts, Test Cases, SQL QA scripts to perform Unit testing, System Testing and Load testing.
- Seamlessly migrated the Code from Development to Testing, UAT and Production.
- Experience with various domain areas such as Health Care, Pharmaceuticals, Finance, Supply Chain and Manufacturing.
- Expertise over preparing report specifications, database designs to support the reporting requirements
- Perform review of Session log files to trace causes of bottlenecks.
- Scheduled Pentaho/Informatica jobs/wf using Cron and Tidal.
- Scheduled Informatica wf using Tivoli.
TECHNICAL SKILLS
ETL Tools: Informatica Power center, Power Connect & PowerExchange, Informatica Cloud Services (ICS), Informatica Intelligent Cloud Services (IICS) Pentaho 4.4/5.4 Google NIFI.
Data Analysis: Requirement Analysis, Business Analysis, detail design, data flow diagrams, data definition table, Business Rules, data modeling, Data Warehousing, system integration
Job Scheduling Tool: Tidal, Tivoli, Control-M, Cron Shell
Data Modeling: ERWin 7.3/4.0, UML Rational Rose, Visio 2007
Databases: Oracle 12c/11g/10g/9i/8i, SQL Server 2005, MS Access.
Reporting Tools: Cognos, Business Objects xi 1, MS SQL Server Reporting services
Programming Skills: C, SQL, PL/SQL, SQL*Plus, Java, T-SQL, VB Script, Java Script, DHTML, XML
Office Tools: MS Office, MS Project, Subversion, GIT
Scripting Languages: Unix Shell Scripting, Java Script
SQL Tools: Toad, SQL Developer, SQL*PLUS
Operating Systems: Windows XP/NT/ 2000/2003/7/8 , AIX, Linux and Sun Solaris
PROFESSIONAL EXPERIENCE
Confidential
Informatica Analyst/Developer
Responsibilities:
- Installed and configured Informatica 10.2 for the client who never used Informatica previously in their environment, installed/configured the software for their Development and Production environments.
- Attended POC for Google NIFI and Google cloud architecture.
- Worked on configuring Google cloud proxies (SQL Cloud Proxy, MySQL and BigQuery Drivers)
- Created Google NIFI pipelines for data loads from CSV files to Google buckets and BigQuery.
- Created user accounts with different privileges including admin and development, execution/monitoring only access accounts.
- Identified all required new servers and databases at Pixelle to prepare for the migration and code deployment from one environment to the other environment.
- Understand technical specification documents like system design and detail design documents for the development of Informatica mappings and workflows to load data into various target.
- Designed and developed ETL process using Informatica Power Centre to load data from wide range of sources such as SQL Server, SAP, Flat files, XML files etc.
- Responsible for migrating the folders, mappings and sessions from development to test environment and Created Migration Documents to move the code from one environment to another environment.
- Based on the Business logic, developed/modified various mappings & mapplets to load data from various sources using different transformations like Source Qualifier, Filter Expression, Lookup, Router, Update strategy, Sorter, Normalizer, Aggregator, Joiner, SQL transformations, XML Transformation in the mapping.
- Used connected, unconnected lookup, static and dynamic cache to implement business logic and improve the performance
- Created Test cases for Unit Test, System Integration Test and UAT to check the data.
- Extensively worked on ETL performance tuning for tune the data load, worked with DBAs for SQL query tuning etc.
- Scheduled Informatica jobs to execute using inhouse SQL scheduling program that ensure to restart the task/session in case of job failure from the failed task/session.
- Developed PL/SQL Stored Procedures, Views and Triggers to implement complex business logics to extract, cleanse the data, transform and load the data into the SQL server database.
- Used Partitions in Sessions to improve the performance of the database load time.
- Created sessions and used pre and post session properties to execute scripts and to handle errors.
- Created BAT scripts to Schedule Informatica Workflows through PMCMD command.
- Extensively worked on workflow Manager and Workflow Monitor to monitor workflows, work lets, sessions logs, tasks etc.
- Used Email task, Control task, Link and command tasks in Informatica Workflows.
Environment: Informatica 10.2 (PowerCenter, Designer, Workflow Manager, Repository Manager, Monitor), Google Cloud, Google NIFI, Power Exchange, MSSQL 2016, SAP, Flat files, PL/SQL, SQL*Scheduler, SQL
Confidential
Informatica Developer
Responsibilities:
- Responsible for Requirement Gathering Analysis and End User Meetings
- Responsible for Business Requirement Documents BRD's and converting Functional Requirements into Technical Specifications.
- Responsible for mentoring Developers and Code Review of Mappings developed by other developers.
- Installed/configured Teradata Power Connect for Fast Export for Informatica.
- Extensively used Teradata utilities like TPT, Fast load, Multiload to load data into target database.
- Did change data capture (CDC) by using the MD5 function of Informatica.
- Created custom IDQ plans and incorporated it into Power Center as mapplet.
- Developed database objects such as SSIS Packages, Tables, Triggers, and Indexes using T-SQL, SQL Analyzer and Enterprise Manager.
- Developed SSIS packages and migrated from Dev to Test and then to Production environment.
- Worked on creating physical layer, business model, mapping and presentation layer in OBIEE.
- Extracted data from various heterogeneous sources like Oracle, SQL Server, and Flat Files and loaded into DataMarts using Informatica.
- Worked on solving Skewness in Teradata.
- Used Informatica Data Quality (IDQ) to format data from sources and load it into target databases according to business requirements.
- Worked on creating Business Objects universe (BO).
- Modified BO universe based on the business requirements.
- Created jobs for Informatica data replication fast clone to get data from oracle and load it into Teradata.
- Wrote BTEQ scripts of Teradata extensively.
- Created custom plans for product name discrepancy check using IDQ and incorporated the plan as a mapplet into Power Center.
- Worked with Autosys for scheduling the INFA jobs using JIL checker and CA workload center.
- Kept INFA jobs on hold for prod patching in Autosys.
- Extensively used Teradata utilities like Fast load, Multiload to load data into target database.
- Did bulk loading of Teradata tables, using TPump utility.
- Used IDQ’s standardized plans for addresses and names clean ups.
- Extensively used various active and passive transformations like Filter Transformation, Router Transformation, Expression Transformation, Source Qualifier Transformation, Joiner Transformation, and Look up Transformation, Update Strategy Transformation, Sequence Generator Transformation, Rank Transformation, and Aggregator Transformation.
- Responsible for best practices like naming Conventions, Performance tuning, and Error Handling
- Responsible for maintaining data quality and data consistency before loading into ODS.
- Responsible for Performance Tuning at the Source level, Target level, Mapping Level and Session Level
- Created business objects universes.
- Created denormalized BO reporting layer for BO reports.
- Solid Expertise in using both Connected and Un connected Lookup transformations
- Extensively worked with various lookup caches like Static Cache, Dynamic Cache, and Persistent Cache
- Developed Re Usable Transformations, and Re-Usable Mapplets
- Worked with Shortcuts across Shared and Non-Shared Folders
- Developed Slowly Changing Dimension Mappings for Type 1 SCD and Type 2 SCD
- Retrieved data from SAP BW and loaded into Oracle using power connect.
- Responsible for implementing Incremental Loading mappings using Mapping Variables and Parameter Files
- Responsible for determining the bottlenecks and fixing the bottlenecks with performance tuning.
- Used Update Strategy DD INSERT, DD DELETE, DD UPDATE, AND DD REJECT to insert, delete, update and reject the items based on the requirement
- Worked with Session Logs, and Workflow Logs for Error handling and Troubleshooting in all environment
- Responsible for Unit Testing and Integration testing of mappings and workflows.
Environment: Informatica Power Center 10.2, Sales Force, Force.com, SAP BW, Teradata, Oracle 11g, MS SQL Server 2014, SSIS, TOAD, SQL, PL/SQL, SAP BO, Windows XP, Autosys, UNIX