Senior Etl Informatica Developer Resume
Mountain View, CA
SUMMARY
- 10 Years of strong experience in data warehousing applications including analysis, architecture, design, development, support and ETL architecture using Informatica Power Center.
- Experience in various stages of System Development Life Cycle (SDLC) and its approaches like Agile.
- Strong experience in installation of Informatica and configuration of Informatica Power Center 10.1/9.6.1
- Working knowledge of Informatica Power Exchange to support Data Integration requirements.
- Experienced with Informatica Power Exchange for Loading/Retrieving data from mainframe systems.
- Experience in usingInformaticaPower Center Transformations such as Source Analyzer, TransformationDeveloper, Mapplet Designer, Mapping Designer, Workflow Manager, Workflow Monitor and Repository Manager.
- Experience in using PostgreSQL, Oracle 10g/9i/8i, Netezza, MS SQL Server 2005, MS Access 2007, SQL, TSQL, and PL/SQL.
- Worked with Dimensional Data warehouses in Star and Snowflake Schemas, Slowly changing dimensions and created slowly growing target mappings, Type1/2/3 dimension mappings.
- Worked on Web services as a part of Informatica and obtained the successful connection of the external web service call. experience with Git and GitHub for version controlling
- Proficient in transforming data from various sources (flat files, XML, Oracle) to Data warehouse using ETL tools.
- Extensively worked on transformations such as Source Qualifier, Joiner, Filter, Router, Expression, Lookup, Aggregator, Sorter, Normalizer, Update Strategy, Sequence Generator and Stored Procedure transformations.
- Extensively experience in developing Informatica Mappings / Mapplets using various Transformations for Extraction, Transformation and Loading of data from Multiple Sources to Data Warehouse and Creating Workflows with Worklets & Tasks and Scheduling the Workflows
- Extensively involved in Informatica performance issue and Database performance issue, Expertise in error handling, debugging and problem fixing in Informatica.
- Experience in design of logical and physical data modeling using E/R studio. Expertise in using SQL*LOADER to load Data from external files to Oracle Database.
- Extensive experience in writing PL/SQL programming units like Triggers, Procedures, Functions and packages in Unix and windows environment.
- Exposure to on - shore and off-shore support activities
- Excellent verbal and communication skills, has clear understanding of business procedures and ability to work as an individual or as a part of a team.
TECHNICAL SKILLS
ETL Tools: Informatica Power Center 10.1/9.6.1/9.1/8.6 , Informatica Data Analyzer, Informatica Power Exchange, SSIS, IDQ.
Data Modeling: Data Modeling - Logical/Physical/Dimensional Erwin 7.2/7.0, Toad, Oracle Designer, PL/SQL
Databases: Netezza, Oracle 11g/10g/9i/8i, MS SQL 7.0, Teradata, SQL Server 2005/2000, Postgresql
Methodologies: OLAP, ROLAP, MOLAP, Star Schema, Snowflake Schema
Programming: C, C++, SQL, PL/SQL, T-SQL, SQL*Plus, HTML, Python, UNIX Shell Scripting. (AIX)
Schedule Tool: Autosys, ControlM, Tidal
Web Service: SOAP UI, Web service Hub (WSH), Git Hub Bit bucket, Informatica
BI Tools: Business Objects XI 3.1/X1 R2, Tableau 9.3,Tableau Server,, OBIEE, Crystal Reports 2008/XI, Cognos 8.4
Environment: UNIX, Windows 2000/2003/XP/Vista.
PROFESSIONAL EXPERIENCE
Confidential, Mountain view, CA
Senior ETL informatica Developer
Responsibilities:
- Implemented various loads like daily loads, weekly loads, and quarterly loads and on demand load using Incremental loading strategy and concepts of changes Data Capture (CDC).
- Involved in analysis of source systems, business requirements and identification of business rule and responsible for developing, support and maintenance for the ETL process using Informatica
- Identified performance issues in existing sources, targets and mappings by analyzing the data flow, evaluating transformations and tuned accordingly for better performance.
- Made use of various Power Center Designer transformations like Source Qualifier, Connected and Unconnected Lookups, Expression, Filter, Router, Sorter, Aggregator, Joiner, Rank, Router, Sequence generator, Union and Update Strategy transformations while creating Mapplets/mappings.
- Automated data transfer processes and mail notifications by using FTP Task and send mail task in Transformations.
- Designing document to create etl pipeline transformations and jobs.
- Specific Dashboard of the Confidential BigQuery Database by cleaning up the metadata of completed courses in addition to educating category owners on how to classify classes in gLearn.
- Working the conversion of an internal learning system (gLearn) from a SQL Database to a non-relational database, specifically Confidential BigQuery.
- Working stakeholder by providing the business requirements for the ETL Layer in Big Query, testing the ETL and working with the Metrics and Internal stakeholders to ensure acceptance.
- Created and ran SQL queries on a weekly/monthly basis, which enable internal teams to analyze course related data.
- Supported QA and various cross functional users in all their data needs.
- Experience in Confidential Cloud Platform services such as API’s Services, storage, (Classic/Application), Cloud Watch and IAM .
- Scheduled jobs using Clarinet tool to run all informatica jobs nightly, weekly and monthly.
- Identify, document and communicate BI and ETL best practices and industry accepted development methodologies and techniques.
- Troubleshoot BI tool problems and provide technical support as needed. Perform other tasks as assigned.
- Worked very closely with Project Manager to understand the requirement of reporting solutions to be built.
- Good experience in design the jobs and transformations and load the data sequentially & parallel for initial and incremental loads.
- Good experience in using informatica tool in cleansing and load the data as per the business needs
- Good experience in configuration of Data integration server to run the jobs in local, remote server and cluster mode.
- Good Experience in designing the advance reports, analysis reports and Dash Boards as per the clients requirements.
Environment: Informatica Power Center (Power Center Repository Manager, Designer, Workflow Manager, and Workflow Monitor),, Confidential Big Query, GCP, Plx Visualization Tool, Clarinet, Oracle, Sybase, My SQL, PostgreSQL, Clarinet, Cider, Critique, GUTS, Shell, Unix, Git, Tableau Server, Java Script.
Confidential, Denver, CO
Senior ETL informatica Developer
Responsibilities:
- Designed and implemented Change Data capture(CDC) processes for Fact and Dimension tables through a combination of Time Stamps, staging (before / after images) and bridge tables
- Identified performance issues in existing sources, targets and mappings by analyzing the data flow, evaluating transformations and tuned accordingly for better performance.
- Used various types of inputs and outputs in Pentaho Kettle including Database Tables, MS Access, Text Files, Excel files and CSV files.
- Automated data transfer processes and mail notifications by using FTP Task and send mail task in Transformations.
- Responsible for creating database objects like table, views, Store Procedure, Triggers, Functions etc. using T-SQL to provide structure to store data and to maintain database efficiently.
- Reporting Data Designer bugs in Adaptive Integration to the support and creating workarounds until the bugs are fixed
- Identify, document and communicate BI and ETL best practices and industry accepted development methodologies and techniques.
- Troubleshoot BI tool problems and provide technical support as needed. Perform other tasks as assigned.
- Worked very closely with Project Manager to understand the requirement of reporting solutions to be built.
- Good experience in design the jobs and transformations and load the data sequentially & parallel for initial and incremental loads.
- Good experience in configuration of Data integration server to run the jobs in local, remote server and cluster mode.
- Good Experience in designing the advance reports, analysis reports and Dash Boards as per the clients requirements.
- Coordinated and assigned work to off-shore team in monitoring daily status and address road blocks if any
- Implemented business rules to transform data and load data from staging tables to Data warehouse tables.
- Validated and unit tested data in ODS layer.
- Supported QA and various cross functional users in all their data needs.
Environment: Informatica Power Center (Power Center Repository Manager, Designer, Workflow Manager, and Workflow Monitor),, SSIS, PostgreSQL, Oracle, SQL Server, Shell, Unix, SQL profiler, XML, SSRS, Tableau, Tableau, Tableau Server, Java Script, Jira, you track, Autosy’s, Tidal
Confidential, Richmond, VA
Sr Informatica Developer
Responsibilities:
- Designed and implemented Change Data capture(CDC) processes for Fact and Dimension tables through a combination of Time Stamps, staging (before / after images) and bridge tables
- Implemented CDC by tracking the changes in critical fields required by the user.
- Worked on optimizing and tuning the Netezza Sql s to improve the performance of batch.
- Used Informatica Power Exchange for loading/retrieving data from mainframe system.
- Involved in loading the data into Netezza from Iseries systems and flat files using complex UNIX scripts.
- Worked with Oracle, Netezza, DB2, SQL Server, Netezza, Informix and Flat file sources.
- Involving in Supporting ETL Applications production jobs and changing existing code based on business needs.
- Monitor Informatica platforms Performance across all three Environments and taking necessary actions in case of File system/permission/space issues
- Developed standard and reusable mappings and mapplets using various transformations like Expression, Aggregator, Joiner, Router, Lookup (Connected and Unconnected) and Filter.
- Identified performance issues in existing sources, targets and mappings by analyzing the data flow, evaluating transformations and tuned accordingly for better performance.
- Executed, scheduled workflows using Informatica Cloud tool to load data from Source to Target.
- Integrated Informatica power center to GITHUB/Bit bucket
- Setup the batches, configured the sessions and scheduled the loads as per requirement using UNIX scripts
- Create SSIS packages and generated insert scripts for data feeds using sources like excel, flat file, SQL server, xml files with transformers like multicast, derived column, audit, merge, union
- Expertise in Data Warehousing, Data Analysis, Reporting, ETL, Data Modeling, Development, Maintenance, Testing and Documentation.
- Worked on database connections, SQL joins, cardinalities, loops, aliases, views, aggregate conditions, parsing of objects and hierarchies.
- Prepared the documentation for the mappings and workflows.
- Made use of Post-Session success and Post-Session failure commands in the Session task to execute scripts needed for cleanup and update purposes. hands on experience in Amazon web services such as EC2, S3, Elastic Beanstalk, Elastic Load Balancing (Classic/Application), Cloud Watch and IAM.
- Performing Test cases and various test scenarios to validated seeded OBIEE reports.
- Developed PL/SQL and UNIX Shell Scripts for scheduling the sessions in Informatica.
- Involved in Unit testing, Integration testing UAT by creating test cases, test plans and helping informatica administrator in deployment of code across development, test and prod repositories
- Actively involved in the production support and also transferred knowledge to the other team members.
- Co-ordinate between different teams across circle and organization to resolve release related issues.
Environment: Informatica Power Center (Power Center Repository Manager, Designer, Workflow Manager, and Workflow Monitor), Power Exchange, IDQ,AWS, B2B,Tableau,Tableau Server, SQL Server 2008,Netezza, Tera Data, DB2 8.1,Soap UI, XML, Autosys, Tidal, Oracle 11g, TOAD,TSQL, SQL, PL/SQL, GITHub, Bitbucket, UNIX, Jira, Salesforce.com, Control-M
Confidential, Upper Saddle River, NJ
Informatica Developer
Responsibilities:
- Worked closely with client in understanding the Business requirements, data analysis and deliver the client expectation.
- Used Informatica PowerCenter 8.6.1/8.5 and its all features extensively in migrating data from OLTP to Enterprise Data warehouse.
- Extensively used Erwin for Logical and Physical data modeling and designed Star Schemas.
- Extracted data from different sources like Oracle, flat files, XML, DB2 and SQL Server loaded into DWH.
- Worked on power exchange to create data maps, pull the data from mainframe, and transfer into staging area.
- Automated the jobs thru scheduling using Maestro scheduler, which runs every day by maintaining the data validations.
- Developed Complex and Optimized ETL packages using SQL scripts and SSIS 2008.
- Worked directly with the chief strategic officer to create multiple SQL queries for ad-hoc reports
- Optimized the performance of queries with modifications in SQL queries, removed unnecessary columns, eliminated redundant and inconsistent data, normalized tables, established joins and created indexes whenever necessary. Also re-wrote some queries as needed.
- Involved in creation of Folders, Users, Repositories and Deployment Groups using Repository Manager
- Developed PL/SQL and UNIX Shell Scripts for scheduling the sessions in Informatica.
- Involved in scheduling the informatica workflows using Autosys.
- Migrated mappings, sessions, and workflows from development to testing and then to Production environments.
- Performed unit testing on the Informatica code by running it in the Debugger and writing simple test scripts in the database thereby tuning it by identifying and eliminating the bottlenecks for optimum performance.
- Wrote PL/SQL stored procedures & triggers, cursors for implementing business rules and transformations.
- Worked extensively with different caches such as Index cache, Data cache and Lookup cache (Static, Dynamic, Persistence and Shared).
- Created deployment groups, migrated the code into different environments. Worked closely with reporting team to generate various reports.
Environment: Informatica Power Center v 8.6.1 (Power Center Repository Manager, Designer, Workflow Manager, and Workflow Monitor), SQL Server 2005,T-SQL, DB2 8.1, XML, Autosys, Postgre Sql, Oracle 10g, TOAD, SQL, PL/SQL, UNIX, Active
Confidential
PL/SQL Developer
Responsibilities:
- Interacted with end users for gathering requirements.
- Perform database tuning, monitoring, loading and back up.
- Creating prototype reporting models, specifications, diagrams and charts to provide direction to system programmers.
- Developed procedures and functions using PL/SQL.
- Created number of database Triggers according to business rules using PL/SQL.
- Developed SQL for loading Meta data from Excel spread sheet to the database using SQL Loader.
- Extensively used PL/SQL to implement cursors, triggers and packages.
- Developed SQL script for loading data from existing MS Access tables to Oracle.
- Create record groups for manipulation of data, Perform unit and system integrated testing.
- Involved in database design, development of database, tables and views.
- Involved in application testing, deployment and production support.
Environment: Oracle 8i, PL/SQL, TOAD, SQL*Loader, MS Access, Excel spread sheet, Windows NT.