Tech Lead Informatica Developer Resume
Fountain Valley, CA
SUMMARY:
- Software Professional with 11+ years of work experience in the IT industry, working in a variety of Data warehousing projects using the industry leading tools and technologies such as Informatica PowerCenter, Informatica Data Quality (IDQ), Intelligent Cloud Services (IICS),Informatica Meta Data Manager, Tableau, Oracle PL/SQL, SQL Server Management Studio (SSMS), Unix.
- Have worked in development and design of ETL methodology for supporting data transformations and processing, using Informatica Power Center 10.2.0, B2B Data Transformation Studio, Power Exchange, Unix job schedulers such as Autosys, Control M.
- Have good understanding of Data Warehouse Architecture and Data Warehousing fundamentals including Star Schema, Snowflake Schema, OLTP and OLAP and have working knowledge of Erwin data modeler.
- Have developed, debugged and tested complex mappings, sessions, worklets and workflows adhering to the defined standards using Informatica PowerCenter.
- Have strong knowledge on Error Handling, parameterization of values and experience in troubleshooting bottlenecks, performance tuning and Optimization of ETL of mappings and sessions.
- Extensively worked with ETL tools to extract data from various sources and targets including Oracle, Flat files (fixed width and delimited), MS SQL Server 2014, SAP ECC, Teradata, Netezza, DB2, JSON, XML, REST API, AWS Cloud (AWS S3, Redshift) and AS400.
- Strong knowledge of Software Development Life Cycle (SDLC) including Requirement analysis, Design, Development, Testing, Support and production Implementation.
- Have experience in following Waterfall and Agile models with the help of software’s like JIRA and RALLY.
- Experienced in Shell scripting and PL/SQL procedures.
- Actively involved in migrating the data warehouse to Snowflake and re - platforming the ETL to Informatica Intelligent Cloud Services (IICS).
- Excellent communication and strong interpersonal skills with the ability to interact with end-users, managers and technical personnel.
TECHNICAL SUMMARY:
RDBMS: Oracle 12c, MS SQL Server 2014, Teradata 14.0, DB2, MS Access, SQL, PL/SQL, Netezza, AS400, Amazon S3& Redshift
DW/ETL Tools: Informatica Intelligent cloud Services, Informatica Power Center 10.2.0, B2B Data Transformation Studio, Informatica Power Exchange, Autosys, Control M, Cucumber, Ruby, WINSCP
Reporting Tools: Business Objects XI R3, MicroStrategy, Web Intelligence, Cognos, Tableau
Languages: SQL, PL/SQL, PERL, Python, Shell Scripting, XML
Database Tools: SQL Developer, TOAD
Data Modeling: Dimensional Data Modeling, FACT and Dimensions Tables, Physical and Logical Data Modeling, ERWIN, Visio
PROFESSIONAL EXPERIENCE:
Confidential, Fountain Valley, CA
Tech Lead informatica Developer
Responsibilities:
- Involved in leading both onsite and offshore teams.
- Worked closely with the SME and BA to get an understanding of the business requirements.
- Worked in Agile Model, moving through the sprint cycle and using the tracking Tool: JIRA.
- Prepared Functional specification Doc to convert the business requirements into ETL Mappings.
- Responsible for architecting and developing solutions and successful project delivery & process engagements
- Developed standard and reusable mappings and mapplets using various transformations like Expression, Aggregator, Joiner, Router, Lookup (Connected and Unconnected) and Filter, Sessions and Workflows to load the data into staging, ODS, fact and dimension tables.
- Worked with heterogeneous sources to Extracted data from Oracle tables, Flat files (fixed width, delimited) and SQL server tables and loading to a relational Oracle warehouse.
- Developed UNIX scripts for jobs scheduling, DDL scripts for creating Oracle tables, Materialized Views for the MicroStrategy Reporting needs.
- Actively involved in implementing large scale ETL processes using Informatica PowerCenter.
- Designed high level ETL process and data flow from source system to target databases.
- Strong experience with Oracle databases and strong SQL and PL/SQL knowledge
- Worked with cross functional teams and been the point of contact for any ETL related questions.
- Develop & unit test Informatica ETL processes for optimal performance utilizing best practices.
- Perform detailed data analysis to understand the source system data and designed the ETL process to load the data into target stem for the reporting needs.
- Actively involved in migrating the data warehouse to Snowflake and re-platforming the ETL to Informatica Cloud Services.
- Coordinated ETL and database development work with offshore ETL developers and conducted code reviews.
Environment: Informatica PowerCenter 10.2, Informatica Intelligent Cloud Services(IICS),SQL Server Management Studio 2014, AS400, Flat Files, Oracle 12c, PL/SQL, JSON, XML, REST API, AWS Cloud (AWS S3, Redshift) Atlassian ITSM, CTRL-M, Toad, Windows 10, UNIX putty, Win SCP, JIRA tracking Tool.
Confidential, Irvine, CA
Tech Lead informatica Developer
Responsibilities:
- Worked closely with the SME and BA to get an understanding of the business requirements.
- Worked in Agile Model, moving through the sprint cycle and using the tracking Tool: JIRA.
- Prepared FSD (Functional specification Document) to convert the business requirements into ETL Mappings
- Developed UNIX scripts and DDL scripts for creating Oracle tables.
- Used Remedy Tool PAC 2000, for creating the change requests and Task Approvals to in corporate the modelling changes into Informatica mappings as per the Change requests.
- Developed standard and reusable mappings and mapplets using various transformations like Expression, Aggregator, Joiner, Router, Lookup (Connected and Unconnected) and Filter, Sessions and Workflows to load the data into staging, ODS, fact and dimension tables.
- Worked with heterogeneous source to Extracted data from Oracle tables, Flat files (fixed width, delimited) and SQL server tables and loading to a relational Oracle warehouse.
- Experienced in data analysis, development, SQL performance tuning, data warehousing ETL process and data conversions.
- Performed unit testing & provided the technical support to the QA Team in case of any defects or failures.
- Performed troubleshooting and provided technical support to applications during Migrate to Production activities.
- Presented Product demo to the product owner and stake holders at the end of Sprint Cycle to get a signoff approval.
- Troubleshooting of long running sessions and fixing the issues related to it.
- Worked with Variables and Parameters in the mappings to pass the values between sessions.
- Worked on data profiling, data validation and data analysis.
- Involved in meetings with production team for issues related to Deployment, maintenance, future enhancements, backup and crisis management of DW.
- Improved performance using Oracle Partitions, Indexes and other Performance Tuning techniques.
- Developed re-usable components in Informatica, Oracle and UNIX
- On-Call/Production Support provided during daytime and off-hours.
- Actively participated in Install/Deployment plan meetings.
Environment: Informatica PowerCenter 10.2.0, Repository Manager, Designer, SQL Server Management Studio 2014, MS Access DB, Informatica Meta Data Manager, Informatica Data Validation Option (DVO), Flat Files, Oracle 12c, PL/SQL, Confidential ALM, Autosys, MS Visio, Toad, Windows 10, UNIX putty, IP Switch, JIRA tracking Tool.
Confidential, Los Angeles, CA
Tech Lead Informatica Developer
Responsibilities:
- Worked closely with the SME and BA to get an understanding of the business requirements.
- Worked in Agile Model, as a part of scrum team using the tracking Tool: Rally
- As a technical Lead, performed impact Analysis on the Change request (CR), to in corporate the modelling changes into Informatica mappings as per the Change requests.
- Managing onshore and Offshore team members to understand ETL Process and resolved the issues.
- Based on the STTM document, generated informatica mappings Autosys, config files, using the automated Code generation Tool.
- Implemented the ETL processes using Informatica tool to load data from various source systems (flat files, xml and database) into the target Oracle 11g database.
- Worked on Data Transformation Studio Plugin for parsing the source XML’s by mapping the Sourcing Locator Publication XSD and Target XD to the respective tables. Also validated the output results and Deployed the package in a zip file thereby converting the unstructured data into structured format.
- Experience working with XML source files. And Upgraded MISMO and XSD file to Latest Version and loaded them in Data Studio transformation plugin for parsing the Source XML files.
- Performed Source to Target Mapping and Applied the transformation rules using various transformations like Joiner, Aggregate, Expression, Filter, update Strategy and Lookup’s which are required for Data extraction.
- Developed complex SQL queries and designed Informatica Mappings to load the data into warehouse.
- Actively performed data analysis, data mapping, data loading, and data validation.
- Inserted Metadata into MD Tables.
- Parameterized ETL rules and Derivations by storing into the Metadata tables.
- Implemented cucumber for automating the Jobs Execution of the AutoSys UNIX box to perform End to End testing from Souring to Cutover Vending and generating ECF files.
- Performed - unit testing & provided the technical support to the QA Team in case of any defects or failures.
- Responsible for communicating Client manager and server hosting/Operation support vendors in case of production load failures.
- Worked on Tokenization of Unix scripts for reusability across higher environments.
- Used Autosys job scheduler to create, schedule and monitor the jobs and send the Email Notification messages in case of process failures.
- Efficiently handled multiple projects during resource crunch.
Environment: Informatica PowerCenter 9.6.1, Repository Manager, Designer, Data Transformation Studio, XML, Flat Files, Oracle 12c, PL/SQL, Confidential ALM, Autosys, MS Visio, Toad, Windows 8, UNIX, Rally
Confidential, Palo Alto, CA
Senior Informatica Developer
Responsibilities:
- As a Senior ETL developer, responsible for the fulfillment of various project tasks. Involved in the entire software development life cycle using agile methodology for data migration, writing stored procedures, product design documentation, code development and testing.
- Worked closely with Business Analysts, Subject Matter Experts from Business team and gather business requirements during scheduled meetings.
- Involved in preparing the application design document to provide a clear understanding of what will be delivered by IT in order to meet the business requirements. This document will provide enough design information in order to get business signoff.
- As a part of development process, worked on developing Informatica mappings, mapplets, sessions, worklets and workflows.
- Created mappings to load data using Source Qualifier, Joiner, Sorter, Aggregator, Union, Router, Update Strategy, SQL transformation, Stored Procedure transformation, Expression, Connected and Unconnected Lookups.
- Involved in extracting data from different source systems, flat files and loaded them into standard Integration environment (target system).
- Developed the Informatica mappings using various transformations, Sessions and Workflows. Teradata was the target database, Source database was a combination of Flat files, Oracle tables, Excel files and Teradata database.
- Developed mappings to load Fact and Dimension tables, SCD Type 1 and SCD Type 2 dimensions and unit tested the mappings.
- Extracted data from Flat files, DB2, Oracle, XML, and Loaded into Teradata data warehouse.
- Creation of customized Mload scripts on UNIX platform for Teradata loads
- Written several Teradata BTEQ scripts to implement the business logic.
- Worked extensively with the Teradata Query man to interface with the Teradata.
- Used Teradata Utilities (Fast Load, Multi Load, Fast Export). Queried the Target database using Teradata SQL and BTEQ for validation.
- Used Transformations like Look up, Router, Filter, Joiner, Stored Procedure, Source Qualifier, Aggregator and Update Strategy extensively.
- Involved in doing error handling, debugging and troubleshooting Sessions using the Session logs, Debugger and Workflow Monitor.
- In the Development Environment, performed Unit testing at granular level. Created test cases which would cover all the unit testing scenarios and uploaded them in ALM (Application Lifecycle Management).
- Performed troubleshooting and provided technical support to QA applications during Migrate to Production activities.
- Used Autosys for Scheduling, created various UNIX Shell Scripts for scheduling various data cleansing scripts and loading process. Maintained the batch processes using Unix Shell Scripts
Environment: Informatica PowerCenter 9.0.1, Repository Manager, Designer, Oracle 11g, SQL Server 2012, XML Files, Flat Files, CSV files, Teradata, PL/SQL (Stored Procedures, Triggers, Packages), Business Objects, Confidential ALM, AUTOSYS, MS Visio, SQL Developer, TOAD, Windows 7, UNIX
Confidential, Charlotte, NC
Sr. Data Analyst / Sr. Informatica Developer
Responsibilities:
- Used Informatica as ETL Tool. Worked in all phases of Data Integration from heterogeneous sources, legacy systems to Target Database. Used IDQ, IDE for Data Analysis.
- Worked on Informatica Power Center tool - Source Analyzer, designer, Mapping and Mapplet Designer, Transformations, Informatica Repository Manager, Informatica Workflow Manager and Workflow Monitor
- Implemented Slowly Changing Dimensions - Type I & II in different mappings as per the requirements.
- Strong experience in using Informatica Workflow Manager and Workflow Monitor to schedule and monitor session status.
- Extensively worked with various Lookup caches like Static cache, Dynamic cache and Persistent cache.
- Worked with Session logs and Workflow logs for Error handling and troubleshooting in Dev environment.
- Responsible for Unit Testing of Mappings and Workflows.
- Responsible for implementing Incremental Loading mappings using Mapping Variables and Parameter Files.
- Responsible for determining the bottlenecks and fixing the bottlenecks with performance tuning.
- Implemented various loads like Daily Loads, Weekly Loads, and Quarterly Loads using Incremental Loading Strategy
- Extensive SQL querying for Data Analysis.
- Wrote and executed DML and DDL scripts to in corporate database changes on Oracle 10 g using Toad tool.
- Extracted sources from flat-files, Oracle, SQL Server and load them into Oracle.
- Experienced in database design, data analysis, development, SQL performance tuning, data warehousing ETL process and data conversions.
- Responsible for Validation of Data as per the Business rules.
- Provided Production Support.
Environment: Informatica PowerCenter 9.0.1/8.6, Repository Manager, Designer, Oracle 11g, SQL Server 2012, DB2, XML Files, Flat Files, CSV files, PL/SQL, TOAD, Windows, UNIX
Confidential, San Diego, CA
Informatica Developer
Responsibilities:
- Interacted with business/end users to analyze business requirements & develop detailed requirement(s) document.
- Worked on Informatica Power Center Tool- Source Analyzer, Mapping Designer & Mapplets, and Transformations.
- Used Debugger in the Mapping Designer to troubleshoot logical errors.
- Created Workflows, re-usable Worklets and Sessions using the Workflow Manager.
- Performed Developer testing, Functional testing, and Unit testing for the Informatica mappings.
- Assigned and executed the mappings full and incremental loads to the DAC, to schedule the execution plan through DAC.
- Did the end to end full load run through the DAC
- Configured DAC - Data warehouse Administration Console for running of ETL jobs
- Conducted Performance tuning of application by modifying the SQL statements.
- Involved in Documentation regarding the ETL process.
- Coordinated with the offshore delivery team for the releases.
Environment: Informatica Power Center 8.6.1, Informatica Designer (8.x), Informatica Repository Manager, Import Wizard, PL/SQL, Windows, Oracle 10g, Erwin 4.1, SAP, Flat Files, DAC (Data Warehouse Administration Console)