- 10+ years of experience with system design, development, testing and production support of Data Warehousing that involves ETL, Reporting and Data modeling experience.
- Experience in creating Ralph Kimball models with Star/Snowflake Schema Designs with analysis - definition, database design, testing, and implementation with Quality process.
- Developed excellent professional skills by working independently, also as a team member to analyze the Functional/ Business requirements and to prepare test plans, test scripts.Collaborated onsite teams, interacted, well managed various offshore teams and effectively manage client expectations.
- Involved in the full development lifecycle from requirements gathering through development and support using Informatica Power Center, Repository Manager, Designer, Workflow Manager, and Workflow Monitor.
- Experience in Repository Configuration, creating complexMappings using Informatica Designer and processing tasks using Workflow Manager to move data from multiple sources into targets.
- DW Methodology: Kimball, Inmon, dimensional and relational data modeling.
- Extracted data from various sources like Oracle, Flat file, SQL SERVER, Teradata and loaded into Oracle database.
- Strong experience in designing and developing business intelligence solutions in Data warehouse/Decision Support Systems using ETL tools Informatica Power Center 9.6.1/9.5.1/9.0.1/8.6.1, OLTP, OLAP.
- Data modeling experience using OLAP/ROLAP tools, Fact and Dimensions tables, Physical and logical data modeling, and Oracle Designer.
- Extensively worked with Oracle database to implement data cleanup, performance tuning techniques.
- Experience in Installation, Configuration, and Administration of Informatica Power Center 9.x/8.x Client, Server.
- Extensively worked on Data Warehouse Administrative Console (DAC), ESP.
- Experience in data integration of various data sources from Databases like Oracle, Teradata, SQL Server and formats like flat-files, CSV files and XML files.
- Extensive experience in Performance Tuning -Identified and fixed bottlenecks and tuned the complex Informatica mappings for better Performance.
- Extensively worked on both Windows andUnix Scripting.
- Experience in implementation of Data Cleanup procedures, transformations, Scripts, Stored Procedures and execution of test plans for loading the data successfully into the targets.
- Have RDBMS concepts experience in Oracle 10g/11g, Oracle PL/SQL, SQL*Plus, SQL*Loader, Netezza, MS SQL Server 2000/2005/2008.
Data Warehousing/ETL Tools: Informatica PowerCenter10.x/9.x/8.x (Source Analyzer, Data Warehousing Designer, Mapping Designer, Mapplet, Transformation, Sessions, Workflow Manager-Workflow, Task, Commands, Worklet, Transactional Control, Constraint Based Loading, SCD I, II and XML transformations), Informatica Data Quality (IDQ), DataStage
Databases: Oracle 11i/10g/9i/8i, MS SQL Server 2000/7.0/6.5, Teradata,Netezza, SQL Server
Data Modeling: Star Schema and Snow-Flake Modeling, FACT and Dimensions Tables, Physical and Logical Data Modeling, ER Studio, Erwin
Tools: SQL*Plus, SQL Assistant, SQL Developer,TOAD, Server Management Studio Express, Putty, Erwin, DAC, ESP
Programming Languages: Unix Shell Scripting, Oracle PL/SQL (Stored Procedures, triggers, Indexes).
Environment: Windows 98/2000/2003/ XP, UNIX, LINUX.
Sr. ETL Developer
- Involved in End-End development of the implementation and Roll out.
- Implemented File Transfer Protocol operations using Talend Studio to transfer files in between network folders.
- Work on Data Migration using export/import.
- Created Talend jobs using the dynamic schema feature.
- Created Talend jobs to copy the files from one server to another and utilized Talend FTP components.
- Used more components in Talend and Few to be mentioned: tjava, toracle, txmlMap, tdelimited files, tlogrow, tlogback components etc. in many of my Jobs Design
- Worked on Joblets (reusable code) & Java routines in Talend
- Implemented Error handling in Talend to validate the data Integrity and data completeness for the data from the Flat File.
- Coordinated with the business to gather requirements and preparing Functional Specification document.
- Created Talend Development Standards. This document describes the general guidelines for Talend developers, the naming conventions to be used in the Transformations and also development and production environment structures.
- Worked on Talend ETL and used features such as Context variables, Database components like tMSSQLInput, tOracleOutput, file components, ELT components etc.
- Involved in automation of FTP process in Talend and FTP the Files in UNIX.
- Optimized the performance of the mappings by various tests on sources, targets and transformations.
- Used Talend Admin Console Job conductor to schedule ETL Jobs on daily, weekly, monthly and yearly basis (Cron Trigger)
- Involved in end-to-end testing of jobs.
- Wrote complex SQL queries to take data from various sources and integrated it with Talend.
- Involved in designing Logical/Physical Data Models, reverse engineering for the entire subject across the schema.
- Developed over 90 mappings to support the business logic including the historical data for reporting needs.
- Developed complex Talend ETL jobs to migrate the data from flat files to database.
- Used transformations like Router, Update Strategy, Lookups, Normalizer, Filter, Joiner and Aggregator.
- Developed Type-1 and Type-2 mappings for current and historical data.
- Incorporated business logic for Incremental data loads on a daily basis.
- Written complex PL/SQL procedures for specific requirements.
- Used Parameter Variables and Mapping variables for incremental data feeds.
- Used Shared folders for Source, Targets and Lookups for reusability of the objects.
- Scheduled the Informatica jobs from third party scheduling tool Autosys Scheduler.
- Involved/Migrated Informatica from 8.6 to version 9.6
- Performed administrator role in migrating the objects from one environment to the other DEV/QA/PROD.
- On-call support for production maintenance
- Platform: Informatica 9.6, DB2 UDB, UNIX, Autosys, SQL Server 2008.
Environment: Informatica Power Center 8.6.1, 9.6.1, Oracle 11g, SQL, PL/SQL, TOAD, MY SQL, Unix, Autosys, OBIEE, Xml, Plat files.
Sr. ETL DeveloperResponsibilities:
- Created ETL workflows to meet the business requirements for Sales & Finance and Procurement and Manufacturing Subject Areas.
- Created Cross reference, Business Control Tables and used Persistent Variables to implement the region based scheduling
- Automated Process to replace the manual effort of creating a list file.
- Used the Ralph Kimball methodology, star schemas with conformed dimensions and fact tables.
- Production Support has been done to resolve the ongoing issues and troubleshoot the problems.
- Automated process to refresh the cubes from Informatica.
- Converted some of the Fact Full Loads to the Incremental Loads as a part of performance improvement
- Decoupled the workflows to remove dependency between the different subject areas
- Effectively used Informatica parameter files for defining mapping variables, workflow variables, and relational connections.
- Involved in enhancements and maintenance activities of the data warehouse including tuning, modifying of stored procedures for code enhancements.
- Created an error strategy to handle records which do not qualify the data quality rules
- Effectively worked in Informatica version based environment and used deployment groups to migrate the objects.
- Used debugger in identifying bugs in existing mappings by analyzing data flow, evaluating transformations.
- Reviewed and analyzed functional requirements, mapping documents, problem solving and trouble shooting.
- Performed unit testing at various levels of the ETL and actively involved in team code reviews.
- Identified problems in existing production data and developed one-time scripts to correct them.
Environment: Informatica Power Center 10.1, Oracle 11g, SQL Server, Oracle SQL Developer, JIRA, UNIX shell scripting, MS OFFICE, Putty, Winscp
Confidential, San Jose
Sr. ETL DeveloperResponsibilities:
- Developed ETL programs using Informatica to implement the business requirements.
- Worked with Exadata database to implement data procedures and performance tuning techniques.
- Customized HR, Procurement and Finance inbuilt mappings.
- Used Informatica file watch events to pole the FTP sites for the external files.
- Production Support has been done to resolve the ongoing issues and troubleshoot the problems.
- Performance tuning was done at the functional level and map level.
- Designing and implementing data warehouses and data marts using components of Kimball Methodology, like Data Warehouse Bus, Conformed Facts & Dimensions, Slowly Changing Dimensions, Surrogate Keys, Star Schema, Snowflake Schema, etc.
- Created ETL workflows to load Sand Disk and HGST Product data
- Responsible for tuning ETL mappings and sessions via source qualifier/lookup overrides to optimize load and query performance.
- Used the feature Explain Plan to find out the bottlenecks in a given query for improving the performance of the job.
- Defined parameters and variables to increase their usability in mappings.
- Created multiple partitions in sessions to allow the data to be loaded concurrently.
- Involved in Unit Testing and System testing to check whether the data loads into target are accurate, which was extracted from different source systems according to the user requirements.
- Tracked issues that occurred during integration testing, which were then fixed and deployed to concerned environment.
- Developed Unix Shell scripts as part of the ETL process to schedule tasks/sessions.
Environment: Informatica Power Center 9.6.1, Oracle Exadata, SQL Server, TOAD, JIRA, UNIX shell scripting, MS OFFICE, Putty, Winscp, DAC
- Involved in gathering and analyzing the requirements and preparing business rules.
- Created various Informatica mappings to validate the transactional data against Business rules, extract look up values and enrich the data as per the mapping documents.
- Developed various Informatica Workflows to load the data from various upstream systems using different methodologies i.e. trigger based pull, direct pull & file based push.
- Designed the ETL architecture for the Deposits product to process huge volumes of Deposits data on daily basis.
- Developed various SQL queries using joins, sub-queries & analytic functions to load the data into various relational DBs i.e. Oracle& SQL Server.
- Created complex DataMart views for the corresponding products.
- Created various complex PL/SQL stored procedures to manipulate/reconcile the data and generate the dashboard reports.
- Performed Unit Testing & prepared the deployment plan for the various objects by analyzing the inter dependencies.
- Developed several UNIX shell scripts for the files Archival & Compression.
- Created various Autosys jobs for the scheduling of the underlying ETL flows.
- Co-ordinated with various team members across the globe i.e. Application teams, Business Analysts, Users, DBA and Infrastructure team to resolve any technical and functional issues in UAT and PROD.
- Created various technical documents required for the knowledge transition of the application, which includes re-usable objects (Informatica &UNIX).
- Worked on IDQ for data cleansing, data matching, data conversion and address standardization.
- Involved in integrating the change in the workflow to both test and allow error handling using Informatica IDQ
- Created Data objects, Quick Profiles, Custom Profiles and Drill Down on Profile Result using IDQ.
- Used Informatica Data Quality for cleansing and matching the customer data. Real-time address cleansing was achieved using Informatica Power Connect for Web Services
- Created reference tables from profile columns using IDQ.
Environment: Informatica Power Center 9.6.1,Web Services, IDQ, Oracle 11g, SQL Server 2012, MS Access 2010, SQL*Loader, UNIX, Winscp, Putty, Erwin 7.2, SQL, PL/SQL, JIRA, SVN
Sr. ETL Lead DeveloperResponsibilities:
- Involved in migration projects to migrate Workflows from Informatica 9.1 to 9.6.1
- Worked on BTEQ scripts to transform data and also used Teradata utilities fastload, multiload, tpt
- Wrote, tested and implemented Teradata Fastload, Multiload and Bteq scripts, DML and DDL.
- Worked with DBA’s to resolve Database related issues
- Created SFTP Process to move files from Source Server to Informatica servers
- Analyzed existing workflows and made necessary changes while migrating workflows
- Deployed ETL Jobs to higher Environments
- Applied Error handling to the existing workflows as per the Teradata standards
- Provided quick production fixes and proactively involved in fixing production support issues.
- Used SQL Assistant to querying Teradata tables.
- Designed complex mappings Involving target load order
- Worked with SET and MULTISET tables for performance evaluation of the scripts.
- Designed the technical specifications for Teradata ETL processing of data into master data ware house and strategized the integration test plan and implementation
- Designed the Database Tables & Created Table and Column Level Constraints using the suggested naming conventions
- Involved with all the phases of Software Development Life Cycle(SDLC) methodologies throughout the project life cycle
- Worked with ESP Scheduling tool for managing the ETL Jobs
- Worked in Agile environment
- Worked on data warehouses with sizes from 30-50 Terabytes.
- Coordinated with the business analysts and developers to discuss issues in interpreting the requirements
- Managed offshore team, assigning and reviewing their work.
Environment: Informatica 9.1/9.6.1,CA ESP 11, UNIX, Teradata 15, Teradata SQL Assistant, Fast Load, BTEQ, MLOAD, ERWIN, Windows Server 2007, JIRA
Sr. ETL/Informatica DeveloperResponsibilities:
- Designed the Data model and Load strategy to get data from different systems and use it for the Online Registration database.
- Extensive experience in ETL development and maintenance using Oracle SQL, PL/SQL and written Store procedure for the Data Dimensions table to insert 10 years’ worth of data and also used the Informatica Store Procedure Transformation to update the columns in the same table
- Implemented Star Schema for De-normalizing data for faster data retrieval for Online Systems.
- Designed and Developed Mappings, sessions and workflows in Informatica.
- Extracted, transformed data from various sources such as Flat files, Oracle 11g and transferred data to the target data warehouse.
- Involved in installation and configuration of Informatica and DAC.
- Designed and Deployed UNIX Shell Scripts.
- Responsible for Pre and Post migration planning for optimizing Data load performance, capacity planning and user support.
- Involved in creating STAR Schema for OLAP cubes.
- Designed, developed and improved complex ETL structures to extract transform and load data from multiple data sources into data warehouse and other databases based on business requirements.
- Worked closely with QA team during the testing phase and fixed bugs that were reported.
- Worked with Data / Data Warehouse Architect on logical and physical model designs.
- Performed impact analysis for systems and database modifications.
- Involved in developing of stored procedure for Change Data Capture (CDC), Auditing etc.
- Monitored Incremental and Full load of Data through Data Warehouse Administration Console (DAC) and Informatica Workflow Monitor.
- Created complex mappings using Unconnected Lookup, Sorter, Aggregator, newly changed dynamic Lookup and Router transformations for populating target table in efficient manner.
- Maintain Development, Test and Production mapping migration Using Repository Manager, also used Repository Manager to maintain the metadata, Security and Reporting.
Environment: Informatica Power Center 9.1/8.6.1, Oracle 11g/10g, Oracle EBS, OBIEE, SQL Server, Flat Files, Toad,Sql Developer, Shell Scripting, Windows 2000, SQL Server, UNIX, Cognos.