We provide IT Staff Augmentation Services!

Etl Tech Lead/sr. Developer Resume

0/5 (Submit Your Rating)

San Diego, CA

SUMMARY

  • 12+ years of ETL and data integration experience in developing ETL mappings and scripts using Informatica Power Center 10.1.0/9.6.1 /9.5.1/9.0/8.6.1/7.1.3/6.2 Client tools using Designer, Workflow Manager, Workflow Monitor & Repository Manager. Good knowledge of Informatica DQ 9.6.1. and Informatica MDM 9.6.1.
  • 7 + years of Software Development Life Cycle (SDLC) experience as a consultant implementing Client/Server and Data Warehousing applications in major companies.
  • Experience in all aspects of project development life cycle that includes requirements gathering, mapping documentation, design, develop, certify, implement into production and support.
  • Extensive experience with Data Extraction, Transformation, and Loading (ETL) from various data sources such as Oracle, DB2, SQL Server, XML Files, Flat Files and VSAM files to target (Warehouse / Data Mart) data bases such as Teradata and Oracle.
  • Experience in Performance Tuning and Debugging of present and existing ETL processes of Sources, Targets, Mappings and Sessions it overcome the bottlenecks in mappings.
  • Experienced in SCD Type - 1,2 & 3, Star-Schema and Snowflake Schema with Logical and Physical data modeling design process, Normalization and Data Cleansing.
  • Sound knowledge of RDBMS concepts, with hands on exposure in the development using SQL, PL/SQL (Stored Procedures, Functions and Triggers), T-SQL, BTEQ and Data loader using SQL* Loader, Tpump and Mload. Strong experience in Unix/Linux Shell scripting.
  • Excellent analytical and logical programming skills with good understanding at the conceptual level, presentation, interpersonal skills with a strong desire to achieve specified goals.

TECHNICAL SKILLS

ETL: Informatica Power Center, 10.1.0/9.6.1 . , Informatica DQ 9.6.1 & SSIS 2015

Big Data: Hadoop-(HDFS), Hive, Pig & Sqoop

OLAP: Cognos 10.1, Tableau 12.2, SAP BO 14.2 & SSRS 2015

Data Modeling: Erwin 7.0.

Database: Oracle 12c/11g/10g/9i/8i, Teradata 16.0, DB2, SQL Server 2014, MS Access.

Programming languages: C, C++, Java, VB, Unix Shell Scripting, HTML, DHTML, XML SQL, BTEQ,T-SQL, SQL*Plus, PL/SQL and TOAD.

Environment: UNIX, Linux, Sun Solaris, HP-UX B.11.11 and Windows 2000/XP

PROFESSIONAL EXPERIENCE

Confidential - San Diego, CA

ETL Tech Lead/Sr. Developer

Responsibilities:

  • Analyzed to estimated work effort for development and enhancement requests. Identified and captured requirements for the projects, participate in producing work estimates. Maintain a productive relationship with business sponsors and end users.
  • Provided high quality technology solutions to address business needs by developing applications within mature technology environments. Adhere to coding standards, procedures and techniques while contributing to the technical code documentation and to manage the end-to-end delivery of data warehousing solutions.
  • Based on the functional specification that assign to each developer and monitoring the development status within the timeframe and providing solution for complicated task and easy methods.
  • ERP System (BaaN 5C and BaaN LN) and other application are Map Source data to data stage from staging to final warehouse models (Data Mart) with ETL process.
  • Used all major transformations for those maps like Normalizer, Lookup (connected and connected), Update Strategy, Stored Procedure, XML-(Source Qualifier, Parser, Generator), Unstructured Data, Data Masking, Java, SQL, Router, Filter, Joiner, Ranker, Source qualifier, Union, Reusable transformation (Lookup, Expression), Mapplets, Shortcut maps and session level Worklets, Workflows and Major tasks.
  • Implemented SCD methods type1 & 2 and facts and dimensional model concepts utilizing major maps. Used four (4) common shortcut maps generating new process key for every inserting and updating records in target table and creating parameter file for tracked table loaded, monitored the workflow for success or failure.
  • Developed Pre and Post-session SQL scripts, to create / drop indexes, PL/SQL script created procedure and function for stored procedure transformation and analyze database objects. Also used shell scripts to purge flat files, reject files and session logs.
  • Using PL/SQL script. (Conditional, Looping, Cursor(Implicit, explicit), Functions, Procedures and packages and exception candling) newly Implemented dynamic purge process in the Data warehouse tables (in Oracle Database) which is 25 branches each branch any number table can do the purge process according to the time period.
  • Performed Unit-(Dev) Test, Certification (CTE) Test, Integration-(ITE) Test, (end-to-end Test with other systems-STE) and User acceptance testing.
  • Developed Shell scripting for checking the files size, files split, files merge, files manipulation, dynamic SQL - (DDL, DML, TCL, DCL) script, SFTP, SCP and WinSCP used for files transfer from one server to another server.
  • Change Data Capture Process (CDC) the control table stores the latest value of timestamp field for every physical sources table. This value is compared in the CDC view to capture only the newly created or modified records. CDC entries are updated with latest value after corresponding EDW table is loaded. This is done by a common procedure.
  • Providing deployment documentation for all developed ETL processes, including process flow and source and target mapping. Scheduling jobs in workflow Manager and Auto sys Job controller.
  • Worked as part of the Production support team and involved in object migrations, and Workflow scheduling. Optimized/Tuned Pushdown Optimization, Pipeline Partitions, Dynamic Partitions, Concurrent Workflows, Grid Deployments, Workflow Load Balancing with various Informatica workflow and sessions that were previously being used.
  • Worked with database administrators to determine indexes, statistics, and partitioning to add to tables in the data warehouse, in order to improve performance for end-users.
  • Worked with Business Intelligence Operations team to ensure control objectives are implemented for all data movement. Processes, implemented change control for production data, establish and follow proper incident management procedures.
  • Ensure compliance with all applicable data privacy regulations and policies as they relate to both firm and client/contact data.

Environment: Informatica Power Center 9.6.1, Oracle 11g and 12c, Teradata 16.0, Cognos 10.1 and Tebleau 12.2.0 for BI, ERWIN 7.0, SQL*Loader, SQL, Pl/SQL, Toad for Oracle. Linux. and Windows Servers.

Confidential - Roseville, CA.

ETL Sr. Developer

Responsibilities:

  • Worked with the business team to formulate success metrics and measurement plan for front store projects, socialize them, and create dashboards/reports to monitor them. The senior leadership and product managers to deliver key online and offline metrics and support the process of building business cases and estimating expected projects benefits.
  • Builted Informatica Mappings based on the ETL logic which is used by SQL loader script, SQL and PL/SQL script in oracle. Developed Perl script for split and manipulate flat files and Unix shell scripts for archive used loaded flat files.
  • Used all major transformations for those maps like Normalizer, Lookup (connected and connected), Update Strategy, Stored Procedure, SQL, Router, Filter, Joiner, Ranker, Source qualifier, Union, Reusable transformation (Lookup, Expression), Mapplets, Shortcut maps and session level Worklets, Workflows and Major tasks.
  • Implemented SCD methods type1 & 2 and facts and dimensional model concepts utilizing major maps. Used four (4) common shortcut maps generating new process key for every inserting and updating records in target table and creating parameter file for tracked table loaded, monitored the workflow for success or failure.
  • Created all date functions based on oracle format to work with T SQL format like To Date, To Char, Add Month, Add Date and eliminated most of the view tables using SQL over ride.
  • Implement and test new or modified codes and/or mappings. Corrected mapping errors, and troubleshooting any issues which arise. Unit testing, end to end testing, integrated test, QA testing, comparing and matching data from oracle database tables, with data from SQL server tables.
  • Created 125 Informatica Mappings and SSIS packages for one time sync up data loads from Oracle to SQL server out of 650 tables . The concepts of those maps are built insert/update and truncate load in the target tables. Used major Tasks like Data Flow, Execute SQL, File System Task, Script Task and Foreach Loop Container and Transformations like Sort, Aggregate, Data Conversion, Lookup, cache Transform, Pivot, Unpivot, Merge, Merge Join, union and Multicast in SSIS Packages.
  • Copied folder by folder in Informatica map and code modifications for each project from the original folders and changed all connectivity from Oracle to SQL Servers, the connectivity should be changed from session level as well as mapping level.
  • Before modifying each project folder, obtaining details on the tables to be created for new data modules in SQL Server Database and made lists of table names with schema, not yet created in SQL Server database, the tables include source, target and lookup tables, used in SQL over ride, stored procedure and functions.
  • In each map changed SQL statement from Oracle to T SQL syntax, in source qualifier and lookup tables SQL over ride for mapping level and in session level pre and post session SQL.
  • Developed 40 stored procedures and functions in T SQL b . Used multiple Courser, Temp table and looping concepts according to oracle (SQL and PL/SQL) programming format, implemented open query for remote database access with Linked Server.
  • Job scheduler used Control-M job scheduler for Informatica batches (Workflow with Worklet), CronTap job scheduler for Shell script in Unix.
  • Monitored, maintained and optimized sessions and ETL processes to improve the performance tuning to remove bottlenecks in mapping level target base, source base, transformation base, session level and system levels.

Environment: Informatica PowerCenter 9.6.0, Oracle 10g and 11g, SQL Server 2008 R2Business Objects XI for BI, ERWIN 7.0, SQL*Loader, SQL*Plus, SQL, Pl/SQLToad for Oracle 9.7 Toad for SQL server 5.0, HP.AX B.11.11. and Windows Servers.

Confidential - St. Louis, Missouri.

ETL Developer

Responsibilities:

  • Analyzed the FDR (Functional Design Requirements) provided by the data analyst and I created TDD (Technical Design Documents) and Involved in the other requirements gathering. for all the Informatica mappings.
  • Ability to analyze the data requirements of multiple projects, analyze various source of the data within and outside of the client and develop the total technical design to enable the target system to get access to the required data in an efficient and timely manner.
  • All Data files deliver to Main Server (drop box) by NDM - (Network Data Mover) the CFD - (Central file Distribution) distribute the files appropriate source domain.
  • Designed and developed Informatica Mappings to load data from Source systems to ODS database and then load into (Teradata Database) Brokerage Data Warehouse.
  • Implemented Slowly Changing Dimensions - Type I, II & III in different mappings as per the requirements and Solid expertise in working with Star Schema and Snow Flake Schema.
  • Developed Informatica mappings with complex business logic in the Transformations, reusable transformations and reusable Mapplets.
  • Extensively used the transformations such as Expression, Sorter, Joiner, Filters, Routers, Sequence Generator, Update Strategy, Rank, Lookup (both connected and unconnected), Stored Procedure and Normalizer.
  • Performed Unit-(Dev), Certification, (CTE) Integration-(ITE) (end-to-end with other systems-STE) and User acceptance testing.
  • Designed data feeds to load Teradata tables from Oracle and DB2 tables or from flat files, using Fastload, Multiload and Tpump.
  • Tuned sessions to improve the performance by eliminating Source, Target, Mapping and Session bottlenecks. Macro and Explain facility to tune existing Teradata processes.
  • Worked with database administrators to determine indexes, statistics, and partitioning to add to tables in the data warehouse, in order to improve performance for end-users.
  • Developed SQL, PL/SQL (Function, Procedure and Packages) scripts using TOAD for Oracle and DB2 and used Inforamtica mapping for data Import/Export.
  • Created various UNIX Shell Scripts for scheduling various loading process. create/drop work tables and indexes archive the original file, data cleansing scripts (delete old flat files, reject files and session logs), monitor systems availability, job status and send alerts to pager which are used for scheduling Autosys jobs.
  • Used Unix Shell scripts to monitor systems availability, job status and send alerts to pager. Worked with Business Object developers to ensure that generated SQL was optimized for Teradata and have good exposes in Business Object.
  • Harvest all the code (Shell script, SQL script, Mapping Objects and Session Objects) for deployment of the production.

Environment: Informatica PowerCenter 8.6.1, Oracle 10g and 11g, Teradata V2R12, DB2, SQL Server, Netezza, ERWIN 7.0, Business Objects XI for BI, SQL*Loader, SQL*Plus, SQLPL/SQL, BTEQ, SunOS 5.10, AIX 5.3. and Windows Servers.

Confidential, Austin, Texas.

ETL Developer.

Responsibilities:

  • As an developer, I interacted with all the business stake holders to gather business requirements and developed required designed strategies for loading data from various sources into Corporate Data Warehouse running on Teradata. The Teradata runs on a Active-Active cluster and data loads will be done to both the nodes in the cluster.
  • The data loads were performed in multiple stages, in compliance with the Corporate Policies, and data audit / archives were performed during the ETL process. Various load strategies were tested to ensure optimal performance of data loads considering the size of each load.
  • Developed several mappings according to the design documents / specs provided and performed unit tests by documenting the test plans and their results.
  • Created mappings for Slowly Changing Dimensions in the SFDC project for implementing date based and flag based versioning logic.
  • Used various transformations like Source Qualifier, Lookup, SQL Transformation, Router, Filter, Update Strategy, Expression, Aggregator and Normalizer etc.
  • Created various tasks like sessions, worklets, and workflows in the workflow manager to test the mapping during development.
  • Created event based loading of data to Teradata using UNIX shell scripts for SAS CI Project. Scripts were created to poll for the arrival of flag file which indicates the availability of source flat files for various mappings and sessions. Once the flag file is available, Shell scripts execute Teradata Mload to load data from flat files to Teradata database.
  • Developed Informatica mappings to load data from 20 tables to Rollup objects in Teradata for further reporting using Business Objects. Created error log files to capture error messages, number of records processed and session load time etc., for quality checking.
  • Performed Unit, Integration (end-to-end with other systems) and User acceptance testing.
  • Tuned sessions to improve the performance by eliminating Source, Target, Mapping and Session bottlenecks.
  • Developed Pre and Post-session SQL scripts, to create / drop indexes, PL/SQL script created procedure and function for stored procedure transformation and analyze database objects. Also used shell scripts to purge flat files, reject files and session logs.
  • Provided Knowledge Transfer to the end users and created extensive documentation on the design, development, implementation, daily loads and process flow of the mappings.
  • Worked as part of the Production support team and involved in object migrations, and Workflow scheduling. Optimized/Tuned various Informatica sessions that were previously being used.
  • Worked with database administrators to determine indexes, statistics, and partitioning to add to tables in the data warehouse, in order to improve performance for end-users.

Environment: Informatica PowerCenter 8.6.1, Oracle 10g and 11g, Teradata V2R12, SQL Server, ERWIN 7.0, Business Objects XI for BI, SQL*Loader, SQL*Plus, SQLPL/SQL, BTEQ, SunOS 5.10 and Windows Servers.

Confidential - San Antonio, Texas

ETL Developer in Informatica

Responsibilities:

  • Involved in the initial requirements gathering.
  • Analyzed the functional specs provided by the data architect and created technical specs documents for all the mappings.
  • Designed and developed Informatica Mappings to load data from Source systems to ODS and then to Data Warehouse.
  • Developed Informatica mappings with complex business logic in the Transformations, reusable transformations and reusable Mapplets.
  • Extensively used the transformations such as Expression, Sorter, Joiner, Filters, Routers, Sequence Generator, Update Strategy, Rank and Lookup (both connected and unconnected).
  • Used Aggregator transformation to aggregate the daily sales data to weekly and monthly.
  • Strong experience in trouble shooting and fixing invalid Mappings, Stored Procedures and Functions.
  • Performed Unit, Integration (end-to-end with other systems) and User acceptance testing.
  • Created partitions through the session wizard in the Workflow Manager to increase the performance.
  • Tuned sessions to improve the performance by eliminating Source, Target, Mapping and Session bottlenecks.
  • Worked with database administrators to determine indexes, statistics, and partitioning to add to tables in the data warehouse, in order to improve performance for end-users.
  • Designed data feeds to load Teradata tables from Oracle tables or from flat files, using Fastload, Multiload, and SQLPlus.
  • Implemented Slowly Changing Dimensions - Type I, II & III in different mappings as per the requirements.
  • Used Macro and Explain facility to tune existing Teradata processes to optimize for performance.
  • Used Erwin to create logical and physical data models that capture current/future state data elements, and additional functionalities like Forward and Reverse Engineering.
  • Solid expertise in working with Star Schema and Snow Flake Schema and defined corporate standards in Naming Conventions.
  • Developed Oracle SQL, PL/SQL scripts using TOAD and used Inforamtica mapping for data Import/Export.
  • Developed pre/post-session shell commands to delete old flat files, reject files and session logs and drop/recreate indexes.
  • Created various UNIX Shell Scripts for scheduling various data cleansing scripts and loading process. Create/drop tables which are used for scheduling jobs.
  • Used Unix Shell scripts to monitor systems availability, job status and send alerts to pager.
  • Worked with Business Objects developers to ensure that generated SQL was optimized for Teradata.

Environment: Informatica Power Center 7.1.3, Oracle 10g/9i, Teradata V2R6, DB2, SQLPL/SQL, ERWIN 4.0, BTEQ, TOAD, SunOS, AIX and Windows Servers.

Confidential

ETL Developer in Informatica

Responsibilities:

  • Collaborated with Business analysts and the DBA for requirements gathering, business analysis and designing of the Data Warehouse.
  • Developed and documented Data Mappings/Transformations, and Informatica sessions as per the business requirements.
  • Designed and developed Informatica Mappings to load data from Source systems to ODS and then to Data Warehouse.
  • Worked on Informatica tools Source Analyzer, Warehouse Designer, Mapping Designer, Workflow Manager, Workflow Monitor, and Repository Manager.
  • Worked with PowerCenter Designer tools in developing mappings and Mapplets to extract product, customer and sales data to load from Sources into Data Warehouse extensively used Transformations like Source Qualifier, joiner transformation, Update Strategy, Lookup transformation, Rank transformations, Expressions, Aggregator and Sequence Generator
  • Written PL/SQL stored Procedures and Functions for Stored Procedure Transformation.
  • Created partitions through the session wizard in the Workflow Manager to increase the performance. Performed Unit and Integration testing.
  • Tuned sessions to improve the performance by eliminating Source, Target, Mapping and Session bottlenecks.
  • Worked on dimensional modeling to Design and develop STAR Schemas, identifying Fact and Dimension Tables for providing a unified view to ensure consistent decision making.
  • Implemented Slowly Changing Dimensions - Type I & III in different mappings as per the requirements.
  • Identified all the confirmed dimensions to be included in the target warehouse design and confirmed the granularity of the facts in the fact tables.
  • Used Debugger wizard to remove bottlenecks at source level, transformation level, and target Level for the optimum usage of sources, transformations and target loads.
  • Worked with heterogeneous sources from various channels like Oracle and flat files.
  • Developed UNIX Shell scripts for data extraction, running the pre/post Session processes.
  • Scheduled sessions and batches on Informatica server using Informatica Workflow manager.
  • Implemented performance tuning logic on Targets, Sources, mappings, sessions to provide maximum efficiency and performance.
  • Made extensive use of UNIX Shell scripting, maintaining user profiles, archiving log files etc.
  • Analyzed Session Log files in case the session failed to resolve errors in mapping or session configurations.
  • Provided technical assistance by responding to inquiries regarding errors, problems, or questions with programs/interfaces.

Environment: Informatica PowerCenter 6.2, Business Objects 5.1.1, Oracle 9iSQL, PL/SQL, ERWIN 3.5, Unix and Windows Server.

Confidential

Software Engineer

Responsibilities:

  • Analyzed requirement and Designed the Database accordingly.
  • Design and Development., Tested Modules and Implementation.

Environment: Visual Basic 6.0, Oracle 8i / Windows 2000.

Confidential

Responsibilities:

  • Designed database according to the business requirements.
  • Design and Development., Tested Modules and Implementation.

Environment: Visual Basic 6.0, Oracle 8i / Windows NT.

We'd love your feedback!