We provide IT Staff Augmentation Services!

Senior Etl Lead Developer Resume

4.00/5 (Submit Your Rating)

PROFESSIONAL SUMMARY:

  • Over 10 years of professional IT experience in Business Analysis, Design, Data Modeling, Development and Implementation of various client server and decision support system environments with focus on Data Warehousing, Business Intelligence and Database Applications.
  • Over 10 years of Ab Initio Consulting with Data mapping, Transformation and Loading from Source to Target Databases, well versed in various Ab Initio parallelism techniques and implemented Ab Initio Graphs using Data, Component, pipeline parallelism and Multi File System (MFS) techniques in complex, high volume Data Warehousing projects in both UNIX and Windows.
  • Extensive experience in Korn Shell Scripting to maximize Ab - Initio data parallelism and Multi File System (MFS) techniques.
  • Experience in providing production support to various Ab Initio ETL jobs and developing various UNIX shell wrappers to run Ab Initio and Data base jobs.
  • Developed various UNIX shell scripts to run Ab Initio and Data base jobs. Good experience working with very large databases and Performance tuning.
  • Good experience working on Ab Initio Metadata Explorer where creating feed files and importing the database catalog, roles privs, Erwin Logical and UDP relations of a database hold the major part.
  • Good Experience working with various Heterogeneous Source Systems like Oracle, DB2 UDB, Teradata, Teradata, MS SQL Server, Flat files and Legacy Systems.
  • Experience in DBMS Utilities such as SQL, PL/SQL, TOAD, SQL*Loader, Teradata SQL Assistant.
  • Experienced with Teradata utilities Fast Load, Multi Load, BTEQ scripting, FastExport, OleLoad, SQL Assistant.
  • Skillfully exploit OLAP analytical power of Teradata by using OLAP functions such as Rank, Quantile, Csum, MSum, group by grouping set etc to generate detail reports for marketing folks.
  • Worked with Transform Components such as Aggregate, Dedup Sorted, Filter by Expression, Join, Normalize, Reformat, Rollup and Scan Components and created appropriate XFRs and DMLs and Automation of load processes using Autosys.
  • Extensively worked on several ETL Ab Initio assignments to extract, transform and load data into tables as part of Data Warehouse development with high complex Data models of Relational, Star, and Snowflake schema.
  • Experienced in all phases of Software Development Life Cycle (SDLC).
  • Expert knowledge in using various Ab Initio components such as Join, Reformat, Scan, Rollup, Normalize, De-normalize, Partitioning and De-partitioning components etc.
  • Experience in Data Modeling, Data Extraction, Data Migration, Data Integration, Data Testing and Data Warehousing using Ab Initio.
  • Extensively worked on GDE, EME, CO>OP, Conduct>It(PLAN)
  • Configured Ab Initio environment to connect to different databases using DB config, Input Table, Output Table, Update table Components.
  • Experience in using EME for version controls, impact analysis and dependency analysis.
  • Able to interact effectively with other members of the Business Engineering, Quality Assurance, Users and other teams involved with the System Development Life cycle
  • Expertise in preparing code documentation in support of application development, including high level and detailed design documents, unit test specifications, interface specifications, etc.
  • Excellent Communication skills in interacting with various people of different levels on all projects and also playing an active role in Business Analysis.
  • Manage multiple projects/tasks within Mortgage, Banking & Financial and pharmaceutical industries in a high-transaction processing environments with excellent analytical, business process, written and verbal communication skills.

TECHNICAL SKILLS:

Data warehousing Tools: Ab Initio (GDE 3.02, 1.13/14/15/16, Co>Operating System 3.02, 2.15/2.14/2.13/2.12/2.11), Ab Initio Metadata Explorer, Informatica 6.1/7.1x,Data Stage 7.5/8.0.1.

Data Modeling: Star-Schema Modeling, Snowflakes Modeling, Erwin 4.0, Visio

RDBMS: Oracle 10g/9i/8i/,TeraData 13.0, TeraData V2R6, Teradata 4.6.2, DB2, MS SQL Server 2000, 2005,2008

Programming: UNIX Shell Scripting, C/C++, Java, Korn Shell, TSQL, SQL*Plus, PL/SQL,HTML, ASP.Net

Operating Systems: Windows NT/XP/2000, UNIX, LINUX(Redhat)

BI tools: Oracle Enterprise BI Server, Cognos 8.x, Crystal Reports 8.0/8.5

PROFESSIONAL EXPERIENCE

Confidential

Senior ETL Lead Developer

Responsibilities:

  • Created Ab Initio graphs that transfer data from various sources like Oracle, flat files and CSV files to the Teradata database and flat files.
  • Worked on Healthcare projects dealing with Medicaid claims, HIPPA etc.
  • Worked on Multi file systems with extensive parallel processing.
  • Implemented Lookups instead of joins, in-memory sorts to minimize the execution times while dealing with huge volumes of data.
  • Extensively used Partitioning Components: Broad Cast, partition by key, partition by Range, partition by round robin and De-partition components like Concatenate, Gather and Merge in Ab Initio.
  • Implemented Transform Components such as Aggregate, Dedup Sorted, Filter by Expression, Join, Normalize, Reformat, Rollup and Scan Components and created appropriate XFRs and DMLs.
  • Used Lookup Transformation in validating the warehouse customer data.
  • Performed bulk data load from multiple data source (ORACLE 10g, legacy systems) to Teradata RDBMS.
  • Used Advanced Query Tool (AQT) front-end tools to issue SQL commands matching the business requirements to Teradata RDBMS
  • Coded and Unit tested Ab Initio graphs to extract the data from Oracle tables and MVS files.
  • Worked on profiling of operational data using Ab Initio Data Profiler/SQL Tool to get better understanding of the data that can be used for analytical purpose for business analysts.
  • Extensively used UNIX Shell Scripting for writing SQL execution scripts in Data Loading Process.
  • Produced mapping document and ETL design document and also used Open text Exstream software to optimize customer engagement solutions and produce complaint products
  • Worked closely with the end users in writing the functional specifications based on the business needs.
  • Participated in project review meetings, have a Thorough knowledge of programming concepts, design, procedures and practices
  • Expert in Abinitio Architecture and development of Mid - Large scale IT projects, have hands-on experience using relational database management systems (Oracle, Teradata, SQL Server, DB2 etc.)
  • Have Adaptability and willingness to learn advanced problem solving skills and worked collaboratively with other Departments to resolve complex issues with innovative solutions
  • Used Phases, Checkpoints to avoid deadlocks and multi-files in graphs and also used Run program, Run SQL components to run UNIX and SQL commands.
  • Provided application requirements gathering, designing, development, technical documentation, and debugging. Assisted team members in defining Cleansing, Aggregating, and other Transformation rules.
  • Extensively worked with PL/SQL Packages, Stored procedures & functions and created triggers to implement business rules and validations.
  • Responsible for Performance-tuning of Ab Initio graphs.
  • Collected and analyzed the user requirements and the existing application and designed logical and physical data models.
  • Scripts were run through Unix shell scripts in Batch scheduling
  • Responsible to prepare Interface specifications and complete Documentation of Graphs and its Components.
  • Created XML files as an end product as per business requirement.
  • Responsible for testing the graph (Unit testing) for Data validations and preparing the test reports.
  • Installed and configured Hadoop Map Reduce, HDFS, Developed multiple Map Reduce jobs in java for data cleaning and preprocessing.
  • Written Map Reduce code to process and parsing the data from various sources and storing parsed data into HBase and Hive using HBase-Hive Integration.
  • Worked with HBase and Hive scripts to extract, transform and load the data into HBase and Hive.
  • Built reusable Hive UDF libraries for business requirements, which enabled users to use these UDF's in Hive Querying and performed Data analysis in Big Data

Environment: Ab Initio (CO>Operating system 3.0.2/2.15/2.14, GDE 3.0.2/1.16/1.15/1/14), ER-win 4.0, UNIX, MVS, SQL, PL/SQL, Oracle 10g, Teradata 13.0, Teradata V2R6, DB2, COBOL, Perl, Autosys

Confidential

Senior Technical Lead Developer/Abinitio

Responsibilities:

  • Created the DB catalog, Roles Privs, Erwin Logical, Erwin UDP feed files to import database catalog, Roles Privs, Logical and UDP relations of a Database.
  • Import DB catalog, Roles, Erwin Logical and UDP of various databases to the Metadata Repository.
  • Import EME dataset import, EME graph import, EME table import, Lineage/Dependency analysis of various Databases to the Metadata Repository.
  • Validate the Change sets of different imports in the Metadata Repository and approve the change sets so that the Business can access the data and also used Open text Extream software to optimize customer products
  • Created Ab Initio graphs that transfer data from various sources like Oracle, flat files and CSV files to the Teradata database and flat files.
  • Derived modeled the Facts, Dimensions, Aggregated facts in Ab Initio from data warehouse star schema for create billing, contracts reports.
  • Worked on Multi file systems with extensive parallel processing.
  • Automation of load processes using Autosys.
  • Used Lookup Transformation in validating the warehouse customer data.
  • Prepare logical/physical diagram of DW, and present it in front of business leaders. Used ERWIN for model design.
  • Performed bulk data load from multiple data source (ORACLE 8i, legacy systems) to TERADATA RDBMS.
  • Used BTEQ and SQL Assistant (Query man) front-end tools to issue SQL commands matching the business requirements to Teradata RDBMS.
  • Coded and tested Ab Initio graphs to extract the data from Oracle tables and MVS files.
  • Worked on profiling of operational data using Ab Initio Data Profiler/SQL Tool to get better understanding of the data that can be used for analytical purpose for business analysts.
  • Extensively used UNIX Shell Scripting for writing SQL execution scripts in Data Loading Process.
  • Produced mapping document and ETL design document.
  • Worked closely with the end users in writing the functional specifications based on the business needs.
  • Participated in project review meetings.
  • Extensively worked with PL/SQL Packages, Stored procedures & functions and created triggers to implement business rules and validations.
  • Responsible for Performance-tuning of Ab Initio graphs.
  • Collected and analyzed the user requirements and the existing application and designed logical and physical data models.
  • Worked on EME environment.
  • Scripts were run through Unix shell scripts in Batch scheduling
  • Responsible to prepare Interface specifications and complete Documentation of Graphs and its Components.
  • Responsible for testing the graph (Unit testing) for Data validations and preparing the test reports.
  • Implemented Security Features of Business Objects like row level, object level and report level to make the data secure.
  • Worked on ingestion process of the web log data into Hadoop platform and worked in extensive data integration using Big Data
  • Created process to web log data enrichment, page fixing, sessionization and session flagging.
  • Responsible for creating Hive tables, partitions, loading data and writing hive queries.
  • Migrated existing inbound processes from legacy system to Hadoop

Environment: Ab Initio (CO>Operating system 2.15/2.14, GDE 1.16/1.15/1/14), Ab Initio Metadata Explorer, ER-win 4.0, UNIX, MVS, SQL, PL/SQL, Oracle 10g, Teradata V2R6, DB2, COBOL, Perl, Autosys .

Confidential, Michigan, USA

Senior Software Developer/ Abinitio

Responsibilities:

  • Developed graphs based on data requirements using various AB Initio Components such as Rollup, Reformat, Join, Scan, Normalize, Gather, Broadcast, Merge etc., making use of statements/variables in the components for creating complex data transformations.
  • Performed Metadata Mapping from legacy source system to target database fields and involved in creating Ab Initio DMLs.
  • Involved in creating detail data flows with Source and Target Mappings and convert data requirements into low level design templates.
  • Implemented various levels of parameter definition like project parameters and graph parameters instead of start and end scripts.
  • Developed UNIX Korn Shell scripts to run various Ab Initio generated scripts.
  • Developed parameterized Ab Initio graphs for increasing the performance of the Project.
  • Worked on improving the performance of Ab Initio graphs by using Various Ab Initio performance techniques.
  • Provided customer support during warranty period by resolving issues in timely Manner.
  • Experience in Design of the Warehouse Architecture and the Database using Erwin.
  • Experience in understanding of the Specifications for Data Warehouse ETL Process and interacted with the designers and the end users for informational requirements.
  • Analyze business requirements and developed metadata mappings and Ab Initio DML’s.
  • Developed subject area graphs based on business requirements using various Ab Initio components like Filter by Expression, Partition by Expression, Partition by round robin, reformat, join, gather, merge rollup, normalize, scan, replicate etc.
  • Extensively used Ab Initio functions like is valid, is error, is defined, sting substring, srting concat and other string functions etc.
  • Developed Strategies for Data Analysis and Data Validation.
  • Used Ab Initio GDE to generate complex graphs for transformation and loading of data into Staging and Target Data base area.
  • Used UNIX environment variables in various .ksh files, which comprises of specified locations to build Ab Initio Graphs.
  • Creating load ready files using Ab Initio to load into database.
  • Experience in Unit testing, System testing and debugging.
  • Developed various Ab Initio Graphs for data cleansing using Ab Initio function such as is valid, is error, is defined, is null and various other string functions etc.
  • Created various Ab Initio Multi File Systems (MFS) to run graphs in parallel.
  • Resolved issues of various severities during testing and production phases on Time.

Environment: Ab Initio (GDE 1.12.6.1, Co>Operating System 2.12.2), UNIX5.2, Teradata V2R5, PERL, SQL\PL-SQL, TOAD, Windows NT/2000/XP.

Confidential, Michigan, USA

Software Developer/ Abinitio

Responsibilities:

  • Performed Metadata Mapping from legacy source system to target database fields and involved in creating Ab Initio DMLs.
  • Involved in creating detail data flows with Source and Target Mappings and convert data requirements into low level design templates.
  • Implemented various levels of parameter definition like project parameters and graph parameters instead of start and end scripts.
  • Developed graphs based on data requirements using various AB Initio Components such as Rollup, Reformat, Join, Scan, Normalize, Gather, Broadcast, Merge etc., making use of statements/variables in the components for creating complex data transformations.
  • Extensively used string * functions, date functions and error functions for source to target data transformations.
  • Well experienced in using Partition (Partition by key, partition by round robin) and Departition components (Concatenate, Gather, and Interleave, Merge) to achieve data parallelism.
  • Created common graphs to perform common data conversions that can be used across the applications using parameter approach using conditional DMLs.
  • Modified Ab Initio Graphs to utilize Data Parallelism and thereby improve the overall performance to fine-tune the execution times by using multi file systems and lookup files whenever required.
  • Implemented phasing and checkpoint approach in ETL process to prevent data loss and to maintain uninterrupted data flow against process failures.
  • Implemented Lookups instead of joins, in-memory sorts to minimize the execution times while dealing with huge volumes of data.
  • Replicate operational table into staging tables, transform and load data into warehouse tables using Ab Initio GDE.
  • Deployed and ran the graphs as executable Korn shell scripts in the applications system.

Environment: Ab Initio (GDE 1.12 Co-op 2.12), UNIX, PL/SQL, Oracle 10g, TeradataV2R5, Query man, UNIX, Windows NT/2000.

Confidential, Michigan, USA

Abinitio Developer

Responsibilities:

  • Created Process Data Flow diagrams in Ab Initio GDE for data conditioning, transformation, and loading.
  • The file will get sent through the Trillium Converter, Parser, Geocoder, and Winkey processes to cleanse, parse, and validate address information.
  • Generated Configuration files, DML files, xfr files specifies the Record format, which are used in components for building graphs in Ab Initio.
  • Involved in creating Flat files using dataset components like Input file, Output file, Intermediate file in Ab Initio graphs.
  • Developed graphs using multistage components.
  • Extensively Used Transform Components like Join, Reformat, Rollup and Scan Components.
  • Implemented the component level, pipeline and Data parallelism in Ab Initio for ETL process for Data warehouse.
  • Gather information from different data ware house systems and loaded into One Sprint Financial Information System Consolidated model using Fast Load, Multi Load, Bteq and unix shell scripts.
  • Extensively used Multi-load and Fast-load utilities to populate the flat files data into Teradata database.
  • Used Fast Exports to generate Flat files after the Change data capture has been accomplished which in turn creates a loadable file that is used to load the database.
  • Developed JCL’s to run the multiloads, fast loads, Fast Exports, and Bteq Scripts from Mainframe to Teradata Machine.
  • Batch processing for data downsizing(subset)
  • Document ways to automate manual processes.
  • Maintaining sandbox by storing all the work in a sequential order.
  • Developed UNIX shell scripts for the purpose of parsing and processing data files. Maintained and did trouble shooting for batch processes for overnight operations.
  • Documented the process procedures and flow for the process.

Environment: Ab Initio (GDE 1.12, Co>operating system 2.11), UDB DB2, Oracle 9i/8i, UNIX IBM AIX 5.1, Teradata V2R5, Shell scripts, Fast-loads, Multi-loads, Fast Exports.

Confidential, Michigan, USA

Abinitio Developer

Responsibilities:

  • Involved in full project life cycle - from analysis to production implementation and support with emphasis on identifying the source and source data validation, developing particular logic and transformation as per the requirement and creating mappings and loading the data into Business intelligence database.
  • Confer with systems analysts, engineers, programmers and others to design system and to obtain information on project limitations and capabilities, performance requirements and interfaces
  • Modify existing software to correct errors, allow it to adapt to new hardware, or to improve its performance.
  • Extensively used various Ab Initio components like Reformat, Input file, Output file, Join, Sort, Normalize, Input Table, Output Table, LoadDB Table, Update DB, Gather Logs and RunDBSql for developing graphs.
  • Used various Ab Initio built-in Transform components like Rollup, Aggregate, Reformat, Scan, Join to implement the business rules.
  • Used Ab Initio to create summary tables using Rollup and Aggregate components.
  • Generated Configuration files (.CFG) using Korn Shell Script. Creating DMLs by specifying the Record format with delimiters for building graphs in Ab Initio.
  • Written Unix Shell Scripts to load data from different sources.
  • Created test data for critical graphs and written Documentation for these graphs
  • Responsible for creating test cases for to make sure the data originating from source is making into target properly in the right format.
  • Extensively worked on Ab Initio GDE component organizer. Designed and implemented Ab Initio graphs, sub graphs using various components such as dedup sorted, partition by key, reformat, filter by expression, gather, merge, etc.
  • Extensively involved in Ab Initio Graph Design, development and Performance Tuning.
  • Developed data transformation, loading, scrubbing and extraction Programs using Ab Initio.

Environment: Ab Initio (GDE 1.12, Co>operating system 2.11), UDB DB2, Oracle 9i/8i, UNIX IBM AIX 5.1, Teradata V2R5, Shell scripts, Fast-loads, Multi-loads, Fast Exports.

We'd love your feedback!