We provide IT Staff Augmentation Services!

Sr. Ab Initio Consultant Resume

3.00/5 (Submit Your Rating)

Columbus, OH

PROFESSIONAL SUMMARY:

  • Around 11 years of IT experience in the Design, Analysis, Development, Implementation and testing of various applications, Decision Support Systems & Data Warehousing applications.
  • Over 8 years of experience in implementing data warehousing solutions using Ab Initio.
  • Worked extensively on Data Warehouse applications development using ETL Tools Ab Initio/Informatica/Datastate/Pentaho, DB2, Teradata, Oracle and UNIX Shell Scripts.
  • Worked extensively in a complex, high - volume of data like 120M to 1.5B of records.
  • Extensively used ETL methodologies for supporting data extraction, transformations and loading processing using Ab-Initio
  • Implemented the Data lineage end to end for the applications across various organizations.
  • Strong experience in designing, developing and testing large scaled applications as per functional requirements using Ab Initio, DB2/Teradata/Oracle and Unix Shell scripting.
  • Well versed with various Ab Initio parallelism techniques and implemented Ab Initio Graphs using Data, Component, pipeline parallelism and Multi File System (MFS) techniques.
  • Configured Ab Initio environment to connect to different databases like DB2/Teradata/Oracle using db config files, Input Table, Output Table, Update table Components.
  • Expert knowledge in using various Ab Initio components such as Join, Reformat, Scan, Rollup, Normalize, De-normalize, Partitioning and De-partitioning components etc.
  • Experience in using EME for version controls, impact analysis, and dependency analysis and data lineage.
  • Experience in providing production support to various Ab Initio ETL jobs and developing various UNIX shell wrappers to run Ab Initio and Data base jobs.
  • Good Experience working with various Heterogeneous Source Systems like Flat files, DB2, Oracle, Teradata, and Legacy Systems.
  • Very good understanding of Teradata’s MPP architecture such as Shared Nothing, Nodes, AMPs, BYNET, Partitioning, Primary Indexes etc.
  • Extensively used different utilities of Teradata such as BTEQ, Fast export, Fast load, Multiload, SQL Assistant, DDL and DML commands
  • Experience in DBMS Utilities such as DB2 Visualizer and Teradata SQL Assistant.
  • Good knowledge of Teradata RDBMS Architecture, Tools & Utilities.
  • Experienced with Teradata utilities Fast Load, Multi Load, BTEQ scripting, FastExport, Multi Load, SQL Assistant.
  • Extensively used Meta programming using Utilities like ab make transform, ab make type.
  • Able to interact effectively with other members of the Business Engineering, Quality Assurance, Users and other teams involved with the System Development Life cycle.
  • Experience of developing reports using market leading BI products especially Pentaho Data Integration/BI Suite
  • Participate in the end to end lifecycle of BI implementations including Design, Development, Deploy and Evolve.
  • Design and develop SQL databases, ETL, and other database components to support Business
  • Intelligence dashboards and compliance reporting
  • Expertise in preparing code documentation in support of application development, including High level and detailed design documents, unit test specifications, interface specifications, etc.
  • Develop and perform tests on various enterprises and provide dashboard solutions with help of Cognos software and assist all end users.
  • Excellent Communication skills in interacting with various people of different levels on all projects and also playing an active role in Business Analysis.
  • Manage multiple projects/tasks within Banking & Financial Service industries in a high - transaction processing environments with excellent analytical, business process, written and verbal communication skills.
  • Knowledge of data/Data profiling.

TECHNICAL KNOWLEDGE:

ETL: Ab Initio (GDE 3.1.2/1/0.xCo>Operating system 3.1/2.15/14/13 ),Metadata Hub, Pentaho Data Integration, Informatica, Datastage

Databases: IBM DB2 - UDB 9& z/OS 9, Teradata V2R6, Oracle 11g/10g/8/9i.

OLAP: SAS

Operating System: UNIX - Red hat Linux 2.6, IBM AIX Version 5.6/5.3, Solaris 5.8/9/10, HP-UX

Scripting/Languages: Korn shell, Pl/Sql, Pro *C, PERL, Python, C, C++, Java

DB Tools: Db2 Toad/Visualizer, Teradata SQL Assistant

Version Control: Ab Initio EME, Clear Case, SCP, VSS

Scheduling Tool: Control-M, Autosys, IBM TWS, Conduct>IT, Cron

Reporting Tools: SAP BI 3.5/7,Cognos 10/8/.4

PROFESSIONAL EXPÉRIENCE:

Confidential, Columbus, OH

Sr. Ab initio Consultant

Responsibilities:

  • Written wrappers exclusively for the benchmarking performance related tasks such as measuring the number of processes (m ps),wall clock/cpu time/number of phases.etc for a given graph.
  • Enhanced the common graphs in COMMON FRAMEWORK, to extend the generic functionality by considerate usage of conditional enabling/disabling and heavy meta programming.
  • Well versed with the usage of PDL functions such as vector slice/vector bsearch all/read file and regular expressions.
  • Involved in the taking the back up of eme and copying the zip file from eme to dev for backup air backup-start/air backup-verify/air backup-dump.
  • Automation of pset creations using air sandbox parameter in wrapper or graph with inbuilt PDL functions pset info/parameter info type.
  • Compiled the dependency analysis warnings/errors for all projects and fixed the same in 3.1.
  • Performance tuning of graphs with too many parameters with only few parameters and the usage of PDL functions instead of shell interpretation.
  • Efficient in writing the meta programming using PDL such as dynamic creation of xfrs using make transfrom/add rule(s)/add statement/make rule..etc
  • Developed template graphs to pull corrected/adjustments data from the front end sources like DIF and process them and load into data mart .Report any validation errors xls occurred during processing in the form of BLOBS to users
  • Fixed the dependency analysis issues in common graphs and making sure objects are clean to import into metadata hub.
  • Established the data lineage end to end to all the land and load processes.
  • Creation of tags/save files in dev EME to facilitate the promotion procees easy and simple.

Environment: Ab Initio (GDE 3.1.2, Co>Op 3.1.2), IBM AIX 6, Oracle 11g, DB2, Shell scripting, Conduct>IT, EME, Metadata hub, Control M

Confidential, Columbus, OH

Sr. Ab initio Developer

Responsibilities:

  • Analyzing the business requirements, setting up design review with platform team, and preparation of Mini Design Documents MDD .
  • Active participation in the data model changes and assistance to Data Modeller regarding the table structure changes and possible range/values for new fields.
  • Creation of standardized code generic/template graphs using components like Write Multiple Files across all regions there by reducing the redundant graphs/autosys box/jobs..
  • Creation of xfr’s for the commonly used functions in code.
  • Extensively used air commands for Abinitio code migrations, Check-in, check outs, finding sandboxes, object uses, lock info, common project info .etc.
  • Involved in migration projects such as AIX Migration Conversion from Solaris to AIX, Double byte Migration Ascii to UTF-8 .
  • Developed generic graphs to upsert data in API/Utility mode depending on the input parameters.
  • Written generic wrapper to extend the date driver functionality for all the jobs depending on the cycle day set by client.
  • Extensive knowledge of converting data across various character sets like iso-8859-1/utf-8/ascii and usage of functions like string truncate explicit, string convert eplicit,string cleanse.
  • Thorough knowledge of converting files across various formats ascii to cobol, ascii to xml.
  • Performance tuning the complex Ab Initio graphs, and Ab Initio best practice approach to implement the ETL process.
  • Extensive knowledge in XML parsing capabilities using abinitio.
  • Written generic graphs to get/post data from web services using WebServices Component.
  • Helping Production Support to fix the failed jobs for daily, Monthly and Weekly schedule
  • SME for the projects like LLR Basel Compliance, COFRS, TSYS, CEEMEA, Global Concurr Card Member, Global Artemis, Global CDF, legacy DOD projects.

Environment: Ab Initio (GDE 3.1/0,1.16, Co>Op 3.1/2.15), IBM DB2 UDB 9, IBM AIX 6,Sun OS 5.10, ksh shell scripting, Conduct>IT, EME, Metadata hub, Autosys,Pentaho Kettle/reporting 3.7, Cognos 8/8.4.

Confidential, Richmond, VA

Ab initio Designer

Responsibilities:

  • Responsible for analyzing and understanding the business/system requirements.
  • Preparation of High Level design and detailed level design documents.
  • Involved in Data modeling and Data model reviews
  • Extensively used Ab initio to load data from Teradata, and Flat files to IBM DB2 (z/OS).
  • Setup of Development (like creation of EME projects, Batch & Utility Connect id’s, QA as well as UAT Environment
  • Developed Abinitio Graphs for various LOB’s processing like Account detail, Account detail2, Rewards detail.
  • Extracted 120M records of data from Teradata to DB2 using Utilities like Fast Export, DSNUTILB for each month as part of historic loads.
  • Extensively used dynamic script generation techniques to improve graphs performance.
  • Used Meta programming for dynamically building the xfr/dml using ab make transform, ab make type utlities
  • Daily loads will be sourced from the flat files sent by TSYS.
  • Transmission of load ready files from/to MVS using FTP to/ FTP from components .
  • Designed generic graph to extract data from mainframe data sets and parse them using cobol-to-dml utilities
  • Developed generic graph to split a huge file into multiple files based on the partition by expression.
  • Designed and Developed various Ab Initio Graphs and business rules engine using the above components and using Ab Initio function Like ABLOCAL(), decimal strip, decimal lpad, decimal lrepad, is bzero, re get match, re index,, is blank, is defined, is null, is valid, etc in the transformations to create xfr’s .
  • Responsible for Designing and developing shell scripts (wrapper) to validate data.
  • Generic wrapper is developed to checkout the EME projects from Unix rather than through GDE.
  • Responsible for writing SQL’s in Teradata SQL Assistant/DB2 Visualiser to validate if the data loaded was correct.
  • Designed and Created various Ab Initio Multi File System’s (MFS) to run graphs in parallel
  • Design and development of Ab Initio load graphs to load tables in DB2.
  • Much familiar with the concept of Sand box, EME for check in & checkout process Involved in unit testing of the Ab Initio graphs.
  • Extensively used air commands For Abinitio code migrations, Check-in, check outs.
  • Extensively used Enterprise Meta Environment (EME) for version control
  • Written SQL scripts with .Sql extensions like Joins which will be used in Database Components of Ab Initio to extract the data from different source tables and to load the target table using Join with DB, Update DB and Load DB table components with the support of dbc file in graphs.
  • Responsible for fully understanding the source systems, documenting issues, following up on issues and seeking resolutions.
  • Control M is extensively used to set up the job streams
  • Confirm the field mapping to the new data warehouse from the old data warehouse is identified for which the one-to-many and many to one fields are properly matched and resolved.
  • Test Abinitio graphs in development and migration environments using test data, fine tuning the graphs for better performance and migrate them to the Production environment.
  • Created test data for the testing team to test the mapping rules.

Environment: Ab Initio (GDE 1.15.7.2, Co>Op 2.15), IBM DB2 9( on z/OS),Redhat Linux 2.6, Teradata VR26 UNIX, Korn shell scripting, Teradata V2R6, Control M, SQL and MVS, EME, Metadata Hub, SAS.

Confidential, Richmond, VA

Ab initio Devoloper

Responsibilities:

  • Understanding system requirements and preparation of system requirement document.
  • Analyzing the technical requirements and preparation of detailed design documents.
  • Developed Ab Initio graphs for Data validation using validate components.
  • Implementation of full life cycle software development with Business specific customization
  • Created technical detail design documents and conducted code reviews within the team.
  • Extensively involved in Ab Initio Graph Design, development and Performance tuning.
  • Developed number of Ab Initio Graphs based on business requirements using various Ab Initio Components such as Partition by Key, Partition by round robin, reformat, rollup, join, scan, normalize, gather, merge etc
  • Developed generic graphs for FTPing files from MVS to DDE and vice versa and thereby reducing development time.
  • Developed generic graph to load data into look up tables(db2).
  • Unloaded huge volumes of data from DB2 tables using DSNUTILA,UNLOAD utilities
  • Used normalise components to break up the data into multiple records as per the requirements.
  • Performed validations, data quality checks and Data profiling on incoming data.
  • Created DBC files for db2 table loads/unloads
  • Responsible for writing SQL’s in Teradata SQL Assistant/DB2 Visualiser to validate if the data load was correct.
  • Implemented Data parallelism by using Multi-file System, Partition and De-partition components and also preformed repartition to improve the overall performance
  • Developed graphs separating the Extraction, Transformation and Load process to improve the efficiency of the system
  • Involved in designing Load graphs using Ab Initio and tuned the performance of the queries to make the load process run faster.
  • Developed shell scripts for Archiving, Data Loading procedures and Validation
  • Extensively used Partition components and developed graphs using Write Multi-Files, Read Multi-Files, Filter by Expression, Run Program, Join, Sort, Reformat, and Dedup.
  • Implemented Lookups, lookup local, In-Memory Joins to speed up various Ab Initio Graphs.
  • Extensively used Multi-load and Fast load utilities to populate the Flat files data in to Teradata
  • Used EME for version controlling and job tracking using web EME.

Environment: Ab Initio (GDE 1.15.7.2, Co>Op 2.15), DB2 9(on z/OS), Linux 2.6, Teradata VR26 UNIX, Korn shell scripting, Teradata V2R6, Control M, SQL.EME, web EME.

Confidential, Newark, DE

Ab initio Developer

Responsibilities:

  • Used Ab Initio as ETL tool to pull data from source systems, cleanse, transform, and load data into databases.
  • Extensively involved in Ab Initio Graph Design, development and Performance tuning.
  • Developed number of Ab Initio Graphs based on business requirements using various Ab Initio Components such as Partition by Key, Partition by round robin, reformat, rollup, join, scan, normalize, gather, merge etc
  • Developed Ab Initio graphs for Data validation using validate components.
  • Enhance performance using data parallelism.
  • Extensively used the Ab Initio tool’s feature of Component, Data and Pipeline parallelism.
  • Implemented Data parallelism by using Multi-file System, Partition and De-partition components and also preformed repartition to improve the overall performance
  • Developed graphs separating the Extraction, Transformation and Load process to improve the efficiency of the system
  • Involved in designing Load graphs using Ab Initio and tuned the performance of the queries to make the load process run faster.
  • Extensively used Partition components and developed graphs using Write Multi-Files, Read Multi-Files, Filter by Expression, Run Program, Join, Sort, Reformat, and Dedup.
  • Implemented Lookups, lookup local, In-Memory Joins to speed up various Ab Initio Graphs.
  • Extensively used Enterprise Meta Environment (EME) for version control
  • Involved in unit testing and assisted in system testing, integrated testing, and user acceptance testing.
  • Used Data profiling task to identify problems in the data that have to be fixed.
  • Performed validations, data quality checks and Data profiling on incoming data.
  • Followed the best design principles, efficiency guidelines and naming standards in designing the graphs
  • Used Enterprise Meta Environment (EME) for version control, Control-M for scheduling purposes.
  • Testing and tuning the Ab Initio graphs and Teradata SQL’s for better performance
  • Extensively used Multi-load and Fast load utilities to populate the Flat files data in to Teradata
  • Enabled MVS input via DB2 and output as Teradata.
  • Designed and build Ab Initio generic graphs for unloading data from source systems. Validated the unloading process with the row count of the landed file with that of the source table row count
  • Designed Ab-Initio graphs using GDE Conditional components
  • Developed shell scripts for Archiving, Data Loading procedures and Validation
  • Provided 24*7 extended supports during the production rollout.
  • Involved in Unit testing, System testing and debugging during testing phase.

Environment: Ab Initio (GDE 1.15, Co>Op 2.15), IBM AIX 5.3, SCP, IBM DB2 8/9, EME.

Confidential, Addison, TX

Ab initio Developer

Responsibilities:

  • Developed and supported the extraction, transformation and load process (ETL) for a Data Warehouse from their legacy systems using Ab Initio and provide technical support and hands-on mentoring in the use of Ab Initio.
  • Metadata mapping from legacy source system to target database fields and involved in creating Ab Initio DML’s.
  • Developed number of Ab Initio Graphs based on business requirements using various Ab Initio Components such as rollup, join, scan,, denormalise, normalize, gather, merge etc.
  • Well versed in Partition techniques based on requriements.
  • Developed various Ab Initio Graphs for data cleansing using Ab Initio function such as is valid, is error, is defined, string * functions etc.
  • Developed Ab Initio graphs for Data validation using validate components.
  • Deducing and consolidating customer records from various sources to create master customer list.
  • Developed Complex Ab Initio XFRs to derive new fields and solve various business requirements.
  • Worked on improving the performance of Ab Initio graphs by using Various Ab Initio performance techniques such as using lookups instead of Joins etc.
  • Implemented Lookup’s, lookup local, In-Memory Joins and rollups to speed up various Ab Initio Graphs.
  • Involved in AB Initio BRE implementation.
  • Used Plans and Conduct>It for job sequencing.
  • Built various parameterized graphs used by for data transport (eg., Teradata Mload Stage Select Insert to Prod),data set management(eg.,File Control Write), data transformations, data quality (eg.,Data file Validation with Signal file) that pulls from a linked sub-graph.
  • I have been constantly involved with Test team for clarifications on different test cases and help them understand the business process in terms of Implementation details.
  • I have been a Key decision maker for code enhancements vs. requirement changes based on the discussions with requirement management team.
  • Played the key role in the Regression testing strategy for Ab Initio 2.15 upgrade and helped resolve issue around this.
  • Developed the application using UNIX and ANSI SQL with Oracle 9i as the backend database.
  • Developed mapping docs using Data Profiler.
  • Developed Autosys Jil scripts.
  • Debugging Ab Initio graphs using Flow Watchers.
  • Creating load ready files using Ab Initio ETL tool to load into database.

Environment: Ab Initio CO>Os 2.15, GDE 1.15, IBM AIX, Ksh & Perl Scripting, ANSI SQL, IBM DB2 8, TWS, EME, Pl/Sql,SAS

Confidential

Ab Initio/ ETL developer

Responsibilities:

  • Used Ab Initio as ETL tool to extract data from source systems, cleanse Transform, and Load data into databases.
  • Responsible for creating Ab Initio graphs for landing the validated source data received from various customers in multifile and creating Lookups for cross .
  • Used Ab Initio components like components like Sort, Partition by Key, Partition by round robin, Partition by expression, Reformat, Join, Normalize, Gather, concatenate, Replicate, filter by expression, Rollup, dedup sorted etc for developing graphs and business rules engine .
  • Designed and developed Ab Initio graphs for transforming various data feed to fit into dimensional model of the target, using complex components.
  • Also used Ab Initio Partition and Departition components to enable parallelism.
  • Responsible for code review and performance tuning of Ab Initio graphs designed by various developers.
  • Designed and Developed various Ab Initio Graphs for data cleansing using Ab Initio functions like is valid, is error, Scrubbing and Bashing operations is defined, string functions and date functions.
  • Performing transformations of source data with Transform components like Join, Dedup, Sort, Filter, Reformat, Filter-by-Expression, Rollup.
  • Developed Ab Initio Multi File Systems (MFS) to monitor performance in parallel layout
  • Implemented Lookups, In-memory Joins and Rollups to speed various Ab Initio graphs.
  • Wide usage of Lookup files while getting data from multiple sources and size of data is limited.
  • Modified the AB INITIO components parameters, utilize data parallelism and thereby improve the overall performance to fine tune the execution times.
  • Debugging Ab Initio graphs using File Watchers.
  • Responsible for writing shell scripts (wrapper) to schedule the jobs in the development environment.
  • Developed UNIX scripts and Ab Initio components.
  • Used sub graphs and parallel execution features using multifile systems.

Environment: Ab Initio (GDE 1.14/13, Co>Op 2.13), Oracle 10g/9i/8.1.6, TOAD, UNIX Shell Scripting, Visio, Windows NT, Tiwoli

Confidential

Pl/SQL Developer

Responsibilities:

  • Involved in preparation of Design Documents.
  • Build external mapping to transfer Old data from database to Oracle Data warehouse. This includes writing PL/SQL procedures.
  • Gathering Software Requirement Specification from the users.
  • Identified and reviewed the design outputs against the Design input requirements.
  • Version configuration using Visual SourceSafe.
  • Developed procedures to ensure conformity, compliance with standards and lack of redundancy, translating business rules and functionality requirements.
  • Ops team to transfer Data files across 24 cities of India used the GUI.
  • Also designed Forecasting data mart. Extracted data from legacy system.
  • Involved in the production support of the system for 6 months to stabilize and efficient uploading/downloading of data across various cities.

Environment: Oracle 8, Solaris 8, Visual Basic 5.0, PL/SQL, & Visual SourceSafe.

We'd love your feedback!