We provide IT Staff Augmentation Services!

Lead Ab Initio Consultant Resume

PA

Experience Summary:

  • Total 10 years of IT experience in Analysis, Design, Development, and Implementation of various Data Warehousing technologies.
  • 6 Years of experience in AbInitio with ETL, Data Mapping, Transformation and Loading from Source to Target Databases in a complex and high-volume environment.
  • Over 1 Years of experience in DMExpress with ETL, Data loading to different databases.
  • Extensive experience in SQL, PL/SQL, Oracle, SQL Server and Netezza
  • Experience in converting of Abinito graphs into Netezza.
  • Experience in troubleshooting and improving performance of the Ab Initio graphs,
  • Wrote setup scripts that define the environment variables and wrappers to run Ab Initio deployed scripts.
  • Experience working with various source systems like Oracle, SQL Server, DB2 UDB and Flat files.
  • Extensively involved in all phases of the SDLC, including working closely with the business analysts and software development teams to ensure a clear understanding of the ETL requirements.
  • Strong skills in UNIX Shell Scripting and Wrapper Scripts, Oracle, DB2, Flat files
  • Ability to work in a team as well as independently. Work in a high intensity environment with excellent interaction/communication skills.
  • Created pl/sql, stored procedures for accessing and manipulating data from database.
  • Experienced in back end data validations, strong in SQL Queries for Data Verification.
  • Microsoft Certified SQL – Server 2000.

TECHNICAL PROFILE:

ETL TOOLS : Ab Initio 2.13/2.14/2.15/3.0.2.10 ; DMExpress7.5

RDBMS : Oracle, SQL Server, Netezza, and DB2.

Scripting : UNIX Shell Scripting and VB Script

B.I. Tools : Business Objects

Operating Systems : UNIX, Windows NT, MS DOS and Windows XP

Scheduling Tools : Control-M, Mainframe Cybermation

QA TOOLS : Quality Center and Test Director

PROFESSIONAL EXPERIENCE:

Confidential, Collegeville, PA

Lead Ab Initio Consultant

1. Pharmacy Benefit Management (PBM)

Sizing: New STAT Methodology implemented as part of the Project Independence project (PI), is required when a retail Rx supplier is lost to the IMS. When IMS lose a supplier it needed to estimate the TRx Volume at the (outlet x product level and outlet) level. PBM data is used to estimate the Trx volume in Sizing process.

Responsibilities:

  • Worked with the Business Users and Technical Architects for the Abinitio Designing the process for PBM Sizing Project at outlet/product level, PBM QC Project.
  • Involve in designing the Abinitio application for the client needs without supervision,
  • Designed, developed Ab Initio graphs as per business requirements. Implemented and deployed across the different environments. Did code reviews, unit testing and fixing defects.
  • Developed graphs which are used to load the data in to the Reference Tables and Fact Tables.
  • Written various unix shell scripts for generic functionality and created parameterized graphs for different interfaces and also worked on the linked graphs.
  • Extensively used the Abinitio tool’s parallelism, multifle features in developing pbm sizing graphs.
  • Developed graphs for the ETL processes using Join, Rollup, Reformat, Dedup, Scan, Update Table, Replicate, Gather, Partition and Departition components extensively.
  • Performed dependency analysis, tracing and data lineage of generic components field by field between the sourcing and the distribution projects using Abinitio.
  • As part of creation of project design, considered design techniques like naming patterns, application type, deployment strategies, right technology and quality attributes.
  • As lead, coordinated with team and distributed the work between the team.
  • Directed the team on IMS standards and understanding the IMS business, functionality and rules.
  • Worked on onsite-offshore model to get work done from the offshore and completed tasks in-time.

Environment: Ab Initio 2.14, Mainframe, Cybermation, Netezza, Oracle, TOAD, UNIX, Quality Center10.0, & B.Objects.

NDW RxEncoding Data:

  • Designed and developed DMExpress Tasks/Jobs to extract data from Oracle database, transformed and loaded into Oracle, Flat Files.
  • Developed DMExpress Tasks to read mainframe files in remote connection and loaded into database tables.
  • Resolved special characters issues in DMExpress jobs in non uft8 databases

Environment: DMExpress7.5, Mainframe Cybermation, Oracle, UNIX, Quality Center10.0.

Confidential, Columbs, OH

Ab Initio & ControlM Developer

Project Description: Basel II: regulatory capital framework was created by the Basel Committee on Banking Supervision of the Bank for International Settlements (BIS).

The Committee formulates broad supervisory standards and guidelines, recommends best practices, and expects that individual authorities will take steps to implement these best practices through national systems.

Basel II capital guidelines are capital adequacy requirements, supervisory review, and market discipline.

Responsibilities:

  • Involved in performance tuning for the various ETL applications across the company (GFT).
  • Work with the Business User and Business Analysts for the creating the MD 50 and MD 70 technical documents for various Data Warehouse applications.
  • Written and maintained documentation to describe program development, logic, coding, testing, changes, and corrections.
  • Modified various Data Model Objects, Transform Objects and wrote logical functions as part of Ab Initio Enhancement projects across GFT.
  • Implemented the Ab Initio projects for various applications and migrated the objects to the different environments including production.
  • Involved in building, testing the Control M jobs for various applications and supporting the same in the Production Environments.
  • Involve in designing the application solutions to user department needs with out supervision, maintains current working knowledge.
  • Worked on EME for check in and check out process and data about implement the business rules, data and definition.
  • Used components like run program and runsql components to run UNIX and SQL commands in Ab Initio. Also used the components like filter by expression, partition by expression, replicate, partition by key and sort Components.
  • Worked on Enterprise Meta Environment EME for - data Processing system from design information to operations data and maintained versions of graphs.
  • Working with Departition Components like Gather, Interleave in order to departition and Repartition the data from Multi File accordingly.
  • Performing transformations of source data with Transform components like Join, Match Sorted, Dedup Sorted, Denormalize, Reformat, Filter-by- Expression.
  • Create Summary tables using Rollup, Scan & Aggregate.
  • Performed research and consulting tasks for Production Control and Scheduling and for Administrative Programming.
  • Developed, tested and implemented Ab Initio graphs, developed generic and also conditional components in Ab Initio & debugging of Ab Initio graphs using File Watchers.
  • Involved in project implementation tasks of Basel Reporting Applications
  • Created mp, ksh, dmls, xfrs, wrapper scripts, backup, dei snapshots, stored procedures, partitioning of databases.

Environment: Ab Initio2.14, 3.0.2, Control M6.4, SQL Server2005, SQL plus, TOAD, Oracle SQL Developer, UNIX, Quality Center10.0, & Business Objects.

Confidential, Forest Hills, NY

Ab Initio Developer

Project Description: The System involves building an Enterprise Data Warehouse system to improve overall quality and productivity of existing processes and deploying reliable and accurate information. Extracting data from different sources like Sybase, DB2, Oracle, MS-SQL Server and transforming the data based on user requirement using Ab Initio’s component container and loading into target, by running the graphs. This information is delivered into various data marts to formulate analytical information and reporting groups in generating reports using Business Objects.

Responsibilities:

  • Worked closely with Business Analysts in Analysis for existing Information available.
  • Worked on EME for check in and check out process and data about implement the business rules, data and definition.
  • Perform research and consulting tasks for Production Control and Scheduling and for Administrative Programming.
  • Developed, tested and implemented Ab Initio graphs, used generic and conditional components in Ab Initio.
  • Used watcher and leading records components testing phase.
  • Part of Unit Testing, Rewritten the functions used in several components to improve performance
  • Implemented Lookups, In-memory Joins and Rollups to speed various Ab Initio graphs
  • Used UNIX Environment variables which comprises of specified locations to build Ab Initio Graphs, Used Ab Initio Partition and Departition components to enable parallelism,
  • Prepared Test Plan, Test cases documents on par with the requirement documents.
  • Involved in preparing Run book to keep track of all incident failure resolutions in Production support.
  • As part of Prod support, Used to communicate with source systems groups, UNIX, Oracle dba to resolve the incidents.
  • Developed UNIX shell scripts to automate file manipulation and data loading procedures.
  • Involved in writing UNIX wrapper scripts to execute the Ab Initio deployed scripts.
  • Worked on Documentation of release notes, code changes, unit test cases and UAT test cases.
  • Worked on Source to Target mapping documents with business users.
  • Worked on low level design documents to provide better understanding of graphs.
  • Responsible for primary 2nd level support for problems encountered with prod processing, including off hours support.
  • Performed daily monitoring, Scheduling tasks/trouble shooting the failed jobs in production.
  • Developed unit test cases, Quality Assurance test plans

Environment: Ab Initio, ControlM, My SQL, SQL Server, SQL plus, UNIX, Test Director, Win SQL and Business Objects, Quality Center9.0

Confidential, St Louis, MO

Ab Initio Developer

Project Description: The Confidential data warehouse will allow Equity Middle Office staff to track a trade through its life cycle from order execution through trade completion, confirmation, and settlement. It will be delivered by collecting data from a number of trade captures and booking systems, correlating and transforming the data into multidimensional views that will be accessed by business intelligence tools such as Business Objects.

Responsibility:

  • Gathered Business requirements and Mapping documents.
  • Multiple graphs were built to unload all the data needed from different source databases by configuring the dbc file in the Input Table component.
  • Developed the graphs using the GDE, with components partition by round robin, partition by key, rollup, sort, scanning, dedupsort, reformat, join, merge, gather, concatenate components.
  • Deploy and test runs the graph as executable k-shell scripts in the application system.
  • Modified the Ab Initio components parameters, utilize data parallelism and thereby improve the overall performance to fine-tune the execution times.
  • Involved in Performance Testing, Tuning of graphs by adding new components.
  • Used watcher and leading records components testing phase.
  • Part of Unit Testing, Rewritten the functions used in several components to improve performance
  • Used Data Parallelism, Pipeline Parallelism and Component Parallelism in Graphs, where huge data files are partitioned into multi-files and each segment is processed simultaneously.
  • Was involved in using the built-in Ab Initio functions to build the Custom components, which will help in implementation of the complex business logic.
  • Performed complex queries involving large data volumes of data. Some of these queries run parallel against the same table to improve the performance.
  • Validating the code and Review the code as per the requirements.
  • Worked on Documentation of release notes, code changes, unit test cases and UAT test cases.
  • Verified and changed functional, systems documents for changes in requirements.
  • Worked on Source to Target mapping documents with business users.
  • Developed wrapper scripts to execute Ab Initio graphs followed by error handling routines and post audit checks/data reconciliation after executing the graphs
  • Executed the test cases and verified the expected results against actual results. Used Test Director to log, monitor and update defects during test.

Environment: Ab Initio2.14, Control M, Oracle, SQL Server, SQL plus, UNIX, Quality Center9

Confidential. South Brunswick

ETL Developer

Project Description: Responsible for building the data warehouses/data marts for the organizational credit risk and financial data by transforming the business requirements to system specifications; detailed design; development in Ab Initio, Korn shell, Teradata/Oracle SQL; unit & integration testing of ETL code.

Responsibility:

  • Extracted data from Oracle and used them to populate data into Oracle Data Warehouse tables
  • Created Korn Shell scripts and corn jobs to refresh the load on weekly basis.
  • Impact Analysis of all the interfaces to set up new client, generation of new extract and development of new interfaces.
  • Developed complex Ab Initio XFRs to derive new fields and solve various business requirements.
  • Developed number of Ab Initio graphs based on Business requirements using various Ab Initio Components such as partition by key, partition by round robin, reformat, rollup, join, Scan, normalize, gather, broadcast, merge etc.
  • Created test scenarios that were used to validate the Ab Initio graphs.
  • Designed and developed complex Ab Initio graphs using Aggregate, Join, Rollup, Scan, look up transformation
  • Used the sub graphs to increase the clarity of graph and to impose reusable business restrictions.
  • Developed several partition based Ab Initio Graphs for high volume data Warehouse. UAT, UNIT & SYSTEM Testing
  • Review of newly developed Ab Initio graphs for new interface, extract and modified graphs (New & modification of existing transformation logic, new database procedures etc).
  • Reviewing of unit testing results, Detail test case document and testing related deliverables.
  • Involved in all phases of the System Development Life Cycle (SDLC),
  • Assigned phases and set up checkpoints to complex graphs having large number of components to protect against failure, avoid dead lock and easy recovery of failed graphs.

Environment: Abinitio, SAS, ControlM, UNIX, Oracle.

Confidential

Software Programmer/Analyst

Project Description: This system manages life and annuity policies, it also will maintain accounting and transaction records of the policies. System will help in generating required reports and correspondence to the client like data pages, conformation statements, annuity statements and etc. whenever the policies are required. This system will give input to SAP finance.

Responsibilities:

  • Involved in Creating Forms, Reports using Developer.
  • Extensively involved in Coding of Triggers, packages in the application level at each and every transaction.
  • Explain plan was run over each and every query in the Application in Estimating cost of each query.
  • Used SQL*Loader to load the input data into the database tables.
  • Created indexes to improve performance and discuss SQL query tuning with the DBA.
  • Involved in writing, procedures, and triggers.
  • Created back end procedures using PL/SQL.

Environment: Oracle , SQL, PL/SQL, ASP/VB, SQL*Loader, Windows 95/98 .

Confidential

Software Programmer Analyst

Project Description: Confidential a web enabled CRM is developed in ASP.Net and proposed to be an integrated Web-enabled solution aimed at consolidating the Sales and Marketing efforts within CGS. Confidential addresses the requirements of the Management and the Sales & Marketing Personnel in their day-to-day activities. A hierarchy based security system, Reports, Admin Module controlling the entire system are some of the highlights of the system.

Responsibilities

  • Reviewed Business Requirements Documents and the Technical Specifications
  • Created Dev, QA environment and Developed the application from starting of Project.
  • Responsible for developing application as per business requirement
  • Used Test Director for requirement management, planning, and execution
  • Recorded, maintained & tracked defects, assigned type & severity levels
  • Used Mercury Test Director for defect tracking, coordinated the defect resolution process and test reports
  • Compatibility Checking: Tested Web performance and security, Involved in testing the application against different browsers Netscape Navigator and Internet Explorer
  • Tested Pipe Line analysis, Stage history in Opportunity, Opportunity Competitors list, Quotas and Competitor Analysis.
  • Test procedures, documenting results including any exceptions involved in testing

Environment Windows 2000 Prof, IIS, ASP. Net, SQL Server 2000 and Test Director 6.0

Hire Now