We provide IT Staff Augmentation Services!

Ab Initio Team Lead\architect Resume

4.00/5 (Submit Your Rating)

NC

SUMMARY

  • Nine years of experience of using Ab Initio tool in developing strategies for ETL (Extraction, Transformation and Loading) mechanism in complex and high volume Data Warehousing projects.
  • Strong development skills includes ability to work through the entire development life (SDLC) cycle from gathering requirements through implementation, development, production, support and documentation of the complete project.
  • Worked extensively in the GDE (Graphical Development Environment) configuring, connecting, and testing the components to produce executable flow graphs on Unix environment.
  • Proficient with various Ab Initio Data Cleansing, Parallelism, Transformation and Multi File System techniques.
  • Possess excellent working knowledge of ACE framework, by which one can create “pset” values for multiple generic graphs in one instance.
  • Expert in writing UNIX Shell Scripts including Korn Shell Scripts, Bourne Shell Scripts.
  • Extensively wrote Wrapper scripts for job scheduling by using Korn shell.
  • Practical experience with working on multiple environments like production, development, testing.
  • Expertise knowledge in Dimensional Data Models using Star schema and Snow - Flake schema used in relational, dimensional and multidimensional modeling, creation of Fact and Dimension Tables, OLAP, OLTP and thorough understanding of Data-Warehousing concepts.
  • Working knowledge of Oracle Database Administration (DBA) and Strong SQL and PL/SQL, SQL Loader, Query Optimization skills in programming.
  • Ability to quickly grasp new concepts and apply software methodologies as per business needs.
  • Excellent interpersonal skills and professional communication skills and have worked extensively in a team oriented environment with a strong desire to achieve specified goals.
  • Extensively used Ab Initio EME Data store/Sandbox for version control, code promotion, and impact analysis.
  • Provided the 24x7 support for production and testing daily, monthly and weekly data refresh and worked with fixing complex/critical production problem.
  • Hands on experience with large scale volume of data marts.
  • Worked with deploying CPU heavy ETL process and tuning the complex Ab Initio process.
  • Worked with testing team and developed Unit test plan, test cases for component, data, stress test, regression test etc.
  • Have solid experience on database query tools such as TOAD, SQL Navigator, SQL Assistant and SQL Plus.
  • Experienced in using the Oracle SQL* Loader features for loading the data from Flat Files to Data Base tables for Bulk Loading.

TECHNICAL SKILLS

Database: Oracle 11G, DB2, SQL Server, Teradata 14

Languages: SQL, PL/SQL

Operating Systems: Windows XP/Vista/7, UNIX (HP-UX, AIX, Sun Solaris), MS-DOS, Linux

Data Warehouse Tools: Ab Initio (Co-Op - 3.0, GDE 3.15), EME, Informatica (7.0/8.0)

Scripting: UNIX, Korn shell, JavaScript

Scheduling Utilities: Control-M, CA7, TWS

PROFESSIONAL EXPERIENCE

Confidential, NC

Ab Initio Team Lead\Architect

Responsibilities:

  • Review and publish feedback on “ETL Code” developed by Ab Initio developers and ensure that they adhere to the coding standards of BCBSNC.
  • Understand the Information Management (IM) software development life cycle (SDLC) and adhere to development team software standards and best practices to write code TEMPeffectively for projects.
  • Participate in regular technical peer review sessions to identify non-adherence to standards, design and performance issues/improvements in ETL code.
  • Also implemented ACE framework to test the psets (created for multiple generic graphs) by running the graphs with inducing some test data.
  • Used the ACE Framework (Ab initio Configuration Environment), to build an A>I based generic application platform, which was integrated with BRE, Data Profiler, Metadata Hub to form the DQE (Data Quality Environment).
  • Wrote Unix shell scripts and wrapper scripts for some unique functionalities and also write Teradata queries for condition based loading and unloading from the source tables.
  • Worked in all the phases of the project including Design, Code, SIT, UAT, and deployment.
  • Monitor the process on Tivoli job scheduler, provide prompt resolutions to production failures adhering to the SLAs, making minor code changes as per the requirement.
  • Provide team status in various projects, escalate issues as needed, assess and communicate risks to the development schedule and project to represent the Data Integration Development team’s interests in cross-functional project teams and ensure project success.
  • Facilitate cross-functional problem solving sessions to arrive at optimal solutions accounting for stakeholder needs (e.g. business requirements and delivery dates) to ensure successful project delivery.
  • Seek out best practices and make recommendations for BCBSNC’s implementation to find opportunities for improvements in development processes.
  • Analyze several aspects of code prior to release to ensure that it will run efficiently and can be supported in the production environment.

Environment: Ab Initio (Co-Op - 3.16, GDE - 3.17), Control-M, Ab Initio Data Profiler, Teradata 13.0.

Confidential, Lincolnshire, IL

Ab Initio Team Lead

Responsibilities:

  • Design Ab initio graphs whose core functionality is to extract data from source systems, cleanse the extracted data and transform them according to the business logic before loading them into target Netezza database.
  • Develop complicated graphs with multiple Ab Initio components such as Join, Rollup, Lookup, Gather, Merge, Interleave, Dedup sorted.
  • Responsible for TEMPeffective implementation of data parallelism and develop all the processes in multi-file system environment.
  • Design Ab initio graphs for data cleansing and validation process, changed data capture processes, loading process from Ab initio into target Teradata database.
  • Deploy, test and run the Ab Initio graph as executable UNIX shell scripts.
  • Modify Ab Initio component parameters and utilize data parallelism and thereby improve the overall performance of the graph.
  • Perform unit testing, integrating testing, regression testing and system testing for Customer related applications.
  • Write Unix shell scripts and wrapper scripts for some unique functionalities and also write teradata queries for condition based loading and unloading from the source tables.
  • Work in all the phases of the project including Design, Code, SIT, UAT, Deploy .
  • Monitor the process on Tivoli job scheduler, provide prompt resolutions to production failures adhering to the SLAs, making minor code changes as per the requirement.
  • Work on change request in the existing project by understanding code and doing impact analysis of the change in the entire Customer Data Mart system and development, testing and implementation of various change requests.
  • Enhancement of Ab initio graphs, in corporate parameterization in few of the graphs in order to make them reusable.

Environment: Ab Initio (Co-Op - 2.0, GDE - 2.0), Ab InitioData Profiler, UNIX, Windows Server 2003, Oracle 11g, Teradata 13.0, Netezza 6.3.

Confidential, Northbrook, IL

Ab Initio Team Lead

Responsibilities:

  • Involved in analysis of existing Ab Initio Graphs, Korn Shell Scripts, and deployment procedures and documenting them.
  • Participated in Software Requirements Specifications (SRS) meetings with various downstream businesses, technical teams.
  • Developed number of Ab lnitio Graphs for the ETL processes based on business requirements using various Ab lnitio components like Partition by Key, Partition by round robin, reformat, join, rollup, gather, replicate etc.
  • Involved in gathering and documenting requirements from Business users.
  • ETL performance enhancement using data-parallelism (m-file), component-parallelism and pipeline-parallelism, in-memory sorting.
  • Implemented phasing and checkpoint approach in ETL process to prevent data loss and to maintain uninterrupted data flow against process failures.
  • Redesigned the existing graphs and documented all the new and enhancement requests.
  • Identified critical fields by analyzing cross field and functional relationships along with the pattern matching techniques and using dependency analysis through Ab Initio EME web interface and using some custom prepared Ab Initio Generic Graphs.
  • Ab Initio graphs are created to search for required pattern data in a dataset using regular expressions.
  • I brought in much needed domain experience to the offshore team.
  • Involved in the estimation of project timelines and completed all the tasks in the estimated timeline.
  • Involved in preparing project documentation outlining the whole process and project reports.
  • Created UNIX scripts to generate DML expressions for database files from tables DDL, DML using m db: gendml utility.
  • Created test plans and test cases and responsible for unit testing.
  • Created and automated a comparison process to compare the results generated by the old application (Mainframes) and the new application and captured the unmatched records.
  • Analyzed the issues with the unmatched records and provided code fix to the problems.

Environment: Ab Initio (Co-Op - 2.8, GDE - 3.1), TWS Maestro,Ab InitioData Profiler, UNIX, SQL, EME and Windows, Oracle 10g, Teradata, SQL Server.

Confidential, Memphis, TN

Ab Initio Team Lead

Responsibilities:

  • Analyzed Functional Requirements Interacted with End user and Source Programmers for business understandings.
  • Lead complete cycle of SQL to Ab Initio conversion projects. Developed graphs to replicate the SQL, PL/ SQL code and implemented in production.
  • Created mapping documents based on the requirements from Business Analysts.
  • Prepared detailed design document for the developers to develop the graphs.
  • Unloaded the data from various source systems flat files (fixed length, delimited), xml's and from various Databases.
  • Developed Ab Initio graphs to unload the data from various source systems and applied the transformations on data according to business rules and load the data into target database.
  • Performing transformations of source data with Transform components like Join, Dedup Sorted, Normalize, Reformat, Filter-by-Expression, Rollup etc.
  • Worked on Multi file systems to implement the data parallelism.
  • Developed Generic graphs for the error validations.
  • Used EME for Version Control.
  • A part of my responsibility was to improve offshore efficiency. When I was the Onsite Co-coordinator for the Off-shore teams, I bridged the gap between the team members have direct contact to their customer and are more motivated to perform up to the clients productivity expectations.
  • Code reviewed for the other developers and made performance improvements.
  • Performed unit testing, system testing and followed all the deployment process.

Environment: Ab Initio (Co-Op - 2.15, GDE - 2.8), Control-M,Ab InitioData Profiler, UNIX, SQL, EME and Windows.

Confidential, Parsippany, NJ

Ab Initio Developer

Responsibilities:

  • Involved in System Study & Business Requirements Analysis & Documentation.
  • Developed and supported the extraction, transformation and load process (ETL) for a Data Warehouse from their legacy systems using Ab Initio and provide technical support and hands-on mentoring in the use of Ab Initio.
  • Worked with various Components like Reformat, Join, Filter-By Expression, Sort, Rollup, Scan etc., to transform the data.
  • Replicate operational data into staging area, Transform and load data into warehouse tables using Ab Initio GDE and responsible for automating the ETL process through scheduling and exception-handling routines.
  • Database Query Optimization and I/O tuning techniques have been used for performance enhancements.
  • Generated database configuration (.dbc) files for source and target databases, which are used in Ab Initio graphs.
  • As an Onsite Offshore coordinator I was instrumental in improving communications, since the offshore team was in India, mode of communication before my presence was difficult due to (phone, chat and email vs face to face) cultural differences.
  • Developed graphs to extract Historical and Incremental Data.
  • Responsible for extracting daily text files from ftp server and historical data from DB2 Tables, cleansing the data and applying transformation rules and loading to staging area.
  • Responsible for creating parameterized graphs, which are used to create common DMLs, XFRs during data transformation phase prior to creating load ready files.
  • Divided Ab Initio graphs into phases with checkpoints to safeguard against failures.
  • Extensively written and used UNIX shell scripts for wrapper scripts.
  • Developed various graphs for data cleansing using Ab Initio functions like is valid, is error, is defined and various string and math functions.
  • Developed Ab Initio ETL process and interpret the transformation rules for all target data objects and develop the software components to support the transformation as well as estimating task duration.
  • Responsible for consolidating customer records from various sources to create master customer list.

Environment: Ab Initio (Co-Op - 2.16, GDE - 2.16), Autosys,Ab InitioData Profiler, UNIX, SQL, EME and Windows.

Confidential, Columbus, OH

Ab Initio Developer

Responsibilities:

  • Involved in analyzing business needs and document functional and technical specifications based upon user requirements with extensive interactions with business users.
  • Involved as designer anddeveloperfor Enterprise data warehouse.
  • Created the mini-specs for different applications.
  • Making automate the development ofAb Initiographs and functions utilizing the Meta data from EME.
  • Developed variousAb Initiographs to validate using data profiler, comparing the current data with previous month data, applying the AMS standards.
  • Used DifferentAb Initiocomponents like partition by key and sort, Dedup, rollup, scan, reformat, join and fuse in various graphs.
  • Written DML expressions for Mainframe data (COBOL copy book).
  • Used checkpoint and phasing to avoid deadlocks and to re-run the graph in case of failure.
  • Also used components like run program and run sql components to run UNIX and SQL commands in Ab Initio.
  • Performing transformations of source data with transform components like join, match sorted, Dedup sorted, reformat and filter-by- expression.
  • Wide usage of lookup files while getting data from multiple sources and size of the data is limited.
  • Involved in project promotion from development to UAT and UAT to promotion.
  • Involved in Production implementation best practices.
  • Implemented Data Parallelism unitizing MFS in the graphs, which deals with data, divided into segments and operates on each segment simultaneously through theAb Initiopartition components to segment data.
  • Used phases and checkpoints to avoid deadlocks inAb Initiographs to facilitate recovery after failures.
  • Using modification of theAbInitio EME to house the required redaction Meta data.
  • UsedAb InitioData Profiler to analyze & validate source data and Data types, and also to determine join datasets by analyzing cross functional relations.
  • Has used Control M for scheduling jobs in Development Environment.
  • Used different EME air commands in project promotion like air tag create, air save, air load, air project export etc.

Environment: Ab InitioGDE 1.15, Co>Operating System 2.15, Control-M,Ab InitioData Profiler, UNIX, windows, TERADATA, Teradata SQL Assistant v2r6, Oracle, DB2, SQL Server.

Confidential, St. Louis, MO

Ab Initio Developer

Responsibilities:

  • Translated business-reporting requirements into data warehouse architectural designs and analyzed source and target data models and made necessary changes.
  • Implemented Ab Initio configuration and set up well-tuned environments for Production and Development.
  • Responsible for creating Generic Graphs using theAb Initiocomponents for ODS and DSS environment.
  • Involved in the complete ETL process life cycle.
  • Develop preprocessing and validation graphs.
  • Involved in populating huge data from ODS, OLTP systems.
  • Develop, implement, and maintain data feeds into and out of the data warehouse using the Ab Initio.
  • Prepare business and technical documentation.
  • DB2 was utilized as source system.
  • Production support on weekly rotation basis to monitor daily and weekly jobs.
  • Create and maintain Slowly Changing Dimension (SCD) TYPE I and TYPE II table's database.
  • Output from the file directly stored into the reporting application tables using Load DB Table.
  • Create Data and Type validation graphs for performing data cleansing.
  • Interpret the transformation rules for all target data objects and develop the components to support the transformation.
  • Create graphs for Data standardization and Data integration into mart.
  • Performance tuning of Ab Initio ETL graphs and processes using various parallelism techniques, proper memory usage, planned phasing and layout.
  • Performance testing of ETL routines.
  • Perform various Database operations like create, drop and update tables and views.
  • Developed Korn shell scripts for file processing.
  • Utilize various Ab Initio components like Lookup, Join, Rollup and Reformat to process data.
  • Analyze and develop production schedule for ETL processes in the system.
  • Study relationship between tables to determine dependencies for individual jobs schedule.
  • Scheduling of Shell Scripts was done by Autosys.
  • Developed JIL Files for running Autosys Jobs.
  • Set up development, QA & Production environments.
  • Migrated jobs from development to QA to Production environments.
  • Created Process Flow diagrams using MS VISIO.

Environment: Ab Initio (Co-Op - 2.15, GDE - 1.15), Korn Shell, Oracle 10g, Teradata, SQL Server, TWS Maestro, PVCS.

We'd love your feedback!