We provide IT Staff Augmentation Services!

Ab Initio Developer Resume

0/5 (Submit Your Rating)

Austin, TX

SUMMARY

  • Over Eight years of professional IT experience in Business Analysis, Design, Data Modeling, Development and Implementation of various client server and decision support system environments with focus on Data Warehousing, Business Intelligence and Database Applications.
  • Over 6 years of Ab Initio Consulting with Data mapping, Transformation and Loading from Source to Target Databases, well versed in various Ab Initio parallelism techniques and implemented Ab Initio Graphs using Data, Component, pipeline parallelism and Multi File System (MFS) techniques in complex, high volume Data Warehousing projects in both UNIX and Windows.
  • Extensive experience in Korn Shell Scripting to maximize Ab - Initio data parallelism and Multi File System (MFS) techniques.
  • Experience in providing production support to various Ab Initio ETL jobs and developing various UNIX shell wrappers to run Ab Initio and Data base jobs.
  • Developed various UNIX shell scripts to run Ab Initio and Data base jobs. Good experience working with very large databases and Performance tuning.
  • Good Experience working with various Heterogeneous Source Systems like Oracle, DB2 UDB, Teradata, Netezza,MS SQL Server, Flat files and Legacy Systems.
  • Very good understanding of Teradata’s MPP architecture such as Shared Nothing, Nodes, AMPs, BYNET, Partitioning, Primary Indexes etc.
  • Experience in DBMS Utilities such as SQL, PL/SQL, TOAD, SQL*Loader, Teradata SQL Assistant.
  • Good knowledge of Teradata RDBMS Architecture, Tools & Utilities.
  • Experienced with Teradata utilities Fast Load, Multi Load, BTEQ scripting, FastExport, OleLoad, SQL Assistant.
  • Skillfully exploit OLAP analytical power of Teradata by using OLAP functions such as Rank, Quantile, Csum, MSum, group by grouping set etc to generate detail reports for marketing folks.
  • Worked with Transform Components such as Aggregate, Dedup Sorted, Filter by Expression, Join, Normalize, Reformat, Rollup and Scan Components and created appropriate XFRs and DMLs and Automation of load processes using Autosys.
  • Extensively worked on several ETL Ab Initio assignments to extract, transform and load data into tables as part of Data Warehouse development with high complex Data models of Relational, Star, and Snowflake schema.
  • Experienced in all phases of Software Development Life Cycle (SDLC).
  • Experience in feed integration and automated data reconciliation.
  • Expert knowledge in using various Ab Initio components such as Join, Reformat, Scan, Rollup, Normalize, De-normalize, Partitioning and De-partitioning components etc.
  • Experience in Data Modeling, Data Extraction, Data Migration, Data Integration, Data Testing and Data Warehousing using Ab Initio.
  • Experience in application tuning and debugging strategies.
  • Exposure to Conduct It,BRE, Data profiler products.
  • Knowledge in Analyzing Data using AbInitio Data Profiler to estimate different Patterns of data, identifying duplicates, frequency, consistency, accuracy, completeness and referential integrity of data.
  • Knowledge on Transformation rules management usingBusiness Rules Engine (BRE).
  • Worked with ODS (Operational Data Source) and DSS (Decision support System) to do the data profiling, Data validation and cleansing process using AbInitio.
  • Experience of usingMetadata Importerfor importing metadata from an EME Technical Repository and other sources like ETL tools (Informatica), Reporting tools (Cognos, SAS, Business Objects etc) and databases (Oracle, Teradata, DB2 etc.)
  • Hands on experience with Metadata Hubadministration tools, utilities for creating Metadata Hubdata stores.
  • Experience in creating and deploying Metadata HubWeb applications, and loading Metadata Hub customizations
  • Configured Ab Initio environment to connect to different databases using DB config, Input Table, Output Table, Update table Components.
  • Experience in using EME for version controls, impact analysis and dependency analysis.
  • Able to interact effectively with other members of the Business Engineering, Quality Assurance, Users and other teams involved with the System Development Life cycle
  • Expertise in preparing code documentation in support of application development, including high level and detailed design documents, unit test specifications, interface specifications, etc.
  • Excellent Communication skills in interacting with various people of different levels on all projects and also playing an active role in Business Analysis.
  • Manage multiple projects/tasks within Mortgag, Mantas, Banking & Financial Service industries in a high-transaction processing environments with excellent analytical, business process, written and verbal communication skills.

TECHNICAL SKILLS

Data warehousing Tools: Ab Initio (GDE 3.1.2/3.0.4/3.0.2/1.15/1.14 , Co>Operating System 3.0.5/2.15/2.14 ), Informatica 6.1/7.1x, SSIS, DTS

Data Modeling: Star-Schema Modeling, Snowflakes Modeling, Erwin 4.0, Visio

RDBMS: Oracle 10g/9i/8i/,Tera Data 13.0, Netezza 4.6.2, DB2, MS SQL Server 2000, 2005,2008

Programming: UNIX Shell Scripting, C/C++, Java, Korn Shell, TSQL, SQL*Plus, PL/SQL,HTML, ASP.Net

Operating Systems: Windows NT/XP/2000, UNIX, LINUX(Redhat)

BI tools: OBIEE 10.1.3.x, Crystal Reports 8.0/8.5

PROFESSIONAL EXPERIENCE

Confidential, Austin, TX

Ab Initio Developer

Responsibilities:

  • Used Components of Ab Initio to extract and transfer the data from multiple operational data sources like TERADATA, DB2 UDB, SQL Server and Oracle to destination data marts in Oracle.
  • Expertise with various Ab Initio components such as Join, Rollup, Lookup, Replicate, Partition by expression, Partition by key, partition by round robin, gather, merge, interleave, Dedup sorted, sort, filter by expression, scan, validate, reformat, FTP, compare records etc.
  • Implemented a number of Ab Initio graphs using Data parallelism and Multi File System (MFS) techniques.
  • Extensive experience in developing transformations between source and target using Ab Initio data mappings, cleansing the data, applying transformations and loading into a complex, high-volume environment.
  • Extensively used UNIX Shell Scripting for writing SQL execution scripts in Data Loading Process.
  • Written SQL scripts which are used in Database Components of Ab Initio to extract the data from different source tables and to load the target table using Update Table and Output Table components with the support of Config (.cfg) file in graphs.
  • Used Ab Initio components like Reformat, Input file, Output file, Join, Sort, Partition By key, Normalize, Input Table, Output Table, Update Table, Gather Logs and Run SQL for developing graphs.
  • Performed data cleansing operations on the data using transformation functions like is valid,is defined,is null,is blank,string lrtrim,re index,re interpret as,string concat,string substring,lookup count,lookup first,now(),decimal strip, re index,re replace,decimal lpad,next in sequence(),length of test characters all(),force error(),switch(),first defined(),lookup match(),conditional dml,cobol-to-dml utility,xml-to-dml utility etc.
  • Used Phases, Checkpoints to avoid deadlocks and multi-files in graphs and also used Run program, Run SQL components to run UNIX and SQL commands.
  • Excellent understanding of the System Development Life Cycle. Clear and through understanding of business process and workflow. Involved in all the four phases namely planning, analysis, design and implementation. Experienced in testing, documentation and requirements gathering.
  • Provided application requirements gathering, designing, development, technical documentation, and debugging. Assisted team members in defining Cleansing, Aggregating, and other Transformation rules.
  • Able to interact effectively with other members of the Business Engineering, Quality Assurance, Users and other teams involved with the System Development Life cycle.
  • Used UNIX environment variables in various .ksh files, which comprises of specified locations to build Ab Initio Graphs.
  • Extensively used the Teradata utilities like BTEQ, Fast load, Multiload, DDL Commands and DML Commands (SQL). Created various Teradata Macros in SQL Assistant for to serve the analysts.
  • Helped Business Users by writing Complex efficient Teradata SQLs to get a detailed for Data Mining. Automated these extract using BTEQ an Unix Shell Scripting.
  • Used Enterprise Meta Environment (EME) for version control.
  • Written complex SQLs using joins, sub queries and correlated sub queries. Expertise in SQL Queries for cross verification of data.
  • CreatingMetadata Hub data storesusing utilities.
  • Creating and deploying Metadata Hub Web applications. Customizing the Metadata Explorer in order for the Business user to explore and analyze the Metadata and to see the contents of the system and applications and drill down in to details of the object.
  • Creating new feed files for importing the metadata on the command line and also in the Metadata Portal. Creating rule files for Transformations and importing the feeds.
  • CreatingData Source Connectionfiles for connecting to the graphs in order to extract the Metadata.
  • GeneratingMetadata Reports and auditing.
  • Adding and Exposing the Divisions in the Metadata Portal.
  • Exposing the Notes Tab and having the various notes type in the Metadata Portal.

Environment: Ab Initio GDE 3.2, Co-Op 3.1.4.4, EME, Kdellingorn shell scripting, UNIX, Teradata 15, SQL Server Navigator 5.0, Windows-NT/2000.

Confidential

Sr Ab Initio Developer

Responsibilities:

  • Designed and deployed the Extract, Transform and Load process, using AbInitio by studying the business requirements from the business users.
  • Developed AbInitio Graphs with complex transformation rules through GDE.
  • Developed Complex AbInitio XFRs to derive new fields and solve various business requirements.
  • Developed to write xml component to take a stream of data and convert it to an xml document.
  • Extensively used AbInitio Components like Join, Rollup, and Reformat etc. as well as Partition and De partition extensively and functions like is valid, is error, is defined, sting substring, srting concat and other string functions etc.
  • Implemented Lookups, lookup local, In-Memory Joins and rollups to speed up various AbInitio Graphs.
  • Implemented 4 and 6 way Multi-file system that is composed of individual files on different nodes that are partitioned and stored in distributed directories and utilized AbInitio parallelism techniques. Extensively used AbInitio Parallelism feature of Component, Data and Pipeline parallelism.
  • Responsible for Performance-tuning of Ab Initio graphs.
  • Collected and analyzed the user requirements and the existing application and designed logical and physical data models.
  • Worked on EME, Meta data environment.
  • Scripts were run through Unix shell scripts in Batch scheduling
  • Responsible to prepare Interface specifications and complete Documentation of Graphs and its Components.
  • Performed data validation before moving the data into staging areas using in built functions like is valid, first defined, is blank, is defined, string length, and string index
  • Developed Strategies for Data Analysis and Data Validation.
  • Ensure ongoing data quality including data quality audit benchmarks and communicate monthly data quality metrics and follow prescribed data quality methodologies
  • Provided guidance and quality assurance to all data masking activities, Profiles the source system data to identify potential data issues.
  • Developed and execute complex SQLs for data validation.
  • Extensively worked on the Continuous Flow technologies like Database Replication and Message Queuing.
  • Updated and inserted transactional data according to the business changes using Continuous Flows.
  • Responsible for testing the graph (Unit testing) for Data validations and preparing the test reports.
  • Implemented Security Features of Business Objects like row level, object level and report level to make the data secure.

Environment: Ab Initio GDE 3.1.3.2, Co-Op 3.1.4.4, EME, Korn shell scripting, UNIX, Teradata 14, SQL Server Navigator 5.0, Windows-NT/2000.

Confidential, OR

Sr Ab Initio Developer

Responsibilities:

  • Designed and deployed the Extract, Transform and Load process, using AbInitio by studying the business requirements from the business users.
  • Developed AbInitio Graphs with complex transformation rules through GDE.
  • Developed Complex AbInitio XFRs to derive new fields and solve various business requirements.
  • Developed to write xml component to take a stream of data and convert it to an xml document.
  • Extensively used AbInitio Components like Join, Rollup, and Reformat etc. as well as Partition and De partition extensively and functions like is valid, is error, is defined, sting substring, srting concat and other string functions etc.
  • Implemented Lookups, lookup local, In-Memory Joins and rollups to speed up various AbInitio Graphs.
  • Implemented 4 and 6 way Multi-file system that is composed of individual files on different nodes that are partitioned and stored in distributed directories and utilized AbInitio parallelism techniques.
  • Extensively used AbInitio Parallelism feature of Component, Data and Pipeline parallelism.
  • Profiled several data sets (serial, multi-file, database tables ) and categorized into different projects, directories using AbInitio data profiler
  • Used AbInitio functions for improving performance of AbInitio graphs.
  • Developed parameterized AbInitio graphs for increasing the performance of the Project.
  • Used Check in and Checkout of graphs from EME for graphs modification and development.
  • Developed UNIX Korn Shell scripts to run various AbInitio generated scripts.Prepared and implemented data verification and testing methods for the Data Warehouse.
  • CreatingMetadata Hub data storesusing utilities.
  • Creating and deploying Metadata Hub Web applications. Customizing the Metadata Explorer in order for the Business user to explore and analyze the Metadata and to see the contents of the system and applications and drill down in to details of the object.
  • Creating new feed files for importing the metadata on the command line and also in the Metadata Portal. Creating rule files for Transformations and importing the feeds.

Environment: Ab Initio GDE 3.1.3.2, Co-Op 3.1.4.4, EME, Korn shell scripting, UNIX, Teradata 14, SQL Server Navigator 5.0, Windows-NT/2000.

Confidential, San Francisco, CA

Sr. Ab Initio Developer

Responsibilities:

  • Created Ab Initio graphs that transfer data from various sources like Oracle, flat files and CSV files to the Teradata database and flat files.
  • Derived modeled the Facts, Dimensions, Aggregated facts in Ab Initio from data warehouse star schema for create billing, contracts reports.
  • Worked on Multi file systems with extensive parallel processing.
  • Automation of load processes using Autosys.
  • Used Lookup Transformation in validating the warehouse customer data.
  • Prepare logical/physical diagram of DW, and present it in front of business leaders. Used ERWIN for model design.
  • Performed bulk data load from multiple data source (ORACLE 8i, legacy systems) to TERADATA RDBMS.
  • Used BTEQ and SQL Assistant (Query man) front-end tools to issue SQL commands matching the business requirements to Teradata RDBMS.
  • Coded and tested Ab Initio graphs to extract the data from Oracle tables and MVS files.
  • Enhancements were done to the existing System as specified by the customer using COBOL, DB2, and JCL.
  • Worked on profiling of operational data using Ab Initio Data Profiler/SQL Tool to get better understanding of the data that can be used for analytical purpose for business analysts.
  • Extensively used UNIX Shell Scripting for writing SQL execution scripts in Data Loading Process.
  • Produced mapping document and ETL design document.
  • Worked closely with the end users in writing the functional specifications based on the business needs.
  • Extensively Fast load, Tpump and TPT as load utilities.
  • Participated in project review meetings.
  • Extensively worked with PL/SQL Packages, Stored procedures & functions and created triggers to implement business rules and validations.
  • Responsible for Performance-tuning of Ab Initio graphs.
  • Collected and analyzed the user requirements and the existing application and designed logical and physical data models.
  • Worked on EME environment.
  • Scripts were run through Unix shell scripts in Batch scheduling
  • Responsible to prepare Interface specifications and complete Documentation of Graphs and its Components.
  • Extensively worked on the Continuous Flow technologies like Database Replication and Message Queuing.
  • Updated and inserted transactional data according to the business changes using Continuous Flows.
  • Responsible for testing the graph (Unit testing) for Data validations and preparing the test reports.
  • Implemented Security Features of Business Objects like row level, object level and report level to make the data secure.

Environment: Ab Initio (CO>Operating system 2.15/2.14, GDE 1.15/1/14), ER-win 4.0, UNIX, MVS, SQL, PL/SQL, Oracle 10g, Teradata V2R6, DB2, COBOL, Perl, Autosys.

Confidential, TX

Ab Initio Developer

Responsibilities:

  • Developed UNIX Korn Shell scripts to run various Ab Initio generated scripts.
  • Developed parameterized Ab Initio graphs for increasing the performance of the Project.
  • Worked on improving the performance of Ab Initio graphs by using Various Ab Initio performance techniques.
  • Provided customer support during warranty period by resolving issues in timely Manner.
  • Experience in Design of the Warehouse Architecture and the Database using Erwin.
  • Experience in understanding of the Specifications for Data Warehouse ETL Process and interacted with the designers and the end users for informational requirements.
  • Analyze business requirements and developed metadata mappings and Ab Initio DML’s.
  • Developed subject area graphs based on business requirements using various Ab Initio components like Filter by Expression, Partition by Expression, Partition by round robin, reformat, join, gather, merge rollup, normalize, scan, replicate etc.
  • Extensively used Ab Initio functions like is valid, is error, is defined, sting substring, srting concat and other string functions etc.
  • Performed data validation before moving the data into staging areas using in built functions like is valid, first defined, is blank, is defined, string length, and string index
  • Developed Strategies for Data Analysis and Data Validation.
  • Ensure ongoing data quality including data quality audit benchmarks and communicate monthly data quality metrics and follow prescribed data quality methodologies
  • Provided guidance and quality assurance to all data masking activities, Profiles the source system data to identify potential data issues
  • Used Ab Initio GDE to generate complex graphs for transformation and loading of data into Staging and Target Data base area.
  • Used UNIX environment variables in various .ksh files, which comprises of specified locations to build Ab Initio Graphs.
  • Responsible for writing shell scripts (wrapper) to schedule the jobs in development environment.
  • Developed graphs for the ETL processes using Join, Rollup and Reformat transform components as well as Partition and De-partition components extensively.
  • Creating load ready files using Ab Initio to load into database.
  • Experience in Unit testing, System testing and debugging.
  • Provided 24/7-production support for a wide range of applications ZWP/ZLP.
  • Developed various Ab Initio Graphs for data cleansing using Ab Initio function such as is valid, is error, is defined, is null and various other string functions etc.
  • Resolved issues of various severities during testing and production phases on Time.

Environment: Ab Initio (GDE 1.12.6.1, Co>Operating System 2.12.2), UNIX5.2, Oracle 8.X, PERL, SQL\PL-SQL, TOAD, Windows NT/2000/XP.

Confidential, Atlanta, GA

Ab Initio Developer

Responsibilities:

  • Performed Metadata Mapping from legacy source system to target database fields and involved in creating Ab Initio DMLs.
  • Involved in creating detail data flows with Source and Target Mappings and convert data requirements into low level design templates.
  • Responsible for setting up Repository projects using Ab Initio EME for creating a common development environment that can be used by the team for source code control.
  • Implemented various levels of parameter definition like project parameters and graph parameters instead of start and end scripts.
  • Developed graphs based on data requirements using various AB Initio Components such as Rollup, Reformat, Join, Scan, Normalize, Gather, Broadcast, Merge etc., making use of statements/variables in the components for creating complex data transformations.
  • Used various Teradata utilities such as Mload, api and Fast load while using I-Table, O-Table components depending on the volume of data and status of the target database table.
  • Created generic graphs for loading and unload Teradata tables using pre and post Run SQL components to clean WEL tables that are created due to the intermediate process failures.
  • Performed data cleansing using AB Initio functions such as is valid, is error, is defind
  • Extensively used string * functions, date functions and error functions for source to target data transformations.
  • Well experienced in using Partition (Partition by key, partition by round robin) and Departition components (Concatenate, Gather, and Interleave, Merge) to achieve data parallelism.
  • Created common graphs to perform common data conversions that can be used across the applications using parameter approach using conditional DMLs.
  • Modified Ab Initio Graphs to utilize Data Parallelism and thereby improve the overall performance to fine-tune the execution times by using multi file systems and lookup files whenever required.
  • Implemented phasing and checkpoint approach in ETL process to prevent data loss and to maintain uninterrupted data flow against process failures.
  • Implemented Lookups instead of joins, in-memory sorts to minimize the execution times while dealing with huge volumes of data.
  • Replicate operational table into staging tables, transform and load data into warehouse tables using Ab Initio GDE.
  • Deployed and ran the graphs as executable Korn shell scripts in the applications system.
  • Developed UNIX Korn Shell script wrappers to run Ab Initio deployed scripts, perform audit checks/data reconciliation and error handling to ensure data accuracy.

Environment: Ab Initio (GDE 1.12 Co-op 2.12), UNIX, PL/SQL, Oracle 10g, TeradataV2R6, Query man, UNIX, Windows NT/2000.

Confidential

Teradata Developer

Responsibilities:

  • Managing databases, tables, indexes, views, stored procedures.
  • Enforcing business rules with triggers and user defined functions, troubleshooting, and replication.
  • Writing the Stored Procedures, checking the code for efficiency.
  • Daily Monitoring of the Database Performance and network issues.
  • Administering the Teradata Server by Creating User Logins with appropriate roles, dropping and locking the logins, monitoring the user accounts, creation of groups, granting the privileges to users and groups. SQL Authentication
  • Rebuilding indexes on various tables.

Environment: Teradata RDBMS, BTEQ, FastLoad, MultiLoad, Fast Export Teradata Manager, Teradata SQL Assistant, Rational Clear Quest, UNIX, MQ, NDM, FTP

We'd love your feedback!