Ab Initio Developer Resume
Atlanta, GA
SUMMARY:
- Over Seven plus years of professional IT experience in Business Analysis, Design, Data Modeling, Development and Implementation of various client server and decision support system environments with focus on Data Warehousing, Business Intelligence and Database Applications.
- Over 5 years of Ab Initio Consulting with Data mapping, Transformation and Loading from Source to Target Databases in a complex, high volume environment.
- Extensively worked on several ETL Ab Initio assignments to extract, transform and load data into tables as part of Data Warehouse development with high complex Data models of Relational, Star, and Snowflake schema.
- Expert knowledge in using various Ab Initio components such as Join, Reformat, Scan, Rollup, Normalize, De - normalize, Partitioning and De-partitioning components etc.
- Experience in Data Modeling, Data Extraction, Data Migration, Data Integration, Data Testing and Data Warehousing using Ab Initio.
- Well versed with various Ab Initio parallelism techniques and implemented Ab Initio Graphs using Data, Component, pipeline parallelism and Multi File System (MFS) techniques.
- Configured Ab Initio environment to connect to different databases using DB config, Input Table, Output Table, Update table Components.
- Very good understanding of Teradata MPP architecture such as Shared Nothing, Nodes, AMPs, BYNET, Partitioning, Primary Indexes etc. Extensively used different features of Teradata such as BTEQ, Fast load, Multiload, SQL Assistant, DDL and DML commands.
- Experience in using EME for version controls, impact analysis and dependency analysis.
- Extensive experience in Korn Shell Scripting to maximize Ab-Initio data parallelism and Multi File System (MFS) techniques.
- Experience in providing production support to various Ab Initio ETL jobs and developing various UNIX shell wrappers to run Ab Initio and Data base jobs.
- Developed various UNIX shell scripts to run Ab Initio and Data base jobs. Good experience working with very large databases and Performance tuning.
- Good Experience working with various Heterogeneous Source Systems like Oracle, DB2 UDB, Teradata, MS SQL Server, Flat files and Legacy Systems.
- Experience in DBMS Utilities such as SQL, PL/SQL, TOAD, SQL*Loader, Teradata SQL Assistant.
- Experienced with Teradata utilities Fast Load, Multi Load, BTEQ scripting, FastExport, OleLoad, SQL Assistant.
- Able to interact effectively with other members of the Business Engineering, Quality Assurance, Users and other teams involved with the System Development Life cycle
- Expertise in preparing code documentation in support of application development, including High level and detailed design documents, unit test specifications, interface specifications, etc.
- Excellent Communication skills in interacting with various people of different levels on all projects and also playing an active role in Business Analysis.
- Manage multiple projects/tasks within Mortgage, Banking & Financial Service industries in a high - transaction processing environments with excellent analytical, business process, written and verbal communication skills.
TECHNICAL SKILLS:
ETL Tools: Ab Initio (GDE 1.15/3.2,3.3.4 Co>Operating system 2.15/3.0/3.3 )
RDBMS: Teradata 15/15.10/V13/V12/V2R6, Oracle 8i/9i/10g/12c, SQL Server, DB2, MS Access.
DB Tools: SQL*Loader, import/Export, TOAD, Teradata SQL Assistant
Languages: C, C++, SQL, PL / SQL, XML, PERL Scripting and Korn Shell Scripting.
Platforms: Sun Solaris 7.0, HP-UX UNIX, AIX, Windows NT/XP/2000/98/95, Mainframes, MS- DOS.
Modeling Tools: Erwin, MS Visio
EXPERIENCE:
Confidential, Atlanta, GA
Ab Initio Developer
Responsibilities:
- Ab Initio Developer for the creation of a Data Mart Implementation which includes products for the analytical and reporting use.
- Did the detailed profiling of operational data using Ab Initio Data Profiler/SQL Tool to get better understanding of the data that can be used for analytical purpose for business analysts.
- Participated in several JAD sessions with analysts from business side to come up with the better requirements.
- Develop and manage processes and workflows involving data moving within, into, and out of the hadoop environment.
- Created Data Mart POC at Data Mart level in user Teradata space to validate the requirements with users and also to come up with the better mapping document with right transformations.
- Participated in Agile Iterative Methodology with the help of BT Project Manager.
- Automation of load processes using CA-7 Workload Automation tool.
- Implemented data marts in ODS, EDW, DM and ADB (Application Database). Coordinated with Enterprise Warehouse architects to follow the corporate standards for the implementation. Used existing Metadata, Audit and ETL frameworks.
- Involved in creation of Logical and Physical models using Erwin for ODS, EDW, DM and ADB and created DDLs for the DBA to create structures in the Teradata environments, development, staging and production. The modeling part is done through JAD sessions with involvement from Enterprise Architects and Business Users.
- Evaluated existing Teradata Industry logical data model (ILDM) related to insurance to be used for the Data Mart.
- Created mapping document for all the above 4 environments and ETL design document for the ETL developers to code.
- Involved in walkthrough of the Data Models and ETL design documents with ETL developers, before each of the ETL coding iteration. Did the integration testing and involved with UAT with business users. Did the Ab Initio ETL Code walk thru and did some performance improvements.
- Automated the entire process using Unix Shell scripts and scheduled the process using Autosys after dependency analysis.
- Extensively used the Ab Initio components like Reformat, Join, Partition by Key, Partition by Expression, Merge, Gather, Sort, Dedup Sort, Rollup, Scan, FTP, Lookup, Normalize, Denormalize, Input, Output and Join With DB. Used Abi features like MFS (8 way), check point, phases etc.
- Well versed with AB Initio parallelism techniques and implemented Ab Initio Graphs using Data parallelism and MFS techniques.
- Good understanding of new Ab Initio features like Component Folding, Parameter Definition Language (PDL), Continuous flows, Queues, publisher and subscriber components.
- Worked extensively to create, schedule, and monitor the workflows and to send the notification messages to the concerned personnel in case of process failures.
- With the help Enterprise Metadata team, uploaded the technical and business metadata to enterprise level Metacenter. Defined audit thresholds for the Balance and control rejections during ETL process.
Environment: Ab Initio (GDE 3.3.4, Co>Operating System 3.3.4), Oracle 12c, SQL Server, IBM DB2 11.1, Hadoop 2.0, HDFS, Unix Shell Scripts, CA-7, TOAD.
Confidential, Charlotte, NC
Ab Initio Developer
Responsibilities:
- Created Ab Initio graphs that transfer data from various sources like Oracle, flat files and CSV files to the Teradata database and flat files.
- Derived modeled the Facts, Dimensions, Aggregated facts in Ab Initio from data warehouse star schema for create billing, contracts reports.
- Worked on Multi file systems with extensive parallel processing.
- Extensively used Partitioning Components: Broad Cast, partition by key, partition by Range, partition by round robin and De-partition components like Concatenate, Gather and Merge in Ab Initio.
- Implemented Transform Components such as Aggregate, Dedup Sorted, Filter by Expression, Join, Normalize, Reformat, Rollup and Scan Components and created appropriate XFRs and DMLs.
- Automation of load processes using Autosys.
- Used Lookup Transformation in validating the warehouse customer data.
- Prepare logical/physical diagram of DW, and present it in front of business leaders. Used ERWIN for model design.
- Performed bulk data load from multiple data source (ORACLE 11i, legacy systems) to TERADATA RDBMS.
- Used BTEQ and SQL Assistant (Query man) front-end tools to issue SQL commands matching the business requirements to Teradata RDBMS
- Coded and tested Ab Initio graphs to extract the data from Oracle tables and MVS files.
- Enhancements were done to the existing System as specified by the customer using COBOL, DB2, and JCL.
- Generated COBOL applications and corresponding custom JCL scripts.
- Worked closely with the end users in writing the functional specifications based on the business needs.
- Used SFTP type of communication in project delivery.
- Participated in project review meetings.
- Extensively worked with PL/SQL Packages, Stored procedures & functions and created triggers to implement business rules and validations.
- Responsible for Performance-tuning of Ab Initio graphs.
- Collected and analyzed the user requirements and the existing application and designed logical and physical data models.
- Created reports using Crystal Reports.
- Scripts were run through Unix shell scripts in Batch scheduling
- Responsible to prepare Interface specifications and complete Documentation of Graphs and its Components.
- Responsible for testing the graph (Unit testing) for Data validations and preparing the test reports.
Environment: Ab Initio (GDE 3.2, Co>Operating System 3.2), Oracle 12c, Teradata 15/15.10(FastLoad, MultiLoad, FastExport, BTEQ SQL Assistant), Shell Scripts, JCL, Crystal Reports, Autosys.
Confidential, Northbrook, IL
Ab Initio Developer
Responsibilities:
- ETL Developer involved in creation of Enterprise Project Management (EPM) Data Mart for the enterprise project level reporting and analytics.
- Conducted user requirement sessions to get the requirements for the EPM Analytics and reporting.
- Had a walkthrough of the Prism (EPM Vendor software) tables to have better understanding of the attributes/elements presented in the software related to projects, project requests, service requests and tasks.
- Followed the enterprise standard of creating Normalized Standard Layer and Dimensional Presentation Layer for the Data Mart creation.
- Did the data profiling using SQL and PL/SQL code to understand the data and relationships on the operational system.
- Involved in the creation of Logical and Physical Models for the EPM Standard Layer and Presentation Layer using Erwin Modeling tool and also created DDLs for DBA to create Oracle Structures.
- Extensively involved in various data modeling tasks including forward engineering, reverse engineering, complete compare, match and merge, creating DDL scripts, creating subject areas, creating DDL scripts, publishing model to PDF and HTML format, generating various data modeling reports etc.
- Identified and tracked the slowly changing dimensions and determined the hierarchies in dimensions. Used Kimball methodologies for Star Schema Implementation.
- Created Mapping Documents and ETL design documents for both Presentation Layer and Standard Layer. Followed Enterprise Level Metadata and Audit standards while creating design and source-to-target mapping documents.
- Coordinated with ETL developers from Indian Offshore Company to develop Ab Initio ETL process to populate both Standard and Presentation Layers. Reviewed the entire ETL process for the performance tuning, audit and backfill and moving forward. Did the Integration testing in both the Layers. And also reviewed the audit checks. Validated the data against the Operational EPM reports.
- Created several automated Unix Shell scripts to create wrappers for Abi code to do parameter passing and also to load Balance and Control files to Oracle B&C database.
- Handed over the production maintenance phase to Offshore Team.
- Coordinated with BO developer for the creation of BO Universe and created templates for the daily/monthly reports to be created. And also helped him in creation of Dashboards with useful KPIs.
- Coordinated with BO developer to create several OLAP cubes for the dimensional analysis.
- Conducted user trainings to help them understand the Presentation Layer structures and available Cubes and Reports for analysis.
- Created complete metadata (data lineage) from ETL to Reporting and loaded to Enterprise Metadata Repository.
Environment: AbInitio Co>Op 2.15, GDE 1.15,Oracle 10g,Toad,SQL,PL/SQL,IBM 390 Mainframes, UNIX, JCL, COBOL, TeradataV12,AUTOSYS, XPIO,PCVS ERWIN, Ascential DataStage 7.5
Confidential, Los Angeles, CA
Ab Initio Developer
Responsibilities:
- ETL Ab Initio Developer involved in the Migration of HRM (Household Relationship Manegement), which is a Master Data Management (MDM) for entire company, to SOA (Service Oriented Archtecture) application called HTS.
- The new platform uses J2EE framework, Websphere Application Server, TIBCO, Ab Initio, I/Lytics and Oracle to implement above architecture.
- With this new SOA architecture, various Web Services (WS) like GetParty, GetHousehold, and GetPolicy etc., will be provided so that enterprise wide applications like SIEBEL, IRMS and NOVA etc. can use HRM data by calling these Web Services using SOAP/XML over HTTP.
- Involved in the data migration process from Mainframe to Oracle database. Involved in the creation logical and physical models for the new architecture.
- Involved in the Bulk Data Retrieval to extract data from SOA application using ETL tool using batch process, which is alternative to Web Services.
- Using RUP methodology, created various Uses Case specification document and diagrams from client integration document for the Bulk Data Retrieval.
- Did the benchmarking and selection of ETL tool for the Bulk processing, which is Ab Initio.
- Did the Proof of Concept (POC) using Ab Initio to show that Ab Initio is the right solution for the HTS Bulk Data Service. Used various Ab Initio components like Call Web service, Read XML, Write XML, xml-to-dml utility to test HTS provided Web Services. Also did the POC with Ab Initio/Oracle Stored Procedures (PL/SQL) to evaluate the performance.
- Performed the Unix shell scripting for the manipulation of XML files as well as for automation.
- Extensively used the Ab Initio components like Reformat, Join, Partition by Key, Partition by Expression, Merge, Gather, Sort, Dedup Sort, Rollup, Scan, FTP, Lookup, Normalize, Denormalize, Input, Output and Join With DB. Used Abi features like MFS (8 way), check point, phases etc.
- Well versed with AB Initio parallelism techniques and implemented Ab Initio Graphs using Data parallelism and MFS techniques.
- Used the HTS provided Web Services within ETL tool so that underlying logic won’t be duplicated within batch process.
Environment: Ab Initio (GDE1.13, Co>operating system 2.13), Teradata V2 R6 (Fastload, Multiload, FastExport, BTEQ) DB2, UNIX IBM AIX 5.1, Oracle 9i.
Confidential, Dallas, TX
Ab Initio/Teradata Developer
Responsibilities:
- ETL Developer on CIMS warehouse implementation using Kimball’s dimensional BUS Architecture and methodologies to ETL data from EDS to Data Marts.
- Implemented Star Schema Analytical Data Marts for the following subject areas under CIMS enterprise data warehouse:
- ACAPS
- Collections
- IVR/Avaya Call and Agent Data
- Conducted several JAD sessions with key business analysts to get the requirements related Reports, KPIs and Data Mining.
- Performed extensive Data Profiling on the source data by loading the data sample into Database using Database Load Utilities.
- Worked with Data Modelers to create a star schema model for the above subject areas and made sure that all the requirements can be generated from the models created.
- Coordinated with EDS team to make sure all the data required for the above data marts is in the Enterprise data store. Otherwise worked with EDS team and Operational teams to push the data required in to the data store (ODS).
- Created a mapping document for each of the tables involved in the above models and explained the mapping documents to the ETL developers.
- Created High level and detail design documents for above data mart projects containing process flows, mapping document, initial and delta load strategies, Surrogate key generation, Type I and Type II dimension loading, Balance and control, Exception processing, Process Dependencies and scheduling.
- Coordinated the ETL efforts with ETL developers to the successful implementation of the above data marts. Involved in day to day issue resolutions related the data quality, mapping and ETL designs.
- Followed the enterprise standards on mapping documents, design documents, Metadata framework, Balance and Controls.
- Involved in the integration testing with ETL developers and User Acceptance Testing (UAT) with Business Analysts.
- Identified the required dependencies between ETL processes and triggers to schedule the Jobs to populate Data Marts on scheduled basis.
- Did the performance tuning on the Ab Initio graphs to reduce the process time.
- Designed and coordinated the backfill process to load 3 to 5 years of the data into Data Marts.
- Extensively used the Ab Initio components like Reformat, Join, Partition by Key, Partition by Expression, Merge, Gather, Sort, Dedup Sort, Rollup, Scan, FTP, Lookup, Normalize, Denormalize, Input, Output and Join With DB. Used Abi features like MFS (8 way), check point, phases etc.
- Well versed with AB Initio parallelism techniques and implemented Ab Initio Graphs using Data parallelism and MFS techniques. Experience in using Conditional Components and Conditional DML.
- Participated in the evaluation of Teradata DBMS to replace DB2/UDB 7 for the EDW. Created several large tables with real card portfolio data totaling 4 TB for the POC. With the help of Teradata folks at San Diego, created tables with right primary index and partitioning index on the 4 node Teradata system. Created several complex queries and ran them to get the performance measurements. Compared the results with the results from running the same queries on UDB DB2 system. Presented the results to Management.
- Worked with DBA team to ensure implementation of the databases for the physical data models intended for the above data marts. Created proper Teradata Primary Indexes (PI) taking into consideration of both planned access of data and even distribution of data across all the available AMPS. Considering both the business requirements and factors, created appropriate Teradata NUSI for smooth (fast and easy) access of data.
- Created proper Teradata Primary Indexes (PI) taking into consideration of both planned access of data and even distribution of data across all the available AMPS. Considering both the business requirements and factors, created appropriate Teradata NUSI for smooth (fast and easy) access of data.
- Extensively used the Teradata utilities like BTEQ, Fastload, Multiload, DDL Commands and DML Commands (SQL). Created various Teradata Macros in SQL Assistant for to serve the analysts.
- Created several BTEQ scripts involving derived tables and volatile/Global temporary tables for adhoc purpose to extract data for several business users on scheduled basis.
- Involved in the collection of statistics on important tables to have better plan from Teradata Optimizer.
- Did the performance tuning of user queries by analyzing the explain plans, recreating the user driver tables by right primary Index, scheduled collection of statistics, secondary or various join indexes.
- Supported several Business Areas by developing around 25 complex reports in several types and Dashborads using Cognos Reports Studio. Validated these reports against Operational Reports for better confidence.
- Implemented data level, object level and package level security in framework manager to achieve highly complex security standards.
- Created Master Detail reports, drill through and custom prompting reports, and Scheduled reports for efficient resource utilization.
- Created Query Prompts, calculations, Conditions & Filters. Developing Prompt pages and Conditional Variables. Involved in Testing and improving Report Performance.
- Trained Users on Query Studio for Ad-Hoc Reporting.
- Helped Business Users by writing Complex efficient Teradata SQLs to get a detailed for Data Mining. Automated these extract using BTEQ an Unix Shell Scripting.
- Interacted with senior manegement from risk, marketing, finance and fraud teams to create key metrics such as receivables, charge-offs, fraud write-off, card usage etc.
- Modeled metadata (Facts and Dimensions) in framework manager and published packages to Cognos connection for implemented data marts. Created correct access paths while designing Framework Model.
- Used Cognos Transformer to create/model (DMR) OLAP cubes for multi-dimensional analysis with appropriate alterate drill down paths and dimensions to enable slicing and dicing to explore data comprehensively.
- Proficiency in Cognos Framework Manager, Transformer, Report Studio, Metric Studio and Analysis studio.
- Helped business users understand the power and limitations of the Cognos reporting tool by conducting training sessions with them.
- Converted several existing highly critical reports like yellow book and blue book, which are developed in SAS into Cognos framework.
Environment: Ab Initio (GDE1.12, Co>operating system2.12), Mainframes MVS, Business objects Suite 6.0, Erwin 3.5.2, Oracle 8i, SQL Server 2000, SQL, SQL Loader, PL/SQL, UNIX.
Confidential
SQL Programmer / Analyst
Environment: Oracle 9.x, SQL, PL/SQL, Developer 2000, Reporter 4.5, UNIX and Windows NT, Korn Shell.
Responsibilities:
- Extensively involved in requirements gathering and data gathering to support developers in handling the design specification
- Involved in designing and coding of functional specifications for the development of user interfaces
- Created tables, indexes, sequences, constraints and snapshots
- Developed packages, procedures, functions and PL/SQL blocks for data validation
- Developed PL/SQL scripts to validate and load data into interface tables
- Fixed software bugs and interacted with developers to resolve technical issues
- Designed and developed a Generic Billing system for a Telecommunication company
- Responsible for all pre-ETL tasks upon which the Warehouse depends including managing & collection of various existing data sources.