We provide IT Staff Augmentation Services!

Senior Ab Initio Developer Resume

3.00/5 (Submit Your Rating)

Somerset, NJ

SUMMARY:

  • Over 7+ years of IT experience in the Design, Analysis, Development, Implementation and Decision Support Systems & Data Warehousing applications, Integration & Migration of Extract transfer and Load (ETL) processes using the Ab Initio Software.
  • Hands on experience in development of Data Warehouses/Data Marts using Ab Initio Co OP, GDE, Component Library, Oracle and UNIX for mainly Banking/Financial/Insurance industries.
  • Solid experience in Extraction, Transformation and Loading (ETL) mechanism using Ab Initio. Knowledge of full life cycle development for building a data warehouse.
  • Actively involved in ETL design, coding using Ab Initio ETL tool to meet requirements for extract, transformation, cleansing, and loading of data from source to target data structures.
  • Excellent skills in Abinitio graph/plan Development and implemented number of various Ab - Initio graphs using ETL techniques.
  • Extensively worked in creating Data marts using Ab Initio GDE 3.3.2/ CO>OP 3.3.2.7.
  • Designed and Developed the graphs using the GDE, with components partition by round robin, partition by key, rollup, sort, scan, de-dup sort, reformat, join, merge, gather Normalize components.
  • Good knowledge in UNIX shell scripting and Autosys Jil scripts.
  • Experience in integration of various data source systems like Flat files, CSV files, mainframes, Teradata, SQL server and Oracle.
  • Extensively used air commands to access EME and also Well versed in OLTP and OLTP Data Modeling, Data warehousing concepts, RDBMS.
  • Having lightening experience in version controlling, creating tags, Snapshots.
  • Experience in migrating code to higher environments like DEV to UAT& UAT to PROD and providing support during migration.
  • Having Hands on Experience in Autosys, Monitoring jobs and fixing job dependency related issues.
  • Well versed with various Ab Initio components such as Round Robin, Join, Rollup, Partition by key, gather, merge, interleave, Dedup sorted, Scan, Validate, FTP.
  • Experience with Ab Initio Co Operating System, application tuning and debugging strategies.
  • Proficient with various Ab Initio Parallelism and Multi File System (MFS) technique.
  • Good understanding of Ab Initio SANDBOX and Graphs parameters to make the graph generic.
  • Configured Ab Initio environment to talk to database using db config, Input Table, Output Table, Update table Components.
  • Worked on different dimension models like SCD1, SCD2 and SCD3.
  • Experience in technical solution using emphasis on Oracle, PL/SQL.
  • Developed numerous Ab Initio jobs to extract data from mainframe datasets into UNIX server.
  • Experience in data profiling and data quality.
  • Expertise in Technical/functional Requirements gathering, Development of flow/ processes with oracle, PL/SQL.
  • Experienced in writing SQL query and PL/SQL Procedures, Triggers, and Functions necessary for maintenance of the databases in the data warehouse development lifecycle.
  • SQL/Database developer experience in fine tuning queries and wrote several SQL queries for ad-hoc reporting.
  • Worked with Wrapper scripting with UNIX Shell programming, Scheduling of Ab Initio jobs with Arow and Control M.
  • Ability to co-ordinate effectively with development team, business partners, end users and management.
  • Experience in work environment with a mix of onsite-offshore Global Delivery model.

TECHNICAL SKILLS:

Ab: initio GDE 1.15,3.1.5, Co >Opsys (3.0 to2.15)

DataBase: Oracle, Microsoft SQL Server, IBM DB2.

Operating System: UNIX, Windows 7, Windows XP.

Languages: SQL, PL/SQL, Unix shell scripting.

AutoSys, Tivoli workload scheduler, and Control: M

Version Control Tool: EME, GIT.

HP: Mercury Quality Center

Data Modeling: Star Schema and Snowflake schema, Erwin tool.

PROFESSIONAL EXPERIENCE:

Confidential, Somerset, NJ

Senior Ab Initio Developer

RESPONSIBILITIES:

  • Developed and enhancement of Ab Initio Graphs and business rules in Ab Initio Functions.
  • Providing Support in migration of Ab Initio Business logic in cloud by using SPARQL.
  • Performed Data Profiling to assess the risk involved in integrating data for new applications, including the challenges of joins and to track data quality.
  • Performing Data Validations between On Prem and Cloud
  • Involved in all the stages of SDLC during the projects. Analyzed, designed and tested the new system for performance, efficiency and maintainability using ETL tool AB INITIO.
  • Review the requirement specifications with the client and provide comments to the manager.
  • Involved in analyzing business needs and document functional and technical specifications based upon user requirements with extensive interactions with business users.
  • Worked with project manager to determine needs and applying customizing existing technology to meet those needs.
  • Involved in Designing the ETL process to Extract, transform and load data.
  • Development of Generic Graphs, plans by using Ab initio and Usage of Active Transformations like Transform, Partition, De-partition, Sort components and Different Lookups Functions.
  • Developed PSET generation Unix Shell script by providing values through list file.
  • Used checkpoint and phasing to avoid deadlocks and re-run the graph in case of failure.
  • Performing transformations of source data with transform components like join,match sorted,reformat,dedup sorted,Filter by expression.
  • Write and modify several application specific scripts in UNIX in order to pass the Environment variables.
  • Worked on improving the performance of Ab Initio graphs by using Various Ab Initio performance techniques like using lookup s (instead of joins), In-Memory Joins and rollups to speed up various Ab Initio Graphs.
  • Involved in monitoring the Ab Initio jobs in and schedules through control M and Arrow.
  • Worked with production support team to debug issues in production and for migrations from pre prod to production
  • Developed Ab Initio graphs for Data validation using validate components like compare records, compute checksum etc.
  • Worked with Wrapper scripting with UNIX Shell programming, Scheduling of Ab Initio jobs with Arow.
  • Used inquiry and error functions like is valid, is defined, is error, is defined and string functions like string substring, string concat and other string * functions in developing Ab Initio graphs to perform data validation and data cleansing.
  • Worked on testing AB INITIO jobs in DDE as well as in AIC (Ab initio in cloud), which was a part of the migration to cloud.
  • Knowledge of checking the data flow through the front end to back end and used the SQL queries to extract the data from the database to validate it at the back end.
  • Generated Quick Reports for users for data analysis on numerous occasions.
  • Involved in setting up the routes in EFG (EXTERNAL FILE GATEWAY) tool for different vendors.
  • Worked in various services in AWS like Step functions, lambda, EC2, S3, IAM, SNS.
  • Created high level and low-level technical design documents.

ENVIRONMENT: Ab Initio (GDE 3.1, Co>Op Sys 3.1), UNIX, SQL, Arow, CONTROL M, Service now,Nebula.

Confidential, Alpharetta, GA

Ab Initio/ETL Developer

RESPONSIBILITIES:

  • Extracted data from various sources like databases, delimited flat files.
  • Extensively Used Ab Initio components like Reformat, Scan, Rollup, Join, Sort, Partition By key, Normalize, Input Table, Output Table, Update Table, Gather Logs and Run SQL for developing graphs.
  • Implemented procedures for management of Ab Initio applications and Citrix servers.
  • Developed and enhancement of Ab Initio Graphs and business rules using Ab Initio Functions.
  • Performed validations, data quality checks and Data profiling on incoming data.
  • Generated Configuration files, DML files, xfr files for specific Record format, which are used in components for building graphs in Ab Initio002E
  • Knowledge in translating business requirements into workable functional and non-functional requirements at detailed production level using Workflow Diagrams, Sequence Diagrams, Activity Diagrams and Use Case Modeling.
  • Developed various BTEQ scripts to create business logic and process the data using Teradata database.
  • Involved in design of database and creation schemes and tables in normalized form.
  • Extensively used Multi-load and Fast-load utilities to populate the flat files data into Teradata database. Performed evaluations and made recommendations in improving the performance of graphs by minimizing the number of components in a graph, tuning the Max Core value, using Lookup components instead of joins for small tables and flat files, filter the data at the beginning of the graph etc.
  • Created high level and low level detailed technical design documents for the project.
  • Involved in creating Ab Initio generic graphs to extract/load data and processing graphs to apply the transformations on the data to apply business rules on the data.
  • Was responsible for maintenance of the prod/non-prod environments at times looking for any failures and applying quick fixes.
  • Participated in designing the scheduling process for the module and made sure all the jobs run without any contention.
  • Extensively used Ab Initio GDE 3.3.2 to develop Graphs.
  • Participated in providing support to the releases until the system became stable.
  • Co-ordinate with the support teams for roll-out, create necessary document for the production migration.
  • Support of bug fixes and issues reported.
  • Validate the data movement and transformations are done correctly as per the requirement.
  • Extensively used File management commands like m ls, m wc, m dump, m copy, m mkfs etc.
  • Responsible for deploying Ab Initio graphs and running them through the Co-operating systems mp shell command language and responsible for automating the ETL process through scheduling.
  • Generate SQL queries for data verification and backend testing. Detect and Correct Data Quality issues.
  • Worked with data mapping from source to target and data profiling to maintain the consistency of the data. Experienced with SQL queries.
  • Written stored procedures and packages on server side and developed libraries.
  • Written UNIX scripts to perform certain tasks and assisting developers with problems and SQL optimization.
  • Implemented phasing and checkpoint approach in ETL process to prevent data loss and to maintain uninterrupted data flow against process failures.
  • Automate the complete daily, weekly and monthly refresh using custom build UNIX SHELL SCRIPTS.
  • Worked with production support team to debug issues in production and for migrations from pre-prod to production.

ENVIRONMENT: Ab Initio (GDE 3.1, Co>Op Sys 3.1), UNIX, SQL, IBM DB2, CONTROL M, Teradata V2 R6 (Fast-loads, Multi-loads, Fast Exports), BTEQ.

Confidential, Cleveland, OH

ETL/ Ab Initio Developer

RESPONSIBILITIES:

  • Extensively used Ab-Initio ETL tool in designing & implementing Extract Transformation & Load processes. Different Ab Initio components were used effectively to develop and maintain the database.
  • Understood the business requirements with extensive interaction with users and reporting teams and assisted in developing the low-level design documents.
  • Provide support to SQL, MYSQL, Oracle, Data guard and DB2 database while coordinating with database teams.
  • Worked in a sandbox environment while extensively interacting with EME to maintain version control on objects. Sandbox features like check-in and checkout were used for this purpose.
  • Used components of Ab Initio to extract, transform and load data from multiple data sources like Teradata (DB2), Flat files, XMLs etc. to target data marts.
  • Cleansed the data using various Ab Initio components like Join, Rollup, Dedup-Sort, Reform, Replicate, Partition by expression, Partition by Key, Partition by round robin, Scan, Filter by Expression, Gather and Merge.
  • Involved in extracting the data from the Flat Files and Relational databases into staging area.
  • Worked with UNIX commands in command prompt, created and executed different types of shell scripts.
  • Implemented phasing and checkpoint approach in ETL process to prevent data loss and to maintain uninterrupted data flow against process failures.
  • Extensively used File management commands like mls, mdump, mcp, mmkfs etc. to operate with multi files.
  • Worked on improving the performance of Ab Initio graphs by using Various Ab Initio performance techniques like using lookup s (instead of joins), In-Memory Joins and rollups to speed up various Ab Initio Graphs
  • Part of remediation (Performance tuning) work on complex Sql's and client sql's and work on improving performance of queries by checking the explain plan and Stats on tables, improve join strategies etc.
  • Implemented slowly changing dimension methodology for accessing the full history of accounts.
  • Used debugger in identifying bugs in existing graphs by analyzing data flow, evaluating transformations.
  • Participated in weekly status meetings and conducting internal and external reviews as well as formal walk through among various teams and documenting the proceedings.
  • Created sandbox and EME Datastore settings in order to access the Metadata.
  • Involved in overall OMS design and code review. And understood the Working of the OMS.
  • Maintained locks on objects while working in the sandbox to maintain the privacy.
  • Developed Complex XFRs to derive new fields and solve various business requirements.
  • Converted user defined functions and complex business logic of an existing application process into Ab Initio graphs using Ab Initio components such as Reformat, Join, Transform, Sort, Partition to facilitate the subsequent loading process.
  • Used Teradata SQL Assistant front-end tool to issue SQL commands matching the business requirements to Teradata RDBMS.
  • Worked with DBA to identify gaps in data modeling, data mapping in Teradata and provide Teradata performance knowledge to users.
  • Implemented a 6- way multi file system in the test environment that is composed of individual files on different nodes that are partitioned and stored in distributed directories in multi file system. Partition Components (Partition by Key, by Expression, by round Robin) were used to Partition the large amount of data file into a simple multiple data file.
  • Used UNIX environment variables in various .ksh files, which comprises of specified locations to build Ab initio Graphs.
  • Responsible for deploying Ab Initio graphs and running them through the Co-operating systems mp shell command language and responsible for automating the ETL process through scheduling.
  • Participating in the agile trainings and meetings as a part of the agile team.

ENVIRONMENT: Ab Initio GDE 1.13 Co-op 2.13, UNIX, PL/SQL, Oracle 8i/9i, Query man, Windows NT/2000

Confidential, Hartford, CT

ETL/Ab Initio Developer

RESPONSIBILITIES:

  • Developed Ab Initio graphs using different components for extracting, loading and transforming external data into data mart.
  • Developed number of Ab Initio Graphs based on business requirements using various Ab Initio Component like Filter by Expression, Partition by Expression, Partition by round robin, reformat, rollup, join, scan, normalize, gather, replicate, merge etc.
  • Extensively used Ab-Initio ETL tool in designing & implementing Extract Transformation & Load processes. Different Ab Initio components were used effectively to develop and maintain the database.
  • Understood the business requirements with extensive interaction with users and reporting teams and assisted in developing the low-level design documents.
  • Responsible for creating datasets for all system of records tables, files.
  • Collect the data statistics null, valid, invalid, distinct, maximum, minimum, average, count for all system of record tables, files.
  • Extensively used Ab Initio GDE 3.0/3.1 to develop Graphs.
  • Used Ab Initio web EME to monitor the data mapping across graphs in the project.
  • Used Metadata Hub in checking Data Lineage, Impact analysis before and after monthly release.
  • Used Ab Initio web EME to monitor the data mapping across graphs in the project.
  • Implemented Data Parallelism through graphs, by using Ab Initio partition components.
  • Documentation of complete Graphs and its Components.
  • Implemented Data Parallelism through graphs, which deals with data, divided into segments and operates on each segment simultaneously through the Ab Initio partition components to segment data.
  • Extensively used UNIX Shell Scripting for writing SQL execution scripts in Data Loading Process.
  • Involved in Ab Initio Design, Configuration experience in Ab Initio ETL, Data Mapping, Transformation and Loading in complex and high-volume environment and data processing at Terabytes level.
  • Knowledge of checking the data flow through the front end to back end and used the SQL queries to extract the data from the database to validate it at the back end.
  • Designed automated reports in SQL Server Reporting Services (SSRS)
  • Involved in automating the ETL process through scheduling.
  • Debugged and modified shell scripts using edit script and vi editor in UNIX environment in determining various graph paths and run file path for job request engine.
  • Responsible for cleansing the data from source systems using Ab Initio components such as reformat and filter by expression.
  • Developed psets to impose reusable business restrictions and to improve the performance of the graph.
  • Extensively used m db commands to query the oracle databases for reporting purposes.
  • Developed several partition-based Ab Initio Graphs for high volume data warehouse.
  • Checked the accuracy of data loaded into Teradata & assured the linkage of keys on the tables for various applications.

ENVIRONMENT: Ab-Initio 2.14.22GDEwith co-op 2.13.1, EME 2.13.1, SQL SERVER 2005, UNIX and Windows XP, Autosys.

Confidential, San Antonio, TX

PL/SQL & Ab Initio Developer

RESPONSIBILITIES:

  • Involved in High level designing and detailed level designing.
  • Developed Ab-Initio graphs - using database, dataset, departition, transform, sort and partition components for extracting, loading and transforming external data to feed PARS by creating DML's, XFR's, DB Config's, SQL's, mp s.
  • Design team member for integrating batch and online architectures, define re-usable components, setting programming and naming conventions, assigning sub-tasks and also give inputs and monitor scope/business/technical requirements.
  • Usage of Ab-Initio data profiler for data cleansing and analysis. Also using the same to analyze test output for making the data warehouse more robust by identifying highly secured information such as personal account information.
  • Usage of Ab Initio EME for dependency analysis and configuration management.
  • Performance tuning of Ab Initio Load processes. Participating in various data cleansing and data quality exercises.
  • Design & development of data load components using Ab-Initio and UNIX shell scripts.
  • Understanding the reporting requirements of users in marketing department.
  • Developed UNIX shell scripts using SQL programs for daily and weekly data loads and involved in creating Database Configuration files (.dbc), used in transformations to extract data from different sources and load into target tables.
  • Extensively used lookups to Increase the performance of the graph.
  • Much familiar with the concept of Sand box for check in & checkout process.
  • Used the Ab Initio Web Interface to Navigate the EME to view graphs, files and datasets and examine the dependencies among objects.
  • Written stored procedures and packages on server side and developed libraries.
  • Written wrapper scripts for date logic implementation and file archiving process.
  • Test Ab Initio graphs in development and migration environments using test data, fine tuning the graphs for better performance and migrate them to the Production environment.
  • Actively involved in writing the code in Oracle 11g using SQL and PL/SQL and in developing ETL Transformations using ETL tool.
  • Experience in using sprint developed Application tools for Preload, load, post loading into partitioned tables.
  • Database tuning, backup and recovery and general day to day running of the system.
  • Involved in Hardware project in changing the DB server from Sun to AIX to minimize size and efficiency of the project and improved Performance.
  • Used Ab initio components like Reformat, Scan, Rollup, Join, Sort, Normalize, Input Table, Output Table, Update Table, Gather Logs and Run SQLfor developing graphs.
  • Expertise in handling Pipeline Jobs.
  • Supporting production, maintaining nightly jobs.

ENVIRONMENT: Ab Initio (GDE 1.13.9, Co-Op 2.12), Oracle9i, Teradata V2R5, AIX 4.0, MS Visio, EME, Korn Shell Scripts.

We'd love your feedback!