We provide IT Staff Augmentation Services!

Senior Ab Initio Developer Resume

4.00/5 (Submit Your Rating)

Chicago, IL

SUMMARY

  • Around 7+ years of IT experience in the Data Analysis, Design, Development, Implementation and testing of Database/Data warehousing and Legacy applications for Financial, Manufacturing and Health Care Insurance industries using Data Modeling, Data Extraction, Data Transformation, Data Loading and Data Analysis.
  • Over 5 years of experience in Ab Initio ETL tool, Data Mapping, Transformation and Loading in a complex, high - volume environment.
  • Expertise in all components in the GDE for creating, executing, testing and maintaining graphs in Ab Initio and also experience with Ab Initio Co-operating System in application tuning and debugging strategies.
  • Expertise and well versed with various Ab Initio Transform, Partition, Departition, Dataset and Database components.
  • Experience in integration of various data sources with Multiple Relational Databases like Oracle, Teradata and worked on integrating data from flat files, XML and COBOL files.
  • Strong knowledge on Business Rules Engine (BRE) rules and Application Configuration Environment (ACE).
  • Worked with Teradata SQL Assistant and various load utilities like BTEQ, Fast Load, MultiLoad, TPump and Fast Export.
  • Experienced in all phases of Software Development Life Cycle (SDLC).
  • Strong knowledge of Data Warehousing concepts and Dimensional modeling like Star Schema and Snowflake Schema.
  • Experience in Data Modeling, Data Extraction, Data Migration, Data Integration, Data Testing and Data Warehousing using Ab Initio.
  • Implemented Ab Initio Graphs using Data, Component, pipeline parallelism and Multi File System (MFS) techniques.
  • Experienced with designing application with Ab Initio Meta programming and use of vectors to solve complex business requirement.
  • Managed and conducted System testing, Integration testing, Functional testing, UAT and Regression testing.
  • Developed Complex database objects likeStored Procedures, Functions, Packages and Triggers using SQL and PL/SQL.
  • Experience in using Automation Scheduling tools like Autosys and Control-M.
  • Worked with EME for jobs migration, version control, and dependency analysis.
  • Experience in providing Production support to various Ab Initio ETL jobs and developing various UNIX shell wrappers to run Ab Initio and Data base jobs.
  • Practical Experience with UNIX shell wrappers and Oracle PL/SQL programming to run Ab Initio and Data base jobs.
  • Excellent Communication skills in interacting with various people of different levels on all projects and also play an active role in Business Analysis.
  • Allocate activities and tasks to the team members at onsite/Offshore. Guide the team and provide the solutions towards any problem faced by them.
  • Self-motivated and proactive leader with technical and exemplary communication skills. Exceptional ability to create, implement and improve IT standards, policies, and procedures.

TECHNICAL SKILLS

Primary Skills: Ab Initio (GDE 3.0,Co>Op 3.0), Data Warehouse Development, Oracle/Teradata SQL, Plan>IT

Secondary Skills: Map Reduce, Data Modeling, Data Warehouse design and Unix shell scripting

Operating Systems: Windows, UNIX

Languages: SQL, PL/SQL, Unix shell scripting

Scheduling Tools: Tivoli workload scheduler, Control-M and Autosys

Databases: Oracle and Teradata V2R5,TTU 12.x/13.x, DB2

SCM Tools: CVS and EME

Domain Knowledge: Banking, Health Care Insurance, Media

Documentation: LLD’s, HLD’s, Design Documents, MS Office

PROFESSIONAL EXPERIENCE

Confidential, Chicago IL

Senior Ab Initio Developer

Responsibilities:

  • Designed and deployed ETL graphs by studying the business requirements from the functional design document and from the business users.
  • Design and develop ETL jobs which extract information from flat files and load them into Teradata data warehouse.
  • Called web services from Ab Initio to get service responses and information exchange to/from remote servers.
  • Deployed and executed Ab Initio and Data Profiler jobs on both Windows and UNIX Environment.
  • Prepared business, detail design and technical documentation for ETL standards, procedures and naming conventions, worked under the ETL process.
  • Worked heavily on PLAN, PSET, PDL make the ETL Flow more dynamic, versatile and set dependencies.
  • Used various Ab Initio components of Partition, De-partition, Database, Datasets, Transform, FTP and Sort to build Graphs in GDE.
  • Helped in Configuring Business Rules Engine (BRE) Version 3.0.3.1 with Ab Initio Application Hub Version 3.0.3.
  • Created PL/SQL stored procedures, functions and packages for moving the data from staging area to data mart.
  • Created XFRs during the data transformation phase prior to creating load ready files.
  • Implemented Partition techniques using Multi file system.
  • Created Database Configuration files (.dbc) and involved in creating SQL statements with .sql extensions, which will be used in Database transformations to extract the data from different source tables and to load into the target table.
  • Involved in creation of proper test data to satisfy all required test cases and performed unit testing on all deliverables also was part of the code review team.
  • Worked as release captain from ETL team to help create Ab initio tags (project level tags or Object level tags) from EME air commands and moving the code from one environment to other.
  • Worked on Autosys which involved in creation of Autosys jobs, execution.
  • Designed process with Ab Initio plan resource pool to run and control number of parallel processes running on the server and at the same time optimize the performance of the process.
  • Phasing and check pointing were used in the graphs to avoid deadlock and recover completed stages of the graph, in case of a failure.
  • Worked on automating the ETL processes using Tivoli Workload Scheduler.
  • Built reporting process in prod upon client process and automated the distribution of reports to business team.
  • Experience on using SQL, PL/SQL, ORACLE Database and developed PL/SQL triggers and master tables for automatic creation of primary keys.
  • Wide usage of lookup files while getting data from multiple sources and size of the data is limited.
  • Developed complex Ab Initio XFR’s to derive new fields and solve various business requirements.
  • Developed DML’s for data from different sources like flat files. Developed conditional DML’s for data that has header, body and trailers.
  • Coordinating 5 Members team at both onsite and offshore. Effort estimation which required for each BAU (Business as Usual) and new project activities.
  • Allocate activities and tasks to the team members at onsite/Offshore. Guide the team and provide the solutions towards any problem faced by them.
  • Performed UNIX shell scripting to formatting complex before loading into database.

Environment: Ab initio GDE 3.4, 3.3, 3.1, UNIX, Oracle 11i, Toad for Oracle, Windows2003, MainFrame, CBM front End Tool, Harvest CA Work Bench, EME Console, Tivoli Workload Scheduler and JS-Console.

Confidential, Richmond, VA

Ab Initio Developer

Responsibilities:

  • Gather the requirements from business analysts, Designed and Developed ETL application.
  • Responsible for High level Design and Detail design document for specific ETL processes.
  • Created Sand box, Sandbox Parameters and Graph Parameters.
  • Involved in analysis to understand the new data model for the application.
  • Involved in implementing the business requirements into Ab Initio graphs after impact analysis.
  • Developed Complex mappings from varied transformation logics like Unconnected /Connected lookups, Router, Filter, Expression, Aggregator, Joiner, Union, Update Strategy and more.
  • Created solution specification documents (SSD) Based on BRD for approval from client prior initiating development tasks.
  • Prepared Unit test plans (positive and negative test cases) for all the business requirements.
  • Extensively used AIR commands to check in/check-out, perform dependency analysis and other EME related operations.
  • Well versed with various Ab Initio parallelism techniques and implemented number of AbInitio Graphs using Data parallelism techniques.
  • Worked with DBA to setup a new Oracle database for Integration testing and configured dbc file to connect to the database.
  • Worked on loading of data from several flat files sources to Staging using Teradata MLOAD, FLOAD and TPump.
  • Worked on tuning the Ab Initio graphs for better performance.
  • Supported Testing Team in providing data, loading the Data in Test Environment, provided sqls for testing the Test cases.
  • Created various SQL and PL/SQL scripts for verification of the required functionalities.
  • Implemented 4-way Multi file system and used Partition components to store the large amount of data in multiple files.
  • Prepared UNIX Shell Scripts and these shell scripts will be scheduled in Autosys for automatic execution at the specific timings
  • Designed and developed common component graphs for all the loading unloading.
  • Provide necessary support documents like Detailed Design Document, Technical Support Document, Job Scheduling details to the production support team and give them KT to support the graphs in production.
  • Designed and developed generic graph for loading data feeds.
  • Possess specific experience performing testing including Backend, Frontend, Regression, Smoke and Functional Testing.
  • Wrote PL/SQL Database triggers to implement the business rules in the application.
  • Designed, implemented and tuned interfaces and batch jobs using PL/SQL.
  • Creating packages and deploying to UAT and PROD Environments. Involved in end to end data validation by interacting with the business (UAT).
  • Ran SQL queries on the tables in the Oracle database to verify the data and mapping documents.
  • Responsible for Allocating tasks to the team members.

Environment: Ab Initio GDE 1.14, Co-op 2.14, PL/SQL, Oracle 10g, Autosys, IBM Rational tools (RequisitePro, Clear Case, Clear Quest), Teradata V2R5/R6.

Confidential, Glastonbury, CT

Ab Initio Developer

Responsibilities:

  • Studied and understand all the functional and technical requirementsto better serve the project.
  • Developed Ab Initio graphs using different components for extracting, loading and transforming external data into data mart.
  • Involved in designing and development of data load components using AbInitio DB2 native utilities for loading the data.
  • Design the ETL application following all ETL standards and architecture.
  • Developed various Ab Initio Graphs based on business requirements using various Ab Initio Components like Partition by Key, reformat, rollup, join, gather, replicate, merge etc.
  • Involved in the design phase worked on the mapping document technical specifications.
  • Responsible for Detailed Low Level Design and Construction for specific ETL processes
  • Develop detailed ETL specifications based on business requirements Set up the Sand box, Sandbox Parameters and Graph Parameters.
  • Involved in PL/SQL code review and modification for the development of new requirements.
  • Developed UNIX scripts to FTP mail files generated, to a different server by considering naming conventions of the file at target server.
  • Check in/Checkout the project in/from to the EME - Maintaining the Version Control of source code.
  • Responsible for extracting daily text files from ftp server and historical data from DB2 Tables, cleansing the data and applying transformation rules and loading to staging area.
  • Worked on testing and tuning the Ab Initio graphs for optimizing performance.
  • Designed and developed common component graphs for all the loading unloading.
  • Developed generic graphs to load and unload data, and process data.
  • Involved in uploading of the data from flat files into Databases and validated the data with PL/SQL procedures.
  • Scheduled the tasks using Autosys and Control-M.
  • Creating packages and deploying to Test and prod Environments.
  • Used TOAD, PL/SQL developer tools for faster application design and development.
  • Data conversion from flat file to intermediate tables using SQL*Loader and Data mapping.
  • Query optimization (explain plans, collect statistics, Primary and Secondary indexes).
  • Involved in end to end data validation by interacting with the business (UAT), Support during SIT, UAT phases.
  • Worked on loading of data from several flat files sources to Staging using Teradata MLOAD, FLOAD and TPump.
  • Worked on exporting data to flat files using Teradata FEXPORT.
  • Allocate activities and tasks to the team members at onsite/Offshore. Guide the team and provide the solutions towards any problem faced by them.
  • Extensively worked in the performance tuning of transformations, Sources, Sessions, Mappings and Targets.

Environment: Ab Initio GDE 1.14, Co-op 2.14, PL/SQL, Oracle 10g, DB2, Autosys, IBM Rational tools (RequisitePro, Clear Case, Clear Quest), Teradata V2R5/R6.

Confidential

Teradata ETL developer

Responsibilities:

  • Communication with business users and analysts on business requirements. Gathering and documenting the technical and business Meta data about the data.
  • The Teradata EXPLAIN facility, which describes to end-users how the database system will perform any request.
  • The TS/API product, a system to allow products designed for SQL/DS to access the Teradata database machine without modification.
  • Analysis of the specifications provided by the clients.
  • Worked with various system interfaces to gather requirements for migration and implementation.
  • Coding using BTEQ SQL of TERADATA, wrote UNIX scripts to validate, format and execute the SQL’s on UNIX environment.
  • Compared the actual test results with expected results using FTP with UNIX scripts.
  • Designed database, tables and views structure for the new data mart.
  • Using Teradata manager, Index Wizard and PMON utilities to improve performance.
  • Populate or refresh Teradata tables using Fast load, Multi load & Fast export utilities for user Acceptance testing and loading history data into Teradata.
  • Involved in unit testing & integration testing.
  • Wrote Functions and Stored procedures in Teradata Database Having good experience in developing Stored Procedures and views.
  • Involved in Creating UNIX Scripts for triggering the Stored Procedures and Macro.
  • Performance tuning of the long running queries.
  • Worked on complex queries to map the data as per the requirements.
  • Reduced Teradata space used by optimizing tables - adding compression where appropriate and ensuring optimum column definitions.
  • Preparing Test Cases and performing Unit Testing.
  • Review of Unit and Integration test cases.
  • Production Implementation and Post Production Support.
  • Generate weekly, monthly reports for internal and external database and system audit.

Environment: Teradata V2R5, Teradata SQL Assistant, BTEQ, FLOAD, FEXP, MLOAD, FTP, Windows XP, Cognos, Visual Basic 6.

We'd love your feedback!