We provide IT Staff Augmentation Services!

Ab Initio Developer Resume

2.00/5 (Submit Your Rating)

SUMMARY

  • Over 9+ years of experience with development in complex and high volume Ab - initio ETL applications including requirement gathering, design, implementation, unit testing, debugging, documentation under Unix Environment.
  • Hands on experience on generic graph implementation, advanced Ab-initio components, psets, plans and Ab-initio product tools like Ab-initio, Cooperating system, EME Technical Repository, PDL meta programming, XML data processing.
  • Expertise in Data Warehouse/Data mart, ODS, OLTP and OLAP implementations teamed with project scope, Analysis, requirements gathering, data modeling, Effort Estimation, ETL Design, development, System testing, Implementation and production support.
  • Extensive experience in developing Stored Procedures, Functions, Views and Triggers, Complex SQL queries using SQL Server and Oracle PL/SQL.
  • Experience in designing and implementing Ab Initio EME Metadata projects.
  • Experienced in using Ab initio Tools such, Metadata Hub, and Data Profiler.
  • Designed Parallel Partition Ab Initio Graphs for a high volume data warehouse
  • Involved in Ab Initio Graph Design and Performance tuning to Load graph process
  • Responsible for fully understanding the source systems, documenting issues, following up on issues and seeking resolutions
  • Extensively used ETL methodologies for supporting data extraction, transformations and loading processing using Ab Initio.
  • Experienced in writing SQL query and PL/SQL Procedures, Triggers, and Functions necessary for maintenance of the databases in the data warehouse development lifecycle.
  • Good exposure to SQL, PL/SQL stored procedures, Triggers and Packages.
  • Deep understanding and capability in troubleshooting of the common issues faced during the development lifecycle including coding, debugging, testing and final roll-out.
  • Strong experience in the Analysis, design, development, testing and Implementation of Business Intelligence solutions using Data Warehouse/Data Mart Design, ETL, BI, Client/Server applications.
  • Solid experience in Extraction, Transformation and Loading (ETL) mechanism using Ab Initio. Knowledge of full life cycle development for building a data warehouse.
  • Extensive experience with data warehousing concepts including ETL best practices, dimensional modeling and business intelligence support.
  • Performed Informatica administrative tasks such as stopping/starting Informatica Repository and Server processes, performing code migration from Development to Test to Production, backup and restoring of Informatica Repository, upgrades and security administration.
  • Experienced in using Informatica workflow manager, Workflow monitor to create, schedule and control workflows, tasks, and sessions.
  • Developed reusable transformations and Mapplets maximize the ETL load performance.
  • Code migration from development to testing to production servers using ETL Process.
  • Experienced in planning and coordinating Disaster recovery solution for Teradata.
  • Experience in work environment with a mix of onsite-offshore Global Delivery model.
  • Self-motivated, excellent written and oral communication skills
  • Strong analytical & problem solving skills.
  • Worked with MVS, VSAM, GDG, DB2, Flat files and excel sheets as the inputs for the graphs.

TECHNICAL SKILLS

Operations: Windows XP.Linux,UNIX, MVS, Z/OS

Languages: COBOL, JCL, XML, C, C++, shell scripting, Eztrive

Modeling Tools: MS Visio, ER win

Tools: Control-M, Op-Console, Job Track, Change man, Endeavour, Xpeditor, Quality Center 10,AutoSys

Databases: DB2, SQL, Oracle and IMS DB

ETL: Ab Initio GDE (1.13, 1.15and 3.03), Ab Initio Co>Op (2.13,2.15 and 3.15), BRE, Express>IT.

Business Domains: Banking and Finance, Insurance

Distributed Technologies: Web Services, WCF, WF (Windows Workflow Foundation).

Software Engineering: Scrum.

Version Control Tools: Ab Initio EME

PROFESSIONAL EXPERIENCE

Confidential

Ab Initio Developer

Responsibilities:

  • Worked in the Data Management Team on Data Extraction, Subset, Data Cleansing, and Data validation .
  • Involved in all the stages of SDLC during the projects. Analyzed, designed and tested the new system for performance, efficiency and maintainability using ETL tool AB INITIO.
  • Worked to support daily/monthly cycle execution from data prospects.
  • Responsible for requirement gathering and development of Expiration Date Fictionalization project.
  • Co-ordinate with different testing groups to accommodate their testing data requirements and translate them to data selection criteria in Ab-Initio format.
  • Document ways to automate manual processes.
  • Designed a STAR schema for sales data involving shared dimensions (Conformed) for other subject areas using Erwin Data Modeler.
  • Used UNIX environment variables in various .ksh files, which comprises of specified locations to build Ab Initio Graphs.
  • Responsible for Performance-tuning of Ab Initio graphs. Written UNIX shell scripts in Batch scheduling.
  • Maintaining sandbox by storing all the work in a sequential order.
  • Developed UNIX shell scripts for the purpose of parsing and processing data files. Maintained and did trouble shooting for batch processes for overnight operations.
  • Used SQL*Loader to load data from external system and developed PL/SQL programs to dump the data from staging tables into Base Tables.
  • Converted user defined functions and complex business logic of an existing application process into Ab Initio graphs using Ab Initio components such as Reformat, Join, Transform, Sort, Partition to facilitate the subsequent loading process.
  • Quick reports generated for the users and data analysis on Test Beds on numerous occasions.
  • Co-ordinate with data team for the development of future changes in the file or table structures to accommodate future testing requirements.
  • Worked with MVS, VSAM, GDG, DB2, Flat files and excel sheets as the inputs for the graphs.
  • Experience in creating bases for the VSAM and generations for the GDG file.
  • Experienced in the use of AGILE approaches including test driven development and scrums.
  • Worked with Ab inito functions like date, strings and user defined functions as per the requirements.
  • Designed and developed ETL Batch jobs using Ab initio. Worked on EME to promote Ab initio codes in higher environment. Also worked on continuous flow, standard environment, and Metadata Hub in order to build lineage.
  • Experienced in creating and deploying Metadata Hub Web applications, and loading Metadata Hub customizations.
  • Created database objects like tables, views, procedures, packages using Oracle tools like PL/SQL, SQL* Plus, SQL Loader.
  • Implemented Data Parallelism through graphs, which deals with data, divided into segments and operates on each segment simultaneously through the Ab Initio partition components to segment data.
  • Performing transformations of source data with Transform components like Join, Match Sorted, Dedup Sorted, Demoralize, Reformat, Filter-by- Expression.
  • Create Summary tables using Rollup, Scan & Aggregate.

Environment: Ab Initio, Linux, Oracle, BRE, POC, XML, TOAD, JIRA, MDM, Business Objects, UNIX shell scripting

Confidential, Hoover, AL

Ab Initio Developer

Responsibilities:

  • Create Summary tables using Rollup, Scan & Aggregate.
  • Analyzed business requirements, technical specification, source repositories and physical data models for ETL mapping and process flow
  • Worked extensively with mappings using expressions, aggregators, filters, lookup, joiners, update strategy and stored procedure transformations
  • Extensively used Pre-SQL and Post-SQL scripts for loading the data into the targets according to the requirement
  • Developed mapping to load Fact and Dimension tables, for type 1 and type 2 dimensions and incremental loading and unit tested the mappings.
  • Involved in designing the Data Mart models with Erwin using Star schema methodology
  • Used repository manager to create repository, user's groups and managed users by setting up privileges and profile
  • Optimize, improve and enhance Ab initio Graphs. Performed Analysis, designing and preparing the functional, technical design document, and code specifications.
  • Performed Database tasks such as creating database objects (tables, views, procedures, functions)
  • Responsible for debugging and performance tuning of targets, sources, mappings and sessions.
  • Optimized the mappings and implementing the complex business rules by creating re-usable transformations and Mapplets
  • Used Informatica Workflow Manager for creating, running the Batches and Sessions and scheduling them to run at specified time
  • Executed sessions, sequential and concurrent batches for proper execution of mappings and set up email delivery after execution.
  • Implemented and documented all the best practices used for the data warehouse
  • Implemented various Teradata recommended best practices while defining Profiles, Roles, Alerts, Multi-Value Compression, TARA GUI, Teradata Data Mover, and Net Backup.
  • Involved in Implementation projects such as ORACLE to TERADATA DB/DW, and SQL SERVER to TERADATA conversion projects.
  • Involved in Teradata Query Tuning and tuned complex Queries, and Views, and implemented Macro's for reduce Parsing time. Developed stored procedures using PL/SQL and driving scripts using Unix korn Shell Scripts.

Environment: Ab Initio, UNIX, DB2, Teradata, shell scripting and Mainframes, Oracle 10g/11g/12g, PL/SQL, SQL Loader, Reporting tools dashboard, Cognos & crystal reports, batch jobs.

Confidential, Myrtle Beach, SC

Ab Initio Developer

Responsibilities:

  • Responsible for converting business requirements into technical specifications. Created high level and detailed designs using Ab Initio.
  • Interacted with Business Analysts, DBAs, third party consultants and other developers to refine requirements and develop solutions to business problems.
  • Designed, developed, tested and implemented software components of ETL systems and participated in all phases of the software development life cycle. Developed and modified configuration scripts in UNIX shell scripts and used SQL extensively to investigate data problems and to support the functionalities of the ETL process.
  • Worked with users to know their needs.
  • Performed Unit testing, implemented enhancements, fixed defects and maintained code.
  • Used Ab Initio as ETL tool to pull data from source systems, cleanse, transform, and load data into databases.
  • Worked in the Data Management Team on Data Extraction, Fictionalization, Subset, Data Cleansing, and Data Validation.
  • Workstation POC Responsibilities include communicating with client in regards to hardware inventory, troubleshooting workstation issues.
  • Creating the batch jobs through mainframe to run the graphs.
  • Working experience with the DCLGEN through DB2 to create the copy book for the DB2 tables.
  • Responsible for to create the prams for the graphs to run through UNIX environment.
  • Responsible for writing the wrapper scripts.
  • Co-ordination with different testing groups to accommodate their testing data requirements and translate them to data selection criteria in Ab inito format.
  • Developed executive summary level metadata reports in excel via web service extracts from the Ab Initio Metadata Hub.
  • Designed a STAR schema for sales data involving shared dimensions (Conformed) for other subject areas using Erwin Data Modeler.
  • Used UNIX environment variables in various .ksh files, which comprises of specified locations to build Ab Initio Graphs.
  • Experienced in creating and deploying Metadata Hub Web applications, and loading Metadata Hub customizations.
  • Created database objects like tables, views, procedures, packages using Oracle tools like PL/SQL, SQL* Plus, SQL Loader.
  • Implemented Data Parallelism through graphs, which deals with data, divided into segments and operates on each segment simultaneously through the Ab Initio partition components to segment data.
  • Performing transformations of source data with Transform components like Join, Match Sorted, Dedup Sorted, Demoralize, Reformat, Filter-by- Expression.
  • Create Summary tables using Rollup, Scan & Aggregate.
  • Wrote and modify several application specific Config scripts in UNIX in order to pass the Environment variables.
  • Implementing the AUTOSYS Interface Database using Oracle, and AB-INITIO technologies.
  • Reviewed applications through the SDLC (Software Development Life Cycle) process to conform to corporate standards with the best and most optimized practices.
  • Provided consulting to users by translating change requests to implementation involving database schema, SQL and program codes using various technologies, AB-INITIO and oracle PL/SQL.
  • Extracted the data from different sources to analyze the data accuracy and developed the reports as neededto support the business
  • Developed a process to execute Abinitio graphs/scripts in serial or parallel mode with full re start ability.
  • Performed unit testing at various levels of the ETL and actively involved in team code reviews.
  • Identified problems in existing production data and developed one time scripts to correct them.
  • Fixed the invalid mappings and troubleshoot the technical problems of the database.

Environment: AB Initio, Teradata, Power Exchange, Oracle 10g, Oracle SQL developer, PL/SQL, Unix, ESPM, Control- M

Confidential, Chadds Ford, PA

ETL Developer

Responsibilities:

  • Developed ETL procedures to transform the data in the intermediate tables according to the business rules and functionality requirements.
  • Involved in the development of Informatica mappings and also tuned for better performance.
  • Created Mapplets and used them in different Mappings.
  • Created Sub Graphs to impose application/business restrictions.
  • Experience in using sprint developed Application tools for Preload, load, post loading into partitioned tables.
  • Database tuning, backup and recovery and general day to day running of the system.
  • Involved in Hardware project in changing the DB server from Sun to AIX to minimize size and efficiency of the project and improved Performance
  • Created PSET'S to run generic graph with different parameters.
  • Used PDL's to compute values for local parameters.
  • Used Meta programming to develop generic and reusable graphs.
  • Designed and Developed Generic Graph that generates Extract PSET's out from an extract metadata table.
  • Used Read XML and Write XML components to send and receive data and messages from downstream applications.
  • Used designer debugger to test the data flow and fix the mappings. Tuned Informatica, Mappings and sessions for optimum performance.
  • Worked with mapping parameters and variables to load data from different sources to corresponding partition of the database table.
  • Created Schema objects like Indexes, Views and Sequences.
  • Extracted the data from Flat files & Oracle and load them through Informatica.
  • Involved in Production Support and issue resolutions.
  • Involved in Unit testing of Mappings, Workflows and Debugging mappings for failed sessions.

Environment: Informatica Power Center 8.6, IDQ 9.1, Oracle 10g, SQL Server 2005, Oracle Designer, Toad, PL/SQL, Linux, Erwin and Windows 2000 / XP.

We'd love your feedback!