We provide IT Staff Augmentation Services!

Ab Initio Developer Resume

New York City New, YorK


  • Over 8 years of IT experience in the Design, Analysis, Development, Modeling, Implementation and testing of various applications, Decision Support Systems & Data Warehousing applications.
  • Strong experience in the Analysis, design, development, testing and Implementation of Business Intelligence solutions using Data Warehouse/Data Mart Design, ETL, BI, Client/Server applications.
  • Strong Experience in Ab Initio Architecture, GDE and Co>Operating System.
  • Solid experience in Extraction, Transformation and Loading (ETL) mechanism using Ab Initio. Knowledge of full life cycle development for building a data warehouse.
  • Expertise in Data Warehouse/Data mart, ODS, OLTP and OLAP implementations teamed with project scope, Analysis, requirements gathering, data modeling, Effort Estimation, ETL Design, development, System testing, Implementation and production support.
  • Extensive experience in developing Stored Procedures, Functions, Views and Triggers, Complex SQL queries using SQL Server and Oracle PL/SQL.
  • Experience in designing and implementing Ab Initio EME Metadata projects.
  • Designed Parallel Partition Ab Initio Graphs for a high volume data warehouse.
  • Involved in Ab Initio Graph Design and Performance tuning to Load graph process.
  • Experience in creating custom schema from Ab Initio EME base schema and creating EME custom views.
  • Responsible for fully understanding the source systems, documenting issues, following up on issues and seeking resolutions
  • Extensively used ETL methodologies for supporting data extraction, transformations and loading processing using Ab Initio.
  • Experienced in writing SQL query and PL/SQL Procedures, Triggers, and Functions necessary for maintenance of the databases in the data warehouse development lifecycle.
  • Experienced in Oracle Database, SQL, PL/SQL, SQL Tuning, Replication, ETL, Database Design, Data Modeling, Data Warehousing.
  • Extensive experience with data warehousing concepts including ETL best practices, dimensional modeling and business intelligence support.
  • Performed Informatica administrative tasks such as stopping/starting Informatica Repository and Server processes, performing code migration from Development to Test to Production, backup and restoring of Informatica Repository, upgrades and security administration.
  • Good knowledge in creating and scheduling Autosys, jobs to automate ETL process. Possesses Control - M job scheduling knowledge as well.


ETL Tools: Ab Initio GDE 3.1.4/3.04/1.15, Co>Operating System 3.x/2.15, SSIS, SSRS, Informatica PowerCenter.

Database: My SQL, PL/SQL, Oracle 9i, 10g, 11g

Scripting Languages: Unix Shell Script, JavaScript, DHTML, XML

Operating Systems: Windows XP, 2003,07,08,10, NT, Unix, SUSI Linux, Red Hat 7.2

Methodologies: SDLC, STLC, Agile, Scrum, Waterfall

Testing Tools: Load Runner, QTP, Quality Center

Defecting Tracking Tools: Rational Clear Quest, Test Director

Version Control Tools: Microsoft Visual SourceSafe, SVN and GIT

Office Tools: MS Excel, MS Word, MS Power Point, MS Project, MS Outlook


Confidential, New York City, New York

Ab Initio Developer


  • Worked in the Data Management Team on Data Extraction, Fictionalization, Subset, Data Cleansing, and Data Validation.
  • Responsible for requirement gathering and development of Expiration Date Fictionalization project.
  • Creating the batch jobs through mainframe to run the graphs.
  • Responsible for to create the prams for the graphs to run through UNIX environment.
  • Assisted in Batch processes using Fast Load, UNIX Shell and Teradata SQL to transfer cleanup and summarize data.
  • Used AutoSys for Scheduling the Jobs and setting up the Jobs for execution.
  • Co-ordination with different testing groups to accommodate their testing data requirements and translate them to data selection criteria in Ab inito format.
  • Developed the scripts to automate the execution of Ab Initio graphs under UNIX/LINUX environment
  • Efficiently using the Table extracts using Ab Initio and also Teradata utilities as required by the process.
  • Provided technical expertise on data modeling, data warehousing and data integration and assisted Ab Initio in developing the Data Profiler.
  • Performed Data mapping between source systems to Target systems, logical data modeling, created class diagrams and ER diagrams and used SQL queries to filter data.
  • Experienced in using Ab initio Tools such as GDE 3.1.5, GDE 3.2.2, Metadata Hub, and Data Profiler.
  • Designed and developed ETL Batch jobs using Ab initio. Worked on EME to promote Ab initio codes in higher environment. Also worked on continuous flow, standard environment, and Metadata hub in order to build lineage.
  • Coordinated with UAT/Business team to help them to run the AutoSys jobs before getting code into production.
  • Implemented various Teradata recommended best practices while defining Profiles, Roles, Alerts, Multi-Value Compression, TARA GUI, Teradata Data Mover, and Net Backup.
  • Involved in Implementation projects such as ORACLE to TERADATA DB/DW, and SQL SERVER to TERADATA conversion projects.
  • Converted user defined functions and complex business logic of an existing application process into Ab Initio graphs using Ab Initio components such as Reformat, Join, Transform, Sort, Partition to facilitate the subsequent loading process.
  • Experienced in creating and deploying Metadata Hub Web applications, and loading Metadata Hub customizations.
  • Used AutoSys 4.5.1 and latest version 11.3.5, WA AE (CA Work Load Automation AE), for scheduling the Jobs and setting up the Jobs for execution.

Environment: Ab Initio (GDE1.15, Co>operating system 2.15), Teradata 12, Autosys, Fast-loads, Multi-loads, Fast Exports, BTEQ, UNIX IBM AIX 5.1, Control-M, Unix Shell scripts, SQL, PL/SQL.

Confidential, Chicago, Illinois

Ab Initio Developer


  • Responsible for converting business requirements into technical specifications. Created high level and detailed designs using Ab Initio.
  • Interacted with Business Analysts, DBAs, third party consultants and other developers to refine requirements and develop solutions to business problems.
  • Designed, developed, tested and implemented software components of ETL systems and participated in all phases of the software development life cycle. Developed and modified configuration scripts in UNIX korn shell scripts and used SQL extensively to investigate data problems and to support the functionalities of the ETL process.
  • Performed Unit testing, implemented enhancements, fixed defects and maintained code.
  • Used Control M and Autosys for Scheduling the Jobs and setting up the Jobs for execution.
  • Tuning of Ab Initio Graphs for better performance.
  • Developed Complex Ab Initio XFR's to derive new fields and solves rigorous business requirements.
  • Worked Extensively on Teradata SQL Assistant to analyze the existing data and implemented new business rules to handle various source data anomalies.
  • Automated the Data Loads using UNIX korn shell scripting for Production, Testing and development environment.
  • Developed executive summary level metadata reports in excel via web service extracts from the Ab Initio Metadata Hub.
  • Designed a STAR schema for sales data involving shared dimensions (Conformed) for other subject areas using Erwin Data Modeler.
  • Involved in Teradata Query Tuning and tuned complex Queries, and Views, and implemented Macro's for reduce Parsing time.
  • Handled Teradata performance SQL Tuning, Query optimization (Explain plans, Collect statistics, Primary and Secondary indexes).
  • Experienced in planning and coordinating Disaster recovery solution for Teradata.
  • Implemented Data Parallelism through graphs, which deals with data, divided into segments and operates on each segment simultaneously through the Ab Initio partition components to segment data.
  • Designed Physical Data Model (PDM) using IBM Info sphere Data Architect data modeling tool and Oracle PL/SQL.
  • Experienced in developing multiple Data Quality configuration, Templates in Express IT and imports into Metadata Hub.
  • Expertise in Ab-initio Product Data Quality and Metadata Hub upgrades, view customization, extractors registration, Metadata Hub Lineage and Metadata hub Import creations.
  • Provided daily reporting of activities and trends by implementing the AUTOSYS Interface Database using Oracle, and Ab Initio technologies.
  • Reviewed applications through the SDLC (Software Development Life Cycle) process to conform to corporate standards with the best and most optimized practices.
  • Provided consulting to users by translating change requests to implementation involving database schema, SQL and program codes using various technologies, AUTOSYS, Ab Initio and oracle PL/SQL.

Environment: Ab Initio (Co-op, Oracle 10g, PL/SQL, Teradata, Autosys, Control-M, Unix Korn shell.

Confidential, Deerfield, Illinois

Ab Initio Developer


  • Analyzed business requirements, technical specification, source repositories and physical data models for ETL mapping and process flow
  • Worked extensively with mappings using expressions, aggregators, filters, lookup, joiners, update strategy and stored procedure transformations
  • Extensively used Pre-SQL and Post-SQL scripts for loading the data into the targets according to the requirement.
  • Involved in Teradata Query Tuning and tuned complex Queries, and Views, and implemented Macro's for reduce Parsing time.
  • Handled Teradata performance SQL Tuning, Query optimization.
  • Responsible for writing automated scripts that update information into the Teradata Database for our Web Applications access.
  • Extensively used the Ab initio components like Reformat, Join, Partition by Key, Partition by Expression, Merge, Gather, Sort, Dedup Sort, Rollup, Scan, FTP, Lookup, Normalize and De-normalize.
  • Developed mapping to load Fact and Dimension tables, for type 1 and type 2 dimensions and incremental loading and unit tested the mappings.
  • Responsible for Performance-tuning of Ab Initio graphs. Written UNIX shell scripts in Batch scheduling.
  • Involved in designing the Data Mart models with Erwin using Star schema methodology
  • Used repository manager to create repository, user's groups and managed users by setting up privileges and profile
  • Used Ab Initio web EME to monitor the data mapping across graphs in the project.
  • Performed Database tasks such as creating database objects (tables, views, procedures, functions)
  • Responsible for debugging and performance tuning of targets, sources, mappings and sessions
  • Optimized the mappings and implementing the complex business rules by creating re-usable transformations and Mapplets.
  • Used Informatica Workflow Manager for creating, running the Batches and Sessions and scheduling them to run at specified time
  • Implemented and documented all the best practices used for the data warehouse
  • Created Workflows, tasks, database connections, FTP connections using workflow manager
  • Responsible for identifying bugs in existing mappings by analyzing data flow, evaluating transformations and fixing bugs
  • Developed stored procedures using PL/SQL and driving scripts using Unix Shell Scripts.

Environment: Ab Initio, Informatica PowerCenter 9.1, Power Exchange, Oracle 10g, Oracle SQL developer, PL/SQL, Unix, ESPM, Control- M

Confidential, Plano, TX

Ab Initio/ ETL Developer


  • Populate foundation data within the EDW in order to support weekly downstream data requirements.
  • Leverage contributions of the DSAS (Direct Sales Accounting System) project
  • Build reconciliation mechanism to ensure no discrepancy in data from source and target systems.
  • Design and Development of Graphs, Plans, PDL using Ab initio and Usage of Active Transformations like Transform, Partition, De-partition, Sort components and Different Lookups Functions.
  • Involve in building the strategy to Implement Customer and transaction level data warehouse on db2.
  • Document user requirements and translate requirements into system solutions.
  • Architect Star & Snowflake based logical & physical data models for Data Warehouse systems using data modeling tools such as Erwin.
  • Involve in creating the functional specification documents for ETL interfaces.
  • Design, develop, deploy and support integration processes across the enterprise by utilizing Informatica V9.1.
  • Designed and developed database schemas using Teradata Relational Data Warehouse.
  • Develop test plans, test cases, test scripts and test validation data sets for Data Mart, Data Warehouse integration/ETL processes.
  • Created several packages to set up and share the local, global variables and transforms which were extensively used for many Ab Initio graphs.
  • Perform software testing including Unit Testing, Functional Testing, Database Testing, Load Testing, Performance testing and User Accepting Testing.
  • Provide platform for testing team to perform White/Black Box Testing, System Testing, Regression Testing, Integration Testing, and End to End Testing.
  • Design and Implement ETL processes for History load and Incremental loads of EDW, Customer and Transaction level data warehouse.
  • Document all the interface processes in current data warehouse system and translate them into new ETL processes using Informatica.
  • Strong knowledge of developing and implementing ETL processes using Ab Initio, DQE, UNIX shell, DB2, and Oracle.
  • Expert in Designing SSIS Packages to Extract, Transform and Load (ETL) / DTS data into the data warehouse from heterogeneous databases such as Oracle, DB2, MS Access.
  • Extensively used Ab initio client tools Graphical Development Environment, Co-ops, and Enterprise Metadata Environment (EME).
  • Involved in Data Assessment to identify key data sources and run the system extracts and queries.
  • Automate all the new ETL jobs using UNIX korn shell scripts and add data validation checks including business rules and referential integrity checks.
  • Perform troubleshooting, performance tuning and performance monitoring for enhancement of jobs.
  • Maintain warehouse metadata, naming standards and warehouse standards for future application development.

Environment: Ab Initio, UNIX Shell scripting, Informatica PowerCenter Client 9.1, SVN, DB2

Confidential, Lynchburg, VA

Ab Initio Developer


  • Involved in a team in designing the Database Schema, for the metadata for storing the informative queries that are generated dynamically.
  • Developed ETL code based on Business requirements using various Ab Initio components.
  • Involved in writing the Triggers which internally calls procedures and functions.
  • Tested the application to ensure proper functionality, data accuracy, and that modifications have no adverse Impact on integrated system environment
  • Experience with Data flow diagrams, Data dictionary, Database normalization theory techniques, Entity relation modeling and design techniques.
  • Installed, and configured all Teradata 13.10/14.10 tools such as SQL Assistant, Teradata Admin, Viewpoint, and Visual explain, and stats wizard.
  • Used Teradata manager to query Teradata RDBMS status and CPU utilization in reports and graphs.
  • Expertise in Client-Server application development using Oracle, PL/SQL, SQL *PLUS, TOAD and SQL*LOADER.
  • Effectively made use of Table Functions, Indexes, Table Partitioning, Collections, Analytical functions, Materialized Views, Query Re-Write and Transportable table spaces.
  • Strong experience in Data warehouse concepts, ETL.
  • Good knowledge on logical and physical Data Modeling using normalizing Techniques.
  • Worked on configuring Ab Initio environment to connect to database using DB configuration file, Input Table, Output Table, and Update Table components and other custom components.
  • Developed Complex database objects like Stored Procedures, Functions, Packages and Triggers using SQL and PL/SQL.
  • Developed Ab Initio graphs as daily and monthly cycle for loading, partitioning, cleaning and populating the data based on legal and business requirements.
  • Worked with different Sources such as DB2, SQL Server and Excel, Flat.
  • Used Informatica to extract data into Data Warehouse.

Environment: Ab Initio, Informatica PowerCenter 8.6, IDQ 9.1, Oracle 10g, SQL Server 2005, Oracle Designer, Toad, PL/SQL, Linux, Erwin and Windows 2000 / XP.

Confidential, Pittsburgh, PA

ETL Developer


  • Worked closely with business analysts to understand and document business needs for Decision support data.
  • Used several Informatica Transformations i.e. Lookup, expression, aggregator, joiner, filter, normalizer.
  • Used the Update Strategy Transformation to update the Target Dimension tables.
  • Tested the ETL Ab Initio graphs and other ETL Processes (Data Warehouse Testing).
  • Developed ETL procedures to transform the data in the intermediate tables according to the business rules and functionality requirements.
  • Involved in communicating with support, business and development teams to resolve issues during test execution. Tested initial and daily loads for various Ab Initio graphs.
  • Created Mapplets and used them in different Mappings.
  • Used designer debugger to test the data flow and fix the mappings. Tuned Informatica, Mappings and sessions for optimum performance.
  • Worked with mapping parameters and variables to load data from different sources to corresponding partition of the database table.
  • Created Schema objects like Indexes, Views and Sequences.
  • Extracted the data from Flat files & Oracle and load them through Informatica.
  • Done data validations for Business Objects reports. Extensively used Ab Initio Co-op to execute the graphs and verify the logs.
  • Involved in Unit testing of Mappings, Workflows and Debugging mappings for failed sessions.
  • Created partitions, SQL over ride in Source Qualifier, session partitions for improving performance.
  • Wrote Unix korn Shell Scripting based on requirements.

Environment: Ab Initio, Informatica PowerCenter 8.6, Oracle 11i/10g, SQL Server 2005, UNIX, Shell Scripting, Toad, Netezza.

Confidential, Columbus, OH

Informatica Developer


  • Involved in detailed analysis of requirements and creation of requirement specifications and functional specifications.
  • Analyzed business requirements and worked closely with various application teams and business.
  • Preparation of technical specification for the development of Extraction, Transformation and Loading data into various stage tables.
  • To develop ETL procedures that is consistent across all application and systems.
  • Involved in Creating Fact and Dimension tables using Star schema
  • Built Repositories using Informatica PowerCenter Repository Manager to store Source, Target, Transformations, Mapplets and Mappings Meta Data Information.
  • Worked on Informatica tools like Source Analyzer, Warehouse Designer, Mapping Designer, Mapplet Designer and Transformation Developer.
  • Modified various objects like Materialized Views, Stored Procedures, Packages, Functions wherever the change has the impact.
  • Developed various Mappings, Mapplets and Transformations for migration of data from various existing systems to the new system using Informatica PowerCenter Designer.
  • Extracting and loading of data from flat file, Oracle sources to Oracle database (target warehouse) using transformations in the mappings.
  • Created Informatica mappings with SQL, PL/SQL procedures and Functions to build business rules to load data.
  • Wrote shell scripts and control files to load data into staging tables and then into Oracle base tables using SQL*Loader.
  • Performed tuning of sessions in Target, Source, Mappings and Session areas.

Environment: Informatica Power Center, Oracle 10g, DB2, PL/SQL, SQL*Loader, Windows NT and UNIX, Toad, SQL Server 2008.

Hire Now