We provide IT Staff Augmentation Services!

Sr. Ab Initio Developer Resume

3.00/5 (Submit Your Rating)

Wilton, CT

SUMMARY:

  • 9+ years of IT experience in the Design, Analysis, Development, Modeling, Implementation and Testing of various Applications, Decision Support Systems & Data Warehousing applications.
  • Solid experience in Extraction, Transformation and Loading (ETL) Mechanism using Ab Initio. Knowledge of full Life Cycle Development for Building a Data Warehouse.
  • Extensively used ETL methodologies for supporting Data Extraction, Transformations and Loading processing using Ab Initio.
  • Experience in Developing Ab Initio Graphs, Conduct>It (Plans), Express>It Templates, Metadata Hub customizations/importers, Job/Schedule creations in Control Center .
  • Experience in Express>It, Meta Data Hub, Control Center, Data profiler Ab Initio Products.
  • Configured Graph parameters, Sandbox parameters, Environment variables, EME Repository Environment for Production/Testing/Development.
  • Experience in Performance Tuning of the Graphs.
  • Experience in Creation of BRE Templates in Express >It.
  • Hands on experience with Metadata Hub Administration Tools, utilities for creating Metadata Hub data stores.
  • Experience of using Metadata Importer for importing metadata from an EME Technical Repository and other sources like ETL tools (Informatica), Reporting tools (Cognos, SAS, Business Objects etc.) and databases (Oracle, Teradata, DB2 etc.) for building data lineage using Metadata Hub.
  • Experience in Creating and Deploying Metadata Hub Web applications.
  • Generated stats (start time, end time, duration, job status, file receive time etc.) from the OPDB Database (Ab initio Control Center) while the jobs are in progress and sending status report.
  • Experience in Jobs/Schedule creation in Control Center.
  • Knowledge in Analyzing Data using Ab Initio Data Profiler to estimate different Patterns of data, identifying duplicates, frequency, consistency, accuracy, completeness and referential integrity of data.
  • Set up Development, QA & Production Environments.
  • Experience in UNIX Shell Scripts including Korn Shell Scripts, Bourne Shell Scripts.
  • Worked with Different Source/Target Systems like Oracle, MS SQL Server, Teradata, and DB2.
  • Excellent Knowledge in Data Warehousing Concepts, Dimensional modeling like Star Schema and Snowflake Schema.
  • Experienced in writing SQL query and PL/SQL Procedures, Triggers, and Functions necessary for maintenance of the Databases in the Data Warehouse Development Lifecycle.
  • Ability to co - ordinate effectively with Development Team, Business, End Users and Management.
  • Deep understanding and capability in Troubleshooting of the common issues faced during the Development Lifecycle including Coding, Debugging, Testing and Final rollout.
  • Experience in reviewing and monitoring Project Progress, Planning & Managing Dependencies and Risks, Resource Planning, Project Tracking and Forecasting.
  • Self-Motivated, Excellent written and Oral Communication skills.
  • Strong analytical & problem solving skills.

TECHNICAL SKILLS:

ETL Tools: Ab Initio GDE 3.3.x, CO>OP 3.3.x, Express>It (ACE/BRE), Metadata Hub, Data profiler, Ab Initio Control Center (Operational Console), Informatica

Ab Initio Control Center,: Maestro, Autosys, Control M, Tivoli Work Load Scheduler

OS: UNIX/LINUX, Windows

Databases: Oracle 11i,10g,9i/8i/8/7.x, MS SQL Server 2008/2005/2000 , Teradata v2r6.1, MS Access, IBM DB2.

Languages: UNIX Shell Scripting, SQL, PL/SQL, T-SQL,C.

Scripting: UNIX Shell (Ksh), JavaScript.

Web: HTML, DHTML, XML, ASP 2.0/3.0

PROFESSIONAL EXPERIENCE:

Confidential, Wilton, CT

Sr. Ab Initio Developer

Responsibilities:

  • Involved in meetings with Business System Analysts and Business Users to understand the Requirements.
  • Developed Ab Initio Graphs/Plans for Reading/Writing to HDFS (Hadoop) Lake.
  • Created BRE Templates, AppConfs in Express>It.
  • Created Link Sub Graphs for re-using the functionality in the Graphs.
  • Created Graphs using components like Reformat, DedupSorted, Filter by expression, Partition By Expression, Partition by Key and Sort, Sort, Replicate, Join, Merge, Concatenate, Gather, Rollup, Normalize, Run Program, Read/Write HDFS, Write Block Compressed Lookup, Merge Block Compressed Data, Lookup Template.
  • Created Sub plan/s by grouping some of the Ab Initio Graphs as specific functionality in Conduct>IT and decide upon when they need to be enabled or disabled.
  • Build PDL driven Ab Initio Plans, Graphs by using various Dataset and Program components.
  • Written BRE Rulesets in the Express>It AppConf Mapping Transformations.
  • Tuned the Graphs by using Lookup files, Memory sort and Max-core parameters for maximum usage of cache memory and to enhance the performance.
  • Used Dynamic Lookup to fetch values from the Lookup file in the Reformat Transformation.
  • Worked on the Orchestration of Jobs using Ab Initio Control Center.
  • Used Control Center to schedule and configure Ab Initio Jobs by calling Plans/Graphs/Scripts.
  • Import and export of Control Center (Operational Console) jobs in to higher environments SIT/UAT/PROD from Development.
  • Handled Deployments and Migration of the Code to Prod/Test regions.
  • Writing SQL queries.
  • Created mockup data for the Development Environment to test the business functionality. Supported for System testing, UAT testing, and preproduction testing. Handled critical data issues in System Testing, UAT environments in the process of recovery.
  • Worked with the Team in carrying out unit testing to identify errors in the Ab Initio code.
  • Written Production Handoff Documents.
  • 24x7 Production Support for ETL jobs for Daily, Monthly and Weekly schedules.

Environment: co>op 3.3.1.2, GDE 3.3.1.1, Express>It, Hadoop, UNIX, SQL.

Confidential, Washington, DC/Charlotte, NC

Sr. Ab Initio Metadata Hub Developer/Admin

Responsibilities:

  • Used Metadata Importer for importing Metadata from an EME Technical Repository and other sources like ETL tools (Informatica), reporting tools (Cognos, Business Objects etc.) and DB2 databases.
  • Imported the Catalog, Custom Catalog, Roles and Privileges Feeds for the Database (Oracle and Teradata) and to appear the Databases in the Physical Assets Hierarchy.
  • Imported the Erwin Logical and Erwin UDPs Feeds for the Logical Models.
  • Loading Metadata Hub customizations.
  • Imported the EME Datasets and EME Graph Imports for building the Data Lineage.
  • Imported the EME Datasets and Data Profiler results from EME to Technical Repository to show up the results in the Metadata portal.
  • Load the Customizations for Valid Value Combinations, which include Master Values, Business Data Elements Groups and Domain Code Sets.
  • Created Metadata Hub data stores using utilities.
  • Create and Deploy Metadata Hub Web applications.
  • Customize the Metadata Explorer in order for the Business user to explore and analyze the Metadata and to see the contents of the system and applications and drill down in to details of the object.
  • Created new Feed files for importing the metadata on the command line and also in the Metadata Portal.
  • Created Rule Files for Transformations and importing the feeds.
  • Created Data Source Connection files for connecting to the graphs in order to extract the Metadata.
  • Generating Metadata Reports (EME Error Reports).
  • Generate Dependency Analysis Reports.
  • Adding and Exposing the Divisions in the Metadata Portal.
  • Imported Datasets in to EME and MDHUB for Data lineage.
  • Installing GDE Keys and ETL server keys.
  • Providing users to EME access, MDHUB access.
  • Maintaining the Prod/Test/Dev Environments.
  • Generate report when a user last used GDE.
  • Handling Code Deployments.
  • Masking the Data for the Prod files to Dev.
  • Shutdown and restart the EME, MDHUB servers.
  • Collecting information for all Environments like the Tag names, DB Names, Schema Names etc.
  • Setting up New Environments and perform shake out test (like checking out the sandbox, running the graphs through the scheduler etc.).

Environment: co>op 3.1.5, GDE 3.1.7.4, Metadata Hub 3.1.12, Express>It (ACE/BRE), DB2, UNIX.

Confidential, Warren, NJ

Sr. Ab Initio Developer

Responsibilities:

  • Involved in meetings with Business System Analysts and Business users to understand the functionality.
  • Development of Ab Initio Graphs/Plans for loading and extracting data from various schemas relating to Oracle Database.
  • Created Graphs using components like Input/output File/Table, lookup, Reformat, Redefine Format, Dedup Sorted, Filter by expression, Partition By Expression, Partition by Key and Sort, Sort, Broadcast, Replicate, Join, Merge, Concatenate, Gather, Rollup, Normalize, Redefine format, Read Excel spread sheet, Fuse, Run Program, Run SQL, Update Table, Multi Update Table, Send Message, send mail.
  • Performed Data cleansing operations on the data using transformation functions like is valid, is defined, is null, is blank,string lrtrim, re index, re interpret as, string concat,string substring,lookup count,lookup first,now(),decimal strip, re index,re replace,decimal lpad, next in sequence(), length of test characters all(), force error(), switch(),first defined(), lookup match(), conditional dml, Cobol-to-dml utility, xml-to-dml utility, etc.
  • Worked on the Orchestration of Jobs using Ab Initio Control Center (Operational Console tool).
  • Used Control Center (Operational Console Tool) to schedule and configure Ab initio jobs by calling Plans/Graphs/Scripts.
  • Import and Export of Control Center (Operational Console) jobs in to higher environments SIT/UAT/PROD from Development.
  • Hands on experience using Conduct>IT to create a complete production-ready plan or sub plan consisting of Ab Initio graphs, custom scripts.
  • Created Sub plan/s by grouping some of the Ab Initio graphs as specific functionality in Conduct>IT and decide upon when they need to be enabled or disabled.
  • Used Express>It (ACE/BRE) to create the framework jobs that are imported in to Operational Console.
  • Created Templates, AppConfs in Express>It (ACE/BRE).
  • Used PDL functions in Graphs/Plans.
  • Written mail scripts for successful processing of the files, file watcher scripts, run ksh scripts for running the graphs, housekeeping scripts etc.
  • Handled Deployments and Migration of the Code to Prod/Test regions.
  • Generated DB configuration files (.dbc) for source and target tables.
  • Written SQL queries.
  • Worked with the team in carrying out unit testing to identify errors in the Ab Initio code related to the customer related data from everyday’s transactions in a real-time transaction environment.
  • Written Production Handoff Documents.
  • Coordinating with Offshore Team.
  • 24x7 Production Support for ETL jobs for daily, Monthly and Weekly schedules.

Environment: co>op 3.1.5, GDE 3.1.7.4, Express>It (ACE/BRE), Oracle, UNIX and HP Quality Center.

Confidential, Herndon, Virginia

Ab Initio Developer

Responsibilities:

  • Involved in Development of Ab Initio graphs for loading and extracting data from various schemas relating to Oracle Database.
  • Dealt with ASCII (fixed, delimited, multiple headers/trailers), EBCDIC and XML files.
  • Created ETL Solution Spec, Supplemental Documents (mapping) based on the requirements in DOORS, had review meetings with Business and Architects to finalize the documents.
  • Written Detailed Design, Application Control Design, Interface Detailed Design documents.
  • Created graphs using components like Input/ Output File/Table, lookup, Reformat, Redefine Format, Dedup Sorted, Filter by expression, Partition By Expression, Partition by Key and Sort, Sort, Broadcast, Replicate, Join, Merge, Concatenate, Gather, Rollup, Scan, Read Excel spread sheet, FTP to, Fuse, Run Program, Run SQL, Update Table.
  • Performed Data cleansing operations on the data using transformation functions like is valid, is defined, is null, is blank,string lrtrim, re index, re interpret as, string concat,string substring,lookup count,lookup first,now(),decimal strip, re index,re replace,decimal lpad, next in sequence(),length of test characters all(), force error(),switch(),first defined(), lookup match(), conditional dml, cobol-to-dml utility,xml-to-dml utility, etc.
  • Worked on Improving the performance of Ab Initio graphs by employing Ab Initio performance components like Lookups (instead of joins), In-Memory Joins, Rollup and Scan components to speed up execution of the graphs.
  • Used Autosys Scheduling tool for Orchestration of the jobs.
  • Handled Deployments and Migration of the Code to Prod/Test regions using Clear Case Tool.
  • 24x7 Production Support for ETL jobs for daily, Monthly and Weekly schedules.
  • Used Ab Initio Components like Partition by Key, reformat, rollup, join, scan, normalize, gather, replicate, merge etc.
  • Check in/Check Out existing applications using EME in order to perform the necessary modifications.
  • Written Release notes and Run Book for the Migration of the code from Development to higher environments and base lining the documents in the DOORS.

Environment: co>op 2.15/3.0.1, GDE 3.0.1, Oracle, Erwin, DOORS, Autosys, Clear Case UNIX and HP Quality Center

Confidential, Wilmington, Delaware

Ab Initio & Metadata Hub Developer/Module Lead

Responsibilities:

  • Used Metadata Importer for importing Metadata from an EME Technical Repository and other sources like ETL tools (Informatica), reporting tools (Cognos, SAS, Business Objects etc) and databases (Oracle, Teradata, DB2 etc.).
  • Imported Catalog, Custom Catalog, Roles and Privileges Feeds for the Database (Oracle and Teradata) and to appear the Databases in the Physical Assets Hierarchy.
  • Imported Erwin Logical and Erwin UDPs Feeds for the Logical Models.
  • Loading Metadata Hub customizations.
  • Imported EME Datasets and EME Graph Imports for building the Data Lineage.
  • Imported Data Profiler results from EME to Technical Repository to show up the results in the Metadata portal.
  • Imported Business Data Definition Matrix feeds.
  • Loading the Customizations for Valid Value Combinations, which include Master Values, Business Data Elements Groups and Domain Code Sets.
  • Created Metadata Hub data stores using utilities.
  • Create and Deploy Metadata Hub Web applications.
  • Customized the Metadata Explorer in order for the Business user to explore and analyze the Metadata and to see the contents of the system and applications and drill down in to details of the object.
  • Created new feed files for the metadata on the command line and also in the Metadata Portal.
  • Created rule files for Transformations and importing the feeds.
  • Created Data Source Connection files for connecting to the graphs in order to extract the Metadata.
  • Generated Metadata Reports and auditing.
  • Adding and Exposing the Divisions in the Metadata Portal.
  • Exposing the Notes Tab and having the various notes type in the Metadata Portal
  • Involved in Development of Ab Initio Graphs for loading and extracting data from various schemas relating to Oracle Database.
  • Performed Data cleansing operations on the data using transformation functions like is valid, is defined, is null, is blank, string lrtrim, re index, re interpret as, string concat,string substring,lookup count,lookup first,now(),decimal strip, re index,re replace,decimal lpad, next in sequence(), force error(),switch(),first defined(), lookup match(), conditional dml, cobol-to-dml utility,xml-to-dml utility, etc.
  • 24x7 Production Support for ETL jobs for daily, Monthly and Weekly schedules.

Environment: co>op 2.15/3.0.1, GDE 3.0.1, Metadata Hub 3.0.4, Oracle, Erwin, UNIX and HP Quality Center.

Confidential, Irving, Tx

Ab Initio Developer

Responsibilities:

  • Dealt with ASCII (fixed, delimited, multiple headers/trailers), EBCDIC, Spread sheets and XML files.
  • Involved in Development of Ab Initio graphs for loading and extracting data from various schemas relating to Oracle Database.
  • Created graphs using components like Input/ Output File/Table, lookup, Reformat, Redefine Format, Dedup Sorted, Filter by expression, Partition By Expression, Partition by Key and Sort, Sort, Broadcast, Replicate, Join, Merge, Concatenate, Gather, Rollup, Scan, Read Excel spread sheet, FTP to, Publish, Subscribe, Fuse, Run Program, Run SQL,MQ Publish Headers, Read/write xml, Batch Subscribe, Update Table.
  • Worked on Continuous Flows for sending the XML messages to Real Time application (TIBCO).
  • Used Continuous Flows for Data Enrichment with Main Frame.
  • Develop and Test wrapper scripts for Graphs and Maestro Jobs/Schedulers.
  • Performed Data cleansing operations on the data using transformation functions like is valid, is defined, is null, is blank,string lrtrim, re index, re interpret as, string concat,string substring,lookup count,lookup first,now(),decimal strip, re index,re replace,decimal lpad, next in sequence(),length of test characters all(), force error(),switch(),first defined(), lookup match(), conditional dml, cobol-to-dml utility,xml-to-dml utility, etc.
  • Written NDM scripts, mail scripts for successful processing of the files, file watcher scripts, run ksh scripts for running the graphs, housekeeping scripts etc.
  • Processing the source feed files according to the Business logic and NDM’ed the files to Target systems.
  • Tuned the Graphs by creating Lookup files, Memory sort and Max-core parameters for maximum usage of cache memory and to enhance the performance.
  • Handled Deployments and Migration of the Code to Prod/Test regions.
  • Involved in various EME data store operations like creating sandbox, code check-in, code checkout, creating project parameters according to the environment settings for this application.
  • Created checkpoints, phases to avoid dead locks and tested the graphs with sample data.
  • Writing SQL queries.
  • Written Production Handoff Documents.
  • Coordinating with Offshore Team.
  • 24x7 Production Support for ETL jobs for daily, Monthly and Weekly schedules.

Environment: co>op 2.15/3.0.1, GDE 1.14/1.15/3.0.1, Oracle, Erwin, UNIX and HP Quality Center.

We'd love your feedback!