Ab Initio Developer Resume
Franklin Lakes, NJ
SUMMARY
- Over 7+ years of experience in Information Technology, having in - depth knowledge in Analysis, Design, Development, Testing and Production/Maintenance of Client/Server (OLTP) and Data Warehouse (DSS) Applications.
- Have 6+ years of experience in Analysis, Design and Development of Data warehousing solutions and developing strategies for Extraction, Transformation and Loading (ETL) mechanism using Ab Initio for full lifecycle development in building a data warehouse.
- Expertise in all phases of the Systems Development Life Cycle (SDLC), including project definition, analysis, design, coding, testing, implementation and support and worked with Waterfall as well Agile/Scrum methodology.
- Extensive experience in Data Modeling, Database Design and Development as Developer/Analyst using Oracle(8i/9i/10g/11i), Teradata, DB2 UDB, SQL, Flat Files, Excel files, JCL, PL/SQL Procedures/Functions and Packages.
- Extensively worked in Data Warehousing concepts, Dimensional modeling like Star Schema and Snowflake Schema.
- Specialized in ETL methodology for supporting Data Analysis, Extraction, Transformations and Loading, in a corporate-wide-ETL Solution using Ab Initio. Strong experience in Ab Initio GDE, CO>operating System, EME.
- Created complex Ab Initio graphs for data processing, data migration and data analyzing purpose.
- Expertise in check pointing/phasing, partition, de-partition, normalize, sort, replicate, assign keys, rollup, aggregate, dedup, reformat, join, Read/Write Excel, FTP and other miscellaneous component groups.
- Experience in using EME for version control and Project Management.
- Expertise in all components in the GDE of Ab Initio for creating, executing, testing and maintaining graphs in Ab Initio and also experience with Ab Initio Co-operating System in application tuning and debugging strategies.
- Developed several Ab Initio complex graphs for transforming, cleansing & loading Data marts.
- Well versed with various Ab Initio parallelism techniques and implemented Ab Initio Graphs using Component Parallelism, Pipeline Parallelism, Data parallelism and Multi File System (MFS) techniques.
- Worked on improving performance of Ab Initio graphs by using various Ab Initio performance techniques like using lookups, in memory sort in joins and rollups to speed up various Ab Initio graphs and optimum maxcore parameter for memory optimization.
- Worked with multi files, partitions and joins in Massively Parallel Processing (MPP) environment with very large Databases (VLDB).
- Strong working experience in Teradata utilities (BTEQ, FASTLOAD, FASTEXPORT, MULTILOAD, SQL Assistant, PMon), UNIX scripting, PL/SQL, SQL, SQL*Plus, SQL*Loader, Bulk Copy (BCP), Stored Procedures, Functions, Packages.
- Designed and developed Load/Unload Teradata Utility Scripts: MLOAD, FASTLOAD, FAST EXPORT
- Worked in Very Large Databases (VLDBs), massive volume data, Table Partitions, Table space, Capacity and I/O Management
- Extensive experience in writing efficient and well-tuned SQL Queries and UNIX Shell Scripts (K-Shell) and awk on Sun Solaris to provide reporting, automate process and validate data on DB2, Teradata, and Oracle.
TECHNICAL SKILLS
ETL Tools: AB INITIO (Co-Operating System 3.1.4/3.0/2.15/2.14/2.13 x/2.10x, GDE 3.1.4/3.0/1.15 , 1.14/1.13x, 1.11x, 1.10, EME), Informatics (Power Mart/Power Center) 8.x
Operating Systems: Windows: 2000/2003/Vista, NT 4.0, XP, Sun Solaris (8,9,10), LINUX, HP UNIX, Mainframe
Database Management Systems: Teradata, Oracle (11g,10g, 9i, 8i), DB2-UDB, SQL Server (2000, 2005, 2008), Ms Access, Mainframe
Programming/Scripting Languages: C/C++, PL/SQL, VB 6.0, Java Script, PERL, PHP, Unix Scripting, HTML, XML, Ms. Excel & VBA
Others: AutoSys, Erwin, IBM ClearCase/ ClearQuest 7.0.1.1, HP QC, Putty, Edit Plus, WinSCP, GIS, File Zilla
PROFESSIONAL EXPERIENCE
Confidential, Franklin Lakes, NJ
Ab Initio Developer
Responsibilities:
- Involve in meetings to gather information and requirements from the business/ business analyst.
- Review Mapping documents and apply business logic embedded in mapping documents into Ab Initio graphs and load/unload tables needed for data validation and create summary reports for business auditing purpose.
- Work for designing and developing Ab Initio (AI) graphs to ftp data from mainframe server to unix server, transform the data and load it in Teradata database as well as send reports/ status as per business requirement.
- Created summary tables or, Excel files using Normalize, De-normalize, Rollup, Scan, and Read/ Write Excel.
- Frequently use components such as reformat, join, rollup, scan, dedup, sort, dedup sorted, filter by expression (FBE), partition by round robin, partition by key, merge, gather, concatenate to develop Ab Initio graphs.
- Use components like normalize and denormalize to develop ETL transformation logic.
- Also used components like run program, runsql, read/ write excel, partition by expression, replicate, xml split, and call web service, input table, output table, leading records, ftp from/to components.
- Modified Ab Initio parameters, utilize data parallelism and thereby improve the overall performance to fine-tune execution times.
- Reduce run time of existing Ab Initio graphs by avoiding unnecessary in memory sort, phase breaks, use high parallelism for high volume data, and optimizing the usage of sort components that break pipe line parallelism.
- Developed Ab Initio plan that will call framework graph with configurable parameters and performs loop operations.
- Perform Unit Testing and coordinate with QA/QC team and Business analyst for functional, regression and User Acceptance testing.
- As a SME of ETL process, prepare ETL design documents and SOPs, perform knowledge transfer to offshore team and new recruits, and resolve database and Ab Initio code related issues.
- Also work with Claim Processing Tools (CRT), MNM Screen, ESD, and Mainframe utilities JCL, TSO, CICS, DB2.
- Use Advanced Excel functions and write macro to analyze complex database defects and generate mass SQL to fix the issue.
- Analyze, test, fix and perform root cause analysis (RCA) of production defects that directly impact medical claims processing. Also, modify Ab Initio graphs if any code bug discovered.
- Also work for code testing/peer review and code migration to production and frequently coordinate with DBA for data refresh and data model changes in different environment in DB2 database.
- Developed wrapper scripts to automate the daily incremental process and send the status report to business.
- Used EME Air Commands like air run, air tag create, air load, air save, air load, air project import/ export etc
Environment: Ab Initio (GDE 3.14, Co-Op 3.1.4), DB2, Teradata, UNIX Shell (ksh), CA-7, Autosys, Queryman, Rumba,Sharepoint
Confidential, Hartford, CT
Ab Initio Developer
Responsibilities:
- Worked in various Ab Initio components to transform and load data into warehouse tables using Ab Initio GDE.
- Automated ETL process, Filtered out header and trailer records to implement source balancing on the graphs to generate control-balancing reports.
- Developed the graphs using the GDE, with components partition by round robin, partition by key, rollup, sort, scanning, dedup sorted, reformat, join, merge, gather, concatenate, web service call, xml split components.
- Designed and developed multiple graphs to load and unload all the data needed from different source databases by configuring the dbc file in the Input Table component.
- Used components like run program and runsql components to run UNIX and SQL commands in Ab Initio.
- Also used the components like filter by expression, partition by expression, replicate, partition by key components.
- Worked with partition and departition Components like Gather, Interleave in order to departition and Repartition the data in Multi File system accordingly.
- Performing transformations of source data with Transform components like Join, Match Sorted, Dedup Sorted, Reformat, and Filter-by- Expression components. Create Summary tables using Rollup, Scan & Aggregate.
- Modified Ab Initio parameters, utilize data parallelism and thereby improve the overall performance to fine-tune execution times.
- Took measures to improve the performance of the Ab Initio jobs by avoiding unnecessary phase breaks and optimizing the usage of sort components that break pipe line parallelism
- Developed wrapper scripts and korn scripts to perform different routine jobs and scheduled processes using Autosys.
- Used Data Parallelism, Pipeline Parallelism and Component Parallelism in Graphs, where huge data files are partitioned into multi-files and each segment is processed simultaneously.
- Developed ETL Design Documents as well Materialized Views for Business users to generate reports.
- Coordinated with the QC testing team for the functional, regression and integrated testing.
Environment: Ab Initio (GDE 3.0.5, 3.0, Co-Op 3.0.5, 3.0), Oracle 11i, Teradata, UNIX Shell (ksh), Autosys, HP-UX, AQT
Confidential, Wilmington, DE
Ab Initio Developer
Responsibilities:
- Worked with the design of MDD (Minor Development Document) from business requirement & information gathered from business.
- Designed and deployed several Ab Initio graphs from the low level design document.
- Developed the daily Ab Initio jobs for the daily transactional data that are automated with the Unix shell scripts.
- Developed Ab Initio framework graphs, reuse the same graph to load data into same dimension tables in different marts.
- Worked with deployment plan to be provided for the production before the release of the project.
- Usage of sub graphs, graph level and project level parameters wherever required in the graphs.
- Worked with PSET to execute same job for different target systems with different set of parameter values.
- Developed an Ab Initio plan that will call framework graph with configurable parameters.
- Responsible for creating test cases to make sure the data originating from source makes it to the target in the proper format.
- Worked with components like Reformat, Join, sort, input table, output table, update table and Join with dB to develop graphs.
- Took measures to improve the performance of the Ab Initio jobs by avoiding unnecessary phase breaks and optimizing the usage of sort components that break pipe line parallelism.
- Worked with Ab Initio package creation and Hermes tag generation for promoting the code to UAT and Production environments.
- Check in/Check Out existing applications Using EME in order to perform the necessary modifications
- Used different EME air commands in project promotion like air tag create, air save, air load, Air project export etc.
Environment: Ab Initio GDE3.0, Co-Op3.0, EME, SUN Solaris, Oracle (9i/10g), SQL developer, Toad, Windows-NT/2000
Confidential, Jersey City, NJ
Ab Initio Developer
Responsibilities:
- Involved in analyzing business needs and document functional and technical specifications based upon user requirements with extensive interactions with business users.
- Development of source data profiling and analysis - review of data content and metadata will facilitate data mapping, and validate assumptions that were made in the business requirements.
- Making automate the development of Ab initio graphs and functions utilizing the Meta data from EME. Also, involved as designer and developer for Enterprise data warehouse.
- Developed various Ab-Initio graphs to validate using data profiler, comparing the current data with previous month data, applying the AMS standards.
- Used Different Ab-Initio components like partition by key & sort, dedup, rollup, scan, reformat, join and fuse in various graphs.
- Also used components like run program and run sql components to run UNIX and SQL commands in Ab-Initio.
- Performed transformations of source data with components like join, match sorted, dedup sorted, reformat and FBE.
- Wide usage of lookup files while getting data from multiple sources and size of the data is limited.
- Involved in project promotion from development to UAT and UAT to promotion.
- Involved inProduction implementationbest practices.
- Implemented Data Parallelism unitizing MFS in the graphs, which deals with data, divided into segments and operates on each segment simultaneously through the Ab Initio partition components to segment data.
- Used phases and checkpoints in AB Initio graphs to facilitate recovery after failures.
- Used different EME air commands in project promotion like air tag create, air save, air load, air project export etc.
Environment: Ab Initio GDE 1.15.7, Co>Operating System 2.15, UNIX, windows, TERADATA, Teradata SQL Assistant v6.5, Oracle.
Confidential, New York, NY
Ab Initio Developer
Responsibilities:
- Created Ab-Initio graphs to load large volume of data around several GB to TB.
- Used the Ab-Initio EME Website to view graphs, files and datasets and examine the dependencies among objects.
- Extracted data from Oracle and used them to populate Teradata Data Warehouse tables associated with Data Mart.
- Created Korn Shell scripts and cron jobs to refresh the load on weekly basis.
- Developed complex Ab Initio XFR’s to derive new fields and solve various business requirements.
- Used inquiry and error functions like is valid, is error, is defined, is null and string functions like string substring, string concat and other string * functions in developing Ab Initio graphs to perform data validation and data cleansing.
- Created test scenarios that were used to validate the Ab-Initio graphs.
- Designed and developed complex Ab-Initio graphs using Aggregate, Join, Rollup, Scan and look up to generate consolidated (fact/summary) data identified by dimensions. Developed complex graphs in normalizing the 3-D data in the Excel Spread Sheet into 2-D flat file. Later this flat file was in to Teradata Data Marts.
- Created various Ab-Initio Multi File Systems (MFS) and used Component Parallelism, Pipeline Parallelism and Data Parallelism technique to run graphs in parallel.
- Used different partition and de-partition components to make graph to run parallel and improve performance of the graph.
- Responsible for cleansing the data from source systems using Ab-Initio components such as reformat and filter by expression.
- Used sub graphs to increase clarity of graph and to impose reusable business restrictions and tested various graphs for their function.
- Developed several partition based Ab-Initio Graphs for high volume data warehouse.
Environment: Ab Initio GDE1.14, Co-Op 2.14, UNIX, windows, PL/SQL, Oracle 10g, Teradata v2r5
Confidential
Software Developer
Responsibilities:
- Worked with various clients to gather business for software and data warehouse development, testing, and administration requirement. Designed, developed, and tested as per client requirement using test driven data development techniques.
- Worked in full Software Development Life Cycle including project definition, analysis, estimating LOE, design, coding, testing, implementation, and support as per test driven development standard.
- Developed full life-cycle database including data profiling, logical/physical data design, data modeling, and implementation/administration for various clients.
- Performed technical research and prepared detailed report concerning project specification and activities.
- Developed program for user web-interface and software to manage customer banking system using VB 6.0, JavaScript, XML, web service, and SQL Server 2000.
- Configured and maintained Windows Server 2003 to host Java/J2ee, ASP.NET, C#/.Net, and PHP based web application.
- Created Database objects including Tables, Indexes, Clusters, sequences, roles, and privileges. Also, designed and implemented Backup & Recovery strategies.
- Wrote script in Unix/Linux to run batch process, automate, and status email and summary reports.
Environment: VB 6.0, Unix, DHTML, XML, PHP, Javascript, C/C++, FORTRAN, SQL Server 2000, Windows Server 2003.
