- Over 7 years of professional IT experience in Analysis, Design, Development, Testing and Implementation of various Data Warehousing, Reporting and Software Applications.
- Over 7 years of Exclusive Experience in Ab Initio (GDE, Co>operating system).
- Expertise in ETL tools using Ab Initio (GDE, Co>operating system), DataStage and SQL Loader for data migration to Enterprise Data Warehouse.
- 3+ years of strong Teradata along with SQL Assistant and BTEQ experience. Solid skills and expert knowledge of Teradata version 2R5.x/6.x/13.x and utilities Parallel Transporter which include inherit tools like TPUMP, MLOAD, FASTLOAD and FASTXPORT.
- Strong experience on developing re - usable generic applications for high-volume DWH environment (~ 80 TB).
- Thorough knowledge in EME, check-ins, checkouts, command line interface, air commands and Dependency analysis.
- Excellent work base experience on OLAP application development using Abinitio, Sybase, Netezza, Oracle and Teradata Procedure’s
- 5+ years of experience on development and maintenance of STAR schemas, Fact Tables, Dimensional Tables (Type 1, Type 2, Type 3), and Surrogate Keys
- Calculated Data Quality metrics such as completeness, accuracy, consistency and stability.
- Hands on experience on analyzing data, data lineage and impact analysis using Metadata Hub.
- Administered management group settings to generate and view reports using Operations Console.
- Good knowledge of Payment Card Industry, Credit Cards Domain, Insurance and strong experience in Financial/Banking industry
- Developed various UNIX shell wrapper scripts to run Ab initio jobs and used shell scripts in graphs/plans.
- 4+ year of experience in AbInitio Plan (Conduct>It) using various tasks components for processing the applications.
- Experience in Ab Initio EME/Sandbox to implement version control and impact analysis to manage various project across the organization.
- Excellent knowledge on Business Rule Engine (BRE) and Application Configuration Environment (ACE).
- Extensively used Express>it and Couduit>it
- Extensive experience on developing Ab Initio Batch and Continuous Flow applications.
ETL Tools: Ab Initio (GDE 3.2.5/3.0.x/2.1x, Pentaho data Integration 6.01, Co>Operating System 3.2.x, 2.1x), Informatica Power Center 8/9/1/9.6.1, DataStage, SQL*Loader and Teradata Parallel Transporter utilities
Database: Teradata 15.00/14.10, Oracle 12C/11g, Oracle X2/X3/X4 full rack Exadata machine, MS SQL Server 2014/2008 and DB2 UDB
Reporting Tools: Tableau, MicroStrategy 8.1.1/8.0.2/7 X (Desktop, Web, Intelligence Server, Report Services, Narrowcast Server, Enterprise Manager)
Testing Tools: Win Runner, Load Runner, Test Director, and Quick Test Pro.
Operating Systems: Microsoft Windows XP/2003/2000/9i/NT, IBM AIX Unix 4.2, Red Hat Linux 8, IBM OS/390 version 2.8, Sun Solaris 8 and DOS
Languages: Perl, SQL, PL/SQL, Transact SQL, UNIX Shell scripting, HTML, XML, Jboss 4.2.x and Jboss 5.1 EAP
Confidential, Riverwoods, IL
Abinitio Production Support/ Developer
- Working as an L3 Production support for teams like Payment data services, Finance data services and Corporate data services to handle the urgent issues as and when happened.
- Developed critical business projects from the beginning requirement collection to install the same in production environment.
- Guiding a team of 6 people in offshore to handle the development activities by setting up daily on call meetings.
- Helping the offshore optimize the code by following AbInitio best practices and tuning the database queries whenever possible.
- Worked on Teradata Unity tool to have disaster recovery setup. Also helped in moving the existing data, tables, schemas and setup in DR server to the Unity server for optimal disaster recovery.
- Developed complex Unix shell scripts to handle the requirements from the business.
- Worked as a Primary prod issue solver on an alternative week.
- Developed MapReduce programs to parse the raw data, populate staging tables and store the refined data in the partitioned tables in the EDW.
- Created Hive queries that helped market analysts spot emerging trends by comparing fresh data with EDW reference tables and historical metrics.
- Provided design recommendations and thought leadership to sponsors/stakeholders that improved review processes and resolved technical problems.
- Enabled speedy reviews and first mover advantages by using Oozie to automate data loading into the Hadoop Distributed File System and PIG to pre-process the data
Environment: AbInitio 3.2.5 with Co>op 3.2.x, Teradata 15.00, Teradata Utility, Oracle 12C, SecureCRT (Unix), Tivoli Workload scheduler (TWS), Windows 7/10
Confidential, Atlanta, GA
Sr. Abinitio Developer
- Responsible for interacting and discussing with Business Team to understand the business process and gather requirements. Developed and documented a high level Conceptual Data Process Design for the projects.
- Responsible for checking the data lineage, data profiling and data management for Risk application.
- Worked towards the optimization of data by following the best practices for data quality.
- Responsible for managing technical resources (Onshore and Offsite) within project schedule.
- Used 14 Way Multi Files for processing Millions of records and used MFS and serial lookup files for lookup.
- Specified the application’s business rules and/or other metadata in Express>It
- Tested the application with test data in Express>It to ensure that the features are configured as expected
- Excelled guiding and managing technical resources within project schedule and scheduling status meeting to get the status updates and discuss ongoing issue and blockers.
- Continued to make changes to the application in Express>It and published the application from Express>It for promotion or release.
- Responsible for updating project manager regarding status of development efforts.
- Liaised with business and functional owner during risk engineering and high-level review sessions to derive and execute action plans, meeting deadlines and standards.
- Handled the tasks of identifying system deficiencies and implementing effective solutions
- Handled the responsibilities of managing technical risks throughout the project.
- Drove impressive improvements across the business as a result of IT implementations.
- Created cost-benefit analyses and ROI assessments that were used as the basis for decision-making on proposed IT implementation projects.
- Created High Level Technical Design and Detailed Design (DDL) documents (HLD) in the initial stages of a project.
- Developed graphs using Ab Initio components and utilized Ab Initio Parallelism techniques and MFS Techniques with the Conditional Components and Conditional DML and thereby improved the overall performance by optimizing the execution time.
Environment: Ab Initio 3.2.5.x with Co>Op 3.2.6, Oracle, DB2, UNIX, MS Office, Mainframe, Windows 7, CA7
Confidential, Chicago, IL
- Developed ETL process for extracting the data from Oracle and DB2 sources and loading DB2 Pure Scale and UDB tables.
- Worked on End to End in the Project from Requirements to Implementation and PIV.
- Used various XML components like XML Combine, Split as part of creating various Business records using XSD.
- Developed Graphs using RPC Component and Web process for creating and retrieval of the Hash dynamically using continuous graph.
- Have a good experience in using Meta Data Hub for browsing the Data and addition of Metadata to Metadata hub.
- Integration of Exadata monitoring into our existing IT monitoring framework. Provide and implement performance and tuning recommendations of all components in the Exadata machines.
- Duplicate all the non-production databases on Exadata server according the Data Warehouse development project's requirement. Help to compress large volume partition table by using Oracle HCC Hybrid Columnar Compression techniques.
- Developed shell scripts in Start and end scripts for sending communication to the users and various Business needs for weekly and Daily jobs
- Have a good experience in air commands, m* commands and Unix commands for day to day activities and good experience in creating tag and save files for code promotion.
- Have a good experience in using majority of Ab Initio Components like Rollup, Scan, Normalize, Reformat, Multi update, Partition Components, and Repartition Components and creating of common functions and usage of vectors.
- Used Control-M as a job scheduler to automate the daily and monthly jobs.
- Used Begin and End Transaction components for processing the Relational Integrity tables (RI).
- Used 4 Way Multi Files for processing Millions of records and used MFS and serial lookup files for lookup.
- Have worked and have good knowledge of Continuous flows.
- Good experience in loading DB2 on mainframe and Pure Scale DB2 tables and has a good experience in loading Oracle tables using API and utility modes.
- Developed ETL application for extracting the data from DB2 Sources and pushing the files through Secure Transfer to various third party vendors.
- Worked on the enhancement of the existing ETL application and tuned the ETL jobs for better performance and enhanced the ETL apps from batch jobs to hourly refresh.
Environment: Ab-Initio GDE 184.108.40.206 (Batch, Continuous), co-op 220.127.116.11, AIX, EME 18.104.22.168, Oracle 11g Enterprise Edition (64 bit), Pure Scale DB2, DB2 on Mainframes V10.5, Windows 7 Professional, Control-M, MS Share Point, Agile Methodology(Rally) and HP Quality center.
Data Ware Housing Developer
- Scheduling status meeting to get the status updates and discuss ongoing issue and blockers.
- Developed Continuous Flow applications using Ab Initio continuous components like subscribe, MQ subscribe, Multi Publish, Publish, Continuous join with DB, Continuous update table.
- Developed Conduct>IT plans and sub-plans by integrating graphs, scripts.
- Extensively used Teradata and its tools to extract and load data
- Created so many multiload and fastload scripts for loading data into Teradata.
- Extensively utilized Ab Initio Parallelism techniques and MFS Techniques with the Conditional Components and Conditional DML and thereby improved the overall performance by optimizing the execution time.
- Designed and developed a process for extracting the data from All Bridge Data Sources.
- Developed the technical design document and the fields Mapping Documents from Various sources with embedding all the System Constraints.
- Designed various Oracle tables at various levels of the Business such as Detail level, Agent level and Leader level.
- Developed a generic process for Versioning the records using SCD type 2 and extensively used various Meta programming and PDL functions.
- Constructed various control balancing methodologies at Source level and target level and balancing between files and file and table.
- Extensively used Multi file partitions (8 way) and used various functions pertaining to multi file.
- Used various types of look up functions like look up interval, look up local, lookup match, look up count and look up next.
- Developed various wrappers from start job to end job and developed PL/SQL scripts to load the data into Oracle tables using parallel hints.
- Used the various Abinitio performances’ techniques to improve the graph performances and fixing the over heads.
- Developed Generic ETL Graphs for loading data to Oracle DB tables from the load ready files and have good experience in building surrogate keys in serial and multi file and chaining the records.
- Have good experience in using air, m commands and UNIX commands.
Environment: Ab-Initio GDE 3.2.2, co-op 3.2.2, Linux, EME 22.214.171.124, Oracle 11.2 g Enterprise Edition, Windows 7 Professional, IBM Tivoli, MS Share Point, SAP BO and HP Quality center.
Data ware Housing Consultant
- Prepare ETL design documents, evaluate and recommend ETL solutions to meet user requirements; define and implement solutions to load and extract data from source data.
- Had conceived data from sources, cleansed, transformed and loaded in to target databases using an ETL tool, Ab Initio.
- Had developed shell scripts that customize Ab Initio graphs prepared using graphical interface, GDE that processes multiple queries on large data
- Had been responsible for replicating operational tables, transform and load data into operational data using ETL techniques.
- Designed/Created several scripts to load the staging/target tables Configured Ab Initio environment to connect to the Teradata database using db config, Input Table and Output Table Components Had been responsible for scheduling and exception-handling routines on designed ETL processes.
- Worked on modifying the Ab Initio component parameters utilized data parallelism and thereby improved the overall performance by optimizing the execution time.
- Extensive experience working with branches in EME and coordinating with other teams to merge them.
- Designed and developed AI Graphs for large projects in the Retention Marketing Domain.
- Had worked on modified DML and supported graph changes pertaining to Unix to Linux
- Had worked on migration of Ab Initio based applications (migration effort).
- Developed AI graph based fast-track ad-hoc projects marketing and legal support.
- Used Teradata utilities such as BTeQ, multiload, fastload etc.
- Conducted and participated in design reviews to support Best Practices.
- Analyzed financial data reports intended for executive consumption.
- Debugged shell scripts and Perl based code that worked in conjunction with AutoSys
- Involved in the creation of Ab-Initio Graphs to extract, transform and load the data from mainframes.
- Worked extensively on Web service components like Call web service.
- Data cleansing operations were performed on the graphs using data cleansing functions.
- Configured Ab Initio environment to connect to the Oracle database using db config, Input Table and Output Table Components.
Environment: Oracle 8i/9i/10g, ERWIN, Ab-Initio 1.14.15 GDE with Co>op 2.13.1 patch level 40 built on AIX5-2-n32mt, Unix - Solaris, HP superdome/RP 8400, Perl, Brio/Hyperion, Windows NT/2000, XML, COBOL, DB2, RCS Version 5.7, Teradata Utilities (BTEQ, Fastload, Multiload, Tpump).