Senior Ab Initio Etl Developer Resume
Cleveland, OH
SUMMARY:
- Around 5 years of IT experience in the Design, Analysis, Development, Implementation and Decision Support Systems & Data Warehousing applications, Integration & Migration of Extract transfer and Load (ETL) processes using the Ab Initio Software.
- Hands on experience in development of Data Warehouses/Data Marts using Ab Initio Co OP, GDE, Component Library, Oracle and UNIX for mainly Banking/Financial/Insurance industries.
- Solid experience in Extraction, Transformation and Loading (ETL) mechanism using Ab Initio. Knowledge of full life cycle development for building a data warehouse.
- Well versed with various Ab Initio components such as Round Robin, Join, Rollup, Partition by key, gather, merge, interleave, Dedup sorted, Scan, Validate, FTP.
- Experience with Ab Initio Co Operating System, application tuning and debugging strategies.
- Proficient with various Ab Initio Parallelism and Multi File System (MFS) technique.
- Good understanding of Ab Initio SANDBOX and Graphs parameters to make the graph generic.
- Configured Ab Initio environment to talk to database using db config, Input Table, Output Table, Update table Components.
- Worked on different dimension models like SCD1, SCD2 and SCD3.
- Developed numerous Ab Initio jobs to extract data from mainframe datasets into UNIX server.
- Experience in data profiling and data quality.
- Experienced in writing SQL query and PL/SQL Procedures, Triggers, and Functions necessary for maintenance of the databases in the data warehouse development lifecycle.
- AWS Certified Solutions Architect with proficiency in cloud services such as EC2, S3, CloudFront, Elastic Beanstalk, SQS, Elastic Load Balancer (Classic/Application), Auto Scaling, RDS, VPC, Route 53, CloudWatch and IAM.
- SQL/Database developer experience in fine tuning queries and wrote several SQL queries for adhoc reporting.
- Worked with Wrapper scripting with UNIX Shell programming, Scheduling of Ab Initio jobs with AutoSys.
- Ability to co - ordinate effectively with development team, business partners, end users and management.
- Experience in work environment with a mix of onsite-offshore Global Delivery model.
TECHNICAL SKILLS:
ETL Tools: Ab-initio GDE 1.15,3.1.5, Co >Opsys (3.0 to2.15)Data Base: Oracle, Microsoft SQL Server, IBM DB2.
Operating System: UNIX, Windows 7, Windows XP.
Languages: SQL, PL/SQL, Unix shell scripting.
Scheduling Tools: AutoSys, Tivoli workload scheduler, and Control-M
Version Control Tool: EME, GIT.
Defect Tracking Tool: HP-Mercury Quality Center
Data Modeling: Star Schema and Snowflake schema, Erwin tool.
PROFESSIONAL EXPERIENCE:
Senior Ab Initio ETL Developer
Confidential, Cleveland, OH
Responsibilities:
- Extracted data from various sources like databases, delimited flat files.
- Extensively Used Ab Initio components like Reformat, Scan, Rollup, Join, Sort, Partition By key, Normalize, Input Table, Output Table, Update Table, Gather Logs and Run SQL for developing graphs.
- Implemented procedures for management of Ab Initio applications and Citrix servers.
- Developed and enhancement of Ab Initio Graphs and business rules using Ab Initio Functions.
- Performed validations, data quality checks and Data profiling on incoming data.
- Generated Configuration files, DML files, xfr files for specific Record format, which are used in components for building graphs in Ab Initio.
- Knowledge in translating business requirements into workable functional and non-functional requirements at detailed production level using Workflow Diagrams, Sequence Diagrams, Activity Diagrams and Use Case Modeling.
- Developed various BTEQ scripts to create business logic and process the data using Teradata database.
- Involved in design of database and creation schemes and tables in normalized form.
- Extensively used Multi-load and Fast-load utilities to populate the flat files data into Teradata database. Performed evaluations and made recommendations in improving the performance of graphs by minimizing the number of components in a graph, tuning the Max Core value, using Lookup components instead of joins for small tables and flat files, filter the data at the beginning of the graph etc.
- Extensively used File management commands like m ls, m wc, m dump, m copy, m mkfs etc.
- Responsible for deploying Ab Initio graphs and running them through the Co-operating systems mp shell command language and responsible for automating the ETL process through scheduling.
- Generate SQL queries for data verification and backend testing. Detect and Correct Data Quality issues.
- Worked with data mapping from source to target and data profiling to maintain the consistency of the data. Experienced with SQL queries. Knowledge of checking the data flow through the front end to back end and used the SQL queries to extract the data from the database to validate Fit at the back end.
- Written stored procedures and packages on server side and developed libraries.
- Performed Data Profiling to assess the risk involved in integrating data for new applications, including the challenges of joins and to track data quality.
- Written UNIX scripts to perform certain tasks and assisting developers with problems and SQL optimization.
- Implemented phasing and checkpoint approach in ETL process to prevent data loss and to maintain uninterrupted data flow against process failures.
- Automate the complete daily, weekly and monthly refresh using custom build UNIX SHELL SCRIPTS.
- Worked with production support team to debug issues in production and for migrations from pre-prod to production.
Environment: Ab Initio (GDE 3.1, Co>Op Sys 3.1), UNIX, SQL, IBM DB2, CONTROL M, Teradata V2 R6 (Fast-loads, Multi-loads, Fast Exports), BTEQ.
Senior Ab Initio ETL Developer
Confidential, Cleveland, OH
Responsibilities:
- Extensively used Ab-Initio ETL tool in designing & implementing Extract Transformation & Load processes. Different Ab Initio components were used effectively to develop and maintain the database.
- Understood the business requirements with extensive interaction with users and reporting teams and assisted in developing the low level design documents.
- Provide support to SQL, MYSQL, Oracle, Data guard and DB2 database while coordinating with database teams.
- Worked in a sandbox environment while extensively interacting with EME to maintain version control on objects. Sandbox features like check-in and checkout were used for this purpose.
- Maintained locks on objects while working in the sandbox to maintain the privacy.
- Used inquiry and error functions like is valid, is defined, is error, is defined and string functions like string substring, string concat and other string * functions in developing Ab Initio graphs to perform data validation and data cleansing.
- Developed Complex XFRs to derive new fields and solve various business requirements.
- Converted user defined functions and complex business logic of an existing application process into Ab Initio graphs using Ab Initio components such as Reformat, Join, Transform, Sort, Partition to facilitate the subsequent loading process.
- Write and modify several application specific scripts in UNIX in order to pass the Environment variables.
- Used Teradata SQL Assistant front-end tool to issue SQL commands matching the business requirements to Teradata RDBMS.
- Scripts were run through UNIX shell scripts in Batch scheduling.
- Worked with DBA to identify gaps in data modeling, data mapping in Teradata and provide Teradata performance knowledge to users.
- Implemented a 6- way multi file system in the test environment that is composed of individual files on different nodes that are partitioned and stored in distributed directories in multi file system. Partition Components (Partition by Key, by Expression, by round Robin) were used to Partition the large amount of data file into a simple multiple data files.
- Used UNIX environment variables in various .ksh files, which comprises of specified locations to build Ab initio Graphs.
- Responsible for deploying Ab Initio graphs and running them through the Co-operating systems mp shell command language and responsible for automating the ETL process through scheduling.
- Involved in monitoring the jobs and schedules through Maestro and Autosys Scheduler.
- Participating in the agile trainings and meetings as a part of the agile team.
- Worked on improving the performance of Ab Initio graphs by using Various Ab Initio performance techniques like using lookup s (instead of joins), In-Memory Joins and rollups to speed up various Ab Initio Graphs.
Environment: Ab Initio GDE 1.13 Co-op 2.13, UNIX, PL/SQL, Oracle 8i/9i, Query man, Windows NT/2000
Ab Initio ETL Developer
Confidential, Long Beach, CA
Responsibilities:
- Study and understand all the functional and technical requirements to better serve the project.
- Developed Ab Initio graphs using different components for extracting, loading and transforming external data into data mart.
- Developed number of Ab Initio Graphs based on business requirements using various Ab Initio Component like Filter by Expression, Partition by Expression, Partition by round robin, reformat, rollup, join, scan, normalize, gather, replicate, merge etc.
- Developed Ab Initio graphs for Data validation using validate components like compare records, compute checksum etc.
- Implemented Data Parallelism through graphs, by using Ab Initio partition components.
- Documentation of complete Graphs and its Components.
- Implemented Data Parallelism through graphs, which deals with data, divided into segments and operates on each segment simultaneously through the Ab Initio partition components to segment data.
- Extensively used UNIX Shell Scripting for writing SQL execution scripts in Data Loading Process.
- Involved in Ab Initio Design, Configuration experience in Ab Initio ETL, Data Mapping, Transformation and Loading in complex and high-volume environment and data processing at Terabytes level.
- Knowledge of checking the data flow through the front end to back end and used the SQL queries to extract the data from the database to validate it at the back end.
- Designed automated reports in SQL Server Reporting Services (SSRS)
- Involved in automating the ETL process through scheduling.
- Used Autosys for scheduling the jobs.
- Debugged and modified shell scripts using edit script and vi editor in UNIX environment in determining various graph paths and run file path for job request engine.
- Responsible for cleansing the data from source systems using Ab Initio components such as reformat and filter by expression.
- Developed psets to impose reusable business restrictions and to improve the performance of the graph.
- Extensively used m db commands to query the oracle databases for reporting purposes.
- Developed several partition based Ab Initio Graphs for high volume data warehouse.
- Checked the accuracy of data loaded into Teradata & assured the linkage of keys on the tables for various applications.
Environment: Mainframe, Ab-Initio 2.14.22GDEwith co-op 2.13.1, EME 2.13.1, SQL SERVER 2005, UNIX and Windows XP, Autosys.
Ab Initio consultant
Confidential
Responsibilities:
- Involved in High level designing and detailed level designing.
- Developed various Ab Initio Graphs based on business requirements using Components like reformat, filter by expression, run program, run SQL, FTP, rollup, join, gather, replicate etc.
- Extensively used lookups to Increase the performance of the graph.
- Developed various Ab Initio Graphs for data cleansing using Ab Initio function like is valid, is defined, is error, string sub-string, string concat and other string * functions.
- Much familiar with the concept of Sand box for check in & checkout process.
- Used the Ab Initio Web Interface to Navigate the EME to view graphs, files and datasets and examine the dependencies among objects.
- Written stored procedures and packages on server side and developed libraries.
- Used FTP components to fetch the GDG files from Main frame systems.
- Written wrapper scripts for date logic implementation and file archiving process.
- Test Ab Initio graphs in development and migration environments using test data, fine tuning the graphs for better performance and migrate them to the Production environment.
- Created Sub Graphs to impose application/business restrictions.
- Experience in using sprint developed Application tools for Preload, load, post loading into partitioned tables.
- Database tuning, backup and recovery and general day to day running of the system.
- Involved in Hardware project in changing the DB server from Sun to AIX to minimize size and efficiency of the project and improved Performance.
- Involved in Abinitio Co-Op Version upgrade from 2.11 to 2.12.
- Expertise in handling Pipeline Jobs.
- Supporting production, maintaining nightly jobs.
Environment: Ab Initio (GDE 1.13.9, Co-Op 2.12), Oracle9i, Teradata V2R5, AIX 4.0, MS Visio, EME, Korn Shell Scripts.