Etl Abinitio Developer Resume
Boca Raton, FL
SUMMARY
- Around 6 years of professional experience in data warehousing projects especially in the Banking, retail and Healthcare services domain.
- Good Experience in developing ACE app configs and Abinitiops - console Jobs.
- Extensive experience in developing well-tuned Ab Initio graphs (Generic and Custom) for UNIX environment.
- Extensive experience in Business Application Software Development using Ab Initio products such as GDE(3.2.2 & 3.1), Co-Operating system (3.2.4 &3.1).
- Good experience in various Ab Initio components such as Reformat, Redefine format, join, Rollup, Scan, Join with DB, Update Table, Partition by Expression/Key/Round Robin, Look up file/template, Read/Write Separated Values, batch Subscribe, Publish, Publish to Plan, Gather Logs, Handle Logs/Error, etc.
- Experience with various Ab initio parallelism techniques (pipeline, component and data). Implemented 2-wayto 4-way partitioning and de-partitioning in Ab Initio Graphs.
- Well versed with phasing and checkpoint concepts of Ab Initio.
- Good Knowledge in optimization techniques such as PDL, dynamic script generation, controlling memory usage, file system layout, component folding and micrographs.
- Good Knowledge in developing Ab Initio Plans and continuous flows.
- Good experience in UNIX shell scripting to bind Ab Initio graphs and execute in sequence.
- Good knowledge in creating and scheduling Autosys, jobs to automate ETL process. Possesses control M job scheduling knowledge as well.
- Managed graphs and other components in EME (Enterprise Meta Environment) and experience using different version control tools like SVN and Clear case.
- Strong knowledge on Data Warehousing methodologies and concepts, including star schemas, snow flakes, ETL processes and reporting tools.
- Good experience in development of PL/SQL stored procedures, functions and packages.
- Good experience in development of various types of reports using Actuate. Report Designer Professional.
- Quick adaptability to any new work environment and new technologies. Have worked on multiple ETL tools and was able to cope up with no training from the tool vendor.
- Possesses excellent analytical, communication, inter-personnel and problem-solving skills to handle client facing roles and end user communications.
- Handled several responsibilities like Requirement Analysis, Development, Design innovations (Providing Optimal Technical Solutions), Reviews, Client interactions, offshore team anchoring and onsite-offshore coordination activities.
- Strong experience of managing quality processes such as process audits and work product audits.
- Strong ability in multi-tasking and adapting to new work opportunities/agile environments.
TECHNICAL SKILLS
ETL TOOL: AbInitio, Informatica Power center
Languages: SQL, Java, C, Unix Shell Scripting
Database: Teradata, Oracle, SQL
Operating Systems: Unix, Linux, Windows 98/NT/2000/XP, Windows Server 2003/2008
Application Packages: MS Office, MS Outlook (MS Office Suite)
PROFESSIONAL EXPERIENCE
Confidential, Boca Raton, FL
ETL Abinitio Developer
Responsibilities:
- Met with business groups to understand the business process and gather requirements.
- Extracted and analyzed the sample data from operational systems (OLTP system) to validate the user requirements.
- Participated in data model (Logical/Physical) discussions with Data Modelers and created both logical and physical data models.
- Extensively used the Ab Initio components like Reformat, Join, Partition by Key, Partition by Expression, Merge, Gather, Sort, Dedup Sort, Rollup, Scan, FTP, Lookup, Normalize and Denormalize.
- Responsible for Performance-tuning of Ab Initiographs. Written UNIX shell scripts in Batch scheduling.
- Used Abi features like MFS (8-way partition), check point, phases etc.
- Extensively used the Teradata utilities like BTEQ, Fast load, Multiload, TPump, DDL Commands and DML Commands (SQL).
- Involved in writing complex SQL queries based on the given requirements and Created series of Teradata Macros for various applications in Teradata SQL Assistant and performed tuning for Teradata SQL statements using Teradata Explain command.
- Created several SQL queries and created several reports using the above data mart for UAT and user reports. Used several of SQL features such as GROUP BY, ROLLUP, CASE, UNION, Sub queries, EXISTS, COALESCE, NULL etc.
- Involved in after implementation support, user training and data model walkthroughs with business/user groups.
- Coded and tested Ab Initiographs to extract the data from Oracle tables and MVS files.
- Collected, analyzed the user requirements of the existing application and designed logical, physical data models.
Environment: AbInitio (GDE 3.2.2, Co>operating system 3.2.4), Teradata v14, UNIX, Control-M, Unix Shell scripts.
Confidential, Houston, TX
ETL Abinitio Developer
Responsibilities:
- Developing Ab Initio graphs as daily and monthly cycle for loading, partitioning, cleaning and populating the data based on legal and business requirements.
- Worked with Partition components like partition by range, Partition by Round Robin, partition by Expression efficient use of Multifile system which comes under Data Parallelism.
- Performing transformations of source data with Transform components like Replicate, Denormalize, Redefines, Reformat, Filter-by-Expression, Rollup etc.
- Used Lookups with Reformat component to fetch matching records based on for the down stream process.
- Used Sort component to sort the tables, and used Dedup Sort to remove duplicate values.
- Used Rollup component to populate monthly, quarterly and annual summary tables.
- Used Data Parallelism, Pipeline Parallelism and Component Parallelism in Graphs, where huge data files are partitioned into multifiles and each segment is processed simultaneously.
- Worked in a sandbox environment while extensively interacting with EME to maintain version control on objects. Sandbox features like checkin and checkout were used for this purpose.
- Worked with departition components Gather, Merge which will be used for adding the files which is done partition for fast process of data files.
- Used Ab Initio Components like Sort, Partition, Rollups, Reformat and Merge to build complex graphs.
Environment: Ab Initio(Co>Op 3.x/GDE 3.1), DB2, SQL Server 2012, Unix Shell Programming
Confidential
ETL Abinitio Developer
Responsibilities:
- Analysis and Estimation of Business Requirements as per User Specifications.
- Preparation of Design document for the Business requirement through continual interaction with Application stakeholders.
- Converts the business requirements to detailed technical structure for data warehouse projects.
- Creating jobs using the Ab Initio as the ETL tool along with UNIX, Oracle and DB2.
- Developing Abinitiographs for batch processing using Abinitiocomponents like Input file, Output file,, Out Table, Input Table, Update Table, Reformat, Join, Lookup, Join with DB, Merge, Gather, Roll Up, Scan etc.
- Responsible for setting up Repository projects using Ab Initio EME for creating a common development environment that can be used by the team for source code control.
- Involved in code check in/check out from EME using GDE and from unix using Abinitio air commands.
- Creating the AbinitioTags using air command.
- Responsible for Unit Testing and system testing in development/QA environments.
- Responsible for analyzing, debugging and fixing production issues.
- Providing resolutions to any kind of time critical issues in production according to the SLA.
- Autosys Jobs execution and JIL maintenance in Development, QA and Production environments.
- Responsible for Production support and maintenance
- Involved in other applications for resolution of the critical code bugs, analysis and production critical issues.
Environment: Win NT/XP, Java, Ab-Initio, UNIX, Oracle, DB2, SQL Server 2008.
Confidential
ETL Developer
Responsibilities:
- Analyzed the functional specifications provided by the data architect and created Technical System Design Documents and Source to Target mapping documents.
- Good proficiency in Informatica Data Quality 9.5.1/9.6.1 and Informatica power center 9.x/8.x/7.xs
- Strong in implementation of data profiling, creating score cards, Creating reference tables and documenting Data Quality metrics/dimensions like Accuracy, completeness, duplication, validity, consistency.
- Strong expertise in installing and configuring the core Informatica MDM components Defined, configured and optimized various MDM processes including landing, staging, base objects, foreign-key relationships, lookups, query groups, queries/custom queries, cleanse functions, batch groups and packages using Informatica MDM Hub console.
- Converted the data mart from Logical design to Physical design, defined data types, Constraints, Indexes, generated Schema in the Database, created Automated scripts, defined storage parameters for the objects in the Database.
- Performed Source System Data Profiling using Informatica Data Explorer (IDE).
- Involved in designing Staging and Data mart environments and built DDL scripts to reverse engineer the logical/physical data model using Erwin.
- Extracted data from SAP using Power Exchange and loaded data into SAP systems.
- Informatica Power Exchange for Mainframe was used to read/write VSAM files from/to the Mainframe.
- Basic Informatica administration such as creating folders, users, privileges, server setting optimization, and deployment groups, etc.
- Designed Audit table for ET and developed Error Handling Processes for Bureau Submission.
- Created various UNIX Shell Scripts for scheduling various data cleansing scripts and automated execution of workflows.
- Used Informatica IDQ to do data profiling of the source and check for the accuracy of data using dashboard.
- Managed Change Control Implementation and coordinating daily, monthly releases and reruns.
- Responsible for Code Migration, Code Review, Test Plans, Test Scenarios, Test Cases as part of Unit/Integrations testing, UAT testing.
- Used Teradata Utilities such as Mload, Fload and Tpump.
- Created BTEQ scripts.
- Used UNIX scripts for automating processes.
Environment: Informatica Power Center, Informatica Developer Client, IDQ, MDM, Power Exchange, SAP, Oracle, PL/SQL, TOAD, SQL SERVER 2005/2008, XML, UNIX, Windows XP, OBIEE, and Teradata.
Confidential
ETL Developer
Responsibilities:
- Analyzed the functional specs provided by the data architect and created technical specs document for all the mappings.
- Worked on Informatica Power Center tool - Source Analyzer, Target designer, Mapping & Mapplet Designer and Transformation Designer.
- Resolved issues that cause the production jobs to fail by analyzing the ETLcode and log files created by the failed jobs on the Informatica server.
- Used Transformations like Lookup, Joiner, Rank and Source Qualifier Transformations in the Informatica Designer.
- Worked on slowly changing dimension tables data.
- Created complex mappings using Unconnected Lookup, Sorter, Aggregator, Stored Procedure, dynamic Lookup and Router transformations for populating target oracle tables in efficient manner.
- Tuning Informatica Mappings and Sessions for optimum performance
Environment: Informatica Power Center, SQL, TOAD For Data Analyst, Business Objects 6.5/XIR2, UNIX, TOAD, PVCS, Unix, Tera Data, Meta Data Management MDM.