Abinitio Consultant Resume
0/5 (Submit Your Rating)
SUMMARY
- Having 3+ years of experience in Information Technology with a strong background in Data warehousing of which includes ETL process using Abinitio and Enterprise metadata Environment.
- Strong working experience in the development of Data warehousing using Data Extraction, Transformation and loading (ETL).
- Proficient in understanding Business processes / requirements and translating them into technical requirements.
- Good Experience in data warehouse application development life cycle, from requirement gathering, implementation, testing, debugging, implementation.
- Having good experience in understanding source and target definitions, Mappings, Transformations and working with graphs.
- Extensively worked with Abinitio Transform components Sort, Filter by expression, Reformat, Join, Rollup, Dedup - sorted.
- Well versed with various Abinitio Parallelism techniques and implemented Abinitio Graphs using Data Parallelism.
- Well understanding in check-in and check-out concepts.
- Good knowledge of Data Warehousing concepts.
- Good technical and Client Interaction skills.
- Eager to learn and adapt to new emerging techniques.
- Good problem solving, research, development and testing skills.
- Excellent organization skills, ability to work under pressure.
TECHNICAL SKILLS
ETL Tool: Ab Initio 1.15,3.0, Co-op 2.15,3.0
Database: Oracle, Teradata
Scripting Language: Shell Scripting
Operating System: Windows, UNIX.
PROFESSIONAL EXPERIENCE
Confidential
Abinitio Consultant
Environment: Abinitio (GDE 3.0,3.1 & Co>Op 3.0,3.1) UNIX.
Responsibilities:
- Involved in Understanding the Source System and the Business Requirements.
- Analysis of the technical design specifications.
- Involved in Deployment Process.
- Create LLD based on the HLD and Prepared implementation Plans for Deployment process.
- Preparing the RCA Document for Analysis the Requirement.
- Developing Ab Initio graphs to process the data coming from source systems by extracting the data, applying business rules to transform and create required final data file which is later loaded into Teradata. Implementing the Change Requests raised by business for all phases of project.
- Performed Impact Analysis for Change Requests raised after the first release
- Doing some enhancements for already existing Abinitio Graphs.
- Involved in the development of Abinitio graphs & also tuned them for better performance.
- Working with multiple source systems for retrieving the source data.
- Preparation and Execution of Unit Test Cases.
- Key designer for reconciliation report generation.
- Analyze source systems and existing tactical solution.
- Developed the Abinitio Generic graphs which used in different projects.
- Prepare Test plans (unit/System/integration) and performing testing.
- SIT/UAT runs for users’ data validation.
Confidential
Abinitio Developer
Environment: Abinitio (GDE 1.15,3.0 & Co>Op 2.15,3.0), UNIX.
Responsibilities:
- Involved in Understanding the Source System and the Business Requirements.
- Analysis of the technical design specifications.
- Involved in identifying the fact table and dimension table.
- Developed Databases and created ER relations based on the business requirements.
- Involved in the development of Abinitio graphs & also tuned them for better performance.
- Preparation and Execution of Unit Test Cases.
- Created Low level design documents and Technical Specification Documents.
- Key designer for reconciliation report generation.
- Peer Review
- Analyze source systems and existing tactical solution.
- Developed the Abinitio Generic graphs which used in different projects.
- Prepare Test plans (unit/System/integration) and performing testing.
- SIT/UAT runs for users’ data validation.
- Answering the queries raised by the Business in UAT
- Performed Impact Analysis for Change Requests raised after the first release
Confidential
Abinitio Developer
Environment: Abinitio (GDE 1.15 & Co>Op 2.15), UNIX.
Responsibilities:
- Prepared PS Documents for Understanding the Ab Initio Graphs.
- Worked with Partition components like PBK, PBRR, and PBE with efficient use of Multifile system.
- Performing transformations of source data with Transform components like Reformat, Filter-by-Expression, Rollup etc.
- Used Lookups with Reformat component to fetch matching records based on for the downstream process.
- Developing Ab Initio graphs as daily and monthly cycle for loading, partitioning, cleaning and populating the data based on legal and business requirements.
- Used Rollup component to populate monthly, quarterly and annual summary tables.
- By using Data Parallelism, Pipeline Parallelism and Component Parallelism in Graphs, where huge data files are partitioned into multifiles and each segment is processed simultaneously.
- Worked with sandbox environment while extensively interacting with EME to maintain version control on objects. Sandbox features like checkin and checkout were used for this purpose.
