Senior Ab Initio (etl) Developer Resume
0/5 (Submit Your Rating)
SUMMARY
- Having around 9 years of experience in designing, developing, and maintaining large business applications such as integration, conversion, testing and data migration.
- Having 9 years of good hands - on experience working in ETL tool Ab-Initio, design/developing applications.
- Having 3+ years of experience working in Hadoop Ecosystem, design/developing applications.
- Worked extensively on Hadoop migration project and POCs.
- Worked on migration of Sqoop jobs into Ab-Initio ingestion graphs.
- Worked on integration of Ab-Initio with AWS S3 POC & Ab-Initio with GCP POC and Big Query.
- Good hands-on experience on generic graphs and psets creation.
- Created an Ab-Initio framework to migrate on Sqoop jobs which ingest data from RDBMS to HDFS.
- Played a major role in creation of another generic Ab-initio Framework to create all objects (all artifacts like dml’s, xfr’s, psets, hive hql’s etc.,) by passing Source to Target mapping document.
- Expertise in Ab-initio advanced tools like Express>It, Acquire>It, Conduct>It.
- Having good experience on scheduling tools like Control Centre and Autosys.
- Good hands-on experience on Unix shell scripting.
- Maintained Metadata Lineage of Ab-Initio end to end flows.
- Involved in POC of Sqoop, Hive and Spark Technologies.
- Hands on experience on Ab-Initio as a Data ingestion tool.
- Exposure in design and development of solutions for Big Data using the Hadoop eco system technologies (HDFS, Hive, Sqoop).
- Brought in simplification process and Optimization initiatives to bring efficiency into applications.
- Manned versatile roles across diverse applications as Data Engineer, Developer, QA engineer and automation projects.
- Able to assess business rules, collaborate with stakeholders and perform source-to-target data mapping, design and review.
- Handled multiple reads and writes with AWS and Hadoop Data Lake (called Enterprise Data Lake -EDL).
- Strong data base experience on Oracle, DB2 and Hive.
- Have Idea on Sqoop Export and Import and upsert.
- Have Idea on Hive Queries and Load Data into Hive Tables from Files and Database.
- Ability to perform both unit and integration testing along with efficient debugging skills.
- Good hands-on experience on Eclipse and IntelliJ.
- Flexible, enthusiastic, and project-oriented team player with solid communication and leadership skills to develop creative solution for challenging client needs.
- Excellent knowledge in Data warehousing concepts.
- Worked on all areas on ETL which includes development, team lead, testing etc.,
- Worked in Agile Methodologies like Scrum and Kanban.
- Have good problem solving and analytical skills and love to innovate in order to perform better.
- Have strong Interpersonal skills, communication skills and people skills to manage a team.
- Lead a team of 12 and reached all the goals on-time.
- Domain experience includes Finance, Insurance, Retail, Travel.
- Guided and trained new comers to the team.
- Bagged GEM for 2 times and WOW for 3 times for best performance & leader-ship qualities.
TECHNICAL SKILLS
Tools: Ab Initio, Express>It, Acquire>It, Conduct>It, Control Centre, AutosysHadoop, Sqoop, Hive, HBASE, AWS S3.
Databases: Oracle, MySQL, Hive, HBASE etc.
Languages: SQL, UNIX Shell Script
Operating Systems: Linux and Windows Recent POC
Integration of Ab: Initio with AWS S3.
Integration of Ab: Initio with GCP. Big Query.
PROFESSIONAL EXPERIENCE
Confidential
Senior Ab initio (ETL) Developer
Responsibilities:
- Created an Ab-Initio framework to migrate on Sqoop jobs which ingest data from RDBMS to HDFS.
- Loaded and transformed large sets of semi structured data likes XML, JSON, Avro, Parquet.
- Line management of team members and their
- Working with other Delivery Leads to perform impact analysis on key initiatives
- Performed Import and Export of remote data to AWS s3
- Involved in working on theData Analysis, Data Quality and data profiling for handling thebusiness that helped the Business team.
- Code & peer review of assigned task. Unit testing and Bug fixing.
- Work with client/customer to create technical strategies and frameworks
- Performed Import and Export of data into HDFS and Hive using Sqoop and managed data within the environment.
- Involved in creating Hive tables, data loading and writing hive queries.
- Managed Hive Tables and created child tables based on partitions.
- Code & peer review of assigned task. Unit testing and Bug fixing.
- Involved in delivering the resultant data to Cassandra
- Understand and execute Change and Incident management
- Work with the Project Manager in the production of Project Work Package for Production related Products
- Good knowledge on HP ALM and AQATDSR & AQATS reports generation for Production release.
Confidential
Team Lead & ETL (Ab-Initio) Developer
Responsibilities:
- Liaise with various business and project stakeholders to understand their business first then implement the new enhancements as per the given requirement.
- Migrated data from multiple source systems to RDBMS/HDFS for data analysis.
- Upgraded Ab Initio objects and refreshed application configurations to 4.0 version while on-boarding the new projects to EDL.
- Prepared Test Cases and performed end to end testing in QA environment.
- Created Hive tables and performed all kinds of data handling like loading, unloading, deleting, partition creation, partition drop, partition repair, DISTCP between different clusters etc.,
- Involved in all areas of project development and testing.
- Developed and maintained the applications using Ab Initio graphs and plans.
- Developed file sequence process using Unix shell script.
- Created application configurations and business rules using Express>IT
- Good hands-on experience on Acquire>It.
- Configured and scheduled jobs using Ab Initio Control Centre.
- Have worked on Autosys scheduler.
- Handled the delivery of the solutions on behalf of offshore team.
- Coordinated with the On-Site coordinator for the build requirements.
- Worked on Ad-hoc requests from the On-Site coordinator.
- Automated unit testing process using Ab Initio plans.
Confidential
ETL (Ab-Initio) Developer
Responsibilities:
- Analyzed the requirements and performed Impact Analysis based on the requirements
- Assisted QA team during testing and defects fixes
- Developed Ab Initio generic graphs and psets using components such as partition, transform, and lookups to validate, transform and load data.
- Developed Unix shell scripts to execute some ad hoc jobs.
- Implemented ICFF lookups for performance tuning.
- Analyzed and resolved production issues.
- Supported code migration to higher environments.