We provide IT Staff Augmentation Services!

Ab Initio Tech Lead Resume

3.00/5 (Submit Your Rating)

SUMMARY

  • Experienced Data Integration Manager with 15 years of IT experience in Managing, implementing and delivering end to end Data Integration global program and projects in the Banking and Insurance domains.
  • Upskilled and Cross - skilled new/existing resources by facilitating adequate trainings on technologies required on the project.
  • Strong change management expertise including risk mitigation, infusing new ideas and providing value adds.
  • Skilled in project implementation, project scoping, risk and feasibility analysis, functional mapping, designing project schedule, change management, quality and delivery management.
  • Delivered high quality solutions in a timely manner meeting and exceeding the business expectations. This involved working with business change managers, technology managers and business sponsors
  • I have acquired strong stakeholder management skills and exposure to various proven industry standards which will help me provide strategic inputs to any future role I am involved with.
  • Experience in Data Warehousing and Data Integration/Data Ingestion Projects based on ETL Ab Initio tool.
  • Experience in Architecting and Designing ETL Applications both Real time and Batch Processing using one of the most Niche ETL tool Ab Initio.
  • Owns and drives the Project/Release planning framework, ensuring that plans are developed and tracked to appropriate standards, are integrated effectively across all work-streams, are unambiguous and achievable, and that dependencies are clearly understood and managed.
  • Have Strong leadership in a matrix environment to promote best practices in managing interrelated projects and programs to ensure successful delivery.
  • Ability to make sure that Project resources have the correct toolsets and processes in place and that all relevant staff are trained and equipped to deliver successfully.
  • Proficient in 3rd Party Vendor Management, Stake Holder Management.
  • Very well understand the IAAS, PAAS and SAAS Platform services.
  • Having in depth knowledge of Architecting, Designing, Managing and delivering projects end to end.
  • Very well versed in designing Fault tolerant, highly available, Parallel processing or distributed processing applications.
  • Considered Auditing, Reconciliation and Disaster recovery scenarios while Architecting the applications.
  • Always Designed and Developed ETL projects considering Ab Initio Best Practices and Standards.
  • Have excellent exposure on delivering projects build up using ETL tool Ab Initio and Big data lakes.
  • Experience in designing, developing and deploying Ab Initio Graphs, coordinating testing and production deployments, efficient in trouble shooting and performance enhancement.
  • Intermediate experienced in Big Data ecosystem experimenting/exploring- Syncsort, Git Hub, Pig, Hive, Hadoop, HDFS, AWS.
  • Intermediate experience in AWS Cloud Platform and its features such as S3, EC2 and Redshift, Athena, Glue etc.
  • Strong exposure to data quality best practices and data governance.
  • Proficient with Ab Initio Best Practices and Standards.
  • Vast experience of working with direct Bank Business Analysts and Solution Architects.
  • Hands on with Erwin data modeler for designing/Editing Business data models.
  • Ability to understand SQL statements and relational database internal processes.
  • Utilize Waterfall and Agile software development model.
  • Proficient with using JIRA, ServiceNow and Leading Triage calls.
  • Demonstrated leadership skills in working with multiple cross-functional teams in major organizations.
  • Outstanding drive and focus on organizational objectives and customer need.
  • Worked in distributed and multicultural working environment and worked physically in US, UK and India. Results oriented and always keen to learn new technologies to acquire more business skills.

PROFESSIONAL EXPERIENCE

Confidential

Ab Initio Tech Lead

Responsibilities:

  • Performed requirement gathering and analysis of incoming reference data.
  • Designed programming logic for data transformation.
  • Built ETL process technical design documentation in real time using Continuous flows.
  • Built, tested and designed interfaces using ETL tool Ab Initio.
  • Created low level design based on high level design documents.
  • Built, tested and deployed interfaces using ETL tool Ab Initio.
  • Involved in Design meetings, Discussing FRD’s (Functional requirement document) with BA’s and creating HLD’s for Technical Implementation.
  • Used JIRA for tracking defects and ongoing tasks within the team.
  • Used ServiceNow for raising issues and Change Requests.
  • Maintained RAID logs for technical discussions.
  • Led TRIAGE calls with multiple teams to bring everyone on one platform and mitigating the issues.
  • Handling Enterprise Data Management Project based on one of the niche Data Governance Tool Collibra.
  • Sets deadlines, assigns responsibilities, mentors, and monitors progress of the team.
  • Learned and developed some of the workflows in Collibra to integrate with DGC and also tries hands on Mulesoft for self development and to have better understanding of Team work.

Confidential

Principal Consultant

Responsibilities:

  • Define strategy, roadmap, and Data governance process for ETL Ab Initio engagements.
  • Sets deadlines, assigns responsibilities, mentors, and monitors progress of the team.
  • Works with stakeholders and functional teams to understand and develop Business requirements.
  • Involved in Design meetings, Discussing FRD’s (Functional requirement document) with BA’s and creating HLD’s for Technical Implementation.
  • Involved in discussions with Infrastructure Teams to understand the Storage allocations made for different environments to support daily volume of data.
  • Responsible for Project delivery from technology side involving Ab Initio design, development, testing in Dev and all higher environments and deploying in Production.
  • Ensure Best Practices are followed for reducing redundancy of code and major focus on improving Performance and reusability of artifacts across the projects.
  • Handling a team of 8 Onsite and 25 Offshore resources maintaining equilibrium in work distribution, team satisfaction and getting effective output to meet deliverables on time.
  • Worked hands-on on Ab Initio Plans design, development, scheduling and execution of jobs through control center.
  • Focused on Data Quality and supports the DQ teams on all issues identified in Testing environments.
  • Leading the responsibility to drive innovation based on data, which is a first-time effort in the organization to move towards completely data driven business.
  • Supporting a POC to testify migration of data from Ab Initio to Snowflake Cloud Database.
  • Build POC for Big Data Ingestion Tool for migrating existing Mainframe feed to Hadoop environment.
  • Build POC to improve performance of data processing by using appropriate data compression formats (Parquet/Avro).
  • Build POC on Analytics reports using Tableau.
  • Used JIRA for tracking defects and ongoing tasks within the team.
  • Used ServiceNow for raising issues and Change Requests.
  • Maintained RAID logs for technical discussions.
  • Led TRIAGE calls with multiple teams to bring everyone on one platform and mitigating the issues.

Confidential

Application Delivery Lead

Responsibilities:

  • In this role handled new initiatives for exploring new technologies and driving multiple teams for performing POC’s to address the business needs.
  • Build POC for Big Data Ingestion Tool for migrating existing Ab Initio code.
  • Design POC’s for converting existing ETL graphs to Hadoop HDFS ecosystem.
  • Designed application to move data from ETL Space to AWS redshift.
  • Managed multiple data warehousing/business intelligence related projects directed toward strategic business and other organizational objectives.
  • Supported Big Data Business Lake programs.
  • Define strategy, roadmap, and Data governance process for ETL/Big Data engagements.
  • Design and deploy AWS solutions using EC2, S3 and Redshift.
  • Installed application on AWS EC2 instances and configured the storage on S3 buckets.
  • Design and program logic for data transformation.
  • Build, test and deploy interfaces using ETL tool Ab Initio/Snaplogic.
  • Design programming logic for data transformation as per requirements of the business.
  • Interact with clients on technical complexity and provide end to end solutions.
  • Design and program logic for data transformation using Ab Initio.
  • Build, test and deploy interfaces using ETL tool Ab Initio.
  • Design programming logic for data transformation as per requirements of the business.
  • Interact with clients on technical complexity and provide end to end solutions.

We'd love your feedback!