We provide IT Staff Augmentation Services!

Big Data Tech Lead Resume

Phoenix, AZ

SUMMARY:

  • Over one year of experience in using Horton networks Hadoop HDFS, Hive, Scoop, PIG .
  • Have Conceptual knowledge in HBASE, PIG, Map Reduce programs
  • Experienced in writing custom UDF’s in s & Hive Core Functionality.
  • Experienced in writing HIVE SQL (HQL) and Pig Scripts.
  • Worked on POCs for Abinito to Big data conversion using Magellan.
  • Have solid ETL work using Abinitioexperience for 5 years and 4 months
  • Agile and water fall software development life cycle.
  • Proficient in developing Ab - initio graphs, writing SQL queries and in UNIX.
  • Design end to end Data warehouse.
  • Generate reports and statistics using Abinitio.
  • Good in understanding the requirements, developing and debugging Abinitio graphs and writing wrapper scripts.
  • Good understanding and implementation of XML components like XML read, XML write, XML split, XML combine, XML reformat. Good understanding on webservice components.
  • Experience on building Abinito continuous applications, abinitio queues and abinitiowebservices.
  • Experience on Metaprogramming functions, dynamic DML/Transform generation.
  • Well versed with various AbInitio parallelism techniques and implemented AbInitio Graphs using Data parallelism and Multi File System (MFS) techniques.
  • Extensive usage of Abinitio Conduct It for building plans.
  • Practical experience with working on multiple environments like production, development, testing Experience in working on multiple Projects at one time. Have proficient experience inETL Testing.
  • Have hands on experience in all phases of Software Life Cycle such as Planning, Analysis, Design, Development, Testing, Documentation and Maintenance.
  • Understanding of EME concepts proficient usage of Abinitio m commands and air commands.
  • Have excellent Interpersonal & Communication skills.

KEY SKILLS:

Tools: Abinitio, Magellan For Big data,Control M,Event engine

Technology: Hadoop, HDFS, Hive, PIG Latin, DB2, Data warehouse, Oracle, UNIX

SDLC: Agile, Water fall

EXPERIENCE:

Big Data Tech Lead

Confidential, Phoenix, AZ

Responsibilities:

  • Managing several Hadoop clusters and other services of Hadoop ecosystem in development and production environments.
  • Work closely with engineering teams and participate in the infrastructure developments and framework developments
  • Automated deployment and management of Hadoop services including implementing monitoring
  • Created managed and external tables in hive and implemented portioning and bucketing techniques for space and performance efficiency
  • Developed the Pig UDF's to preprocess the data for analysis
  • Created nodes in Event engine and automated use case in all the environments.
  • Used GIT repository to check-in and checkout code.

Sr. ETL/Big Data Tech Lead

Confidential, Phoenix, AZ

Responsibilities:

  • Analyze and transform stored data by writing Pig/Hive jobs based on business requirements
  • Developed the Pig UDF's to preprocess the data for analysis
  • Design and provide technical solutions to the clients and team members to meet business requirements.
  • Understand and implement the project requirements. Interact with business users for requirements and then plan, facilitate and coordinate with project resources for their implementation.Vital role in the design, development and testing stage in process management, trouble shooting and problem solving.
  • Provide presentations, project estimates, periodic project status reports (daily, weekly, monthly). Conduct regular meetings with all project stakeholders for status, defect triage, idea generation and effective management.
  • Perform code reviews, ensure best practices are followed, review and approve defect fixes, signing off of production releases and implementation.

Sr. ETL/BigdataTech Lead

Confidential, Phoenix, AZ

Responsibilities:

  • Involved in the enhancement activities based on the requirement.
  • Developing the new functionality/ enhancing the existing functionality by making changes to the programs.
  • Designed, tested and debugged external and DB2 native stored procedures.
  • Technical Design Documentation.
  • Involved in Preparing Test case scripts.
  • Coordinating offshore team in development and technical related issues.
  • Owner of all the Triage Issues and SQP development
  • Have completely recreated SAGE application in ETL with better performance and functionality.
  • Recognized and appreciated by clients for implementation of generic solutions and reusing existing code and SAGE was one of the significant revenue generating project for Confidential .

Sr. ETL/Data Warehouse Developer

Confidential

Responsibilities:

  • Abinitio Requirement Analysis, graph development and enhancement
  • Designed Project related Design documents for various files & their layouts
  • Develop abinito graphs, shell scripts, SQL queries.
  • Create parameter sets including PDLs.
  • Ensure the successful data migration.
  • Developing Store procedure
  • Involved in Preparing Test case scripts.
  • Testing support to SIT & UAT.
  • Handled Code walkthrough for Production support team.
  • Configuration of the environment variables- Changing abinitiorc, stdenv so that enterprise level parameters can be used as standards for all applications.
  • Resource server management for Conduct-It and debugging any issues related to it

Sr. ETL/Data Warehouse Developer

Confidential

Responsibilities:

  • Designed Project related Design documents for various files & their layouts.
  • Daily interaction with the client to discuss queries, resolve coding issues and proper documentation of the development.
  • Involved in unit testing, QA testing and creation of unit test cases.
  • Scheduling and running of Control M ETL jobs

Hire Now