We provide IT Staff Augmentation Services!

Hadoop Developer Resume

3.00/5 (Submit Your Rating)

Sunrise, FL

EXPERIENCE SUMMARY:

  • Around 7 years of overall experience in IT industry.
  • Confidential has worked in various development, maintenance and enhancements projects.
  • Expertise knowledge on working with Big Data Platform (BDP) - Hadoop, MapReduce, Hive.
  • Expertise knowledge on working with ETL tool - InformaticaPowerCenter
  • Expertise understanding of data warehousing concepts-DWH and DataMart.
  • Expertise knowledge on working with SQL and PL/SQL.
  • Expertise knowledge on working with Teradata, Netezza, Oracle and DB2 (UDB).
  • Good knowledge on working with Job scheduling tools- Autosys and Tidal.
  • Good knowledge on working with UNIX scripts.
  • Good Knowledge on Business Intelligence (BI) concept - Microstrategy, OBIEE.

TECHNICAL SKILLS:

Technology: Bigdata, Hadoop, Hive, Oracle PL/SQL, Informatica PowerCenter, Teradata, Netezza.

Database: Oracle 10g/11g, Teradata 14, Microsoft SQL server, NetezzaAginity, DB2 UDB.

Operating System: WinXP/7, UNIX

Big Data: Hadoop, Hive

ETL: Informatica PowerCenter 9.6.1

DB: Teradata, Netezza, PL/SQL, Hive, IBM DB2, Microsoft SQL Server

Job Scheduler and Monitor: Autosys, Tidal, Teradata View Point

Unix: Putty, WinSCP, Reflection

Data Modelling: Erwin, Microsoft Visio

Others: StarTeam, ServiceNow

PROFESSIONAL EXPERIENCE:

Confidential, Sunrise, FL

Hadoop Developer

Responsibilities:

  • Developed MapReduce programs to parse the raw data, populate staging tables and store the refined data in partitioned tables in the EDW.
  • Created Hive queries that helped market analysts spot emerging trends by comparing fresh data with EDW reference tables and historical metrics.
  • Manages delivery of technology solutions in support of key product initiatives by working closely with product owners, architects, engineers, quality assurance, and third parties.
  • Works within and across business units to prioritize, plan and execute the development schedule for major product releases
  • Monitors team velocity, financials, and other KPIs in relation to the plan and published progress reports
  • Customer Interaction Management (CIM) with client and stakeholder.
  • Monitors project activities and performance across resource channels
  • Maintains dependency plan between planned sprints across engineering, infrastructure, and third parties
  • Manages, and appropriately escalates, delivery impediments, risks, issues, and changes tied to the product development initiatives
  • Applies judgment and discretion on when to raise issues versus work through issues with the team
  • Resolves issues precluding engineers from making progress against sprints and deadlines
  • Ensures deliverables across engineering teams are high quality and clearly documented
  • Discuss code strategy, contribute to architecture discussions, and review API specs

Tools: Bigdata Hadoop, Hive, MapReduce , Informatica PowerCenter 9.6.1, IDB Data studio for DB2, FileZila, Windows 7, UNIX

Confidential, Miami, FL

Senior ETL Developer

Responsibilities:

  • Identify and provide solution to the long running job.
  • Interacts with client and building the good work relationship.
  • Design a reconciliation process for intraday data load.
  • Gather the requirement interacting with business user.
  • Design and adherence to code review standard.
  • Prioritize the work according to complexity level or impact on the systems.
  • Adherence to proper process to client standard
  • Provide solution to client and clarify the work to offshore.
  • Schedule adherence with Quality delivery
  • Customer Interaction Management (CIM) with client and stakeholder.
  • Transfer of gained knowledge to fellow team member.

Tools: Informatica PowerCenter 9.6.1, Teradata. TOAD for Oracle, SecureCRT, OpCorn job scheduler, Windows 7, UNIX

Confidential, Broomfield, CO

Hadoop Developer

Responsibilities:

  • Developing HQL and loading data to HDFS.
  • Maintain and enhance the system by analyzing and implementing the solutions to Hadoop ecosystem.
  • Envision, estimate, define, build and deploy solutions using standard delivery assets to improve client´s delivery approach
  • Prioritize the work according to complexity level or impact on the systems.
  • Adherence to proper process to client standard
  • Gather the requirement interacting with business user.
  • Preparing Estimate with proper LOE according to requirements.
  • Coordinate with offshore and guide them to their day-to-day activity.
  • Customer Interaction Management (CIM) with client and stakeholder.
  • Provide solution to client and clarify the work to offshore.
  • Details level Gap analysis to enhance system.
  • Code developed and tested is delivered to stake holders. Defects if logged are accepted and code are modified accordingly
  • Schedule adherence with Quality delivery
  • Transfer of gained knowledge to fellow team member.

Tools: Big data Hadoop , Informatica PowerCenter 9.6.1, Teradata SQL Assistance, Hive, TOAD for DB2, Tidal, Putty, WinSCP, StarTeam, Windows 7, Perl, UNIX

Confidential, Framingham, MA

Technical Lead

Responsibilities:

  • Preparing ETL transformations in Informatica.
  • Coding in PL/SQL to.
  • Estimate accordingly to requirement and preparing approach document with design solution and Customer Interaction Management (CIM) with client and stakeholder.
  • Preparing Technical Specification
  • ETL Design, development, Unit testing, QA testing.
  • Distribute the work to fellow team members.
  • Preparing the Plan to execute the delivery with proper time line and with respective owner.
  • Code developed and tested is delivered to stake holders. Defects if logged are accepted and code are modified accordingly
  • Schedule adherence with Quality delivery
  • Transfer of gained knowledge to fellow team member

Tools: Informatica PowerCenter 9.6.1, Hive, Hadoop, JAVA, Teradata, Oracle SQL developer, MS SQL Server, Tidal, Putty, WinSCP, Windows 7, UNIX

Confidential, Torrance, CA

ETL Developer

Responsibilities:

  • Data Modeling in Star Schema fact-dimension model.
  • Using the Confirmed dimension model to optimize the job run duration.
  • Understanding business requirements and modelling in Kimball Star Schema.
  • Provide solution to business user to improve the reporting system and address to performance tuning
  • Perform gap analysis in business requirements
  • Documenting business requirement gaps, system requirements and system design architecture
  • Co-ordination with client and stakeholders
  • Research to implement complex functionality
  • Develop and Implement the project components
  • Perform performance tuning
  • Coordinate with Business users for user acceptance testing
  • Provide post-implementation support

Tools: Informatica PowerCenter 9.5.1, NetezzaAginity, TOAD for Oracle, Erwin, MS Visio, Oracle OBIEE, Autosys, SharePoint, Windows 7, UNIX

Confidential, Torrance, CA

Developer

Responsibilities:

  • Doing analysis and creating analysis and design document.
  • Coding in Informatica, SQL, PL/SQL and UNIX shell scripts.
  • Unit test case writing.
  • Unit testing.
  • Bug Fixing.
  • Performing code review.
  • Performance Tuning.

Tools: Informatica PowerCenter 9.1.1, NetezzaAginity, TOAD for Oracle, Autosys, SharePoint, Windows XP, UNIX

We'd love your feedback!