We provide IT Staff Augmentation Services!

Etl & Hadoop Technical Lead Resume

5.00/5 (Submit Your Rating)

TexaS

PROFESSIONAL SUMMARY:

  • 7+ years of experience in various technologies such as ETL Architecture, Development, Enhancement, Maintenance, Data Modeling, Data profiling, Reporting including Business requirement, System requirement gathering.
  • Expertise in complete SDLC which involves Requirement Gathering, Proof of concept, Gap analysis, Converting functional to technical Spec, Design, Development, Testing and Implementation through Agile Methodology and the Scrum Process.
  • Good understanding of Data Warehouse Concepts/Dimensional Modeling (Slowly Changing Dimensions, Star and Snowflake Schema).
  • Hands - on experience in various Financial Advisory Domain Projects (Investment Management, Asset Management and Wealth Management).
  • Strong experience in Creating Data Lake in Hadoop Hortonworks, Data Ingestion in HIVE Tables, writing UNIX Shell scripts, PL/SQL Scripts for development, automation of ETL process, error handling, and defect fixing.
  • Built Data Analytics Team at Onsite and Offshore locations.
  • A multi-talented with motivating skill to optimize Human utilization and ensuring business requirements are delivered on time as per client’s requirement.

TECHNICAL SKILLS:

ETL: IBM Infosphere Datastage 8.5/9.1/11.5

Big Data: Hadoop Hortonworks, Hive

Reporting: SAP Business Object, Information

Design Tool: Unix Shell Script

Database: Oracle 10g, SQL Server, Netezza,Db2, Sybase

CRM: Salesforce

Tools: SQL Squirrel, SQL Developer, Aginity, SQL Server Management Studio,Putty,Control-M

Agile IT Project Management Tool: Rational Team Concert

WORK EXPERIENCE:

ETL & Hadoop Technical Lead

Confidential, Texas

Responsibilities:

  • Participated in Sprint Backlog Planning with product owners to prioritize the User Stories from Product Backlog.
  • Created Data Lake to store Structured/Semi-Structured data in Hadoop Hortonworks.
  • Experienced in creating Hive Tables to store Legacy data.
  • Actively engaged in the client meetings to gather business requirements.
  • Prepared High Level design ETL Job Flow Mapping (End-to-End) document as per the business requirements.
  • Analyzing the existing Source/Target System for creating Source-to-Target Mapping.
  • Expertise in determining the load strategy to store cleansed data in Warehouse/Mart Tables.
  • Strong knowledge in creating Data Warehouse/Data Mart objects using ETL/ELT Method.
  • Used different stages of Datastage Designer like Lookup, Join, Merge, Funnel, Filter, Copy, Aggregator, and Sort etc
  • Health Meter Tool - Automated to identify duplicates in the Target System.
  • Introduced different validation methods to validate the input feed received from different systems before processing the same.
  • Integrated various data sources (DB2-UDB, SQL Server, PL/SQL, Oracle, Netezza, XML, Flat Files, Salesforce and MS-Access) into data staging area.
  • Fine tuned datastage jobs with large volume of data for better performance.
  • Batch jobs creation using Control-M.
  • Worked with end users during User Acceptance Testing phase.
  • Working in onsite-offshore work model.

Developer

Confidential

Responsibilities:

  • Created Stored Procedures, Functions, Views, and Triggers.
  • Experienced in migrating Stored Procedures, Functions, Views, and Triggers from Sybase to Oracle 11g.
  • Experienced in using Global Temporary tables.
  • Performed Unit testing on the components migrated to Oracle.
  • Defect fixes during project life cycle phase.
  • Fixed data issues during UAT period.

Database Administrator

Confidential

Responsibilities:

  • Managing database security by automating user creation process and assigning privileges, profiles and roles to users depending on the user activity.
  • Administer Database Object Privileges and managing user accounts.
  • Refreshing database tables by creating database link and cursors.
  • Monitor the tablespace growth, mount points periodically and fix issues with Temp & undo table space.
  • Gathering Statistics for tables & schemas using DBMS STATS.
  • Resource Management - Creating and manage tables, indexes, allocate and manage physical storage structures like data files, redo logs, control files.
  • Monitoring the batch jobs, fix if any issue occurs.

We'd love your feedback!