We provide IT Staff Augmentation Services!

Sr Etl/hadoop/big Data Developer Resume

0/5 (Submit Your Rating)

SUMMARY

  • 10+ years of IT experience with expertise in analysis, design, development and implementation of Data Warehousing applications using HADOOP, HIVE, SQOOP, OOZIE, PIG, Hbase, Teradata utilities, UNIX, Teradata SQL Assistant 13.0, UNIX shell scripting, JCLs, COBOL, Vertica and SQL work bench.
  • Extensive experience in Business Analysis, Application Design, Development, Implementation for banking and Financial Services.
  • Extensive experience in Creating, configuring and managing various Teradata load utilities like BTEQ, FLOAD, MLOAD, FXPORT, TPUMP etc.
  • Extensive experience in analyzing explains, performancetuning,collecting statistics, and different indexes - PI, SI, JI,PPI
  • Proficient in developing strategies for extracting, transforming and loading using Informatica Power Center 9.1/8.6.1 and Datastage
  • Proficient in the Integration of various data sources like Oracle, Teradata and Flat Files into the staging area, Data Warehouse and Data Mart.
  • Implemented data strategies, build data flows and developed conceptual, logical and physical data models to support new and existing projects.
  • Experience in various phases of IT projects Software Development Life Cycle (SDLC) such as analysis, design, coding and testing, deployment and production support.
  • Extensive Experience on End-To-End implementation of Data warehouse and Strong understanding of Data warehouse concepts and methodologies.
  • Worked for clients like Confidential, Financial and Banking Domain applications and Confidential healthcare and insurance domain applications.
  • Involved in extracting various data sources using the SQLs from the various files, various tables from the databases like Teradata, Vertica and Mainframe.
  • Involved in importing and exporting the data from tables using the Teradata utilities, sharing the data to downstream systems through NDM and SFTP connection.
  • Involved in designing some data profiling for better quality of data.
  • Developed scripts to come up with a generic way to load the data into the table on a particular database. Developed generic way to load data Historical Data.
  • Extensive experience in loading high volume data, and performance tuning.
  • Experience converting business requirements into a feasible technical solution.
  • In Depth knowledge in Data Warehousing & Business Intelligence concepts with emphasis on Life Cycle Development including requirement analysis, design, development, testing and implementation.
  • Having knowledge in financial domain experiences like customer profile management and card processing.
  • Involved in working with NDM, integrating the data from various source systems.
  • Experience in integration of various data, Data Conversions of sources like DB2, SQL Server, MS Access and Flat files into the Staging Area.
  • Thorough understanding of Data Warehouse concepts like Data mart, Data mining, Star and snowflake Schemas, Facts, Dimensions, Surrogate keys, drill down and drill across approach.
  • Developed Test Scripts, Test Cases, and SQL QA Scripts to perform Unit Testing, System Testing and Load Testing. Seamlessly migrated the Code from Development to Testing, UAT and Production.
  • Involved in initial and incremental runs, Validation, monitor documents, monitor scripts
  • Experienced in Scheduling tool Autosys for creating JIL files, Monitoring jobs
  • Excellent communication skills, Good organizational skills, outgoing personality, Self-motivated, hardworking, ability to work independently or cooperatively in a team, eager to learn, ability to grasp quickly.
  • Excellent Analytical and Leadership qualities, working in a team
  • Experience in Mainframe, UNIX shell scripting, FTP and file management in various UNIX and mainframe environments.
  • Experience in Job Scheduling tool like Autosys for UNIX and CA7 scheduler for mainframe.

TECHNICAL SKILLS

ETL Tools: Informatica Power Center 9/8.6.1

Databases: Teradata, Vertica, DB2, HIVE, HADOOP

Languages: SQL, shell scripting, Expeditor, JCL and COBOL

Development Tools: SQL assistant, SQL Workbench, SSH Tectia Client, Autosys, Toad for data analysts, Change Man, CA7, Endevor, HP Quality Center 9.2, Ms Excel,oozie, Sqoop

Database Modeling: Microsoft VISIO

Operating System: Windows XP/NT/2000, DOS, UNIX, Linux

PROFESSIONAL EXPERIENCE

Confidential

Sr ETL/Hadoop/Big Data Developer

Responsibilities:

  • Develop several Big Data related scripts through Hive, oozie, Pig, Sqoop.
  • Develop unix scripts to process data using Teradata utilities - BTEQ, Multiload, Fastexport, TPUMP and etc.
  • Test the process to hand over to production teams for implementation.
  • Design and code from specifications, analyzes, evaluates, tests, debugs, documents, and implements moderately complex software applications.
  • Uses coding methods in specific programming languages to initiate or enhance program execution and functionality
  • Under general direction, devises or modifies procedures to solve complex problems considering computer equipment capacity and limitations, operating time, and form of desired results.
  • Competent to work at the highest technical level of all phases of applications programming activities.
  • Monitor program execution for expected performance.
  • Modify, install, and prepare technical documentation for system software applications.
  • Interface with different departments within the organization regarding new deployments
  • Identify, escalate and document production impact issues for the environment to confirm delivery for clients and business notifications.
  • Develops, maintains, and reports intranet metrics.

Environment: HADOOP, HIVE, SQOOP, OOZIE, PIG, UNIX, TERADATA, SQL, UNIX shell scripting, Squirrel SQL client, oracle, SQL ASSISTANT, MySQL, SQL developer

Confidential

Sr Teradata Developer / ETL Developer

Responsibilities:

  • Interacted with the users, Business Analysts for collecting, understanding the business requirements.
  • Prepared business requirement documents and reviewed with Clients, downstream application teams and users
  • Perform impact analysis study and check the technical viability for implementing the solution proposed
  • Create standard abbreviation documents for conceptual, logical, physical and dimensional data models.
  • Create conceptual, logical, physical and dimensional data models.
  • Document conceptual, logical, physical and dimensional data models.
  • Prepared combine High level and Low level design document for all the applications
  • Involved in creating scripts and JILs as per the design documents.
  • Involved creating the tables in Teradata, Vertica and setting of the various environments like DEV, SIT, UAT and PROD.
  • Performed performance tuning, collect statistics for the new and existing process.
  • Prepare Unit test case document and perform Unit testing and Integration testing
  • Coordinate with up-stream and downstream application owner to process and make available the required source data for testing.
  • Unit testing of the changed/new code
  • SIT and UAT testing of the changed/new code
  • Involved joint working session with LOBs and reviewed the results after each level of testing
  • Involved in CAB activities and release planning to move the components.
  • Deployed the components into production using support teams.
  • Pre-Production execution and compared the results with existing production results.
  • Joint working session with LOBs and reviewed and validate the pre-production execution results.

Environment: UNIX, TERADATA, VERTICA, COBOL, JCL, SQL, Autosys, SQL workbench, SSH Tectia Client, UNIX shell scripting, Toad for data analysts, SQL ASSISTANT, TSO/ISPF, ChangeMan, Endover

Confidential

Teradata Developer / Data Analyst

Responsibilities:

  • Interacted with the individual line of business users, Business Analysts for collecting, understating their systems.
  • Source systems data are thoroughly analyzed and documented in the standard format.
  • Prepared business requirement documents and reviewed with Clients, downstream application teams and users
  • Prepared the data mapping document
  • Create standard abbreviation documents for conceptual, logical, physical and dimensional data models.
  • Create conceptual, logical, physical and dimensional data models.
  • Document conceptual, logical, physical and dimensional data models.
  • Prepared individual High level and Low level design document.
  • Involved in creating Jobs, scripts and JILs as per the design documents.
  • Involved creating the test tables in Teradata and setting of the various environments like DEV, SIT, UAT and PROD.
  • Performed performance tuning, collect statistics for the new and existing process.
  • Coordinate with up-stream and downstream application owner to process and make available the required source data for testing.
  • Involved unit testing of the changed/new code.
  • Involved SIT and UAT testing of the changed/new code.
  • Reviewed the test results with LOBs and the defects are logged into QC.
  • Logged the defects in QC and assigned to respective team and also involved in defect fixes raised by business users and Quality team.
  • Involved in CAB activities and release planning to move the components.
  • Deployed the components into production and monitored couple of production executions.

Environment: UNIX, TERADATA, SQL, Autosys, SSH Tectia Client, UNIX shell scripting, SQLMicrosoft VISIO

Confidential

Technical Lead

Responsibilities:

  • Interacted with the users, Business Analysts for collecting, understating the business requirements
  • All the sub applications those comes under C&RA are analyzed to check if they are impacted or not
  • The impacted applications are analyzed thoroughly to check the level of impact
  • Interact with customers to learn and document requirements that are used to produce business requirement documents.
  • Combine High level and Low level design document created to document the changes to the impacted applications
  • Coding as per the HLD/LLD
  • Coordinate with downstream and up-stream application owner to process and make available require data for users reporting.
  • Performed performance tuning, collect statistics for the new and existing process.
  • Unit testing of the changed/new code
  • SIT and UAT testing of the changed/new code
  • Data validation after each level of testing
  • Peer review and release planning
  • Deploy changed components into production using change man tool
  • Involved in CAB activities and release planning to move the components.
  • Deployed the components into production and monitored couple of production executions.

Environment: TERADATA, Autosys, SSH Tectia Client, UNIX shell scripting, SQL ASSISTANT, TSO/ISPF, ChangeMan, Endover, COBOL, JCL, SQL, UNIX

Confidential

Technical Lead

Responsibilities:

  • Interacted with the users, Business Analysts for collecting, understating the business requirements.
  • Designed the process as per the requirements and documented the process using MS VISIO. Demonstrated the process Design with Business Analyst and LOBs.
  • Prepared High level and Low level design document as per the requirement design document.
  • Involved in creating Jobs, scripts and JILs as per the design documents.
  • Coordinate with downstream and up-stream application owner to process and make available require data for users reporting.
  • Involved in unit testing of the changed/new code.
  • Involved SIT and UAT testing of the changed/new code.
  • Reviewed the test results with LOBs and the defects are logged into QC.
  • Logged the defects in QC and assigned to respective team and also involved in defect fixes raised by business users and Quality team.
  • Involved performance tuning, collect statistics for the new and existing process.
  • Involved in CAB activities and release planning to move the components.
  • Deployed the components into production using support teams.
  • Provided one month warrant support and completed the production handover

Environment: COBOL, JCL, SQL, UNIX, TERADATA, Autosys, SSH Tectia Client, UNIX shell scripting, SQL ASSISTANT, TSO/ISPF, ChangeMan, Endover

Confidential

Technical Analyst

Responsibilities:

  • Interacted with the users, Business Analysts for collecting, understating the business requirements.
  • Prepared High level and Low level design document as per the requirement design document.
  • Involved in creating Jobs, scripts and JILs as per the design documents.
  • Coordinate with downstream and up-stream application owner to process and make available require data for users reporting.
  • Involved in unit testing of the changed/new code.
  • Involved SIT and UAT testing of the changed/new code.
  • Reviewed the test results with LOBs and the defects are logged into QC.
  • Logged the defects in QC and assigned to respective team and also involved in defect fixes raised by business users and Quality team.
  • Involved in CAB activities and release planning to move the components.
  • Deployed the components into production using support teams.
  • Provided one month warrant support and completed the production handover

Environment: Environment: TERADATA, Autosys, SSH Tectia Client, UNIX shell scripting, SQL ASSISTANT, TSO/ISPF, ChangeMan, Endover, COBOL, JCL, SQL, UNIX

Confidential

Developer

Responsibilities:

  • Interacted with the Technical leads, Business Analysts for collecting, understating the business requirements.
  • Involved in preparing High level and Low level design document as per the requirement design document.
  • Developed COBOL programs, JCLs as the design documents.
  • Used Endevor to compile COBOL and DB2 programs and maintained the different versions.
  • Created the packages using Endevor and freeze the components.
  • Worked unit testing of the changed/new code.
  • Involved SIT and UAT testing of the changed/new code.
  • Data validation after each level of testing.
  • Involved in Reviewing and Documenting the test results
  • Logged the defects in QC and also involved in defect fixes raised by business users and Quality team.

Environment: COBOL, JCL, DB2, VSAM TSO/ISPF, Endevor, File Aid, SPUFI, DB2TOOLS, Xpeditor, QMF.

We'd love your feedback!