We provide IT Staff Augmentation Services!

Data Engineer Resume

2.00/5 (Submit Your Rating)

Wilmington, DE

PROFESSIONAL SUMMARY:

  • 10+ years of experience in teh Information Technology field with experience as a Technical Lead, Big data developer, Data Analyst and Business Intelligence (BI) developer for teh financial and Insurance domain.
  • 5 years of intensive experience in AWS and Big Data Technology which includes EC2, EMR, S3, Lambda, Spark, Hive, Pig, Hadoop, Hue and Python.
  • More than 5 years’ of experience in Data Analytics using Business Objects, Cognos and open source Python (Bokeh, Matplotlib)
  • CICD and Test Automation using Python - Behave, Java-Cucumber, Selenium.
  • 7 years of experience with Business Intelligence and ETL tools like Informatica, Cognos 10.1/8.x, Business Objects XI R2, Ab Initio, Cognos ReportNet1.
  • Proven experience of leading large geographically dispersed software development teams in global delivery model to reduce time and cost of project implementation.
  • Excellent skills in Python, Oracle SQL, Teradata SQL, ETL jobs and BI reports.
  • Self-motivated and enjoy working in a technically challenging environment. Possess excellent Communication and interpersonal skills and is a quick learner.

TECHNICAL SKILLS:

Business Intelligence Tools: AWS lambda, Amazon EC2, Amazon EMR, Spark, Hadoop, PIG, HIVE, HUE, Impala, Cognos, Business Object, Crystal Report, Ab Initio GDE, Informatica

Database: REDSHIFT, Hive, ORACLE, DB2, Teradata, Sybase, Imapala

Scheduler Tools: Control M Scheduler, Lambda

Version Control: GitHub, SVN

Operating Systems: Windows XP/95/98/2000/2007, UNIX and Linux

Programming Languages: Python, JAVA, HTML, Shell commands, PL SQL

Methodologies: Agile, Kanban and SDLC

PROFESSIONAL EXPERIENCE

Confidential, Wilmington, DE

Data Engineer

Responsibilities

  • Design and develop self-service data pipeline using Spark and AWS Platform (EC2, EMR, Lambda, S3) dat populates datasets in Cloud
  • Create several business metrics using Spark and AWS EMR
  • Create a data pipeline using Python, AWS EC2 and AWS Lambda to flag teh data and send email notification to business users
  • Build CICD pipeline using Python-Behave, Java-Cucumber
  • Build complex logic and create 16 metrics using Hive HQL, S3 and load it in redshift to support overdraft pod.
  • Create PySpark scripts for data transformation
  • Create data ingestion framework using HDFS command and write PIG script for implementing transformation logic and storing file in parquet format in data lake
  • Perform code review and release teh final code in production
  • Provide post production support and post production data validation support
  • Actively participate in story grooming and planning session and create teh stories in Jira
  • Create design document, architectural diagram and full presentation and present it among technology teams

Environment: AWS EMR, AWS EC2, AWS S3, Spark, Python, Hadoop, Hive, Pig, HUE, GitHub, Jira, Version One

Confidential, Newark, DE

Hadoop Developer/BI Analyst

Responsibilities

  • Participate in project estimation, requirement and design discussion, prepare and send teh notes from teh discussions
  • Create design document, architecture diagram and review it with team members
  • Leading offshore development team for product deliverables
  • Converted Teradata SQL scripts into HIVE QL to get better performance
  • Created automated data pipeline using HDFS command to load teh data in staging layer
  • Created pig script to implement transformation logic and loaded teh data in integration layer
  • Apply teh data quality checks (Data type, null validation, duplicate validation)
  • Apply teh incremental load logic (insert, update, delete)
  • Develop data model for reports using Cognos Framework manager.
  • Develop reports in Cognos report studio.
  • Provide system testing/UAT support and fix teh defects
  • Create implementation plan and review with Change management team
  • Provide support during post production

Environment: HIVE, HDFS Command, HIVE, Cognos 10.1, Informatica, Teradata, DB2, Control M, Unix, ALM

Confidential, Atlanta, GA

BI Analyst

Responsibilities

  • Interact with Business users to gather, understand and document teh requirements
  • Developing Framework Manager Models
  • Generated Complex reports in Cognos 10.1 report studio including Drill Down reports
  • Created list reports & cross tab reports using Cognos 10.1
  • Lead 10-member team and send weekly status report to teh stake holders

Environment: Cognos 8.4 DB2, Oracle 10g, IBM Clear Quest

Confidential

Lead Cognos Developer

Responsibilities:

  • Lead teh Cognos development team.
  • Develop list, cross tab, drill through reports which involved multiple prompts, Conditional formatting in Report Studio using Sybase 15 database.
  • Deployed Packages and Reports from one environment to another and UAT support.
  • Analyze teh existing business object universe and reports
  • Create Cognos Framework Manager model for report development
  • Development of Cognos unit testing and migration of Cognos reports
  • Write queries in db2 Database for unit testing

Environment: Business Object XI-R2, Cognos 8 Report Studio & DB2, IDB, Mainframes, and Oracle.

We'd love your feedback!